Need Help on Fastest Search Logic

  • You could be right but the FTI wouldn't be on Table 2 according to the OPs latest post. It would only be needed on Table 1. Fortunately, I don't see any requirements to join Table 1 to Table 2 under such conditions.

    Wasn't the requirement to search the rows in Table2 for those that matched a row in Table1? That was how I read it.

  • lnardozi 61862 (10/13/2013)


    You could be right but the FTI wouldn't be on Table 2 according to the OPs latest post. It would only be needed on Table 1. Fortunately, I don't see any requirements to join Table 1 to Table 2 under such conditions.

    Wasn't the requirement to search the rows in Table2 for those that matched a row in Table1? That was how I read it.

    Ah. Yes... that was the original problem. I thought you were talking about the lastest request which is...

    born2achieve (10/13/2013)


    Hi Jeff,

    am back,

    i got one more tricky situation from my client. i frightened to hear about this concept from them. the concept will be i will have to take the product name from table 1 and search it with %product name% search condition. not whole word matching.

    In your example, after we split the comma separated values into temp table , fetch each item from table 1 and we have map it with %table1.productname% onto temp table. i am wondering about this ugly concept. because it will kill the time. do you have any suggestion on this concept. sample below,

    if the product name on table 1 is "milk" and on the temp table if we have "milk with fat","milk with out fat","milk with less fat" then we have to fetch these three product name. for this i hope we should have to use % table1.productname %.

    could you please

    Going back to the original problem and considering how fast the solution turned out to be to do that, I don't believe that I go through setting up FTS for that. Shifting gears to my original suggestion, what they really need to do is to store the data in a normalized format rather than using the CSV column. The fast solution actually does just exactly that... if changes the denormalized Table 2 to a normalized version and then does a normal join to that. Still, it would be better to avoid such on-the-fly normalization or FTS by properly normalizing Table 2 to begin with.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Going back to the original problem and considering how fast the solution turned out to be to do that, I don't believe that I go through setting up FTS for that. Shifting gears to my original suggestion, what they really need to do is to store the data in a normalized format rather than using the CSV column. The fast solution actually does just exactly that... if changes the denormalized Table 2 to a normalized version and then does a normal join to that. Still, it would be better to avoid such on-the-fly normalization or FTS by properly normalizing Table 2 to begin with.

    --Jeff Moden

    I never seem to be able to sell people on denormalizing the data and creating a view with an Instead Of trigger that looks like the original table. It's the best of both worlds - you get to do things like God intended without all the bother of actually changing your application. True, writes aren't nearly as performant but many applications read lots of rows, but tend to write them one at a time.

Viewing 3 posts - 31 through 32 (of 32 total)

You must be logged in to reply to this topic. Login to reply