Slow clustered index seek?

  • If it's a "human" client, then there's no way they need 92,000 rows... that's a lot of pages to read. Normal typewritten pages only has 60 rows by 80 characters.

    If it's a program that's going to consume the data, I recommend you find the equivelent method in T-SQL and send a summary.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I'm going to pretend I didn't see SSN and TaxID being transmitted in the "open" to user queries...(I'm sure your doctors will appreciate having that broadcast...:)). Is that REALLY necessary?

    Even with all of that - you have a perfectly good unique identifier in there (the NPI), which likely should be your clustered key (assuming it's completely filled in by now).

    Also - the address is a TEXT field? when is the last time you had a need to store a 2000 line address?

    Start with a structure that makes sense, and you should be able to cut down on how much has to travel over the wire. As Jeff and Gail mentioned, if this is for a human - then set up some kind of paging system, so that you can send "human-sized" chunks of data across the wire instead of ninety thousand records.

    ----------------------------------------------------------------------------------
    Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?

  • Thank you all for all your suggestions!!!

    I think I've learnt quite a bit on indexing just working on this query:-) and sure Gail, will remember to post it on the correct section the next time!

    I worked on getting minimal records from the db over the past few days and made it work within 15 seconds - which still is not a good query but atleast it works for the time being! phew:)

    Anyways, the problem was, this particular program was written in foxpro, connecting to sql db. They just got all of the records that are required from the db and would do the paging from within the program and send that to a front-end. (until now the records were never 20000+, so there wasn't a problem!)

    Since I am new to this project (and to foxpro as well) and didn't have too much time, was trying not to deal with all the messy foxpro code, so as not to break the existing functionality and just wanted to handle everything within the sql query instead. But that didn't help. so had to break my head over cleaning up the code and moved a lot of logic to sql to send only required records to the front-end!

    Thanks again and sorry for wasting all your time.All your suggestions have been really helpful to me!

    Rgds,

    Suku

    P.S. Yes Jeff, it is a "human client" and they do display all the 90,000+ records on the front-end, with paging that only goes next-prev and not even first-last:-) We have been trying to talk to our clients about revising the search criteria.. yet to get a go ahead.

Viewing 3 posts - 16 through 18 (of 18 total)

You must be logged in to reply to this topic. Login to reply