Jeff Moden (10/5/2015)
Eric M Russell (10/3/2015)
898 million page reads? Each page is 8'000 bytes... :w00t:The final 42 million reads is pretty bad, as well, even for a batch run. That's like doing a table scan from a 344GB table.
Hello Jeff
I think the problem is that I have to have a sub-query which compares and ranks data from two different systems and I have to repeat this 10 times. Table scans unfortunately have to occur to some extent as I am not always joining by the primary key (I didn't design these systems!).
If I can get it down to 30 seconds I can live with it though!:cool: