• ricardo_chicas (8/19/2010)


    Hello All,

    I have this situation:

    For several reports we need to extract about 5 millions records from a table of 124 million ( 173 gb)

    I am able to use the clustered index ( with a "like" statement) and the only other parameter is a column with a state so it is not helping much, the thing is that the resultant table is really big ( 5 gb ) and is taking forever to load and it is also consuming lots of resources from the server ( we usually get the error there is not enough system memory to run this query... )

    Exactly how are you generating the report?

    SSRS ? Custom program ? T-SQL ?

    Is the report generator running on the same machine as the SQL server ?

    Do you really need to return all 5 million rows to the report generator?

    I suppose the report generator does not print out a report with 5 million rows but instead does some aggregations of the data to print out a more reasonable number of rows. If this is the case could you perhaps do this aggregation in the source query instead of doing it in the report ?

    Another possible thing to look for is if you have an ORDER BY in the source query - that might force SQL server to sort the result set which is fairly expensive for 5GB data

    /SG