• ricardo_chicas (8/20/2010)


    Ok, the T-sql program loads all the needed data into a table, then another process use that data for a webpage ( believeme all of that data is needed) that said

    Again

    What is the best way to retrieve that much data without uning a single query that block my main table for more that an hour ( I can't use nolock since I can't affort to have dirty reads) and still remain optimal?

    I'm trying to help you here, but you are really providing too little information. Please be more detailed.

    My understanding so far is that you have one large table with 124 million rows

    you then use a T-SQL script to move 5 million of those rows to a secondary table.

    An external program then reads that secondary table and sends it over a network link to an end user.

    You then say that something blocks the main table for over an hour. Which table is being blocked ?

    Is the problem in the T-SQL script that moves data between the two tables, or is the problem in the external program that tries to send data over a slow network ?

    Right now it sounds like the external program is the problem.

    If that is the case a suggestion could be to let the external program read all the data from the database as quickly as possible end just save the data to a temporary file on disk.

    Then the the external program can read the data from the temporary file and send it to the client.

    Have I understood your problem ?