performance optimization of selecting more than 140 columns with over 3000 MB of data

  • I have a query which returns more than 140 columns and over 100K records and total data size is more than 3000MBs. It is taking more than 26 seconds to retrieve the data. Query is just select * from table. Please help to give some suggestion to get the data faster, may be in 3-4 seconds into result grid.

    Also, i have a table with over 140 columns and insert query is taking a lot of time. It is select and insert. Select query is running in 1 second but insert task is taking a lot of time. Table do not have any indexes on it Please help me to optimize this insert task.

    Thanks,

    Akash

  • For the part of your question:

    To send 3GB across a network and process it a the client side takes a while.

    I can't think of any scenario why the display of 3GB of data would be useful in any way...

    Please clarify why you need to display it in a grid view.

    Regarding your second question: What do you mean by "lot of time"? Can you post the actual insert query including the actual execution plan?

    Furthermore, does the table have no index at all? Not even a clustered index? Is there possibly any locking/blocking involved due to other users/processes?



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

  • 3GB of data in 26 seconds, that's fast!

    What is the end user going to do with 100 000 rows?

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply