taking very long time to execute

  • I have a table with following stats:
    443114382 rows,
    reserved 45511728 kb,
    680616 kb index size and
    248 kb unused. One of the reporting app running directly select * from this table with no where condition. It's taking too long time to execute. It's just simple select statement. when i talk to those they want to query that way and asking me to find the reason and solution to fix it.
    Execution plan is showing Index scan as user querying select*
    Please advise if you have any suggestions.

  • saptek9 - Wednesday, April 4, 2018 1:58 PM

    I have a table with following stats:
    443114382 rows,
    reserved 45511728 kb,
    680616 kb index size and
    248 kb unused. One of the reporting app running directly select * from this table with no where condition. It's taking too long time to execute. It's just simple select statement. when i talk to those they want to query that way and asking me to find the reason and solution to fix it.
    Execution plan is showing Index scan as user querying select*
    Please advise if you have any suggestions.

    To be honest, the query is stupid.  The query asks to return all columns of 443,114,382 row, to where?  If it's the screen, who in Blue Blazes is going to read 443 million rows of a 45 Gigabyte table?  If they are returning it to the screen to "save" the data to a file, then do that directly without going through the screen.

    If the report is to gather information for aggregated reporting, then do the aggregations in SQL Server before you send the data to the reporting app.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • The reason is they are trying to return 400+ million rows of data, the fix is to not return that much data.

  • There is absolutely ZERO way to tune a query that is selecting all columns and all rows. You can't do it.

    What you can do, is tune the hardware and the network. By tune I mean of course, buy bigger, faster hardware. Throw lots of money at this problem and you'll fix it.

    Otherwise, go back to the business and try to figure out what they really need, because it's very seldom 400 million rows of data.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Grant Fritchey - Thursday, April 5, 2018 6:22 AM

    There is absolutely ZERO way to tune a query that is selecting all columns and all rows. You can't do it.

    What you can do, is tune the hardware and the network. By tune I mean of course, buy bigger, faster hardware. Throw lots of money at this problem and you'll fix it.

    Otherwise, go back to the business and try to figure out what they really need, because it's very seldom 400 million rows of data.

    How can i prove that this is kind of hardware or less memory issue?

  • Grant Fritchey - Thursday, April 5, 2018 6:22 AM

    There is absolutely ZERO way to tune a query that is selecting all columns and all rows. You can't do it.

    What you can do, is tune the hardware and the network. By tune I mean of course, buy bigger, faster hardware. Throw lots of money at this problem and you'll fix it.

    Otherwise, go back to the business and try to figure out what they really need, because it's very seldom 400 million rows of data.

    Is there any way i can show them the screenshot to prove that they need faster hardware or memory?

  • saptek9 - Wednesday, April 4, 2018 1:58 PM

    I have a table with following stats:
    443114382 rows,
    reserved 45511728 kb,
    680616 kb index size and
    248 kb unused. One of the reporting app running directly select * from this table with no where condition. It's taking too long time to execute. It's just simple select statement. when i talk to those they want to query that way and asking me to find the reason and solution to fix it.
    Execution plan is showing Index scan as user querying select*
    Please advise if you have any suggestions.

    My max memorey size is 64 gb and min memory size is 32 gb. I am asking them to increase the memory to do this kind of queries successfully. I saw 4 direct select queries retrieving more than billion records. But they want that way. How can i prove that we need more memory or hardware to do so?

  • saptek9 - Thursday, April 5, 2018 8:57 AM

    Grant Fritchey - Thursday, April 5, 2018 6:22 AM

    There is absolutely ZERO way to tune a query that is selecting all columns and all rows. You can't do it.

    What you can do, is tune the hardware and the network. By tune I mean of course, buy bigger, faster hardware. Throw lots of money at this problem and you'll fix it.

    Otherwise, go back to the business and try to figure out what they really need, because it's very seldom 400 million rows of data.

    Is there any way i can show them the screenshot to prove that they need faster hardware or memory?

    This is just about volumetric measures on the server, pages/sec, sec/page measures of the disk and the wait statistics.

    You shouldn't be trying to prove they need more hardware. You should be trying to prove that just running completely open-ended queries is silly.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Grant Fritchey - Thursday, April 5, 2018 9:09 AM

    saptek9 - Thursday, April 5, 2018 8:57 AM

    Grant Fritchey - Thursday, April 5, 2018 6:22 AM

    There is absolutely ZERO way to tune a query that is selecting all columns and all rows. You can't do it.

    What you can do, is tune the hardware and the network. By tune I mean of course, buy bigger, faster hardware. Throw lots of money at this problem and you'll fix it.

    Otherwise, go back to the business and try to figure out what they really need, because it's very seldom 400 million rows of data.

    Is there any way i can show them the screenshot to prove that they need faster hardware or memory?

    This is just about volumetric measures on the server, pages/sec, sec/page measures of the disk and the wait statistics.

    You shouldn't be trying to prove they need more hardware. You should be trying to prove that just running completely open-ended queries is silly.

    It's not even necessarily a SQL Server issue with that much data, it's entirely possible there is lag on the receiving end of that query.

  • ZZartin - Thursday, April 5, 2018 9:16 AM

    It's not even necessarily a SQL Server issue with that much data, it's entirely possible there is lag on the receiving end of that query.

    There's going to be lags everywhere. It's inescapable.

    Next question... is this one person running one query a day or is it lots of people running lots of queries, all of them SELECT * FROM without a WHERE claues? If the latter... hoo boy.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • saptek9 - Thursday, April 5, 2018 9:01 AM

    saptek9 - Wednesday, April 4, 2018 1:58 PM

    I have a table with following stats:
    443114382 rows,
    reserved 45511728 kb,
    680616 kb index size and
    248 kb unused. One of the reporting app running directly select * from this table with no where condition. It's taking too long time to execute. It's just simple select statement. when i talk to those they want to query that way and asking me to find the reason and solution to fix it.
    Execution plan is showing Index scan as user querying select*
    Please advise if you have any suggestions.

    My max memorey size is 64 gb and min memory size is 32 gb. I am asking them to increase the memory to do this kind of queries successfully. I saw 4 direct select queries retrieving more than billion records. But they want that way. How can i prove that we need more memory or hardware to do so?

    Ask them if it is ok to email them a spreadsheet with the data being returned by that query. They will likely freak out and tell you that is entirely too much data for an email. To which your response is "exactly".

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • saptek9 - Thursday, April 5, 2018 8:57 AM

    Grant Fritchey - Thursday, April 5, 2018 6:22 AM

    There is absolutely ZERO way to tune a query that is selecting all columns and all rows. You can't do it.

    What you can do, is tune the hardware and the network. By tune I mean of course, buy bigger, faster hardware. Throw lots of money at this problem and you'll fix it.

    Otherwise, go back to the business and try to figure out what they really need, because it's very seldom 400 million rows of data.

    Is there any way i can show them the screenshot to prove that they need faster hardware or memory?

    What you need to do is to tell them that they don't need to get all the data because it already exists in the database.  What they really need to do is correctly identify the problem they wish to solve other than getting all the data so they can solve the problem (whatever it is) and then let someone that actually knows what they're doing solve the problem correctly. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Sean Lange - Thursday, April 5, 2018 9:44 AM

    saptek9 - Thursday, April 5, 2018 9:01 AM

    saptek9 - Wednesday, April 4, 2018 1:58 PM

    I have a table with following stats:
    443114382 rows,
    reserved 45511728 kb,
    680616 kb index size and
    248 kb unused. One of the reporting app running directly select * from this table with no where condition. It's taking too long time to execute. It's just simple select statement. when i talk to those they want to query that way and asking me to find the reason and solution to fix it.
    Execution plan is showing Index scan as user querying select*
    Please advise if you have any suggestions.

    My max memorey size is 64 gb and min memory size is 32 gb. I am asking them to increase the memory to do this kind of queries successfully. I saw 4 direct select queries retrieving more than billion records. But they want that way. How can i prove that we need more memory or hardware to do so?

    Ask them if it is ok to email them a spreadsheet with the data being returned by that query. They will likely freak out and tell you that is entirely too much data for an email. To which your response is "exactly".

    Spreadsheet with a BILLION rows ?
    1.) Won't fit on one tab.
    2.) Will need ~ 1,000 tabs, plus or minus a few.
    3.) They can forget about any kind of analysis across the entire data set.
    4.) Who's going to spend the time to figure out how to generate those 1,000 different tabs? (here, maybe you get lucky with bulk copy, but I doubt it...)
    5.) What possible value would such a behemoth of a spreadsheet do for anyone, assuming that it even manages to avoid any kind of number of tabs limitation, or for that matter, windows file size limitation?
    6.) Anyone care to try and add a formula to all of those rows, on all tabs, and once you have, anyone wanna press F9 (calc) ?  After all, surely with a spreadsheet that size, you DID remember to turn off automatic recalculation, didn't you?
    7.) And 6 assumes you can even OPEN the spreadsheet without waiting a few years for it to load, or maybe just crashing Excel or your computer as a whole...
    8.) You better have at least 64GB of RAM... and a lot of $$$, ...  and even then you might well be in trouble... heck, you might even need a server machine so that you would be capable of having enough RAM...  current desktop and laptop motherboards are limited to 64GB max, and only at the very high end do they accommodate that much.

    Need I go on?   You can reference the limits on Excel specs here:  https://support.office.com/en-us/article/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3

    Be careful what you ask for, Jeff... someone out there is bound to be foolish enough to try and take you up on it.   Maybe you could make this a fun project, just to see if it's even possible... (ha ha, tee hee, tee hee) :hehe:

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • sgmunson - Friday, April 6, 2018 12:49 PM

    Sean Lange - Thursday, April 5, 2018 9:44 AM

    saptek9 - Thursday, April 5, 2018 9:01 AM

    saptek9 - Wednesday, April 4, 2018 1:58 PM

    I have a table with following stats:
    443114382 rows,
    reserved 45511728 kb,
    680616 kb index size and
    248 kb unused. One of the reporting app running directly select * from this table with no where condition. It's taking too long time to execute. It's just simple select statement. when i talk to those they want to query that way and asking me to find the reason and solution to fix it.
    Execution plan is showing Index scan as user querying select*
    Please advise if you have any suggestions.

    My max memorey size is 64 gb and min memory size is 32 gb. I am asking them to increase the memory to do this kind of queries successfully. I saw 4 direct select queries retrieving more than billion records. But they want that way. How can i prove that we need more memory or hardware to do so?

    Ask them if it is ok to email them a spreadsheet with the data being returned by that query. They will likely freak out and tell you that is entirely too much data for an email. To which your response is "exactly".

    Spreadsheet with a BILLION rows ?
    1.) Won't fit on one tab.
    2.) Will need ~ 1,000 tabs, plus or minus a few.
    3.) They can forget about any kind of analysis across the entire data set.
    4.) Who's going to spend the time to figure out how to generate those 1,000 different tabs? (here, maybe you get lucky with bulk copy, but I doubt it...)
    5.) What possible value would such a behemoth of a spreadsheet do for anyone, assuming that it even manages to avoid any kind of number of tabs limitation, or for that matter, windows file size limitation?
    6.) Anyone care to try and add a formula to all of those rows, on all tabs, and once you have, anyone wanna press F9 (calc) ?  After all, surely with a spreadsheet that size, you DID remember to turn off automatic recalculation, didn't you?
    7.) And 6 assumes you can even OPEN the spreadsheet without waiting a few years for it to load, or maybe just crashing Excel or your computer as a whole...
    8.) You better have at least 64GB of RAM... and a lot of $$$, ...  and even then you might well be in trouble... heck, you might even need a server machine so that you would be capable of having enough RAM...  current desktop and laptop motherboards are limited to 64GB max, and only at the very high end do they accommodate that much.

    Need I go on?   You can reference the limits on Excel specs here:  https://support.office.com/en-us/article/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3

    Be careful what you ask for, Jeff... someone out there is bound to be foolish enough to try and take you up on it.   Maybe you could make this a fun project, just to see if it's even possible... (ha ha, tee hee, tee hee) :hehe:

    It wasn't me that made the suggestion... although I thought the suggestion was appropriately humorous. 🙂

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Friday, April 6, 2018 2:09 PM

    sgmunson - Friday, April 6, 2018 12:49 PM

    Sean Lange - Thursday, April 5, 2018 9:44 AM

    saptek9 - Thursday, April 5, 2018 9:01 AM

    saptek9 - Wednesday, April 4, 2018 1:58 PM

    I have a table with following stats:
    443114382 rows,
    reserved 45511728 kb,
    680616 kb index size and
    248 kb unused. One of the reporting app running directly select * from this table with no where condition. It's taking too long time to execute. It's just simple select statement. when i talk to those they want to query that way and asking me to find the reason and solution to fix it.
    Execution plan is showing Index scan as user querying select*
    Please advise if you have any suggestions.

    My max memorey size is 64 gb and min memory size is 32 gb. I am asking them to increase the memory to do this kind of queries successfully. I saw 4 direct select queries retrieving more than billion records. But they want that way. How can i prove that we need more memory or hardware to do so?

    Ask them if it is ok to email them a spreadsheet with the data being returned by that query. They will likely freak out and tell you that is entirely too much data for an email. To which your response is "exactly".

    Spreadsheet with a BILLION rows ?
    1.) Won't fit on one tab.
    2.) Will need ~ 1,000 tabs, plus or minus a few.
    3.) They can forget about any kind of analysis across the entire data set.
    4.) Who's going to spend the time to figure out how to generate those 1,000 different tabs? (here, maybe you get lucky with bulk copy, but I doubt it...)
    5.) What possible value would such a behemoth of a spreadsheet do for anyone, assuming that it even manages to avoid any kind of number of tabs limitation, or for that matter, windows file size limitation?
    6.) Anyone care to try and add a formula to all of those rows, on all tabs, and once you have, anyone wanna press F9 (calc) ?  After all, surely with a spreadsheet that size, you DID remember to turn off automatic recalculation, didn't you?
    7.) And 6 assumes you can even OPEN the spreadsheet without waiting a few years for it to load, or maybe just crashing Excel or your computer as a whole...
    8.) You better have at least 64GB of RAM... and a lot of $$$, ...  and even then you might well be in trouble... heck, you might even need a server machine so that you would be capable of having enough RAM...  current desktop and laptop motherboards are limited to 64GB max, and only at the very high end do they accommodate that much.

    Need I go on?   You can reference the limits on Excel specs here:  https://support.office.com/en-us/article/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3

    Be careful what you ask for, Jeff... someone out there is bound to be foolish enough to try and take you up on it.   Maybe you could make this a fun project, just to see if it's even possible... (ha ha, tee hee, tee hee) :hehe:

    It wasn't me that made the suggestion... although I thought the suggestion was appropriately humorous. 🙂

    It was me that made the suggestion. And all the absurdity of how completely stupid and unmanageable it would be as a spreadsheet was exactly the point.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

Viewing 15 posts - 1 through 15 (of 18 total)

You must be logged in to reply to this topic. Login to reply