SS2019 Running Extremely Slow

  • Hi everyone

    I got a new computer and copied over the DB from my old machine.  I backed up the DB from the old machine onto an external drive and then restored it on the new computer.  I ran one stored procedure and it is super slow on the new machine.  On the old machine the script ran in 1 min 10 seconds. On the new one it has been over 10 minutes and still running.  Clearly something is wrong.

    Old machine - 2015 Intel i5 with 8 GB ram

    New machine - 2022 Intel i9 with 128 GB ram

    What could possibly be the issue that is causing the script to run super slow on my new machine?

    Thank you

  • the old machine is running windows 10 pro and the new machine is running windows 11 pro

  • Same version of SQL Server on both machines?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Phil Parkin wrote:

    Same version of SQL Server on both machines?

    Yes

    Both are running SS2019.  The version for SSMS is slighly different.

    the values produced by both the queries look the same.

    it is almost as if the new computer is using unindexed tables when the query does searches.  are indexes also restored? or do I have to remake them?

    do i have to tell SQL Server how much memory it can use?  or does it know automatically to use whatever the system has?

  • What edition of SQL Server? Hopefully not Express, as it can't take advantage of all that horsepower.

    Compare actual execution plans.

    You might try updating statistics on the new computer.

    You have more cores on the i9. I wonder if parallelism is an issue. What are Cost Threshold for Parallelism & Max Degree of Parallelism set to?

  • ratbak wrote:

    What edition of SQL Server? Hopefully not Express, as it can't take advantage of all that horsepower.

    Compare actual execution plans.

    You might try updating statistics on the new computer.

    You have more cores on the i9. I wonder if parallelism is an issue. What are Cost Threshold for Parallelism & Max Degree of Parallelism set to?

    I am using Developer edition

    Cost Threshold for Parallelism = 5

    Max Degree of Parallelism = 12

    Are these values correct?

    I will rerun the query to get the statistics and execution plan.  The query took 5 hours to finish yesterday.  The one on the old computer took about 1 minute.

  • ratbak wrote:

    What edition of SQL Server? Hopefully not Express, as it can't take advantage of all that horsepower.

    Compare actual execution plans.

    You might try updating statistics on the new computer.

    You have more cores on the i9. I wonder if parallelism is an issue. What are Cost Threshold for Parallelism & Max Degree of Parallelism set to?

    I am not familiar with 'updating statistics'.  what exactly is this?  how do i do this?

     

  • Statistics are what the query optimizer uses to choose an execution plan.They contain statistical information about the distribution of values. The Query Optimizer uses these statistics to estimate the cardinality, or number of rows, in the query result.

    Updating statistics ensures those estimates are valid. "SAMPLE" does just what it sounds like -- samples a percentage of the rows to extrapolate an estimate. "FULLSCAN" will look at every single row. 100% accurate, and maybe not a bad idea the first time, but may take a long time. After that statistics will normally be updated as data changes.

    You can specify the table, index, or even specific statistic.

    EXEC sp_msforeachtable 'UPDATE STATISTICS ? WITH FULLSCAN' will update statistics on all tables with a full scan.

    EXEC sp_msforeachtable 'UPDATE STATISTICS ? WITH SAMPLE 20 PERCENT' will update statistics using a 20% sample.

  • In SSMS, you can right click on the server & select properties, then select Memory. You should set memory to something lower than total server memory so that sufficient memory is left for Windows -- Windows processes use disk cache (pagefile) if they are memory starved, & that can significantly impact performance -- and for any other applications/services you run (e.g., SSRS, SSIS). There are even other SQL Server memory allocations that require memory that isn't included under max memory. There are various rule-of-thumb calculations like this one, which I think comes up w/ excluding 13GB for 128GB machine... but they generally don't take into other services/applications running on the SQL Server host. If you're running anything else, I'd personally limit SQL Server to 112GB,  and probably less... but there are wiser experts than me that might have better advice.

    Limiting SQL maximum memory is about indirect impacts to performance & not starving other processes -- it won't directly improve SQL performance.

  • Do you know how to get execution plans? Do you know how to interpret them?

    You can generate an execution plan in SSMS if you click the "Include Actual Execution Plan" button or menu item (under Query) & then execute the query. You can then right click the execution plan and select "Save Execution Plan As..." to save as a file (XML format).

    If you provide those (identifying which is old and which is new), experts here can provide feedback on what those reveal.

    If there is confidential information, you probably shouldn't upload directly. You could use Sentry One Plan Explorer to anonymize the data first.

  • This was removed by the editor as SPAM

  • This was removed by the editor as SPAM

  • ratbak wrote:

    Statistics are what the query optimizer uses to choose an execution plan.They contain statistical information about the distribution of values. The Query Optimizer uses these statistics to estimate the cardinality, or number of rows, in the query result.

    Updating statistics ensures those estimates are valid. "SAMPLE" does just what it sounds like -- samples a percentage of the rows to extrapolate an estimate. "FULLSCAN" will look at every single row. 100% accurate, and maybe not a bad idea the first time, but may take a long time. After that statistics will normally be updated as data changes.

    You can specify the table, index, or even specific statistic. EXEC sp_msforeachtable 'UPDATE STATISTICS ? WITH FULLSCAN' will update statistics on all tables with a full scan. EXEC sp_msforeachtable 'UPDATE STATISTICS ? WITH SAMPLE 20 PERCENT' will update statistics using a 20% sample.

    Thank you for this.  I ran the full scan version.

  • ratbak wrote:

    In SSMS, you can right click on the server & select properties, then select Memory. You should set memory to something lower than total server memory so that sufficient memory is left for Windows -- Windows processes use disk cache (pagefile) if they are memory starved, & that can significantly impact performance -- and for any other applications/services you run (e.g., SSRS, SSIS). There are even other SQL Server memory allocations that require memory that isn't included under max memory. There are various rule-of-thumb calculations like this one, which I think comes up w/ excluding 13GB for 128GB machine... but they generally don't take into other services/applications running on the SQL Server host. If you're running anything else, I'd personally limit SQL Server to 112GB,  and probably less... but there are wiser experts than me that might have better advice. Limiting SQL maximum memory is about indirect impacts to performance & not starving other processes -- it won't directly improve SQL performance.

    the default max was set to 2 147 483 647 MB.  That is alot.  I changed it to 100000 MB (ie 100 GB).  That leaves 28 GB for other processes. I also restarted MS SQL SERVER by going to Configuration Manager and restarting the service.

    The query has been running for close to 5 minutes which is clearly wrong.  The old one took 1 minute.

  • ratbak wrote:

    Do you know how to get execution plans? Do you know how to interpret them?

    You can generate an execution plan in SSMS if you click the "Include Actual Execution Plan" button or menu item (under Query) & then execute the query. You can then right click the execution plan and select "Save Execution Plan As..." to save as a file (XML format). If you provide those (identifying which is old and which is new), experts here can provide feedback on what those reveal. If there is confidential information, you probably shouldn't upload directly. You could use Sentry One Plan Explorer to anonymize the data first.

    I know what they are but not knowlegeable enough to interpret them.  I have that setup.  Once i get the results I will review and share it.  The last run for the query took close to 5 hours so i suspect it will be close this time.

Viewing 15 posts - 1 through 15 (of 39 total)

You must be logged in to reply to this topic. Login to reply