• I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

    If you are trying to tune a query against a TB-sized object you must ensure your test environment can recreate the conditions experienced by your full-sized database. Queries against a 1 GB object will behave differently to queries against a 1TB object.

    For example, with a 1GB object you may try using a table variable to hold some intermediate data and get better performance than using a #temp table. Scale up to even 100GB and the table variable could be holding so much data that a properly indexed #temp table is vastly more performant. Likewise many access plans that work well with a 1GB object will be poor with 1 1TB object. You can easily live with half your data being pulled into a workfile with a 1GB object, but copying 500 GB into tempdb will add a LOT of time to your query execution and it may be better to formulate a query that avoids this.

    So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counter-productive.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara