Dynamically managing SQL Memory

  • Heres the scenario, I've recently come across involving a multi-instance SQL server server.

    The server itself hosts a multi tiered DW, with Staging, ODS and Warehouse located in their own instance of SQL server 2008. However what happens is that Max Memory allocations are dynamically altered between each instances, depending on the state of the ETL process, so Staging->ODS the memory will be configured to favour the ODS, then during ODS->Warehouse load it will be switched to favour the Warehouse.

    The Actaul DB's (ODS and Warehouse) are Very large (1-2 TB each), and as far as I'm aware the server has a total of 512GB of physical ram.

    Is there a jusifitcation for doing this or would you be better leaving each instance with a fixed amount of memory?

    _________________________________________________________________________
    SSC Guide to Posting and Best Practices

  • I'm doing the same although for my application I give the database 59GB out of 64GB for the ETL and then trim it down to 12GB to free up the rest for processing an analysis services cube.

    It seems to work but are there downsides? And no, we can't ask for more memory at this stage. We only just got it bumped up from 32GB six weeks ago.

    I am considering the same method as the original post just for the last part of the ETL as well, since that is very memory-intensive and would utilise as much as we can throw at it. Trouble is by the time the ETL gets to the last step SQL has already taken a lot of the memory available to it.

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply