Hardware requirements for Conversion / Migration Server

  • scottminer1205


    Points: 14

    We perform data migrations for clients.

    We intend to allocate drive space according to the following guidelines:

    C:\ OS
    D:\ SQL Binaries and system databases
    E:\ User DB Data Files
    F:\ User DB Log Files
    G:\ Temp DB (Data and Log Files)
    H:\ Backups

    My question is, are there any formulas dependent upon the source data to allocate the drive space (as in number of GB).  For instance, we have determined that we need to make approximately 4 backups throughout this process, so if we know the size of the original .bak file (say it’s 100GB), we can multiply that by 4 to suggest that the H:\ Drive receive 400 GB of space.

    Are there any recommendations or formulas for how to size the remaining drives?  Your help is much appreciated.

  • Steve Jones – SSC Editor

    SSC Guru

    Points: 713653

    Here’s what I’d say.

    For OS/binaries, I’d probably put 100GB there, maybe more for pagefile depending on RAM. Guess high, since the system drive is a pain to alter later.
    For SQL binaries, this is minimal, and you might drop the pagefile here. System databases should be fairly small, though depending on history, perhaps these might be in the 10s of GB range.
    For user dbs, you need to know/guess data size. I usually try to size the data file for data + 3mo growth, re-evaluating this every (or every other) month. You do need extra space since your data will grow. Hopefully you can estimate a data size for the next year+. I don’t know how easy it is to add space. If it’s easy (you’re on a SAN of some sort), guess low. If not, guess high.
    For log files, this is workload  dependent. You will be completely guessing if you don’t have any history here. It’s not like a log file is 10% of data. It could be that, could be lower, could be much higher. This is tracking the record of changes, so more changes could b e lots of log, even if data isn’t growing. As a gross (and likely bad) rule of thumb, I’d set my log file at 15-20% of data and then see where I am. Log backups give me an idea of the workload log I’m generating during each period. However, no matter what, I want extra space here. I also like to include placeholders on this drive (And data drive) to get me out of emergencies. https://voiceofthedba.com/2014/11/24/placeholders-for-emergencies/

    For backups, you have the idea for fulls, but you might need logs, so ensure you account for this. More log backups is more files, but the aggregate log is roughly the same. If I generate 10GB of log records a day, this could be 4 2.5GB log backups every 4 hours or 20 500GB files if I backup 20 times a day. Again, remember data grows, so I need to account for the fact that 4 full backups today at 100GB might be 4 backups at 150GB in a year.

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply