Costing per database

  • Hi,

    I am just checking to see if anyone has ever did a costing per database (based on it size). I am trying to figure out what the cost is for a database throughout its lifecycle. I am looking for mow much disk space will each DB use, how much memory does it require, how many CPU cycles.

    Any help is appreciated.

    Thanks

     

     

     

  • I think that is a very per-use case question.

    For example, the cost to host a data warehouse is going to be different than a highly OLTP workload.

    And there are a lot of questions that would come into play like does your instance do a lot of calculations?  If so, you may need more CPU to do the calculations efficiently.  Do you have downtime?  What I mean is does your company operate 24/7 or 9-5?  Having downtime means you have times where the CPU and memory utilization and disk can be set aside for other tasks (antivirus scans, disk defragmentation, etc).

    How busy is your SQL instance?  If you get 1 transaction per hour vs 1 million transactions per hour will have different costs for disk space and memory and CPU.  As for disk space, it again depends on how much data you store and the type of disk you use.  if you are running the latest and greatest SSD/Flash technology, it may be very expensive, but if you are running on 5400 spinning platters, the cost per GB is a lot lower at the cost of greatly increased lookup times.

    An example of why there isn't a magic "1 size fits all" answer to this - where I work we have a 400 GB database for our ERP system.  That has a lot of INSERTs and UPDATEs happening on it daily (200,000 per day on a normal 9-5 day is a very rough estimate).  Of that, we capture some audit data in a second database that is JUST inserts with very few other actions ever happening on the audit database.  Maybe a few select's per year for support things.  The main ERP system, since it has a lot of SELECTS, INSERTS, and UPDATES is going to need more memory and CPU than the audit one.  BUT the audit one is likely going to be larger as it is capturing all of the data changes along with who changed it and when.  The main database also has an archiving process that runs yearly to remove old, useless data.  This doesn't happen on the Audit database.  And the Archive gets even less action than the Audit one.  It gets one large set of writes that happen per year and otherwise sits pretty much idle.

    My opinion, there is not enough information to warrant a guess on the cost analysis as memory and CPU utilization do not scale simply by database size.  CPU and Memory will scale much better with number of concurrent users and number of transactions, and even then it won't be a linear scale and it will still vary a bit with use cases.  For example, if you are going by concurrent users and you have 100 users connected all running 100 queries at the same time, that will likely use more CPU and Memory than having 1 million users connected but running 0 queries.

     

    On top of all of the above, I think it does depend on your hardware vendors.  For example, if you have it running on a self-contained physical server (ie no SAN and bare metal box running SQL) the cost will be different than running it on a SAN and in VMWare.  It also depends on what you consider a database lifecycle.  I have some databases that are likely to never see EOL and will be running as long as the company keeps running.  Yet, I've seen others that only survive a year before we shut them down.

    The above is all just my opinion on what you should do. 
    As with all advice you find on a random internet forum - you shouldn't blindly follow it.  Always test on a test server to see if there is negative side effects before making changes to live!
    I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.

  • TJ_T wrote:

    Hi,

    I am just checking to see if anyone has ever did a costing per database (based on it size). I am trying to figure out what the cost is for a database throughout its lifecycle. I am looking for mow much disk space will each DB use, how much memory does it require, how many CPU cycles.

    Any help is appreciated.

    Thanks

    This is a tough thing to do because such estimates are normally quite a bit different than what actually happens simply because of the complexity of making such an estimate and the inaccuracies of estimating the actual load and usage unless you have the benefit of of being able to examine similar databases.  One place to start is at the following URL.

    https://docs.microsoft.com/en-us/sql/relational-databases/databases/estimate-the-size-of-a-database?view=sql-server-ver15

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Take a look at the Azure measurement DTU. This is how Microsoft was primarily charging, per database, for Azure. Now, the DTU is largely a secondary measure and they're primarily focused on vCore and storage.

    Originally, DTU meant Database Transaction Unit and it was a weird melange of transactions, cpu use and memory. However, over time, the fact that the DTU didn't really correlate to transactions in any way meant they redefined DTU to just mean DTU. But if you look at some of the articles out there that tried to measure DTU, it might help you make that determination.

    If you really want detailed metrics per database, measuring full I/O per database, or full CPU per database, I'd suggest looking at Extended Events. You'll be gathering stupid amounts of data, but unlike most other options, it's easily segregated by database.

    Also, depending on your system, don't forget that you may have to deal with cross-database queries. How do you plan to measure those?

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply