Determining Filegrowth for a completely new database

  • Hi

    i'm hoping to create a new database for an inventory control system. I drew ER diagrams and created a well organized series of tables. but i'm confused how to determine the file growth for the database.

    please explain 😀

  • Double posting tends to tick folks off because they may spend some time of one of your posts trying to give you an answer only to find that a good answer is on the other post. We do this for free... please have a little consideration for our time and not double post. We'll find your post because most of us look at all the forums for new posts...

    http://www.sqlservercentral.com/Forums/Topic430916-5-1.aspx

    Now, for your problem. Congrats on a well designed system and for being concerned about growth... but, to start with, you need to identify two things right out of the gate...

    1. How many rows you'll add and...

    2. the average size of those rows.

    I can't help you with #1 nor can anyone on this forum... only you can determine that on your end by finding out what the anticipated usage of the Inventory system will be.

    While you're trying to find the information for #1, you can work on determining #2 for each table... lookup "estimating table size" in Books Online and maybe write a little dynamic SQL to go through the tables you've built. If you haven't built any tables, yet, then a template on a spreadsheet would help you in your estimations.

    Personnally, I'd figure out how much disk space it would take to hold the first year or two of data, and then triple that. That'll leave room for all sorts of indexes/keys, unanticipated growth, currently unknown "support/reporting requirements", and some room for a decent size TempDB.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Hi,

    1. Repeat: try to search before open a new topic 🙂

    2. Anyway: If the question is the FILEGROWTH parameter of the data and log files then the answer is quite standard: 100-200 MB, no more. The automatic filegrowth is a failsafe feature and can be quite ineffective in case of a db under heavy load. Imagine the situation: thousands of queries tries to modify the database - which is totally full so everyone should wait while the file will grow and being nulled out. So you should define an estimated growth and the DBA should manually grow the files from time to time when it's necessary.

  • Hi,

    I think in the begining you cannot do something (only if you know the volume/frequency of transactions

    insert/delete/update :)) to figure out the file growth of your database but over the time you will be able to

    check your file growth progression.

    Then you will see if your design need to be updated (but I am sure if you fellow the 3 Normal Form it will be Ok), sometimes you also need to denormalize for performance issues.

    Check the following link

    http://vyaskn.tripod.com/track_sql_database_file_growth.htm

    For database design check the following link http://www.databaseanswers.org/data_models/[/url]

    Regards,

    Ahmed

  • Hold up your thumb, point it at a ruler and guess 🙂

    Seriously, there's good advice above, but for a new system, it could vary wildly. It the app is successful it could blow your estimate. If it's not, then you could have wasted a lot of time.

    Take a guess, and go big. Disk is cheap, so set up a 10GB database.

    I'd then set up a procedure and get a count of all rows in all tables each day. Store this off and run some comparisons, watching for growth over a few months. That way you can see which tables a growing and which aren't. Dont' forget to trim this table over time.

    You can also watch backup sizes. If you track the sizes of the .bak files everyday, you get two things. One is an idea of how much data you're adding overall and two what to expect from growth. I used to track backup sizes and if the diff between two days was more than 20%, I'd start investigating to see what happened and if this was a real change, a mistake, or a one time load of some sort.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply