Is there good practices on creating the database and schema using stored Proc?

  • Hi ,

    we would like to automate the database  and schema creation with master data for one of our project. Is there any better way/ new feature to create the database and schema with master data using Stored procedure in newer version? If any kindly share the ideas or scripts?

    Regards,

    Ram

     

     

  • Can't you just add all the objects you want to be in every one of your databases to the model database?

    if you want to add all the objects after the database is created, you'd have to script them all, check for existence, and if they don't exist, create them.

  • You should have a deployment script (s) of some sort to do this. Something like FlywayDB would make this easy: https://flywaydb.org/

     

    Disclosure: I work for Redgate Software.

  • If your deployment requires data to exist as well as just the schema - then a base database as a backup file probably is easier to deploy.  Restore the database from that backup and you have a database that is ready for use.

    Just creating the schema then requires adding supporting data before the database can be useful for the application.

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • What I'd say about the backup approach is that it is better, but if you are changing this set of schema/data often, then this can be a pain point to manage. Ideally, you deploy stuff from your pipeline to prod and to a database used to hold the backups, but in practice, this becomes cumbersome.

     

  • Steve Jones - SSC Editor wrote:

    What I'd say about the backup approach is that it is better, but if you are changing this set of schema/data often, then this can be a pain point to manage. Ideally, you deploy stuff from your pipeline to prod and to a database used to hold the backups, but in practice, this becomes cumbersome. 

    It is all relative...if your system requires hundreds of tables with pre-loaded lookup tables before it can be used - then you either use a backup of the database with the schema and data ready to go or you build a deployment script that builds all of the objects - and then loads the data (another set of scripts - and files).

    Or - you build a deployment model that can use a combination.  A baseline database that is built out with the latest schema every couple of months and a deployment script based off the last time the baseline database was updated.  Then you restore the baseline and apply the deployment scripts from that version to current.

    Each method has different pain points...

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Create a "base" db with the schema, tables, etc. that you need.

    When you need to create a new db, restore a backup of that db to the new db name you want.

    As to the old base db changing, easiest is likely to just back it up every 30 mins (or on whatever schedule) overwriting the previous backup file.  The backup will always have a reasonably recent version of the base, and naturally you can do the backup on demand any time you want.  I personally would never try to maintain scripts for this because that's such a pain to keep accurate.

    SQL DBA,SQL Server MVP(07, 08, 09) Prosecutor James Blackburn, in closing argument in the Fatal Vision murders trial: "If in the future, you should cry a tear, cry one for them [the murder victims]. If in the future, you should say a prayer, say one for them. And if in the future, you should light a candle, light one for them."

  • Depending on the amount of data in the master data, I've done this all with scripts in source control. That assumes the master data is all just lookup information, 5 rows, 50 rows, certainly less than 100 rows, per table. As soon as it starts to expand beyond that, maintaining it in scripts just becomes too much of a headache. Then, we maintained the master data in a dedicated system where we could version it appropriately. We built a data load process that was part of our build & deployments. Any new database is built from source control and master data is loaded from the other database. This provided a much more dynamic, fast moving system, rather than trying to maintain versions of backups.

    ----------------------------------------------------
    The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood...
    Theodore Roosevelt

    The Scary DBA
    Author of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd Edition
    Product Evangelist for Red Gate Software

Viewing 8 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply