Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12

Administering a Development Environment Expand / Collapse
Author
Message
Posted Wednesday, August 10, 2005 8:00 AM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Saturday, June 28, 2014 8:14 AM
Points: 89, Visits: 260

On this topic, I agree with the concerns raised about "blanket refreshes", yet I do see the value of regular, simple refreshes. With that in mind, I did the following pretty soon after I arrived at my new company. Feel free to comment on it (the more I hear, the more I learn )

I proposed, and created, a "staging" server. This lives in the production world, and has very restricted access. The staging process follows the following simple paradigm, and does it on a daily basis:

1> receive production backups (Production Backups jobs last step is to copy nightly full backup file to specific location).

2> Restore backup

3> scrub all sensitive, private and security data (a seperate exercise that was done before we did the staging server, was to review the existing "scrubbing" process)

4> remove production users from databases

5> shrink logs

6> create new "scrubbed" backup in a date based folder

7> email success/failure

What we have the the "server" (a 35 step SQL Server Agent job) creating these scrubbed backups on a daily basis (35 databases, totalling +- 55 GB of databases and 55 GB of source backups). The process above takes 1h45 (and does a few extra's like taking a monthly backup and pushing it to a special directory so we can re-create month-end, and a annual directory as well). We kepp 2 days worth, so at any point in time, we have yesterdays full set, todays full set, last month ends full set and last years year-end full set.

We then have "client" jobs (SQL Server Agent jobs) on each of our development and test and training servers (we have 4 dev boxes, and 4 test boxes and 1 training box - 2 dev and 2 test boxes in a development stream, and 2 dev and 2 test boxes in a produciton support stream). These jobs fetch the backups from the date based folder, and restore the backups, apply environment specific settings and scrub's (like resetting emails to dev2@somewhere.outthere), adding environments users, and email success/failure. This can take anywhere betwenn 2 and 5 hours (given the different box specifications).

Typically our production support stream will have weekly refreshes (on a sunday), but they can, and have, gone through periods where nightly refreshes - expecially the first week after a new production rollout.

Development stream will typically get a refresh the first week of a new release, as does training.

Of course, any client an have an ad hoc refresh run as and when required, but the agent gives us the ability to make use of "down time" during the night (unless we are all working late, AGAIN ).

Hmm - anything else to add? Oh yes - this might sound expensive - all these different servers etc.? Not really (relatively speaking), since all our dev and test and training and staging servers are all PC's, using IDE harddrives. Yes, that's unfortunate, since we cann get proper etsimation of production performance etc. but the business is still a startup, and can't afford 4 x 8 processor clusters for dev and test.

Thoughts, comments?

Post #209093
Posted Wednesday, August 9, 2006 9:15 AM


SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Yesterday @ 11:42 AM
Points: 821, Visits: 2,028

Dan.  You said "I now how a handy c# app that rolls db changes out from one source tree to another and applies them against the db, so in effect I have a source tree in sync with the db. From this we can easily track changes when debugging, refresh dbs backwards (from prod through UAT & testing to dev) and roll forward new releases providing a rollback mechanism!"

I would very much like to look at this.  I've been thinking of writing just such a beast.  I had a good tool to do table on schema sync but it's in VB 6 and uses third party DLL's.  I just have not had the time to move it to .Net.  I have installed my old version at serveral customers.  I send them an updated schema file, they run it against their production database and get a customized change script.  This they run to appply the upgrade.

I want a tool that will do that but be in .Net so that I can just drop in the execuatable and current definition file but extend it to views, keys, and stored procedures.  I could whip it out in VB.Net but I just don't have the two days that it would take right now.

 



ATB

Charles Kincaid

Post #300634
Posted Friday, January 15, 2010 6:23 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, December 3, 2014 3:16 PM
Points: 17, Visits: 55
I'm glad someone mentioned "remove sensitive data". I know many places simply test against copies of live but we should be resisting it and pushing for proper development data sets.
Post #848207
« Prev Topic | Next Topic »

Add to briefcase ««12

Permissions Expand / Collapse