Request for review of proposed test environment

  • We do have a completely isolated test environment. Its a pain to use because it is completely isolated (all the server names / IP addresses etc. are identical to live so that we are testing "as per the live environment". Because of the isolation just trying to copy a file there is tedious. That environment is designed to allow a test of "everything" to prove that we can perform a specific procedure - so as well as SQL database and the APP we also have a complete copy of Exchange, the whole of the Document Management System and all the rest of the cast ..

    For my specific application I can cope with a much more skinny environment.

    I am thinking that I just need multiple SQL environments to isolate my DEV, TEST and QA database copies from PRODUCTION.

    Can I just have one SQL VM with multiple instances on it for DEV, TEST and QA?

    I presume that would mean that I only have to buy one SQL licence (compared to putting each environment on a different SQL VM)

    On DEV I would have my application database ("MYAPP_DEV") and then copies of the 3rd party databases that my APP queries, these would be renamed from "OTHER_DATABSE" to "DEV_OTHER_DATABASE" in order to enforce that all my references to [OTHER_DATABASE] are using a SYNONYM and nothing directly references:

    FROM [OTHER_DATABASE].dbo.[SOME_TABLE]

    My APP is browser based, so URLs can be

    MyApp-DEV.MyDomain.com

    MyApp-TEST.MyDomain.com

    and the CONFIG for the Website will have a Connect String for the database, so can route to the appropriate Server / Instance

    Is it as simple as that, or is there more segregation that I need to do?

    Any suggestions would be appreciated

  • If there is a specific set of requirements, such as auditing or compliance, where you need to isolate these environments, there should be no need to keep things the way they are.

    Same IP addresses?  Wow.

    I would not combine dev/qa/prod on the same server with different instances.  That's a surefire way to cause headaches and a lot of administration work to isolate performance to the different instances.

    In most places I've been, there are 3-4 environments, Dev, QA, staging or UAT, and production.

    The dev environment is fairly wide open.

    QA is more controlled, deployments are approved and access is limited to the QA testers.

    UAT is the last place where the final sign off occurs when a change is made.  This is secured in the same manner as production.  You only access this by using the applications.

    Production access is limited to admins.  If a developer, as an example, needs to run a query to do troubleshooting, they request access via an interface we developed, it then gets approved (or denied), and the system grants them access for 24 hours.

    All deployments are 100% automated via the DevOps deployment process.  The only manual work is admin tasks, such as expanding disk space.  Everything related to applications, reporting, or BI is automated.

    All environments are identical from the software and patching levels. The resources allocated to Dev and QA are somewhat less than production as far as memory and CPU.  UAT and production are identical from a hardware perspective.

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

  • I presume that would mean that I only have to buy one SQL licence (compared to putting each environment on a different SQL VM)

    My understanding is that the Developer version of SQL Server can be used for Dev and Test environments. I would be inclined to use multiple VMs..

     

  • I'm with Michael. I wouldn't load dev/test with prod. You can use developer edition for dev/test, so use that, with separate environments.

    Having the same IP shouldn't matter. You should never connect with an IP. Connect with an FQDN, which you can override for machines with the local hosts file. If you worry about setup, everything should be deployed with infrastructure as code, so that nobody clicks on anything. A process (Chef, puppet, etc.) builds a server from a config, so that you can ensure that everything is the same. The only override would be the IP and server name.

  • Michael, Thank you, that's very helpful.

    Michael L John wrote:

    If there is a specific set of requirements, such as auditing or compliance, where you need to isolate these environments, there should be no need to keep things the way they are..

    We don't have Audit / Compliance restriction.

    Same IP addresses?  Wow..

    This is the environment which, historically, we have only used to test 3rd party APPs. Nowadays we have strict requirements on 3rd Party Vendor code (any use of NOLOCK has to be authorised, database names have to conform to our naming convention, etc) ... but that is NOW, way back then we didn't have any such rules. Some of the 3rd party stuff is horrific ... full of hardwired links to other databases (from other 3rd party vendors that they pull data from) ... NOLOCK used like confetti ... and so on.  One Vendor named their database "LIVE" which is not particularly helpful in knowing which Supplier Name / Product it refers to! ... but we are where we are on that.

    So that test environment is intended to guarantee exact duplication with production, to ensure that 3rd party Vendor upgrade tests perform "identical to the live system"

    I don't want that for our in-house stuff, I think I just need a different SQL instance, and a different Web Site, and then point browser at the appropriate sub-domain and Voila! you are connected to DEV / TEST / QA etc.

    Pls assume that our Production environment is completely separate from DEV / Test

    But I would like to have DEV / TEST / QA (and Help Desk) on separate SQL instances on a single machine (and one SQL licence). There are only 50 people in the company, and I can tolerate some lesser formality over who can do what on DEV vs. QA, but basically the way you have very helpfully described it is very similar to what we want.

    In most places I've been, there are 3-4 environments, Dev, QA, staging or UAT, and production..

    The dev environment is fairly wide open.

    QA is more controlled, deployments are approved and access is limited to the QA testers.

    UAT is the last place where the final sign off occurs when a change is made.  This is secured in the same manner as production.  You only access this by using the applications..

    I think my objective is similar, but I would appreciate any comments anyone has.

    We do fresh development on DEV. For us 99% of what we do is SQL. The APP is just an engine, and it uses Metadata in the database to figure out how to display stuff to the user. We don't do any APP programming, except for generating "meta data". For example, to make a Data Entry Form we just choose which Database Table Columns are to appear on the form. We do not create any HTML for the form, no APP data validation, just SProcs and Metadata. The actual form displayed to the user, and all the data validation, and which Sprocs to call etc., is all mechanically generated in real-time (well ... "cached")

    So at the end of today on DEV we will have some modified SProcs and some new Metadata (and maybe some DDL changes)

    We run an automated regression test system. Over night the "Base Test Database" is restored, all the Sproc / Metadata changes on DEV are applied, a new BACKUP is taken (for tomorrow's base restore) and then the regression test runs. It records any difference between Baseline and Current. Tomorrow we get a list of any differences and we either "Re-base" them (an intended change) or raise a ticket for a fix (an error)

    When we get to a review milestone we produce a Rollout Script. (In practice we also run that through the Regression Test to check it didn't screw anything up ...)

    We restore Production to QA and apply the rollout script, and then selected users review the changes we made.

    (My naming of QA seems to be more like your UAT)

    We usually have several cycles before a formal release. When we are ready for a formal release we make a Rollout Script, restore Production to QA and apply the script. We are reusing QA for this purpose ... maybe we should have another variant for that.

    At this point QA is used for "user training" for any new features and so on. There is no harm if a user tries something and it all goes Pete Tong ...

    Once its all signed off we run that same rollout script on Production.

    Separate to that we have Help Desk copy databases. That is copies of each Production database copied to "HD" every night. When a problem is raised with Help Desk they will use that "yesterday's data version" to investigate. For us, HD are permitted to investigate on Production, but they would only do that if the issue can only be reproduced on Production (e.g. depends on data entered today) AND the problem will not wait until tomorrow.

    [Production access is limited to admins

    That's true for us at an SQL level, but the APP has loads of diagnostic capability, so Help Desk can do diagnostics on Production (i.e. restricted to the tools provided in our APP - basically "Reports")

    All deployments are 100% automated via the DevOps deployment process.

    We are heading that way. Ours is "largely automated" but there are a few manual-ish steps.

    All environments are identical from the software and patching levels. The resources allocated to Dev and QA are somewhat less than production as far as memory and CPU.  UAT and production are identical from a hardware perspective

    I'm aiming for that too, thanks.

    My "Separate SQL Server with multiple instances for DEV/TEST/QA" will just be a virtual machine and, so IT tell me :), the physical hardware available to it will make it pretty much indistinguishable from the Production SQL Server environment.

    By comparison the server which hosts our "fully isolated test environment" has significantly less RAM than Production. That is a pain because when we RESTORE the Production SQL VM onto Test then SQL won't even start ... so we have to clutz about starting SQL in Manual/Single User Mode, change the CONFIG to reduce memory, and then we can start SQL for the Test environment.

  • Ken McKelvey wrote:

    My understanding is that the Developer version of SQL Server can be used for Dev and Test environments. I would be inclined to use multiple VMs.. 

    That's obvious, now you have said it. Thanks, that will make a big difference

    Steve Jones - SSC Editor wrote:

    I wouldn't load dev/test with prod. You can use developer edition for dev/test, so use that, with separate environments..

    Production is separate.

    If I have separate VMs for Dev / Test / QA will that need more underlying hardware "reserved" than if I had them in difference instances on a single SQL Server? If not then separate VM would definitely be better.

    We are a small company, a couple of developers, 3 people in Help Desk (generalists but each with a speciality - one for IT, one for our in house APP, and one for all the 3rd party stuff). We also have a technical author.

    As such the load on the DEV / TEST / QA systems is tiny, so I am reluctant to put them on separate VMs if that would "reserve" more physical hardware that multiple instances on one SQL Server.

    Having the same IP shouldn't matter.

    That TEST environment is a restore of the Production VMs so we have identical names and everything as a side effect of that. But given how poorly written our 3rd party stuff is "any change could be critical", so that was the reason we did it that way.

  • Using them on once instance works, but you still have a name resolution issue. The db names are different, so the connection string is different. I guess you can repoint the default_db for logins, but this still feels like a place for a change that could impact the exact duplicate.

     

    You can have an instance with the same name/IP, but you need network separation. We used to do this, before VMWare, with hardware that duplicated multiple environments for us. Each was exactly duplciated, down to IPs and machine names. Clients needed different networking connections, which were dynamically changed by the scripts setting up environments, to allow a route from a->b, where a was a particular client, and b was the environment being set up. b could change into a new environment, but it had the same network access from the client perspective, as in 10.10.10.55 was always the main web server.

  • Thanks for your help Steve

    Steve Jones - SSC Editor wrote:

    Using them on once instance works, but you still have a name resolution issue. The db names are different, so the connection string is different. I guess you can repoint the default_db for logins, but this still feels like a place for a change that could impact the exact duplicate.

    The "name resolution issue"?

    I am envisaging:

    I have a "MyApp" database, and other 3rd party databases "OtherDB1" and "OtherDB2"

    On DEV (whether VM or Instance) these will be named

    MyApp_DEV and DEV_OtherDB1 and DEV_OtherDB2

    (I have a separate reason for using Suffix on the first, and Prefix on the others)

    and thus on QA they will be

    MyApp_QA and QA_OtherDB1 and QA_OtherDB2

    In my "MyApp" databases I will change the SYNONYMS to the 3rd party databases to reference them by those PREFIX-names. The copies of those 3rd party databases will be the in "local" SQL VM/Instance - so all the DEV DBs in the DEV SQL VM/Instance

    My DEV APP is a Browser based app. So the Sub-Domain I use will have its own IIS website, and that will have a CONFIG file with Connect String which will connect be to DEV VM/Instance and the appropriate "MyApp_DEV" database. Also the "login"

    That Login will have permissions to access DEV_OtherDB1 and DEV_OtherDB2

    That's my plan. If you can see any snags that would be appreciated

    Given that planned environment are there and Pros / Cons to VM vs Instance? (i.e. where VM would be a separate VM for each of DEV, TEST, QA and Instance would be one SQL VM with a separate INSTANCE for each of DEV, TEST, QA)

    The thing I think I am most keen to achieve is minimum dedicated / reserved physical hardware

    I also want to prove that ALL access to 3rd Party Databases goes through SYNONYMs and there is no accidental hardwired code accessing them. The way I plan to achieve this is the deliberate prefix-naming of "DEV_xxx" on the 3rd party database. On the TEST and QA environments the prefixes will be different (and the SYNONYMs changed as appropriate) so any accidental direct reference hardwired code should break once it gets to TEST .. .QA

    I presume on either a VM or an INSTANCE if I am in TEST then if I do

    SELECT ... FROM OtherDB1.MyTable -- that will use my SYNONYM and actually connect to TEST_OtherDB1

    SELECT ... FROM DEV_OtherDB1.MyTable -- This will fail, and will NOT connect to DEV_OTherDB1 which is in the DEV VM/Instance and is NOT a database in the current TEST VM/Instance

    If I have got that wrong, and that name separation is NOT achieved using  INSTANCEs then I definitely need VMs

     

Viewing 8 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply