Detailed Test Poll

  • It's another Friday and so it's time for another poll. Once again it's a technical one though I'll try for another fun one next week 🙂

    In the Feb 6, 2006 issue of eWeek, there was an article about the next tools after Sparkle, Microsoft's design tool for something. Don't know really what it is and don't care too much, but I happened to read one small section near the end. I apologize for not linking, but I cannot find this article on the eWeek site, so I wonder if they have more separation between print and web than I realize.

    Anyway, the sentence that caught my eye had to do with testing for Visual Studio. Microsoft was trying for 100% automation in the testing and said that the "current Visual Studio 2005 test covers more than 10 million tests, with 9,000 servers in the lab, and a full test pass takes 21 days." (eWeek, Feb 6, 2006, pg 14)

    Wow!

    21 days? I'm not sure if that's more amazing than the 9,000 servers. They are both really amazing and it makes me want to get a look at the SQL Server test suite and setup.

    So with that in mind, here's the poll:

    What's the testing setup at your company? Formal? Automated?

    I wish I could say I have some great setup here, but since we mostly just patch the web site, our testing consists of me, Brian, and Andy clicking around and seeing if things work. I use 3 or 4 test accounts, but no formal process.

    🙁

    Steve Jones

  • Well, for one of our clients, certain development teams have a process that we call "Testing in Production"...

  • The short answer: We use a mix of automated test scripts and manual scripts, plus what I call the "real user experience" approach to testing. (The latter consists of clicking all over the place till something falls apart, which generally seems to be the most reliable way of ensuring quality since its what users do.)

    Visual Studio 2005 is actually proof of why a fully automated test suite is a fatal approach to real quality. As a user of the tool almost daily since early Beta releases, I find the final product is still plagued by some bugs that are obvious, easily reproduced, and painful hits against productivity. So, while I think VS 2005 is a fantastic product overall, it proves that automating testing is pointless if the automation script fails to address functional processes outside a narrow range. Some of the bugs in the product are so obvious I have often heard my coders grumble, "Maybe they should have tried launching this f---ing thing."

    SQL Studio was probably also a victim of automated testing, and it shows the same obvious fractures. Anyone who has ever run into the error where after deleting a slew of objects the system informs you some index number doesn't exist, leaves the Summary page entirely blank, and essentially forces you to close and re-open the Studio has seen the fine end result of no human being using their eyeballs.

    Maybe a more interesting question, though, than how we all test, would be why we test the way we do. My bet is that if Microsoft examined that question and its response, they would change their tools to reflect it.

  • On our team - manual, with test cases specified and controlled by an independant Test Lead.  Elsewhere in the company, some automation via scripts does take place....but in a very limited IT area.

  • We have some test scripts that are used on the current data migration project.  These scripts check each table to see if it passes certain migration rules.  That's helped a great deal in cutting out errors before trying to load the data.

    Other than that, we use a test database to develop sprocs etc in.  We check the output in there and then release it to the live environment if all is well.

    With the sort of work we do bugs aren't a big problem.  Mostly, it's down to data quality issues and changes in working practice, and updating SQL scripts to reflect that.

  • Unfortunately, "testing in production" tends to be our most popular method too.

  • At my company, we have a full team of testers (formally known as business analysts or business technicians).  Since we have multiple lines of business (around 9), each of our testers specializes in a few of those areas.  When a project comes up involving one of their specialties, they're assigned to the project to give us (programmers) the specs and then test the project when it's complete.  Our project is tested three times before it is let run wild so to speak... once on our development server, once when it moves to QA, and once more when it moves to production (usually at night after business hours).  Since we're not a 24-hour operation -- we're an insurance company that provides web access to our agents -- this works just fine.  The programmers work very closely with the BAs/BTs, and they're great.  Very rarely does something get let through that's wrong.  And because we work closely with them, we can give them areas to pinpoint test if we need to.  While our system is prone to human error, I think it works great for us.

  • We have a Development box that is also used for Testing followed by Testing in Production through test accounts and Final release to customers. The tests are manual through testplans created both by Developer and Tester.

  • For every Stored Procedure, there is a commented section containing test cases for that sproc.

    For .NET development I use NUnit for every class method.

  • Testing Phase one- Developer (that would be me) tries to test the app out and see if it breaks. Usually I don't get it to break.

    Testing Phase Two- A group of users get the app and are told to try to break it.

    Testing Phase Three- Application in its finished state is deployed to clients, and hopefully the thing works.

    Though sometimes we just do "testing in production" which usually ends up with "working 50-60 hour weeks in production" since almost all of it is done on mission critical systems.

    -- Aleksei


    A failure to plan on your part does not constitute an emergency on my part!

  • Unfortunately, testing in our environment is mostly human based as well. 

    Developer test in Development environment.

    Then test users test in a testing/training environment

    Then a release to production.

    To help us help you read this[/url]For better help with performance problems please read this[/url]

  • I use MbUnit almost religiously. There are some things that are difficult to test such as code written against non-deterministic APIs. One of my primary responsibilities right now is SharePoint development and testing components that modify SP sites is something that is over my head right now. (Send me a message or email if you know how to do this!)

    The developers I work with mostly came from the VB 6.0 world and a lot of the OO capabilities available in .NET are not completely understood. OO concepts are crucial for creating components that are easy to test so I am trying to pull them through the learning curve.

    One thing I want to try soon is code coverage and I want to learn more about how to use the TestDriven.net package. I am probably not taking full advantage of the productivity enhancements that are available there either.

    [font="Tahoma"]Bryant E. Byrd, BSSE MCDBA MCAD[/font]
    Business Intelligence Administrator
    MSBI Administration Blog

  • At my place of work we use a similar process of clicking on things to see if things are working. We have a process to do a quarterly "test" restore of the critical databases. We do not have an automated system in place just a manual restore and testing of the databases.

  • Our tests are all performed manually, although following written test procedures.  Unit testing takes place in the development environment, or is supposed to, followed by QA testing in a test environment.  Once passed there, any in-house developed systems are tested against a Stage environment then promoted to production, if they succeed.  Third-party apps are pushed to production following a successful result in the test environment.

    Oh, then there is the documentation to prove that this has been done.  Thanks SOX....

  • "Testing in Production" here, too.  We're always told "it doesn't matter how ugly it is.  'Good Enough' is good enough."

Viewing 15 posts - 1 through 15 (of 22 total)

You must be logged in to reply to this topic. Login to reply