This is part 3 of a 3 part series of thoughts on certification and Microsoft technologies.
We'll never be able to completely and accurately measure a person's skills in technology. At least not in any cost- and time-effective way. Ultimately we want to come up with some way to weed through candidates and ensure they have a minimum aptitude for technology and some level of skill in the areas that are important to us. We want a way, with some level of confidence, to say that a person who has xx certification knows yy skills.
In the Microsoft world we can be sure that our platforms and technologies will change at least every 2-3 years, with major or minor revisions to all parts of the product we use. We might see minor tool changes, but fundamental feature enhancements or vice versa. However even when there are major changes, the revisions to the effective way we accomplish tasks doesn't change much. It evolves, and I think that a core set of skills can be measured, and more importantly, scored.
How we do that, I'm not sure. As Brent Ozar said, however, the experiment must go on. We, as an industry and group, should be finding ways to assess our community, and drive forward our profession. I'd like to think that we could build an open source framework that allows for the presentation of a situation, and the evaluation of a result. It could be a framework like tsqlt, which allows us to write tests that can be evaluated by a scoring system. By taking a script of some sort, and comparing it to a "question", some automated measurement would be able to determine if the question was answered (or partially answered).
Our community could easily build a bank of hundreds, if not thousands, of questions. Want to evaluate someone? Download 50 questions, drop someone in a room for an hour and see how much they get done. They might not finish, which would be a good test in and of itself. Run their answers through a scoring engine and get a report back. With tags, we could easily separate questions into a variety of packs that employers could use to test certain areas. Testing core skills, without too much worry about version specific items would allow questions to live for years. Heck, with the age of some SQL Server instances out here, I bet some companies still need SQL Server 2000 based tests.
Ultimately I don't think Microsoft will properly build and maintain a framework to evaluate candidates. They have too much incentive to cheat. They can fool lots of employers with easy to pass, paper diplomas and turn a profit with lots of easy certifications that sound good, but don't really test skills. The future of measurement in technology will be like it is in many other fields, with independent bodies that provide a minimal level of educational skill for most individuals. It will consist of granular tests that measure skills, in real situations, not question and answer trivia. Some people will slip through, some will cheat, but it will work well enough when it falls out of the hands of vendors.
Until that time, all you can do is prove your own skills, in person, through your publications, or with lots of good, valuable answers given to others.