• Well, I can't say that I 've ever had the need for 2100 parameters in a stored procedure, but I have built and worked on some fairly large systems. One in particular is in a large manufacturing corporation, encompasses about 30 databases hosting around 12,000+ objects and spanning two different servers. Believe it or not, most of it is still running in SQL 2000. We talked about converting many times but the feeling of the IT department is "if it's not broke, don't fix it". The system is quite complex and has remote database queries that are used to ferry data between the SQL 2000 system and the SQL 2005 system. In SQL 2005, we have a very large data warehouse that is used to generate Cognos reports for data that is aggregated, pivoted, diced, sliced, massaged, and re-purposed in tables for quick access. Some of the tables contain 2 million plus records, so speed is a factor for the management relying on the reports.

    I thought our 20 GB reporting database was big. But then again, when I think of "big data", I'm thinking of Amazon or Facebook with their dozens if not hundreds of replicated servers running all over the globe with the speed requirements of a teenager with a smartphone. I'm always a bit mystified by how they accomplish these seemingly lightning fast data retrievals against such a behemoth system.

    Hats off to the analysts that have to keep all that straight. Documentation will only get you so far. The rest relies on specialized skills and experience.

    Jerry Boutot, MCAD MCP, MTA
    Jerry Boutot Official