Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Can SQL Server do ‘Big’? Expand / Collapse
Author
Message
Posted Thursday, September 19, 2013 9:18 PM


Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Yesterday @ 11:26 AM
Points: 561, Visits: 2,415
Comments posted to this topic are about the item Can SQL Server do ‘Big’?


Best wishes,

Phil Factor
Simple Talk
Post #1496687
Posted Friday, September 20, 2013 2:17 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Monday, April 07, 2014 3:58 AM
Points: 27, Visits: 363
I have seen one of those limits breached a number of times. It's the number of parameters (2,100) which applies to any query.

Amusingly I've had developers complain that this is not high enough and that there must be something wrong with SQL as it won't let them do what they want. It's not that big or complex a DB either (1TB or so, 700 tables).

The queries themselves don't come via a stored proc but via nHibernate. What you typically see is a query being passed in as follows :

SELECT blah FROM dbo.Table WHERE MyParam IN (@P0, @P1...@P2100)

This all depends on how many params get passed in, so if I have 2,099 I get

SELECT blah FROM dbo.Table WHERE MyParam IN (@P0, @P1...@P2099)

and the hash of the query changes, so we compile a new execution plan and so on.

Mix this in with another approach you see where it explicitly states the size of the values being passed in and you get another variation for the length of the parameter. If I'm passing in a surname with 7 characters, I get one execution plan and if I pass in a surname with 6 characters and I get another. You can get both together for extra fun !
Post #1496746
Posted Friday, September 20, 2013 6:17 AM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Friday, April 04, 2014 8:42 AM
Points: 598, Visits: 1,504
I commonly talked to people at the NTSSUG that have very large SQL Server environments, (mainly medical and financial), but I personally haven't had a client that requires something of that scale.

Where I encounter colleagues that have insanely huge data requirements are in the physics, weather, astronomy and biology science realm. They truly have "Big Data" issues. Most of their tools are not known in the Microsoft realm, (outside of folks like Buck Woody), but I believe they really could benefit someone with an open mind that loves data. There's some really cool software that's been developed to deal with their issues that I'm incorporating into the "normal" business data world.
Post #1496837
Posted Friday, September 20, 2013 6:58 AM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Friday, April 11, 2014 6:56 AM
Points: 372, Visits: 578
2,100 parameters for a stored procedure?


I can't imagine writing a stored procedure that takes in 2,100 parameters. I haven't even written a C# method that takes more than 25 parameters. 2,100 parameters would make my head spin.
Post #1496861
Posted Friday, September 20, 2013 7:02 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: 2 days ago @ 6:03 AM
Points: 1,240, Visits: 1,384
Nothing in my world that comes anywhere close to "BIG" so far....
Post #1496866
Posted Friday, September 20, 2013 7:07 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, February 27, 2014 12:11 PM
Points: 19, Visits: 80
When I worked for a pharmaceutical giant handling the Clinical Trials database it was close to that number but the company used Oracle so it doesn't really apply.
Post #1496868
Posted Friday, September 20, 2013 7:34 AM
Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Today @ 7:21 AM
Points: 1,381, Visits: 2,001
Only a few tenth of gigs but the foreign keys limit was reached which gave us some headache.
Also the object naming of 128 characters is way too low (sysname). Having several tenth of thousands of objects, naming objects using their functionality are mandatory to find what we need rapidly.
Post #1496887
Posted Friday, September 20, 2013 7:46 AM


Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, February 03, 2014 7:35 AM
Points: 8, Visits: 133
Well, I can't say that I 've ever had the need for 2100 parameters in a stored procedure, but I have built and worked on some fairly large systems. One in particular is in a large manufacturing corporation, encompasses about 30 databases hosting around 12,000+ objects and spanning two different servers. Believe it or not, most of it is still running in SQL 2000. We talked about converting many times but the feeling of the IT department is "if it's not broke, don't fix it". The system is quite complex and has remote database queries that are used to ferry data between the SQL 2000 system and the SQL 2005 system. In SQL 2005, we have a very large data warehouse that is used to generate Cognos reports for data that is aggregated, pivoted, diced, sliced, massaged, and re-purposed in tables for quick access. Some of the tables contain 2 million plus records, so speed is a factor for the management relying on the reports.

I thought our 20 GB reporting database was big. But then again, when I think of "big data", I'm thinking of Amazon or Facebook with their dozens if not hundreds of replicated servers running all over the globe with the speed requirements of a teenager with a smartphone. I'm always a bit mystified by how they accomplish these seemingly lightning fast data retrievals against such a behemoth system.

Hats off to the analysts that have to keep all that straight. Documentation will only get you so far. The rest relies on specialized skills and experience.


Jerry Boutot, MCAD MCP
Owner, AppDataWorks, LLC
AppDataWorks, LLC
Jerry Boutot Music
Post #1496894
Posted Friday, September 20, 2013 7:46 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Today @ 11:13 AM
Points: 257, Visits: 901
I can appreciate domain-specific big data requirements in sciences or aggregating many streaming sensor data. However, I wonder if there exists a properly normalized schema that needs 253 foreign key references on a table. If you know how that happens, please do share - I'm sure it would make an interesting article (*nudge*)

ryan.offord (9/20/2013)
I have seen one of those limits breached a number of times. It's the number of parameters (2,100) which applies to any query.

Amusingly I've had developers complain that this is not high enough and that there must be something wrong with SQL as it won't let them do what they want. It's not that big or complex a DB either (1TB or so, 700 tables).

The queries themselves don't come via a stored proc but via nHibernate. What you typically see is a query being passed in as follows :

SELECT blah FROM dbo.Table WHERE MyParam IN (@P0, @P1...@P2100)

This all depends on how many params get passed in, so if I have 2,099 I get

SELECT blah FROM dbo.Table WHERE MyParam IN (@P0, @P1...@P2099)

and the hash of the query changes, so we compile a new execution plan and so on.

Mix this in with another approach you see where it explicitly states the size of the values being passed in and you get another variation for the length of the parameter. If I'm passing in a surname with 7 characters, I get one execution plan and if I pass in a surname with 6 characters and I get another. You can get both together for extra fun !


Of course that kind of automated error requires a framework. I haven't heard from developers who love nHibernate, but I hear from DBA who hate it. It seems to actively abuse the database.

No doubt there are many other applications that "worked" as a prototype then scaled up to production that "sorta worked" as long as there's enough surplus resource in the server to accommodate the inefficient design/approach. Unfortunately the cost to re-engineer often exceeds the cost of more server resource, so the poorly-written applications end up with dedicated servers while those done-right-the-first-time applications lose deadlock contests (for example) to those apps that haven't have been isolated yet. :)
Post #1496895
Posted Friday, September 20, 2013 9:09 AM


SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Yesterday @ 2:51 PM
Points: 826, Visits: 1,220
My largest database isn't big based on size, but I'd count it big based on number of objects. I'm pushing 120,000 tables with a minimum of 50,000 more in the near future. Before anyone criticizes design, it's a 3rd party. I only support it I didn't create it.

Tim
Post #1496925
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse