Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

The Standard Limitation Expand / Collapse
Author
Message
Posted Tuesday, August 6, 2013 8:08 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Today @ 6:53 PM
Points: 31,177, Visits: 15,623
Comments posted to this topic are about the item The Standard Limitation






Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1481629
Posted Wednesday, August 7, 2013 2:43 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Yesterday @ 4:30 AM
Points: 2,885, Visits: 3,253
I agree that the SQL Server license model is badly designed, and I would much prefer a model based around capacity rather than features. Also the 'virtualisation tax' of Software Assurance being needed to run SQL Server in the cloud is almost insulting.

SQL 2014 has some great new features, such as updatable column-store indexes and in-memory tables. But to fully exploit these Enterprise Edition is needed.

Compare SQL 2014 to AWS Redshift. This already has much the same BI feature set as SQL 2014, but pricing starts at $1000 per year (albeit for an environment almost too small to do useful work), and scales up roughly linearly to 32 cores and 128GB memory.

For a business with only about 3TB of data SQL Enterprise Edition is no longer competitive, when compared to Cloud offerings. We will no doubt continue to use SQL Server for a number of years, but our license needs peaked late in 2012 and will now go down year on year.

Moving to a different DBMS is not a trivial matter, which is why SQL will stay a part of our infrastructure, but what now gets developed for a different DBMS is unlikely to ever get ported to SQL Server.

By the time Microsoft feels the pinch, it will be because businesses have already left SQL Server behind. In some ways this is sad, but in others it means opportunities to learn new things and have a career for as long as it is needed.


Original author: SQL Server FineBuild 1-click install and best practice configuration of SQL Server 2014, 2012, 2008 R2, 2008 and 2005. 18 October 2014: now over 31,000 downloads.
Disclaimer: All information provided is a personal opinion that may not match reality.
Concept: "Pizza Apartheid" - the discrimination that separates those who earn enough in one day to buy a pizza if they want one, from those who can not.
Post #1481726
Posted Wednesday, August 7, 2013 2:46 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Yesterday @ 7:36 AM
Points: 2,907, Visits: 1,832
Look at Amazon instances. High-Memory Quadruple Extra Large DB = 68GB RAM!

It really depends on what your use case is.
Heavy reads and a large database? You are going to need RAM for your buffer pool.
Mainly writes and serving up reference data? Memory isn't such an issue.

We use Enterprise features (compression, partitioning, resource governor etc) but with very few exceptions our instances come well below the 64GB RAM. Part of that is because most of the estate is virtualised.


LinkedIn Profile
Newbie on www.simple-talk.com
Post #1481731
Posted Wednesday, August 7, 2013 3:48 AM


SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Saturday, October 11, 2014 8:18 PM
Points: 831, Visits: 1,588
(Don't panic - tongue is firmly in cheek here)
Wouldn't it be good if they created a licensing structure that forced db designers to normalise their OLTP databases properly. I'm heartily sick of people going on about how database XYZ must be important because it occupies 150GB on the disk, and we need all this RAM etc etc. Then you dig into it, and it's a mess of unnecessary clustered PK guids, ints that should be tinyints, nvarchars that should be varchars, redundant columns, strings that should be lookup values and on and on it goes. The smaller the db the cheaper the license and the world will slowly become a better place. Because there are cases where even really well normalised databases have billions of very compact rows, you'd scale the cost at a decelerating rate. So you could do something like Size = =0.005*(Cost^2).

You know you want it.




One of the symptoms of an approaching nervous breakdown is the belief that one's work is terribly important.
Bertrand Russell
Post #1481762
Posted Wednesday, August 7, 2013 4:21 AM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Today @ 7:16 AM
Points: 304, Visits: 3,417
GPO (8/7/2013)
(Don't panic - tongue is firmly in cheek here)
Wouldn't it be good if they created a licensing structure that forced db designers to normalise their OLTP databases properly. I'm heartily sick of people going on about how database XYZ must be important because it occupies 150GB on the disk, and we need all this RAM etc etc. Then you dig into it, and it's a mess of unnecessary clustered PK guids, ints that should be tinyints, nvarchars that should be varchars, redundant columns, strings that should be lookup values and on and on it goes. The smaller the db the cheaper the license and the world will slowly become a better place. Because there are cases where even really well normalised databases have billions of very compact rows, you'd scale the cost at a decelerating rate. So you could do something like Size = =0.005*(Cost^2).

You know you want it.


It's called Cloud Databases, they even push you strongly in the direction of properly designed indexing, because you pay for each io. Your unnecessary scans on badly designed tables using * costs you money.


I'm a DBA.
I'm not paid to solve problems. I'm paid to prevent them.
Post #1481774
Posted Wednesday, August 7, 2013 7:26 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Today @ 6:53 PM
Points: 31,177, Visits: 15,623
GPO (8/7/2013)
...
You know you want it.


LOL, that's funny and hopefully someone gets the joke and doesn't consider this.

On one hand I'd like this, on the other, there are good reasons I break normalization. What I'd rather have is better testing that is hard, and cheap, and makes people use the real tools in the real world to prove some level of knowledge.







Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1481846
Posted Wednesday, August 7, 2013 8:20 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Monday, September 29, 2014 9:05 AM
Points: 22, Visits: 155
DB Software licensing is a strange thing. Imagine if car companies ran business like software companies, charging you extra if you added a bunch of performance parts after the initial purchase. I guess when you only purchase the "permission to use" for something, you are bound by the will of the entity that technically owns that thing (software). There should be an option to "buy" software rather than just a license. That way, you could pay a bunch up front but not have to worry about cost when you upgrade hardware.
Post #1481896
Posted Wednesday, August 7, 2013 10:58 AM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Yesterday @ 9:38 AM
Points: 1,907, Visits: 2,056
GPO (8/7/2013)
Wouldn't it be good if they created a licensing structure that forced db designers to normalise their OLTP databases properly. I'm heartily sick of people going on about how database XYZ must be important because it occupies 150GB on the disk, and we need all this RAM etc etc. Then you dig into it, and it's a mess of unnecessary clustered PK guids, ints that should be tinyints, nvarchars that should be varchars, redundant columns, strings that should be lookup values and on and on it goes. The smaller the db the cheaper the license and the world will slowly become a better place....


You bring up some good points, and others which I'm a bit confused about. Good database design would drive people to use the proper datatypes where appropriate. What I think a number of people forget though is that normalization is analysis, not design. As Steve mentioned there are some cases where specific denormalizations will improve performance and require fewer RAM and CPU resources, e.g. a current status of an item in the item's record itself instead of having to search through the history of the item to find the status.

As for licensing itself, Andrew's analogy of what you describe to Cloud data services makes me think we're talking apples and oranges here. If businesses are paying for the extra CPU and RAM hardware for in house servers and also paying extra for the software to use that hardware, it's a double hit to their budget for something that the software company didn't really do anything to earn that extra money. SQL Server would be coded the same for a 4 core system with 4 Gig of RAM as it would for a 16 core system with 64 Gig. In the cloud you're really paying only once for the "service" so it makes more sense in that model to price on size.

What I'd like to see is a more a-la-carte pricing by feature approach so that every SMB doesn't esentially have to pay the Enterprise edition penalty for just a handfull of features they will use that aren't in the Standard edition. I highly doubt MS will ever go that route since they like big software bundles, even though a feature based approach would probably help them better determine how people are using their software and what parts of the system are worth MS putting more research and development into.
Post #1481976
Posted Wednesday, August 7, 2013 11:00 AM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Yesterday @ 9:38 AM
Points: 1,907, Visits: 2,056
Andrew-H (8/7/2013)
DB Software licensing is a strange thing. Imagine if car companies ran business like software companies, charging you extra if you added a bunch of performance parts after the initial purchase...


yeah, or charged you more for the car depending on how many miles you anticipated driving in the next 3 years
Post #1481977
Posted Wednesday, August 7, 2013 11:10 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Monday, February 17, 2014 11:53 AM
Points: 10, Visits: 40
I support a 2TB database that is the backend for a key application in our business. As it is vendor designed, I have little say in the structure. The application pumps in about 3 million data values every hour, 24 hours a day. I'd love to use all 96 GB I have on the server, but I am living with the 64 GB limit in SQL Standard simply because of cost. The SQL Standard license was $3,500, the Enterprise license was $70,000! And we are a "Microsoft Partner" so we get "special pricing". For some reason, I don't feel very special.
Post #1481982
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse