The Standard Limitation

  • Comments posted to this topic are about the item The Standard Limitation

  • I agree that the SQL Server license model is badly designed, and I would much prefer a model based around capacity rather than features. Also the 'virtualisation tax' of Software Assurance being needed to run SQL Server in the cloud is almost insulting.

    SQL 2014 has some great new features, such as updatable column-store indexes and in-memory tables. But to fully exploit these Enterprise Edition is needed.

    Compare SQL 2014 to AWS Redshift. This already has much the same BI feature set as SQL 2014, but pricing starts at $1000 per year (albeit for an environment almost too small to do useful work), and scales up roughly linearly to 32 cores and 128GB memory.

    For a business with only about 3TB of data SQL Enterprise Edition is no longer competitive, when compared to Cloud offerings. We will no doubt continue to use SQL Server for a number of years, but our license needs peaked late in 2012 and will now go down year on year.

    Moving to a different DBMS is not a trivial matter, which is why SQL will stay a part of our infrastructure, but what now gets developed for a different DBMS is unlikely to ever get ported to SQL Server.

    By the time Microsoft feels the pinch, it will be because businesses have already left SQL Server behind. In some ways this is sad, but in others it means opportunities to learn new things and have a career for as long as it is needed.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • Look at Amazon instances. High-Memory Quadruple Extra Large DB = 68GB RAM!

    It really depends on what your use case is.

    Heavy reads and a large database? You are going to need RAM for your buffer pool.

    Mainly writes and serving up reference data? Memory isn't such an issue.

    We use Enterprise features (compression, partitioning, resource governor etc) but with very few exceptions our instances come well below the 64GB RAM. Part of that is because most of the estate is virtualised.

  • (Don't panic - tongue is firmly in cheek here)

    Wouldn't it be good if they created a licensing structure that forced db designers to normalise their OLTP databases properly. I'm heartily sick of people going on about how database XYZ must be important because it occupies 150GB on the disk, and we need all this RAM etc etc. Then you dig into it, and it's a mess of unnecessary clustered PK guids, ints that should be tinyints, nvarchars that should be varchars, redundant columns, strings that should be lookup values and on and on it goes. The smaller the db the cheaper the license and the world will slowly become a better place. Because there are cases where even really well normalised databases have billions of very compact rows, you'd scale the cost at a decelerating rate. So you could do something like Size = =0.005*(Cost^2).

    You know you want it.

    ...One of the symptoms of an approaching nervous breakdown is the belief that ones work is terribly important.... Bertrand Russell

  • GPO (8/7/2013)


    (Don't panic - tongue is firmly in cheek here)

    Wouldn't it be good if they created a licensing structure that forced db designers to normalise their OLTP databases properly. I'm heartily sick of people going on about how database XYZ must be important because it occupies 150GB on the disk, and we need all this RAM etc etc. Then you dig into it, and it's a mess of unnecessary clustered PK guids, ints that should be tinyints, nvarchars that should be varchars, redundant columns, strings that should be lookup values and on and on it goes. The smaller the db the cheaper the license and the world will slowly become a better place. Because there are cases where even really well normalised databases have billions of very compact rows, you'd scale the cost at a decelerating rate. So you could do something like Size = =0.005*(Cost^2).

    You know you want it.

    It's called Cloud Databases, they even push you strongly in the direction of properly designed indexing, because you pay for each io. Your unnecessary scans on badly designed tables using * costs you money.

    I'm a DBA.
    I'm not paid to solve problems. I'm paid to prevent them.

  • GPO (8/7/2013)


    ...

    You know you want it.

    LOL, that's funny and hopefully someone gets the joke and doesn't consider this.

    On one hand I'd like this, on the other, there are good reasons I break normalization. What I'd rather have is better testing that is hard, and cheap, and makes people use the real tools in the real world to prove some level of knowledge.

  • DB Software licensing is a strange thing. Imagine if car companies ran business like software companies, charging you extra if you added a bunch of performance parts after the initial purchase. I guess when you only purchase the "permission to use" for something, you are bound by the will of the entity that technically owns that thing (software). There should be an option to "buy" software rather than just a license. That way, you could pay a bunch up front but not have to worry about cost when you upgrade hardware.

  • GPO (8/7/2013)


    Wouldn't it be good if they created a licensing structure that forced db designers to normalise their OLTP databases properly. I'm heartily sick of people going on about how database XYZ must be important because it occupies 150GB on the disk, and we need all this RAM etc etc. Then you dig into it, and it's a mess of unnecessary clustered PK guids, ints that should be tinyints, nvarchars that should be varchars, redundant columns, strings that should be lookup values and on and on it goes. The smaller the db the cheaper the license and the world will slowly become a better place....

    You bring up some good points, and others which I'm a bit confused about. Good database design would drive people to use the proper datatypes where appropriate. What I think a number of people forget though is that normalization is analysis, not design. As Steve mentioned there are some cases where specific denormalizations will improve performance and require fewer RAM and CPU resources, e.g. a current status of an item in the item's record itself instead of having to search through the history of the item to find the status.

    As for licensing itself, Andrew's analogy of what you describe to Cloud data services makes me think we're talking apples and oranges here. If businesses are paying for the extra CPU and RAM hardware for in house servers and also paying extra for the software to use that hardware, it's a double hit to their budget for something that the software company didn't really do anything to earn that extra money. SQL Server would be coded the same for a 4 core system with 4 Gig of RAM as it would for a 16 core system with 64 Gig. In the cloud you're really paying only once for the "service" so it makes more sense in that model to price on size.

    What I'd like to see is a more a-la-carte pricing by feature approach so that every SMB doesn't esentially have to pay the Enterprise edition penalty for just a handfull of features they will use that aren't in the Standard edition. I highly doubt MS will ever go that route since they like big software bundles, even though a feature based approach would probably help them better determine how people are using their software and what parts of the system are worth MS putting more research and development into.

  • Andrew-H (8/7/2013)


    DB Software licensing is a strange thing. Imagine if car companies ran business like software companies, charging you extra if you added a bunch of performance parts after the initial purchase...

    :hehe: yeah, or charged you more for the car depending on how many miles you anticipated driving in the next 3 years

  • I support a 2TB database that is the backend for a key application in our business. As it is vendor designed, I have little say in the structure. The application pumps in about 3 million data values every hour, 24 hours a day. I'd love to use all 96 GB I have on the server, but I am living with the 64 GB limit in SQL Standard simply because of cost. The SQL Standard license was $3,500, the Enterprise license was $70,000! And we are a "Microsoft Partner" so we get "special pricing". For some reason, I don't feel very special.

  • Dick Cooman (8/7/2013)


    I support a 2TB database that is the backend for a key application in our business. As it is vendor designed, I have little say in the structure. The application pumps in about 3 million data values every hour, 24 hours a day. I'd love to use all 96 GB I have on the server, but I am living with the 64 GB limit in SQL Standard simply because of cost. The SQL Standard license was $3,500, the Enterprise license was $70,000! And we are a "Microsoft Partner" so we get "special pricing". For some reason, I don't feel very special.

    "special" is defined in how you pronounce it :w00t:

  • ...and pay more as we needed to add hardware.

    Why not bring a feature of the mobile world into server software with "In App Purchases"!

    Then, Microsoft could sell just one Edition of SQL Server and when you install it, you have the option to buy or rent additional features, whether that feature is for more RAM, CPU, or software feature of the product.

  • Perhaps the standard / enterprise licensing model is more about ease of enforcement than anything else. A lot easier to tie size limits to a particular edition than to individual site requirements?

  • Tony Parfitt-465405 (8/7/2013)


    Perhaps the standard / enterprise licensing model is more about ease of enforcement than anything else. A lot easier to tie size limits to a particular edition than to individual site requirements?

    I think that has a lot to do with it.

    It might be a real problem if a database automatically stopped working if it hit a size limit as well.... Could open MS up for liability claims.

    cloudydatablog.net

  • :hehe: yeah, or charged you more for the car depending on how many miles you anticipated driving in the next 3 years

    That's outsourced to the VAR in the form of servicing costs....which are of course random and seem to bear no resemblance to the quality or quantity of work!

Viewing 15 posts - 1 through 15 (of 15 total)

You must be logged in to reply to this topic. Login to reply