May 5, 2018 at 1:25 pm
Comments posted to this topic are about the item Pride in Azure SQL Database
May 5, 2018 at 4:13 pm
While I agree that this all sounds absolutely wonderful, I wonder how much pride there will be when they do an automatic upgrade that has the same effect as 2014 SP1 had on a lot of people or like the "new and improved" cardinality estimator had on our core code. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
May 6, 2018 at 12:38 am
Jeff Moden - Saturday, May 5, 2018 4:13 PMWhile I agree that this all sounds absolutely wonderful, I wonder how much pride there will be when they do an automatic upgrade that has the same effect as 2014 SP1 had on a lot of people or like the "new and improved" cardinality estimator had on our core code. 😉
IIRC, the demise of the DBA has been prognosticated going all the way back to SQL Server 2000 and the Database Tuning Adviser...
Seems like every time MS releases some new ground breaking technology, I just get the impression that they've lost the plot and all the new innovations are being driven by the latest it/marketing buzzwords without any consideration what their current customer base is actually asking for.
How many instances are using any of the following?:
Elastic Pools,
Stretch Databasexs
In Memory OLTP
Graph Databases
R language functionality
the list can go on...
I'm not saying this stuff isn't cool or interesting but I'd be shocked if more that 2% of MS's customer base is using any of it. So, a) why are we being forced to pay for it and b) why are these things taking priority over basic functionality?
To this day, I'm of the opinion that 2012 was the single greatest new release... for no other reason than the expansion of the set windowed functions and the introduction of window frames.
Also, rather than introducing Azure Machine Learning as a new buzz word/feature,why not actually apply machine learning to query tuning and/or plan optimization?
Why are we still, to this day, living with "good enough" execution plans? It seems like, if just a small portion of the efforts being directed into features no one is asking for, were directed into actually improving the core product, everyone would be happier.
Before we let MS off the hook by saying it's impossible to test millions of possible plan combinations quickly, how about we say that we let them have it for ad-hoc queries but demand better for stored procedures?
Also, have a look at this... GTC 2018 Keynote with NVIDIA CEO Jensen Huang
Given the processing power being delivered by these new GPUs, it should be not only possible to test millions (if not billions) of possible plan combinations, it should also be able to offer up revere engineered rewritten (but functionally equivilant) t-sql scripts.I
May 7, 2018 at 8:45 am
I'm not convinced that execution plan optimization can be significantly improve by throwing more CPU power at it. Sub-optimal plans are typically the result of stale or missing statistics and cardinality estimation.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
May 7, 2018 at 9:02 am
Jeff Moden - Saturday, May 5, 2018 4:13 PMWhile I agree that this all sounds absolutely wonderful, I wonder how much pride there will be when they do an automatic upgrade that has the same effect as 2014 SP1 had on a lot of people or like the "new and improved" cardinality estimator had on our core code. 😉
I think you're overestimating how many people had problems. Some did, and I know you did, but lots of people didn't notice it or have significant issues. They also provided trace flags to ensure backwards compatibility with the old flag.
May 7, 2018 at 9:10 am
Jason A. Long - Sunday, May 6, 2018 12:38 AMJeff Moden - Saturday, May 5, 2018 4:13 PMWhile I agree that this all sounds absolutely wonderful, I wonder how much pride there will be when they do an automatic upgrade that has the same effect as 2014 SP1 had on a lot of people or like the "new and improved" cardinality estimator had on our core code. 😉IIRC, the demise of the DBA has been prognosticated going all the way back to SQL Server 2000 and the Database Tuning Adviser...
...
I'm not saying this stuff isn't cool or interesting but I'd be shocked if more that 2% of MS's customer base is using any of it. So, a) why are we being forced to pay for it and b) why are these things taking priority over basic functionality?
To this day, I'm of the opinion that 2012 was the single greatest new release... for no other reason than the expansion of the set windowed functions and the introduction of window frames.Also, rather than introducing Azure Machine Learning as a new buzz word/feature,why not actually apply machine learning to query tuning and/or plan optimization?
Why are we still, to this day, living with "good enough" execution plans? It seems like, if just a small portion of the efforts being directed into features no one is asking for, were directed into actually improving the core product, everyone would be happier.
Before we let MS off the hook by saying it's impossible to test millions of possible plan combinations quickly, how about we say that we let them have it for ad-hoc queries but demand better for stored procedures?
machine learning is being used for tuning. It's being tuned and worked on in Azure. The goal appears to be to get this on premises, though I have no idea when.
Not to defend MS, but they do introduce features that help us. You mention window functions, what about AGs? Most of what you list are developer features that expand the boundaries of the core engine, and do get more people to use the product. This is a business, and part of the goal is to get more sales. That's part of the deal. Like it or not, small enhancements to something like the query optimizer, don't sell. They don't make a big difference to most people. Heck, most people don't even know what their compilation time is for plans,and they think it's trivial. It can be, but it can also be longer than execution time. There is little benefit to MS to improve some small things that niggle at you or me.
The other thing to keep in mind is that it takes time to use certain features. While I wasn't sure In-Memory OLTP was great, now that it's been around for a few versions, more and more people are using them. I do keep seeing increased adoption, though there are still constraints in those features, so there are limits. Columnstore has been widely adopted.
May 9, 2018 at 2:33 pm
Steve Jones - SSC Editor - Monday, May 7, 2018 9:02 AMJeff Moden - Saturday, May 5, 2018 4:13 PMWhile I agree that this all sounds absolutely wonderful, I wonder how much pride there will be when they do an automatic upgrade that has the same effect as 2014 SP1 had on a lot of people or like the "new and improved" cardinality estimator had on our core code. 😉I think you're overestimating how many people had problems. Some did, and I know you did, but lots of people didn't notice it or have significant issues. They also provided trace flags to ensure backwards compatibility with the old flag.
My point is, if they had not provided a backwards compatible trace flag, we'd have been in deep Kimchi. And, if we had SSIS doing anything worthwhile, 2014 sp1 stood a pretty good chance of killing the instance, which is why they had to recall and reissue sp1. Too late if your instance was murdered by the bad change.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 9, 2018 at 5:06 pm
I'm still unsure of your complaint. If they hadn't done something that allowed you to workaround a change, you'd be upset? It sounds like you want to complain about the change, even though they anticipated potential issues and included the trace flag.
As for SP1, that's a problem for sure. I think it's embarrassing and unfair for them to issue patches with fundamental problems. Clearly there should be major RCA analysis on how/why that happens with a goal of reducing the chances of that in the future. I commend them for reacting and responding quickly, which is more than some vendors do.
May 9, 2018 at 6:47 pm
Steve Jones - SSC Editor - Wednesday, May 9, 2018 5:06 PMI'm still unsure of your complaint. If they hadn't done something that allowed you to workaround a change, you'd be upset? It sounds like you want to complain about the change, even though they anticipated potential issues and included the trace flag.As for SP1, that's a problem for sure. I think it's embarrassing and unfair for them to issue patches with fundamental problems. Clearly there should be major RCA analysis on how/why that happens with a goal of reducing the chances of that in the future. I commend them for reacting and responding quickly, which is more than some vendors do.
No... I'm tickled that they have the trace flag to avoid using the "improved" cardinality estimator. Thank goodness they did.
Just like the point about SP1, though, you don't really have a choice as to when a change will be installed. Here it comes, ready or not. Thanks to their EULA, poor you if it kills something of yours. There's no recourse for shoddy workmanship in software, especially with MS, even if it's forced upon you.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 9, 2018 at 7:51 pm
Here we go again... another possible bum SP for 2016.
https://www.sqlservercentral.com/Forums/1935421/Service-Pack-2-for-SQL-Server-2016
The OP wouldn't have a choice if they were on Azure, would they?
--Jeff Moden
Change is inevitable... Change for the better is not.
May 9, 2018 at 7:53 pm
As a bit of a sidebar, leave it to this great community to come up with a fix. Hope MS is listening.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 10, 2018 at 9:05 am
They would be able to delay updates in Azure.. Though not forever.
May 10, 2018 at 3:27 pm
Steve Jones - SSC Editor - Monday, May 7, 2018 9:10 AMJason A. Long - Sunday, May 6, 2018 12:38 AMJeff Moden - Saturday, May 5, 2018 4:13 PMWhile I agree that this all sounds absolutely wonderful, I wonder how much pride there will be when they do an automatic upgrade that has the same effect as 2014 SP1 had on a lot of people or like the "new and improved" cardinality estimator had on our core code. 😉IIRC, the demise of the DBA has been prognosticated going all the way back to SQL Server 2000 and the Database Tuning Adviser...
...
I'm not saying this stuff isn't cool or interesting but I'd be shocked if more that 2% of MS's customer base is using any of it. So, a) why are we being forced to pay for it and b) why are these things taking priority over basic functionality?
To this day, I'm of the opinion that 2012 was the single greatest new release... for no other reason than the expansion of the set windowed functions and the introduction of window frames.Also, rather than introducing Azure Machine Learning as a new buzz word/feature,why not actually apply machine learning to query tuning and/or plan optimization?
Why are we still, to this day, living with "good enough" execution plans? It seems like, if just a small portion of the efforts being directed into features no one is asking for, were directed into actually improving the core product, everyone would be happier.
Before we let MS off the hook by saying it's impossible to test millions of possible plan combinations quickly, how about we say that we let them have it for ad-hoc queries but demand better for stored procedures?machine learning is being used for tuning. It's being tuned and worked on in Azure. The goal appears to be to get this on premises, though I have no idea when.
Not to defend MS, but they do introduce features that help us. You mention window functions, what about AGs? Most of what you list are developer features that expand the boundaries of the core engine, and do get more people to use the product. This is a business, and part of the goal is to get more sales. That's part of the deal. Like it or not, small enhancements to something like the query optimizer, don't sell. They don't make a big difference to most people. Heck, most people don't even know what their compilation time is for plans,and they think it's trivial. It can be, but it can also be longer than execution time. There is little benefit to MS to improve some small things that niggle at you or me.
The other thing to keep in mind is that it takes time to use certain features. While I wasn't sure In-Memory OLTP was great, now that it's been around for a few versions, more and more people are using them. I do keep seeing increased adoption, though there are still constraints in those features, so there are limits. Columnstore has been widely adopted.
Also to note here, Azure Machine Learning is a product that allows you to build machine learning packages similar to how you would build SSIS packages, except it turns into a usable API that you can then share to the world. In this instance, instead of hard coding the algorithms in C++, Python or whatever directly into the application, you can simply call to the API that your data scientist have full control over. This is essentially the same as using stored procedures, but for machine learning and within it's own managed environment so it's not directly embedded in your database or your code for scalability purposes.
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply