Moore's Law: Navigating the Pace of Technological Change

  • Comments posted to this topic are about the item Moore's Law: Navigating the Pace of Technological Change

  • Thanks for a thought-provoking article, Ryan. Good info on adoption of SQL Server. And my deepest condolences to Sir Moore's family. Great life lived.

    On Moore's Law, is it inaccurate to say it was largely with reference to semiconductor technology? On the adoption of technology these days, could it be because most out of the box solutions come much more "fully formed" than before?

    Br. Kenneth Igiri
    https://kennethigiri.com
    All nations come to my light, all kings to the brightness of my rising

  • Thank you, Ryan, for a great article. I was unaware that Gordon Moore had recently passed away. WOW, what a legacy he left behind.

    I work in state government. In my experience, government agencies tend to lag far behind the private sector in upgrading. My personal observation is that any technology, be it hardware or software, is left to continue functioning until it can no longer turn on. In fact, just last week I had to investigate a Window server for some old software, to see if we could get anything out of it. (We can't.) This server is a Windows 2003 Server. That's right, it literally is 20 years old! And we were still powering it. We decided to turn it off as we could no longer get any data out of the ancient software on it.

    However, on the other side I am pleased to say that we are busy upgrading our SQL Server instances. There's a lot of them, but the plan is to move them all to SQL 2019, unless there's vendors software on it that won't run on SQL 2019. There are a few servers we're waiting to see what the vendors say.

    Kindest Regards, Rod Connect with me on LinkedIn.

  • Thanks for the thoughts Kenneth!

    You're absolutely right. Moore's law was specifically about the advancement of semiconductors. However, in my experience, the concept took on a life of it's own around technology in general (maybe specifically correlated with the advent of the iPhone and modern cell phones) so that "everything" in technology was doubling in power every so many years. It's certainly anecdotal, but I couldn't help but bring it up after hearing about Gordon Moore. 🙂

  • Rod, I appreciate your point that the vertical you work in will certainly influence how stable the servers have to be over time and how much tolerance there is for some level of experimentation. As shocking as it is to hear about a Windows 2003 server still running, I'm almost more impressed that you were able to turn it off working for a state government! 🙂

    Hope the SQL Server 2019 migration goes well!

  • I appreciate your blog post, both the information about Moore's passing and your perspective on why it appears that companies are keeping up with new SQL Server versions.

    I don't disagree completely with what you wrote, but I have a different perspective/take on things at least in part.  I'm a big believer in change that has a net benefit.  I'm against change for the sake of change or a net cost.  Yes, things are a changing fast, but if those 'improvements' in SQL Server do not provide a net benefit for the business needs of the particular business in question, then upgrading is not warranted for that business -- except in the case of aging hardware (which is not the issue if everything is in the cloud as you mentioned) or if MS forces the upgrade by no longer supporting the older version.  It's those last two issues which drove my agency's last upgrade from 2008 R2 to 2019 a couple years ago--not because there was a net benefit to the upgrade in terms of meeting our relatively modest business needs.  I can't see how I could justify an upgrade to SQL Server 2022 any time in the next few years.

    So, a third reason so many companies may have SQL Server versions that are so recent is that they were "forced" to upgrade due to: a) needing to replace old hardware (so why not upgrade then) and/or b) needing to move away for security reasons from SQL Server versions which will no longer be supported by MS.

    I also suspect that there is a sampling bias in Brent Ozar's data share.  I would guess that companies which can afford Brent's services have more complex needs compared to companies who can't afford a top-notch consultant.  Hence, I would expect Brent's company to be working with businesses which have more recent versions of SQL Server.  Please don't get me wrong, I'm a huge fan of Brent and appreciate what he does for the SQL Server community.  I'm just wondering about how helpful that post is in telling us the distribution of SQL Server versions out there.

    There was a SQL Server Central blog post not too long ago where the author (sorry, I forgot his name) essentially made the case that if DBAs aren't using the latest and greatest features in SQL Server, then they must not actually understand those features or must not have proper imagination for the business needs or both.  I understand the point that was being made, and I'm sure it is true in some cases.

    However, I think that the side of the equation that is being missed in that previous blog post and in this one is that a feature is only as beneficial as the business need that it meets.  I personally found many great improvements in SQL Server in versions up thru SQL Server 2008 R2.  With each of our previous upgrades, I would immediately take advantage of changes to meet our business needs.  I've found some nice changes in SQL Server 2019, but the improvements in 2019 over 2008 R2 for *our* business needs are no where near the Moore's law projection.  I can't say that the upgrade was worth the cost.  We just had to do it.  (For my own career/personal benefit, I'm glad we upgraded.)

    To explain the point in car terms: Suppose the top speed limit in the country anywhere is say 80 mph (miles per hour), and the driver never intends to go above 90 mph, maybe 100 mph in an absolutely emergency.  If the driver already has a car in good working order which goes 150 mph, it is not really a benefit to pay for a new car just because the new car gets 200 mph.  The need is already being met perfectly with the old car.

    So I guess the question is:  Are humanity's business needs increasing in scope and complexity as much as our technology solutions are improving?  I have no doubt that for some businesses, this is absolutely true.  I'm just wondering how many businesses really need to upgrade SQL Server all the time in order to take advantage of features which would meet unfulfilled business needs.

    I want to end by repeating that I liked and appreciated your post.  I don't think you are wrong.  I just think the situation is more complicated.  Your post got me to thinking and moved me to post a reply.  🙂

  • JJ,

    I'm SO glad that you did respond. I think you're completely correct that the issue is nuanced and there probably is a bias in this specific form of data. In my previous companies, my experience was similar to yours. SQL Server 2008 R2 was a very stable released for us and it served hundreds of client databases/servers well. For us, the biggest jump was when Query Store was added to SQL Server 2016, a game changer to our SaaS business and ability to better solve some harder performance issues.

    Having been on the product support side of things, I do think part of the "forced" upgrade is simply an issue of priorities. 15 years ago when I had to rely on a CD to install SQL Server (or an ISO that we could build the image from), there just wasn't as much expectation for fast fixes and new features. Stability was so essential and expected. Now, the competition for proving value and keeping pace is always at odds with maintaining support for an older version... and I think customers get caught in the middle.

    To your major point, however, I agree that features and total software value haven't magically doubled every two years (give or take). Sometimes we just get carried along with the flow. 😉

    Thanks again! I really appreciate your thoughts and input!

     

     

  • RE: "In my previous companies, my experience was similar to yours. SQL Server 2008 R2 was a very stable released for us and it served hundreds of client databases/servers well. For us, the biggest jump was when Query Store was added to SQL Server 2016, a game changer to our SaaS business and ability to better solve some harder performance issues.

    Having been on the product support side of things, I do think part of the "forced" upgrade is simply an issue of priorities. 15 years ago when I had to rely on a CD to install SQL Server (or an ISO that we could build the image from), there just wasn't as much expectation for fast fixes and new features. Stability was so essential and expected. Now, the competition for proving value and keeping pace is always at odds with maintaining support for an older version... and I think customers get caught in the middle."

    This was a very interesting and helpful addition to the discussion!  Thanks for taking the time to post.  I appreciate how responsive you have been to other's posts.  It has created a nice conversation.

     

  • It takes time for people to understand a new feature, what benefits it will bring and at what cost.  It is not always easy for a technical person and a business person to communicate effectively.

    I can remember the jump from SQL6.5 to SQL 7 and 2000.  Overnight performance jumped by orders of magnitude.  The jump from SQL2000 to SQL2005 offered another major leap in performance.

    SQL2012 gave us column stores which, for analytics loads, gave another major leap forward.  If you didn't have a column store use case then an upgrade was a much tougher sell.

    One thing I have noticed is that cloud vendors are much quicker to upgrade their offerings.

    DevOps practices allow for rapid patching of software too.  Releases to production used to be major events and not every week.  Now there may be 20 or 30 releases per day!  That desensitises people to release phobia.

     

  • You overlooked part of Moore's Law. Maybe I'm weird (most people would say yes), but I think the halving of cost of computing is just as important as the doubling of transistors. Perhaps it was not germane to the discussion, but if the count of transistors doubles at double the cost (or more), it doesn't provide the same benefit.

    • This reply was modified 1 year ago by  JRuss.

    Trying to figure out the world of SQL as marketing consultant for SQL Solutions Group https://sqlsolutionsgroup.com/

  • re: "One thing I have noticed is that cloud vendors are much quicker to upgrade their offerings. ... Now there may be 20 or 30 releases per day! That desensitises people to release phobia."

    I want to quibble with the word "phobia."  This is just a pet peeve of mine.

    A phobia is an irrational fear.   I would argue that from the user's perspective, there's nothing irrational about fearing updates.  Even change that is a net benefit can have a cost to users who have to learn something new.  I don't think this should be discounted.

    I'll use Microsoft's current model of releasing updates to say Office 365 all the time as an example.  My agency rolls out these changes without any review or warning to staff.  So one day Office works one way and the next day office works another way.  Just yesterday at a Teams meeting with my fellow IT staff, staff were just trying to have our meeting, but we paused to try to figure out one feature, because unexpectedly, "They [MS] have changed this."   I heard sounds of frustration and even a quite but passionate, "I hate this."  One nice thing about a big release over little released for users is that they have a stable product to work with until there is an upgrade.  When the upgrade rolls around, then there is a known time (no surprises in the middle of trying to get something done) when changes will be made, and a single learning curve of all the changes followed by another stretch of time when the product is stable and known.

    Problems with constant upgrades go beyond the hit on user productivity due to unexpected learning curves.  Sometimes there are bugs introduced with the "upgrades."  Our agency experienced this just a few weeks ago when Excel started behaving differently than it had been behaving over the last 20 years (and presumably prior).  This new behavior broke some spreadsheet functionality that is a key component of one of our applications.   Being on Excel 365 and rolling out 'upgrades' whenever MS feels like it, we had no warning that Excel was going to change and give us this bug.  Ie, we couldn't plan to address the change at a time that was good for our agency and that would minimize user pain.

    Prior to being on Office 365, we had the straight desk versions of Office and everything would work smoothly for years at a time.  If a problem occurred, it was because of developer's code.  I don't know what kind of control agencies have when they use Azure, so maybe this is a non-issue when it comes to SQL Server.  However, as of right now, I'm pretty happy having our own SQL Server installation where we can fully control when we are going to deal with upgrades.

    Note: I'm not really arguing that changes should be made to software only rarely.  You might even call me hypocritical since I do frequent releases of my applications myself.  I'm just arguing that we should recognize the real cons of the current cloud models of constantly rolling out changes without warning and even without the users having a say.  It's worth it for developers to keep this in mind to see if we can make our processes even better for the users.  And it's worth thinking about when we use software, like SQL Server, ourselves.  Would we want SQL Server to be upgraded (beyond security upgrades) constantly?  Would having to guess if a new problem was our down doing or because a change that MS made be a good use of anyone's time?

    Thanks for letting me vent.  🙂

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply