Another Re-Release

  • Comments posted to this topic are about the item Another Re-Release

  • I would agree putting a new CU out, rather than re-releasing it under the same CU would've been a better choice. It's far to easy to look and say "I've already got that," and move on.

    Admittedly, they did put a new KB number on the download, but how many people look at that?

  • The whole "let's release this stuff NOW and worry about patches later" is stupid.

    Who *wants* to have to update so often? Certainly not DBAs. Not MS's customers. The whole idea of "fast releases" is deeply flawed for two reasons, both of which are deal killers.

    First, it encourages release of buggy products. "Oh, we'll be releasing another patch anyway, we'll just fix stuff in that". Not to MY server you won't!

    Second, who has time to bring down a production server (or all of them) to apply a patch *the manufacturer hasn't had time to thoroughly test*? What happens when this is a mission-critical server? Or a *life-sustaining* server?

    And to do it with an increasing tempo?

    No. Somebody needs to take a baseball bat to the heads of whoever thinks this is a good idea.

    Obviously somebody's been slipping something in IT's coffee again. :hehe: Probably the accountants! Or the sales weasels...

  • roger.plowman (6/2/2016)


    The whole "let's release this stuff NOW and worry about patches later" is stupid.

    First, that's not what's happening. The patches are tested, run through a(n ever increasing) set of tests. It's taken them years to get to the point where the level of testing for a CU is what it is for an SP. Internally.

    What is different is there isn't an external set of customers testing in different environments. I think we still need this, at least once a year for SPs.

    Who *wants* to have to update so often? Certainly not DBAs. Not MS's customers. The whole idea of "fast releases" is deeply flawed for two reasons, both of which are deal killers.

    First, it encourages release of buggy products. "Oh, we'll be releasing another patch anyway, we'll just fix stuff in that". Not to MY server you won't!

    Second, not really true. People hit by bugs do want these patches. I haven't recommended patching with CUs unless you are affected by a bug. In that case, there are MS customers that want these patches.

    I don't think this does or doesn't encourage buggy releases. What this does is get written software, written code, which is inventory, off the shelves and into customers' hands. If you don't test, or don't have precautions, this can encourage buggy software. However, that's semi-flawed thinking. There will always be bugs.

    What quick releases allow is for you to patch things quickly. Including patch a patch.

    Second, who has time to bring down a production server (or all of them) to apply a patch *the manufacturer hasn't had time to thoroughly test*? What happens when this is a mission-critical server? Or a *life-sustaining* server?

    And to do it with an increasing tempo?

    This is certainly the issue I brought up.

    However, Windows moved to monthly patches years ago. All these complaints and worries existed then. Now most people, and most organizations, including mission critical servers, get patched monthly.

  • When the Major Release cycle is ~2 years and the license model changes and the recommendation to apply the ~bi-monthly CU's, and Patch Tuesday, it seems that MS is driving us SMBs to more stable and less expensive environments.

    Linux/PostGres/PHP/Apache are sure beginning to get a lot of attention in my shop.

  • Well, I certainly understand cost and licensing changes. I tend to agree there.

    However Linux has patches almost every month coming out, same as Windows. MySQL has quarterly patches, so you're saying 3 months v 2 months. If there are issues, do you want to wait an extra month?

    Note, you don't have to apply these patches, for any OS/platform/software. However you do want to apply some periodically to not be too far behind. In that case, do you want the availability more frequently or less? I'd argue more.

  • I attempt to install CUs in bunches... meaning... I will go with a -1 CU for installs for about 6 months or so then install the next -1 CU 6 months after that. With the number of SQL Servers we have I don't have enough time to apply all or even 1 behind all of the time.. .let alone convince the apps folks they need to test DEV, TEST, PQA every few months to keep current.

    I won't apply newer CUs on purchased software installs. No way to know if it will break the vendor code.. and usually when I ask the vendor they say they don't test individual CUs, only SPs. So, if I apply a newer CU and have an issue they won't support me.

  • Steve Jones - SSC Editor (6/2/2016)


    roger.plowman (6/2/2016)


    The whole "let's release this stuff NOW and worry about patches later" is stupid.

    First, that's not what's happening. The patches are tested, run through a(n ever increasing) set of tests. It's taken them years to get to the point where the level of testing for a CU is what it is for an SP. Internally.

    What is different is there isn't an external set of customers testing in different environments. I think we still need this, at least once a year for SPs.

    We need this for EVERY release. Patch or not, and many times MS releases new features along with patches. In the past you didn't hear about this patch or that patch having to be withdrawn.

    MS is overdriving their lights here. Automated testing only goes so far, to support SQL Server or other products of equal complexity you need feedback on all the myriad combinations of hardware that the software has to deal with.

    This is especially true of a product like SQL Server that could, in fact, kill someone if a bug gets through.

    This kind of testing takes time. A lot of time. Making this insane cadence something of a danse macabre.

    Who *wants* to have to update so often? Certainly not DBAs. Not MS's customers. The whole idea of "fast releases" is deeply flawed for two reasons, both of which are deal killers.

    First, it encourages release of buggy products. "Oh, we'll be releasing another patch anyway, we'll just fix stuff in that". Not to MY server you won't!

    Second, not really true. People hit by bugs do want these patches. I haven't recommended patching with CUs unless you are affected by a bug. In that case, there are MS customers that want these patches.

    I don't think this does or doesn't encourage buggy releases. What this does is get written software, written code, which is inventory, off the shelves and into customers' hands. If you don't test, or don't have precautions, this can encourage buggy software. However, that's semi-flawed thinking. There will always be bugs.

    What quick releases allow is for you to patch things quickly. Including patch a patch.

    Not arguing patches are bad. But these aren't just patches, are they? Not when a patch breaks existing behavior in another part of the system. Don't get me wrong, automated testing is great. Fast *CORRECT* patches are great. But there is a line, and MS has crossed it some years ago. Their code quality is dropping. This fast-cadence BS is one of the biggest contributors.

    Second, who has time to bring down a production server (or all of them) to apply a patch *the manufacturer hasn't had time to thoroughly test*? What happens when this is a mission-critical server? Or a *life-sustaining* server?

    And to do it with an increasing tempo?

    This is certainly the issue I brought up.

    However, Windows moved to monthly patches years ago. All these complaints and worries existed then. Now most people, and most organizations, including mission critical servers, get patched monthly.

    Except, this isn't just patching. It's also new versions. Win10 is a perfect example. It's still buggy. SQL Server 2008 is still buggy. All the other MS products are too. Partially because they're so busy releasing new stuff they don't have time (read "accountants want higher revenues") to take the time to make bullet proof stuff.

    Such a rapid pace can't be sustained without unacceptable loss of quality.

    Full stop.

  • roger.plowman (6/3/2016)


    Steve Jones - SSC Editor (6/2/2016)


    roger.plowman (6/2/2016)


    The whole "let's release this stuff NOW and worry about patches later" is stupid.

    First, that's not what's happening. The patches are tested, run through a(n ever increasing) set of tests. It's taken them years to get to the point where the level of testing for a CU is what it is for an SP. Internally.

    What is different is there isn't an external set of customers testing in different environments. I think we still need this, at least once a year for SPs.

    We need this for EVERY release. Patch or not, and many times MS releases new features along with patches. In the past you didn't hear about this patch or that patch having to be withdrawn.

    Not correct. Patches have been withdrawn and re-released by MS and other vendors, including database vendors, in the past.

  • roger.plowman (6/3/2016)


    But there is a line, and MS has crossed it some years ago. Their code quality is dropping. This fast-cadence BS is one of the biggest contributors.

    [\quote]

    I don't know what your experience is, but this hasn't been mine. Nor I have I seen any reports of this being the case from any significant number of users. I'd be curious what the released bug count it, but I would guess it's lower now in the core engine.

    I know security patches have lowered over time.

  • CU#'s are identifiers. Identifiers should (must?) be unique.

    Gaz

    -- Stop your grinnin' and drop your linen...they're everywhere!!!

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply