Microsoft released an update to XP (maybe 2000) that broke any PC running Norton or McAfee. Oracle released a flawed version of Java that was so bad Apple denied the ability to run that version on their products. Oracle followed up with a patch that was just as bad. Adobe patches come out so often it seems to be hourly, and each is worse than the one before!
Patches frequently fix major security flaws. They fix performance issues. They fix bugs that cause apps to crash.
As noted above, patches frequently break things, cause major security issues, cause performance issues, and introduce new bugs that cause apps to crash.
So seriously, if my system works, and I am not aware of any issues, why in the world would I apply a patch? No, really?
In my case, I support more than 20 different applications, more than 40 separate SQL Server database servers in production, almost as many test SQL Server servers, all spread out over more than 100 actual production Windows servers. So again, why would I patch something that isn't broken? No, really?
Do we need to patch? Yes. Do vendors needs to step up and do even the smallest amount of testing before releasing patches? That would be nice. It will also be nice when the economy turns around (yeah I know, good luck!) and companies start hiring more staff to allow their staff to be able to actually be proactive instead of reactive. I think that will happen soon, or at least as soon as a purple dinosaur runs for president.