I've encountered this scenario a few times. At the highest level, I think it's the result of the business and decision makers treating IT as simply overhead and an afterthought. You then get a small cadre of beleaguered individuals whose superiors expect them to do everything from AD administration to un-jamming the printer since it's all just "that computer stuff" with little to no budget. Complex and potentially impossible tasks with unrealistic timelines and commitments become the norm.
Obviously, this is sub-optimal for the IT staff. Typically, the most skilled individuals who have other options vote with their feet and depart. It also of course ultimately harms the business. In addition to the DR scenarios and firefighting that result, which can be expensive and potentially calamitous in some cases, institutional knowledge departure and turnover is an enormous drain on productivity.
It's honestly extremely odd the dynamic that often exists between IT and the business at large in many cases. Technology is ubiquitous in most industries, yet somehow IT isn't considered that important. Critical technology decisions are made by people with little, or even no, technical expertise or desire to learn despite the potentially huge ramifications.
However, I don't think you can fully blame the business side for this arrangement. There are of course extremes and I'm painting with a very broad brush here, but bear with me. Often times, IT projects or improvements don't easily lend themselves to quantification. If you're a salesperson, you can say I made $X sales this month. That's an easy to digest metric for everyone to understand. How do you quantify the business value of migrating from an old SQL 2005 server to 2014? It's not so clear how to even estimate, much less present to a decision maker.