I think there is a hierarchy of requirements that stem from what a company sees as its fundamental needs for IT. As an example, a CEO may decide one of the company's aims is to know its customers better, and will see a CRM application as a means to get there. The IT department will see the app needs a database underneath it, and that both app and database need servers to sit on, and then that the servers will need a network OS to support them, and a network over which to communicate.
My experience has generally been that, whilst market forces fluctuate the cost of rare skills a little, the further someone's job is away from the company's fundamental stated need, the lower the level of remuneration. OK, it's a little simplistic, but a network, no matter how well designed, can never actually add value for the business; it can merely reduce costs. However, an application/database pair does have the ability to add value, so the company's much more ready to spend on people directly related to that pair. And if someone understands two or more of the necessary areas (e.g. a DBA who is also an acceptable programmer), the company will start spouting buzzwords like "synergy" and "integration", and will value that person all the more.
In short, how important a person is to a business doesn't determine how much they're paid so much as how well the decision makers in that business recognise why that person's job is important. Visibility, not utility.
Semper in excretia, sumus solum profundum variat