SQLServerCentral Editorial

Caching

,

How many of you have worked with a cache of data for your application? I would bet that the vast majority of you have never built a caching system. There's not shame in this, as most of us don't work with very high levels of concurrency in our applications. Even if we hit them, it's rare enough that we wouldn't bother with the extra coding when we designed the application because, well, it's extra coding. It takes time, requires more testing, and our project managers might not be willing to invest in this "just in case" we hit some concurrency issue. Actually, most of us probably would view this as a "good problem", one that would then be worth re-architecting the application.

The problem is that often no one wants to re-architect a working application and potentially delay other, more exciting enhancements. Many highly visible web sites have gone through this, experienced growing pains, and only those that have a tremendous amount of resources to devote to the problem usually solve it.

However it doesn't have to be hard. Brent Ozar published a piece on caching results that uses a creative solution, scaling out to a new database and table that exists only to return results. In some sense, this is a great application of the KISS principle that doesn't use replication or any other complex technology. It even employs a technique those NoSQL platforms often hawk: eventual consistency.

That's the idea behind caching. The data isn't necessarily the most up to date, and it's not dependent on being synced with your other data. Ideally you'd have some system that updates your cache when the base data changes, but even then there's an "eventual consistency" delay at work. It might be tens, or hundreds, of ms, which is often what we find with distributed systems, but that's usually fine. It's very rare that a problem requires absolute synchronization of all data in the application.

I'd urge you to consider thinking about  building some scaffolding into your application that can potentially be built out later to incorporate caching, or some other scaling technology, in the event that you find your system is more popular than anyone might expect. It shouldn't be too hard, especially if you think about it early on.

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating