S
M
T
W
T
F
S
Published on November 8, 2023

Exploring Different Types of Cache Systems

Caching is a common technique in modern distributed systems to enhance performance and reduce response time. The general idea is to reuse previously computed values and prevent subsequent server or database hits. A distributed system can have multiple caching points starting with browser cache, CDN, load balancer cache, distributed cache, and database cache. The following techniques assume that data within the cache are not stale. Browser caching can cache HTTP responses and facilitate faster data retrieval. The browser cache should shave off significantly from the response time. Enable browser cache by adding an expiration policy in the response HTTP headers. Web assets such as images, videos, and documents are perfect content for caching because they do not change as often. Web assets are typically cached in content delivery networks (CDN) that are geographically distributed to be as close to the request origin as possible to reduce response time. Content can be personalized through edge nodes as well. Load balancer caching can help reduce stress on the servers and improve response time. Depending on their implementation, load balancers can be configured to respond with cached results for subsequent requests with the same parameters. Distributed caches such as Redis are in-memory key-value stores with high read and write performance. One common application of distributed cache is inverted indexing for full document search. Lastly, depending on implementation, database may have caching functionality such as bufferpool and views to help improve response time. Bufferpool caches query results in allocated memory for future retrieval. Similarly, views cache precomputed query results to help reduce latency.

Browser CacheCacheContent Delivery NetworkDatabase CacheDistributed CacheDistributed SystemsLoad BalancerLoad Balancer CacheTech