Cache
there are only two problems in computer science naming things and cache invalidation yadda yadda
Challenge
#wip
When caching result sets from a slow API or database, what's the better strategy:
- cache one large object and process it every time I pull it out of the cache
- cache many small objects and process it before putting it into the cache maybe already compute results before caching
Lets say I've got a collection of items and each item is a complex object itself:
$items = fetch('/slow/API/endpoint');
/*
$items = [
'id1' => [ // large objetcs
'key1' => 'a',
'key2' => 'b',
],
'id2' => [
'key1' => 'c',
'key2' => 'd',
],
'id3' => [
'key1' => 'e',
'key2' => 'f',
],
]
*/
$cache->set('itemsCacheKey', $items)
// now instead I hit the API I use
$cache->get('itemsCacheKey')
// and pull that large blob out of the cache
That's painless if I use that whole collection to list it somewhere and even filter it.
What if I need only the object with id2
or a subset ['id1','id3']
?
Do I fetch it and cache it despite being redundant, since all the data already exists in $cache->get('itemsCacheKey')
?
$items = fetch('/slow/API/endpoint/id2');
/*
$item = [
'id2' => [
'key1' => 'c',
'key2' => 'd',
],
]
*/
$cache->set('itemCacheKey_id2', $item)
//
$items = fetch('/slow/API/endpoint?ids=id1,id3');
/*
$items = [
'id1' => [ // large objetcs
'key1' => 'a',
'key2' => 'b',
],
'id3' => [
'key1' => 'e',
'key2' => 'f',
],
]
*/
$cache->set('itemCacheKey_id1,id3', $items)
Or I add more logic before the caching process and cache every single key?
I think I stay with the raw data dump into the cache. Unless its becoming too big of an object. Then split it.
I must be able to remove the cache layer and work with the data directly.
What are the pro and cons of caching each small call versus caching the result of many small calls in server-side caching?
ANSWER
Caching is a technique that can significantly improve the performance of web applications. When it comes to caching small calls versus caching the result of many small calls, there are pros and cons to each approach.
Caching each small call:
Pros:
- Reduced latency: Caching each small call can reduce latency and improve the response time of the application. Since the result of each call is cached, subsequent requests for the same data can be served faster.
- Smaller cache footprint: Since each call is cached separately, the cache footprint can be smaller. This can be useful if the application has limited memory for caching.
Cons:
- Increased cache misses: Caching each small call can result in more cache misses, which can negatively impact performance. If the cache is not warmed up, there can be a high number of cache misses, which can result in slower response times.
- Increased complexity: Caching each small call can add complexity to the application. The caching logic needs to handle each small call separately, which can be more difficult to implement and maintain.
Caching the result of many small calls:
Pros:
- Reduced cache misses: Caching the result of many small calls can reduce cache misses and improve performance. Since the result of many small calls is cached together, subsequent requests for the same data can be served faster.
- Simplified caching logic: Caching the result of many small calls can simplify the caching logic. The caching logic can be implemented once and used for all small calls that share the same result.
Cons:
- Larger cache footprint: Caching the result of many small calls can result in a larger cache footprint. Since the result of many small calls is cached together, the cache can consume more memory.
- Increased latency for some calls: Caching the result of many small calls can result in increased latency for some calls. If a small call that is part of the cached result changes, then the entire cached result needs to be invalidated and recalculated. This can result in increased latency for subsequent requests for the same data.
In summary, caching each small call can reduce latency and result in a smaller cache footprint, but can also increase complexity and result in more cache misses. Caching the result of many small calls can reduce cache misses and simplify caching logic, but can also result in a larger cache footprint and increased latency for some calls. The choice between these two approaches depends on the specific requirements of the application.
Answer via Phind.com ChatGTP 4 powered search engine What are the pro and cons of caching each small call versus caching the result of many small calls in server-side caching?
- Caching challenges and strategies
- Caching strategies to speed up your API - LogRocket Blog
- Different Types of Caching: Server Cache vs Browser Cache vs Site Cache
[[../programming/API Design]]