Caching
Caching in httpc
Caching is a mechanism to store data that are usually produced by expensive processing or fetched with long waiting time, in fast access locations to be later retrieved and reused. The reuse of the cached data allows to not reprocess o refetch the same data, therefore, saving resources or time.
@httpc/kit provides two ways to caching data:
Output cache is a way to cache the whole call result, avoiding its execution for subsequent invocations. You can differentiate by arguments in order to store different values when different arguments are specified by the calling client.
You can explicitly cache a variable for granular control. Useful to avoid re-execution of expensive computations, queries or long external calls.
All caching features rely on the CachingService
which is used under the hood by friendlier and more integrated components.
Call output caching
You can cache a call return value with the Cache
middleware.
import { httpCall, Cache} from "@httpc/kit";
const getProfile = httpCall(
Cache("1h"),
async () => {
return /** return the value */
}
);
The Cache
middleware allows to specify an expiration to limit the cached value validity.
Checkout the Cache middleware section for all options and configuration details.
Value caching
You can cache any variable thanks to the useCached
hook. Value caching allows to store data imperatively in code for full control.
import { useCached } from "@httpc/kit";
async function getExpensiveQuery() {
let data = await useCached("expensive-query");
if (!data) {
// data is not in cache
// we need to retrieve it
const result = await db.doExpensiveQuery();
// set the result into the cache
// and assign it back to the data variable
data = await useCached("expensive-query", result);
}
return data;
}
Value caching via useCached
hook is available everywhere in your application, not just inside httpc calls. You can use it in parsers, middlewares and, in general, anywhere within the request processing execution.
Checkout the useCached
section for all options and configuration details.
Builtin components
Cache middleware
With the Cache
middleware you can transparently cache the call return value and sets its expiration.
import { httpCall, Cache} from "@httpc/kit";
const getAllPosts = httpCall(
Cache("2h"),
async () => {
return await db.select("posts");
}
);
You can specify the expiration, also know as TTL, with a human readable string in the format (amount)(unit)
, where:
amount
is a positive numberunit
is one ofms
(milliseconds),s
(seconds),m
(minutes),h
(hours),d
days.
So you can set 2h
for 2 hours, 5m
for 5 minutes, and so on.
Specific cache
By default the Cache middleware will use the default cache as data store.
You can specify a different cache with the cache
option:
import { httpCall, Cache} from "@httpc/kit";
const getAllPosts = httpCall(
Cache("2h", { cache: "redis" }),
async () => {
return await db.select("posts");
}
);
Cache Key
The cache key is the unique identifier for the slot in which the cached value will be stored. When a call has no arguments, the cache key is unique and every execution will use the same key to read and write the value from the cache. In this case, the cache key is autogenerated and nothing needs to be configured.
But when the call has one or more arguments, there’s a need to differentiate the slot where to store the returned value. For example, for the following getPost
call, the cache needs to use a different slot for each postId
specified:
const getPost = httpCall(
Cache("2h"),
async (postId: string) => {
// function logic
}
);
The Cache
middleware includes a predefined way to generate a cache key from the arguments. The default implementation identifies a unique key based on all arguments: it hashes the arguments as string
and concatenates them. So anything different from string
will be stringified first.
For the previous example getPost
, the default implementation is able to differentiate the cache slot for each post, as the postId
is taken in to consideration. So, nothing need to be configured as the default is good.
You can specify your own implementations as keyFactory
option. The keyFactory
is a function that takes all arguments as input and must return a string, which will be the cache key used to store the call result.
type KeyFactory = (arguments: any[]) => string
In the following example, you want to exclude the 2nd parameters from the cache key as it has no influence on the value itself.
const getPost = httpCall(
Cache("2h", { keyFactory: ([postId]) => postId }),
async (postId: string, markAsRead?: boolean) => {
// function logic
}
);
In-Memory cache
You can store the value in memory without using a cache provider. This is a quick and fast way to cache values locally and strictly to the specific call.
const getAllPosts = httpCall(
Cache("2h", { inMemory: true }),
async () => {
// function logic
}
);
You can set the keyFactory
to identity the cache key from the parameters, if you need something different from the default implementation.
const getPost = httpCall(
Cache("2h", { inMemory: true, keyFactory: ([postId]) => postId }),
async (postId: string, markAsRead?: boolean) => {
// function logic
}
);
CachingService
The CachingService
is the core service that provides the caching features to all components.
You can register different cache providers. A provider is defined by a key and a factory.
import { CachingService, CachingServiceOptions, REGISTER_OPTIONS } from "@httpc/kit";
REGISTER_OPTIONS<CachingServiceOptions>(CachingService, {
caches: {
memory: () => new LruCache(),
remote: () => new RedisCache({ url: process.env.REDIS_ENDPOINT })
},
});
Builtin components allow to specify the cache you want to use. For example, with the Cache
middleware:
const getAllPosts = httpCall(
Cache("2h", { cache: "remote" }),
async () => {
// omitted
}
);
The builtin CachingService
is registered with the ICachingService
interface. For advanced scenarios, you can require it explicitly:
const caching = useInjected("ICachingService");
Default cache
The CachingServices
allows you to specify a default cache.
import { CachingService, CachingServiceOptions, REGISTER_OPTIONS } from "@httpc/kit";
REGISTER_OPTIONS<CachingServiceOptions>(CachingService, {
caches: {
memory: () => /** omitted */,
remote: () => /** omitted */
},
defaultCache: "memory"
});
You may want to specify a default cache to use shorthand version of all components. This avoids you to specify the target cache every time and keeps your code short.
In the following example, useCached
will transparently use the default cache.
const value = useCached("some-key");
InMemory cache provider
// TODO
3rd party providers
@httpc/kit offers some pre-configured integration with 3rd party caching providers or libraries.
Usually a 3rd party cache can be enabled with a single import:
import "@httpc/kit/caching-*";
where the *
is the 3rd party package name.
The pre-configured caching providers are:
LRU
@httpc/kit provides out of the box an integration with the lru-cache package.
First of all, ensure the package is installed:
npm install lru-cache
pnpm add lru-cache
yarn add lru-cache
To enable the integration, you can register the cache with any key you like. For example for the key memory
:
import { CachingService, CachingServiceOptions, REGISTER_OPTIONS } from "@httpc/kit";
import { LruCache } from "@httpc/kit/caching-lru";
REGISTER_OPTIONS(CachingService, {
caches: {
memory: () => new LruCache()
}
});
Options
The LruCache
can be configured with LruCacheOptions
.
property | type | default | description |
---|---|---|---|
size | number | 100 | the max count of items the cache can hold |
ttl | number | 0 | the expiration of items in milliseconds, use 0 to have no expiration |
const cache = new LruCache({
size: 10, // 10 items
ttl: 60000 // 1minute
});
To use the pre-configured LruCache
defaults, just instantiate it without arguments:
// use defaults
const cache = new LruCache();
Redis
@httpc/kit provides out of the box an integration with redis via the @redis/client package.
First of all, ensure the client is installed:
npm install @redis/client
pnpm add @redis/client
yarn add @redis/client
To enable the integration, you can register the cache with any key you like. For example for the key remote
:
import { CachingService, CachingServiceOptions, REGISTER_OPTIONS } from "@httpc/kit";
import { RedisCache } from "@httpc/kit/caching-redis";
REGISTER_OPTIONS(CachingService, {
caches: {
remote: () => new RedisCache({
url: "redis://usr:pwd@redis-server:6380"
})
}
});
Options
To instantiate the RedisCache
you can either provide:
your own redis client
import { RedisCache } from "@httpc/kit/caching-redis"; import { createClient } from "@redis/client"; const cache = new RedisCache({ client: createClient({ // configuration }) });
provide redis client options to configure a new one
import { RedisCache } from "@httpc/kit/caching-redis"; const cache = new RedisCache({ url: "...", // all redis options here });
Hooks
useCached
With the useCached
hook you can get and set items from the caching service. The hook is asynchronous, it returns a Promise
, so it have to be used with async
.
import { useCached } from "@httpc/kit";
async function doSomething() {
const value = await useCached("item-key");
if (value) {
// the value is in the cache;
}
}
With no options all operations are against the default cache. But, you can also specify a cache you want to operate with.
Get a cached value
From the default cache:
const value = await useCached("item-key");
You can specify from which cache you want to get it:
const value = await useCached("item-key", { cache: "redis" });
Set a value
To the default cache:
await useCached("item-key", value);
You can specify which cache you want to write to:
await useCached("item-key", value, { cache: "redis" });
You can specify the TTL, that is, the expiration in milliseconds (not all provider support this):
// expire after 1 minute
await useCached("item-key", value, { ttl: 60000 });
When writing a value, useCache
also returns it, so you can write a one-liner:
// both writing a value and returning it
const value = await useCached("item-key", 12345);
// here value = 12345
useCache
With the useCache
hook you can get a specific cache previously defined for theCachingService
.
import { useCache } from "@httpc/kit";
const cache = useCache("memory");
You can omit the cache name to get the default cache.
// gets the default cache
const cache = useCache();
If no default cache is registered, useCache
will throw a misconfiguration error.
Decorators
cache
You can inject a specific cache in your service constructor.
import { cache, ICache } from "@httpc/kit";
class Service {
constructor(
@cache("redis") private cache: ICache,
) {
}
}
With no parameters, you can inject the default cache.
class Service {
constructor(
@cache() private cache: ICache, // default cache injected
) {
}
}
If no default cache is registered, the resolution will fail and raise an error.
Interfaces
ICachingService
// TODO
ICache
// TODO
ICacheSync
// TODO