Data Caching

The cache function allows you to cache the results of data fetching or computations. It provides fine-grained control over data and is suitable for various scenarios, including Client-Side Rendering (CSR), Server-Side Rendering (SSR), and API services (BFF).

Basic Usage

import { cache } from '@module-federation/bridge-react';

export type Data = {
  data: string;
};

export const fetchData = cache(async (): Promise<Data> => {
  return new Promise((resolve) => {
    setTimeout(() => {
      resolve({
        data: `[ provider ] fetched data: ${new Date()}`,
      });
    }, 1000);
  });
});

Parameters

  • fetchData: The fetchData function in a DataLoader.
  • options (optional): Cache configuration.
    • tag: A tag to identify the cache, which can be used to invalidate it.
    • maxAge: The cache's validity period (in milliseconds).
    • revalidate: A time window for revalidating the cache (in milliseconds), similar to the stale-while-revalidate feature of HTTP Cache-Control.
    • getKey: A simplified cache key generation function that creates a key based on the function's arguments.
    • onCache: A callback function for when data is cached, used for custom handling of cached data.

The type for the options parameter is as follows:

interface CacheOptions {
  tag?: string | string[];
  maxAge?: number;
  revalidate?: number;
  getKey?: <Args extends any[]>(...args: Args) => string;
  onCache?: (info: CacheStatsInfo) => boolean;
}

Return Value

The cache function returns a new fetchData function with caching capabilities. Calling this new function multiple times will not result in repeated execution.

Scope of Use

This function is only supported for use within a DataLoader.

Detailed Usage

maxAge

After each computation, the framework records the time the data was written to the cache. When the function is called again, it checks if the cache has expired based on the maxAge parameter. If it has, the fn function is re-executed; otherwise, the cached data is returned.

import { cache, CacheTime } from '@module-federation/bridge-react';

const getDashboardStats = cache(
  async () => {
    return await fetchComplexStatistics();
  },
  {
    maxAge: CacheTime.MINUTE * 2,  // Calling this function within 2 minutes will return cached data.
  }
);

revalidate

The revalidate parameter sets a time window for revalidating the cache after it has expired. It can be used with the maxAge parameter, similar to the stale-while-revalidate model of HTTP Cache-Control.

In the following example, if getDashboardStats is called within the 2-minute non-expired window, it returns cached data. If the cache is expired (between 2 and 3 minutes), incoming requests will first receive the old data, and then a new request will be made in the background to update the cache.

import { cache, CacheTime } from '@module-federation/bridge-react';

const getDashboardStats = cache(
  async () => {
    return await fetchComplexStatistics();
  },
  {
    maxAge: CacheTime.MINUTE * 2,
    revalidate: CacheTime.MINUTE * 1,
  }
);

tag

The tag parameter is used to identify a cache with a label, which can be a string or an array of strings. This tag can be used to invalidate the cache, and multiple cache functions can share the same tag.

import { cache, revalidateTag } from '@module-federation/bridge-react';

const getDashboardStats = cache(
  async () => {
    return await fetchDashboardStats();
  },
  {
    tag: 'dashboard',
  }
);

const getComplexStatistics = cache(
  async () => {
    return await fetchComplexStatistics();
  },
  {
    tag: 'dashboard',
  }
);

revalidateTag('dashboard-stats'); // This will invalidate the caches for both getDashboardStats and getComplexStatistics.

getKey

The getKey parameter allows you to customize how cache keys are generated. For example, you might only need to rely on a subset of the function's parameters to differentiate caches. It is a function that receives the same arguments as the original function and returns a string as the cache key:

import { cache, CacheTime } from '@module-federation/bridge-react';
import { fetchUserData } from './api';

const getUser = cache(
  async (userId, options) => {
    // Here, options might contain many configurations, but we only want to cache based on userId.
    return await fetchUserData(userId, options);
  },
  {
    maxAge: CacheTime.MINUTE * 5,
    // Only use the first parameter (userId) as the cache key.
    getKey: (userId, options) => userId,
  }
);

// The following two calls will share the cache because getKey only uses userId.
await getUser(123, { language: 'zh' });
await getUser(123, { language: 'en' }); // Hits the cache, no new request.

// Different userIds will use different caches.
await getUser(456, { language: 'zh' }); // Does not hit the cache, new request.

You can also use the generateKey function provided by Modern.js with getKey to generate cache keys:

INFO

The generateKey function in Modern.js ensures that a consistent, unique key is generated even if the order of object properties changes, guaranteeing stable caching.

import { cache, CacheTime, generateKey } from '@module-federation/bridge-react';
import { fetchUserData } from './api';

const getUser = cache(
  async (userId, options) => {
    return await fetchUserData(userId, options);
  },
  {
    maxAge: CacheTime.MINUTE * 5,
    getKey: (userId, options) => generateKey(userId),
  }
);

customKey

The customKey parameter is used to customize the cache key. It is a function that receives an object with the following properties and must return a string or Symbol to be used as the cache key:

  • params: An array of arguments passed to the cached function.
  • fn: A reference to the original cached function.
  • generatedKey: The original cache key automatically generated by the framework based on the input parameters.
INFO

Generally, the cache will be invalidated in the following scenarios:

  1. The reference to the cached function changes.
  2. The function's input parameters change.
  3. The maxAge condition is not met.
  4. revalidateTag is called.

customKey can be used in scenarios where you want to share a cache even if the function reference is different. If you only need to customize the cache key, it is recommended to use getKey.

This is very useful in certain scenarios, such as when the function reference changes, but you still want to return cached data.

import { cache } from '@module-federation/bridge-react';
import { fetchUserData } from './api';

// Different function references, but they can share a cache via customKey.
const getUserA = cache(
  fetchUserData,
  {
    maxAge: CacheTime.MINUTE * 5,
    customKey: ({ params }) => {
      // Return a stable string as the cache key.
      return `user-${params[0]}`;
    },
  }
);

// Even if the function reference changes, it will hit the cache as long as customKey returns the same value.
const getUserB = cache(
  (...args) => fetchUserData(...args), // New function reference.
  {
    maxAge: CacheTime.MINUTE * 5,
    customKey: ({ params }) => {
      // Return the same key as getUserA.
      return `user-${params[0]}`;
    },
  }
);

// Even though getUserA and getUserB are different function references, because their customKey returns the same value,
// they will share the cache when called with the same parameters.
const dataA = await getUserA(1);
const dataB = await getUserB(1); // This will hit the cache and not make a new request.

// You can also use a Symbol as the cache key (usually for sharing a cache within the same application).
const USER_CACHE_KEY = Symbol('user-cache');
const getUserC = cache(
  fetchUserData,
  {
    maxAge: CacheTime.MINUTE * 5,
    customKey: () => USER_CACHE_KEY,
  }
);

// You can also use the generatedKey parameter to modify the default key.
const getUserD = cache(
  fetchUserData,
  {
    customKey: ({ generatedKey }) => `prefix-${generatedKey}`,
  }
);

onCache

The onCache parameter allows you to track cache statistics, such as hit rates. It is a callback function that receives information about each cache operation, including its status, key, parameters, and result. You can return false from onCache to prevent a cache hit.

import { cache, CacheTime } from '@module-federation/bridge-react';

// Track cache statistics.
const stats = {
  total: 0,
  hits: 0,
  misses: 0,
  stales: 0,
  hitRate: () => stats.hits / stats.total
};

const getUser = cache(
  fetchUserData,
  {
    maxAge: CacheTime.MINUTE * 5,
    onCache({ status, key, params, result }) {
      // status can be 'hit', 'miss', or 'stale'.
      stats.total++;

      if (status === 'hit') {
        stats.hits++;
      } else if (status === 'miss') {
        stats.misses++;
      } else if (status === 'stale') {
        stats.stales++;
      }

      console.log(`Cache ${status === 'hit' ? 'hit' : status === 'miss' ? 'miss' : 'stale'}, key: ${String(key)}`);
      console.log(`Current hit rate: ${stats.hitRate() * 100}%`);
    }
  }
);

// Example usage:
await getUser(1); // Cache miss.
await getUser(1); // Cache hit.
await getUser(2); // Cache miss.

The onCache callback receives an object with the following properties:

  • status: The status of the cache operation, which can be:
    • hit: Cache hit, returns cached content.
    • miss: Cache miss, executes the function and caches the result.
    • stale: Cache hit but the data is stale, returns cached content and revalidates in the background.
  • key: The cache key, which may be the result of customKey or the default generated key.
  • params: The arguments passed to the cached function.
  • result: The result data (either from the cache or newly computed).

This callback is only called when the options parameter is provided. When using a cache function without options, the onCache callback is not invoked.

The onCache callback is very useful for:

  • Monitoring cache performance.
  • Calculating hit rates.
  • Logging cache operations.
  • Implementing custom metrics.

Storage

Currently, whether on the client or server, the cache is stored in memory. By default, the shared storage limit for all cache functions is 1GB. When this limit is reached, the LRU (Least Recently Used) algorithm is used to remove old caches.

INFO

Considering that the content cached by the cache function is not expected to be very large, it is currently stored in memory by default.

You can specify the cache storage limit using the configureCache function:

import { configureCache, CacheSize } from '@module-federation/bridge-react';

configureCache({
  maxSize: CacheSize.MB * 10, // 10MB
});