Quick Start
Get up and running with redis-graph-cache in 60 seconds.
Step 1: Declare your schema
Define your entities, relations, and lists. Use entity for objects, list for small JSON-array collections, and indexedList for paginated/sorted ZSET-backed feeds.
const schema = {
post: {
type: 'entity' as const,
id: 'id',
key: (id: string | number) => `post:${id}`,
fields: {
title: { type: 'string' as const },
content: { type: 'string' as const },
createdAt: { type: 'string' as const },
},
relations: {
author: { type: 'user', kind: 'one' as const },
comments: { type: 'comment', kind: 'many' as const },
},
ttl: 3600, // seconds
},
user: {
type: 'entity' as const,
id: 'id',
key: (id: string | number) => `user:${id}`,
fields: { name: { type: 'string' as const } },
ttl: 7200,
},
comment: {
type: 'entity' as const,
id: 'id',
key: (id: string | number) => `comment:${id}`,
fields: { body: { type: 'string' as const } },
ttl: 1800,
},
// Plain JSON-array list — fine for short collections
recentPosts: {
type: 'list' as const,
entityType: 'post',
key: () => 'posts:recent',
idField: 'id',
ttl: 600,
},
// ZSET-backed list for the global feed
globalFeed: {
type: 'indexedList' as const,
entityType: 'post',
key: () => 'feed:global',
idField: 'id',
scoreField: 'createdAt',
maxSize: 10_000,
trackMembership: true,
ttl: 86400,
},
};Step 2: Instantiate the cache
Create the cache instance with your schema and configuration options.
import { RedisGraphCache } from 'redis-graph-cache';
const cache = new RedisGraphCache(schema, {
redis: {
host: process.env.REDIS_HOST || 'localhost',
port: 6379,
poolSize: 8, // optional; default 1
},
cache: {
defaultTTL: 3600,
enableCompression: true,
compressionThreshold: 4096,
},
});Step 3: Write data
Write entities with nested relations. The cache automatically normalizes the data.
// Write a post with embedded author
await cache.writeEntity('post', {
id: 1,
title: 'Hello',
content: 'World',
createdAt: '2026-04-26T12:00:00Z',
author: { id: 9, name: 'Ada' },
});
// Atomic add to the global feed
await cache.addIndexedListItem('globalFeed', {}, {
id: 2,
title: 'Second',
createdAt: '2026-04-26T12:05:00Z',
author: { id: 9, name: 'Ada' },
});Step 4: Read data
Read entities with automatic hydration of relations.
// Read a single post (author is hydrated automatically)
const post = await cache.readEntity('post', 1);
// → { id: 1, title: 'Hello', content: 'World', author: { id: 9, name: 'Ada' } }
// Paginated read from the global feed, newest-first
const latest = await cache.readIndexedList('globalFeed', {}, {
limit: 50,
reverse: true,
});Step 5: Invalidate data
Cascade invalidation removes entities from all tracked lists atomically.
// Removes the entity AND ZREMs from every tracked indexed list
await cache.invalidateEntity('post', 1);Step 6: Monitor and disconnect
Get real metrics and close connections gracefully on shutdown.
// Real metrics (no fake zeros)
console.log(cache.getMetrics());
// {
// cacheHits, cacheMisses, hitRate, totalOperations,
// avgResponseTime, memoryUsage, activeConnections,
// failedOperations, lastUpdated
// }
// Health check
console.log(await cache.getHealthStatus());
// Shutdown
await cache.disconnect();You're up and running!
In just 60 seconds, you've:
- Defined a schema with entities, relations, and lists
- Connected to Redis with connection pooling
- Written nested data with automatic normalization
- Read data with automatic hydration
- Used cascade invalidation to keep lists in sync
- Monitored cache performance with real metrics