Core Library Documentation
The foundation of Count Cachula - a lightweight cache-first library using the stale-while-revalidate pattern.
Installation
npm install @countcachula/coreQuick Setup
import * as CountCachula from '@countcachula/core';
// Create a requestconst request = new Request('/api/users');
// Fetch with cache-first strategyconst observable = CountCachula.fetch(request);
// Subscribe to updatesobservable.observe(async (response) => { const data = await response.json(); console.log('Data received:', data);});Core Concepts
Stale-While-Revalidate Pattern
The stale-while-revalidate pattern is at the heart of Count Cachula. It provides instant responses from cache while fetching fresh data in the background:
- Check Cache - Look for existing cached response
- Return Immediately - If found, return cached data instantly
- Background Update - Simultaneously fetch fresh data from network
- Notify Observers - When fresh data arrives, notify all subscribers
This pattern eliminates loading spinners for returning users while ensuring they always get fresh data.
CacheObservable Architecture
CacheObservable is a custom observable implementation designed specifically for the cache-first pattern:
- • Lazy Execution - Network requests only start when the first observer subscribes
- • Multiple Observers - Many components can observe the same request without duplicate network calls
- • Two-Value Stream - Emits up to 2 values: cached response (if available) + fresh response
- • Automatic Cleanup - Returns unsubscribe function for proper resource management
How Caching Works
Count Cachula leverages the browser's Cache API for persistent storage:
// Internally, Count Cachula:// 1. Creates a cache key from the request// 2. Checks the Cache API for existing response// 3. Returns cloned response if found// 4. Stores new responses in cacheThe Cache API provides:
- • Persistent storage across sessions
- • Automatic request/response matching
- • Proper handling of headers and methods
- • Built-in cache expiration support (planned)
Request/Response Lifecycle
Cache Hit Flow
Cache Miss Flow
API Reference
fetch(request: Request): CacheObservable<Response>
Main entry point for making cache-first requests.
Parameters:
request: Request- Standard Fetch API Request object
Returns:
CacheObservable<Response>- Observable that emits Response objects
Example:
const request = new Request('/api/data', { headers: { 'Accept': 'application/json' }});const observable = CountCachula.fetch(request); CacheObservable<T>
Observable implementation for handling cached and fresh responses.
observe(callback): unsubscribe
Subscribe to response updates.
Parameters:
callback: (data: T) => void- Function called with each response
Returns:
() => void- Unsubscribe function for cleanup
Behavior:
- • First call with cached data (if available)
- • Second call with fresh data from network
- • Responses are cloned for safe multiple reads
Example:
const unsubscribe = observable.observe(async (response) => { // Called up to twice: once with cache, once with fresh const data = await response.json(); updateUI(data);});
// Clean up when doneunsubscribe();Guides
Basic Usage Patterns
Simple GET Request
const observable = CountCachula.fetch(new Request('/api/items'));
observable.observe(async (response) => { const items = await response.json(); console.log('Items:', items);});With Request Options
const request = new Request('/api/user', { headers: { 'Authorization': 'Bearer token123', 'Accept': 'application/json' }});
const observable = CountCachula.fetch(request);Error Handling
Handle errors gracefully with try-catch:
observable.observe(async (response) => { try { if (!response.ok) { throw new Error(`HTTP error! status: ${response.status}`); } const data = await response.json(); updateUI(data); } catch (error) { console.error('Failed to process response:', error); showErrorMessage(error.message); }});Framework Integration
Vanilla JavaScript
class DataManager { constructor() { this.unsubscribers = []; }
loadData(url) { const request = new Request(url); const observable = CountCachula.fetch(request);
const unsubscribe = observable.observe(async (response) => { const data = await response.json(); this.render(data); });
this.unsubscribers.push(unsubscribe); }
render(data) { document.getElementById('content').innerHTML = data.map(item => `<div>${item.name}</div>`).join(''); }
cleanup() { this.unsubscribers.forEach(unsub => unsub()); }}Web Components
class DataList extends HTMLElement { connectedCallback() { const observable = CountCachula.fetch( new Request(this.getAttribute('data-url')) );
this.unsubscribe = observable.observe(async (response) => { const data = await response.json(); this.innerHTML = data.map(item => `<li>${item.name}</li>` ).join(''); }); }
disconnectedCallback() { this.unsubscribe?.(); }}
customElements.define('data-list', DataList);Advanced Topics
Custom Cache Keys
Count Cachula automatically generates cache keys from requests:
// These create different cache entries:new Request('/api/users?sort=name');new Request('/api/users?sort=date');
// Headers also affect cache key:new Request('/api/data', { headers: { 'Accept-Language': 'en' }});new Request('/api/data', { headers: { 'Accept-Language': 'fr' }});Response Cloning
Responses are automatically cloned to allow multiple reads:
observable.observe(async (response) => { // First read const json = await response.json();
// Would normally fail, but Count Cachula clones responses observable.observe(async (response2) => { const json2 = await response2.json(); // Works! });});Memory Management
Best practices for preventing memory leaks:
class Component { constructor() { this.subscriptions = new Set(); }
fetch(url) { const observable = CountCachula.fetch(new Request(url));
const unsubscribe = observable.observe(response => { // Handle response });
this.subscriptions.add(unsubscribe); }
destroy() { // Clean up all subscriptions this.subscriptions.forEach(unsub => unsub()); this.subscriptions.clear(); }}Examples
Polling with Cache
function pollEndpoint(url, interval = 5000) { const request = new Request(url);
const poll = () => { CountCachula.fetch(request).observe(async (response) => { const data = await response.json(); updateDashboard(data); }); };
poll(); // Initial fetch return setInterval(poll, interval);}
// Start pollingconst intervalId = pollEndpoint('/api/stats', 10000);
// Stop pollingclearInterval(intervalId);Dependent Requests
// First requestCountCachula.fetch(new Request('/api/user')).observe(async (response) => { const user = await response.json();
// Second request depends on first const teamsRequest = new Request(`/api/users/${user.id}/teams`); CountCachula.fetch(teamsRequest).observe(async (teamsResponse) => { const teams = await teamsResponse.json(); displayUserWithTeams(user, teams); });});Batch Preloading
function preloadDetails(items) { items.forEach(item => { // Start fetching but don't wait for response const request = new Request(`/api/items/${item.id}`); CountCachula.fetch(request).observe(() => { // Cache is now warm for this item }); });}
// Later, when user clicks an item, it loads instantly from cachefunction showItemDetail(itemId) { const request = new Request(`/api/items/${itemId}`); CountCachula.fetch(request).observe(async (response) => { const detail = await response.json(); displayDetail(detail); // Instant if preloaded! });}Request Deduplication
Multiple components requesting same data only triggers ONE network request:
// Component ACountCachula.fetch(new Request('/api/config')).observe(response => { // Handle config});
// Component B (simultaneously)CountCachula.fetch(new Request('/api/config')).observe(response => { // Gets same response, no duplicate request});
// Component C (later)CountCachula.fetch(new Request('/api/config')).observe(response => { // Gets cached version immediately, then fresh update});