CacheKit Docs

High-performance cache policies and supporting data structures.

View the Project on GitHub OxidizeLabs/cachekit

Quickstart

Get up and running with CacheKit in minutes.

Install

Add CacheKit to your Cargo.toml:

[dependencies]
cachekit = "0.3"

Each eviction policy is gated behind a feature flag. The default features include policy-s3-fifo, policy-lru, policy-fast-lru, policy-lru-k, and policy-clock. For minimal builds, use default-features = false and enable only the policies you need:

[dependencies]
cachekit = { version = "0.3", default-features = false, features = ["policy-lru"] }

See Compatibility and Features for the full list of per-policy feature flags.

Build Your First Cache

Use CacheBuilder with an eviction policy:

use cachekit::builder::{CacheBuilder, CachePolicy};

fn main() {
    // Create an LRU cache with capacity 100
    let mut cache = CacheBuilder::new(100).build::<u64, String>(CachePolicy::Lru);

    // Insert values
    cache.insert(1, "alpha".to_string());
    cache.insert(2, "beta".to_string());

    // Read values (updates recency)
    assert_eq!(cache.get(&1), Some(&"alpha".to_string()));

    // Check presence and size
    assert!(cache.contains(&2));
    assert_eq!(cache.len(), 2);
}

Pick a Policy

Start with one of these:

See the full policy guidance in Policy overview.

Thread Safety

Cache implementations are not thread-safe by default. Wrap with a lock for shared access, or enable the concurrency feature for built-in concurrent wrappers:

use std::sync::{Arc, Mutex};
use cachekit::builder::{CacheBuilder, CachePolicy};

let cache = Arc::new(Mutex::new(
    CacheBuilder::new(100).build::<u64, String>(CachePolicy::Lru)
));

let cache_clone = cache.clone();
std::thread::spawn(move || {
    let mut guard = cache_clone.lock().unwrap();
    guard.insert(1, "value".to_string());
});

Enable the feature:

[dependencies]
cachekit = { version = "0.2.0-alpha", features = ["concurrency"] }

Tiny example with a built-in concurrent wrapper:

use cachekit::ds::ConcurrentClockRing;

let cache = ConcurrentClockRing::new(2);
cache.insert("a", 1);
cache.insert("b", 2);

assert_eq!(cache.get(&"a"), Some(1));

Metrics (Optional)

Enable the metrics feature to collect basic hit/miss data:

[dependencies]
cachekit = { version = "0.2.0-alpha", features = ["metrics"] }
use cachekit::policy::lru::LruCore;
use cachekit::traits::CoreCache;

let mut cache = LruCore::<u64, String>::new(100);
cache.insert(1, "value".to_string());
cache.get(&1);

#[cfg(feature = "metrics")]
{
    let metrics = cache.metrics_snapshot();
    println!("Hits: {}", metrics.hits());
    println!("Misses: {}", metrics.misses());
    println!("Hit rate: {:.2}%", metrics.hit_rate() * 100.0);
}

Direct Policy Access

Use a policy directly when you need policy-specific operations:

use std::sync::Arc;
use cachekit::policy::lru::LruCore;
use cachekit::traits::{CoreCache, LruCacheTrait};

let mut cache: LruCore<u64, &str> = LruCore::new(100);
cache.insert(1, Arc::new("value"));

if let Some((key, _)) = cache.peek_lru() {
    println!("LRU key: {}", key);
}

Next Steps