Skip to content

cachly-dev/cachly-rust

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cachly – Rust SDK

Official Rust SDK for cachly.dev
Managed Valkey/Redis cache built for AI apps. GDPR-compliant · German servers · Live in 30 seconds.

Crates.io License: MIT GDPR: EU-only

Installation

cargo add cachly

Or add manually to Cargo.toml:

[dependencies]
cachly = "0.1.0-beta.1"
tokio  = { version = "1", features = ["full"] }

Quick Start

use cachly::{CachlyClient, Result};
use serde::{Deserialize, Serialize};

#[derive(Serialize, Deserialize)]
struct User { name: String, plan: String }

#[tokio::main]
async fn main() -> Result<()> {
    let cache = CachlyClient::new("redis://:password@my-instance.cachly.dev:6379").await?;

    // Set with TTL
    cache.set("user:42", &User { name: "Alice".into(), plan: "pro".into() }, Some(300)).await?;

    // Get
    let user: Option<User> = cache.get("user:42").await?;

    // Exists
    let exists = cache.exists("user:42").await?;

    // Atomic increment
    let count = cache.incr("page:views").await?;

    // Delete
    cache.del(&["user:42"]).await?;

    Ok(())
}

Get-or-Set Pattern

let data: String = cache
    .get_or_set(
        "expensive:query",
        || async { Ok(run_db_query().await) },
        Some(60),
    )
    .await?;

Semantic AI Cache (Speed / Business tiers)

use cachly::{CachlyClient, SemanticOptions};

let client = CachlyClient::new(std::env::var("CACHLY_URL").unwrap().as_str()).await?;

// Your embedding function (OpenAI, Fastembed, local model, …)
let embed = |text: String| async move {
    openai_embed(&text).await.map_err(|e| cachly::CachlyError::Embed(e.to_string()))
};

let sem = client.semantic(embed);
let result = sem
    .get_or_set(
        &user_prompt,
        || async { call_llm(&user_prompt).await },
        SemanticOptions { threshold: 0.90, ttl: Some(3600), ..Default::default() },
    )
    .await?;

println!("{} {}", if result.hit { "⚡" } else { "🔄" }, result.value);

API Reference

CachlyClient

Method Signature Description
new async fn new(url: &str) -> Result<Self> Connect to cache
get async fn get<T>(key: &str) -> Result<Option<T>> Get value
set async fn set<T>(key: &str, value: &T, ttl: Option<u64>) -> Result<()> Set value
del async fn del(keys: &[&str]) -> Result<usize> Delete keys
exists async fn exists(key: &str) -> Result<bool> Check existence
expire async fn expire(key: &str, seconds: u64) -> Result<bool> Update TTL
incr async fn incr(key: &str) -> Result<i64> Atomic increment
get_or_set async fn get_or_set<T, F, Fut>(...) -> Result<T> Get-or-set pattern
semantic fn semantic<EF>(embed_fn: EF) -> SemanticCache<EF> Semantic cache

Batch API — Multiple Ops in One Round-Trip

Bundle GET/SET/DEL/EXISTS/TTL operations into one HTTP request or Redis pipeline.

use cachly::{CachlyClient, BatchOp};

let client = CachlyClient::new_with_batch(
    &std::env::var("CACHLY_URL")?,
    std::env::var("CACHLY_BATCH_URL").ok().as_deref(),
).await?;

let results = client.batch(vec![
    BatchOp::get("user:1"),
    BatchOp::get("config:app"),
    BatchOp::set("visits", "42", Some(86400)),
    BatchOp::exists("session:xyz"),
    BatchOp::ttl("token:abc"),
]).await?;

// results[0].value     → Option<String>
// results[2].ok        → Option<bool>
// results[3].exists    → Option<bool>
// results[4].ttl_secs  → Option<i64>  (-1 = no TTL, -2 = key missing)

AI Dev Brain — Persistent Memory for Your Coding Assistant

cachly ships a 30-tool MCP server that gives Claude Code, Cursor, GitHub Copilot, and Windsurf a persistent memory across sessions — so they never forget your architecture, lessons learned, or last session context.

npx @cachly-dev/init

Or configure manually in your editor (~/.vscode/mcp.json / .cursor/mcp.json):

{
  "servers": {
    "cachly": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@cachly-dev/mcp-server"],
      "env": { "CACHLY_JWT": "your-jwt-token" }
    }
  }
}

session_start(instance_id, focus) returns a full briefing in one call: last session summary, relevant lessons, open failures, brain health. 60 % fewer file reads, instant context, zero re-discovery.

→ Full docs: cachly.dev/docs/ai-memory


Environment Variables

CACHLY_URL=redis://:your-password@my-app.cachly.dev:30101
CACHLY_BATCH_URL=https://api.cachly.dev/v1/cache/YOUR_TOKEN   # optional
# Speed / Business tier – Semantic AI Cache:
CACHLY_VECTOR_URL=https://api.cachly.dev/v1/sem/your-vector-token

Find both values in your cachly.dev dashboard.

License

MIT © cachly.dev

About

Official Cachly sdk-rust SDK

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages