Skip to content

cachly-dev/cachly-dotnet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cachly.Client – .NET SDK

Official .NET SDK for cachly.dev – Managed Valkey/Redis cache.

GDPR-compliant · German servers · Live in 30 seconds

Installation

dotnet add package Cachly.Client

Targets .NET 8 and later. Uses StackExchange.Redis.

Quick Start

using Cachly;

await using var cache = await CachlyClient.ConnectAsync("redis://:password@my-instance.cachly.dev:6379");

// Set with TTL
await cache.SetAsync("user:42", new UserDto("Alice", "pro"), ttl: TimeSpan.FromMinutes(5));

// Get
var user = await cache.GetAsync<UserDto>("user:42");

// Exists
bool exists = await cache.ExistsAsync("user:42");

// Atomic increment
long count = await cache.IncrAsync("page:views");

// Delete
await cache.DelAsync("user:42");

Get-or-Set Pattern

var report = await cache.GetOrSetAsync(
    key: "expensive:report",
    fn: () => db.RunExpensiveReportAsync(),
    ttl: TimeSpan.FromSeconds(60));

Semantic AI Cache (Speed / Business tiers)

Cache LLM responses by meaning, not just exact key. Cut OpenAI costs by 60%.

using Cachly;

await using var cache = await CachlyClient.ConnectAsync(
    Environment.GetEnvironmentVariable("CACHLY_URL")!);

// Provide your own embedding function
async Task<double[]> EmbedAsync(string text)
{
    var response = await openAI.GetEmbeddingsAsync(
        new EmbeddingGenerationOptions { Model = "text-embedding-3-small" }, text);
    return response.Value[0].ToFloats().ToArray().Select(f => (double)f).ToArray();
}

var result = await cache.Semantic.GetOrSetAsync<string>(
    prompt: userQuestion,
    fn: () => openAI.AskAsync(userQuestion),
    embedFn: EmbedAsync,
    options: new SemanticOptions { SimilarityThreshold = 0.90, Ttl = TimeSpan.FromHours(1) });

Console.WriteLine(result.Hit
    ? $"⚡ Cache hit (similarity={result.Similarity:F3})"
    : "🔄 Fresh from LLM");
Console.WriteLine(result.Value);

Dependency Injection (ASP.NET Core)

// Program.cs
builder.Services.AddSingleton(async _ =>
    await CachlyClient.ConnectAsync(builder.Configuration["Cachly:Url"]!));

// Or using a factory
builder.Services.AddSingleton<CachlyClient>(sp =>
    CachlyClient.ConnectAsync(sp.GetRequiredService<IConfiguration>()["Cachly:Url"]!)
        .GetAwaiter().GetResult());

API Reference

CachlyClient

Method Description
ConnectAsync(url) Connect and return a new client
GetAsync<T>(key) Get value (null if not found)
SetAsync<T>(key, value, ttl?) Set value
DelAsync(keys[]) Delete keys, returns count
ExistsAsync(key) Check existence
ExpireAsync(key, ttl) Update TTL
IncrAsync(key) Atomic increment
GetOrSetAsync<T>(key, fn, ttl?) Get-or-set pattern
Semantic SemanticCache helper for AI workloads
Raw Direct IDatabase access

Batch API — Multiple Ops in One Round-Trip

Bundle GET/SET/DEL/EXISTS/TTL operations into one HTTP request or Redis pipeline.

await using var cache = await CachlyClient.ConnectAsync(
    Environment.GetEnvironmentVariable("CACHLY_URL")!,
    batchUrl: Environment.GetEnvironmentVariable("CACHLY_BATCH_URL")); // optional

var results = await cache.BatchAsync(new[]
{
    CachlyBatchOp.Get("user:1"),
    CachlyBatchOp.Get("config:app"),
    CachlyBatchOp.Set("visits", "42", ttl: TimeSpan.FromDays(1)),
    CachlyBatchOp.Exists("session:xyz"),
    CachlyBatchOp.Ttl("token:abc"),
});

string? user  = results[0].Value;       // null on miss
bool    ok    = results[2].Ok ?? false;
bool    found = results[3].Exists ?? false;
long    secs  = results[4].TtlSeconds ?? -2; // -1 = no TTL, -2 = key missing

Ohne batchUrl fällt der Client auf eine StackExchange.Redis-Pipeline zurück.

Environment Variables

CACHLY_URL=redis://:your-password@my-app.cachly.dev:30101
CACHLY_BATCH_URL=https://api.cachly.dev/v1/cache/YOUR_TOKEN   # optional
# Speed / Business tier – Semantic AI Cache:
CACHLY_VECTOR_URL=https://api.cachly.dev/v1/sem/your-vector-token

Find both values in your cachly.dev dashboard.

AI Dev Brain — Persistent Memory for Your Coding Assistant

cachly ships a 30-tool MCP server that gives Claude Code, Cursor, GitHub Copilot, and Windsurf a persistent memory across sessions.

npx @cachly-dev/init

session_start(instance_id, focus) returns a full briefing in one call: last session summary, relevant lessons, open failures, brain health.

→ Full docs: cachly.dev/docs/ai-memory


Links


MIT © cachly.dev

About

Official Cachly sdk-dotnet SDK

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages