
Muscle Memory Cache for Agents
We’re building a new type of database, creating structure out of natural language, and we need your help!
In this role you’d be the lead engineer on Butter's data layer, building a distributed storage and query engine that scales to thousands of RPS.
Requires: Experience building complex systems in compiled systems languages (Rust, Zig, C++), and familiarity with kernel APIs & syscalls, file formats, consensus algorithms, and more.
Impressive side projects are as valid as work experience; all that matters is curiosity and ability to go deep.
Butter is building an LLM proxy that records and deterministically replays tool call trajectories. Our goal is to get LLMs out of the hotpath for repetitive tasks, increasing speed, reducing variability, and eliminating token costs for the many cases that could have just been a script.
Why
We discovered this problem after experiencing it first-hand building computer-use agents. We realized that many process-automation tasks are deeply repetitive, simple data transformation tasks that could be run as scripts. Critically, the user pull to agents was not to replace these scripts with agents, but to use agents to discover and self-heal the scripts when new edge cases are encountered.
We believe these cases exist even beyond computer-use, to any agent tasked to perform repeat workflows. Learn a skill once, run it forever.
As an LLM proxy, we act as the LLM, spoofing responses to deterministically guide agents down cached paths, or cleanly falling back to actual LLMs on cache miss.