← All posts Post 02

Multi-agent coordination without an SDK


I was running Blackreach, a Claude Code session, and a couple other tools at the same time. None of them knew the others existed. Blackreach would finish a research task and everything it found died with the session. The coding session started every conversation blind. I had no visibility into what any of them were doing.

Every multi-agent framework I looked at wanted to fix this by making me rewrite everything around their SDK. LangGraph, CrewAI, AutoGen. All of them want to own the infrastructure. That wasn't what I needed.

I needed a layer underneath. Something existing tools pass through without modification.

How the proxy works

Velqua sits between any app and its LLM provider. It intercepts every API call, injects persistent memory into the context, and forwards the request. The agent calls Ollama or Claude as normal. Velqua handles the rest. Zero code changes.

# Before: agent calls provider directly
curl http://localhost:11434/api/chat   # Ollama

# After: change one port number
curl http://localhost:11435/api/chat   # Velqua proxy > Ollama
                                        # memory injected automatically

The Mesh layer extends this. Instead of injecting one user's personal memory, it injects shared knowledge from a pool that all agents can read and write to. Blackreach finishes a task and writes its findings to the pool. The next agent to make a request gets those findings in its context. It doesn't know Blackreach exists. It just knows.

Agent identity without config

The first problem was figuring out which agent is making each request without requiring any changes in the agent itself.

Velqua Mesh uses a detection chain. First it checks for an X-Velqua-Agent request header, which agents can optionally set. Then it checks the user-agent string. Then the port number. If none of those work it assigns an anonymous ID. Change one port number and the proxy knows who you are.

The noteboard

Shared memory is passive. Agents write, others read. The noteboard is more direct. An agent can leave a structured note for a specific agent or broadcast to whoever picks up the next task.

POST /mesh/notes
{
  "from": "blackreach",
  "to": "any",
  "content": "847 Linear A inscriptions downloaded. Saved to /data/linear_a/.
              HT 31 shows unusual sign clustering worth looking at.",
  "tags": ["research", "complete"]
}
Velqua already injects memory transparently on every request. The noteboard is just another thing to inject. No new mechanism needed, the delivery system already exists.

Current status

This is the architecture post, not the shipping post. Velqua Mesh is in active development. The proxy and memory engine are built. The Mesh coordination layer is what's being built now.

When it ships I'll update this with the actual implementation and a demo. The goal is one port number, no SDK, no cloud, all your agents talking to each other.


If this is useful for something you're building, reach out.

← How Blackreach works Next: Linear A →