Skip to content

mellowmarshall/mars-protocol

Repository files navigation

mars-protocol

Mesh Agent Routing Standard

A decentralized capability discovery network for autonomous agents

License: MIT/Apache-2.0 Rust Protocol Version PyPI npm crates.io


How does an AI agent find another agent that can review code, generate images, or search the web — without a central registry?

mars-protocol is a Kademlia DHT over QUIC that lets machines publish, discover, and verify capabilities across a global peer-to-peer mesh. No API keys. No platform lock-in. No single point of failure.


Wire Spec · Getting Started · Operator Guide · Examples


Live Network

The MARS mesh is live with 65+ services. Connect to any hub:

Hub Address Location
us-east 5.161.53.251:4433 Ashburn, VA
us-west 5.78.197.92:4433 Hillsboro, OR
eu-central 46.225.55.16:4433 Nuremberg, DE
ap-southeast 5.223.69.128:4433 Singapore

Quick Start — Pick Your Path

Python (easiest)

pip install mesh-protocol
from mesh_protocol import MeshClient

# Connect via a local gateway
with MeshClient("http://localhost:3000") as client:
    # Discover AI search providers
    providers = client.discover("data/search")
    for p in providers:
        print(f"{p.type} -> {p.endpoint}")

    # Discover LLM inference endpoints
    llms = client.discover("compute/inference/text-generation")

    # Publish your own capability
    client.publish(
        "compute/analysis/code-review",
        endpoint="https://my-agent.example.com/review",
        params={"languages": ["rust", "python"]},
    )

TypeScript / JavaScript

npm install mars-protocol
import { MeshClient } from "mars-protocol";

const client = new MeshClient("http://localhost:3000");
const providers = await client.discover("compute/inference");
await client.publish("compute/analysis/code-review", {
    endpoint: "https://my-agent.example.com/review",
});

Rust (native, no gateway needed)

cargo add mars-client
let mut client = MeshClient::new(keypair, bind_addr).await?;
client.bootstrap(&[NodeAddr::quic("5.161.53.251:4433")]).await?;
client.publish_capability("compute/inference/text-generation",
    "https://my-agent.example.com/generate", None, &seed).await?;
let results = client.discover(&routing_key("compute/inference")).await?;

HTTP (any language)

# Start the gateway
./mesh-gateway --seed 5.161.53.251:4433 --listen 0.0.0.0:3000

# Publish
curl -X POST http://localhost:3000/v1/publish \
  -H "Content-Type: application/json" \
  -d '{"type":"compute/inference/text-generation","endpoint":"https://..."}'

# Discover
curl http://localhost:3000/v1/discover?type=compute/inference

Share Your GPU

Turn any machine with a GPU into a mesh inference provider:

# Install Ollama + pull a model
curl -fsSL https://ollama.com/install.sh | sh
ollama pull llama4

# Interactive setup (first time)
pip install httpx
python tools/gpu-provider/provider.py

# Make it permanent — survives reboot, auto-restarts on crash
python tools/gpu-provider/provider.py --install

Other agents discover your GPU instantly:

providers = client.discover("compute/inference/text-generation")
# → "llama4:latest (NVIDIA GeForce RTX 3090)" — free, us-east

See GPU Provider docs for pricing, regions, and service management.


Bridges

Connect existing ecosystems to the mesh:

Bridge Install What it does
MCP Bridge pip install mesh-mcp-bridge Publish MCP server tools → mesh. Discover mesh capabilities as MCP tools.
OpenAPI Bridge pip install httpx pyyaml Point at any Swagger/OpenAPI spec → register all endpoints on mesh.
# Publish an MCP server's tools to the mesh
mesh-mcp-bridge publish --gateway http://localhost:3000 --mcp-server "python my_server.py"

# Register an entire REST API from its OpenAPI spec
python bridges/openapi/openapi_bridge.py --gateway http://localhost:3000 \
    --spec https://petstore.swagger.io/v2/swagger.json

Why

Every AI agent framework reinvents service discovery. MCP servers need manual configuration. Tool registries are centralized bottlenecks. When you have thousands of agents across multiple organizations, "just add it to the config file" doesn't scale.

mars-protocol makes capability discovery a network primitive:

┌──────────────┐         ┌──────────────┐         ┌──────────────┐
│  Agent A     │         │  Mesh Hub    │         │  Agent B     │
│              │ STORE   │              │ FIND    │              │
│  "I can do   │────────▶│  Kademlia    │◀────────│  "Who can    │
│   code       │         │  DHT over    │         │   review     │
│   review"    │         │  QUIC        │         │   code?"     │
└──────────────┘         └──────────────┘         └──────────────┘

Architecture

Crate Map

Crate Description
mesh-core Wire types, CBOR serialization, Ed25519 identity, content-addressed hashing
mesh-transport QUIC transport with mutual TLS (Ed25519 self-signed certs)
mesh-dht Kademlia DHT — routing table, iterative lookup, descriptor storage
mesh-schemas Well-known schema hashes and routing key constants
mars-client High-level client library (MeshClient — bootstrap, publish, discover, ping)
mesh-node CLI binary — run a mesh node, publish/discover capabilities
mesh-hub Production hub — redb storage, multi-tenant, admin API, peering, metrics
mesh-gateway HTTP/JSON gateway — lets any language use the mesh

Protocol Stack

┌─────────────────────────────┐
│  Application Layer          │
│  Descriptors + Schemas      │
├─────────────────────────────┤
│  DHT Layer                  │
│  Kademlia (STORE/FIND_VALUE │
│  /FIND_NODE/PING)           │
├─────────────────────────────┤
│  Security Layer             │
│  Mutual TLS + Ed25519       │
│  Sender-TLS Binding         │
├─────────────────────────────┤
│  Transport Layer            │
│  QUIC (RFC 9000)            │
└─────────────────────────────┘

SDKs & Bridges

Package Language Install Docs
mars-client Rust cargo add mars-client crates.io
mesh-protocol Python pip install mesh-protocol PyPI
mars-protocol TypeScript npm install mars-protocol npm
mesh-mcp-bridge Python pip install mesh-mcp-bridge PyPI
openapi-bridge Python pip install httpx pyyaml README

What Agents Publish

A descriptor is a signed, content-addressed record that says "I can do X, reach me at Y":

Descriptor {
    publisher:    did:mesh:z6Mkt...     # Ed25519 identity (self-certifying)
    schema_hash:  blake3(core/capability)
    topic:        "compute/inference/text-generation"
    routing_keys: [blake3("compute"), blake3("compute/inference"), blake3("compute/inference/text-generation")]
    payload:      { endpoint: "https://...", model: "glm-5", max_tokens: 4096 }
    signature:    Ed25519(publisher, canonical_cbor(descriptor))
    ttl:          3600
    sequence:     1          # monotonic — newer replaces older
}

Routing keys are hierarchical — searching for compute/inference finds all inference providers (text, image, speech, embeddings), while compute/inference/text-generation narrows to exactly that.


Running a Hub

# Generate hub identity
./mesh-hub --generate-keypair

# Edit config (see examples/hub.toml)
cp examples/hub.toml mesh-hub.toml
$EDITOR mesh-hub.toml

# Run
./mesh-hub --config mesh-hub.toml

Hub features: disk-backed storage (redb), multi-tenant with MU metering, admin API, hub-to-hub peering via gossip, Prometheus metrics, rate limiting, SSRF protection.

See the Operator Guide for full configuration reference. Multi-region deployment scripts in deploy/.


Security Model

Hybrid Authentication

  • Descriptors carry Ed25519 publisher signatures — self-authenticating across relays, caches, and federation
  • Protocol messages use mutual TLS with Ed25519 identity binding — zero per-message overhead

Defense in Depth

  • Sender-TLS binding on every protocol message
  • Sybil-resistant routing (LRU ping challenge on full K-buckets)
  • Descriptor revocation and key rotation
  • SSRF prevention (blocks RFC1918, CGN, ULA, loopback)
  • Per-IP and per-identity rate limiting

Cryptographic Guarantees

  • Identity: Ed25519 keypairs, DID-based identifiers
  • Hashing: BLAKE3 with algorithm agility (version byte)
  • Transport: QUIC with mutual TLS, mandatory client auth
  • Serialization: RFC 8949 deterministic CBOR
  • Descriptors: Self-signed, sequence-monotonic, TTL-bounded

Use Cases

Agent-to-Agent Discovery

An AI agent needs code review. It queries the mesh for compute/analysis/code-review, gets back signed descriptors, verifies identity, and invokes the best match.

Distributed GPU Inference

GPU owners share their hardware on the mesh. Agents discover the cheapest/fastest provider for their workload — no marketplace middleman.

Federated Tool Registry

Publish MCP server tools to the mesh. Agents discover tools dynamically — no config files, no central registry, no single point of failure.


Contributing

cargo test --workspace    # Run the full test suite (230+ tests)
cargo test --package mesh-hub   # Hub hardening tests only

License

Dual-licensed under MIT or Apache-2.0 at your option.

About

Mesh Agent Routing Standard — decentralized capability discovery for autonomous agents

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors