What this is
MCP server for Hive Agent Storage — agent-native object storage with per-agent DID isolation and x402 pay-per-byte metering. Routes to Storj, Filecoin, and Arweave under the hood. Real Base USDC settlement to 0x15184bf50b3d3f52b60434f8942b7d52f2eb436e. No mocks, no testnet, no dev-trust.
Settlement is real: USDC on Base L2 via Hive Civilization rails. No simulated proofs, no mock receipts. Pricing is per-call; see JSON-LD offers for the full schedule.
Tools (5)
agent_storage_put— Upload bytes to agent-isolated object storage. Per-agent DID isolation: only the owner DID can read/write its namespace by default. Settles in real Base USDC at $0.0001/KB on upload. Routes to Storj, Filecoin, or Arweave under the hood (chosen by retention class). Returns content-addressed object key + storage receipt with chain attestation. Backend pending — currently returns 503.agent_storage_get— Read an object from agent-isolated storage. Free for own DID; cross-DID reads cost $0.00005/KB in real Base USDC (settled per KB read). Returns the object bytes + storage receipt + chain attestation. Backend pending — currently returns 503.agent_storage_list— List objects inside an agent storage namespace. Free read. Supports key prefix filtering and pagination cursor. Backend pending — currently returns 503.agent_storage_delete— Delete an object from agent storage. Owner-only — only the agent_did that owns the namespace can delete. Free. Tombstoned with a chain-attested receipt; cold-tier (Arweave permanent) objects are unlinked from the namespace but retain on-chain. Backend pending — currently returns 503.agent_storage_quota— Return the current quota usage for an agent namespace: bytes used, bytes allocated, object count, retention-class breakdown, and lifetime USDC spent on this namespace. Free read. Backend pending — currently returns 503.
Discovery
GET /.well-known/mcp.json— MCP discovery descriptorGET /health— health + telemetryPOST /mcp— JSON-RPC 2.0 over Streamable-HTTP, MCP 2024-11-05GET /sitemap.xml— crawler sitemapGET /robots.txt— allow-all crawl policyGET /.well-known/security.txt— security contact