What This Means
- Run the entire Char Hub in your VPC — No data leaves your network
- Use your internal inference endpoint — vLLM, Azure OpenAI, Amazon Bedrock, or any OpenAI-compatible API
- Connect to internal MCP servers — Behind your firewall, no tunnels required
- Same SDK, same tools — Your frontend code and WebMCP tools work unchanged
- Built on Elixir/BEAM — Battle-tested runtime for stateful, distributed systems
Interested?
Contact Us
Tell us about your data residency requirements

