16-brain local command router

Local Brain Router

Ask it like an operator. It routes the question to the right MetrAIyux 0S owner brain, adds the secondary reviewer, pulls live proof surfaces, and keeps private admin setup out of public answers.

16 brainsFS27 gate awareLive surfacesStatic fallback

Brain mode

Loading knowledge base...

Default mode runs in the browser. It loads Site Operator, 0meg4kAI, Central Command, Gray, the cabinet brains, live surface routing, and the sales/proof registries before answering.

Command question console

Ask about buyers, FS27 auth, pricing, public proof, client deployments, cabinet ownership, and production guardrails.

Optional local model bridge

Keep it light: Ollama or llama.cpp can be plugged in later.

This site does not require a model to work. The optional bridge is for when you want answers rewritten by a small local model through an OpenAI-compatible endpoint after the browser router has already picked the owner lane.

Ollama supports OpenAI-compatible APIs, while llama.cpp and llama-cpp-python can also expose OpenAI-compatible local servers. This means the brain can later point at localhost instead of a paid cloud provider.

Endpoint test

Not tested.

Internal limits

This is a company knowledge brain, not an unauthorized legal filing engine.

The router can explain the cabinet structure, live surfaces, public profiles, governance language, AE positioning, FS27 gate relationship, pricing source of truth, and deployment steps. It should not expose tokens, submit filings, invent licenses, publish unsupported claims, or treat a local answer as legal, financial, tax, HR, or security advice.