Pick the hardware. State your intent. Ship it.
We generate the hardware-specific runtime and the manifest that drives it. Change behavior by talking to NOUS, our platform's brain. The runtime never recompiles.
From the operator's first sentence to a live system. Three steps. Three industries below — same path.
c2-standard-8) for the plant-wide controller — bulk translation, manifest broadcast, alert routing all on the metal. Each line station's existing operator hardware connects over the mesh.Three steps. Three industries. Same path.
Honest answer: we don't know yet. Honest prediction: less than 90 days.
The typical platform onboarding is a long chain of translators. What your operators want has to pass through engineers, architects, vendors, deployment teams, QA, and back to the operators again. Every link slows the work. Every link dilutes the original intent.
We collapse that chain. The operator states the WHAT in plain language. NOUS and the-last-platform deliver the HOW — directly, on the metal. That is the source of the speed.
biRT is a complete computing platform written from scratch in C. It runs directly on hardware. No operating system, no kernel, no filesystem, no external libraries.
One binary. Direct hardware access. From power-on to thinking in under a millisecond. biRT lands on any hardware that doesn't lock the boot chain after BIOS. Two GCP VM tiers run biRT bare metal in production today — ELITE compute (c2-standard-8) and FOUNDATION catalog (e2-micro), six nodes across three continents. ARM bring-up runs on consumer phones where the bootloader is reachable; where it isn't, biRT-A ships the same intent contract as an Android-form runtime until the device unlocks. RISC and other architectures are next, on the recommendation of our hardware engineer Norman Campbell.
Because the runtime is pure machine code and small enough to fit beside the silicon — not on top of a half-gigabyte of operating-system overhead — the hardware does what it was built for. The OS isn't in the way.
Cryptographic keys are generated on the device and never transmitted off it. Local storage is sealed with passphrase-derived keys (PBKDF2). Brute-force lockout is built into the vault. Anything in transit is wrapped in AES-256-GCM. Anything at rest is wrapped in AES-256-GCM. Authentication uses Ed25519 signatures, implemented from finite-field arithmetic up — no library shortcuts.
This is what we have shipped, in the codebase, today. The conversational privacy layer for AI interactions — the part that mediates between operators and external models — is being designed now. We will describe it on this page when we can point at running code.
The runtime is small, fast, and stable. Behavior is the manifest — a deployment-specific description of what the runtime should do, on which targets, and when. Change the behavior by changing the manifest. The runtime never recompiles. The conversational and authoring layers above this are in active development; the example below is illustrative of the design, not a transcript from a production system today.
Operators don't write code to change behavior. They have a conversation with NOUS — our platform's brain. NOUS turns the conversation into pure intent — expressed in BASICI — and ships it as a signed manifest the runtime can execute. Any human language in. BASICI out. The runtime stays the same.
Two worlds, joined: single-purpose machine-level code that runs as fast as the silicon allows, and human thought translated into runtime instructions by NOUS. The runtime never sees the conversation; the conversation never has to know about the runtime.
BASICI is the language pure intent is written in. Not the old IBM BASIC. Not a tutorial language. The AI version — canonical, unambiguous, hardware-aware.
An operator speaks to NOUS in their own language. NOUS generates and deploys the new intent to the runtime, confirms successful operation, then responds to the operator in their language, dialect, and style.
The memory layer is memory-mapped (zero-copy), journaled for crash safety, encrypted at rest with AES-256-GCM, and indexed by up to 16 keys per store. Binary serialization compresses roughly 79x versus raw text. Knowledge and code share the same representation, which means the system reasons about its own behavior the same way it reasons about anything else.
Internal seek-speed measurements are very promising. We will not claim "fastest on Earth" until we have published the numbers. We are running the architecture against Wikipedia and other public datasets next, and the benchmarks — methodology, hardware, raw numbers, comparisons — will be published in full and reproducible by anyone who wants to verify them.
We turn pure intent into a runtime that runs on your hardware, under your keys.
Some pieces are not finished. We are not going to pretend otherwise.
| Capability | Status | What it means |
|---|---|---|
| biRT on consumer ARM phones (where the boot chain unlocks) | Shipped | Bare-metal bring-up on consumer ARM phones where the post-BIOS chain hands off — boots through the device's bootloader to biRT, no OS. Framebuffer, debug, sensors, runtime all on the metal. The proof that biRT lands wherever the bootloader is reachable; per-device bring-up clears each new vendor's chain. |
| biRT-A — Android-form runtime | Shipped | When a device's bootloader is locked and the unlock path is exhausted, biRT-A ships the same intent contract as a single Android kiosk app — fullscreen AMOLED, all hardware sensors via SensorManager, identical 16-byte event log. When the device unlocks, the bare-metal binary takes over. Until then, biRT-A is what runs. |
| biRT on GCP — ELITE + FOUNDATION (x86) | Shipped | Two VM tiers running biRT bare metal in production, six nodes across three continents. ELITE (c2-standard-8) handles bulk translation, repo import, mesh hubs. FOUNDATION (e2-micro) serves encrypted triples at the edge. From-scratch gVNIC driver, UDP/IP/DHCP/ARP, bare-metal libc replacement, HW-accelerated crypto (AES-NI, SHA-NI) with software fallback. |
| Intelligence stack | Shipped | Natural-language understanding, reasoning, multi-format ingestion (PDF, JSON, CSV, plain text), intent ingestion with multi-source corroboration, 155,000+ word lexical database. ~28,000 lines of production C. |
| Cryptographic primitives | Shipped | AES-256-GCM, SHA-256/512/1, HMAC, PBKDF2, Ed25519 implemented from finite-field arithmetic up. Authentication uses constant-time comparison — no timing side channels. |
| Console & QR provisioning | Shipped | Cross-platform interactive terminal with F-key command bar (search, try, feed, confirm, ship, refactor, trace, save, inspect). QR generation from scratch (Galois field GF(256), Reed-Solomon error correction) for scan-to-register device provisioning. |
| Translation proofs (BASICI) | Shipped | ~50,000 lines of validated roundtrips proving every Intelligence skill survives translation between C, BASICI, and natural language. The receipts behind the BASICI claim. |
| Operator console for AI (metal console) | Drafting | Operator-facing console that mediates AI conversations and authors manifests. Designed; planned and scoped in the metal repo. Not yet running end-to-end. |
| Multi-AI desktop edition | Drafting | Single surface routing to multiple popular AI providers. Designed; not yet built. Will not be claimed shipped until anonymization behavior is implemented and demonstrated. |
| Noise On The Wire (NOTW) | Shipped | Core protocol, mesh, and relay are done. Discovery, routing, store-and-forward, AES-256-GCM transport encryption, cross-platform. Additional transport backends (BLE, LoRa, WiFi/UDP, serial) being added. |
| Intelligent Data Store (IDS) | Shipped | Memory-mapped, journaled, encrypted at rest, indexed for sub-microsecond seek. Tested across persistence, growth, scope, and journaling. Wikipedia-scale public benchmarks pending; the architecture itself is done. |
| Pure-intent authoring tools | In Progress | Engineers can author and edit intents directly. End-user authoring (no engineer required) still in development. |
| Conversational manifest authoring | Drafting | Operators describe behavior changes in plain language; the system drafts a signed manifest. AI-assisted, multi-language. Designed and scoped; not yet running end-to-end. |
| Manifest delivery over mesh | In Progress | Signed payload broadcast over the mesh, applied at next runtime tick. The transport, signing, and broadcast paths are shipped; manifest-as-runtime-config is in active integration. |
| biRT on RISC and emerging architectures | Drafting | Architectural targets identified with our hardware engineer. Bring-up work not yet started. |
| Custom silicon | Not Yet | Direction, not deliverable. Stated for honesty, not as a commitment. |
| External operator deployments | Not Yet | Internal use across multiple repos. No third-party in production yet. |
The Last Platform is being used to build The Last Platform. The system reasons about itself, updates itself, and is being built to repair, and ship itself end-to-end. Today a human still pulls the final lever on every release. Tomorrow we don't.
This is what a complete AI-native company looks like — first of its kind in the AI renaissance. Not a wrapper around someone else's model. Not a thin orchestration layer over the existing stack. Bare metal up, runtime up, intelligence up — every layer ours, every layer auditable, every layer composable.
Anyone not throwing everything away and starting from nothing is doing it wrong. Goodtimes.