Why We Chose Rust for a Cross-Platform Agent

The Life Savor agent runs on macOS, Windows, six Linux distributions, NetBSD, and OpenBSD — all from a single codebase. We chose Rust for this, and it wasn't because Rust is trendy. Here's the actual reasoning.

The Requirements

We needed a language that could:

  1. Compile to a single native binary (no runtime dependencies)
  2. Run on every major desktop OS without a VM or interpreter
  3. Handle concurrent WebSocket connections, model inference, and skill execution simultaneously
  4. Manage memory safely without a garbage collector (predictable latency for real-time inference)
  5. Interface with native platform APIs (Keychain on macOS, Credential Manager on Windows, libsecret on Linux)
  6. Embed PyTorch and ONNX Runtime for local model inference

What We Considered

Go — great for servers, but CGo makes cross-compilation painful. Embedding PyTorch via CGo is fragile. The garbage collector introduces latency spikes during inference.

C++ — meets all technical requirements, but memory safety is manual. For a security-critical application handling PII, encrypted vaults, and sandboxed processes, we didn't want to bet on manual memory management.

Node.js / Python — require a runtime. Distributing a Node.js app as a native binary means bundling the entire runtime. Performance for local inference is inadequate.

Swift — excellent on Apple platforms, but cross-platform support is limited. We use Swift for the macOS-native GUI, but the core agent needs to run everywhere.

Rust — single binary, no runtime, memory safety enforced at compile time, excellent async ecosystem (tokio), first-class C FFI for PyTorch/ONNX, and compiles to every target we need.

What Rust Gives Us

Zero-Cost Abstractions for the Pipeline

The message interception pipeline processes every message through PII scanning, content safety, and routing. In Rust, this pipeline is a chain of zero-cost abstractions — no heap allocations in the hot path, no virtual dispatch overhead, no garbage collection pauses.

Fearless Concurrency

The agent juggles:

  • A persistent WebSocket connection (heartbeats, commands)
  • Multiple skill processes (spawned, monitored, terminated)
  • Model inference (potentially GPU-bound)
  • File system watchers (skill directory changes)
  • Encrypted vault operations

Rust's ownership system prevents data races at compile time. We don't need mutexes around everything — the type system tells us when shared state is safe.

Cross-Compilation

We build for 8+ targets from a single CI pipeline:

  • x86_64-unknown-linux-gnu
  • aarch64-unknown-linux-gnu
  • x86_64-apple-darwin
  • aarch64-apple-darwin
  • x86_64-pc-windows-msvc
  • x86_64-unknown-netbsd
  • x86_64-unknown-openbsd

Each produces a single static binary. No installer dependencies, no shared libraries to manage, no "works on my machine" issues.

Safe FFI for ML Runtimes

The agent embeds PyTorch (via tch-rs) and ONNX Runtime (via ort). These are C/C++ libraries with complex memory management. Rust's FFI boundary gives us safe wrappers — if we misuse the C API, the compiler catches it.

The Tradeoffs

Compile times. A full release build takes 5-8 minutes. Incremental debug builds are fast, but CI pipelines feel the cost.

Learning curve. The borrow checker is real. New contributors take longer to become productive. But once code compiles, it tends to work correctly — fewer runtime bugs, fewer production incidents.

Ecosystem maturity. Some libraries are younger than their Go/Node equivalents. We've contributed patches upstream and occasionally vendor dependencies.

Binary size. A release binary with PyTorch and ONNX Runtime linked is ~50MB. Acceptable for a desktop application, but larger than a Go binary would be.

Would We Choose It Again?

Yes. The combination of memory safety, native performance, cross-platform compilation, and safe C FFI is unique to Rust. For a security-critical application that runs on user devices, handles sensitive data, and embeds ML runtimes — there isn't a better option today.

The agent has been running in production across all supported platforms with zero memory-safety-related bugs. The type system catches those at compile time, so they never reach users.