Skip to content

Instantly share code, notes, and snippets.

@simonholm
Created October 25, 2025 19:08
Show Gist options
  • Select an option

  • Save simonholm/4b32aa93055e6c1d8c44c7ecb256a0e0 to your computer and use it in GitHub Desktop.

Select an option

Save simonholm/4b32aa93055e6c1d8c44c7ecb256a0e0 to your computer and use it in GitHub Desktop.
Clipboard snippet
# Codex CLI on Termux — Full Summary (Unofficial Native Build)
## ✅ Overview
You successfully built and ran the real **OpenAI Codex CLI** natively on Termux.
It logs in, executes models (e.g., `gpt-5-codex`), and runs the MCP server help.
This is a legitimate source build — not an exploit — but **unofficial** and **unsupported** by OpenAI.
---
## 🧩 Steps Performed
1. **Verified Termux environment**
- Installed: rust 1.90, git 2.51, clang 20, make 4.4.
- Target triple confirmed: `aarch64-linux-android`.
2. **Located Codex workspace**
- Path: `~/codex/codex-rs/cli/Cargo.toml` (inside the main Rust monorepo).
3. **Dry-run and dependency checks**
- `cargo check -p codex-cli -vv` succeeded.
- Verified cross-crate dependencies resolve under Bionic libc.
4. **Initial release build attempt**
- Terminated by Android OOM killer (expected for LTO on mobile).
5. **Rebuilt in debug mode**
- `cargo build -p codex-cli` → successful build in ≈ 5 min.
- Binary created: `~/codex/codex-rs/target/debug/codex` (~442 MB).
6. **Confirmed execution**
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment