Install
Install the Codex CLI:
npm install -g @openai/codex
Usage with Ollama
Codex requires a larger context window. It is recommended to use a context window of at least 64k tokens.
Quick setup
To configure without launching:
ollama launch codex --config
Manual setup
To use codex with Ollama, use the --oss flag:
To use a specific model, pass the -m flag:
codex --oss -m gpt-oss:120b
To use a cloud model:
codex --oss -m gpt-oss:120b-cloud
Profile-based setup
For a persistent configuration, add an Ollama provider and profiles to ~/.codex/config.toml:
[model_providers.ollama-launch]
name = "Ollama"
base_url = "http://localhost:11434/v1"
[profiles.ollama-launch]
model = "gpt-oss:120b"
model_provider = "ollama-launch"
[profiles.ollama-cloud]
model = "gpt-oss:120b-cloud"
model_provider = "ollama-launch"
Then run:
codex --profile ollama-launch
codex --profile ollama-cloud