Skip to main content
< Back to blogs page

January 17, 2026

OpenAI Codex Goes Open Source with Ollama Integration

OpenAI Codex CLI now works with open-source models through Ollama

Codex Is Now Open Source

OpenAI just made a significant move: Codex, their AI coding assistant CLI, now works with open-source models through Ollama.

This means you can run a powerful code generation tool locally without relying on OpenAI's proprietary API. The CLI can read, modify, and execute code in your working directory using models like gpt-oss:20b or gpt-oss:120b.

Getting Started

Setup is straightforward:

npm install -g @openai/codex
codex --oss

By default, it uses the local gpt-oss:20b model. For larger tasks, switch to a bigger model:

codex --oss -m gpt-oss:120b

Important: Codex requires a large context window. Ollama's default is only 4096 tokens, but coding tools need at least 32K. Set this before running:

OLLAMA_CONTEXT_LENGTH=32000 ollama serve

Keep in mind that larger context windows require more VRAM. Verify your setup with ollama ps to confirm the context size is allocated correctly.

Cloud Options

If local hardware isn't sufficient, Ollama Cloud models are fully supported:

codex --oss -m gpt-oss:120b-cloud

This gives you the flexibility to run smaller models locally for quick tasks while offloading heavier workloads to the cloud.

Why This Matters

Three reasons this is significant:

1. Local-first AI development - Run code generation entirely on your machine. No API keys, no usage limits, no data leaving your system.

2. Model flexibility - Choose the model that fits your hardware and task. The 20B parameter model works for most coding tasks; scale up to 120B when needed.

3. Lower barrier to entry - Open-source models and local execution democratize access to AI coding tools. You're no longer locked into a subscription or pay-per-token model.

What This Signals

OpenAI open-sourcing Codex reflects a broader industry trend. As open-weight models become increasingly capable, proprietary APIs face pressure to offer more value or open up.

For developers, this is a win. More options, more control, and the ability to integrate AI coding assistance into workflows without vendor lock-in.

The Bottom Line

Codex with Ollama is worth trying if you're building software. Local execution, open models, and a simple CLI make it easy to experiment without commitment.

The gap between proprietary and open-source AI tools continues to narrow—and developers benefit every time it does.

Author Image

Posted by

Fahad Siddiqui

Founder, Datum Brain

Copyright © 2026 Datum Brain

facebookinstagramlinkedintwitteryoutube