Skip to content

Commit 29d154c

Browse files
authored
fix: use o4-mini as the default model (#1135)
Rollback of #972.
1 parent 6b5b184 commit 29d154c

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

codex-rs/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ The `config.toml` file supports the following options:
3232
The model that Codex should use.
3333

3434
```toml
35-
model = "o3" # overrides the default of "codex-mini-latest"
35+
model = "o3" # overrides the default of "o4-mini"
3636
```
3737

3838
### model_provider

codex-rs/core/src/flags.rs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ use std::time::Duration;
33
use env_flags::env_flags;
44

55
env_flags! {
6-
pub OPENAI_DEFAULT_MODEL: &str = "codex-mini-latest";
6+
pub OPENAI_DEFAULT_MODEL: &str = "o4-mini";
77
pub OPENAI_API_BASE: &str = "https://api.openai.com/v1";
88

99
/// Fallback when the provider-specific key is not set.

0 commit comments

Comments
 (0)