gpt-oss-120b is an open-weight, 117B-parameter Mixture-of-Experts (MoE) language model from OpenAI designed for high-reasoning, agentic, and general-purpose production use cases.
| Model | Perf | Speed | Context | Max out | Input | Output |
|---|---|---|---|---|---|---|
gpt-oss-120bRT | 131K | 131K | — | — | ||
| 131K | 131K | — | — |
/v1/models/openai/gpt-oss-120b