gpt-oss-120b

openai medium reasoning

244 questions answered · 244 with human benchmark data · Released 2025-08-05

gpt-oss-120b is an open-weight, 117B-parameter Mixture-of-Experts (MoE) language model from OpenAI designed for high-reasoning, agentic, and general-purpose production use cases. It activates 5.1B parameters per forward pass and is optimized to run on a single H100 GPU with native MXFP4 quantization. The model supports configurable reasoning depth, full chain-of-thought access, and native tool use, including function calling, browsing, and structured output generation.

Alignment 60
Consensus 74
Confidence 62

Scores by Category

Notable Questions

Response Confidence

Related Models

Most Different

Qualia Garden Exploring AI values alignment