← 返回模型库
M
gpt-oss-120b
cmnn7mt7r001j51q4sxtp93qi
gpt-oss-120b is an open-weight, 117B-parameter Mixture-of-Experts (MoE) language model from OpenAI designed for high-reasoning, agentic, and general-purpose production use cases. It activates 5.1B parameters per forward pass and is optimized...
官方定价 输入
$0.5/M Tokens
官方定价 输出
$1.5/M Tokens
LMSYS Elo
-