← 返回模型库
M

Llama 4 Maverick

meta-llama

Llama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language model from Meta, built on a mixture-of-experts (MoE) architecture with 128 experts and 17 billion active parameters per forward...

官方定价 输入
$/M Tokens
官方定价 输出
$/M Tokens
LMSYS Elo
1380

分类

通用
暂无中转站价格数据,请在后台添加