← 返回模型库
M
Llama 4 Scout
meta-llama
Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...
官方定价 输入
$/M Tokens
官方定价 输出
$/M Tokens
LMSYS Elo
1400
分类
通用暂无中转站价格数据,请在后台添加