Llama 3.1 family

Meta's open-weight Llama 3.1 line at three sizes.

Llama 3.1 ships at 405B, 70B, and 8B parameter counts. Open weights mean you can self-host or pick whichever inference provider gets you the best $/token; quality and capability climb with parameter count.

Specifications side-by-side

SpecMeta: Llama 3.1 70B InstructMeta: Llama 3.1 8B Instruct
Providermeta-llamameta-llama
Context131K16K
Max output16,38416,384
Input $/Mtok$0.40$0.02
Output $/Mtok$0.40$0.05
Modalitiestexttext
Licenseproprietaryproprietary
Released2024-07-232024-07-23

Other families