Back to Models
AI21 Labs
Jamba
Open source1 variants
AI21 Labs' 52B hybrid SSM-Transformer model — Mamba + MoE architecture with 256K context and exceptional throughput.
256K tokensFree / Open weightsMoEApache 2.0
AI21 Labs
AI21 Labs' 52B hybrid SSM-Transformer model — Mamba + MoE architecture with 256K context and exceptional throughput.