Stability AI Reveals 12 Billion-Parameter Stable Language Model 2 and Enhanced 1.6 Billion Iteration

Stability AI Reveals 12 Billion-Parameter Stable Language Model 2 and Enhanced 1.6 Billion Iteration

Stability AI Reveals 12 Billion-Parameter Stable Language Model 2 and Enhanced 1.6 Billion Iteration

Stability AI Introduces Latest Offerings in the Stable Language Model 2 Series: a 12 Billion Parameter Core Model and a Tuned Variant. These iterations were trained on an impressive two trillion tokens spanning seven languages: English, Spanish, German, Italian, French, Portuguese, and Dutch.

The 12 billion parameter model aims to achieve a delicate equilibrium between robust performance, efficiency, memory usage, and speed. It adheres to the established framework set by Stability AI’s previously documented technical report on Stable Language Model 2 1.6B. This latest release expands the company’s model lineup, providing developers with a transparent and potent tool for advancing AI language technology.

In addition to the 12B model, Stability AI has rolled out an updated version of its Stable Language Model 2 1.6B. This enhanced 1.6B variant enhances conversational capabilities across the same seven languages while keeping system requirements remarkably low.

Stable Language Model 2 12B is crafted as a resource-efficient open model optimized for multilingual tasks, delivering smooth performance on readily available hardware.

According to Stability AI, this model can tackle tasks typically reserved for much larger models, which often demand significant computational and memory resources, such as extensive Mixture-of-Experts (MoEs). The tuned variant is especially suitable for various applications, including serving as a central component in retrieval RAG systems, owing to its exceptional performance in tool usage and function calling.

In performance evaluations against popular robust language models like Mixtral, Llama2, Qwen 1.5, Gemma, and Mistral, Stable Language Model 2 12B demonstrates commendable performance when evaluated on zero-shot and few-shot tasks across general benchmarks outlined in the Open LLM leaderboard.

With this latest release, Stability AI expands the Stable Language Model 2 family into the 12B realm, offering an open and transparent model without compromising on potency and precision. The company expresses confidence that this release will empower developers and enterprises to forge ahead with future developments while maintaining complete control over their data.

Developers and enterprises can now leverage Stable Language Model 2 12B for both commercial and non-commercial purposes through a Stability AI Membership.