Start United States USA — software Snowflake ropes in AI21’s Jamba-Instruct to help enterprises decode long documents

Snowflake ropes in AI21’s Jamba-Instruct to help enterprises decode long documents

82
0
TEILEN

The hybrid nature of Jamba-Instruct makes it more economically accessible than other models of the same class on Snowflake.
data cloud giant Snowflake announced it is adding Israeli AI startup AI21 Labs’ enterprise-focused Jamba-Instruct LLM into its Cortex AI service.
Available starting today, the model will enable Snowflake’s enterprise customers to build generative AI applications (like chatbots and summarization tools) capable of handling long documents without compromising on quality and accuracy.
Given enterprises’ massive dependence on large files and documents, Jamba-Instruct could be a great asset for teams. However, it’s important to note here that AI21 is not Snowflake’s only large language model (LLM) partner. The Sridhar Ramaswamy-led company has been laser-focused on the gen AI category. It has already initiated several engagements to create a whole ecosystem for developing highly performant, data-driven AI apps.
Just a couple of days ago, the company announced a partnership with Meta to bring the all-new Llama 3.1 family of LLMs to Cortex. Before that, it debuted a proprietary enterprise model called ‘Arctic’. The approach has been quite similar to that of rival Databricks, which acquired MosaicML last year and has since been moving aggressively, building its own DBRX model and adding new LLMs and tools for customers to build upon.What does Jamba-Instruct offer to Snowflake users?
Back in March, AI21 made headlines with Jamba, an open generative AI model combining the tried-and-tested transformer architecture with a novel memory-efficient Structured State Space model (SSM). The hybrid model provided users access to a massive 256K context window (the amount of data an LLM can take in to process) and activated just 12B of its 52B parameters — delivering not only a powerful but also an efficient solution at the same.
According to AI21, Jamba delivered 3x throughput on long contexts compared to Mixtral 8x7B (another model in its size class), making an enticing offering for enterprises. This led the company to debut Jamba-Instruct, an instruction-tuned version of the model with additional training, chat capabilities and safety guardrails to make it suitable for enterprise use cases.

Continue reading...