Search
Close this search box.

Snowflake Launches Arctic: The Most Open, Enterprise-Grade Large Language Model

This is a watershed moment for Snowflake, with our AI research team innovating at the forefront of AI.

Snowflake, the Data Cloud company, has unveiled its latest innovation: Snowflake Arctic. This state-of-the-art large language model (LLM) marks a significant milestone in AI technology, designed to be the most open and enterprise-grade LLM available on the market.

Advancing AI with Openness and Innovation

Snowflake Arctic incorporates a unique Mixture-of-Experts (MoE) architecture, delivering top-tier intelligence with unparalleled efficiency at scale. What sets the Arctic apart is its commitment to openness. Snowflake has not only released Arctic’s weights under an Apache 2.0 license but also shared details of the research leading to its development, setting a new standard for transparency and collaboration in enterprise AI technology.

Sridhar Ramaswamy, CEO, Snowflake: “This is a watershed moment for Snowflake, with our AI research team innovating at the forefront of AI. By delivering industry-leading intelligence and efficiency in a truly open way to the AI community, we are furthering the frontiers of what open source AI can do. Our research with Arctic will significantly enhance our capability to deliver reliable, efficient AI to our customers.”

Empowering Enterprise AI Strategies

According to a recent Forrester report, nearly half of global enterprise AI decision-makers are leveraging existing open source LLMs for their organization’s AI strategy. With Snowflake Arctic, the company is empowering all users to harness the power of industry-leading open LLMs. Snowflake Arctic is available under an Apache 2.0 license, enabling ungated personal, research, and commercial use.

Yoav Shoham, Co-Founder and Co-CEO, AI21 Labs: “Snowflake Arctic is poised to drive significant outcomes that extend our strategic partnership, driving AI access, democratization, and innovation for all. We are excited to see Snowflake help enterprises harness the power of open source models, as we did with our recent release of Jamba — the first production-grade Mamba-based Transformer-SSM model. Snowflake’s continued AI investment is an important factor in our choosing to build on the Data Cloud, and we’re looking forward to continuing to create increased value for our joint customers.”

Collaboration and Compatibility

Snowflake Arctic is not just about providing access; it’s about facilitating collaboration and compatibility. Snowflake offers code templates, flexible inference, and training options, allowing users to quickly deploy and customize Arctic using their preferred frameworks. Arctic is available for serverless inference in Snowflake Cortex, the company’s fully managed service offering machine learning and AI solutions in the Data Cloud. Additionally, it will be available on Amazon Web Services (AWS), expanding its accessibility.

David Brown, Vice President Compute and Networking, AWS: “Snowflake and AWS are aligned in the belief that generative AI will transform virtually every customer experience we know. With AWS, Snowflake was able to customize its infrastructure to accelerate time-to-market for training Snowflake Arctic. Using Amazon EC2 P5 instances with Snowflake’s efficient training system and model architecture co-design, Snowflake was able to quickly develop and deliver a new, enterprise-grade model to customers. And with plans to make Snowflake Arctic available on AWS, customers will have greater choice to leverage powerful AI technology to accelerate their transformation.”

Efficiency and Performance Redefined

Snowflake’s AI research team has achieved remarkable feats with Arctic. Not only was it built in less than three months, but it also incurred significantly lower training costs compared to similar models. The model’s MoE design improves both training systems and performance, while its meticulously crafted data composition meets enterprise needs. Arctic activates a fraction of its parameters compared to leading models during inference or training, setting new benchmarks for efficiency without compromising on performance.

Shishir Mehrotra, Co-Founder and CEO, Coda: “As the pace of AI continues to accelerate, Snowflake has cemented itself as an AI innovator with the launch of Snowflake Arctic. Our innovation and design principles are in-line with Snowflake’s forward-thinking approach to AI and beyond, and we’re excited to be a partner on this journey of transforming everyday apps and workflows through AI.”

Clement Delangue, CEO and Co-Founder, Hugging Face: “There has been a massive wave of open-source AI in the past few months. We’re excited to see Snowflake contributing significantly with this release not only of the model with an Apache 2.0 license but also with details on how it was trained. It gives the necessary transparency and control for enterprises to build AI and for the field as a whole to break new grounds.”

Snowflake Continues to Accelerate AI Innovation for All Users

Snowflake continues to provide enterprises with the data foundation and cutting-edge AI building blocks they need to create powerful AI and machine learning apps with their enterprise data. When accessed in Snowflake Cortex, Arctic will accelerate customers’ ability to build production-grade AI apps at scale, within the security and governance perimeter of the Data Cloud.

In addition to the Arctic LLM, the Snowflake Arctic family of models also includes the recently announced Arctic embed, a family of state-of-the-art text embedding models available to the open source community under an Apache 2.0 license. The family of five models are available on Hugging Face for immediate use and will soon be available as part of the Snowflake Cortex embed function (in private preview). These embedding models are optimized to deliver leading retrieval performance at roughly a third of the size of comparable models, giving organizations a powerful and cost-effective solution when combining proprietary datasets with LLMs as part of a Retrieval Augmented Generation or semantic search service.

Snowflake also prioritizes giving customers access to the newest and most powerful LLMs in the Data Cloud, including the recent additions of Reka and Mistral AI’s models. Moreover, Snowflake recently announced an expanded partnership with NVIDIA to continue its AI innovation, bringing together the full-stack NVIDIA accelerated platform with Snowflake’s Data Cloud to deliver a secure and formidable combination of infrastructure and compute capabilities to unlock AI productivity. Snowflake Ventures has also recently invested in Landing AI, Mistral AI, Reka, and more to further Snowflake’s commitment to helping customers create value from their enterprise data with LLMs and AI.

In conclusion, Snowflake’s launch of the Arctic represents a groundbreaking advancement in the field of AI technology. With its unique Mixture-of-Experts architecture and unparalleled commitment to openness, Snowflake Arctic sets a new standard for enterprise-grade large language models. By providing access to cutting-edge AI building blocks and fostering collaboration through its Data Cloud platform, Snowflake empowers enterprises to harness the full potential of AI and machine learning. With Arctic and its family of models, Snowflake continues to accelerate AI innovation, driving value for customers and shaping the future of AI-enabled solutions in the Data Cloud era.

Picture of Anshika Mathews
Anshika Mathews
Anshika is an Associate Research Analyst working for the AIM Leaders Council. She holds a keen interest in technology and related policy-making and its impact on society. She can be reached at anshika.mathews@aimresearch.co
Subscribe to our Latest Insights
By clicking the “Continue” button, you are agreeing to the AIM Media Terms of Use and Privacy Policy.
Recognitions & Lists
Discover, Apply, and Contribute on Noteworthy Awards and Surveys from AIM
AIM Leaders Council
An invitation-only forum of senior executives in the Data Science and AI industry.
Stay Current with our In-Depth Insights
The Most Powerful Generative AI Conference for Enterprise Leaders and Startup Founders

Cypher 2024
21-22 Nov 2024, Santa Clara Convention Center, CA

25 July 2025 | 583 Park Avenue, New York
The Biggest Exclusive Gathering of CDOs & AI Leaders In United States
Our Latest Reports on AI Industry
Supercharge your top goals and objectives to reach new heights of success!
Join AIM Research's Annual Subscription Today

Unlock Unlimited AI Insights for Just $9999!

50+ AI and data science reports
All new reports for the next 12 months
Full access to GCC Explorer and VendorAI
Stay ahead with cutting-edge insights