At Snowflake’s annual Summit, Christian Kleinerman, the Executive Vice President of Product, unveiled the company’s latest innovations and provided insights into Snowflake’s strategic approach to artificial intelligence (AI). A recurring theme throughout the conversation was Snowflake’s commitment to bringing AI models directly to enterprise data, ensuring data governance and security remain top priorities.
Addressing Regional Availability Challenges
One of the key questions addressed the availability of Snowflake’s AI platform, Cortex, across different regions. Kleinerman acknowledged that currently, Cortex is available with AWS in the Sydney and Tokyo regions. However, he emphasized the importance of cross-region calling capabilities, as cloud providers often lack larger GPUs like A100 and H100 across all regions.
“Cloud providers themselves don’t have the larger GPUs in every region,” Kleinerman explained. “So we actually have documentation on which models, not GPUs, but whether it’s a small instance or a large instance of which LLMs (large language models) are available in which region.”
The Business Value Perspective on AI Costs
Kleinerman addressed concerns about the potential high costs associated with AI workloads, offering a nuanced perspective. He cited a real-world example of a small startup where an AI-powered customer support bot handled more cases than the entire human support team combined, with lower response times and higher customer satisfaction.
“In many, many use cases, it’s just going to be dramatic savings,” Kleinerman asserted. “The cost of an LLM (large language model) call is very low. So in my mind, the question of cost is a question of what is the business value.”
Bringing AI to the Data, Not the Other Way Around
A core tenet of Snowflake’s AI strategy is to bring the models directly to the data, rather than sending data to external APIs. This approach addresses critical security and governance concerns, as customer data never leaves the trusted Snowflake environment.
“Our whole AI strategy is, instead of taking data and sending it to some API by someone like OpenAI or someone else, and then once you send the data, you have a lot of security questions or governance concerns, our whole strategy is to bring the models to the data,” Kleinerman stated.
He further emphasized that Snowflake guarantees the models are hosted by Snowflake, ensuring data is not trained or prompts remembered, even with custom models deployed through Snowflake’s container services.
Embracing Open Source and Interoperability
Addressing the theme of democratization and open source, Kleinerman acknowledged that competitors often follow Snowflake’s lead, but he views this as validation of their approach. He underscored Snowflake’s commitment to interoperability and open catalogues through initiatives like Polaris and Iceberg, with support from major cloud providers like Amazon, Microsoft, and Google.
“We all agree. We want to interoperate. We want to have open catalogues,” Kleinerman said, referring to Snowflake’s partnerships with cloud giants.
Regarding Databricks’ acquisition of the company controlling Iceberg, Kleinerman predicted mounting pressure on them to open-source their offerings, as Snowflake and other partners remain committed to an open and interoperable ecosystem.
Accelerating Innovation Through Pragmatism
Kleinerman attributed the acceleration of innovation at Snowflake to three key factors:
1. Completion of large, complex infrastructure projects, such as being data-complete and compute-complete.
2. Extensibility through native apps, which are faster to build than full-fledged features.
3. A mindset shift towards more pragmatic choices and incremental delivery, exemplified by the early general availability (GA) release of Cortex LLM functions.
“If there are instances where we can deliver value to our customers sooner, faster, we’re not going to delay that,” Kleinerman explained, highlighting Snowflake’s commitment to rapid value delivery.
Positioning as the AI Data Cloud
Reflecting the convergence of data and AI teams in enterprises, Snowflake now positions itself as the “AI Data Cloud.” This strategic shift acknowledges the blending of skill sets and responsibilities within modern data and AI teams.
Strategic Acquisitions for Skills and Roadmap Acceleration
Kleinerman also discussed Snowflake’s acquisition strategy, which is primarily driven by acquiring skills or accelerating product roadmaps, rather than buying customers or revenue. Recent acquisitions like Neeva brought valuable AI and large language model (LLM) expertise to the company.
“The areas where we choose to acquire companies are based on one of two things,” Kleinerman explained. “Either folks that can help us accelerate a roadmap, or when we have skills gaps that we want to fill.”
Throughout the conversation, Kleinerman emphasized Snowflake’s customer-centric approach, commitment to open standards, and focus on enabling enterprises to leverage AI while keeping data secure and governed within their trusted environments. As AI continues to transform industries, Snowflake’s strategy of bringing models to enterprise data could prove to be a competitive advantage in the rapidly evolving data and AI landscape.