After Valuation Talks in May, ClickHouse Lands $350 Million at $6.35 Billion

The future of analytics isn’t just dashboards. It’s intelligent agents that interpret data, trigger workflows, and power real-time decisions.

ClickHouse has spent the last decade optimizing for speed. That focus on performance, not extensibility has made it one of the most widely adopted systems for real-time analytical workloads. Now, it’s also one of the most highly valued.

After reports in May that it was in advanced talks to raise capital at a $6 billion valuation, ClickHouse has now confirmed a $350 million Series C at a valuation of $6.35 billion. The round was led by Khosla Ventures, with participation from BOND, Battery Ventures, Bessemer Venture Partners, and IVP, alongside existing investors Index Ventures, Lightspeed, Benchmark, Coatue, FirstMark, Nebius, and GIC. It also secured a $100 million credit facility from Stifel and Goldman Sachs, bringing total funding to more than $650 million.

ClickHouse, the column-oriented database, originally developed at Yandex in 2009, was built to solve a clear problem on how to process complex analytical queries at high speed, across massive datasets, without unnecessary overhead. It remained focused on that core for more than a decade. Then, as companies began turning to real-time telemetry, observability, and machine-driven systems, ClickHouse became the kind of infrastructure that more teams realized they needed.

The company says the capital will be used to scale its cloud product, expand global operations, and strengthen customer partnerships. It has also begun to grow its capabilities through acquisition, adding HyperDX, focused on observability, and PeerDB, which replicates Postgres data into ClickHouse with minimal latency.

“As AI agents proliferate across data-driven applications, observability, data infrastructure, and beyond, the demand for agent-facing databases like ClickHouse has reached an inflection point,” said Aaron Katz, CEO and co-founder. “The future of analytics isn’t just dashboards. It’s intelligent agents that interpret data, trigger workflows, and power real-time decisions.”

From internal tool to core infrastructure

ClickHouse was originally developed by engineer Alexey Milovidov at Russian tech firm Yandex to support its internal web analytics platform. The system was open-sourced in 2016, and in 2021 it became a standalone company, co-founded by Milovidov, Katz (former CRO at Elastic), and Yury Izrailevsky (a former engineering VP at Netflix and Google).

Its architecture has stayed close to its origins. ClickHouse uses a column-oriented storage engine, allowing it to read only relevant data for each query. It performs vectorized execution in-memory, compresses data efficiently, and scales horizontally across clusters of hundreds of nodes. The result is a database designed for high-throughput analytical queries on petabyte-scale structured datasets, with low latency and high concurrency.

ClickHouse now counts over 2,000 customers, including Tesla, Meta, Mercado Libre, Sony, Memorial Sloan Kettering, Lyft, GitLab, and Instacart. Its architecture has also proven especially useful to AI-native companies such as Anthropic, LangChain, Weights & Biases, Sierra, and Poolside, which use ClickHouse to power model telemetry, logging, and performance analysis pipelines.

“We designed and built ClickHouse from day one to support a broad spectrum of real-time data applications across industries,” Katz said. “Our momentum reflects that enterprises are hungry for a platform that can keep up with their scaling ambitions.”

Commercial traction and cloud expansion

In late 2022, the company launched ClickHouse Cloud, its fully managed commercial offering. The service, available across AWS, Google Cloud, Microsoft Azure, and Alibaba Cloud, delivers the same performance as the open-source core, with the operational simplicity required for production deployments. It includes proprietary components such as ClickHouse Keeper, which replaces ZooKeeper, and ClickPipes, a built-in ingestion engine.

According to the company, ClickHouse Cloud consistently outperforms traditional data warehouses like Snowflake in speed, compression, and infrastructure efficiency particularly for structured, high-volume use cases.

The company has grown more than 300% over the past year and has seen demand across sectors where real-time data processing is central to business operations. Its cloud platform now powers systems for fraud detection, observability, real-time personalization, and streaming analytics.

“ClickHouse is solving one of the most important infrastructure challenges of this era of AI and agents,” said Ethan Choi, Partner at Khosla Ventures. “The ability to deliver fast, scalable, and cost-efficient analytics is becoming foundational.”

Showcasing production-scale AI systems

ClickHouse made its funding announcement during its first user conference, held in May, where it highlighted how customers are using its database to support production AI and data-intensive systems. Across presentations, a recurring theme emerged: machine-generated workloads now dominate, and existing infrastructure often struggles to support them.

As Katz explained on stage: “Agent throughput will be constrained by how quickly they can get the data and the insights to make the right decisions. Agents also create an explosion of data that needs to be stored, processed, and analyzed cheaply and quickly. That’s what ClickHouse is built for.”

The systems discussed weren’t prototypes. They included model evaluation engines, experimentation dashboards, inference monitoring systems, and near-real-time feedback loops. What they shared in common was the need for consistent, sub-second analytical queries on live data.

While competitors like Databricks and Snowflake have expanded their platforms into full data stacks adding workflow orchestration, notebooks, AI development kits, and data governance layers ClickHouse has held its focus. Its managed cloud service is an extension of its core product, not a reinvention. Its acquisitions have been additive to ingestion and observability, not architectural shifts.

“What we’re seeing is that businesses are no longer building just analytics dashboards, they’re building data products,” Katz said. “That includes internal monitoring systems, external user-facing analytics, fraud detection engines, and increasingly, pipelines that support AI agents operating on streaming data.”

📣 Want to advertise in AIM Research? Book here >

Picture of Anshika Mathews
Anshika Mathews
Anshika is the Senior Content Strategist for AIM Research. She holds a keen interest in technology and related policy-making and its impact on society. She can be reached at anshika.mathews@aimresearch.co
Subscribe to our Latest Insights
By clicking the “Continue” button, you are agreeing to the AIM Media Terms of Use and Privacy Policy.
Recognitions & Lists
Discover, Apply, and Contribute on Noteworthy Awards and Surveys from AIM
AIM Leaders Council
An invitation-only forum of senior executives in the Data Science and AI industry.
Stay Current with our In-Depth Insights
The Most Powerful Generative AI Conference for Enterprise Leaders and Startup Founders

Cypher 2024
21-22 Nov 2024, Santa Clara Convention Center, CA

25 July 2025 | 583 Park Avenue, New York
The Biggest Exclusive Gathering of CDOs & AI Leaders In United States
Our Latest Reports on AI Industry
Supercharge your top goals and objectives to reach new heights of success!