EnCharge AI Raises $100 Million To Challenge The AI Chip Status Quo With In-Memory Computing.

For decades, AI computing has revolved around the same fundamental bottleneck: transferring massive amounts of data between memory and processors, using both power and time in the process. While businesses like Nvidia and AMD have prospered in this climate, a new breed of chip startups is taking a different approach, one that completely eliminates inefficiency.er. […]

For decades, AI computing has revolved around the same fundamental bottleneck: transferring massive amounts of data between memory and processors, using both power and time in the process. While businesses like Nvidia and AMD have prospered in this climate, a new breed of chip startups is taking a different approach, one that completely eliminates inefficiency.er.

EnCharge AI, a Santa Clara-based startup that has just raised $100 million in Series B funding, bringing its total to $144 million. The round was led by Tiger Global, with participation from a long list of investors, including Maverick Silicon, Capital TEN, SIP Global Partners, Zero Infinity Partners, CTBC VC, Vanderbilt University, Morgan Creek Digital, Samsung Ventures, and HH-CTBC. Previous backers like RTX Ventures, Anzu Partners, Scout Ventures, AlleyCorp, ACVC, and S5V also doubled down on their investment.

Founded in 2022, EnCharge AI is built on nearly a decade of research from CEO and co-founder Naveen Verma, whose work at Princeton University explored the potential of non-volatile memory devices—a type of memory that retains information even when power is removed. This research has now evolved into an AI accelerator chip that integrates memory and computation, significantly reducing energy consumption while improving efficiency.

A Different Kind of AI Chip

The AI hardware market has largely been defined by GPUs, which have become the backbone of model training and inference. But GPUs are designed for high-performance parallel computing, not necessarily for power-efficient AI inference, especially in edge computing environments where real-time decision-making is needed.

This is where EnCharge AI’s analog in-memory computing chips come in. Unlike traditional architectures that constantly shuffle data between memory and processing units, EnCharge’s technology performs AI computations directly within memory, slashing energy usage and boosting performance. The company claims this method delivers a 20x improvement in energy efficiency, making AI more accessible to industries that need localized intelligence, such as automotive sensing, industrial robotics, and smart retail.

This shift is particularly relevant as companies seek alternatives to cloud-dependent AI processing. While cloud infrastructure remains essential, running AI models directly on local devices—also known as edge computing—reduces reliance on data centers, lowers operational costs, and enables faster, more secure processing.

The funding comes at a crucial time as EnCharge prepares to launch its first commercial AI accelerator products in 2025. The startup has already partnered with semiconductor manufacturing giant TSMC to refine its first-generation chips.

However, Verma has remained tight-lipped about the company’s valuation and customer base. While PitchBook data suggested EnCharge had raised money at a $438 million post-money valuation, the company disputed this claim. The identity of its customers also remains undisclosed, though the composition of its investor base hints at interest from both strategic and financial players in the industry.

The Analog Bet

EnCharge AI’s approach represents a departure from the dominant trends in AI chip development, where companies have focused on scaling digital processing rather than betting on analog computing. But the potential of analog AI chips is gaining attention.

IBM researchers, in a recent paper on analog processing, noted that eliminating the separation between compute and memory could make processors more efficient and economical than traditional digital architectures. However, they also highlighted that analog chips have historically struggled with noise and accuracy, making them less viable for AI model training—though they remain promising for inference.

This is where EnCharge AI claims to have made a breakthrough. According to Verma, the startup’s noise-resilient design solves one of the biggest challenges in analog chip development.

“If you have 100 billion transistors on a chip, they can all have noise, and you need them all to work,” Verma explained. “The big breakthrough we had is figuring out how to make analog not sensitive to noise.”

The company achieves this using a precise set of geometry-dependent metal wires, which are readily available in standard semiconductor supply chains. This, Verma argues, makes EnCharge’s technology scalable and cost-effective compared to other experimental AI architectures.

Can EnCharge Compete in an Already Crowded Market?

While EnCharge AI’s pitch is compelling, the AI chip industry is notoriously difficult to break into. Established players like Nvidia, AMD, and Intel dominate the space, while newer entrants such as Mythic and Sagence are also developing analog AI chips with a similar vision.

For investors, this means being highly selective about which startups they back. Jimmy Kan, an investment partner at Anzu Partners who previously worked at Qualcomm, noted that his firm had evaluated over 50 AI chip startups before deciding to invest in EnCharge.

“One out of every five of those was some sort of new novel architecture like analog or spiking neural network computation chips,” Kan said. “We really had it in our mind to find an AI compute technology that was really, really differentiated, versus incremental, versus something that Nvidia might just develop next quarter or next year.”

EnCharge AI’s ability to secure money from a variety of semiconductor-focused investors indicates confidence in its long-term prospects. However, whether it can turn its findings into an economically viable product that competes with established players remains to be seen.

Unlike many deep tech startups that rush to secure venture funding based on early-stage research, EnCharge AI took a different path. The company spent years refining its technology before officially launching in 2022.

“There’s certain kinds of innovations where you can jump to venture backing very early on,” Verma said. “But if what you’re doing is developing a fundamentally new technology, there’s a lot of aspects of that that need to be understood to de-risk. A lot of them fail.”

The challenge now is execution. With $144 million in funding and a roster of experienced semiconductor leaders at the helm, EnCharge AI is betting that its in-memory computing technology will redefine AI inference not just for cloud computing, but for the broader world of edge AI applications.

As the AI hardware race intensifies, the coming years will determine whether EnCharge AI’s vision can translate into tangible market success or if the AI industry remains firmly in the grip of the GPU giants.

For now, Verma is optimistic about what’s ahead.

“Our Series B is a pivotal milestone for the company that signals our readiness to bring our full-stack AI solutions to market in 2025,” he said. “We are grateful to the fantastic group of investors who will help us unlock the potential of artificial intelligence for countless industries and applications in a way that is sustainable, cost-effective, and scalable.”

📣 Want to advertise in AIM Research? Book here >

Picture of Anshika Mathews
Anshika Mathews
Anshika is the Senior Content Strategist for AIM Research. She holds a keen interest in technology and related policy-making and its impact on society. She can be reached at anshika.mathews@aimresearch.co
Subscribe to our Latest Insights
By clicking the “Continue” button, you are agreeing to the AIM Media Terms of Use and Privacy Policy.
Recognitions & Lists
Discover, Apply, and Contribute on Noteworthy Awards and Surveys from AIM
AIM Leaders Council
An invitation-only forum of senior executives in the Data Science and AI industry.
Stay Current with our In-Depth Insights
The Most Powerful Generative AI Conference for Enterprise Leaders and Startup Founders

Cypher 2024
21-22 Nov 2024, Santa Clara Convention Center, CA

25 July 2025 | 583 Park Avenue, New York
The Biggest Exclusive Gathering of CDOs & AI Leaders In United States
Our Latest Reports on AI Industry
Supercharge your top goals and objectives to reach new heights of success!