Search
Close this search box.

Computer Chips Today Are Way Too Hot and Lightmatter Knows Why

Since light defines the speed limit in our universe, it’s the fastest thing out there.

Nick Harris’s journey into technology began at the intersection of two fields that would shape his career: quantum computing and semiconductors. As a doctoral student at MIT, Harris delved into the world of quantum computers, exploring how they manipulate light for computation. Yet his fascination with solving real-world problems had begun even earlier, during his tenure as an R&D engineer. Fresh out of college, Harris immersed himself in the physics of transistors—the nanoscale switches at the heart of computer chips.

It was there that he witnessed a looming crisis: the voracious energy consumption of semiconductors. Not only were they guzzling electricity, but projections suggested their appetite would skyrocket over the next two decades. This realization stayed with him, even as he advanced in quantum research. The turning point came when Harris began contemplating a bold idea: Could the principles of quantum computing—particularly its use of light—be adapted to address the semiconductor energy problem?

This question became the foundation of Lightmatter, the company Harris co-founded with a mission to revolutionize the way computers process information. Leveraging his dual expertise in quantum systems and semiconductors, Harris envisioned a future where AI computing could be faster, cheaper, and vastly more energy-efficient.

The Light-Based Solution to a Growing Problem

At its core, Lightmatter’s innovation lies in photonic computing, a technology that uses light to process AI workloads. While most chips rely on electrical signals that generate significant heat and consume vast amounts of power, photonic chips harness the speed and efficiency of light. “Since light defines the speed limit in our universe, it’s the fastest thing out there,” Harris explained.

This shift from electrons to photons addresses the fundamental challenge of Dennard scaling, the principle that transistor efficiency improves as they get smaller. With modern transistors now at the scale of electrons, they suffer from energy leakage, resulting in inefficiencies and heat generation. Photonic computing sidesteps this issue entirely.

Using lasers and optical components, Lightmatter’s processors handle data-intensive tasks, such as neural network calculations, with significantly less energy. “We’ve developed a new type of compute element that is based on light—processing light, from lasers. Using that element, we’re able to get around this fundamental technology challenge, this energy problem,” said Harris.

Building Chips for the Future of AI

Lightmatter’s chips are designed with AI in mind, specifically to accelerate neural networks. This technology enables high-speed data processing with reduced energy costs. Unlike traditional chips where data movement requires turning electrical wires on and off—leading to energy losses due to resistance and capacitance—optical systems don’t suffer from these limitations.

“I saw Intel has a new 600-watt computer chip. Those are pretty wild numbers; you start to get to a point where you’re getting close to fundamental limits on your ability to pull heat out of silicon.”

Lightmatter’s chips circumvent these issues by processing light from lasers. Unlike electrons, photons don’t dissipate heat in the same way, allowing computations to run at extremely high speeds with minimal energy loss. According to Harris, this approach overcomes what he calls the primary challenge in computing today: energy density.

Harris broke it down: “On an electronic chip, when you want to send a signal between points, you have to turn a wire off and on. And turning a wire off and on costs energy—you have to dissipate energy through resistance, inductance, and capacitance. But with an optical wire, you don’t have those properties. So turning it off and on takes very, very little energy, and you don’t have time delays associated with that, which allows you to run the processor at very high clock speeds.”

This approach not only saves energy but also enables larger, more sophisticated AI models to operate effectively. Harris pointed out that the energy savings and computational efficiency could democratize AI, allowing companies of all sizes to deploy advanced neural networks without prohibitive hardware costs.

The Impact on Data Centers and the AI Industry

One of Lightmatter’s key markets is data centers, which are the backbone of AI operations for companies like Google, Amazon, and Meta. Today’s data centers rely heavily on traditional chips from industry leaders like Nvidia, but these chips are power-hungry. Lightmatter’s technology offers a compelling alternative.

The company claims its photonic chips are 10 times faster than Nvidia’s AI processors and significantly more energy-efficient. If Lightmatter chips replaced Nvidia DGX-A100 processors in just 100 data centers, the energy savings would be equivalent to the annual electricity usage of 61.6 million homes.

This energy efficiency directly addresses a growing concern in AI computing. As algorithms like GPT-3 demand ever-increasing computational power, the environmental and financial costs of running such models have become unsustainable. Harris believes Lightmatter’s technology could democratize access to advanced AI capabilities, preventing a future where only tech giants can afford to deploy neural networks at scale.

For major tech companies, this energy efficiency translates into cost savings and environmental benefits. “We’re allowing them to operate cheaper, so it costs them less money to do each inference, which means they can run bigger models,” said Harris. This could unlock AI features that have previously been too expensive to deploy, from smarter voice assistants like Siri to improved search algorithms and retail recommendations.

The Road Ahead for Lightmatter

Lightmatter has already gained significant traction, securing $80 million in funding in May 2023 from investors like HP Enterprise and Lockheed Martin. This capital is being used to scale the company’s market strategy and build out its go-to-market teams. However, as Harris acknowledged, Lightmatter is still largely in the research phase, with significant hurdles ahead to replace incumbent technologies in data centers.

One of the company’s early backers is Google Ventures, whose support reflects Lightmatter’s potential. Interestingly, Lightmatter’s chip architecture resembles Google’s Tensor Processing Unit (TPU), a leading AI processor. However, while TPUs rely on electrical processing, Lightmatter performs these calculations in the optical domain, achieving greater energy efficiency.

“It would speed it up, but I think the proper way to use it would be to do a much more advanced neural network in the same amount of time—to give you a lot more information that’s relevant.”

The transition to photonic computing represents a profound shift in how AI workloads are managed. As Harris noted, “With current technologies, you’re going to see the opposite of democratization. The only people who will be able to run their neural nets will be the big software companies, places like Google, Amazon, and Facebook. I don’t want it to be a future where compute is inaccessible.”

A Vision for an Equitable and Sustainable Future

The implications of Lightmatter’s technology extend beyond energy savings. By making AI computation more affordable and scalable, Lightmatter is paving the way for broader access to advanced AI tools. This could enable smaller companies and startups to compete in an arena currently dominated by tech giants.

Moreover, Lightmatter’s focus on energy efficiency addresses one of the tech industry’s most pressing concerns: sustainability. With AI models like OpenAI’s GPT-3 requiring immense computational resources, the environmental footprint of AI is growing. Lightmatter’s chips could be a crucial step toward mitigating this impact while enabling further advancements in the field.

“There are only a few players that can really do this, and those machines using tons of energy is not something that I’m excited about. We’re enabling a world where AI isn’t just faster but more accessible, efficient, and meaningful for everyone.”

Picture of Anshika Mathews
Anshika Mathews
Anshika is an Associate Research Analyst working for the AIM Leaders Council. She holds a keen interest in technology and related policy-making and its impact on society. She can be reached at anshika.mathews@aimresearch.co
Subscribe to our Latest Insights
By clicking the “Continue” button, you are agreeing to the AIM Media Terms of Use and Privacy Policy.
Recognitions & Lists
Discover, Apply, and Contribute on Noteworthy Awards and Surveys from AIM
AIM Leaders Council
An invitation-only forum of senior executives in the Data Science and AI industry.
Stay Current with our In-Depth Insights
The Most Powerful Generative AI Conference for Enterprise Leaders and Startup Founders

Cypher 2024
21-22 Nov 2024, Santa Clara Convention Center, CA

25 July 2025 | 583 Park Avenue, New York
The Biggest Exclusive Gathering of CDOs & AI Leaders In United States
Our Latest Reports on AI Industry
Supercharge your top goals and objectives to reach new heights of success!
Join AIM Research's Annual Subscription Today

Unlock Unlimited AI Insights for Just $9999!

50+ AI and data science reports
All new reports for the next 12 months
Full access to GCC Explorer and VendorAI
Stay ahead with cutting-edge insights