Latest articles

How Light Could Change Computing And Data Transfer

Photo credit: LRZ / Veronika Hohenegger

Modern workloads are pushing high-performance computers to the breaking point so tech giants and startups are racing to find ways to ease the growing energy and performance demands. Photonics are shedding light on potential ways forward.

In a September paper in Nature magazine researchers at Microsoft’s company’s lab in Cambridge, England, describe building a small analog optical computer (AOC) from micro-LEDs, lenses and phone camera sensors that use light instead of electricity to perform calculations. Photons can transmit data faster than electrons and work in low-energy environments, which makes them perfect for processing intensive workloads like scientific computations, machine learning and optimization problems. The goal of building the AOC was to show that using light for computation could make future systems faster and more sustainable than today’s digital machines, with potential huge upsides for business applications.

Microsoft’s multi-year research project with Barclays Bank PLC involved using the AOC to solve the type of optimization problem that is posed every day at the clearinghouses that serve as intermediaries between banks and other financial institutions. Another promising area for analog optical computers is in healthcare. Microsoft used its AOC to allow for much quicker MRI scans, which would make it possible to do more scans with one MRI machine each day.

Microsoft’s work gives a glimpse of photonic computing’s potential. The next wave of computing will be shaped by advances in simulation, image analysis, classification, and other complex applications – areas where existing chip technologies are reaching their physical and economic limits: performance stagnates, energy demand skyrockets, and the costs explode. Photonic processing is poised to become a critical pillar of next generation computing because it excels in exactly these domains.

Like Microsoft, German scale-up Q.ant is using photons instead of electrons to process information, an approach that could make computations faster, greener and far more efficient. Meanwhile, IBM and Spanish scale-up IPronics, are separately developing ways to move data between chips using light, a shift that could make it easier to train large AI models and develop AI applications.

A Platform for Multitask, High-Performance Computing

The concept of optical computing – photonic architecture uses analog computation with light instead of translating complex mathematical functions into digital bits and transistor flips – was introduced several decades ago, but a resurgence of interest has been spurred by rapid progress and recent innovations in silicon photonics, nanophotonics and materials science.

Another paper published in Nature in April sheds light on why U.S. research and advisory firm Gartner just included photonic computing in its 2025 Hype Cycle for Data Center Infrastructure Technologies.

A variety of technical developments in integrated photonics has demonstrated their potential for accelerating computation, says the April Nature paper. These efforts, complemented by promising results on optical transformers and nonlinear activation functions, have confirmed the extensive potential of photonics as a platform for multitask, high-performance computing.. Recent studies have also shown that integrated silicon photonics can perform AI training tasks. And,. importantly for businesses, photonic computing has demonstrated its potential to solve complex problems more efficiently.

Photonic computing is one of the future computing technologies being tested at the Leibniz Supercomputing Centre (LRZ), on the Campus Garching near Munich, operated by the Bavarian Academy of Sciences and Humanities (see the photo). Among other things LRZ runs an experimental lab for data centers.

LRZ is testing German photonic computing scale-up Q.ANT’s photonic AI accelerator, which CEO and Co-founder Michael Förtsch says enables drastic energy savings and performance gains:  For complex AI workloads, Q.ANT claims its photonic processors can reach 50x better performance than state-of-the-art GPUs and a 30x better energy efficiency.

“We wanted to see if you could bring this technology into a data center and use it for applications, so we got our hands on it to measure and evaluate and report afterwards,” says Prof. Dieter Kranzlmüller, director of LRZ and an expert in next-level compute paradigms.LRZ won’t complete its testing of Q.ant’s system for another few months but he calls its technology “very promising. “

In an interview with The Innovator Kranzlmüller talks about the stark difference between graphic processing units (GPUs) – the chips used today to power AI – and Q.ant’s photonic chips.

“When I open the doors to a rack of GPUS in LRZ there is a lot of heat emanating from it, even if the system is idle,” says Kranzlmüller “When I walk around the corner and open the cabinet with working Q.ant photonic chips there is no heating. You can feel it when you are in front of the system.”

Q.ant has been working on data processing with light since 2018. On October 30 it announced the second closing of its Series A funding round securing an additional investment from Duquesne Family Office, the investment firm of American billionaire Stanley F. Druckenmiller. The latest funding raise brings Q.ANT’s total funding to $80 million, the largest Series A financing round for photonic computing in Europe. The company said the funds will help accelerate commercialization of Q.ANT’s light-based processors, drive next-stage technology development to improve AI infrastructure and support the company’s expansion into the U.S. market.

Other startups, including Lightmatter (which is being backed by Google), Lightelligence, Luminous, LightSolver, are also working on photonic computing. Q.ANT, which counts Hermann Hauser, a serial entrepreneur and founder of Amadeus Capital Partners, and Hermann Eul, a former Intel executive and member of the board of Infineon Technologies, Europe’s largest semiconductor manufacturer, among its advisors, says it is the first photonic computing company to have publicly presented a prototype solution that easily plugs into existing data centers with no need for changes to current hardware or software stacks.

Q.ANT has a product that you can buy and integrate into existing ecosystems,” says Leon Varga, Q.ant’s Senior Software Developer. “It is already implemented on standard protocols, in a standard form factor, and runs in the data center.”

That is a major difference with Microsoft’s approach, he says. Microsoft’s AOC “ is a laboratory setup on an optical table,” he says, which currently can’t be integrated into a product format or compute-environment.

He also points out that Microsoft’s AOC is going in a different direction technologically. The researchers are drawing on insights from the quantum computing space and so called “quantum-inspired” algorithm optimization schemes. “That lies in the future,” says Varga. “At Q.ANT, we look at the current market and try to support the latest use cases and open up new applications.”These include next-generation AI applications such as computer vision (image-based AI), highly nonlinear algorithms, with key applications in the machine learning area, and other future algorithms that can be used to solve current industrial problems involving classification like quality inspections or inventory optimization, says Varga. “We are also looking at the field of physics simulations in HPC [high performance computing] such as thermodynamic simulations or weather forecasting,” he says.

Q.ant has also tested an MRI scan use case with its native processing unit, a specialized computer microprocessor designed to mimic the processing function of the human brain. “We proved that error propagation of analog compute does not occur and that AI based on photonic compute is feasible, with same performance at lower energy and less footprint,” says Varga. Q.ant’s photonic system achieves 99.5% consistency compared with digital CPU/GPU [Computer Processing Units/Graphic Processing Units] results across deep, multi-layer architectures, he says.

“The really important thing is this,” says Varga, “our photonic processor is built to unlock completely new functional spaces, enable other algorithms that were previously out of reach, and thus open up new applications. This is what we mean, when we say, we give the industry a new tool to compute new things, i.e. rethink computing.”

That may take some time. Photonic computing requires the development of compatible models and algorithms to be implemented on this new form of hardware. LRZ’s Kranzlmüller points to Nvidia’s AI chip as an example. Nvidia improved its technology over time but the real key to its success was that it built an ecosystem of software [called CUDA] and hardware that can be blended and is easy to use, he says. “Q.ant has built the hardware, now it needs to build the rest of the software environment” so there is a full stack hardware/software environment of products, he says.

Optical Data Transfer

AI is not just putting a strain on compute processors. It is also testing the limits of data transfer. Here, too, advanced optical technologies are playing a role. “We are starting to see that coming to the fore now,” says Bob Sorensen, Senior Vice President of Research, at U.S.-based Hyperion Research. “Optical is stepping in as a replacement technology for data transfer, driven primarily by the significant data demands of AI applications.

While optical computing is more speculative – it is likely to be a few years before it is commercialized and becomes widespread- optical interconnect is closer to reality and has much broader applications and potential for revenue,” he says.

IBM has developed a cost-efficient technology based on photonic integration that reduces the energy needed to optically connect computing units.

Instead of focusing on light to do computation IBM’s labs are developing photonic links that shuttle data between chips, memory and boards, with the aim of easing the bottlenecks that slow AI systems today. “Light is used to transfer data from chip to chip, from CPU to memory or from one board to another,” IBM researcher Jean Benoit Héroux was quoted as saying in an IBM blog pasting. “That’s where IBM’s optical efforts are focused.”

Every bit of data that moves through a data center costs energy. For Héroux and his colleagues, the key number to watch is picojoules per bit, a measure of how much energy it takes to send a single bit of information. “Today, we’re around five to 10 picojoules per bit,” Héroux says in the blog. “The goal is to get below one.”

Achieving that will require new materials and designs. Last year, IBM unveiled a polymer optical waveguide technology that aims to squeeze more optical channels into less space.

Meanwhile Spanish scale-up IPronics is working on a new approach to optical circuit switching (OCS) that is expected to enable the structured, high-volume data movement that AI and advanced computing require. Just as electrical packet switching powered the Internet era, optical circuit switching is expected to shape the AI era.

The first commercial OCSs appeared in the early 2000s, based on micro-electro-mechanical systems (MEMS). These systems use tiny movable mirrors to steer light between ports. MEMS-based OCSs are mature and wavelength-agnostic but they face some reliability challenges. The ability to support large port counts has enabled their adoption in high-performance environments. However, with the introduction of AI the technology faces practical limits in switching speed, cost per port, and integration density.

To overcome these constraints, silicon photonics (SiPh) has emerged as the next stage of OCS development. SiPh enables, with no moving parts, faster reconfiguration times, and higher on-chip integration, says Ana Gonzalez, IPronic’s vice-president of business development. Its compatibility with standard CMOS processes allows manufacturers to scale production while reducing costs. Today, is still maturing, but its strong potential is already visible as it begins to make its way into early commercial deployments and next-generation AI data centers, she says.

In real-world deployments, next generation OCS’ benefits are most apparent in structured workloads, such as AI/ML clusters, where traffic patterns are predictable and repetitive, she says. In these scenarios, circuits can be reconfigured in microseconds, enabling low-latency interconnects and contention-free communication across thousands of processors. This makes the latest generation of OCSs particularly effective in GPU clusters, where data must move in tightly synchronized, all-to-all patterns, says Gonzalez.

“Super fast optical switches are becoming essential to growing the performance of scale-up networks of GPUs and memory,” says CEO Christian Dupont. “IPronics is the first to launch into the market the next-generation optical circuit switch product based on silicon photonics,” he says. “We are already shipping to customers that are testing the technology and we expect the first network tests in 2026.”

The Future Of Computing Will Resemble Netflix

The advancement of photonic computing and photonic data transfer underscore how the future of computing will be heterogenous, says Hyperion Research’s Sorensen. “Sometimes using CPU and Intel microprocessors will be enough. Other times you may need GPUs and AI. Certain applications may require quantum computing. Photonic computing may be the fastest hardware implementation for some specific applications and algorithms. The workload will define the kind of hardware you need.”

Sorensen uses the analogy of television. It used to be that broadcast television was the only option. You watched whatever was on. That was traditional computing. You bought what the vendor offered. Today there are multiple streaming services like Netflix, he says, and you search for the one that has what you want. He advises business leaders to do the same with the current range of computing architectures. “Understand your current computational workloads and find the pain points,” he says. “Determine what are the things you need to reach your ultimate goals and what are the compute problems you have that matter the most. Explore the architectural options in earnest and figure out where you can benefit most from looking at new technology.”

This article is content that would normally only be available to subscribers. Become a subscriber to see what you have been missing.

 

 

 

About the author

Jennifer L. Schenker

Jennifer L. Schenker, an award-winning journalist, has been covering the global tech industry from Europe since 1985, working full-time, at various points in her career for the Wall Street Journal Europe, Time Magazine, International Herald Tribune, Red Herring and BusinessWeek. She is currently the editor-in-chief of The Innovator, an English-language global publication about the digital transformation of business. Jennifer was voted one of the 50 most inspiring women in technology in Europe in 2015 and 2016 and was named by Forbes Magazine in 2018 as one of the 30 women leaders disrupting tech in France. She has been a World Economic Forum Tech Pioneers judge for 20 years. She lives in Paris and has dual U.S. and French citizenship.