How is photonic computing moving toward commercialization?


Photon computing uses light instead of electricity to perform calculations.

Image source: Getty Images/iStockphoto




Ruti Ben-Shlomi

Co-founder and CEO of LightSolver



  • Silicon-based computing has always been a key driver of innovation, but today’s workloads are pushing high-performance computers to their limits.

  • Photon-based (or optical) computing—using light instead of electricity—is paving the way for a new world of faster, more efficient computation.

  • As the demand for faster, more energy-efficient computing continues to grow, photonic systems not only complement traditional technologies but also hold the potential to outperform them in certain applications.


In recent decades, advancements in silicon-based computing have fueled innovation. However, today’s demanding workloads—such as large-scale projects like climate simulations and AI training—are pushing high-performance computers to their limits.


Meanwhile, quantum computing is still in its early stages and will likely take years before it can be deployed for large-scale commercial applications. In this gap, a promising contender has emerged: photonic, or optical, computing.


Photon computing uses light instead of electricity to perform calculations. Although this concept has been explored for decades, recent advancements have made it more commercially viable.


In fact, Gartner has just included photonic computing in its "2025 Data Center Infrastructure Technology Hype Cycle" report—clear evidence that this technology is capturing the attention of industry leaders and investors alike.


Why photonic computing matters


Photon computing leverages the speed and efficiency of light. Photons can transmit data much faster than electrons and operate in low-power environments—features that make them ideally suited for handling high-intensity workloads, such as scientific computing, machine learning, and optimization tasks.


Additionally, electronic systems generate heat and require extensive cooling measures, whereas optical systems have minimal thermal demands. Photonics-based computing is gaining momentum, with current research paving the way for a faster, more efficient computing paradigm.


There are several architectures in the field of photonic computing, each with its own advantages and disadvantages:


1. Free-Space Optical Communication (FSO)


As one of the earliest forms of optical computing, FSO systems use components such as lenses, spatial light modulators, and optical masks to manipulate light—whether in air or a vacuum—in order to process information. FSO systems can range in size from small boxes to occupying several racks, offering both flexibility and high-speed performance.


Before becoming a practical product, FSO still faces several challenges. A key issue lies in enhancing the system's durability and reliability. This calls for further optimization of optomechanical engineering, such as integrating solid-state optical modules, incorporating built-in spatial light modulators (SLMs), or leveraging photonic metamaterials.


Another issue is that the response speed of SLMs—currently used to control light paths in these systems—is far slower than that of electronic devices. However, next-generation SLMs with faster speeds and higher resolutions are already under development, holding promise for overcoming these limitations.


2. Photonics chip


Photon chips integrate miniature optical components such as lasers, beam splitters, and interferometers, enabling quick and easy integration into existing electronic architectures. While several technologies exist in this field, most struggle to scale up for more complex computing tasks.


Currently, most chip designs are two-dimensional, which limits their versatility. To boost performance, some companies have experimented with connecting chips together—but this often leads to significant signal loss. Another challenge is that photonic chips can’t optically store memory, forcing constant switching between light and electricity, which in turn degrades both performance and accuracy. Moreover, in large-scale optical circuits, analog signals may weaken and become prone to noise, making it difficult to achieve precise results.


Since scaling is so challenging, some companies are shifting their focus from developing all-optical AI chips to creating optical interconnect devices—technologies that leverage light to transmit data at high speeds between electronic components. This approach involves researching and engineering new materials and devices to minimize signal loss and enhance computational accuracy. For instance, lithium niobate has already shown promising results in early experiments.


3. Fiber Optic System


These systems leverage mature fiber-optic communication infrastructure, using light-guiding capabilities within optical fibers to perform complex computations—making them ideal for tackling challenging problems such as optimization tasks and artificial intelligence applications.


One example is the Coherent Ising Machine (CIM), which performs computations by sending light pulses through fiber-optic loops. Unfortunately, its core functionality still relies on electronic components, forcing frequent opto-electronic conversions that significantly slow down the computing speed. In the future, the system may shift toward chip-based architectures to enhance integration and scalability. To fully leverage the advantages of optics, the system’s electronic Field-Programmable Gate Array (FPGA)-based control unit must eventually be replaced with an all-optical processor, thereby eliminating the drawbacks associated with conversion processes altogether.


Another innovative design leverages multi-core specialty optical fibers to simultaneously handle multiple computations—much like multitasking with light. Currently, most of these multi-core fiber systems remain in the research phase.


The Bottleneck on the Path to Commercialization


Although photonic computing is gaining significant momentum, several critical technological bottlenecks still need to be overcome before it can achieve widespread adoption.


  • Accuracy and Stability:Optical-based systems are susceptible to interference from component misalignment, temperature fluctuations, or random signal noise. Researchers are addressing these challenges by implementing closed-loop feedback systems for automatic adjustment, employing specially designed adjustable-focus lenses, and leveraging machine learning tools to ensure the system remains properly calibrated at all times.

  • Memory and Storage:Optical data storage remains a significant challenge. Advances in optical components such as triggers—used to store individual bits of information—and delay lines, which help control the timing of light signals, could potentially transform the landscape. A particularly promising exception is systems based on optical cavities, which leverage light itself as both the medium for storage and computation, thereby eliminating the energy losses associated with transferring data between memory and processors.

  • Integration and Packaging:Whether calibrating lasers in free-space systems or miniaturizing photonic circuits on-chip, physical integration remains a significant challenge. Innovations in 3D packaging and materials could help enhance scalability while reducing costs.


The Future Path of Photonic Computing


We are entering a critical window of opportunity for the development of photonic computing. Amidst the growing global demand for faster, more energy-efficient, and higher-performance computing capabilities, photonic computing systems offer a promising new path—capable of complementing traditional silicon-based technologies while also holding the potential to outperform them in certain applications.


In the short term, all-optical free-space systems appear to be the most feasible, while hybrid systems that combine optical and electronic components also hold significant potential—especially as OEO (optical-electrical-optical) conversion technology advances, leading to reduced energy losses.


Memory computing that leverages light for both computation and storage also holds significant potential. In the medium term, processing architectures combining spatial and temporal dimensions could deliver even greater performance and energy efficiency.








The above content represents the author's personal views only.This article is translated from the World Economic Forum's Agenda blog; the Chinese version is for reference purposes only.Feel free to share this in your WeChat Moments. For reprints, please leave a message at the end of the article or on our official WeChat account.

Translated by: Di Chenjing | Edited by: Wang Can

The World Economic Forum is an independent and neutral platform dedicated to bringing together diverse perspectives to discuss critical global, regional, and industry-specific issues.

Follow us on Weibo, WeChat Video Accounts, Douyin, and Xiaohongshu!

"World Economic Forum"





Share this article