Is photonics the end of the CPU?

Image Description: an optical circuit.

As time goes on, our computer are becoming more and more powerful. This is primarily due to the ongoing development of the CPU (or GPU): the ‘brain’ of the computer. A pressing question facing the electronics industry is whether computers will one day become more powerful than a human beings. As it stands, however, conventional electronic computers are still facing ‘the Von Neumann bottleneck’ – a limit in the data transfer speed between the processor and memory. As a result, computers still have a long way to go before they’re likely to start outsmarting us. 

Photonics may one day have the same intellectual capacity as humans.

A computer’s CPU is responsible for processing computations and storing the results in the nearest available memory, as the CPU doesn’t have storage capability itself. 

The term ‘Moore’s Law’ refers to the fact that the size of the average computer is progessively shrinking as technology improves. However, this trajectory has plateaued significantly in recent years indicating that we may be reaching the theoretical limit of shrinking chip size. 

Researchers are now trying to investigate new technologies to bypass this restriction, including developments such as photonic and neuromorphic computing.  It is thought that these might one day replace our current computer systems, perhaps evening acheiving the same intellectual capacity as a human brain.

Photonic computational strategies use light instead of electricity to carry out data processing and storage. This is likely to be one of easiest technoogical developmets to implement, while still improving computing performance significantly. Its basic principle is similar to that optical fibre communication: light is used to transfer data between CPU and memory, achieiving much faster processing speed than is possible with electrical signals. 

Another advantage of this system is that different streams of information can share the same fibre. Instead of using hundreds of electrical wires, optical fibres can elegantly separate information based on wavelength – a phenomenon known as wavelength multiplexing.

However, one of the biggest obstacles facing photonic computing is that its network cannot store the information – it can only be used for  data transfer or real-time computation. In other words, the photonic computer was inherently volatile. 

This posed a signficiant problem for the photinic industry for decades, until a potential solution was identified in 2015. A research group led by Professor Harish Bhaskaran in the Department of Materials at the University of Oxford used developed a system using phase-change materials that was able to achieve permanent data storage in an all-optical network. This discovery was published in Nature Photonics in 2015. 

Phase-change materials are not new inventions – they are often used in rewriteable CDs and DVDs. Data storage is acheived by using a laser to switch this kind of material between a crystalline and amorphous phase.  Importantly, this process is both stable and reversible.

In the all-optical network, researchers place a thin layer of this phase-change material on top of a structure known as the waveguide. An optical pulse carrying information passes through the waveguide, switching the phase of the material at a nanosecond time scale. The material state can, in turn, alter the light transmission inside the waveguide, thereby enabling the optical read-out. The network can even utilise the ‘transition’ states in between two phases in order to upgrade the scale of possible computation.

Since the storage capability increases exponentially as the bits increase, phase-change photonics may theoretically expan our computing capacity far beyond electronic computers. 

However, as most of the signals in our world are electrical signals – existing systems will need   to have their output converted before they may be transmitted in an all-optical network. This pre-processing requires both a lot of energy and ancillary circuits, which may compromise performance. 

Another hurdle that needs to be overcome if computers are to catch up with human brains, is the computing structure. 

In human brains, computing and storage happen simultaneously. This is remarkable given the complexity of the processing going on inside your head. Your laptop tends to over heat when handling particularly difficult  computing tasks, so why do we never get a fever from thinking too much? 

Why do we never get a fever from thinking too much?

All the answers are hidden in in the unique architecture of our brains. Our neurons play the role of a processor, and synapses as memory. When a neuron is excited, it generates a spike pulse whic propagates across synapses to other neurons. When the post-synaptic neuron receives a pulse larger than its threshold, it will generate its own spike pulse. This can alter the strength of synaptic connections, making it either easier or harder to excite them in future, a process broadly comparable to data storage. 

Researchers are now exploring how to mimic this behavior computationally in an active area of technological development known as ‘neuromorphic computing’. 

Compared to traditional computer architecture, neuromorphic computing is much more flexible and efficient. It has distributed computing nodes, collocated processors, and memory like neurons and synapses. When it executes an algorithm, the parameters of the network (e.g the resistance) will change correspondingly. 

Its activity depends on the input it recieves and, thus, theoretically avoids unnecessary power consumption. 

However, the development of the ‘neurosynaptic circuit’ is still very much in its infancy. In May 2019, Bhaskaran’s research team collaborating with University of Münster created an all-optical spiking neurosynaptic network with self-learning capabilities,. The results of this study were published in Nature. 

We may be heading for a computing revolution.

While most of these technologies are still at the earliest stages of development, the rate of progress we are currently experiencing suggests we may be heading for a computing revolution.  At present, photonics computing has demonstrated a potential for large volme storage, in-memory computing, and voltatile and non-volatiles capabilities in a single-device. Similarly, further developments in ‘neuromorphic computing’ may offer a computational strategy capable of achieving comparable complexity to human cognition. In addition to these incredible technologies, researchers are also exploring  an array of other approaches to next-generation computational strategies including quantum computing and magnetic computing. 

However, moving forward, progress in this industry is likely to depend on how well we are able to integrate these exciting new stategies with existing technology. 

Image Credit: Nathan Youngblood.