As modern transistors approach the size of individual atoms, we are seeing the end of Moore’s Law, the exponential scaling of transistor density on microchips. Moore’s Law has been one of the key driving forces for economic growth and technological progress for the past 70 years. Computing power using classical methods (conventional transistors) is nearly maxed out, due ironically, to quantum mechanics. As transistors reach the scale of individual atoms, electrons can “quantum tunnel” across a transistor rendering it useless.

But the quantum realm could also yield the solution to Moore’s Law leaving even the best classical chips in the dust. However, just like their classical predecessors, quantum computers are approaching a junction which will define the winners in the race for a commercially viable model: photons vs mass-based particles.

Taking a step back to the origins of classical computers, up to and including the 1940s most classical computers were running on vacuum tubes.  Vacuum tube-based computers were in fact used during World War II to crack Nazi Germany’s encryption.  Working on the similar principle to traditional, cathode ray tube (CRT) televisions, they could steer electrons to generate useful outputs.  However, much like those inefficient, power hungry, cumbersome CRT televisions, they were quickly usurped by a superior technology as soon as it came along.

The 1950s brought the introduction of solid-state transistors, which are smaller, more scalable, and more efficient. Vacuum tubes quickly became a thing of the past and there was no turning back.

Quantum computers are also now reaching a similar crossroads.

Whilst they can accelerate some of the most taxing (or even impossible) processes to a standard CPU, to just minutes (for example, tasks that would take the fastest super-computer 10,000 years to complete, can be done in a mere 4 minutes [1]) they face a similar fork in the road as classical computing technology did in 1950s.

Quantum computers can run on two types of particles, either i) a massive, charged particle such as an ion or electron, or ii) mass-less, charge-free photons.

Massive, charged particles are the equivalent of vacuum tubes today – they got us much of the way, helping develop the framework for computer logic, data entry, output and other less tangible aspects required. The Goliaths of the computing world have been making those first leaps with their transmon (charge and mass) based processors such as Sycamore (Google) or Eagle (IBM), but they could ultimately be defeated by the Davids of the quantum realm using photonics-based processors. Photonic quantum computing could herald the quantum leap for quantum computing – enabling it to become truly scalable and achieve its true potential.

Previously, it was impossible for us to create a single photon and manipulate it, meaning charged particles were the way forward. These are large particles relative to photons, with a mass and charge that make them more likely to interact with the environment and to decohere i.e. become unstable and lose their quantum computing capabilities.

This led to the development of these large dilution fridges, designed to maintain these near absolute zero temperatures i.e. an environment at which there is incredibly low background noise to prevent interaction with / interference with the particles being used (i.e. prevent decoherence) with temperatures even lower than those in deep space.

There is a workaround that has been developed by the likes of Honeywell with their ion trapped quantum computer, which negates the need for the milli Kelvin temperatures, but at a cost of lower clock speeds.

Traditionally a “bit” in a computer (an individual unit of memory) is restricted to either a 1 or 0. However, quantum computers utilise both the probabilistic nature of the very small such that a single quantum bit (qubit) is a superposition of both 1 and 0, and also another concept called entanglement. When two particles become entangled, their properties become intertwined. That is to say, if we were to know the properties of one particle, we would implicitly know the properties of the other.

However, the conditions required for such a process are incredibly fragile. Entanglement is a delicate process, and any interactions with the environment will result in decoherence. The particle is no longer represented by a probability wave due to it being detected, so the wave function collapses and we lose the information in the wave. To completely isolate the particle, we have to maintain a temperature of a few milli Kelvin (i.e. a few thousandths of a degree above absolute zero).

Recent advances in photonics by companies such as Quandela or Sparrow Quantum mean we are now able to produce single photon emitters (called quantum dots), and these make the problem of decoherence much more manageable and enable the basis of photonic based quantum computing.

As photons have no charge or mass, meaning they are much less likely to interact with their environment and rather than having temperatures of a few milli Kelvin, a few Kelvin (such as those widely used in existing applications such as MRI scanners within hospitals and in petroleum and natural gas exploration) will suffice and we no longer have the need for these bespoke dilution fridges.

So whilst there are alternative approaches to quantum computing such as transmon based processors, only photonic based solutions provide both i) coherence longevity and ii) optimal processor speed.

The underlying dual in quantum computing may have in fact been resolved by the development of photonics which are smaller, more scalable and more efficient, and present a new stage of commercialisation of quantum computers.

The same way that Fairchild Semiconductor pioneered the future in the 1950s through the commercial development of the semiconductor, and the Traitorous Eight (including Gordon Moore of Moore’s Law) built the foundations of modern technology and venture capital, photonic based quantum computing could herald the same in the 2020s. Investors need to focus on the picks and shovels plays in photonic quantum computing that can unlock that future, e.g. quantum dot single photon source, photonic demultiplexer, specialist fibre optic cables, to ensure that they can ride the tidal wave of innovation that is coming.

We have worked extensively with deep tech across multiple areas including Bright Computing on their sale to Nvidia, which is used in the world’s most powerful classical supercomputers to ensure cluster stability and scalability as well as across highly heterogenous computational architectures. In addition, we have also advised ETA Devices on their sale to Nokia, used to dynamically add gain to digital radio signals (rather than traditional, linear gain) to dramatically reduce the power loss for cellular base stations, and we look forward to working further with those looking to unleash the future.

Find out more about us.

 

[1] Source

Share
04

GET IN TOUCH

Say Hello