Lightning-fast communications

Light is a medium that allows for efficient data transmission at the highest data rates. Jürg Leuthold, head of the Institute of Electromagnetic Fields, explains how researchers continue pushing the boundaries of the possible.

I remember as children we used to play in the dark, flashing our torches to send each other messages in Morse code. How do things stand now?
Actually, this operation principle has uniquely been used until recently. Up to the year 2005, optical communication technology operated on basically that principle: if the signal – that is to say, light – is present, then that corresponds to a value of “one”; otherwise, the signal or light is absent, corresponding to a value of “zero”. The only thing that really changed prior to 2005 was the switching speed, which got faster and faster.

Am I right in thinking that the use of fibre optic cables was an important step en route to ever-higher data transmission speeds?
Absolutely. Instead of sending electrons down copper cables the oldfashioned way, we now drive photons – tiny particles of light – through fibre optic cables. This makes it possible to transmit much more information per unit of time. It was in 1996 that a data rate of one terabit per second was achieved for the very first time using the familiar light on and off technique. That means switching a trillion, or 1012 times a second. This represented a real breakthrough.

But there was still plenty more to come, wasn’t there?
In 2001, specialist magazine Nature published an important theoretical paper addressing the maximum possible transmission capacity of fibre optic cables. The paper suggested that the maximum possible transmission capacity of a fiber optic cable should be around 100 terabits per second. However, this was reckoned unachievable in practice. It was estimated that the best possible data transmission rate would be around 10 terabits per second. However, that very year, we realized that, in optics too, it was possible to encode light signals in a new way. Prior to 2001, we had access only to the signal intensity – i.e. the light was on or the light was off. Now, we had new components that allowed for improved encoding – or phase encoding as we call it.

Enlarged view: Jürg Leuthold
In his laboratory, Jürg Leuthold works with the "fastest light switches in the world". (Photo: Marvin Zilm)
"Home users will soon be surfing at 10 Gbit/s." Jürg Leuthold

What exactly do you mean by that?
Light is a wave. A wave’s phase indicates at what point in time the wave’s peak and its valley are transmitted. That means that for a given wave you could either transmit the peak first and then the valley, or vice versa. That in itself is information. The difficulty is that the light waves we are talking about oscillate some 200 trillion times a second. To identify the absolute phase of such a waveform is something that many at the time viewed as impossible.

What was your part in this development?
At the time, I was working at Bell Labs in the US. A colleague of mine and I were fortunate in that we weren’t familiar with the theories surrounding the presumed impossibility of phase coding in optical fiber communications. We happened to work on the idea of measuring not the absolute phase but rather the relative phase from one bit to the next. This is much easier to do. The encoding technique is known as differential phase-shift keying. I built a special receiver for the purpose.

And your plan worked?
It allowed us to beat the then world record in data transmission by a factor of two straight off the bat. Since 2005, we’ve seen the first networks to feature differential phase-shift keying entering service. Since then, the big network operators have increasingly been turning to phase encoded signals to transmit data – and it looks like the “light on-light off” era is drawing to a close. About four years ago, we reached a data transmission rate of 100 terabits per second in a single fibre optic cable for the first time. Not only have we achieved what was considered theoretically possible but practically beyond reach ten years ago – by now we’ve even exceeded that benchmark.

How does that help me as a private user?
Think back 15 years – at this time you were lucky to get 128 kilobits a second from your desktop computer. These days, you can easily get a gigabit per second with a fibre optic connection. Private users today have around 10,000 times the bandwidth they had 15 years ago. Now imagine the same thing happening in a different context: how about asking car companies to make their cars 10,000 times faster or 10,000 times more energy efficient.

What are you working on at the moment?
In communications technology, we start off with electrical signals. To be used in optical communications, these need to be converted into an optical laser signal. This is where modulators come in, converting the electrical signal into an optical one. Standard optical telecommunications modulators are about 10 centimetres long and 2 centimetres wide. They work at speeds of up to 40 gigabits a second and consume five picojoules of energy per bit encoded. That may not sound like much, but when you repeat the operation 40 billion times a second it quickly adds up – especially when you have up to a thousand modulators in a single room. What we’ve done is develop new modulators, reducing their size to a millimetre or even less. The latest generation of modulators are just a few micrometers in length and operate twice as fast with way less power consumption.

How did you manage to make the modulators so small and energy efficient while delivering such performance?
We no longer work so much with light as with plasmons. The only time the information is in the form of a light signal is in the fiber optic cable itself. As soon as the signal arrives on the chip, we convert it into a plasmon. Plasmons are oscillations of electrons at the frequency of optical light signals. It’s much easier to manipulate plasmons than light because you’re dealing with electrons, not photons. These plasmons are switched, then converted back into an optical signal a fraction of a picosecond later and fed into the fibre optic cable, now carrying information.

What’s the advantage of this miniaturisation?
It now seems conceivable that we could combine optics and electronics on the same chip. Before, this was out of the question because of the discrepancy in the size of the components involved. Generally speaking, high-performance optic components are still comparatively large. A terabit transmitter, for instance, takes up a lot of room. If we wanted to house 1,000 of them in a central data exchange, we would need an entire building. And with all the additional components involved, energy consumption would simply be too high. That’s why it’s so important we work on miniaturising the components.

The bulk of data traffic seems to be transferring to mobile communications. How can your research help in this context?
It will take new approaches in the mobile communications sector, too, to deal with the huge amount of data that customers will demand in future. Similar to optics, where we have very high frequencies, mobile communications will soon have to move to higher frequencies. More precisely, mobile communications will soon be moving from conventional microwaves to waves that oscillate around 100 to 1,000 times faster. Experts envision that a new era – the terahertz area – will start. And of course we want to be right at the forefront.

Enlarged view: modulators to convert electrical signals into optical ones
New modulators to convert electrical signals into optical ones: conventional modulators are 10 centimetres in size and encode 40 gigabits per second. The new generation of modulators developed by Jürg Leuthold and his team measure a few micrometers and operate at higher speeds with lower power consumption. (Photo: Antal Thoma)
JavaScript has been disabled in your browser