To understand why OFDM is such an amazing invention, you first have to understand a little about digital transmission of data. At heart, digital modulation schemes are all simply ways of converting the digital signal (ones and zeros) to an analogue signal that can both be transmitted easily, and then decoded easily at the other end.
There are three basic classes of digital modulation:
- Amplitude modulation. In these, you're looking at different signal levels to get different bit patterns - for example, you might have a rule that says "loud signal is a 1, quiet signal is a 0". The advantage here is simplicity - to implement the transmitter and receiver for these modulation schemes is easy (a morse code transmitter and receiver are an amplitude modulation scheme). The disadvantage is that it's very prone to being disrupted by noise or imperfections in the transmission channel, so you struggle to approach the Shannon limit.
- Frequency modulation. In these, you look at the frequencies of the signal to determine what bit pattern is being transmitted - for example, you might say that "2kHz (high) pitch is a 1, 200Hz (low) pitch is a 0". DTMF in touch-tone phones is an example of a frequency modulation scheme. Again, these systems are simple (although not quite as simple to implement as amplitude systems), but they're still easily disrupted by some classes of noise.
- Phase modulation. In these schemes, you represent data by a phase shift - this is much harder to decode, but has the advantage of being much more noise resistant than any of the other modulation schemes.
By changing the amplitude of two waves that are 90° apart in phase, and combining them, we effectively shift both phase and amplitude of the signal at the same time, resulting in QAM being a mixture of phase and amplitude modulation, and getting some of the advantages of both, at the expense of a more complex receiver.
So, how does this all tie into OFDM? Well, firstly, you need to recall some basic trigonometry; QAM uses two carriers 90° out of phase with each other. Another way of looking at this is that QAM uses the sine and cosine waves at a given frequency and phase as its carrier pair. The first step of a QAM demodulator separates out the received signal into the cosine and the sin wave at that frequency.
The other bit of information you need to make OFDM make sense is to realise that signal impairments (noise, attenuation, multipath interference etc) don't affect all frequencies easily. For example, attenuation in a length of copper wire goes up as the frequency of the signal goes up; electronic equipment emits noise at various frequencies, but concentrated around some specific (equipment-dependent) frequencies - such as 33MHz for a PC with a PCI bus.
So, finally, onto OFDM; the basic idea is that rather than try and use the entire channel bandwidth to transmit a single high bitrate signal (as is done in Ethernet, for example), we'll split the channel into lots of narrow channels, and transmit a low bitrate signal on each of them. We can then recombine all the low bitrate signals together into the original high bitrate signal at the other end.
This sounds expensive, at first sight; 802.11a/g use OFDM with relatively few carriers (52), yet some applications, like DVB-T, can use thousands of carriers. If we had to have a full-blown QAM receiver for each carrier, it would be impossible to afford; here's where the miracle comes in. There's a mathematical trick, called the Fourier transform, which can split your incoming signal by frequency, giving you the amplitude of the sine wave and cosine wave at each frequency. Co-incidentally, the first step of a QAM decoder is to split your signal into sine and cosine waves at each frequency. It's this trick that makes OFDM economic to implement; instead of having a QAM decoder for every frequency in the OFDM signal, we perform a Fourier transform on the incoming signal (which we can do efficiently using an algorithm called the FFT), and then all that remains is to quantise each of the amplitudes that comes out, and reconstruct the original digital signal.
From this point, we can do all sorts of tricks; ADSL, for example, uses a different QAM constellation at each frequency to get the maximum data rate out of a single line. DVB-T uses the same QAM constellation in each sub-band, but uses a different subset of the available OFDM carriers for data in each symbol, so narrow band interference doesn't wipe out too much of the wanted data.
Further, because OFDM transforms a high symbol-rate (a symbol is a bit-pattern) wideband signal into a collection of low symbol-rate narrow-band signals, it makes it possible to play tricks that rely on the fact that you can receive at a higher symbol rate than you're actually needing to use; for example, DVB-T can operate in a "single frequency network" mode, where several synchronised transmitters all send the same signal. Because the symbols are relatively slow, and all contain identical data, a receiver can work out what the original digital signal was, and correctly ignore the interfering transmitters.
All in all, OFDM is amazingly simple for what it actually achieves. There are other systems that do similar things (such as CDMA, used in mobile telephones and GPS), but they aren't as simple to explain.
Next on my list of interesting technologies to get the hang of is MIMO, as used in 802.11n wireless networks. This one's even more of a brain-strainer, as it somehow manages to use multiple aerials all tuned to the same frequency to somehow get more data rate than could be done with a single perfect aerial.