Thinking about smoke signals and the first transatlantic cable last week got me thinking about the electrical engineering behind early long-distance wired communication.
John Steele Gordon’s A Thread Across the Ocean gives a nice account of laying the cable, including accounts of a deadly storm and the sound of the cable snapping under the weight of miles of cable nearly half way across the ocean. It’s mostly told, however, from the business people’s point of view.
The signal level was very low. One of the engineers developed a more sensitive receiver to pull the weak signals out of the noise. Dots and dashes ran together unless they were keyed very slowly. Managers downplayed how long it took to send messages.
What was going on from a technical standpoint? We can model a transmission line a series of inductances and capacitances. It also has to include unwanted resistance of the conductor and unwanted conduction through the dielectric material.
An Analog Electronics Companion by Scott Hamilton provides the values for resistance, capacitance, and inductance of original transatlantic telegraph cable.
R = 4.81 ohms / km
C = 0.21 uF / km
L = 280 uH / km
I simulated four (4) 1000-ft segments in Partsim. It’s not valid to model 1000ft of cable as a single lumped element, but Partsim could not handle simulating 16 250-ft segment.
Simulating this model in PartSim provides similar results to what Scott Hamilton got with 16 elements in Spice. In these plots solid green shows the source signal, red shows the signal after 1000km, and blue shows it after 4000km.
The faint signal represented by those blue slopes came from somewhere a week's travel away. It must have been amazing to see any galvanometer deflection triggered by someone on the other side of the ocean.
Dots:
Dashes:
One difficulty in using those signals was signal amplitude. The unwanted conduction of the dielectric on the original cable was much worse than 1Meg ohm per mile, in my model, so the signal received through the cable was weaker than in my simulation.
A bigger problem is intersymbol interference (ISI). The cable filters out the high-frquency components, causing dots and dashes to run together. This model shows the minimum dot length of 2 seconds. At this rate a typical word would take four minutes to send. On an ideal connection, experienced telegraph operators can send and receive 20 words per minute.
Viewed from the frequency domain, we could say the cable roles off and does not pass much beyond 1Hz.
Engineers considered speeding up the fall time by adding controlled leakage throughout the cable, but the loss of amplitude would have put the signal in the noise. The solution would come many years later: adding inductance. This makes intuitive sense because the problem is the cable is not passing high frequencies. Blocking lower frequencies means it has a flatter frequency response, which means less distortion in the time domain.
A solution early engineers did not try was an increased alphabet size. They could have sent information in the signal level, which we would now call amplitude shift keying (ASK). They could have encoded data in bits and not used dashes, which are needlessly long. This would have required a whitening algorithm to prevent a long string of 1’s or 0’s that are hard to count.
Transmitting more data through a channel has come along way in 150 years. In urban wireless channels, we’re bumping against Shannon’s Law, which quantifies the theoretical maximum data rate possible at a given bandwidth and signal to noise ratio. The first transatlantic cable suffered from being band limited to less than 1Hz and from poor SNR. (Increasing signal strength meant voltages that could damage the cable.) An 802.11(n) Wi-Fi channel has 40MHz of bandwidth and 35dB of SNR. To get an increased data rate, we have the same roadblocks as 150 years ago: More throught requires either more bandwidth or a better SNR.