I want to explore some of the basics of transmitting digital signal over a satellite link, both for general interest and also as a way of getting these concepts more solidly integrated into my own understanding. For the most part, in my work supporting a couple of over-the-air IP networks, the mechanics of the space link are often a sort of amorphous set of givens, much like the "cloud" graphic used to depict the Internet in network diagrams. My goal here is to remove as much of the fuzziness as I can without getting too lost in details of things like link budgets and Reed-Solomon codes.
Communicating digitally over a satellite connection can often feel a lot like using an old-school modem: your sleek, digital broadband datastream gets snarled in a slow, often balky analog link. Indeed, the mechanics are the same - data is modulated, passed over a long (in this case, very long) distance, where it is demodulated and passed along to its destination. When you begin to get a sense of all the things that can (and often do) happen to an electromagnetic signal on the way to and from a satellite 35,000 kilometres above the Earth, you may be struck, as I was, at what a miracle it is that the data can pass at all, let alone at a decent speed and error rate!
Our fundamental obstacle in bouncing our signals off a spacecraft in geosynchronous orbit is signal loss due to the distance involved and the absorption of signal by atmospheric gases. "Noise," or interference to the signal, is introduced at various points along the link, and it's the job of a radio frequency (RF) engineer to determine the right amount of power to overcome the signal without saturating the receiver. Think of this optimization as being like trying to shout to be heard in a noisy room: if you speak too softly, no one can hear you, but if you scream your head off, your "signal" will be distorted, and no one will understand you. So there's a fairly delicate balance at work here; there needs to be enough power to get the signal through the various sources of loss without over-driving it while leaving enough "extra" power available to overcome transient sources of loss like rain.
I often find it useful to follow the path from source to receiver as a way of understanding how a given link works. So, in a drastically simplified model, let's travel along with a packet as it traverses a satellite link between two terrestrial networks.
As in any digital-analog-digital transmission, the first thing that happens is the conversion from a digital bitstream to an analog bandpass channel. The details of this process are way beyond the scope of this brief introduction, but let's leave it that our packet has now been magically transformed into a series of symbols. Depending on the type of modulation we are using, our symbol rate may vary (we used to call this "baud rate" back in the day). Now our structured, discrete bitstream has fallen through the rabbit hole into the often confusing world of continuous waveforms. Things are about to get much more interesting.
Our waveform is now traveling across the inter-facility link, which we might think of the "runway" the signal needs to traverse before it takes off into space at the satellite dish. Because it is traveling across a shielded coaxial cable, it can use a lower frequency than it will later require. This is, for the VSAT (Very Small Aperture Terminal) satcom I am used to, is typically 70 or 140 MHz L-band. Before it gets to the dish, it will be upconverted to a much higher frequency. The Ku band used by the equipment that I work with operates anywhere between 12 and 18 GHz; other equipment uses C-band, at lower frequencies. This microwave signal is then dramatically boosted (by many orders of magnitude) with a high-power amplifier and transmitted into space.
The feed on a satellite dish is placed at the dish's focal point, which directs most of the signal in the direction of the spacecraft. Because this is a radio transmission, of course, it does not travel in a straight line, like a laser, but rather radiates out in other directions. In general, the larger the antenna, the better the ability to concentrate power. The measurement of the effectiveness of a directed antenna is called gain, and it becomes of crucial importance as we attempt to get our signal out to the satellite.
When planning a satellite link, an RF engineer uses what's called a "link budget" to predict the performance of the link. The link budget takes into account the loss and gain of power along the link, as well as the various impairments (noise) that affect the signal. Given a specific power level at the transmit terminal, the link performance at the receive terminal can be calculated - and, thereby, the necessary power on the transmit side in order to achieve a desired performance on the receive side.
As our packet - now a series of symbols - rises from the Earth, it must first fight its way through the atmosphere. In addition to rain (a notorious absorber of microwave signal) and thermal interference, our signal may also have to contend with other signals, or "interferers," as it heads out to space. By the time a signal reaches the satellite, even the most effective antenna cannot prevent it from spreading out hundreds of miles. Each satellite has multiple transponders located in its payload, and so, with the signal drastically attenuated so far out in space, after being battered by noise, a key consideration now becomes adjacent interference, or the effect of other signls passing to and from the target satellite... or to and from other satellites (it is getting crowded out there). Regulations require that your own signal minimize this adjacency, so your concern now becomes how to get the most out of your own performance while being mindful of those on adjacent transponders. Here, being a good neighbor is not just a good idea... it's the law.
The satellite is usually used as a "bent-pipe" repeater; that is to say, it receives the signals sent to it, filters and amplifies them, and transmits it back to Earth over its downlink. The satellite receiver is a major source of noise in the link that must be taken into consideration. Here, a stray signal can become intermodulated with your signal - this noise is then injected into your signal and blasted back to Earth.
At the ground receiver, the signal is received in whatever state, and then error correction is applied to it to counteract the effects of signal degradation over its long journey. The reconstruction of a noisy carrier into its original signal is one of the great miracles of this whole process. Remember that we do not, at this level, simply ask for re-transmission of garbled signals - if at all, such transmission control must be handled by digital protocols. Rather, the signal is reconstructed from protocols inherent in the transmission (which are, again, out of the scope of this introduction).
Once recieved, this signal is beat against a local oscillator to convert it back down to L-band, and returned to the demodulator for conversion back to its bits. Only now can the bithead's beloved internet protocols do their magic - and yet, with the high latency of the space link, TCP SYN/ACK becomes a joke. One must assume a window size of 1 for acknowledgement to have any meaning... which means that transmission would slow to a crawl as each and every packet goes through the link, is acknowledged, and the next packet allowed to transmit. There are various IP aceleration schemes that are done to more or less spoof TCP so that speeds approaching the broadband transfer rates we have grown accustomed to can be achieved. The most famous of these is the Space Communications Protocol Suite or SCPS (pronounced "skips"), though there are newer (and better) commercial implementations being released all the time.
This introduction is presented in draft form, and I welcome - in fact am begging for - your comments and clarifications.
domenica 4 marzo 2007
Iscriviti a:
Post (Atom)