TransWikia.com

Why is the transmission data rate between Mars and Earth so low?

Physics Asked by scrrr on October 28, 2021

I read that we get between 500 and 32k bits per second when sending data from Mars to Earth. Apparently it’s substantially higher between Moon and Earth. What are the reasons? Please explain the relationship between frequency band used and range and bandwidth (if there is any) and perhaps the term bandwidth in general. It looks like it sometimes means a range of frequencies and sometimes a data rate. Thanks!

One Answer

Why does data rate decrease with distance?

This has to do with a relation between signal-to-noise ratio of a communication channel and the data rate which can be theoretically transmitted through it.

The basic reasoning goes like this: The data transmitter has only a limited amount of power available to it. The inverse square law then tells you that the power received on Earth, at a distance $R$, scales as $1/R^2$ (let's assume for simplicity that we're comparing the same receivers and transmitters, just at different distances).

The received signal always contains some noise. Its intensity typically depends on the receiver design and its environment and is mostly independent of the signal you're actually receiving. To be able to decode your signal, you need to average it for some amount of time $T$. This basically reduces the standard deviation of the noise by a factor proportional to $sqrt{T}$ and its power (which scales quadratically with the amplitude) by $T$. To recognize your signal among the noise, you need to push the noise power below the signal level. This means that the necessary averaging time $T$ scales as $R^2$.

So let's say you're sending your bit stream with simple on-off keying, i.e. signal on for 1 and signal off for 0. You need to make each bit last for a time of roughly $T$ such that the receiver, after averaging the received signal for this time, can reliably tell apart a 0 from a 1. And so we get that the data rate you can realistically send (with a fixed transmitter and receiver) goes down with the distance as $1/R^2$.

Caveats

Remember that we have made this comparison between different distances for the same transmitters and receivers. Without this constraint, you could in principle reach equal data rates from the Moon and from Mars by scaling up the transmitter power on Mars proportionally to the squared ratio of the two distances, making the signal-to-noise ratio at the receiver end the same. Similarly, you could make the receiving antenna for the Mars link larger and get the same result.

"Data bandwidth" versus "frequency bandwidth"

As for your question about the relation between data bandwidth and frequency bandwidth, consider that to send a data stream with a bit rate $B$ using on-off keying, you need to modulate the carrier wave at this rate. This means that the spectrum of the modulated signal will have a bandwidth on the order of $B$. So vaguely speaking, you could say that (at least for on-off keying) the data bandwidth gives you a lower bound on the frequency bandwidth needed to send the data. Or conversely, the available frequency bandwidth gives you an upper bound on the data bandwidth you can achieve.

It's important to note that on-off keying is only one of many possible types of data encodings and other ones could allow you to transmit higher data rates. More generally, the relation between frequency bandwidth and data bandwidth (quantified by an information theory concept called "channel capacity") also depends on the signal-to-noise ratio and is given by the Shannon-Hartley theorem.

Answered by aekmr on October 28, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP