Sunday, June 5, 2011
My wife likes to listen to a small shower radio during her morning routine, and she laments the fact that she can only get one decent classical station in, and none of the popular music she likes. However, the Christian radio stations come in loud and clear, as do a few particularly awful country stations, and a long string of Spanish language stations. (Neither of us are fond of modern country music it seems.)
We were discussing this phenomena, that the stations we did not like seem to come in very strongly across the dial, but the "good" stations had short range or were weak in comparison.
"Perhaps the reason why the bad stations are clearer is because fewer people listen to them, and are sucking up all the good stations." My wife commented.
I found this a bit funny and started smiling. "It does not work that way!" I stated confidently, but then I though about it a bit...
"Why do you think that?" I asked?
"Well, my cell phone gets worse signals in areas with lots of users, and the WiFi networks get slower when there are a lot of people using the connection." Why not radio stations.
Do radio stations have a set bandwidth, like the cell phone towers or WiFi hubs?
Well, not really. The power a radio station puts out is attenuated mostly by distance from the tower, and the amount of matter absorbing, scattering and reflecting the radio waves.
Also, given the fact that a receving radio antenna is sucking down all the radio stations at once, and it is the tuner that is filtering out the unwanted stations, one could make the claim that an receving antenna would attenuate all the stations equally. One could get picky about how antenna lengths are better at resonating with particular frequencies than others I supposed.
But one may still wonder... does a receiving antenna that is actively tuned to a particular frequency absorb more power from the signal than one that is not active, or tuned to another station?
And if so, how much power?
----- Start Fermi Problem ------
Estimate how many radios there are within 20 miles of a transmitter in any given city.
Estimate how much power a single receiving antenna attenuates from the signal.
When it is active? When it is not tuned to the station?
Calculate the difference in power a receiving antenna would observe at 20 miles if suddenly all the other antennas were to physically disappear!
----- End Fermi Problem -----
There are probably many other reasons for a perceived signal to noise ratio for certain types of stations. And this is a fun topic to think about.
Perhaps country music just seems louder to those who do not want to hear it.
Perhaps country music stations cater more towards a demographic that travels more during the day, and thus they are simply transmitted at a higher power than popular music stations.
Perhaps Christian radio stations like to be louder to catch your attention! After all, they are trying to get their message out and save your soul!
But more listeners would seem to be a very small contribute to relative signal strengths, at least from a physics standpoint.