|<<>>|2 of 40 Show listMobile Mode

Radio vs. Sound Waves

Published by marco on

Updated by marco on

I played Kahoot[1] the other day with the family. The quizzes are pretty wide-ranging and pretty decent fun, especially for a mix of ages. One of the quizzes concerned sound and electromagnetic waves and I tried to explain why one of the answers was incorrect “in the moment”, as it were. Concerned that my explanation had engendered rather than answered questions, I take another crack at it below.

Radio vs. Sound Waves

I was thinking again about how you didn’t seem to convinced by my fumbling attempts to explain the difference between sound and radio waves. The point I was trying to make is better made by the article Difference Between Radio Waves and Sound Waves (Pediaa), which writes,

“[…] radio waves are a type of electromagnetic wave that can travel when there is no medium, whereas sound waves are a type of mechanical wave that cannot travel if there is no medium.”

In describing how the two do interact, I used the words “encode” and “decode”. Sound waves are recorded by a microphone, transformed to a digital or analog signal, transmitted via radio waves, transformed to a digital or analog signal, and, finally, used to drive a mechanism like a speaker to reproduce the original sound by “playing” them on a speaker, which is generally a vibrating surface that generates sound waves in the air near you. There’s a whole lot more detail to the encoding format, process, the electronics, etc., but that’s the basic gist.

Taking Pictures

You didn’t sound convinced even by that, so I thought of what is, perhaps, a better analogy.

When you take a picture with your phone, you’re doing nearly exactly the same thing, but with electromagnetic waves in the visible spectrum. These waves are visible in a particular place; they strike the CCD (charge-coupled device) in your camera, which records millions of data points, encoded with a picture format (e.g. JPEG or RAW).

When you send that picture, you don’t think of it as “sending the light waves over radio waves”, though, do you? You think of it as sending the picture. But that picture is just a frozen representation of light waves that once existed in a certain place and time.

That you then use your phone to encode radio waves to transmit the picture to someone else (or to store in the cloud) has nothing to do with the original light waves. You could just as easily have copied the JPEG to a computer, onto a USB stick and transported it that way.

That’s what I meant when I said that sound waves don’t have anything to do with radio waves. The radio wave is not intrinsic to the process of hearing sound waves; it’s just a common transmission mechanism for all sorts of data (digital or analog, though increasingly digital these days).

Recording and Playing Sound

In the olden days, the vibrations of a microphone (a piece of vibration-sensitive membrane connected to a sensor (generally piezoelectric) that transforms or encodes the vibration into a voltage with a certain amplitude and frequency, which drove a needle the inscribed a wax cylinder or vinyl disc. You could then carry (not transmit) this cylinder to somewhere else, where you could use a device that, essentially, reverses the recording process to decode the sound and “play” it. No radio waves in sight.

You can extrapolate the example of the picture to a series of moving pictures that even includes sound (a video). In that case, you are capturing information using two detectors (a microphone and a camera/CCD) and the software in the phone combines these inputs into a single recording, storing it as a file, 30-60 times per second. That file has nothing to do with radio or sound or light waves.

“Files”

Even the “file” isn’t anything like the metaphorical mental picture we tend to use for it. The flash storage in which it resides is a block of material that records trillions of individual “values” in the form of a physical property that is in one of two distinct states, each of which can be read by yet another sensor. This sensor applies a voltage to these distinct regions of the material and interprets the results as 1s and 0s. Hard drives use magnetism instead of electrical charge.

A file comprises a region of these physical states that the phone can read and write and the software can reliably interpret as 1s and 0s. It is read with these sensors, interpreted by the software to produce signals that play it by applying further voltages to red, green, and blue emitters in the “screen” that emulate the original light waves—doing the same for the original sound waves using a speaker, as described above—but those are new light and sound waves.

A miracle in your hand

It’s actually kind of a miracle that any of this works at all, don’t you think? I’ve got a 3-year-old used iPhone 6s that can record 4k video at 30FPS (33.3ms per frame) and store it in a format that can be immediately replayed and shared. Not only that it works, but that it can do all of this so efficiently, using so little power relative to the task—and in the palm of your hand. Amazing, really.

And don’t even get me started about how that phone talks to space all day long (using radio waves). Check out that the video All you need to know to understand 5G by Sabine Hossenfelder (YouTube).


[1]

For the hell of it, I switched the UI to French to improve the learning effect and saw a very odd encoding error.

 Tu vois ton pseudo à l&#x27écran ?

All of the accented characters are correct, but the single quote remained unencoded, for some strange reason—because there is never any reason to encode it in the first place, really, since it’s not a reserved character in HTML.