A radio transmitter broadcasts plane electromagnetic waves whose electric field amplitude at a particular location is 1.550 times 10 to the negative third volts per meter. What is the amplitude of the electromagnetic wave’s magnetic field at that location?
We’re told in the statement that in this electromagnetic wave at a certain spot, the electric field amplitude is 1.550 times 10 to the negative third volts per meter; we’ll call that 𝐸. We want to solve for the amplitude of the wave’s magnetic field at that same location; we’ll call that 𝐵. In this problem, we will assume that 𝑐, the speed of light, is exactly 3.00 times 10 to the eighth meters per second.
If we draw a sketch of this scenario, we have a radio tower emitting radio waves. And each of those waves each has an electric and a magnetic component to it. There is a relationship between the magnitude of the electric field and the magnetic field in an electromagnetic wave.
This relationship says that the magnitude of the magnetic field 𝐵 is equal to the magnitude of the electric field 𝐸 divided by the speed of light 𝑐. When we apply this to our situation, we’re given 𝐸, the electric field, and 𝑐 is a known constant, so we can plug in for those now.
When we calculate this fraction, we find that the magnetic field magnitude 𝐵 is 5.17 times 10 to the negative 12th tesla. That’s the amplitude of the magnetic field given the amplitude of the electric field.