@livinglava,
Quote:They are both frequencies because frequency just refers to waves-per-second; it says nothing about what's causing the frequency to be what it is.
Sure, it's true that what's
causing the apparent difference in frequency (motion) is not apparent just from looking at the frequency measured by the receiver. So what?
Quote:you [are] defending the idea that a light wave exists in its own unchanging frame and that other frames that interact with the light's frame are 'perceiving' it differently without actually changing it in any way within its own frame.
Of course I'm "defending" that idea. Who, other than a solipsist, would deny it to begin with?
Why is it that a radar gun, based on the known "doppler effect," can measure relative speed?
It can't, without some assumptions.
Assumption: Electromagnetic waves always travel at one, uniform, rate of speed, i.e. "c." (in a vacuum). It's speed does not vary with time, distance traveled, or anything else. It is "constant." Or, as you put it, " a light wave exists in its own unchanging frame."
Therefore, if you DON"T measure it at that speed in some other frame, then you are wrong. You are mismeasuring the speed. You are the victim of an "optical illusion," because you are not measuring it's "true" speed.
You will mis-measure the "true" speed of light if you miscalculate the distance traveled. If you are moving toward, or away from, a source of light, then you will miscalculate the distance the light must travel if you assume there is no relative motion between you and the emitter, and you therefore don't account for the fact that the distance is changing all the time. You must correct for that to get the "true" distance the light must travel to reach you. You can only measure the true speed of light when you are using the correct distance traveled by it.