Pentcho Valev
2015-07-05 17:10:27 UTC
http://www.einstein-online.info/spotlights/doppler
Albert Einstein Institute: "The frequency of a wave-like signal - such as sound or light - depends on the movement of the sender and of the receiver. This is known as the Doppler effect. (...) Here is an animation of the receiver moving towards the source:
Loading Image... (stationary receiver)
Loading Image... (moving receiver)
By observing the two indicator lights, you can see for yourself that, once more, there is a blue-shift - the pulse frequency measured at the receiver is somewhat higher than the frequency with which the pulses are sent out. This time, the distances between subsequent pulses are not affected, but still there is a frequency shift: As the receiver moves towards each pulse, the time until pulse and receiver meet up is shortened. In this particular animation, which has the receiver moving towards the source at one third the speed of the pulses themselves, four pulses are received in the time it takes the source to emit three pulses." [end of quotation]
That is, the speed of the pulses relative to the stationary receiver is c = 3d/t, but relative to the moving receiver is c' = 4d/t = (4/3)c, where d is the distance between subsequent pulses and t is "the time it takes the source to emit three pulses".
Clearly the speed of light (relative to the receiver) varies with the speed of the receiver, in violation of Einstein's relativity.
Let us formulate the problem in more precise terms. Consider a light source emitting a series of pulses the distance between which is d (e.g. d = 300000 km). The frequency of the pulses at a stationary receiver is f = c/d:
http://www.einstein-online.info/images/spotlights/doppler/doppler_static.gif
The receiver starts moving with speed v towards the light source - the frequency shifts from f = c/d to f' = (c+v)/d:
http://www.einstein-online.info/images/spotlights/doppler/doppler_detector_blue.gif
Question: Why does the frequency shift from f = c/d to f' = (c+v)/d ?
Answer 1 (fatal for Einstein's relativity): Because the speed of the pulses relative to the receiver shifts from c to c' = c+v.
Answer 2 (saving Einstein's relativity): Because the motion of the receiver somehow changes the distance between the pulses - this distance should shift from d to d' = cd/(c+v) (otherwise goodbye Einstein!).
Answer 1 is reasonable (it is relevant for all types of wave), Answer 2 is obviously absurd. So the attempt to save Einstein's relativity amounts to reductio ad absurdum, which means that the underlying premise - Einstein's 1905 constant-speed-of-light postulate - is false.
Pentcho Valev
Albert Einstein Institute: "The frequency of a wave-like signal - such as sound or light - depends on the movement of the sender and of the receiver. This is known as the Doppler effect. (...) Here is an animation of the receiver moving towards the source:
Loading Image... (stationary receiver)
Loading Image... (moving receiver)
By observing the two indicator lights, you can see for yourself that, once more, there is a blue-shift - the pulse frequency measured at the receiver is somewhat higher than the frequency with which the pulses are sent out. This time, the distances between subsequent pulses are not affected, but still there is a frequency shift: As the receiver moves towards each pulse, the time until pulse and receiver meet up is shortened. In this particular animation, which has the receiver moving towards the source at one third the speed of the pulses themselves, four pulses are received in the time it takes the source to emit three pulses." [end of quotation]
That is, the speed of the pulses relative to the stationary receiver is c = 3d/t, but relative to the moving receiver is c' = 4d/t = (4/3)c, where d is the distance between subsequent pulses and t is "the time it takes the source to emit three pulses".
Clearly the speed of light (relative to the receiver) varies with the speed of the receiver, in violation of Einstein's relativity.
Let us formulate the problem in more precise terms. Consider a light source emitting a series of pulses the distance between which is d (e.g. d = 300000 km). The frequency of the pulses at a stationary receiver is f = c/d:
http://www.einstein-online.info/images/spotlights/doppler/doppler_static.gif
The receiver starts moving with speed v towards the light source - the frequency shifts from f = c/d to f' = (c+v)/d:
http://www.einstein-online.info/images/spotlights/doppler/doppler_detector_blue.gif
Question: Why does the frequency shift from f = c/d to f' = (c+v)/d ?
Answer 1 (fatal for Einstein's relativity): Because the speed of the pulses relative to the receiver shifts from c to c' = c+v.
Answer 2 (saving Einstein's relativity): Because the motion of the receiver somehow changes the distance between the pulses - this distance should shift from d to d' = cd/(c+v) (otherwise goodbye Einstein!).
Answer 1 is reasonable (it is relevant for all types of wave), Answer 2 is obviously absurd. So the attempt to save Einstein's relativity amounts to reductio ad absurdum, which means that the underlying premise - Einstein's 1905 constant-speed-of-light postulate - is false.
Pentcho Valev