Galactic Redshifts and Supernova Light Curves

Ever since its discovery by Edwin Hubble in the 1920s, the linear dependence of the redshift of galaxies on their distance has been interpreted in terms of the well known Doppler effect, i.e. as an overall recession of all galaxies (for the sake of historical correctness it should be pointed out that Hubble himself was apparently never certain about this interpretation of the redshift (see Despite the obvious philosophical and logical problems with such a model (see my main Cosmology page), other redshift mechanisms have been ruled out both for alleged theoretical and observational reasons. The latest alleged argument for a recession-related redshift is the observed change in the shape of the light curves of supernovae in distant galaxies, which appears to expand exactly by the same factor as the wavelength itself. According to Big-Bang proponents (see for instance, this should phenomenon should not be observed if the redshift is not velocity related but has a different physical reason (i.e. for steady-state models). However, this conclusion is apparently based on the particle model of light where the observed intensity is given by the number of photons (assumed to be constant). This model is strictly speaking flawed as it is not only theoretically inconsistent but also contradicts experimental results (see my page regarding the Photoeffect ). If light is interpreted correctly in terms of an electromagnetic field of a certain amplitude and coherence length, and if one assumes now that a hypothetical non-Doppler redshift mechanism 'stretches' the corresponding wavetrains, it is reasonable to assume that this also leads to a reduction of the amplitude which in turn would then be interpreted as a reduced intensity of the light. If not taken into account, this intensity reduction leads to an underestimation the of absolute brightness of the supernova and one hence concludes that the light curve must have broadened (because for the apparent brightness one would expect a faster decay of the light curve according to the observed dependence of the shape of supernova light curves on absolute brightness).

As suggested on my website under Plasma Theory of Hubble Redshift of Galaxies, the reason for the redshift could be the small scale electric field due to the intergalactic plasma. This could stretch the 'wave trains' progressively (leading to the redshift) and simultaneously reduce their amplitude (leading to an intensity reduction) as illustrated below (top: original wavetrains; bottom: redshifted wavetrains).
Schematic illustration of the redshift of light in the intergalactic plasma
Note: the above illustration is actually not quite correct as the wave field has no gaps but is continuous in space with random phase shifts after one coherence length (on the average). This means that a stretching of the wave trains does not only lead to a reduction of the amplitude (as indicated in the diagram) but also to an overlap of neighbouring wavetrains which on the other hand increases the amplitude again. As the overlapping wavetrains are however randomly out of phase, this increase would be only proportional to the square root of the stretch factor (incoherent superposition). The overall reduction of the amplitude is therefore inversely proportional to the square root of the redshift. This results in an inversely proportional reduction of the intensity if the radiation can be considered as incoherent (see my page regarding the Photoelectric Effect).

It should furthermore also be pointed out that a typical delay time of 1 week amounts only to about a fraction of 10-11 of the total light travel time (1 billion years), which is less (or at best equal) to the accuracy with which the speed of light is known. It is therefore theoretically possible that the speed of light in the intergalactic plasma depends on its intensity, i.e. higher intensities travel slightly faster (which could at least explain the change in the decaying part of the supernova light curves).

However, whatever the actual reason for the galactic redshift may be, the dilation of the supernova lightcurves can in no way be considered as a proof that it is caused by a recession over the Doppler effect.

Print Version

Cosmology Home | Home

Thomas Smid (M.Sc. Physics, Ph.D. Astronomy)
See also my sister site