Einstein v Hubble
The first serious opposition to Einstein came from Edwin Hubble an astronomer who followed on from the work of Vesto Slipher who first made the connection between Red Shift and Doppler Shift. Hubble then measured, the Red Shift on light from distant bodies and related it to distance. It was by then assumed by everybody that Red Shift was synonymous with Doppler Shift. Hubble also then established by observation that not only were more distant galaxies moving away from us but that the speed of recession increased with distance. This was formalized as an expanding universe expression and then used ever after as a means of calculating distances of stellar objects.
I was having problems at the time with the fact that not only had the big bang theory left the Universe in a state of expansion but that the rate of expansion seemed to be increasing too while it should reasonably be slowing towards an eventual apogee. Initially I postulated that during a big bang the mass would be made up of parts of varying mass and velocity, said parts would then experience varying outward speeds so that after a while everything would settle down with the maximum velocities at extreme distance and minimal velocities close to the origin. Recently however I have found references to work showing that those outward velocities were in fact also subject to subsequent acceleration. With no evidence of any forces available to cause this the scientific community seemed to be functioning on a “because it is” basis. I found some vague references to space itself being affected or even created by the big bang but nothing I could use.
The Hubble space Telescope (Named after Edwin Hubble who died in 1953) carried out the Deep Field experiment in 1996. The experiment was supposed to get insight into the formation of primeval galaxies etc so a patch of seemingly empty space was selected and a narrow field of view fitted into it. At that time the big bang was reckoned to be some 13 Billion years ago. Representing a narrow "keyhole" view stretching to the visible horizon of the universe, the Hubble Deep Field image covers a very small sample of sky but it is considered to be representative of the typical distribution of galaxies in space, because the Universe, statistically, looks largely the same in all directions. Gazing into this small field, the Hubble Space Telescope revealed a bewildering assortment of galaxies at various stages of evolution. A comment from the time is as follows (An American Astronomical Society Meeting Release from NASA’s Hubblesite).
January 15, 1996: One peek into a small part of the sky, one giant leap back in time. The Hubble telescope has provided mankind's deepest, most detailed visible view of the universe.
Representing a narrow "keyhole" view stretching to the visible horizon of the universe, the Hubble Deep Field image covers a speck of the sky only about the width of a dime 75 feet away. Though the field is a very small sample of the heavens, it is considered representative of the typical distribution of galaxies in space, because the universe, statistically, looks largely the same in all directions. Gazing into this small field, the Hubble Telescope team uncovered a bewildering assortment of at least 1,500 galaxies at various stages of evolution.
So low was the light level from the target area it took a combination of 10 days of accumulated light to form the 1996 image. A later, Ultra Deep Field Image took 11 days. From the then current theories the distances implied by the solid angle of vision, a view of the universe from nearly the time of the big bang had been expected. A star field of mostly nebulae, infant, or at least juvenile galaxies was then expected. What was found, however, was a variety of galaxies of various types, statistically similar to our own view of the night sky and of widely varying apparent ages! Not only were we not seeing the edge of the Universe in fact we were nowhere near far enough out for that. (Or was it going to be the same all the way to infinity?) In the Ultra Deep Field Experiment the work was repeated with a different patch of “dark” sky in 2009. The result was much the same as in 1996 so that now the age of the Universe or the time from the big bang is no longer held to be 13 Billion years but is now estimated to be 78 Billion years.
From the results of the Deep Field experiments cosmologists of the time simply revised the theoretical size and age of the Universe. There seemed to me to be no science to this however a figure was simply chosen to be beyond the ability of our technology to verify (or contradict it).
Along the way scientific and theoretical physicists were getting into all manner of tangles with the Einstein-Hubble universe where some very creative work had added more “fudge factors” overlaying Einstein’s original one. Scenarios were being proposed that worked well in science fiction novels but to me did not belong in scientific work. Some of this work however just beggared belief so I went back to my roots rather than follow it, I could not accept things on trust then nor can I now. Fudge factors have now been applied over others in attempts to keep theories going (and research grants I suspect) long after the experiments that were meant to confirm them into the status of theories had failed.
My approach was to go all the way back to Edwin Hubble. There was no point in going back to Einstein because he already knew it was wrong but couldn’t fix it. What did the damage was reconciling General Relativity and Hubble’s Law. Hubble considered that the red shift visible in light from distant objects was due to Doppler Shift. Throughout recent history this has been treated as a given but I think there is another, better, possibility.
I am proposing a hypothesis that as light travels vast distances a “drag” accumulates on it akin to friction. Everything in nature has its own opposing frictional forces but light speed is universally assumed to be fixed at “c”. The relationship VELOCITY = FREQUENCY x WAVELENGTH is a given. If we then note that photon velocity is constant at “c” a frictional force would appear in the frequency term such that as frequency was reduced wavelength would increase. The energy of the wave E would go down then with time and distance, thus giving a red shift, having nothing whatsoever to do with the work of Doppler. The third equation below, Max Plank’s, expresses it all λ is wavelength so it’s reciprocal is frequency.
f=c/λ or f=E/h or E=hc/λ
E Energy h Plank’s constant f frequency c velocity of light λ Wavelength
A friction effect acting on “E” over time or distance will then lead to an increase in wavelength or a shift in frequency towards red without any reference to a recessional speed of the source.
Note that this does not mean there is no recessional velocity and thus no Doppler shift; it simply is not the dominant phenomenon behind observed Red Shift.
The Planck's constant “h” is a physical constant reflecting the sizes of quanta in quantum mechanics. It is named after Max. Planck, one of the founders of quantum theory. The Planck constant was first described as the proportionality constant between the energy (E) and the frequency of its associated photon (ν). This relationship between the energy and frequency is called the Planck relation or the Planck–Einstein equation:
Since the frequency ν, wavelength λ, and speed of light c are related by λν = c, the Planck relation can also be expressed as
E=hc/λ (In Plank’s work frequency was “v” while elsewhere in this work it is “f”)
This is the expression used above to illustrate the effect of a reduction of “E” and predates me considerably. As to its existence in reality I simply considered the quadrature relationship of the voltage and magnetic excursions of the plane wave and the fact that they were oscillatory, the one inducing the other pushing the wave front forward by a process of continuous re-radiation. In my experience everything that moves or changes is subject to resistive losses in some form but because the photon speed is fixed at “c” the drain on E has to appear elsewhere and the only remaining part other than the constants is the wavelength λ or its inverse f.
This is the argument that the darkness of the night sky conflicts with the assumption of an infinite and eternal static universe. It is one of the pieces of evidence for a non-static universe such as the current Big Bang model. The argument is also referred to as the "dark night sky paradox." The paradox states that in an infinite universe, at any angle from the Earth the sight line will end at the surface of a star, so the night sky should be completely light.
Using a frictional concept (the Woollvin drag factor “w”) at the level of Plank’s quantum model of a photon as above, over interstellar distances the resultant “red shift” will also diminish the intensity of distant light sources such that they literally fade out. What would have been distant light in the “big bang” model is now still electromagnetic radiation but greatly attenuated and shifted down in frequency below that of visible light. Higher frequency radiation than light will also be shifted down and some will appear as light but at too low a level to support the idea of the paradox.
The concept of Dark Matter has been used to cover the apparent shortfall in the mass of the Universe. This has then lead to postulations of Dark Energy and Dark Flow. In each case the postulation is necessary because without it theoretical work on observed phenomena fails. This is typical of many attempts to patch up the theoretical basis of the “Big Bang” Universe. In a “Steady State” Universe the total mass of the Universe is infinite so we are only concerned with its local density instead, which does not lead to any conflicts with observations and no need for dark matter. The view of the Universe at 78 Billion years distance is very similar to here. And if the next version of the “Hubble” space telescope can resolve an image significantly further out we should now expect the view to be essentially the same again. While the idea of an invisible form of matter having no associated radiation and a zero albedo is reasonable, the fact that it seemingly does not reveal itself by occlusion either seems to me to go too far. The fact that this Cosmology does not require “Dark Anything” to work is very appealing to me.
When Einstein looked for a way to render general relativity compatible with a steady state Universe I now think he was right. When Hubble used Doppler shift to deduce interstellar speed of recession he was wrong. By then however Einstein was ageing and not a well man and did not pursue the issue further. The scientific community then argued amongst itself and formed a consensus seemingly without the benefit of further work and so the Big Bang theory gained majority support. It now seems that it was not in fact entitled even to the status of “theory” because everywhere it was subsequently used it failed and needed patchwork. In my view some of these patches were outrageous and instead of being adopted should have been considered as evidence for tearing down the whole edifice and starting again. If my work above turns out to be correct said edifice collapses on itself.
My position can be summarised as “Interstellar Red Shift is not Doppler Shift, there was no Big Bang, and the Universe extends outwards without limits forever and has already existed forever before now.”
There is even then an explanation for the microwave noise background as the “Plank- Woollvin, Red Shift” works all the way to zero from infinite distance. It’s not the residue of a Big Bang, its radiant energy that has run down over vast distances from near infinite numbers of sources, out of the visible bands and down into the microwaves and below, weakening all the way. To prove this we need to find a suitable group of spectral “lines” establish their relationships to each other then search for that pattern in much lower bands in radiation from very distant sources. The Deep Field data may well already hold some suitable information but the answer is more likely to come from radio telescope work. A lot of work is being done with radio telescopes but to my knowledge each telescope operates on a fairly wide bandwidth and to see the spectral lines it will be necessary to scan along a frequency axis using signals from a relatively small aperture. Ideally we need a band of energy from a single source and a search bandwidth significantly less than the frequency separation of the spectral lines of common elements.
As a footnote we on Earth conceptually have always started at the center of all things. Copernicus started the process of demotion so that instead of being at the center of everything, the Sun was and we were in orbit around it. With each major work we have been demoted to lesser status. Now we should exercise caution because whichever direction we point our telescopes, we get similar results and it would be very reasonable to then assume that we are therefore after all at the center of the Universe. The reality is that this too is an illusion because a steady state Universe will be infinite and thus have no center.
My photon drag factor “w”” then progressively removes energy from incoming light to a point where it’s too weak to register on our instrumentation. This does not mean there are no further stellar bodies “out there” and with every technical advance in detection sensitivity the apparent size of the Universe will appear larger. Already however the figures are so vast we may as well assume infinity.
I have absolutely no idea as to the physical mechanism behind my drag factor but the work I have done added to that of those who have gone before leaves a situation where while we may not understand it yet its existence is proven and capable of numerical evaluation. We can thus use “w” in the future and proceed accordingly. The situation is the same as gravity we can feel and see its effects, measure them and use the results to predict further outcomes but as to what gravity actually is, will take more time