Quantum gravity physics based on facts, giving checkable predictions: November 2005

Monday, November 28, 2005

Comments on Tony Smith's Physics Site

Tony Smith has an extensive and fascinating collection of mathematical physics insights at http://www.valdostamuseum.org/hamsmith. He is a mathematician, lawyer, and veteran of the Vietnam War. He is also a string theorist, but has made testable predictions such as in his paper http://arxiv.org/abs/physics/0207095. The reason for the cowboy hat is a medical condition, reaction to ultraviolet in sunlight. I'm interested in the reason why supersymmetry in string theory postulates that every fundamental particle is paired to an (unobserved) superpartner with a different spin. This is the 1-1 boson-fermion supersymmetry in string theory. Smith has a paper which uses E6 Lie algebra to avoid the need for this supersymmetry, but to still describe gravity and the standard model in 26 dimensions: http://cdsweb.cern.ch/search.py?recid=730325&ln=en. It was suppressed by arXiv.org, although arXiv.org does host Tony Smith's physical interpretation of string theory (illustration above): http://arxiv.org/ftp/physics/papers/0102/0102042.pdf

[If you first want a background to symmetry problems in physics, here is a summary:

There are many symmetry theories in modern physics, but the two vital to force mechanisms are the weak force symmetry (which exists for energies above 250 GeV but breaks spontaneously at 250 GeV, because of the Higgs mechanism which mires low energy particles but yields to higher energies, as a ball thrown hard enough will penetrate a glass window), and supersymmetry.

Electroweak theory was developed by Sheldon Glashow, Steven Weinberg and Abdus Salam. They showed that early in the big bang, there were three weak gauge bosons and a neutral boson, and that the photon which now exists is a combination of two of the original gauge bosons purely because this avoids being stopped by the weak charge of the vacuum; other combinations are stopped so the photon exists uniquely by the filtering out of other weak gauge bosons. Because the photon does not interact with the weak charge of the vacuum, it only interacts with electric charges. The vacuum is composed of weak charge, but not electric charge, so the photon can penetrate any distance of vacuum without attenuation. This is why electric forces are only subject to geometrical dispersion (inverse-square law).

These developments in the 1960s led to the Standard Model of fundamental particles. In this model, the strong nuclear, weak nuclear and electromagnetic forces all become similar at around 10^14 GeV, but beyond that they differ again, with electromagnetic force becoming stronger than the strong and weak forces. In 1974, Howard Georgi and Sheldon Glashow suggested a way to unify all three forces into a single superforce at an energy at 10^16 GeV. This ‘grand unified theory’ of all forces apart from gravity has the three forces unified above 10^16 GeV but separated into three separate forces at lower energies. The way they did this was by ‘supersymmetry’, doubling the particles of the Standard Model, so that each fundamental particle has a supersymmetric partner. The energy of 10^16 GeV is beyond testing on this planet and in this galaxy, so the only useful prediction they could make was that the proton should decay with a half-life somewhat smaller than has already been ruled out by experiment.

Edward Witten developed the current mainstream superstring model, which has 10/11 dimensions. The history of string theory begins in the 1920s with the Kaluza-Klein theory. Kaluza showed that adding a fifth dimension to general relativity units gravity and electromagnetism tensors, while Klein showed that the fifth dimension could remain invisible to us as a rolled up string. In the late 1960s, it was shown that the strings could vibrate and represent fundamental particle energies. In 1985, Philip Candelas, Gary Horowitz, Andy Strominger and Edward Witten suggested that 10-D string theory with the 6 extra dimensions curled up into a Calabi-Yau manifold would model the standard model, preserving supersymmetry and yet giving rise to an observable 4-D spacetime in which there is the right amount of difference between left and right handed interactions to account for the parity-violating weak force. This ‘breakthrough’ speculative invention was called ‘superstrings’ and led to the enormous increase in research in string theory.

Finally, in March 1995, Edward Witten proved that 10-D strongly coupled superstring theory is equivalent to 11-D weakly coupled supergravity. Apparently because it was presented in March, Witten named this new 10/11-D mathematics ‘M-theory’.

Witten then made the misleading claim that ‘string theory predicts gravity’:

‘String theory has the remarkable property of predicting gravity’: false claim by Edward Witten in the April 1996 issue of Physics Today, repudiated by Roger Penrose on page 896 of his book Road to Reality, 2004: ‘in addition to the dimensionality issue, the string theory approach is (so far, in almost all respects) restricted to being merely a perturbation theory’. String theory does not predict for the strength constant of gravity, G!
]


So it is very interesting that Tony Smith's approach gets rid of the need for supersymmetry. My view is that string theory is acceptable if it is made scientific, which means getting the simplest theory which meets the experimental facts and can be induced to make some predictions. His prediction of quark masses, http://www.valdostamuseum.org/hamsmith/d4d5e6hist.html, is of interest. Those parts of the model I can understand, which are largely at the connections to experimental facts, remind me of Feynman's approach to the unification of physics by guessing. Because the mechanism is not being investigated, but connections are being guessed, there is a tendency to jump hurdles to make connections between the model and the experimental facts. The hurdle jumping in some cases may be a correct leap, but in others it may well be wrong.

UPDATE: see http://www.math.columbia.edu/~woit/wordpress/?p=318#comment-7025

Since I want to know the mechanism, the whole basis of the approach - the mathematical speculations at a high level in unfamiliar territory - are a difficulty. The discussion on http://www.valdostamuseum.org/hamsmith/StringMFbranegrav.html is fascinating. A clear discussion is also at http://www.valdostamuseum.org/hamsmith/VodouPhysics.html.

He offers a $100,000 prize at http://www.valdostamuseum.org/hamsmith/VoDouPhysicsPrize5.html for further work.

At http://www.valdostamuseum.org/hamsmith/cnfGrHg.html, Smith quotes Feynman:

Richard Feynman's book Lectures on Gravitation (1962-63 lectures at Caltech), Addison-Wesley 1995, contains a section on Quantum Gravity by Brian Hatfield, who says: "... Feynman ... felt ... that ... the fact that a massless spin-2 field can be interpreted as a metric was simply a "coincidence" ... In order to produce a static force and not just scattering, the emission or absorption of a single graviton by either particle [of a pair of particles] must leave both particles in the same internal state ... Therefore the graviton must have integer spin. ... when the exchange particle carries odd integer spin, like charges repel and opposite charges attract ... when the exchanged particle carries even integer spin, the potential is universally attractive ... If we assume that the exchanged particle is spin 0, then we lose the coupling of gravity to the spin-1 photon ... the graviton is massless because gravity is a long ranged force and it is spin 2 in order to be able to couple the energy content of matter with universal attraction ...".

The whole basis of this is spin 2 graviton approach for quantum gravity wrong, because it is mechanism-less. It is possible to produce a physical mechanism which works and makes predictions, http://feynman137.tripod.com/, although it is very unorthodox or 'crackpot'. To me, the more 'crackpot' the facts seem, the better. This mechanism shows that gravity is a background effect of the Feynman electromagnetic mechanism, whereby charges are continually exchanging gauge boson radiation which produces the electromagnetic force. Experimentally, I'm building on Heaviside/Catt work, although both Heaviside and Catt had crackpotism in interpreting their work. Dr Arnold Lynch, instructed by J.J. Thomson in the 30s, and who worked on the computer used to break German codes inWWII, wrote me a few years before he died that Heaviside made a mess of the 'Heaviside slab of energy current' theory in one important respect: waveguides. Although Heaviside had developed the Poynting vector independently of Poynting to describe light speed electric morse code signals in the undersea cable between Newcastle and Denmark in 1875, he fell down when he visualised a waveguide (a rectangular metal box used to feed radio signals at UHF or microwave to an antenna without radiation escaping while in transit) as a short-circuited transmission line (two parallel plates connected together by two more plates to form a box). This, Heaviside pointed out, could never carry radio waves. However, it works, so Heaviside's belief that a radio waveguide would function as a transmission line was false. Catt does not appear to know or care about the mechanisms going in electromagnetism, despite his political type 'concerns'. He certainly refuses to do science when I try to discuss with him. So Catt is really crackpot!

Heaviside neglected the fact that his mechanism for electricity is not merely 'guided' by the surfaces of the conductors, but actually TIED to them, whereas radiowaves between metal plates just bounce around. The TEM wave of electricity is totally different to the radiowave, the radiowave starts at one conductor, travels through the vacuum at c, then arrives (after the delay t = d/c) at the receiver conductor. This is radio. It is Maxwell's 'DISPLACEMENT CURRENT' energy, not his 'light wave' model. The Heaviside/TEM wave of electricity goes in a direction 90 degrees different to the DISPLACEMENT CURRENT that is called radio waves. I've explained this to Catt, but he prefers rearranging the deck chairs aboard the Titanic as it sinks. He won't take notice of me, despite mechanism, proof and experimental fact.

Tony Smith on the other hand, is way into the mathematical side of physics, and is open minded to string theories, extra dimensions, dark energy and dark matter epicycles in cosmology, and multiverse (multiple 'universes' interpretation of quantum mechanics). This tends to send my blood pressure up a great deal, since these speculations are pushing to the religion side of science. However, it isn't as bad as Michio Kaku’s UFO's and Parallel Worlds.

Tony Smith's page http://www.valdostamuseum.org/hamsmith/CornellBan.html is my favourite page on the internet. I just love it the page , the heresy and suppression by arXiv.org: 'Sufi Islam, IFA, the Rig Veda, and Physics and the multicultural backgrounds of Jesus and Mary Magdalene was banned by Cornell...'. I've downloaded the 4 MB book which arXiv.org. I just love the fact that it was suppressed by arXiv.org. It is full of mathematics, advanced geometries with fascinating asides into religion. It really is good, the kind of book to have in your laptop as an escape. I like Tony Smith's recent poem:

'http://www.math.columbia.edu/~woit/wordpress/?p=302#comment-6124

'It seems to me that this modification of part of Rudyard Kipling’s “The Gods of the Copybook Headings” gives an optimistic (from my point of view) vision of the future of SuperString Theory:

'As I pass through my incarnations
in every age and race,
I make my proper protestations
to the Gods of the String Theory Place.
Peering at reverent Stringers
I watch them flourish and fall.

'And the Gods of Experiment Results,
I notice, outlast them all.
We were living in trees when they met us.
They showed us each in turn.
That water would certainly wet us,
as Fire would certainly burn:
They denied that Wishes were Horses;
they denied that a Pig had Wings.

'So we worshiped the Gods of String Theory
Who promised these beautiful things.
But, though we had plenty of Strings,
there was nothing our Strings could predict,
And the Gods of Experment said:
‘That means that String Theory is sick.’

'Then the Gods of String Theory tumbled,
and their smooth-tounged wizards withdrew,
And the hearts of the meanest were humbled
and began to belive it was true
That All is not Gold that Glitters,
and Two and Two make Four—
And the Gods of Experiment Results
limped up to explain it once more:
As surely as Water will wet us,
as surely as Fire will burn,
The Gods of Experiment Results
with inevitable truth will return!

'Tony Smith
http://ww.valdostamuseum.org/hamsmith/'

The illustration above attempts to show the electron schematically. The Heaviside energy current, or Poynting vector, has magnetic field, electric field, and propagation all perpendicular. The magnetic field forms loops normally, but if - as in the case of the electron - the Heaviside energy current is trapped in a loop due to the strength of gravity on small distance scales, it gives rise to a magnetic dipole but radially symmetric electric field, as shown in the April 2003 Electronics World article. The polarised vacuum around the electron core has an outward directed electric field (positive toward negative) which opposes the inward electric field of the core to the extent that 99.27% of the electric charge is shielded. The core's polar magnetic field is completely unaffected, of course, as it is parallel to - not crossing - the polarised electric field.

‘All charges are surrounded by clouds of virtual photons, which spend part of their existence dissociated into fermion-antifermion pairs. The virtual fermions with charges opposite to the bare charge will be, on average, closer to the bare charge than those virtual particles of like sign. Thus, at large distances, we observe a reduced bare charge due to this screening effect.’ – I. Levine, D. Koltick, et al., Physical Review Letters, v.78, 1997, no.3, p.424.

Since the Heisenberg uncertainty formula d = hc/(2.Pi.E), i.e., E = hc/(2.Pi.d), works for d and E as realities in calculating the observed ranges of forces carried by gauge bosons of energy E, we can introduce work energy as E = Fd, which gives us the electron core (unshielded) force law: F = hc/(2.Pi.d^2). This is 137.0... times Coulomb. Actually this is a short cut, but it works.

The 'ordered chaos' described statistically by Schroedinger's wave equation arises in the atom from the interactions of 3 bodies, the Poincare effect. Bohr's semi-classical atom is perfectly consistent with Schroedinger's wave equation for a nucleus, electron and one other particle, such as in the meaasuring instrument (it's not classical since it disagrees with Maxwell's equations as Bohr ignores radiation due to centripetally acceleratign charge, which is the gauge boson mechanism in the case of a spinning particle, but unfortunately spin was only introduced in 1925).This is because the normal circular or elliptical orbits of each electron is chaotically altered continuously by each of the other electrons as they move relative to each other. If you deal with a hydrogen atom with just 1 electron orbiting a nucleus, you have 2 bodies in effect, and the thing will obey Bohr's model, but you can never check it because you need another (third) particle in the instrument to probe where the electron is.This Poincare chaos effect, and the derivation on my page of Schrodinger's wave equations, is not speculative but well established fact.

http://motls.blogspot.com/2005/11/discrete-physics.html

Nigel said...

Dear Lumos,

Be careful ... This makes various predictions and contains no speculation whatsever, it is a fact based mechanism, employing Feynman's mechanism as exhibited in the Feynman diagrams - virtual photon exchange causing forces in QFT.

He noted that path integrals has a deeper underlying simplicity:

"It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities." - Character of Physical Law, pp 57-8.

Bests,
Nigel

Here is a check of Quantoken's 'GUITAR' theory:

http://quantoken.blogspot.com/2005/02/proton-and-neutron-mass-from-guitar.html

Quantoken begins saying that nucleons like the proton have 3 quarks: 'I find that given if the structure is constructed using a building element of 3 different flavors, there is exactly one way of forming one solid piece, and exact one way when all three pieces are separated from each other. And there are 3 ways one separate from the group of the other two. That's (1, 3, 1). When I futher study how many different ways within each scenary they can interact with each other, there are (1, 5!,7!) ways respectively. So the total number of intrisic states are: Wi = (1,3,1) * (1,5!7!) = (1x1 + 3*5! + 1*7!) = 1 + (3*5!) + 7! Isn't that elegant? Now don't forget that externally, for proton, it has a spin up and spin down state. That's two different states. The total number of states would then be two multiplied by the intrisic number of states above:W = Wspin * Wi = 2x(1+3*5!+7!). The entropy then would be S = ln(Wi). The simplest structure has two states, 0, 1, and the entropy is ln(2). So that's it. We have obtained the proton mass! Since proton is considered a point particle so far and NO geometric factor is involved, it's entropy from interla states corresponds to its mass linearly: Mp = S/ln(2) = ln(W)/ln(2) = ln(2*(1+3*5!+7!))/ln(2)Mp = ln(10802)/ln(2) = 13.39901083. That is the proton mass! ... It agrees with experimental value excellently! How come? Remember we are using the natural units so far. In the matural unit set, the electron mass is: Me = alpha * M0 = alpha = 1/137.03599911 = 7.297352568x10^-3. Let's see the mass ratio between proton and electron: Mp/Me = 13.39901083/7.297352568x10^-3 Mp/Me = 1836.146836. My calculation matches excellently with the accepted value of 1836.15, See http://physics.nist.gov/cgi-bin/cuu/Value?mpsmesearch_for=proton+mass'

Quantoken's theory is pretty abstract and the speculative interpretations are justified by the results so obtained. This is easier to ignore than to reasonably dismiss as nonsense, particularly when you look at the landscape of alternative universe speculations in string theory, and compare the level of guesswork. At least quantoken produces a result and compared it to nature! However, the leaps of faith involved in Quantoken's methodology are relatively large. For example, Quantoken goes on to use the Sommerfeld fine structure number 137.0..., and if you scroll down to the comments of his post (cited above), you see it varies with energy because it is the shielding factor of the electron core charge, which is 137e-, and is reduced at low energies to e- by the polarised vacuum shield around the core. At high energies, collisions break partly through the polarised vacuum shield, so more of the strong core electric field is revealed. Quantoken lacks this. He does however recognise that the muon mass is 1.5 times the electron mass times 137.0..., but his explanation of the 1.5 factor is non-mechanistic and numerological.

Sunday, November 27, 2005

Penrose’s Perimeter Institute lecture is interesting: ‘Are We Due for a New Revolution in Fundamental Physics?’

Penrose suggests quantum gravity will come from modifying quantum field theory to make it compatible with general relativity.

I like the questions at the end where Penrose is asked about the ‘funnel’ spatial pictures of blackholes, and points out they’re misleading illustrations, since you’re really dealing with spacetime not a hole or distortion in 2 dimensions. The funnel picture really shows a 2-d surface distorted into 3 dimensions, where in reality you have a 3-dimensional surface distorted into 4 dimensional spacetime. In his essay on general relativity in the book It Must Be Beautiful, Penrose writes:

‘… when there is matter present in the vicinity of the deviating geodesics, the volume reduction is proportional to the total mass that is surrounded by the geodesics. This volume reduction is an average of the geodesic deviation in all directions … Thus, we need an appropriate entity that measures such curvature averages. Indeed, there is such an entity, referred to as the Ricci tensor …’

Feynman discussed this simply as a reduction in radial distance around a mass of (1/3)MG/c^2 = 1.5 mm for Earth. It’s such a shame that the physical basics of general relativity are not taught, and the whole thing gets abstruse. The curved space or 4-d spacetime description is needed to avoid Pi varying due to gravitational contraction of radial distances but not circumferences.

Hubble's observations of the big bang (above), illustrate an apparent increase in recession speeds with distances. But we're looking back in time with increasing distance.

By the time the light arrives, the stars are not where they were when the light was emitted, so the recession speed only corresponds to a specific time in the past, not to a specific distance now.

Hence, the proper correlation to get a constant is not the quotient 'speeds/distances' but rather 'speeds/time' which is acceleration, and is a = c/(age of universe) = 6 x 10^-10 m/s^2.

This acceleration is real, unlike the fiddled acceleration implied by the naive application of general relativity to the big bang that ignores the mechanism of gravity in the big bang itself.

The latter, fictional, acceleration is an invention to keep general relativity minus the gravity mechanism for it fitting astronomical data. Because the inward force depends on the reaction of gauge bosons to the outward expansion force of the surrounding universe (since gravity is the shielding of a push rather than a pull force), the most distant galaxies have no inward push retarding them, so the naive application of general relativity to the big bang is false.

General relativity is the basis of the mechanism of gravity, but by the physical properties of space, not by the naive and false application of mechanism-less general relativity to the big bang.

The fiddled acceleration is supposed to prove that there is a lot of mysterious 'dark energy'. As we have seen, this is simply due to the lack of the mechanism for gravity in general relativity. When this lack is filled, general relativity works properly.

Similarly, the mysterious 'dark matter' is due to the same problem, the application of general relativity naively to the big bang, which predicts a critical density for the universe which is 0.5e^3 times higher than the correct density, given by heuristically finding and introducing the correct mechanism for gravity into the general relativity framework.

The factor 0.5e^3 is equal to about 10. The 0.5 comes from the geometric shielding effect in the gravitational mechanism (proved identically using two distinct calculations), while the e^3 is a density corection factor derived from a solution to an integral of the visible spacetime divergence with changing time past, which varies the density. The divergence is equivalent to a red-shift effect, so that the extremely high (approaching infinite) densities as we look towards time zero (tremendous spacetime distance or time past) don't introduce an infinity. They don't introduce an infinity because the divegence weakens the contribution from collossal distances, or very early times, toward zero.

On the home page, I show that the density falls with the inverse cube of time, if there is light speed expansion with no gravitational retardation (because with the gravity mechanism, there is no push to slow down the most distant matter/energy in the universe).

The density variation formula is [(1 – r/(ct)]^-3. Setting this equal to density factor e^3 we see that 1 - r/(ct) = 1/e. Hence r = 0.632ct. This means that the effective distance at which the gravity mechanism source lies is at 63.2 % of the radius of the universe, R = ct.

At that distance, the density of the universe is 20 times the local density where we are, at a time of 15,000,000,000 years after the big bang. Therefore, the effective average distance of the gravity source is 9,500,000,000 light years away, or 5,500,000,000 years after the big bang. What is scary is that this, all of it, is very simple, but the 'experts' just want to sneer and say general relativity (describing absolute accelerations like rotation) is wrong because of special relativity (which is a flat earth theory, unable to deal with accelerations at all) doesn't contain absolute motions like accelerations. We now live in not the unscientific age Feynman complained about in the 60s, but a fascist age where it is fashionable to score points by suppressing science.

Matt Edwards, who makes money from the book 'Pushing Gravity' rejects the scientific method, like Ivor Catt, http://en.wikipedia.org/wiki/Talk:Gravity/Archive_3

Dear Nigel,

Since you stated above that I endorsed your theory, I have no choice but to reply. I did not endorse it. I merely noted that there were some mathematical similarities between your model and something I was working on (which did not appear in the book). Your model is premised on the Big Bang model. As I made clear in the full text of my letter to you, I do not subscribe to the Big Bang. I support a model featuring a universe in perpetual equilibrium, in the manner of Jaakkola's work. Please be more careful about quoting others.

Best wishes, Matt Edwards

Dear Matt Edwards,

I read your email and you apparently endorsed the mathematical similarities between the mechanism's prediction of my model, and the measured experimental gravity laws of Newton and Einstein, the newtonian law and the general relativity contraction mechanism. Either you want science or not. Ivor Catt has also done this [refusing to back proved fact if someone else has worked it out; preferring to imagine science is a personal business not a matter of hard facts about the universe]. If you understand what science is about you know it rests on facts. If you don't subscribe to the big bang model, you need a replacement model which has at least as much testable evidence for it. It really is sickening that the few people who understand LeSage mechanism chicken out of the science of the big bang. Do you subscribe to evolution, or is it just the big bang which your personal feelings don't tag along with? Please be reasonable here. I'm talking about predictions confirmed by experimental measurements, this is the scientific criterion, not prejudice.

Best wishes,
Nigel Cook

Comment on Plato's blog

The first thing to do is understand the causal mechanism behind the exclusion principle, the spin thing, which is a magnetism force effect. The electrons and quarks are small normal dipole magnets and align in pairs, cancelling each other's magnetism out, just as when you have a pile of magnets. They don't align themselves naturally into one super magnet, but into pairs pointing opposite ways, because the entropy increases that way. Electrons in an atom feel each other's magnetism as well as the electric force. In fact the polar (radial) magnetic field from the electron core won't be shielded by the polarised vacuum, so it will produce greater magnetic force effects than the electric field from the core which is reduced by a factor of 137 by the polarised vacuum shield.


Heuristically, gauge boson (virtual photon) transfer between charges to cause electromagnetic forces, and those gauge bosons don't discriminate against charges in neutral groups like atoms and neutrons. The Feynman diagrams show no way for the gauge bosons/virtual photons to stop interactions. Light then arises when the normal exchange of gauge bosons is upset from its equilibrium.

You can test this heuristic model in some ways. First, most gauge bosons are going to be exchanged in a random way between charges, which means the simple electric analogue is a series of randomly connected charged capacitors (positive and negative charges, with vacuum 377 ohm dielectric between the 'plates'). Statistically, if you connect an even number of charged capacitors in random along a line across the universe, the sum will be on average be zero. But if you have an odd number, you get an average of 1 capacitor unit. On average any line across the universe will be as likely to have an even as an odd number of charges, so the average charge sum will be the mean, (0 +1)/2 = 1/2 capacitor. This is weak and always attractive, because there is NO force at all in the sum = 0 sum case and attractive force (between oppositely charged capacitor plates) in the sum = 1 case.

Because it is weak and ALWAYS attractive, it's gravitation?

The other way they charges can add is in a perfect summation where every charge in the universe appears in the series + - + -, etc. This looks improbable, but is statistically a drunkard's walk, and by the nature of path-integrals gauge bosons do take every possible route, so it WILL happen. When capacitors are arranged like this, the potential adds like a statistical drunkard's walk because of the random orientation of "capacitors", the diffusion weakening the summation from the total number to just the square root of that number because of the angular variations (two steps in opposite directions cancel out, as does the voltage from two charged capacitors facing one another). This vector sum of a drunkard's walk is the average step times the square root of the number of steps, so for ~10^80 charges, you get a resultant of ~10^40.
The ratio of electromagnetism to gravity is then (~10^40)/(1/2).

Notice that this proof shows gravity is caused by the same agent as electromagnetism; by gauge bosons. It does away with needing separate quantum gravity and unobserved gravitons.

The distances between the charges are ignored. This is explained because on average half the gauge bosons will be going away from the observer, and half will be approaching the observer. The fall due to the spread over larger areas with divergence is offset by the concentration due to convergence.

The Feynman problem with virtual particles in the spacetime fabric retarding motion does indeed cause the FitzGerald-Lorentz contraction, just as they cause the radial gravitationally produced contraction of distances around any mass (equivalent to the effect of the pressure of space squeezing things and impeding accelerations). What Feynman thought may cause difficulties is really the mechanism of inertia:

Maxwell’s 1873 Treatise on Electricity and Magnetism, Articles 822-3: ‘The ... action of magnetism on polarised light [discovered by Faraday not Maxwell] leads ... to the conclusion that in a medium ... is something belonging to the mathematical class as an angular velocity ... This ... cannot be that of any portion of the medium of sensible dimensions rotating as a whole. We must therefore conceive the rotation to be that of very small portions of the medium, each rotating on its own axis [spin] ... The displacements of the medium, during the propagation of light, will produce a disturbance of the vortices ... We shall therefore assume that the variation of vortices caused by the displacement of the medium is subject to the same conditions which Helmholtz, in his great memoir on Vortex-motion [of 1858; sadly Lord Kelvin in 1867 without a fig leaf of empirical evidence falsely applied this vortex theory to atoms in his paper ‘On Vortex Atoms’, Phil. Mag., v4, creating a mathematical cult of vortex atoms just like the mathematical cult of string theory now; it created a vast amount of prejudice against ‘mere’ experimental evidence of radioactivity and chemistry that Rutherford and Bohr fought], has shewn to regulate the variation of the vortices [spin] of a perfect fluid.’

‘… the source of the gravitational field can be taken to be a perfect fluid…. A fluid is a continuum that ‘flows’... A perfect fluid is defined as one in which all antislipping forces are zero, and the only force between neighboring fluid elements is pressure.’ – Professor Bernard Schutz, General Relativity, Cambridge University Press, 1986, pp. 89-90.

‘In this chapter it is proposed to study the very interesting dynamical problem furnished by the motion of one or more solids in a frictionless liquid. The development of this subject is due mainly to Thomson and Tait [Natural Philosophy, Art. 320] and to Kirchhoff [‘Ueber die Bewegung eines Rotationskörpers in einer Flüssigkeit’, Crelle, lxxi. 237 (1869); Mechanik, c. xix]. … it appeared that the whole effect of the fluid might be represented by an addition to the inertia of the solid. The same result will be found to hold in general, provided we use the term ‘inertia’ in a somewhat extended sense.’ – Sir Horace Lamb, Hydrodynamics, Cambridge University Press, 6th ed., 1932, p. 160. (Hence, the gauge boson radiation of the gravitational field causes inertia. This is also explored in the works of Drs Rueda and Haisch: see http://arxiv.org/abs/physics/9802031 http://arxiv.org/abs/gr-qc/0209016 , http://www.calphysics.org/articles/newscientist.html and http://www.eurekalert.org/pub_releases/2005-08/ns-ijv081005.php .)

Saturday, November 26, 2005

Above: the powerpoint slide from Smolin's talk, a diagram from Craig B. Markwardt's paper 'Independent Confirmation of the Pioneer 10 Anomalous Acceleration'. http://arxiv.org/abs/gr-qc/0208046

Dr Lee Smolin's talk at 2005 Loop Quantum Gravity: PIONEER 10 ACCELERATING AT ~8.6 x 10^-10 m/s^2

The acceleration of the big bang is the variation in velocities (0 - c) over variations in the time component of spacetime (0 - 15,000,000,000 years), a = c/t = 7 x 10^-10 m/s^2.

I've just listened to a video of Smolin's talk, Some persistent puzzles in background independent approaches to quantum gravity. Holy smoke! I never knew that Pioneer 10 and 11 are accelerating at the amount given by the big bang gravity model on my page. Here is a transcript of the relevant part of Smolin's presentation (he suggests anomalies because of the 1 in 100 million asymmetry in the quantum gravity loops that upsets accelerations on cosmic scales):

'One can multiply by the speed of light, but that just turns the Hubble distance into the Hubble time. If one multiplies by two speeds of light to get a big number and divides by the Hubble scale, one gets an acceleration which is 10^-8 cm/s^2, and every place where we observe that acceleration there are anomalies in Newton's laws!

'Now I don't know whether to believe this set of anomalies, the Pioneer anomaly, I'm not going to argue for it, but the claim is that what is measured in two independent satellites, maybe three if you stretch the data, is an anomalous acceleration toward the sun of 8 x 10 ^-8 cm/s^2. And then there's the dark matter, the fact that the [galaxy] rotation curves go flat, because it turns out that that's a typical acceleration for stars in the outer edges of the galaxy. ...'

References: http://www.space-time.info/pioneer/pioanomlit.html
http://arxiv.org/abs/gr-qc/0104064

Supersymmetry in the Standard Model allows unification of all fundamental forces apart from gravity at high energy, the M-theory Calabi-Yau manifold rolls 6 dimensions up in such a way that the parity-violating weak force is explained. Witten proved that 10-D strongly coupled superstring theory is equivalent to 11-D weakly coupled supergravity.

‘String theory has the remarkable property of predicting gravity’ -Edward Witten, Physics Today, April 96.

How beautiful... Just don't ask how well the only 'prediction' compared to experiment.

'There is, however, one physical prediction that string theory does make: the value of a quantity called the cosmological constant (a measure of the energy of the vacuum). Recent observations of distant supernovae indicate that this quantity is very small but not zero. A simple argument in string theory indicates that the cosmological constant should be at least around 55 orders of magnitude larger than the observed value. This is perhaps the most incorrect experimental prediction ever made...' - Woit, American Scientist, March-April 2002.

Thursday, November 24, 2005

Science orthodoxy = science

Above is the ‘greatest’ formula in the world, the formula used to keep progress from occurring!

‘Men are deplorably ignorant with respect to natural things … they must be made to quit the sort of learning that comes only from books, and that rests only on vain arguments from probability and upon conjectures [M-theory].’ – William Gilbert, De Magnete, 1600 AD.

Tony Smith hosts formerly secret pics of the 7-kt Fishbowl-Checkmate detonation at 147 km altitude, effectively in space: http://valdostamuseum.org/hamsmith/fishbowl1.gif . I identified this test from the black rocket exhaust trails blown about by the wind at low altitudes, comparing the photos to a few stills in Dolan’s Capabilities of Nuclear Weapons, U.S. Department of Defense, 1972, chapter 1 (originally secret-restricted data). The outward force of a ‘big bang’ in space (the outward pressure times the spherical area, F=PA) should cause an inward reaction force of gauge bosons, to help explain gravity simply, and this does seem to make the right predictions: http://nigelcook0.tripod.com/.

Tony Smith’s CERN document server paper, EXT-2004-031, uses the Lie algebra E6 to avoid 1-1 boson-fermion supersymmetry:‘As usually formulated string theory works in 26 dimensions, but deals only with bosons … Superstring theory as usually formulated introduces fermions through a 1-1 supersymmetry between fermions and bosons, resulting in a reduction of spacetime dimensions from 26 to 10. The purpose of this paper is to construct … using the structure of E6 to build a string theory without 1-1 supersymmetry that nevertheless describes gravity and the Standard Model…’

I wonder why this sort of work is excluded by the string theorists who censor arXiv?

Could it be that THEY are the paranoid ones? Peter Woit says he doesn’t like the claims made for the Calabi-Yau manifold being beautiful, and that 11-d supergravity doesn’t predict anything testable. If there is useful science in M-theory, mainstream string theorists have had a decade to find it! More likely, it’s a dead end, like Kelvin’s vortex atom or Maxwell’s elastic aether. Just because Kelvin and Maxwell were top mathematicians as well as physicists, did not make their speculations correct. String theorists should study Kelvin’s vortex atom and Maxwell’s aether to see the fate of paranoia-type defence of crackpotism. Science can’t cover up the ineptitude of famous people endlessly.

See Frederick Forsyth's essay in the Daily Express (7 Oct 05, p11): 'Fascism is not a doctrinal creed; it is a way of behaving towards your fellow man. What, then, are the tell-tale hallmarks of this horrible attitude? Paranoid control-freakery; an obsessional hatred of any criticism or contradiction; the lust to character-assassinate anyone even suspected of it; a compulsion to control or at least manipulate the media... the majority of the rank and file prefer to face the wall while the jack-booted gentlemen ride by. ... An interesting man, John Reid. A socialist (actually he started out as a communist) all his life, he too has sold the pass. Now he is happy to send British troops to die in a faraway place in a war he knows perfectly well was started on a tissue of deliberate lies. It's a pity. I never flagged John Reid as a second-rater. The other apostates, Prescott, Straw, Blunkett, yes; and of course the founding Blairites, who never had a faith at all, yes; and the stark-naked power lusters, yes. But to see Reid unfazed by what happened to the old Jewish refugee and pacifist [Walter Wolfgang] and making limp excuses... that was sad. ... But I do not believe the innate dececency of the British people has gone. Asleep, sedated, conned, duped, gulled, deceived, but not abandoned.' The same applies in science!

Catt's work explained simply and briefly

‘In 1964 I went to Motorola to research into the problem of interconnecting very fast (1 ns) logic gates ... we delivered a working partially populated prototype high speed memory of 64 words, 8 bits/word, 20 ns access time. ... I developed theories to use in my work, which are outlined in my IEEE Dec 1967 article (EC-16, n6) ... In late 1975, Dr David Walton became acquainted ... I said that a high capacitance capacitor was merely a low capacitance capacitor with more added. Walton then suggested a capacitor was a transmission line. Malcolm Davidson ... said that an RC waveform [Maxwell’s continuous ‘extra current’ for the capacitor, the only original insight Maxwell made to EM] should be ... built up from little steps, illustrating the validity of the transmission line model for a capacitor [charging/discharging]. (This model was later published in Wireless World in Dec 78.)’ - Ivor Catt, Electromagnetic Theory Volume 2, St Albans, 1980, pp207-15. (See http://www.ivorcatt.org/icrwiworld78dec1.htm)




Above: Two electrons repel because gauge bosons exchanged between them are non-redshifted, unlike the gauge bosons pushing them together from the surrounding, receding universe. So repulsion is explained. An electron and a proton are pushed together because they shield one another, being opposite charges. It is all a matter of recoil due to the momentum of gauge bosons.The electron is emitting and receiving gauge bosons from charges in all directions, but the big bang means that those from the surrounding universe suffer redshift. Between two nearby similar charges, the exchange on facing sides of the charges where they couple is not redshifted, although it is redshifted where they are being pushed together on the far sides.The net exchange is like two machine gunners firing bullets at each other; they recoil apart. The gauge bosons pushing them together are redshifted, like nearly spent bullets coming from a great distance, and are not enough to prevent repulsion.In the case of attraction, the same principle applies. The two opposite charges shield one another and get pushed together. Although each charge is radiating and receiving energy on the outer sides, the inward push is from redshifted gauge bosons, and the emission is not redshifted. The result is just like two people, standing back to back, firing machine guns. The recoil pushes them together, hence the attraction force.

‘... the view of the status of quantum mechanics which Bohr and Heisenberg defended - was, quite simply, that quantum mechanics was the last, the final, the never-to-be-surpassed revolution in physics ... physics has reached the end of the road.’ - Sir Karl Popper, 'Quantum Theory and the Schism in Physics', Rowman and Littlefield, NJ, 1982, p6.

Charged capacitors consist, most basically, of two opposite charges separated by an insulator like vacuum. The plates ‘attract’. An atom is a capacitor, and the attraction is offset by the motion of the electron. Gauge bosons deliver the electric force. If you put a line of capacitors in series, the potential difference, or voltage to use the old expression (it is like calling distance ‘mileage’ that is disapproved of by teachers who are generally better at imposing red-tape than real physics), adds up simply. A whole atom is neutral, but it is also positive and negative. Whether another particle 'sees' the atom as neutral or as containing two charges, one positive and one negative, depends on the situation:

EXAMPLE 1:

+
…….Gauge bosons cause attraction!
-


EXAMPLE 2:
+…….
…………………Gauge bosons cease to exist?!
-……. +

(Negative charges in diagonals repel, positives in diagonals also repel, and these two forces tend to result in neutral. Actually, I think because the diagonals are a bigger distance by the square root of two, to the horizontal and vertical spacings, the point below about there being no net force is wrong in this example. The repulsion forces will be weaker due to the extra distance along diagonals and the inverse square law, than the attractive forces that act over shorter distances. But forget that. The point I’m making is that gauge bosons don’t discriminate between a lone electron and an electron in a neutral atom. The apparent lack of electric force for a balanced ‘neutral’ atom or indeed for a ‘neutral’ neutron composed of three charged quarks, is an illusion, due to equilibrium.)

In the second charge arrangement above, a square with charges at each corner, the attraction and repulsion forces are in equilibrium, so there is no net force. But as soon as you distort the geometry from a square, a net force appears.Two explanations are offered by different theories: (1) forces don't operate unless they can be measured, so there are no gauge bosons in an equilibrium (Mach's principle of economy or ignoring the unobservable), and (2) there is an equilibrium of forces, so gauge bosons continue to be exchanged between charges in all 'neutral' situations, where the net charge is zero.If electromagnetism gauge bosons do continue to be exchanged in the second situation, then we have a simple way to unify electricity and gravity quantitatively.The gauge bosons exchanged between charges in a big bang universe containing a random distribution of equal amounts of each charge has provable properties, based on the capacitor addition effect.Charged capacitors in series add up like the cells in a battery. Similarly, a series of + - + - ... charges is like a series of cells. The gauge bosons (like real bosons in Feynman's path integrals) go all possible ways, and will sniff out a zig-zag line of capacitors in the universe. The dielectric between the capacitor plates (fundamental particles, charged quarks and charged electrons, not 'neutral' atoms or 'neutral' neutrons) is vacuum, and we have vacuum capacitors that work.This zig-zag summation of every charge in the universe gives a strong electric force, since the basic charge force for a single particle gets multiplied by the square root of the number of charges in the universe. This comes from the vector resultant in a drunkard's walk statistics, where a drunk taking 100 paces, each being in a random direction, will end up on average 10 paces from where he started. This is not speculation but vector statistics, mathematical fact!With 10^80 charges in the universe (equal to roughly the volume of a sphere of 15 giga light years radius, times the average density of the universe at 15 giga years after the big bang, divided by the mass of a hydrogen atom, since the universe is mainly hydrogen), the strength of electromagnetism will be the square root of this times gravity, 10^40.Why ‘times gravity’? In say a straight line through the universe, the charges will be randomly aligned, like a series of randomly placed capacitors or indeed battery cells. The average potential from such a series is zero for an even number of cells, + -, but is 1 unit for an odd number. In the universe there will therefore be an average of (0 + 1)/2 = 1/2 cells for any straight line. This 1/2 cell is corresponds to 1 charge, gravity. The ratio of electromagnetism/gravity is thus 10^40/(1/2).You could of course ask about the distances of the charges distributed in space, and try to make the mechanism absurd on this ground. But you would be wrong. You see, on average half the gauge bosons in the universe will be going away from you, while half will be coming toward you. Those going away are diverging outward, but those approaching are converging inward. Just as divergence results in a fall by the inverse-square law, so does convergence result in an increase by the same factor. The two distance factors cancel as (R^2).(1/R^2) = 1. Hence the drunkard's walk is merely a summation of charges and the two distance factors due to proximity or non-proximity of charges simply cancel out!

The magnetic force in electromagnetism results from the spin of gauge bosons, and this seems to be one thing about Maxwell’s spacetime fabric that was not entirely wrong. Maxwell’s 1873 Treatise section 822-3: ‘The ... action of magnetism on polarised light [discovered by Faraday not Maxwell] leads ... to the conclusion that in a medium ... is something belonging to the mathematical class as an angular velocity ... This ... cannot be that of any portion of the medium of sensible dimensions rotating as a whole. We must therefore conceive the rotation to be that of very small portions of the medium, each rotating on its own axis... The displacements of the medium, during the propagation of light, will produce a disturbance of the vortices ... We shall therefore assume that the variation of vortices caused by the displacement of the medium is subject to the same conditions which Helmholtz, in his great memoir on Vortex-motion, has shewn to regulate the variation of the vortices of a perfect fluid.’

Tuesday, November 22, 2005

Danny Ross Lunsford’s major paper, published in Int. J. Theor. Phys., v 43 (2004), No. 1, pp.161-177, was submitted to arXiv.org but was removed from arXiv.org by censorship apparently since it investigated a 6-dimensional spacetime which again is not exactly worshipping Witten’s 10/11 dimensional M-theory. It is however on the CERN document server at http://doc.cern.ch//archive/electronic/other/ext/ext-2003-090.pdf, and it shows the errors in the historical attempts by Kaluza, Pauli, Klein, Einstein, Mayer, Eddington and Weyl. It proceeds to the correct unification of general relativity and Maxwell’s equations, finding 4-d spacetime inadequate: ‘… We see now that we are in trouble in 4-d. The first three [dimensions] will lead to 4th order differential equations in the metric. Even if these may be differentially reduced to match up with gravitation as we know it, we cannot be satisfied with such a process, and in all likelihood there is a large excess of unphysical solutions at hand. … Only first in six dimensions can we form simple rational invariants that lead to a sensible variational principle. The volume factor now has weight 3, so the possible scalars are weight -3, and we have the possibilities [equations]. In contrast to the situation in 4-d, all of these will lead to second order equations for the g, and all are irreducible - no arbitrary factors will appear in the variation principle. We pick the first one. The others are unsuitable … It is remarkable that without ever introducing electrons, we have recovered the essential elements of electrodynamics, justifying Einstein’s famous statement …’ D.R. Lunsford shows that 6 dimensions in SO(3,3) should replace the Kaluza-Klein 5-dimensional spacetime, unifying GR and electromagnetism: ‘One striking feature of these equations ... is the absent gravitational constant - in fact the ratio of scalars in front of the energy tensor plays that role. This explains the odd role of G in general relativity and its scaling behavior. The ratio has conformal weight 1 and so G has a natural dimensionfulness that prevents it from being a proper coupling constant - so this theory explains why ordinary general relativity, even in the linear approximation and the quantum theory built on it, cannot be regularized.’

Major revision of http://nigelcook0.tripod.com/ just uploaded. Most changes are near the beginning, including two new illustrations and a more detailed discussion.

Monday, November 21, 2005

Rueda and Haisch, Physical Review A v49 p 678 (1994), showed that the virtual radiation of electromagnetism can cause inertial mass, and in Annalen der Physik v14 p479 they do the same for gravity in general relativity. The virtual radiation acts on fundamental particles of mass, quarks and electrons, which are always charged. It doesn’t ignore all the quarks in a neutron just because they have no net charge. A gauge boson going at light speed doesn’t discriminate between neutrons and protons, only the fundamental quarks inside them. Therefore, the background field of virtual radiation pressure besides causing inertia (and the contraction of moving objects in the direction of motion) also causes gravity (and the contraction in the direction of gravitational fields, the reduction in GR).

The coupling constant for electromagnetism is then naturally related to gravity. Between similar charges, the electric field causing the radiation pressure adds up like a series of batteries. In any line, there will be approximately equal numbers of both charges, so the sum will be zero. The only way it can add up is by a drunkard’s walk, where the statistics show the net charge will be the square root of the number of similar charges in the universe. Since there are 10^80 charges, electromagnetism will be 10^40 times stronger than gravity. Attraction is due to opposite charges screening each other and being pushed together by the radiation from the surrounding universe, while repulsion is due to the fact that nearby charges exchange gauge bosons which aren’t redshifted by cosmic expansion (and so produce a mutual recoil), while the radiation pushing them together is red-shifted by the big bang and so is weaker.


Lunsford’s CERN document server paper http://doc.cern.ch//archive/electronic/other/ext/ext-2003-090.pdf discounts the historical attempts by Kaluza, Pauli, Klein, Einstein, Mayer, Eddington and Weyl. It proceeds to the correct unification of general relativity and Maxwell’s equations, finding 4-d spacetime inadequate: ‘… We see now that we are in trouble in 4-d. The first three [dimensions] will lead to 4th order differential equations in the metric. Even if these may be differentially reduced to match up with gravitation as we know it, we cannot be satisfied with such a process, and in all likelihood there is a large excess of unphysical solutions at hand. … Only first in six dimensions can we form simple rational invariants that lead to a sensible variational principle. The volume factor now has weight 3, so the possible scalars are weight -3, and we have the possibilities [equations]. In contrast to the situation in 4-d, all of these will lead to second order equations for the g, and all are irreducible - no arbitrary factors will appear in the variation principle. We pick the first one. The others are unsuitable … It is remarkable that without ever introducing electrons, we have recovered the essential elements of electrodynamics, justifying Einstein’s famous statement …’

D.R. Lunsford shows that 6 dimensions in SO(3,3) should replace the Kaluza-Klein 5-dimensional spacetime, unifying GR and electromagnetism: ‘One striking feature of these equations ... is the absent gravitational constant - in fact the ratio of scalars in front of the energy tensor plays that role. This explains the odd role of G in general relativity and its scaling behavior. The ratio has conformal weight 1 and so G has a natural dimensionfulness that prevents it from being a proper coupling constant - so this theory explains why ordinary general relativity, even in the linear approximation and the quantum theory built on it, cannot be regularized.’

Sunday, November 20, 2005

'Unified force theory' frauds

With such a dramatic lack of experimental support, string theorists often attempt to make an aesthetic argument, professing that the theory is strikingly "elegant" or "beautiful." Because there is no well-defined theory to judge, it's hard to know what to make of these assertions, and one is reminded of another quotation from Pauli. Annoyed by Werner Heisenberg's claims that, though lacking in some specifics, he had a wonderful unified theory (he didn't), Pauli sent letters to some of his physicist friends each containing a blank rectangle and the text, "This is to show the world that I can paint like Titian. Only technical details are missing." Because no one knows what "M-theory" is, its beauty is that of Pauli's painting. Even if a consistent M-theory can be found, it may very well turn out to be something of great complexity and ugliness. - Dr Peter Woit, 'Is string theory even wrong?', American Scientist, March-April 2002, http://www.americanscientist.org/template/AssetDetail/assetid/18638/page/2#19239

The illustration above is the anti-Heisenberg campaign of exclusion-principle discoverer, Wolfgang Pauli. It reads: 'Comment on Heiseberg's Radio advertisement. This is to show the world that I can paint like Titian. Only technical details are missing. W. Pauli.'

http://cosmicvariance.com/2005/11/14/our-first-guest-blogger-lawrence-krauss/:

The whole basis of the energy-time version of the uncertainty principle is going to be causal (random interactions between the gauge boson radiation, which consititues the spacetime fabric).

Heuristic explanations of the QFT are required to further the basic understanding of modern physics. For example, Heisenberg’s uncertainty (based on impossible gamma ray microscope thought experiment): pd = h/(2.Pi), where p is uncertainty in momentum and d is uncertainty in distance. The product pd is physically equivalent to Et, where E is uncertainty in energy and t is uncertainty in time. Since, for light speed, d = ct, we obtain: d = hc/(2.Pi.E). This is the formula the experts generally use to relate the range of the force, d, to the energy of the gauge boson, E. Notice that both d and E are really uncertainties in distance and energy, rather than real distance and energy, but the formula works for real distance and energy, because we are dealing with a definite ratio between the two. Hence for 80 GeV mass-energy W and Z intermediate vector bosons, the force range is on the order of 10^-17 m. Since the formula d = hc/(2.Pi.E) therefore works for d and E as realities, we can introduce work energy as E = Fd, which gives us the strong nuclear force law: F = hc/(2.Pi.d^2). This inverse-square law is 137 times Coulomb’s law of electromagnetism.

So surely the heuristic explanation of this 137 anomaly is just the shielding factor by the polarised vacuum?

‘All charges are surrounded by clouds of virtual photons, which spend part of their existence dissociated into fermion-antifermion pairs. The virtual fermions with charges opposite to the bare charge will be, on average, closer to the bare charge than those virtual particles of like sign. Thus, at large distances, we observe a reduced bare charge due to this screening effect.’ – I. Levine, D. Koltick, et al., Physical Review Letters, v.78, 1997, no.3, p.424.

The muon is 1.5 units on this scale but this is heuristically explained by a coupling of the core (mass 1) with a virtual particle, just as the electron couples increasing its magnetic moment to about 1 + 1/(2.Pi.137). The mass increase of a muon is 1 + 1/2 because Pi is due to spin and the 137 shielding factor doesn’t apply to bare particles cores in proximity, as it is due to the polarised vacuum veil at longer ranges. This is why unification of forces is approached with higher energy interactions, which penetrate the veil.

This idea predicts that a particle core with n fundamental particles (n=1 for leptons, n = 2 for mesons, and obviously n=3 for baryons) coupling to N virtual vacuum particles (N is an integer) will have an associative inertial mass of Higgs bosons of:

(0.511 Mev).(137/2)n(N + 1) = 35n(N + 1) Mev, where 0.511 Mev is the electron mass. Thus we get everything from this one mass plus integers 1,2,3 etc, with a mechanism.
Accuracy tested against data for mass of muon and all ‘long-lived’ hadrons:

LEPTON (n=1): Muon (N=2): 105 Mev (105.66 Mev measured)

HADRONS

Mesons (contain n=2 quarks):

Pions (N=1): 140 Mev (139.57 and 134.96 actual)Kaons (N=6): 490 Mev (493.67 and 497.67 actual)Eta (N=7): 560 Mev (548.8 actual)
Baryons (contain n=3 quarks):Nucleons (N=8): 945 Mev (938.28 and 939.57 actual)Lambda (N=10): 1155 Mev (1115.60 actual)Sigmas (N=10): 1155 Mev (1189.36, 1192.46, and 1197.34 actual)Xi (N=12): 1365 Mev (1314.9 and 1321.3)

The mechanism is that the charge of the bare electron core is 137 times the Coulomb (polarisation-shielded) value, so vacuum interactions of bare cores of fundamental particles attract 137 times as much virtual mass from the vacuum, increasing the inertia similarly. It is absurd that these close fits, with only a few percent deviation, are random chance, and this can be shown by statistical testing using random numbers as the null hypothesis. So there is strong evidence that this heuristic interpretation is on the right lines.

The problem is that people are used to looking to abstruse theory due to the success of QFT in some areas, and looking at the data is out of fashion. If you look at history of chemistry there were particle masses of atoms and it took school teachers like Dalton and a Russian to work out periodicity, because the bigwigs were obsessed with vortex atom maths, the ’string theory’ of that age.

Eventually, the obscure school teachers won out over the mathematicians, because the vortex atom (or string theory equivalent) did nothing, but empirical analysis did stuff.

Mesons
Pions = 1.99 (charged), 1.93 (neutral)Kaons = 7.05 (charged), 7.11 (neutral)Eta = 7.84
Baryons
Nucleons = 13.4Lambda = 15.9Sigmas = 17.0 (positive and neutral), 17.1 (negative)Xi = 18.8 (neutral), 18.9 (negative)Omega = 23.9

The masses above for all the major long-lived hadrons are in units of (electron mass)x137. A statistical Chi-squared correlation test confirms they are close to integers. The mechanism is that the charge of the bare electron core is 137 times the Coulomb (polarisation-shielded) value, so vacuum interactions of bare cores of fundamental particles attract 137 times as much virtual mass from the vacuum, increasing the inertia that much too.

Leptons and nucleons are the things most people focus on, and are not integers when the masses are in units of (electron mass)x137. The muon is about 1.5 units on this scale but this can be explained by a coupling of the core (mass 1) with a virtual particle of similar size for an average of half the time, just as the electron couples increasing its magnetic moment to 1 + 1/(2.Pi.137). The mass increase of the muon is 1 + 1/2 because the Pi is due to spin and the 137 shielding factor doesn’t apply to bare cores in proximity. [More at http://members.lycos.co.uk/nigelbryancook/]

http://cosmicvariance.com/2005/11/14/our-first-guest-blogger-lawrence-krauss/:

"Your comment is awaiting moderation."

Nigel Cook on Nov 20th, 2005 at 8:25 am

Dear Plato,

This fear is that U.S. government part-funded physics pre-print server arXiv.org is controlling where science goes on the basis of prejudice and blacklisting heretics to the string theory mainstream. Anonymous censors at arXiv.org have a high degree of arbitrary power to blacklist: http://www.valdostamuseum.org/hamsmith/jouref.html#arxivregreq2002

See also http://www.valdostamuseum.org/hamsmith/MENTORxxxarXiv.html

The root of Tony Smith’s blacklisting by arXiv.org seems to be his defence in one paper of 26-dimensional string theory which is now officially replaced by 10/11 dimensional M-theory: http://cdsweb.cern.ch/search.py?recid=730325&ln=en

Another example is would-be astronaut Danny Ross Lunsford, http://cdsweb.cern.ch/search.py?recid=688763&ln=en

Lunsford got this paper published in Int. J. Theor. Phys.: 43 (2004) no. 1, pp.161-177, but it was removed from arXiv.org since it investigated a 6-dimensional spacetime which again is not exactly worshipping Witten’s 10/11 dimensional M-theory.

http://www.math.columbia.edu/~woit/wordpress/?p=128#comment-1932 :

“I certainly know from experience that your point about the behavior of the gatekeepers is true - I worked out and published an idea that reproduces GR as low-order limit, but, since it is crazy enough to regard the long range forces as somehow deriving from the same source, it was blacklisted from arxiv (CERN however put it up right away without complaint).

“Nevertheless, my own opinion is that some things in science just can’t be ignored or you aren’t doing science, which is not a series of wacky revolutions. GR is undeniably correct on some level - not only does it make accurate predictions, it is also very tight math. There are certain steps in the evolution of science that are not optional - you can’t make a gravity theory that doesn’t in some sense incorporate GR at this point, any more than you can make one that ignores Newton on that level.”

I’ve evidence on my home page that Lunsford’s conclusion that gravity is a residual of the other forces is the right way to view it, and this resolves problems of mechanism and cosmology, making predictions. But because Lunsford is suppressed by arXiv.org, this whole sidestream is forced underground. All because of a few dictators in string theory who can’t be bothered to check what they are deleting.

Copied here in case deleted by Cosmic Variance!

Saturday, November 19, 2005

New discussion with Quantoken on Motl's blog

Nigel said...
Dear Quantoken, The e^3 result comes as follows:Mass continuity equation for the galaxies in the space-time of the receding universe: dρ/dt + div.(ρv) = 0. Hence: dρ/dt = -div.(ρv). dx = dy = dz = dr, where r is radius. Hence divergence (div) term -div.(ρv) = -3d(ρv)/dx. For spherical symmetry Hubble equation v = Hr. Hence dρ/dt = -div.(ρv) = -div.(ρHr) = -3d(ρHr)/dr= -3ρHdr/dr= -3ρHSo dρ/dt = -3ρH. Rearranging:-3Hdt = (1/ρ) dρ. Integrating:-3Ht = (ln ρ1) – (ln ρ). Using the base of natural logarithms (e) to get rid of the ln’s: e^(-3Ht) = density ratioBecause H = v/r = c/(radius of universe) = 1/(age of universe, t) = 1/t:e-3Ht = density ratio= e^[-3(1/t)t] = e^-3 = 1/20.

All this says is that the mass of the receding universe is the volume times the higher effective volume at earlier times, which is partly offset by divergence. Another way to analyse it is to calculate numerically the gauge boson pressure from increasing shells around the observer, which cause gravity and electromagnetism. If you look at the actual mechanism by which QFT gauge bosons cause forces, you can get the whole thing from pushing. Similar charges recoil apart since the exchange between them is stronger than the red-shifted gauge bosons from the surrounding universe that are pushing them together, whereas opposite charges shield one another and are thus pushed together. Gravity, as D. R. Lunsford concluded his paper on the subject, is a residual of electromagnetism. Electromagnetism is 10^40 times gravity because the gauge boson exchange between similar charges addes up in a statistical drunkard’s walk, whereby the vector sum is equal to one step times the square root of the total number of steps, and there are 10^80 similar charges in the universe. It’s obvious that there is no way string theory is going to deal with gravity strength or mechanism, because these numbers can’t come out of any string theory. You have to look at the heuristic explanation of GR and QFT, plus the empirical facts of the big bang (not ‘facts’ derived from naive assumptions that gravity has no mechanism in the universe). The 377-ohm vacuum impedance discussion has been removed from my page. Everytime someone says there's something on my page which isn't helpful, I remove it. Your GUITAR may have useful elements in it, but it is likely to be incomplete. Best wishes, Nigel

Nigel said...
As for 137: Heisenberg’s uncertainty (based on impossible gamma ray microscope thought experiment): pd = h/(2.Pi), where p is uncertainty in momentum and d is uncertainty in distance. The product pd is physically equivalent to Et, where E is uncertainty in energy and t is uncertainty in time. Since, for light speed, d = ct, we obtain: d = hc/(2.Pi.E). This is the formula the experts generally use to relate the range of the force, d, to the energy of the gauge boson, E.Notice that both d and E are really uncertainties in distance and energy, rather than real distance and energy, but the formula works for real distance and energy, because we are dealing with a definite ratio between the two. Hence for 80 GeV mass-energy W and Z intermediate vector bosons, the force range is on the order of 10^-17 m.Since the formula d = hc/(2.Pi.E) therefore works for d and E as realities, we can introduce work energy as E = Fd, which gives us the strong nuclear force law: F = hc/(2.Pi.d^2). This inverse-square law is 137 times Coulomb's law of electromagnetism.
10:04 AM

FROM:
http://motls.blogspot.com/2005/11/congratulations-to-devin-walker.html

Friday, November 18, 2005

Templeton Foundation awarded PCW Davies $1M for religion, after Davies, a physics professor, wrote a lot of popular books about the beauty of mysterious and unexplained equations. The award ceremony was in a London cathedral.

In 1995, physicist Davies wrote on pp54-57 of his book ‘About Time’:

‘Whenever I read dissenting views of time, I cannot help thinking of Herbert Dingle… who wrote … Relativity for All, published in 1922. He became Professor … at University College London… In his later years, Dingle began seriously to doubt Einstein’s concept … Dingle … wrote papers for journals pointing out Einstein’s errors and had them rejected … In October 1971, J.C. Hafele [used atomic clocks to defend Einstein] … You can’t get much closer to Dingle’s ‘everyday’ language than that.’

Now, let’s check out J.C. Hafele.

J. C. Hafele is not horses***. Hafele writes in Science vol. 177 (1972) pp 166-8 that he uses G. Builder (1958) as analysis for the atomic clocks.

G. Builder (1958) is an article called ‘ETHER AND RELATIVITY’ in Australian Journal of Physics, v11, 1958, p279, which states:

‘… we conclude that the relative retardation of clocks… does indeed compel us to recognise the CAUSAL SIGNIFICANCE OF ABSOLUTE velocities.’

Just to remind ourselves of what Einstein and his verifier Sir Arthur Eddington wrote on this:

‘The special theory of relativity … does not extend to non-uniform motion … The laws of physics must be of such a nature that they apply to systems of reference in any kind of motion. Along this road we arrive at an extension of the postulate of relativity….’ – Albert Einstein, ‘The Foundation of the General Theory of Relativity’, Annalen der Physik, v49, 1916.

‘The Michelson-Morley experiment has thus failed to detect our motion through the aether, because the effect looked for – the delay of one of the light waves – is exactly compensated by an automatic contraction of the matter forming the apparatus…. The great stumbing-block for a philosophy which denies absolute space is the experimental detection of absolute rotation.’ – A.S. Eddington, Space Time and Gravitation, Cambridge, 1921, pp. 20, 152.

So the contraction of the Michelson-Morley instrument made it fail to detect absolute motion. This is why special relativity needs replacement with a causal general relativity:

‘According to the general theory of relativity space without ether is unthinkable.’ – Albert Einstein, Leyden university lecture ‘Ether and Relativity’, 1920. (A. Einstein, Sidelights on Relativity, Dover, 1952, p. 23.)

‘… with the new theory of electrodynamics [vacuum filled with virtual particles] we are rather forced to have an aether.’ – P.A.M. Dirac, ‘Is There an Aether?,’ Nature, v168, 1951, p906. (If you have a kid playing with magnets, how do you explain the pull and push forces felt through space? As ‘magic’?)

‘Children lose interest … because a natural interest in the world around them has been replaced by an unnatural acceptance of the soundness of certain views, the correctness of particular opinions and the validity of specific claims.’ – Dr David Lewis, You can teach your child intelligence, Book Club Associates, London, 1982, p. 258.

Wonder why the ‘defenders of Einstein’ don’t attack general relativity? I suppose they are too busy defending obsolete special relativity, which only applies to non-accelerating motion…. Open debate, promised by the religion-science Templeton Foundation, sounds very nice, and was promised also by Britain’s Prime Minister who stated we were going to war to defend liberty.

When people wanted to raise the issue of liberty in Iraq during the last Labour Party conference, the Government used its power over the police to have an old member held under the ‘Prevention of Terrorism Act’ to prevent free speech. He was lucky compared to the guy who was repeatedly shot in the back until dead ‘by accident’.

Templeton will have to turn similarly paranoid when it runs into the real world. The whole point of ‘blacklisting’ seems to be accusing people of terrorism or being unethical, without foundation.
They will simply take the moral high ground like religion and when you start asking questions or providing ideas they’ll fabricate some irrelevant excuse to suppress you, saying you’re a cowboy. [This was inspired by Woit's post: http://www.math.columbia.edu/~woit/wordpress/?p=297]

Thursday, November 17, 2005

Bigwigs

When Electronics World published my papers in 2002-3, I got a pretty amazing attack from Sir Kevin Aylward (http://www.anasoft.co.uk/band/bio.htm), Warden of the King's Ale (a defunct title, according to Catt who actually tried to look it up: there is only a Queen in England at present anyhow!), an electronics engineer (http://www.anasoft.co.uk/) who worked on the fated SuperConducting SuperCollider and has a page on Relativity for Tellytubbies: http://www.anasoft.co.uk/physics/gr/index.html

I did not like the conventional treatment of general relativity, and pored over Kevin's collection, coming up with a heuristic approach that fits Penrose's and Feynman's description of it (volume reduction due to radial contraction around a mass), then I approached it from another angle, the LeSage mechanism of gravity. Plugging the Hubble expansion into one of the throwaway equations (the mass continuity equation, which I had seen years before in some textbook but had never used) on Kevin's page, I got an analytical proof for the density correction factor!

The problem to me was trying to see the tensors as differential geometry, in a physical way rather than an abstract mathematical tool. Reading Einstein's 1916 paper on the foundation of general relativity, together with several textbooks by lesser authors, helped fill in gaps.

I also enjoyed the evolution of women: http://www.anasoft.co.uk/replicators/malefemale.html

What is so amazing about Kevin is that he doesn't seem to see how funny it is to me. He writes a negative letter which is printed, gets rebuked by me and Catt, then writes a really violent letter to the Electronics World editor which allegedly dismisses me and Catt as fools, the editor blocks the letter to prevent legal action, then we discover Kevin is actually 'Warden of the King's Ale' and has a site full of his own scientific ideas, some very sensible too: http://www.anasoft.co.uk/quantummechanics/index.html

I love the bit where Kevin writes: 'Kevin is a firm Atheist. Gods do not exist. End of story.' (http://www.anasoft.co.uk/band/bio.htm)

You always find that atheist's believe in things. They start believing in equations or the future, or themselves, in place of God. Being Catholic (http://www.catholic-ew.org.uk/), I believe in Jesus as well as the scientific method. I don't really believe that Jesus was not a human being, because if he was superman he could have been acting when being crucified and not really have felt pain. I realised this when a kid, but was soon forbidden to ask annoying questions, like why Jesus didn't teach people how to do first aid or basic surgery, which would have saved millions over the last 2000 years. The answer is 'don't ask' or you will be crucified by your fellow Catholics. The whole thing is based on mystery, morality and justifiable social work.

The deep strength of religious mysticism is vagueness, like string theory. Start asking clear questions, and you meet a conspiracy of paranoia. Don't ask silly questions, or join another tribe; the choice is yours. The whole subject is a bit dangerous to discuss in public, because I don't want to annoy the Pope and be excommunicated from religion in the same way that I was excommunicated from physics. It isn't a good experience. The bottom line is, Jesus is a moral character to believe in, and 'God' is such a vague concept it can mean anything from Nature to Father Christmas for different people. If religion is to progress, it will get stronger by becoming weaker, i.e., less specific to particular groups (Muslins versus Christians, etc.). Science, to progress, needs to do the exact opposite and give up vague, untestable horses*** string theory!

Further developments in causality

Revised http://nigelcook0.tripod.com/ as a brief summary, omitting string theory controversy which is on the more complete http://members.lycos.co.uk/nigelbryancook.

From http://nigelcook0.tripod.com/:

There was a crude empirical equation for ... masses by A.O. Barut, PRL, v. 42 (1979), p. 1251. We can extend the basic idea to hadrons. The muon is 1.5 units on this scale but this is heuristically explained by a coupling of the core (mass 1) with a virtual particle, just as the electron couples increasing its magnetic moment to about 1 + 1/(2.Pi.137). The mass increase of a muon is 1 + 1/2 because Pi is due to spin and the 137 shielding factor doesn’t apply to bare particles cores in proximity, as it is due to the polarised vacuum veil at longer ranges. This is why unification of forces is approached with higher energy interactions, which penetrate the veil. This idea predicts that a particle core with n fundamental particles (n=1 for leptons, n = 2 for mesons, and obviously n=3 for baryons) coupling to N virtual vacuum particles (N is an integer) will have an associative inertial mass of Higgs bosons of: (0.511 Mev).(137)n(N + 1)/2 = 35n(N +1) Mev.

Accuracy tested against data for mass of muon and all ‘long-lived’ hadrons:

LEPTON (n=1)

Muon (N=2): 105 Mev (105.66 Mev measured), 0.6% error!

HADRONS

Mesons (contain n=2 quarks):

Pions (N=1): 140 Mev (139.57 and 134.96 actual), 0.3% and 3.7% errors!
Kaons (N=6): 490 Mev (493.67 and 497.67 actual), 0.7% and 1.6% errors!
Eta (N=7): 560 Mev (548.8 actual), 2% error!

Baryons (contain n=3 quarks):

Nucleons (N=8): 945 Mev (938.28 and 939.57 actual), 0.7% and 0.6% errors!
Lambda (N=10): 1155 Mev (1115.60 actual), 3.5% error!
Sigmas (N=10): 1155 Mev (1189.36, 1192.46, and 1197.34 actual), 3.0%, 3.2% and 3.7% errors!Xi (N=12): 1365 Mev (1314.9 and 1321.3 actual), 3.8% and 3.3% errors!
Omega (N=15): 1680 Mev (1672.5 actual), 0.4% error!

The mechanism is that the charge of the bare electron core is 137 times the Coulomb (polarisation-shielded) value, so vacuum interactions of bare cores of fundamental particles attract 137 times as much virtual mass from the vacuum, increasing the inertia similarly. It is absurd that these close fits, with only a few percent deviation, are random chance, and this can be shown by statistical testing using random numbers as the null hypothesis. So there is empirical evidence that this heuristic interpretation is on the right lines, whereas the ‘renormalisation’ is bogus: http://www.cgoakley.demon.co.uk/qft/

From http://members.lycos.co.uk/nigelbryancook/:


The falsity of mainstream unified field theory (‘string theory’): untestable, multiple universe/dimension fantasy speculation)

Dr Luboš Motl, Harvard University: ‘… quantum mechanics is perhaps the deepest idea we know. It is once again a deformation of a conceptually simpler picture of classical physics.
Dr Peter Woit is a Columbia University mathematician who runs the weblog ‘Not Even Wrong’ about string theory – physicist Pauli deemed speculative belief systems like strings which predict nothing and cannot be tested or checked ‘not even wrong’. He has written a book that will sort out the nonsense in physics.

For problems of speculation in physics about purely mathematical approaches to quantum gravity, see his weblog where Dr Woit concludes:

‘the danger is that there may be lots of ways of "quantizing gravity", and with no connection to experiment you could never choose amongst them. String theory became so popular partly because it held out hope for being able to put the standard model and gravity into the same structure. But there’s no reason to believe it’s the only way of doing that, and people should be trying different things in order to come up with some new ideas.’

Skeptical San Francisco Chronicle newspaper article (http://www.math.columbia.edu/~woit/blog/):

March 14 2005: Today’s San Francisco Chronicle contains an article about string theory entitled ‘Theory of Everything’ Tying Researchers Up In Knots. The lead sentence is:

‘The most celebrated theory in modern physics faces increasing attacks from skeptics who fear it has lured a generation of researchers down an intellectual dead end.’

Davidson contrasts Michio Kaku’s very pro-string theory point of view in his new book Parallel Worlds, with the much more skeptical views of Lawrence Krauss, who has a book entitled Hiding in the Mirror: The Mysterious Allure of Extra Dimensions coming out [on 20 October 2005].
Stanford’s Robert Laughlin makes the point that string theorists are trying to camouflage the theory’s increasingly obvious flaws by comparing the theory to ‘a 50-year-old woman wearing way too much lipstick.’

The article, using Woit's blog for scientific data, pointed out that further problems, concluding: 'it is a disaster for string theory because the sheer number of estimated universes – equal to the number one followed by 500 zeroes – is unimaginably large. If true, it means that string theory is so flexible that it can be used to predict almost any kind of universe you want, no matter how crazy, and hence it predicts nothing specific enough to be scientifically interesting.'

Update: On 20 October 2005, Dr Lawrence Krauss’s book called Hiding in the Mirror about the failure of string theory to do anything scientific came out. He wrote an entertaining and well checked book on ‘Physics of Star Trek’. So perhaps Hollywood will make a film about string theory, a kind of epic tragedy, depicting the collective insanity of the generation of theoretical physicists who ignored Feynman’s warning. Dr Woit himself has an exciting book coming out in March 2006, which digs further into the mathematical frauds than Krauss did. (At the bottom of this page are more details from the Wikipedia encyclopedia entry on Not Even Wrong that is due to be deleted by the philistines.) The Guardian newspaper has now joined the fight for factual physics. Earlier in 2005, I arranged with Melanie Hewitt at News International to publish a page advert in The Times for £4,000. She told me on the phone that it would need to be run past the editor, since it was not very boring. The crank editor censored it, being too paranoid to allow questions to be raised about the funding of crackpottery.

The sound wave has an outward overpressure followed by an equal under-pressure phase, giving an outward force and equal-and-opposite inward reaction that allows music to propagate. Nobody hears the force of music on the eardrum (but they ‘hear’ talk about the equation of a sound ‘wave’!). The outward force phase of a sound wave is like the forward thrust of a rocket in air, while the inward force of sound that follows is like the backward thrust of exhaust. ‘String theory’ cranks suppress all evidence of the outward force of an explosion, because its equal inward reaction by the third law of motion in the case of the big bang is the force of gravity, as proved mathematically on this page.

‘It has been said that more than 200 theories of gravitation have been put forward; but the most plausible of these [the Lesage-Feynman pressure shielding scheme] … had the defect that they lead nowhere and admit of no experimental test.’ - Sir Arthur Eddington, Space Time and Gravitation, Cambridge University Press, 1921, p64. We prove below that LeSage’s push gravity results from a pressure source mechanism in the universe that makes correct predictions of the strength of gravity, etc. Light speed gravitons are force gauge bosons, established in quantum field theory, that can’t fill in shadows. Weblog: http://electrogravity.blogspot.com/

There is little ‘dark matter’ around because the false ‘critical density’ in general relativity is out by a factor e3/2 = 10. The acceleration of the universe implied from supernovae red-shifts is false (since gravity is a response to the surrounding matter, distant galaxies in the explosion are not slowed down by gravity, so there is no need to claim there is an acceleration offsetting a fictitious gravity pull-back). This is completely unique: F = ¾ mM H2/( p r2 r e3) » 6.7 x 10-11 mM/ r2 Newtons! [see http://members.lycos.co.uk/nigelbryancook/ for proper maths symbols which don't come out on this blog]

...

Electronic Universe. Part 2, Electronics World, N. Cook Electronics World, Vol. 109, No. 1804 (2003), downloads of two articles titled ‘An Electronic Universe’; first part August 2002 and second part April 2003 containing illustrations of the mechanism of the electron and electromagnetic forces ... [http://members.lycos.co.uk/nigelbryancook/ contains this article at the end, which deals with causality in special relativity, etc.]

Analysis of gauge bosons in big bang and analysis of particle masses

As we look to larger distances, we simultaneously look further back in time. In this sense, Minkowski was right in saying spacetime is one thing, because of the speed of light and force fields. In the big bang universe, density was greater in the past, which means at greater distances. As time zero is approached, the density would approach infinity. We don’t see either the radiation or the gravitational effects from infinite density at time zero, 15,000 million light years away. The reason is obviously red-shift of light and gravity causing radiation, ‘gravitons’ or rather Lunsford’s residual effect of the other gauge bosons. The red-shift wipes out contributions from the highly receding universe at the theoretical boundary, as witnessed by the fact that the cosmic background radiation, emitted 300,000 years after the big bang, has a frequency reduction by a factor of 1,000. Radiation from 1 second after time zero would have such an immensely greater red-shift reduction factor that it would, despite the high frequencies associated with radiation at such early times, be completely undetectable to us.

I’m going to put some illustrations of the force mechanisms on my home page, plus curves comparing the observed big bang to nuclear explosions in a spacetime fabric, dealing with outward force of the explosion and the inward reaction force at various stages, and how the nuclear, electromagnetic and gravity forces derive from quantum field theory gauge boson exchange in this dynamic big bang.

Another thing I’m going to post is an analysis of the mechanism behind the masses of all observed particles, which form a kind of early periodic table by analogy to Dalton and Newlands’ early chemical ideas from examining apparent atomic masses. Obviously, the pioneers were confused by the mass of chlorine (which contains two isotopes in such proportions it is not close to an integer of the mass of hydrogen), and they had problems working out that water is H2O not just HO. However, the analogy is a good starting point. Here is a list of all the long-lived hadron masses, in units of with the electron having a mass of 1/137, and the muon mass of 1.5 having been addressed with other leptons in the last post. Most hadron particle masses are near integers! This is supported by a Chi-squared test.

Mesons
Pions = 1.99 (charged), 1.93 (neutral)
Kaons = 7.05 (charged), 7.11 (neutral)
Eta = 7.84

Baryons
Nucleons = 13.4
Lambda = 15.9
Sigmas = 17.0 (positive and neutral), 17.1 (negative)
Xi = 18.8 (neutral), 18.9 (negative)
Omega = 23.9

Force of sound

For sound, the outward force and inward force are line the thrust of rocket exhaust and the reaction of the rocket forward, when the rocket is going at a steady speed in air. An aeroplane is another example, the forward force is balanced. We know there's force in sound from the sine wave pressure wave plot which proves it. The pressure times this area is the outward force. The overpressure times this area is the net outward force (of importance in an air burst, where the 14.7 psi air pressure confuses Kevin into thinking it stops the shock).

Kevin claimed falsely: “The ‘thrust’ of a rocket is of the ‘expansion of gases’ accelerating mass away from the rocket. Conservation of momentum requires that the rocket accelerate as well.”
Wrong. The rocket has air drag, like a sound wave hitting air ahead of it, which prevents acceleration from the force. Similarly, weight is a force. My downward force on the floor does not cause the floor to accelerate or me to accelerate the other way: there is no acceleration because the forces are equal. Take a refresher course in elementary mechanics from an A-level physics or applied maths... Forces in equilibrium don't induce accelerations.