Quantum gravity physics based on facts, giving checkable predictions

Tuesday, October 10, 2006

John Horgan interviews Peter Woit; Lubos Motl's ex-student Jim Weatherall claims that string theory isn't important; British University Physics Teaching Half Shuts Down

‘Since 1982 A-level physics entries have halved. Only just over 3.8 per cent of 16-year-olds took A-level physics in 2004 compared with about 6 per cent in 1990.

‘More than a quarter (from 57 to 42) of universities with significant numbers of physics undergraduates have stopped teaching the subject since 1994, while the number of home students on first-degree physics courses has decreased by more than 28 per cent. Even in the 26 elite universities with the highest ratings for research the trend in student numbers has been downwards.’

- http://www.buckingham.ac.uk/news/newsarchive2006/ceer-physics-2.html (Much more info: here.)

In the interview by John Horgan (you need to download the 206 MB file before playing because it will not stream the video), Peter Woit talks about the suppression of his efforts to counter stringy propaganda. I can add some explanation to the problem of raising awareness of the tragic consequences of string theory for physics. You see, I authored the opinion page in the October 2003 issue of Electronics World, before Peter Woit's blog started in March 2004. (Weirdly the closure of British university physics departments is now being blamed on Peter Woit by stringers: see my comments about this dispute here, here, here, and here.)

On that Electronics World page, I gave reasons why the closure of university physics departments in the UK is due to string theory. There is a ‘crimestop’ fear of asking real questions about physics by kids who don't want a sneering rebuff of the Euler sort.

‘Crimestop means the faculty of stopping short, as though by instinct, at the threshold of any dangerous thought. It includes the power of not grasping analogies, of failing to perceive logical errors, of misunderstanding the simplest arguments if they are inimical to (an authority) and of being bored or repelled by any train of thought which is capable of leading in a heretical direction. Crimestop, in short, means protective stupidity.’ - http://www.lewrockwell.com/hein/hein95.html

The consequence is that, instead of asking physics teachers for the evidence of string theory, kids give up physics: physics is supposed to answer questions so kids can tell that something is amiss with string hype going unopposed. So I’m glad that Professor Michio Kaku has admitted the great truth of human nature (at least in his draft article, ‘accidentally’ published on his weblog for a short time):

‘It’s a sign of the vitality of theoretical physics that people are so passionate ... Science flourishes with controversy.’

- http://www.math.columbia.edu/~woit/wordpress/?p=457#comment-15821

Contrast that to the referee report from the string theorist (at one time blamed on CVJ, who I admire greatly for his wonderful posts on the weblog Cosmic Variance) who stopped Not Even Wrong from being published by the academic publisher Cambridge University Press some years ago.

The mainly anonymous censors (so far only CVJ and LM have sneakily confessed - update 16 Dec 2006, CVJ has indicated that he has not read Peter Woit's book and appears to be suggesting he did not therefore review it) who stopped Not Even Wrong’s publication by Cambridge University Press, and other university presses, helped make the attack worse by opening the way for Penrose to get the book printed by Jonathan Cape and other popular publishers. So they are to blame for the damage done to string theory today.

Here in the UK the string theory crusades which have been hyped not just by Americans but also by Hawking in best selling books, have correlated supposedly by a pure coincidence, with a near 50% SLUMP in the number of students doing A-level physics (which is one vital asset required for undergraduate physics entry) since the rise of string theory in 1985.

As a result many university physics undergraduate teaching departments have had to close and staff have been laid off or had to be integrated into electronics, electrical engineering, and mathematics departments.

It is extremely sad to feel you know a cause, and to be told to shut up. People say the A-level physics decline is caused by a lack of hype of string theory (which seeing the vast sales of books by Hawking and other string theory acolytes in the UK, is ridiculous), or they blame a lack of physics teachers due to a lack of students of physics (which is a circular argument, because the lack of physics teachers is due to a lack of people studying physics, which in turn is due to the unhealthy religous type stringy hype lacking evidence etc.).

The evidence is that string theory is a real problem and so I disagree a bit with Lubos Motl's former student Jim Weatherall's statement: ‘ultimately, string theory simply isn’t very important ... there must be better things to worry about. For all the snide comparisons by string theory’s critics, at least a religion would speak to ethics and suffering.’

I've seen some suffering caused due to string theory, so if evidence of mere suffering is all that is remaining to prove string is a religion, then that's it: string is religion, and New Scientist magazine is it's sacred scroll.

New Scientist's old editor Dr Alun M. Anderson and its new editor Jeremy Webb BSc (electronics, Exeter University) a former BBC sound engineer, reject everything I have submitted which has been pro-physics, ie, string critical. See the damage they have done to the magazine and to student interest in physics by their financial expediency and bigmouth sycophancy: http://golem.ph.utexas.edu/category/2006/09/a_plea_to_save_new_scientist.html

Note: the last post on this blog - below - has now been revised and updated and posted at https://nige.wordpress.com/ which is more reliable than this blogger site. Since this site is often inaccessible due apparently to server overload, https://nige.wordpress.com/ will persist.

Update: Lambda (the CC) -> 0, when G -> 0. Gravity dynamics which predict gravitational strength and various other observable and further checkable phenomena, are consistent with the gravitational-electromagnetic unification in which there are 3 dimensions describing contractable matter (matter contracts due to its properties of gravitation and motion), and 3 expanding time dimensions (the spacetime between matter expands due to the big bang according to Hubble’s law). Lunsford has investigated this over SO(3,3):

http://www.math.columbia.edu/~woit/wordpress/?p=128#comment-1932:

‘…I worked out and published an idea that reproduces GR as low-order limit, but, since it is crazy enough to regard the long range forces as somehow deriving from the same source, it was blacklisted from arxiv (CERN however put it up right away without complaint). … my work has three time dimensions, and just as you say, mixes up matter and space and motion. This is not incompatible with GR, and in fact seems to give it an even firmer basis. On the level of GR, matter and physical space are decoupled the way source and radiation are in elementary EM. …’ - drl

“… the flat universe is just not decelerating, it isn’t really accelerating …”

- http://cosmicvariance.com/2006/01/03/danger-phil-anderson

All you need to do to make gravitational strength to fall toward zero over cosmic distances is to recognise the very plain, simple fact that any exchange of force causing gauge boson radiation between receding masses will suffer redshift related problems not seen in the QFT of nuclei and atoms.

Over short distances, any Yang-Mills quantum gravity will be unaffected because the masses aren’t receding, so exchange radiation won’t be upset. But over great distances, recession of galaxies will cause problems in QFT gravity that aren’t physically included in general relativity.
I don’t know if gauge boson’s are redshifted as or slowed down, but it’s clear between two masses receding from one another at a speed near c, the force will be weakened. That’s enough to get gravity to fade out over cosmic distances.

This means G goes to zero for cosmology sized distances, so general relativity fails and there is no need for any cosmological constant at all, CC = 0.

http://cdsweb.cern.ch/search.py?recid=688763&ln=en shows that CC = 0 if gravity and electromagnetism are unified by having three expanding time dimensions instead of one.

When you think about it, it’s obviously correct: GR deals with contractable dimensions describing matter, and one time dimension. Lunsford simply expands the time to three dimensions hence symmetry orthagonal group (3,3). The three expanding time dimensions give the cosmological recession! The Hubble expansion then becomes a velocity variation with time, not distance, so it becomes an acceleration. Newton’s laws then tell us the outward force of the big bang and the inward reaction, which have some consequences for gravity prediction.
We already talk of cosmological distances in terms of time (light years). The contractable dimensions always describe matter (rulers, measuring rods, instruments, planet earth). Empty space doesn’t contract in the expanding universe, no matter what the relative motion or gravity field strength is. Only matter’s dimensions are contractable. Empty spacetime volume expands. Hence 3 expanding dimensions, and 3 contractable dimensions replace SO(3,1). More here.

25 Comments:

At 2:10 PM, Anonymous Anonymous said...

Copy of a comment to Bee's blog after she claimed - or seemed to claim - in comments on LM's blog that if you ask the question of whether redshift is accompanied by velocity change, that is like denying redshift. (Redshift is a fact!! You don't have to deny a fact to ask a question when the speed of significantly redshifted light - such as the CBR - has never been measured. This is NOT the Michelson-Morley experiment or any variant, because that just determines effect of receiver speed, and the receiver undergoes a contraction which compensates for any change in c, making relativity work. Nobody has ever measured the speed of the CBR, which infrared so redshifted that it is now in the microwave band. Measuring that speed would determine emitter effects on light speed, which are vital to validate relativity. Anyone who denies the value of continually making new measurements on new phenomena in physics to check ideas which have never been tested before, is UNHELPFUL):

Hi Bee,

"... to say that the GPS work because of General Relativity is not wrong, but it is not the whole story: It is a catch phrase to show that GR is not some abstract mathematics, but plays indeed a role in the real world."

On the topic of mathematical models in general, see Feynman:


‘It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities.’

- R. P. Feynman, Character of Physical Law, November 1964 Cornell Lectures, broadcast and published in 1965 by BBC, pp. 57-8.

Now you want me to explain a candidate mechanism I suppose...

Nothing works because of a mathematical model, and until quantum gravity is included general relativity won't even be a complete mathematical model ;-)

You might as well claim that that people meet and marry because of the equation 1 + 1 = 2.

Underlying general relativity, there are real dynamics. If it is analogous to a Yang-Mills quantum field theory, exchange radiation will behave differently in the universe than in an atom or nucleus, due to redshift ;-)

Smolin et al. show in LQG that a path integral is a summing over the full set of interaction graphs in a Penrose spin network. The result gives general relativity without a metric (ie, background independent). Next, you simply have to make gravity consistent completely with standard model-type Yang-Mills QFT dynamics to get predictions:

Over short distances, any Yang-Mills quantum gravity will be unaffected because the masses aren’t receding, so exchange radiation won’t be upset. But over great distances, recession of galaxies will cause problems in QFT gravity that aren’t physically included in general relativity.

I don’t know if gauge boson’s are redshifted as or slowed down (background independence upsets SR, and Maxwell's model is hogwash since his displacement current equation which depends on vacuum polarization can't occur in a QFT unless the electric field strength exceeds the IR cutoff, which corresponds to about 10^18 v/m, FAR higher than the field strengths of Hertz' radio waves which he lying claimed to prove Maxwell's equations correct), but that simply doesn't matter: either way, it’s clear that between two masses receding from one another at a speed near c, the force will be weakened. That’s enough to get gravity to fade out over cosmic distances.

This means G goes to zero for cosmology sized distances, so general relativity fails and there is no need for any cosmological constant at all, CC = 0.

Lambda (the CC) -> 0, when G -> 0. Gravity dynamics which predict gravitational strength and various other observable and further checkable phenomena, are consistent with the gravitational-electromagnetic unification in which there are 3 dimensions describing contractable matter (matter contracts due to its properties of gravitation and motion), and 3 expanding time dimensions (the spacetime between matter expands due to the big bang according to Hubble’s law). Lunsford has investigated this over SO(3,3):

http://www.math.columbia.edu/~woit/wordpress/?p=128#comment-1932:

‘... I worked out and published an idea that reproduces GR as low-order limit, but, since it is crazy enough to regard the long range forces as somehow deriving from the same source, it was blacklisted from arxiv (CERN however put it up right away without complaint). ... my work has three time dimensions, and just as you say, mixes up matter and space and motion. This is not incompatible with GR, and in fact seems to give it an even firmer basis. On the level of GR, matter and physical space are decoupled the way source and radiation are in elementary EM. ...’ - drl

Nobel Laureate Phil Anderson:

“... the flat universe is just not decelerating, it isn’t really accelerating ...”

- http://cosmicvariance.com/2006/01/03/danger-phil-anderson


Hence Lunsford's model is right. Note that this PRECEDES experiment. I got a publication in Electronics World Oct 96, which is for a dynamical model.

When you think about it, it’s obviously correct: GR deals with contractable dimensions describing matter, and one time dimension. Lunsford simply expands the time to three dimensions hence symmetry orthagonal group (3,3). The three expanding time dimensions give the cosmological recession! The Hubble expansion then becomes a velocity variation with time, not distance, so it becomes an acceleration. Newton’s laws then tell us the outward force of the big bang and the inward reaction, which have some consequences for gravity prediction, predicting G to within experimental error!
We already talk of cosmological distances in terms of time (light years). The contractable dimensions always describe matter (rulers, measuring rods, instruments, planet earth). Empty space doesn’t contract in the expanding universe, no matter what the relative motion or gravity field strength is. Only matter’s dimensions are contractable. Empty spacetime volume expands. Hence 3 expanding dimensions, and 3 contractable dimensions replace SO(3,1).

The question is, how long will stringers with only hype be defended by non-falsifiable predictions about soft scatter of heavy ions? Similar to predictions of large extra dimensions?



BTW, if you want to contribute a cent to determining experimentally whether redshifted light suffers a velocity change, go over to LM's blog. ;-)


Best,
nc

 
At 2:39 PM, Anonymous Anonymous said...

Copy of a comment to http://discovermagazine.typepad.com/horganism/2006/10/the_end_of_stri.html


‘It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities.’

- R. P. Feynman, Character of Physical Law, November 1964 Cornell Lectures, broadcast and published in 1965 by BBC, pp. 57-8.

Nothing works because of a mathematical model which so-and-so invented to describe something in the natural world. For example, until quantum gravity is included in general relativity, the latter won't even be a complete mathematical model for gravity, let alone the cause for all gravitational phenomena.

You might as well claim that that people meet and marry because of the equation 1 + 1 = 2.

Underlying general relativity, there are real dynamics. If it is analogous to a Yang-Mills quantum field theory, exchange radiation will behave differently in the universe than in an atom or nucleus, due to redshift.

Smolin et al. show in LQG that a path integral is a summing over the full set of interaction graphs in a Penrose spin network. The result gives general relativity without a metric (ie, background independent). Next, you simply have to make gravity consistent completely with standard model-type Yang-Mills QFT dynamics to get predictions:

(1) Over short distances, any Yang-Mills quantum gravity will be unaffected because the masses aren’t receding, so exchange radiation won’t be upset.

(2) But over great distances, recession of galaxies will cause problems in QFT gravity that aren’t physically included in general relativity.

I don’t know if gauge boson’s are redshifted with constant velocity or if they are slowed down due to recession, being exchanged less frequently when masses are receding from one another.

It doesn't matter: either way, it’s clear that between two masses receding from one another at a speed near c, the force will be weakened. That’s enough to get gravity to fade out over cosmic distances.

This means G goes to zero for cosmology sized distances, so general relativity fails and there is no need for any cosmological constant at all, CC = 0.

Lambda (the CC) -> 0, when G -> 0. Gravity dynamics which predict gravitational strength and various other observable and further checkable phenomena, are consistent with the gravitational-electromagnetic unification in which there are 3 dimensions describing contractable matter (matter contracts due to its properties of gravitation and motion), and 3 expanding time dimensions (the spacetime between matter expands due to the big bang according to Hubble’s law). Lunsford has investigated this over SO(3,3):

http://www.math.columbia.edu/~woit/wordpress/?p=128#comment-1932:

‘... I worked out and published an idea that reproduces GR as low-order limit, but, since it is crazy enough to regard the long range forces as somehow deriving from the same source, it was blacklisted from arxiv (CERN however put it up right away without complaint). ... my work has three time dimensions, and just as you say, mixes up matter and space and motion. This is not incompatible with GR, and in fact seems to give it an even firmer basis. On the level of GR, matter and physical space are decoupled the way source and radiation are in elementary EM. ...’ - D. R. Lunsford.

Nobel Laureate Phil Anderson:

“... the flat universe is just not decelerating, it isn’t really accelerating ...”

- http://cosmicvariance.com/2006/01/03/danger-phil-anderson


Hence Lunsford's model is right. Note that this PRECEDES experiment. I got a publication in Electronics World Oct 96, which is for a dynamical model.

When you think about it, it’s obviously correct: GR deals with contractable dimensions describing matter, and one time dimension. Lunsford simply expands the time to three dimensions hence symmetry orthagonal group (3,3). The three expanding time dimensions give the cosmological recession! The Hubble expansion then becomes a velocity variation with time, not distance, so it becomes an acceleration.

Newton’s laws then tell us the outward force of the big bang and the inward reaction, which have some consequences for gravity prediction, predicting G to within experimental error.

We already talk of cosmological distances in terms of time (light years). The contractable dimensions always describe matter (rulers, measuring rods, instruments, planet earth). Empty space doesn’t contract in the expanding universe, no matter what the relative motion or gravity field strength is. Only matter’s dimensions are contractable. Empty spacetime volume expands. Hence 3 expanding dimensions, and 3 contractable dimensions replace SO(3,1).

Lunsford's paper:

http://cdsweb.cern.ch/search.py?recid=688763&ln=en

I'd be keen for you to ask Peter Woit about Lunsford, and also about Woit's use of representation theory to generate the Standard Model in low dimensions. This is the really big problem if gravity is successfully modelled by Lunsford's approach.

Wikipedia gives a summary of representation theory and particle physics:

‘There is a natural connection, first discovered by Eugene Wigner, between the properties of particles, the representation theory of Lie groups and Lie algebras, and the symmetries of the universe. This postulate states that each particle “is” an irreducible representation of the symmetry group of the universe.’

Woit’s historical approach in his course notes is very clear and interesting, but is not particularly easy to read at length on a computer screen, and ideally should be printed out and studied carefully. I hope it is published as a book with his arXiv paper on applications to predicting the Standard Model. I’m going to write a summary of this subject when I’ve finished, and will get to the physical facts behind the jargon and mathematical models. Woit offers the promise that this approach predicts the Standard Model with electroweak chiral symmetry features, although he is cautious about it, which is the exact opposite of the string theorists in the way that he does this, see page 51 of the paper (he is downplaying his success in case it is incomplete or in error, instead of hyping it).

Lunsford's paper on gravity: http://cdsweb.cern.ch/search.py?recid=688763&ln=en

Woit's paper producing the Standard Model particles on page 51: http://arxiv.org/abs/hep-th/0206135

Maybe you can find out why these ideas are being neglected by string theorists!

Try to get a rational and reasonable response from Lubos Motl, Jacques Distler, Sean Carroll (who is not a string theorist but a cosmologist, so he should be willing to make a comment on the cosmological effects of Lunsford's paper - the end of the cosmological constant in particular), and also Clifford Johnson who is a string theorist.

Since the string theorists have been claiming to have the best way to deal with gravity, it would be interesting to see if they will defend themselves by analysing alternatives, or not.

(I predict you will get a mute reaction from Woit, but don't let him fool you! He is just cautious in case he has made an error somewhere.)

nc

BTW, I tried to shut down string theory in the Oct. 2003 issue of Electronics World, but discovered that there is a lot of public support for string theory, because string theorists and (fellow travellers like Hawking) have had sufficient good sense to censor viable alternatives, including Lunsford's paper from arXiv even after it was published in a peer-reviewed journal.

 
At 3:40 AM, Blogger nige said...

On the negative side of advance

I have some empathy with one danger feared by people like Lubos Motl and probably Ed Witten, Peter Woit, Lee Smolin, Lenny Susskind:

the destruction of rigor in physics by crackpotism.

Actually they should ask take the occasional glance in the mirror when complaining about "crackpotism". Usually "crackpotism" can be disproved and is WRONG, which is far better than string theory which is non-falsifiable and thus NOT EVEN WRONG.

But probably they falsely think that people such as myself want to destroy physics.

Well, if string theory is your definition of "physics" then according to your crackpot definition, yes I would like to see string theory derated. But I don't see any physics in "string theory" which is mathematical crackpotism and is totally different to mathematical sanity in say the checked applications of GR and the SM (although both mainstream SM and GR are incomplete, lacking evidence for symmetry breaking mechanisms and Yang-Mills dynamics for quantum gravity).

If string theory does go down, there will possibly be a small loss, because some of the people in physics may not "like" the physical dynamics of the universe, which Assistant Professor Lubos Motl illustrates by the Rube-Goldberg machine (a whole series of physical mechanisms, one causing the other, and so on see http://motls.blogspot.com/2006/10/rube-goldberg-machine-video.html ).

Those people who may include Ed Witten and Lenny Susskind, can just take a hike into the maths department or the astronomy department or the computer department.

If they don't like the universe as it is, that's their problem, and they don't have to seek to replicate the Final Solution of Hitler but exterminating other people's work from the arXiv.org and suppressing EVIDENCE which shows them to be wrong.

If they want to go on believing in M-theory or S-matrix or Steady State Cosmology or Ptolemic Universe or Flat Earth Theory or Devine Intervention or Kepler's magnetic force holding planets in orbit around the sun, or Newton's fiddled "theory" of sound waves which is just dimensional and ignores the adiabatic effect, or Maxwell's FALSE mechanical gear cog and idler wheel displacement current in a light ray, or ESP or UFO or Lock Ness monster or fairies at the bottom of the garden, then fine. (There is allegedly no Thought Police to define/stop/punish drivel type insanity in a democratic system.) But those people don't have any democratic right to use that religious belief type stuff, be it mathematical Calabi-Yau manifold or Biblical Creationism, to suppress science on arXiv.org

In the long run, people can adjust to a Rube-Goldberg causal universe just as they have adjusted to an extra dimensional stringy one in the past.

It can't make the physics crisis much worse. If done reasonably, demanding evidence and Popperian checks, it will not sink physics in crackpot noise and error. And it may increase interest in physics.

I hope someone in string like Brian Greene http://www.pbs.org/wgbh/nova/elegant/greene.html will write a 400 page book called "The INelegant Universe: The world as a Rube-Goldberg Machine" to offset harm from the stringy drivel.

 
At 8:23 AM, Blogger nige said...

Lubos Motl reports Hawking is now in Hollywood directing a lying film about 11 dimensional string theory being true.

http://motls.blogspot.com/2006/10/beyond-horizon-hawking-in-imax-cinemas.html

Dear Lumo,

Thanks for this exciting news!

It should brane-wash the few remaining string disbelievers out there, and do-in physics completely.

Actually, believing in 11 dimensions is probably less dangerous than Muslim beliefs in killing all critics. (No offense to Islamic extremists, so don't now crash planes in my neighbourhood please.)

So I'm quite tolerant towards Hawking. I don't think he should be arrested for insanity.

Best,
nc | Homepage | 10.16.06 - 11:16 am | #

 
At 9:50 AM, Blogger nige said...

Did string theory kill Feynman??


http://motls.blogspot.com/2006/10/beyond-horizon-hawking-in-imax-cinemas.html

In Mlodinow’s book Feynman’s Rainbow, in chapter 13, with Feynman hating stringy s***:

“This whole discussion is pointless! It’s getting on my nerves! I told you - I don’t want to talk about string theory!”
Censored Lubos Motl fan | Homepage | 10.16.06 - 12:33 pm | #


http://www.math.columbia.edu/~woit/wordpress/?p=89#comment-1037

JC Says:

October 11th, 2004 at 1:50 pm

From some folklore stories I vaguely recall, allegedly Richard Feynman finally gave in and decided to learn string theory in the last few months of his life in 1987-1988.

D R Lunsford Says:

October 11th, 2004 at 2:15 pm

And I thought it was cancer. Live and learn!

Chris Oakley Says:

October 12th, 2004 at 11:25 am

No it wasn’t ... String theory is what killed him.



Censored Lubos Motl fan | Homepage | 10.16.06 - 12:39 pm | #

 
At 10:17 AM, Blogger nige said...

Lumos deleted above comments, so we try again as good scientists must each time they fail:


http://motls.blogspot.com/2006/10/beyond-horizon-hawking-in-imax-cinemas.html


Excellent post!
nigel cook | Homepage | 10.16.06 - 1:07 pm | #

BTW, will real string be used, or props? Also, will real Hawking radiation be demonstrated?
nigel cook | Homepage | 10.16.06 - 1:10 pm | #

 
At 10:35 AM, Blogger nige said...

Lubos Motl is deleting other stuff too, such as my slightly disappointed remarks about egotism (and no solid result) from Hawkrose and Penman


http://motls.blogspot.com/2006/10/precision-black-hole-measurements.html

Don't believe Hawkrose-Penman! Name one, just ONE, Nobel prize they have received for checked predictions!

A particle has a black hole area for gravitational interactions: proof at http://feynman137.tripod.com/#h

I calculate gravity two ways, but two different versions of the same mechanism (radiation pressure and Dirac sea perfect fluid type pressure). One method (fluid pressure) uses a mathematical trick which gets around needing to put into the calculation the cross-section for gravity interactions, but predicts G accurately.

The shielding area of an electron so calculated equals Pi(2GM/c^2)^2.

This is the cross-section of a black hole event horizon.

This does not prove what an electron is at the black hole level, but it gives some clues: it is possibly an electromagnetic Heaviside energy wave trapped in a small loop by gravity (obviously all negative electric field Heaviside energy, not an oscillating light wave).

This means that we have to look at Dr Thomas Love's theorem. Love (California State Uni) shows that if you set the kinetic energy of a planet equal to the gravitational potential energy, ie, (1/2)mv^2 = mMG/R, that gives Kepler's law!!!

See http://nige.wordpress.com/2006/0...kinetic-energy/

Extending this to the electron as a c-speed Heaviside wave gravity-trapped loop and we get mc^2 = mMG/R where M is mass of universe and R is a fraction of the radius of the universe which corresponds to the effective distance of the mass radially outward from us.

Anyway, there are no singularities in black hole electrons: nothing exists below that size!
nc | Homepage | 10.13.06 - 4:53 pm | #


By the way, I get the shielding area by effectively comparing the fluid mechanism prediction (which predicts G without needing shielding area) to the Yang-Mills radiation mechanism.

The radiation mechanism does NOT predict G unless the shielding cross-section is put in by hand.

Normalizing the radiation mechanism to give the value of G that the fluid mechanism gives, yields the shielding area Pi(2GM/c^2)^2.

So the gravity mechanism calculations immediately give two independent predictions: G and the gravitational size of the electron "core".

The two mechanisms are duals in so much as the gauge boson radiations will spend part of their existence as charged particle pairs in loops in space (Dirac sea).

Each mechanism (the fluid of charged particles, and the radiation) in my calculation is assumed to be 100% responsible for gravity, so they are a dual of one another. In reality, if the contribution to gravity from radiation pressure is 100f %, the contribution from Dirac sea pressure will be 100(1-f) %, so the sum of both mechanisms in practice is the same as either mechanism considered to be the complete cause separately.

Nigel
nc | Homepage | 10.13.06 - 5:01 pm | #

 
At 1:03 PM, Blogger nige said...

Lubos deleted a previous attempt. Another comment:


According to the Foreword of Hawking and Leonard Mlodinow, A Briefer History of Time, Bantam, 2005, p1:

"A brief History of Time was on the London Sunday Times best-seller list for 237 weeks and has sold about one copy for every 750 men, women, and children on earth."

Where would string be without him? :-)

nigel cook | Homepage | 10.16.06 - 3:57 pm | #

 
At 1:35 AM, Blogger nige said...

Lubos has let the previous comment remain! ;-)

Now another:

http://motls.blogspot.com/2006/10/jihad-on-mass-ave.html

Quantoken,

You are right about the Stefan-Boltzmann radiation law, but in a nuclear explosion 80% of the light/heat is emitted after the shock wave has cooled below 3000 K. The reason is that the initial flash of x-rays from the bomb heats compresses nearby air so much before it can expand that red-brown colored nitrogen dioxide forms (this gives the fireball its rust-like color), which absorbs most further heat and light. Hence the nitrogen dioxide formation cuts out the initial pulse, and it is only after the fireball cools (by expanding until it is below 3000 K) that the nitrogen dioxide stops being formed and is engulfed by the expanding fireball edge so thermal radiation peaks again.

In the Mike H-bomb, a 10 megaton bomb, it took 3.25 seconds until the final flash peak occurred. The time scales in proportion to the square-root of the bomb power.

The result is, there isn't as much difference between a chemical and a nuclear explosion as you might expect regards light/heat, although chemical explosions do emit slightly less light/heat, it is not a vast difference in %.

Lubos,

Regards God mythology, I'm planning to give my revised paper on SM and GR mechanism the title:

Loop Quantum Gravity and QFT: The Universe as Rube-Goldberg Machine, so Goodbye Mr God!

Do you think people will like it? ;-)

nigel cook | Homepage | 10.17.06 - 4:28 am | #

 
At 7:55 AM, Blogger nige said...

Copy of a comment to Clifford's blog:

http://asymptotia.com/2006/10/16/manifold-yau/#comment-2155

nc
Oct 17th, 2006 at 7:46 am

Howdi Clifford,

I'm [widely regarded as] a moronic crackpot, so maybe you can help with a question on the Calabi-Yau manifold?

Suppose GR is as Lunsford investigates 6 dimensional (3 distance-like dimensions, 3 time-like dimensions).

All I need is know is whether 10-d superstring is compatible with 6-d GR instead of the usual 4-d GR, ie, what if anything is known about 4-d Calabi-Yau manifolds?

Chees,
nc

Reference links

Lunsford 6-d unification of Maxwell and GR: http://cdsweb.cern.ch/search.py?recid=688763&ln=en

Suppression of peer-reviewed published paper on 3 time dimensions from arXiv: http://www.math.columbia.edu/~woit/wordpress/?p=128#comment-1932

Interpretation of Lunsford's finding that CC = 0 in terms of a Yang-Mills exchange radiation theory of gravity: http://discovermagazine.typepad.com/horganism/2006/10/the_end_of_stri.html#comments

(1) Over short distances, any Yang-Mills quantum gravity will be unaffected because the masses aren’t receding, so exchange radiation won’t be upset.

(2) But over great distances, recession of galaxies will cause problems in QFT gravity that aren’t physically included in general relativity.

I don’t know if gauge boson’s are redshifted with constant velocity or if they are slowed down due to recession, being exchanged less frequently when masses are receding from one another.

It doesn't matter: either way, it’s clear that between two masses receding from one another at a speed near c, the force will be weakened. That’s enough to get gravity to fade out over cosmic distances.

This means G goes to zero for cosmology sized distances, so general relativity fails and there is no need for any cosmological constant at all, CC = 0.

Lambda (the CC) -> 0, when G -> 0. Gravity dynamics which predict gravitational strength and various other observable and further checkable phenomena, are consistent with the gravitational-electromagnetic unification in which there are 3 dimensions describing contractable matter (matter contracts due to its properties of gravitation and motion), and 3 expanding time dimensions (the spacetime between matter expands due to the big bang according to Hubble’s law). Lunsford has investigated this over SO(3,3): http://cdsweb.cern.ch/search.py?recid=688763&ln=en


BTW Clifford, if this is off-topic I'm copying this comment to my blog so you are free to discuss there if you prefer (and delete this comment naturally). I've some experimental evidence (rejected by Nature) that supersymmetry is flawed because the correct way to achieve unification is through a representation of energy conservation of the various fields, but I don't want to rule out supersymmetry completely until I know what the Calabi-Yau is like if it is 4-d and not 6-d.

 
At 7:58 AM, Blogger nige said...

Copy of a comment to John's blog:

http://discovermagazine.typepad.com/horganism/2006/10/why_brian_josep.html#comment-23970903

Thanks for this post, which is very interesting. So it was the Copenhagen Interpretation to blame?

The Josephson Junction led to practical high-sensitivity magnetic field sensors, SQUIDs,
http://en.wikipedia.org/wiki/SQUID , but the quantum weirdness based on Cooper pairs and quantum tunnelling doesn't validate ESP.

The failure is ultimately in classical physics, which should be formulated with inbuilt indeterminancy for the 3+ body problem (which leads to chaos as Poincare discovered). The whole myth of classical physics being somehow deterministic (Maxwell and GR) is based on ignoring this:

‘... the ‘inexorable laws of physics’ ... were never really there ... Newton could not predict the behaviour of three balls ... In retrospect we can see that the determinism of pre-quantum physics kept itself from ideological bankruptcy only by keeping the three balls of the pawnbroker apart.’ – Tim Poston and Ian Stewart, Analog, November 1981.

Professors David Bohm and J. P. Vigier in their paper ‘Model of the Causal Interpretation of Quantum Theory in Terms of a Fluid with Irregular Fluctuation’ (Physical Review, v 96, 1954, p 208), showed that the Schroedinger equation of quantum mechanics arises as a statistical description of the effects of Brownian motion impacts on a classically moving particle. However, the whole Bohm approach is wrong in detail, as is the attempt of de Broglie (his ‘non-linear wave mechanics’) to guess a classical potential that mimics quantum mechanics on the small scale and deterministic classical mechanics at the other size regime.

The actual cause for the Brownian motion is explained by Feynman in his QED lectures to be the vacuum 'loops' of virtual particles being created by pair production and then annihilated in the small spaces in intense fields within the atom. Feynman

‘... when the space through which a photon moves becomes too small (such as the tiny holes in the screen) ... we discover that ... there are interferences created by the two holes, and so on. The same situation exists with electrons: when seen on a large scale, they travel like particles, on definite paths. But on a small scale, such as inside an atom, the space is so small that ... interference becomes very important.’ (Feynman, QED, Penguin, 1985.)

It is tragedy is that Bohm ignored the field fluctuations when he tried to invent "hidden variables" which were unnecessary and false, and failed when tested by the Aspect check on Bell's inequality. The Dirac sea corrected predicted antimatter. It is clear from renormalization of charge and mass in QFT that the Dirac sea only appears to become real at electric fields over 10^20 volts/metre, which corresponds to the "infrared (IR) cutoff", ie the threshold field strength to create an electron + positron pair briefly in vacuum. The existence of pairs charges being created and annihilated in quantum field theory only appears real between the IR cutoff and an upper limit "ultraviolet (UV) cutoff" which is needed to stop the charges in the loops being having so much momenta that the field is unphysical. All this is just a mathematical illusion, due to QFT ignoring discontinuities and assuming Heisenberg's uncertainty principle is metaphysical (creating something from nothing) instead of describing the energy of a discrete background Dirac sea of particles which gain energy from the external field they are immersed in:

‘What they now care about, as physicists, is (a) mastery of the mathematical formalism, i.e., of the instrument, and (b) its applications; and they care for nothing else.’ – Karl R. Popper, Conjectures and Refutations, R.K.P., 1969, p100.

‘… the Heisenberg formulae can be most naturally interpreted as statistical scatter relations, as I proposed [in the 1934 German publication, ‘The Logic of Scientific Discovery’]. … There is, therefore, no reason whatever to accept either Heisenberg’s or Bohr’s subjectivist interpretation of quantum mechanics.’ – Sir Karl R. Popper, Objective Knowledge, Oxford University Press, 1979, p. 303.

Hence, statistical scatter gives the energy form of Heisenberg’s equation, since the vacuum is full of gauge bosons carrying momentum like light, and exerting vast pressure; this gives the foam vacuum.

Posted by: nc | October 16, 2006 at 02:12 PM

 
At 8:22 AM, Blogger nige said...

http://asymptotia.com/2006/10/16/manifold-yau/#comment-2156


Jacques Distler
Oct 17th, 2006 at 7:52 am

There is precisely one Calabi-Yau manifold of real dimension 4. It’s called K3.

It is very well studied, both in physics and mathematics.

3 nc
Oct 17th, 2006 at 8:08 am

Hi Jacques,

Thank you very much for this K3 name. It is stated on http://en.wikipedia.org/wiki/K3_manifold that:

“In mathematics, in the field of complex manifolds, a K3 surface is an important and interesting example of a compact complex surface (complex dimension 2 being real dimension 4).

“Together with two-dimensional complex tori, they are the Calabi-Yau manifolds of dimension two. Most K3 surfaces, in a definite sense, are not algebraic. This means that, in general, they cannot be embedded in any projective space as a surface defined by polynomial equations. However, K3 surfaces first arose in algebraic geometry and it is in this context that they received their name — it is after three algebraic geometers, Kummer, Kähler and Kodaira, alluding also to the mountain peak K2 in the news when the name was given during the 1950s. …

“K3 manifolds play an important role in string theory because they provide us with the second simplest compactification after the torus. Compactification on a K3 surface preserves one half of the original supersymmetry.”

It also refers to http://www.cgtp.duke.edu/ITP99/morrison/cortona.pdf which is almost unintelligible [to my level of maths] and http://arxiv.org/abs/hep-th/9611137 which looks similar. I’ll take a closer look when I have time.

Many thanks,
nc

 
At 11:52 AM, Blogger nige said...

http://motls.blogspot.com/2006/10/emperor-of-math.html

Dear Lumo,

Suppose once upon a time in a fairytale, in a distant parallel universe codenamed "DRL" there were was a GR theory with 3 contractable matter dimensions and 3 expanding time dimensions, ie 6 dimensions.

Suppose they discovered 10-d supersymmetry and needed to roll up the remaining 4-d into a Calabi-Yau manifold. Would this work? I mean presumably the landscape would be smaller than 10^500 because the Calabi-Yau manifold is then 4-d not 6-d?

Is there any list anywhere of the number of solutions for different numbers of dimensions in the Calabi-Yau manifold? Jacques has pointed out that 4-d Calabi Yau manifolds are well studied as K3:

http://asymptotia.com/2006/10/16/manifold-yau/#comment-2156

Unfortunately, I can't understand any of the Wiki papers very easily, because I haven't been trained in that type of maths.

Best,
nigel cook | Homepage | 10.17.06 - 2:44 pm | #

--------------------------------------------------------------------------------

Actually Jacques says there is ONLY ONE manifold of that type, but I don't know how many solutions that one manifold actually has?
nigel cook | Homepage | 10.17.06 - 2:46 pm | #

 
At 1:08 PM, Blogger nige said...

Comment copy in case Clifford needs to edit/cut it a bit:

http://asymptotia.com/2006/10/16/not-in-tower-records/#comment-2182

nc Oct 17th, 2006 at 12:52 pm

Congratulations! Sounds very interesting.

In the post about the DVD you touch on spirituality and quantum theory a bit. Do you agree with Feynman's claim that path integrals are due to interference by virtual charges in the loops which occur out to 10^-15 m from an electron (the [IR] cutoff, distance corresponding to 0.51 MeV electron collision energy closest approach)?

‘... when the space through which a photon moves becomes too small (such as the tiny holes in the screen) ... we discover that ... there are interferences created by the two holes, and so on. The same situation exists with electrons: when seen on a large scale, they travel like particles, on definite paths. But on a small scale, such as inside an atom, the space is so small that ... interference becomes very important.’ - R. P. Feynman, QED, Penguin Books, London, 1985.

Also, there is news of a new film about 11 dimensions starring Stephen Hawking:

http://www.cambridge-news.co.uk/news/city/2006/10/16/2367bc3d-644d-42e9-8933-3b8ccdded129.lpf

Hawking's Brief History of Time sold one copy for every 750 men, women and children on the planet and was on the Sunday Times bestseller list 237 weeks, according to page 1 of A briefer History of Time which seems to be the same text but has beautiful illustrations of photon and electron interference (pp 96, 98), and a nice simple illustration of the Yang-Mills recoil force mechanism (p 119).

Pages 118-9 state: "... the forces or interactions between matter particles are all supposed to be carried by particles. What happens is that a matter particle, such as an electron or a quark, emits a force-carrying particle. The recoil from this emission changes the velocity of the matter particle, for the same reason that a cannon rolls back after firing a cannonball. The force-carrying particle then collides with another matter particle and is absorbed, changing the motion of that particle. The net result of the process of emission and absorption is the same as if there had been a force between the two matter particles.

"Each force is transmitted by its own distinctive type of force-carrying particle. If the force-carrying particles have a high mass, it will be difficult to produce and exchange them over a large distance, so the forces they carry will have only a short range. On the other hand, if the force-carrying particles have no mass of their own, the forces will be long-range..."

Do you agree with popularization of the Yang-Mills theory by the cannon ball analogy? I do, but that's because I've worked out how attractive forces can result from this mechanism, and how to predict stuff with it. However, I know this makes some people upset, who don't want to deal with a Rube-Goldberg machine type universe because it gets rid of God.

Best,
nc

 
At 2:30 PM, Blogger nige said...

http://asymptotia.com/2006/10/16/not-in-tower-records/#comment-2193

nc
Oct 17th, 2006 at 2:19 pm

Second correction to my comment: “Do you agree with Feynman’s claim that path integrals are due to interference by virtual charges in the loops...” should read: “Do you agree with Feynman’s claim that the chaotic nature of sub-atomic sized path integrals are due to interference by virtual charges in the loops...”

Sorry again!

 
At 2:43 PM, Blogger nige said...

Lubos has replied:

http://motls.blogspot.com/2006/10/emperor-of-math.html

Dear NC,

according to the normal definitions, a higher number of time dimensions than one implies the existence of closed time-like curves that violate causality and allow you to convince your mother to have an abortion before you're born, which is a contradiction, using the terminology of Sidney Coleman.

That's one of the problems with Danny Ross Lunsford's 3+3D theories as well as all other theories with two large times or more.

Even if the signature of the Universe were 7+3, you would have to compactify or otherwise hide 4+2=6 dimensions to get realistic physics.

Jacques is right that all 4-real-dimensional Calabi-Yau manifolds are homeomorphic to a K3 manifold: it's a proven theorem. The only other smooth topology, if you have a more tolerant definition of a CY manifold, would be a 4-torus.

The possible Ricci-flat geometries on a K3 manifold form a 57-real dimensional moduli space isomorphic to SO(19,3,Z)SO(19,3)/SO(19) x SO(3). All of the solutions are continuously connected with each other.

Dear Charles, I probably mismeasured the measurement of the sign of your correction of Overbye's sentence . Thanks anyway.

All the best
Lubos
Lubos Motl | Homepage | 10.17.06 - 5:26 pm | #

--------------------------------------------------------------------------------

Dear Lumos,

Thanks for the 4-torus idea! The 3 time time dimensions are the orthagonal dimensions of empty and expanding space between masses, so they can't form closed loops. The 3 distance like dimensions describe the dimensions of matter which is non expanding and indeed contractable ;-)

Best,
nc
nigel cook | Homepage | 10.17.06 - 5:39 pm | #

 
At 1:59 AM, Blogger nige said...

http://brahms.phy.vanderbilt.edu/~rknop/blog/?p=108#comment-7583

nc Says: Your comment is awaiting moderation.

October 18th, 2006 at 2:37 am
Is it true that the CBR is the most perfect blackbody radiation spectrum ever observed? I heard that claim somewhere. I’m not sure if it is true because I know how Planck got his theory, and it was fiddling the theory to meet the already known curve, which was fairly precisely known even in 1900 from lab measurements. Before Planck’s formula, there were various attempts to construct semi-empirical equations to fit the curve (which failed because the underlying theory couldn’t be rigorously constructed). Basically the Rayleigh-Jeans law came first but fails due to UV problems: http://en.wikipedia.org/wiki/Rayleigh-Jeans_law . Also, what about the ‘new aether drift’ in the CBR spectrum? Why don’t people popularize it as a reference frame for measuring absolute motion? Muller, R. A., ‘The cosmic background radiation and the new aether drift’, Scientific American, vol. 238, May 1978, p. 64-74, [Go through Google here first hit, Scientific American paper abstract]:

“U-2 observations have revealed anisotropy in the 3 K blackbody radiation which bathes the universe. The radiation is a few millidegrees hotter in the direction of Leo, and cooler in the direction of Aquarius. The spread around the mean describes a cosine curve. Such observations have far reaching implications for both the history of the early universe and in predictions of its future development. Based on the measurements of anisotropy, the entire Milky Way is calculated to move through the intergalactic medium at approximately 600 km/s.”

More: http://en.wikipedia.org/wiki/Talk:Herbert_Dingle#Disgraceful_error_on_article_page

http://en.wikipedia.org/wiki/Herbert_Dingle

 
At 6:30 AM, Blogger nige said...

{For a while up to 2pm London time on 18 oct 2006, this electrogravity.blogger.com site was down and when the root site blogger.com returned I then had to republish this weblog to get it restored.}


Update:

http://discovermagazine.typepad.com/horganism/2006/10/the_end_of_stri.html

Thanks for keeping the comment above! I've now got some dialogue from string theorists Prof. Jacques Distler and Ass. Prof. Lubos Motl, which is a step in the right direction.

On Prof. Clifford Johnson's blog, Prof. Jacques Distler replied to my question regards the compatibility of Lunsford and superstring:
http://asymptotia.com/2006/10/16/manifold-yau/#comment-2156

Because Lunsford has 3 orthagonal time dimensions (describing basically empty, non-contractable, expanding volumes of space) and 3 orthagonal contractable distance dimensions (describing matter like rulers, hence if you want to measure the distance in empty space and use a ruler then you are not actually measuring the space you are measuring contractable matter - ie the ruler), and unifies GR and electromagnetism in the limit where the cosmological constant disappears, it's evident that to preserve supersymmetry you would need a 4-d Calabi-Yau manifold.

Jacques says the 4-d Calaib-Yau is just a single thing called a K3 manifold, which Wiki says is: "the second simplest compactification after the torus. Compactification on a K3 surface preserves one half of the original supersymmetry."

Next, Lubos says:

http://www.haloscan.com/comments/lumidek/116109989645197647/#626354

Dear NC,

according to the normal definitions, a higher number of time dimensions than one implies the existence of closed time-like curves that violate causality and allow you to convince your mother to have an abortion before you're born, which is a contradiction, using the terminology of Sidney Coleman.

That's one of the problems with Danny Ross Lunsford's 3+3D theories as well as all other theories with two large times or more.

Even if the signature of the Universe were 7+3, you would have to compactify or otherwise hide 4+2=6 dimensions to get realistic physics.

Jacques is right that all 4-real-dimensional Calabi-Yau manifolds are homeomorphic to a K3 manifold: it's a proven theorem. The only other smooth topology, if you have a more tolerant definition of a CY manifold, would be a 4-torus.

The possible Ricci-flat geometries on a K3 manifold form a 57-real dimensional moduli space isomorphic to SO(19,3,Z)SO(19,3)/SO(19) x SO(3). All of the solutions are continuously connected with each other. ...

All the best
Lubos
Lubos Motl | Homepage | 10.17.06 - 5:26 pm | #


The claim that 3 orthagonal time dimensions can form closed loops is naughty of Lubos, since it is implicitly ruled out by the dynamics of the model. The 3 time dimensions are the orthagonal dimensions of empty and expanding space between masses, so they can't form closed loops. The 3 distance like dimensions describe the dimensions of matter which is non expanding and indeed contractable.

In conventional superstring theory, there are 10-d and to make that work with a 4-d GR you roll up 6-d into the Calabi-Yau manifold. Hence for Lunsford's 6-d GR unification, you would need to roll up only 4-d.

This is totally different to the Kaluza-Klein [pseudo] 5-d unification where you have 1 extra distance dimension rolled up. Lunsford is adding two extra time dimensions, not two extra distance dimensions.

I'm not sure that supersymmetry is correct. The usual role of it in getting rid of UV divergences due to massive loops and unifying everything near the Planck energy scale, is extremely extravagant since it postulates an unobserved superpartner for every observed partner, and can't predict anything checkable about the superpartners (such as their exact energy, etc.). I think a lot can be done to understand unification by working at the problem the other way, and asking questions about the conservation of gauge boson energy when the electromagnetic field is attenuated by vacuum polarization above the IR cutoff. That energy presumably goes into creating short range nuclear forces via vacuum dynamics. If so, that would predict unification automatically because when say EM force becomes very much stronger at higher energy (closer distance) due to less vacuum polarization separating the observer from the particle core, the amount of attenuated force causing nuclear forces will start to decline because there is less attenuation of EM to provide the energy to power strong nuclear interactions. Hence, perfect unification should occur anyway, just on the principle of conservation of energy in gauge boson fields.

However, I could be wrong about this. Pauli used energy conservation with a beta decay "anomaly" to predict the neutrino. Bohr used the same anomaly to argue that energy conservation fails to work in beta decay, and at first dismissed Pauli's prediction as speculative. However, Bohr was wrong. See http://cosmicvariance.com/2006/10/03/the-trouble-with-physics/#comment-124189

Posted by: nc | October 18, 2006 at 06:42 AM


http://asymptotia.com/2006/10/16/not-in-tower-records/#comment-2182


nc
Oct 17th, 2006 at 12:52 pm

Congratulations! Sounds very interesting.

In the post about the DVD you touch on spirituality and quantum theory a bit. Do you agree with Feynman’s claim that [the chaotic nature of sub-atomic sized path integrals is] due to interference by virtual charges in the loops which occur out to 10^-15 m from an electron (the [IR] cutoff, distance corresponding to 0.51 MeV electron collision energy closest approach)?

‘... when the space through which a photon moves becomes too small (such as the tiny holes in the screen) ... we discover that ... there are interferences created by the two holes, and so on. The same situation exists with electrons: when seen on a large scale, they travel like particles, on definite paths. But on a small scale, such as inside an atom, the space is so small that ... interference becomes very important.’ - R. P. Feynman, QED, Penguin Books, London, 1985.

Also, there is news of a new film about 11 dimensions starring Stephen Hawking:

http://www.cambridge-news.co.uk/news/city/2006/10/16/2367bc3d-644d-42e9-8933-3b8ccdded129.lpf

Hawking’s Brief History of Time sold one copy for every 750 men, women and children on the planet and was on the Sunday Times bestseller list 237 weeks, according to page 1 of A briefer History of Time which seems to be the same text but has beautiful illustrations of photon and electron interference (pp 96, 98), and a nice simple illustration of the Yang-Mills recoil force mechanism (p 119).

Pages 118-9 state: “... the forces or interactions between matter particles are all supposed to be carried by particles. What happens is that a matter particle, such as an electron or a quark, emits a force-carrying particle. The recoil from this emission changes the velocity of the matter particle, for the same reason that a cannon rolls back after firing a cannonball. The force-carrying particle then collides with another matter particle and is absorbed, changing the motion of that particle. The net result of the process of emission and absorption is the same as if there had been a force between the two matter particles.

“Each force is transmitted by its own distinctive type of force-carrying particle. If the force-carrying particles have a high mass, it will be difficult to produce and exchange them over a large distance, so the forces they carry will have only a short range. On the other hand, if the force-carrying particles have no mass of their own, the forces will be long-range...”

Do you agree with popularization of the Yang-Mills theory by the cannon ball analogy? I do, but that’s because I’ve worked out how attractive forces can result from this mechanism, and how to predict stuff with it. However, I know this makes some people upset, who don’t want to deal with a Rube-Goldberg machine type universe because it gets rid of God.

Best,
nc


http://discovermagazine.typepad.com/horganism/2006/10/why_brian_josep.html


Thanks for this post, which is very interesting. So it was the Copenhagen Interpretation to blame?

The Josephson Junction led to practical high-sensitivity magnetic field sensors, SQUIDs,
http://en.wikipedia.org/wiki/SQUID , but the quantum weirdness based on Cooper pairs and quantum tunnelling doesn't validate ESP.

The failure is ultimately in classical physics, which should be formulated with inbuilt indeterminancy for the 3+ body problem (which leads to chaos as Poincare discovered). The whole myth of classical physics being somehow deterministic (Maxwell and GR) is based on ignoring this:

‘... the ‘inexorable laws of physics’ ... were never really there ... Newton could not predict the behaviour of three balls ... In retrospect we can see that the determinism of pre-quantum physics kept itself from ideological bankruptcy only by keeping the three balls of the pawnbroker apart.’ – Tim Poston and Ian Stewart, Analog, November 1981.

Professors David Bohm and J. P. Vigier in their paper ‘Model of the Causal Interpretation of Quantum Theory in Terms of a Fluid with Irregular Fluctuation’ (Physical Review, v 96, 1954, p 208), showed that the Schroedinger equation of quantum mechanics arises as a statistical description of the effects of Brownian motion impacts on a classically moving particle. However, the whole Bohm approach is wrong in detail, as is the attempt of de Broglie (his ‘non-linear wave mechanics’) to guess a classical potential that mimics quantum mechanics on the small scale and deterministic classical mechanics at the other size regime.

The actual cause for the Brownian motion is explained by Feynman in his QED lectures to be the vacuum 'loops' of virtual particles being created by pair production and then annihilated in the small spaces in intense fields within the atom. Feynman

‘... when the space through which a photon moves becomes too small (such as the tiny holes in the screen) ... we discover that ... there are interferences created by the two holes, and so on. The same situation exists with electrons: when seen on a large scale, they travel like particles, on definite paths. But on a small scale, such as inside an atom, the space is so small that ... interference becomes very important.’ (Feynman, QED, Penguin, 1985.)

It is tragedy is that Bohm ignored the field fluctuations when he tried to invent "hidden variables" which were unnecessary and false, and failed when tested by the Aspect check on Bell's inequality. The Dirac sea corrected predicted antimatter. It is clear from renormalization of charge and mass in QFT that the Dirac sea only appears to become real at electric fields over 10^20 volts/metre, which corresponds to the "infrared (IR) cutoff", ie the threshold field strength to create an electron + positron pair briefly in vacuum. The existence of pairs charges being created and annihilated in quantum field theory only appears real between the IR cutoff and an upper limit "ultraviolet (UV) cutoff" which is needed to stop the charges in the loops being having so much momenta that the field is unphysical. All this is just a mathematical illusion, due to QFT ignoring discontinuities and assuming Heisenberg's uncertainty principle is metaphysical (creating something from nothing) instead of describing the energy of a discrete background Dirac sea of particles which gain energy from the external field they are immersed in:

‘What they now care about, as physicists, is (a) mastery of the mathematical formalism, i.e., of the instrument, and (b) its applications; and they care for nothing else.’ – Karl R. Popper, Conjectures and Refutations, R.K.P., 1969, p100.

‘... the Heisenberg formulae can be most naturally interpreted as statistical scatter relations, as I proposed [in the 1934 German publication, ‘The Logic of Scientific Discovery’]. ... There is, therefore, no reason whatever to accept either Heisenberg’s or Bohr’s subjectivist interpretation of quantum mechanics.’ – Sir Karl R. Popper, Objective Knowledge, Oxford University Press, 1979, p. 303.

Hence, statistical scatter gives the energy form of Heisenberg’s equation, since the vacuum is full of gauge bosons carrying momentum like light, and exerting vast pressure; this gives the foam vacuum.

Posted by: nc | October 16, 2006 at 02:12 PM

 
At 2:24 AM, Blogger nige said...

http://www.math.columbia.edu/~woit/wordpress/?p=475#comment-17846

nc Says: Your comment is awaiting moderation.

October 19th, 2006 at 5:12 am

Surely it is important to think about the motivation behind string theory, i.e., is it based on 'mystical reasons ... the most perfect ... complete model ... passionate...':

Aristotle thought that the earth was stationary ... He believed this because he felt, for mystical reasons, that the earth was the centre of the universe and that circular motion was the most perfect. ... a complete model of the heavens. Ptolemy was passionate about his studies. 'When I follow at my pleasure the serried multitude [he wrote].... my feet no longer touch the earth.'

- Stephen Hawking with Leonard Mlodinow, A Briefer History of Time,, Bantam, London, 2005, p8.

Is it just possible that some of the enthusiasm for string theory has similar psychological origins? ;-)

 
At 2:21 PM, Blogger nige said...

http://twistedphysics.typepad.com/cocktail_party_physics/2006/10/baby_take_a_bel.html#comment-24130254

In addition to the shell structure magic numbers, it is supposedly impossible to get to element number 137 for theoretical reasons: the short range attractive strong force between nucleons will be exactly balanced by the long-range electromagnetic repulsion of 137 protons!

This assumes that the strong force coupling for inter-nucleon forces is indeed exactly 137. The whole reason for radioactivity of heavy elements is linked to the increasing difficulty the strong force has in offsetting electromagnetism as you get towards 137 protons, accounting for the shorter half-lives. So here is a derivation of the 137 number in the context of strong nuclear force mediated by pions:

Heisenberg’s uncertainty says p*d = h/(2.Pi), if p is uncertainty in momentum, d is uncertainty in distance.

This comes from the resolving power of Heisenberg’s imaginary gamma ray microscope, and is usually written as a minimum (instead of with “=” as above), since there will be other sources of uncertainty in the measurement process. The factor of 2 would be a factor of 4 if we consider the uncertainty in one direction about the expected position (because the uncertainty applies to both directions, it becomes a factor of 2 here).

For light wave momentum p = mc, pd = (mc)(ct) = Et where E is uncertainty in energy (E=mc^2), and t is uncertainty in time. OK, we are dealing with massive pions, not light, but this is close enough since they are relativistic:

Et = h/(2*Pi)

t = d/c = h/(2*Pi*E)

E = hc/(2*Pi*d).

Hence we have related distance to energy: this result is the formula used even in popular texts used to show that a 80 GeV energy W+/- gauge boson will have a range of 10^-17 m. So it’s OK to do this (ie, it is OK to take uncertainties of distance and energy to be real energy and range of gauge bosons which cause fundamental forces).

Now, the work equation E = F*d (a vector equation: “work is product of force and the distance acted against the force in the direction of the force”), where again E is uncertainty in energy and d is uncertainty in distance, implies:

E = hc/(2*Pi*d) = Fd

F = hc/(2*Pi*d^2)

Notice the inverse square law resulting here!

This force is 137.036 times higher than Coulomb’s law for unit fundamental charges! This is the usual value often given for the ratio between the strong nuclear force and the electromagnetic force (I’m aware the QCD inter quark gluon-mediated force takes different and often smaller values than 137 times the electromagnetism force).

I first read this amazing 137 factor in nuclear stability (limiting the number of elements to a theoretical maximum of below 137) in Glenn Seaborg’s article ‘Elements beyond 100′ (in the Annual Review of Nuclear Science, v18, 1968 by accident after getting the volume to read Harold Brode’s article - which was next after Seaborg’s - entitled ‘Review of Nuclear Weapons Effects’).

I just love the fact that elements 99-100 (Einsteinium and Fermium) were discovered in the fallout of the first Teller-type H-bomb test at Eniwetok Atoll in 1952, formed by successive neutron captures in the U-238 pusher, which was within a 25-cm thick steel outer case according to some reports. Many of the neutrons must have been trapped inside the bomb. (Theodore Taylor said that the density of neutrons inside the bomb reached the density of water!)

‘Dr Edward Teller remarked recently that the origin of the earth was somewhat like the explosion of the atomic bomb…’ – Dr Harold C. Urey, The Planets: Their Origin and Development, Yale University Press, New Haven, 1952, p. ix.

‘It seems that similarities do exist between the processes of formation of single particles from nuclear explosions and formation of the solar system from the debris of a supernova explosion. We may be able to learn much more about the origin of the earth, by further investigating the process of radioactive fallout from the nuclear weapons tests.’

– Dr P.K. Kuroda, ‘Radioactive Fallout in Astronomical Settings: Plutonium-244 in the Early Environment of the Solar System,’ Radionuclides in the Environment (Dr Edward C. Freiling, Symposium Chairman), Advances in Chemistry Series No. 93, American Chemical Society, Washington, D.C., 1970.

Posted by: nc | October 19, 2006 at 05:04 PM

Comment by nigel cook | October 19, 2006

 
At 8:26 AM, Anonymous Anonymous said...

THE DECLINE U.K. PHYSICS CORRELATES WITH THE DECLINE IN RELIGION IN THE U.K.

See http://www.vexen.co.uk/religion/rib.html for the background of the fall in religious faith in the U.K.:

PEOPLE ARE STILL INTERESTED IN RELIGION, THEY JUST DON'T ATTEND CHURCH SO MUCH.

This is similar to the physics situation: they buy physics books, but they don't get more deeply involved!

BECAUSE RELIGION IS STRONGER IN THE U.S.A. WE WOULD EXPECT PHYSICS TO BE HEALTHIER, as is indeed the case!

There is thus some evidence that people in the U.K. see physics like a religion, which is due to people like Hawking in the U.K. selling one copy of his book to every 750 people on the planet and remaining in the London Sunday Times best seller list 237 weeks.

Because vast segments of humanity in China and such like presumably did not buy their proportion of "A Briefer History of Time" it is actually most abundant in the U.K., where there are piles in second hand bookshops stretching from floor to ceiling. presumably a fair % of the U.K. population bought (or were given as a present) Hawking's book, and it did not convince them to major in physics!

 
At 6:22 AM, Blogger nige said...

Here is a copy of a relevant comment of mine from my other blog, http://nige.wordpress.com/2006/10/20/loop-quantum-gravity-representation-theory-and-particle-physics :

Rather than put up a new post to discuss rubbish, or include it in the present post, I'm going to discuss the errors in Dr Stephen Hawking and Dr Leonard Mlodinow, A Briefer History of Time (Bantam Press, London, 2005, 162pp) here. It is almost entirely sh*t and makes me angry.

The Foreword (p1) states that the original Brief History of Time of 1988 'was on the London Sunday Times best-seller list for 237 weeks and has sold about one copy for every 750 men, women and children in earth.'

A disproportionate number of copies were sold in England, where physics has collapsed (although John 'Jupiter Effect' Gribbin, author of the sh*t 1984 In Search of Schroedinger's Cat, together with editors of New Scientist like the self-promoting current editor Jeremy 'out of personal interest' Webb and the egotistic ex-editor Dr Alun 'I'm great because I'm a Green environmentalist' Anderson richly deserve to take their share much of the credit with Dr Hawking in the next U.K. Government sponsored 'kick-physics-in-the-face awards' which will be named something more dishonest!).

Hawking's and Mlodinow's biggest offense is on p125: 'In string theories, the basic objects are not point particles but things that have a length but no other dimension, like an infinitely thin piece of string.'

How can something be 'infinitely thin'? Is this a joke? Surely the Emperor's New Clothes were woven out of infinitely thin cloth? Clearly, I'm thick because I think they mean they mean the width was 1/infinity metres = 0 metres. If my bank balance is X, that doesn't mean I have any money, particularly if I have infinitely little money, X=0. A mere quantitative difference, ie, the difference between having 0 width string and 1 mm width string, is a QUALITATIVE difference, because it is the difference between having the Emperor's New Clothes thread, and having real thread! Clearly, even string theorists can't be that dumb, so Hawking and Mlodinow are bad explainers.

Hawking and Mlodinow's second biggest offense is on page 10, where they repeat the damn lie that '... Copernicus['] ... revolutionary idea ... was much simpler than Ptolemy's model...'

This lie was disproved by Koestler in the his analysis of the Copernician revolution, The Sleepwalkers published in 1959. He showed that Copernicus has ~80 epicycles compared to only ~40 in Ptolemy's model. The liars claim that every new theory is simpler in complexity, but that is not true. Sometimes reality has a technical complexity: for example the 'theory' that 'God created and controls everything' is 'simple' in some sense but it is not a step forward for science to dump all the complex knowledge and to prefer a simpler hypothesis which fits all the facts. What makes science good is PREDICTIONS that can be checked objectively.

On page 14, Hawking and Mlodinow prove how confused they are:

'... you can disprove a theory by finding even a single observation that disagrees with the predictions ... what actually happens is that a new theory is devised that is really an extension of the previous theory.'

These two sentences contradict one another, so either the first needs correction or the second needs deletion (actually the second is correct and the first is wrong).

Page 23 contains another ignorant lie:

'Actually, the lack of an absolute standard of rest has deep implications for physics: it means that we cannot determine whether two events that took place at different times occurred in the same position in space.'

This is a lie because you can find out the location and absolute times of say two supernovae explosions by their redshifts, etc., and you can determine motion according to the absolute measure due to the +/- 3 mK redshift/blueshift in the 2.7 K microwave background due to earth's absolute motion in space:

http://en.wikipedia.org/wiki/Talk:Herbert_Dingle#Disgraceful_error_on_article_page

... Quantum field theory now clearly demonstrates that this absolute background exists because quantum field theory has a vacuum filled with virtual particles (ground state of Dirac sea), which look different to an observer who is moving than to an observer who is stationary. See [6] page 85:

"In Quantum Field Theory in Minkowski space-time the vacuum state is invariant under the Poincare group and this, together with the covariance of the theory under Lorentz transformations, implies that all inertial observers agree on the number of particles contained in a quantum state. The breaking of such invariance, as happened in the case of coupling to a time-varying source analyzed above, implies that it is not possible anymore to define a state which would be recognized as the vacuum by all observers." (Emphasis added to the disproof of special relativity postulate 1 by quantum field theory.)
As for the background to us to determine absolute motion: the cosmic background radiation is ideal. By measuring time from the big bang, you have absolute time. You can easily work out the corrections for gravitation and motion. It is easy to work out gravitational field strength because it causes accelerations which are measurable. Your absolute motion is given by the anisotropy in the cosmic background radiation due to your motion. See

Muller, R. A., 'The cosmic background radiation and the new aether drift', Scientific American, vol. 238, May 1978, p. 64-74 [ http://adsabs.harvard.edu/abs/1978SciAm.238...64M ]:

"U-2 observations have revealed anisotropy in the 3 K blackbody radiation which bathes the universe. The radiation is a few millidegrees hotter in the direction of Leo, and cooler in the direction of Aquarius. The spread around the mean describes a cosine curve. Such observations have far reaching implications for both the history of the early universe and in predictions of its future development. Based on the measurements of anisotropy, the entire Milky Way is calculated to move through the intergalactic medium at approximately 600 km/s."

After all, if the Milky Way has an absolute motion of 600 km/s according to the CBR, that is a small value compared to c, so time dilation is small. Presumably galaxies at immense distances have higher speeds.

The current picture of cosmology is an infinitely big currant bun, expanding in an infinitely big oven with no edges so that each currant moves away from the others with no common centre or "middle".

However, the universe is something like 15,000,000,000 years old and that although the 600 km/s motion of the Milky Way is mainly due to attraction toward Andromeda which is a bigger galaxy, we can still estimate that 600 km/s is an order-of-magnitude estimate of our velocity since the big bang.

In that case we are at a distance of about s = vt = 600,000t m/s = 0.002R where R = ct = radius of universe. Hence we are at 0.2% of the radius of the universe, or very near the "middle". The problem is that the steady state (infinite, expanding) cosmology model was only finally discredited in favour of the BB by the discovery of the CBR in 1965, and so people still today tend to hold on to the steady-state vestage that states it is nonsensical to talk about the "middle" of a big bang fireball! In fact, it is perfectly sensible to do so until someone actually goes to a distant galaxy and disproves it, which nobody has. There is plenty of orthodoxy masquerading as fact in cosmology, not just in string theory!

Once you have found your absolutely known velocity and position in the universe, you can calculate the absolute amount of motion and gravity-caused time dilation (if the observer can see the observable distribution of mass around them and can determine the velocity through the universe as given by the fact that the CBR temperature is about 0.005 Kelvin hotter in the direction the Milky Way is going in than in the opposite direction, due to blueshift as we move into a radiation field, and redshift as we recede from one).

The matter distribution around us tells us how to correct our clocks for time-dilation. Hence, relativity of time disappears, because we can know for absolutely what the time is from time of the BB. (This is similar to the corrections you need to apply when using a sundial, where you have to apply a correction called the "equation of time", for the time of year. For old clocks you would need to correct for temperature because that made the clock run at different rates when it was locally hot or cold. Time dilations are not a variation in the absolute chronology of the universe where time is determined by the perpetual expansion of matter in the big bang. Time dilations only apply to the matter which is moving and/or subject to gravitation. Time dilation to a high energy muon in an accelerator doesn't cause the entire universe to run more slowly, it just slows down the quark field interactions and the muon decays more slowly. There is no doubt that all "relativistic" effects are local!)

Also, you can always tell the absolute time by looking at the recession of the stars. Measure the Hubble constant H, and since the universe isn't decelerating ("... the flat universe is just not decelerating, it isn’t really accelerating..." - Nobel Laureate Phil Anderson, [ http://cosmicvariance.com/2006/01/03/danger-phil-anderson/#comment-10901 ]), the age of the universe is t = 1/H (if the universe was slowing due to critical density, the Friedmann solution to GR would be not t = 1/H but rather t = (2/3)/H, however the reason for the classical Friedmann critical density solution failing is probably that gravitons are redshifted by cosmic expansion and so the quantum gravity coupling constant falls over vast distances in the expanding universe, preventing gravitational retardation of expansion, and this effect is not accounted for in GR which ignores quantum gravity effects; instead an ad hoc cosmological constant is simply added by the mainstream to force GR to conform to the latest observations).

Alternatively, all you need to observe to determine absolute time is the value of the CBR temperature. This tells you the absolute time after the BB, regardless of what your watch says: just measure the ~2.728 Kelvin microwave background.

The average temperature of that is a clock, telling you absolute time. When the temperature of the CBR is below 3000 Kelvin, the universe is matter dominated so:

Absolute time after big bang = [current age of universe].(2.728/T)^1.5, where T is the CBR temperature in Kelvin.

For T above 3000 Kelvin, the universe was of course opaque due to ionisation of hydrogen so it was radiation dominated and the formula for time in that era more strongly dependent on temperature and is

Absolute time after big bang = [current age of universe].(2.728/T)^2.

Reference: [ http://hyperphysics.phy-astr.gsu.edu/hbase/astro/expand.htm ]. Although the Friedmann equation used on that page is wrong according to a gravity mechanism [ http://feynman137.tripod.com/ ], the error in it is only a dimensionless multiplying factor of 0.5e^3, so the underlying scaling relationship (ie the power-law dependence between time and temperature) is still correct.

Defining absolute time from the CBR temperature averaged in every direction around the observer gets away from local time-dilation effects. Of course it is not possible to accurately measure time this way like a clock for small intervals, but the principle is that the expansion of the universe sets up a universal time scale which can in principle be used to avoid the problem of local time dilations.

There is a limitation with the 1887 Michelson-Morley experiment. This caused FitzGerald and Lorentz to come up with the contraction of spacetime to save aether. It was measuring effects of the motion of the light receiver, not of the motion of the light emitter. If light speed varies with redshift, then the CBR radiation will be approaching us at a speed of 6 km/s instead of c. This comes from: c x (300,000 years / 15,000,000,000 years) = 0.00002c = 6 km/second compared to the standard value of c = 300,000 km/second. This would be easily measurable by a simple instrument to confirm what the velocity of severely redshifted light is. For suggested experimental equipment, see: [ http://mrigmaiden.blogspot.com/2006/09/update-to-riofrio-equations-post.html ]. Nigel
_______________

Another lie in Hawking and Mlodinow is on p28 where they claim Maxwell found the speed of electromagnetic waves to 'match exactly the speed of light!' The exposure of this lie is (with full references) in my own published Electronics World article in the U.K. in April 2003: Maxwell BEGAN with Weber's 1856 discovery that the square root of reciprocal of the product of the electric and magnetic force constants equals c. He then fiddled the theory, making serious farces with mechanical gear cog and idler wheen aethers in 1861-2, to obtain a false classical theory for light waves. There was no Eureka! moment because he never calculated the speed of light from his equations, he put the speed of light in and used it (working backward) to calculate a false (continuous) displacement current formula, the 'extra current' he added to Ampere's law.

Another lie in Hawking and Mlodinow is on p33: '...the theory of relativity requires us to put an end to the idea of absolute time'. See http://en.wikipedia.org/wiki/Talk:Herbert_Dingle#Disgraceful_error_on_article_page for why that is a lie. The next paragraph of Hawking-Mlodinow, worshipping special relativity as a Holier than Holy religion, is total anti-objectivity, physics-hating sh*t.

Another lie in Hawking and Mlodinow is on p48: 'In the theory of relativity there is no unique absolute time...' should be corrected to read: 'In the religion of relativity there is no unique absolute time...

Another deception in Hawking and Mlodinow is on p58: 'The behaviour of the universe could have been predicted from Newton's theory of gravity at any time in the nineteenth, the eighteenth, or even the seventeenth century.' This is a deception because it gives the deceiving idea (for convenient reasons of obfuscation as we shall see) that nobody predicted the big bang. In fact they did!

http://www.math.columbia.edu/~woit/wordpress/?p=273#comment-5322

... please note Erasmus Darwin (1731-1802), father of Charles the evolutionist, first defended the big bang seriously in his 1790 book ‘The Botanic Garden’:

‘It may be objected that if the stars had been projected from a Chaos by explosions, they must have returned again into it from the known laws of gravitation; this however would not happen, if the whole Chaos, like grains of gunpowder, was exploded at the same time, and dispersed through infinite space at once, or in quick succession, in every possible direction.’

Weirdly, Darwin was trying to apply science to Genesis. The big bang has never been taken seriously by cosmologists, because they have assumed that curved spacetime makes the universe boundless and such like. So a kind of belief system in the vague approach to general relativity has blocked considering it as a 10^55 megatons space explosion. Some popular books even claim falsely that things can’t explode in space, and so on.

In reality, because all gravity effects and light come to us at light speed, the recession of galaxies is better seen as a recession speed varying with known time past, than varying with the apparent distance. Individual galaxies may not be accelerating, but what we see and the gravity effects we receive at light speed come from both distance and time past.

So the acceleration of universe = variation in recession speeds / variation in time past = c/t = cH where H is Hubble constant. The implication of this comes when you know the mass of the universe is m, because then you remember Newton’s 2nd law, F=ma so you get outward force. The 3rd law then tells you there’s equal inward force (Higgs/graviton field). When I do the simple LeSage-Feynman gravity shielding calculations, I get gravity within 1.7%.

It is suppressed like Tony Smith’s prediction of the top quark mass by arXiv.org

Another bad deception in Hawking and Mlodinow is on p59: 'In fact, in 1922, several years before Edwin Hubble's discovery, Friedmann predicted exactly what Hubble later found.'

This is a bad deception because:

(1) General relativity doesn't include the redshift of gravity-causing gauge bosons, so it doesn't even describe the universe (a quantum gravity is needed for that!),

(2) As Hawking and Mlodinow themselves admit on pp63-4, Friedmann only found the solution to general relativity which falsely claims that the universe will collapse; there is no evidence of this particular solution whatsoever in Hubble's results, therefore in no way conceivable was Hubble's finding supportative of the particular result which Friedmann 'predicted'.

(3) Friedmann was a crackpot because general relativity (which is wrong for cosmology due to ignoring Yang-Mills quantum gravity exchange dynamics in an expanding universe, like redshift/energy degradation of the exchange radiation) has a LANDSCAPE OF ALL WRONG SOLUTIONS! The wrong solution Friedmann found was just one wrong solution. There were also two other major equally wrong solutions to general relativity for cosmology, which Friedmann did not even mention. In real physics, ALL SOLUTIONS ARE REAL, which is why Dirac had to predict antimatter due to the negative energy solution for the Dirac sea in his equation, see

http://en.wikipedia.org/w/index.php?title=Dirac_sea&oldid=83000933

and

http://cosmicvariance.com/2006/10/03/the-trouble-with-physics/#comment-123700

Another bad deception in Hawking and Mlodinow is on p61: '... Penzias and Wilson had unwittingly stumbled across a striking example of Friedmann's first assumption that the universe is the same in every direction.' This is a lie because as already said:

Muller, R. A., 'The cosmic background radiation and the new aether drift', Scientific American, vol. 238, May 1978, p. 64-74 [ http://adsabs.harvard.edu/abs/1978SciAm.238...64M ]:

"U-2 observations have revealed anisotropy in the 3 K blackbody radiation which bathes the universe. The radiation is a few millidegrees hotter in the direction of Leo, and cooler in the direction of Aquarius. The spread around the mean describes a cosine curve. Such observations have far reaching implications for both the history of the early universe and in predictions of its future development. Based on the measurements of anisotropy, the entire Milky Way is calculated to move through the intergalactic medium at approximately 600 km/s."

Hence, on the basis of the cosmic background radiation, the universe definately ain't the same in every direction! Hawking and Mlodinow are writing just ignorant, ill-informed, intellectually insulting, badly written rubbish.

Still another bad deception in Hawking and Mlodinow is on p65: 'Our galaxy and other galaxies must also contain a large amount of "dark matter" that we cannot see directly but which we know must be there because of the influence of its gravitational attraction on the orbits of stars in the galaxies.' They miss the point that most if not all of the dark matter may possibly be accounted for by energy considerations such as http://www.gravity.uk.com/galactic_rotation_curves.html (NOTE THAT OTHER PAGES ON www.gravity.uk.com CONTAIN WRONG AND NOT EVEN WRONG SPECULATIVE RELIGIOUS BELIEFS AS I'VE POINTED OUT IN COMMENTS ON PREVIOUS POSTS OF THIS BLOG; www.gravity.uk.com is not my site!).

Still another bad deception in Hawking and Mlodinow is on p102: 'If general relativity is wrong, why have all experiments thus far supported it?' This is due to the LANDSCAPE OF ENDLESS SOLUTIONS you get from general relativity (where you fiddle the cosmological constant and other parameters to meet observed values, so it can't be falsified - except by the fact that it lacks the dynamics of quantum gravity like redshift of gauge boson radiation).

From chapter 10 onwards, the number of errors/misunderstandings of the authors per chapter falls rapidly, although as I commented at the beginning, they either don't understand string theory, or they don't know how to make it sound rational if they do understand it.

 
At 7:27 AM, Anonymous Anonymous said...

(I should add to the last comment the fact that Hawking and Mlodinow don't even mention loop quantum gravity, which has been around since about 1988 when the earlier book came out! What moronic fascists. But then, perhaps there is a parallel universe in Hawking and Mlodinow's big patronising ignorant branes where they are right.)

 
At 7:37 AM, Anonymous Anonymous said...

A DEFINITION OF “FASCISM”: http://feynman137.tripod.com/#d :

Fact based predictions and comparison with experimental observations

‘String/M-theory’ of mainstream physics is falsely labelled a theory because it has no dynamics and makes no testable predictions, it is abject speculation, unlike tested theories like General Relativity or the Standard Model which predicts nuclear reaction rates and unifies fundamental forces other than gravity. ‘String theory’ is more accurately called ‘STUMPED’; STringy, Untestable M-theory ‘Predictions’, Extra-Dimensional. Because these ‘string theorists’ suppressed the work below within seconds of it being posted to arXiv.org in 2002 (without even reading the abstract), we should perhaps politely call them the acronym of ‘very important lofty experts’, or even the acronym of ‘science changing university mavericks’. There are far worse names for these people.

HOW STRING THEORY SUPPRESSES REALITY USING PARANOIA ABOUT ‘CRACKPOT’ ALTERNATIVES TO MAINSTREAM

‘Fascism is not a doctrinal creed; it is a way of behaving towards your fellow man. What, then, are the tell-tale hallmarks of this horrible attitude? Paranoid control-freakery; an obsessional hatred of any criticism or contradiction; the lust to character-assassinate anyone even suspected of it; a compulsion to control or at least manipulate the media ... the majority of the rank and file prefer to face the wall while the jack-booted gentlemen ride by. ... But I do not believe the innate decency of the British people has gone. Asleep, sedated, conned, duped, gulled, deceived, but not abandoned.’ – Frederick Forsyth, Daily Express, 7 Oct. 05, p. 11.

 
At 12:46 PM, Blogger nige said...

http://www.math.columbia.edu/~woit/wordpress/?p=483#comments

nigel cook Says: Your comment is awaiting moderation.

October 27th, 2006 at 3:36 pm

Can I just ask a simple question: how thick is a string (not how long is a string)? If it has zero thickness, then it is the same thickness - by coincidence - as the fabric from which the Emperor’s New Clothes were woven in the Hans Christian Anderson adventure!

Predicted answers to this question from string theorists:

(1) The string has zero thickness as seen in 4-d because the other 6-d (Calabi-Yau manifold) provide the thickness in extra dimensions.

(2) The string is Planck length, so this answers the question.

(3) Asking questions is stupid.

(4) Go to school.

(5) **** off.

I’ve listed the answers above as a multiple-choice test so that string theorists can at least make a guess at the correct answer, if they don’t know it.

nigel cook Says: Your comment is awaiting moderation.

October 27th, 2006 at 3:37 pm

(By the way, just to emphasise, the question is about the thickness of a string, and not that of a string theorist.)

 

Post a Comment

<< Home