Quantum gravity physics based on facts, giving checkable predictions

Wednesday, January 04, 2006

From: "Brian Josephson"
To: "Nigel Cook"
Sent: Wednesday, January 04, 2006 4:59 PM
Subject: Re: mathematics

... An old Cambridge story, concerning a person who found himself sitting next to the taciturn Prof. X (X being variously named as Dirac, Stokes ..) at dinner.

Person sitting next to X: "someone has bet me I won't get more than 2 words out of you tonight"

Prof. X: "You lose!"

=b=

* * * * * * * Prof. Brian D. Josephson :::::::: bdj10@cam.ac.uk
* Mind-Matter * Cavendish Lab., J J Thomson Ave, Cambridge CB3 0HE, U.K.
* Unification * voice: +44(0)1223 337260 fax: +44(0)1223 337356
* Project * WWW: http://www.tcm.phy.cam.ac.uk/~bdj10
* * * * * * *


Thanks to Professor Josephson for the joke above! {See footnote to this post for the context of Josephson's email.} The entire physics community is tactiturn and unamused at any innovation. It is nearly as funny as Dr Woit's rip-roaringly scientific response
here.

‘(1). The idea is nonsense. (2). Somebody thought of it before you did. (3). We believed it all the time.’ - Professor R.A. Lyttleton's summary of inexcusable censorship (quoted by Sir Fred Hoyle in ‘Home is Where the Wind Blows’ Oxford University Press, 1997, p154).



Geometry of Feynman gauge boson exchange (gravity force) mechanism.

See http://feynman137.tripod.com/ for two different proofs.

A shield, like the planet earth, is composed of very small, sub-atomic particles. The very small shielding area per particle means that there will be an insignificant chance of the fundamental particles within the earth ‘overlapping’ one another by being directly behind each other. The total shield area is therefore directly proportional to the total mass: the total shield area is equal to the area of shielding by 1 fundamental particle, multiplied by the total number of particles. (Newton showed that a spherically symmetrical arrangement of masses, say in the earth, by the inverse-square gravity law is similar to the gravity from the same mass located at the centre, because the mass within a shell depends on its area and the square of its radius.) The earth’s mass in the standard model is due to particles associated with up and down quarks: the Higgs field.



From the illustration above, the total outward force of the big bang,





(total outward force) = ma = (mass of universe).(Hubble
acceleration, a = dv/dt = Hc)

while the gravity force is the shielded inward reaction (by Newton’s 3rd law the outward force has an equal and opposite reaction):

F = (total outward force).(cross-sectional area of shield
projected to radius R) / (total spherical area with radius R).





The cross-sectional area of shield projected to radius R is equal to the area of the fundamental particle (p multiplied by the square of the radius of the black hole of similar mass), multiplied by the (R/r)2 which is the inverse-square law for the geometry of the implosion. The total spherical area with radius R is simply four times p, multiplied by the square of R. Inserting simple Hubble law results c = RH and R/c = 1/H give us

F = (4/3)p r G2M2/(Hr)2.

We then set this equal to F=Ma and solve, getting

G = (3/4)H2/(p r ).

When the effect of the higher density in the local universe at the great distance R is included, this becomes

G = (3/4)H2/(p r (local) e3),

which is accurate to within 1.65% (see http://feynman137.tripod.com/) and identical to that obtained in the older analysis (this proof is at: http://feynman137.tripod.com/).

The density correction factor explained: For mass continuity of any expanding gas or explosion debris, dr/dt = -Ñ.(rv) = -3rH. Inserting the Hubble expansion rate v = Hr and solving yields, r = rlocal e3 (early visible universe has higher density). The reason for multiplying the local measured density of the universe up by a factor of about 20 (the number e3 , the cube of the base of natural logarithms) is because it is the denser, more distant universe which contains most of the mass which is producing most of the inward pressure. Because we see further back in time with increasing distance, we see a more compressed age of the universe. Gravitational push comes to us at light speed, with the same velocity as the visible light that shows the stars. Therefore we have to take account of the higher density at earlier times. What counts is what we see, the spacetime in which distance is directly linked to time past, not the simplistic picture of a universe at constant density, because we can never see or experience gravity from such a thing due to the finite speed of light. The mass continuity equation dr/dt = -Ñ.(rv) is simple hydrodynamics based on Green’s theorem and allows the Hubble law (v = HR) to be inserted and solved. An earlier method of calculation for this the notes of CERN preprint EXT-2004-007, is to set up a formula for the density at any particular time past, so as to calculate red-shifted contributions to inward spacetime fabric pressure from a series of shells surrounding the observer. This is the same as the result r = rlocal e3 .

The acceleration is

a = (variation in velocity)/(variation in time) = c / (1/H) = cH
= 6 x 10-10 ms-2.

outward force (F = ma) is very large. The 3rd law of motion implies equal inward force like an implosion, which in LeSage gravity gives the right value for G, disproving the ‘critical density’ formula of general relativity by ½ e3 = 10 times. This disproves most speculative ‘dark matter’. Since gravity is the inward push caused by the graviton/Higgs field flowing around the moving fundamental particles to fill in the void left in their wake, there will only be a gravitational ‘pull’ (push) where there is a surrounding expansion. Where there is no surrounding expansion there is no gravitational retardation to slow matter down. This is in agreement with observations that there is no slowing down (a fictitious acceleration is usually postulated to explain the lack of slowing down of supernovae).


'... the source of the gravitational field [in general relativity] can be taken to be a perfect fluid... A fluid is a continuum that 'flows' ... A perfect fluid is defined asone in which all antislipping forces are zero, and the only force between neighboring fluid elements is pressure.' - Bernard Schutz, General Relativity, Cambridge University Press, 1986, pp. 89-90.



Current teaching of general relativity, as causing a flat surface like a rubber sheet to curve into a manifold, is unhelpful to further progress in unifying quantum space with gravitation, since physical space fills volume, not surface area: 'The Michelson-Morley experiment has thus failed to detect our motion ... because the effect looked for - the delay of one of the light waves - is exactly compensated by an automatic contraction of the matter forming the apparatus ... The great stumbing-block for a philosophy which denies absolute space is the experimental detection of absolute rotation.' - Professor A.S. Eddington (who confirmed Einstein's general theory of relativity in 1919), MA, MSc, FRS, Space Time and Gravitation: General Relativity Theory, Cambridge University Press, Cambridge, 1921, pp. 20, 152.

Einstein repudiated the completeness of special relativity in 1916: 'The special theory of relativity ... does not extend to non-uniform motion. The laws of physics must be of such a nature that they apply to systems of reference in any kind of motion. Along this road we arrive at an extension of the postulate of relativity. The general laws of nature are to be expressed by equations which hold good for all systems of co-ordinates, that is, are co-variant with respect to any substitutions whatever (generally co-variant).' - Albert Einstein, 'The Foundation of the General Theory of Relativity', Annalen der Physik, v49, 1916.

(Every physicist I spoke to at the Open University in 1996-7 conceded that the Earth's absolute motion in space is about 400 km/s toward Andromeda, as proved by the +/-0.003 Kelvin cosine variation in the 2.7 Kelvin microwave background radiation. This blueshift/redshift effect due to motion in the universe is well known, an article appeared in it in the Scientific American in around 1977 titled The New Aether Drift, and it is always removed from the cosmic microwave backgroundbefore the data is processed to find ripples, which are millions of times smaller in size than the relatively massive effect due to the earth's motion. Special relativity is not wrong mathematically, it is just replaced by general relativity which is broader.)

Feynman noted that the drag effect of the spacetime fabric only works on the dv/dt not on v. So it is not like air. He also notes that in general relativity, the earth's radius is squeezed 1.5 mm due to the spacetime distortion; gauge boson pressure in space squeezes it. It only resists accelerations. The 'drag' is thus inertia, and causes the Lorentz-FitzGerald contraction. The reason why it doesn't continuously slow things down is that equilibrium re-establishes as a result of the Lorentz-FitzGerald contraction. Penrose has a diagram depicting the contraction of electric field strength in the direction of motion around a moving charge. This makes it clear how the equilibrium is restored, allowing motion: the charge distorts in shape when moving, so that the pressure on it from each direction remains equal, preventing continuous drag.

The editor of Physical Review Letters, Dr Brown, emailed me in January 2003, stating that the cause of gravity is already known, so an alternative is unnecessary. The physics pre-print server, funded by U.S. government supposedly in the interests of genuine science arxiv.org, also deleted my preprint paper after hosting it for 2 minutes. Subsequent emails showed that they felt it more important to avoid publishing a hoax than to avoid not publishing a genuine advance. Because so much of what is taken to be 'acceptable' physics and published is today crazy speculation, the editors cannot distinguish rubbish from genuine advances, and to some extent they are prejudiced towards comradeship with those on string theory research payrolls. Therefore the scientific referee system works like an old time trade union, ignoring the hard work of outsiders (non-members) and helping comrades by publishing their trivia or speculations. The casualty of this corruption is the status of physics, with U.K. physics A-level take up dropping 4 % each year. String theory and ESP may make the occult end of the subject more appealing, but it does not help to recruit normal people into physics.

I wish Dr Peter Woit would start saying nice things about suppressed science, which would help discredit string theory. He knows a lot about the problem, but prefers to do things one step at a time. The next step is for the publication of the attack on string theory, Not Even Wrong, in London in April 2006. I hope he has positive material in the book to replace string theory!

Of course, the only way to get rid of string theory would be for Woit to investigate the facts on how to deal with gravity. Notice Dr t'Hooft/Plato also ignores the facts. It is curious Woit did not delete my anon comment quoting Feynman:

‘You can recognise truth by its beauty and simplicity. … The inexperienced, and crackpots, and people like that, make guesses that are simple, but you can immediately see that they are wrong, so that does not count. … We have to find a new view of the world that has to agree with everything that is known, but disagree in its predictions somewhere, otherwise it is not interesting. And in that disagreement it must agree with nature.’ (Character of Physical Law, 1965, p171.)

No reply yet from Woit to my email:

From: Nigel Cook
To: woit@math.columbia.edu
Sent: Monday, December 12, 2005 1:17 PM
Subject: Woit on Santa Claus
Dear Peter Woit,

Why don't you write a post for kids on Christmas Eve, explaining how Santa Claus uses string theory, parallel universes, and extra dimensions to deliver presents all over the world at the same time? You might finish it off by hinting that Witten looks as well as writes like Santa Claus, so perhaps he is really Santa.

Hope you find this a sensible and helpful suggestion.

Yours etc.,
Nigel Cook


'It does not make any difference how smart you are, who made the guess, or what his name is - if it disagrees with experiment it is wrong.' - Feynman, Character of Physical Law, 1965, p156.

The great problem was that Maxwell produced his equations, then Hertz claimed to have confirmed Maxwell's prediction. When Planck and Bohr later showed that Maxwell's equations were definitely faulty, it was too late to re-examine the theory as the physics has already been dumped (Maxwell had a false aetherial gear cog and idler wheel 'displacement current' mechanism). So Maxwell's equations were labelled classical physics instead of being corrected.

Theory: storks deliver babies to families! Prediction: houses with more storks nests on their roofs will have more children. Confirmation: in general, the bigger the house, the more likely you are to find more children living there. So Popper's insistence on evidence to subsequently test a theory can lead to errors being falsely "confirmed".

Popper's insistence that any theory must be potentially wrong (falsifiable) steers clear of the classical straightforward mathematical proofs by Archimedes. Archimedes lacked mechanisms for hydrodynamics like buoyancy, but at least he used a methodology anchored in empirical fact. Popper would be forced to dismiss Archimedes. (For my dynamical treatment of bouyancy, see: http://members.lycos.co.uk/nigelbryancook/emails.htm

Popper's error is obvious: his definition of science ignores proved science while allowing false theories to dominate, which become so protected by ignorant orthodoxy that nobody can ever hope to debunk them. Epicycles would be protected under Popper's scheme, because it is impossible to actually disprove unnecessary maths.

The Maxwell equation for displacement current is wrong as it is not stepwise. When corrected, taking into account that an atom is a charged capacitor, you get quantum theory. The capacitor charges in steps equal to the time for energy travelling at light speed to transit a capacitor plate and reflect back - http://www.ivorcatt.org/icrwiworld78dec2.htm

Unclassified Official admission that mutual inductance/EMC (electromagnetic compatibility) problems led to the disaster reported on updated page
http://feynman137.tripod.com/ is at http://www.ams.mod.uk/ams/content/topics/pages/2551.htm:

'Examples of Problems caused by lack of attention to EMC/MI: Mutual Interference between the SATCOM and EW systems onboard HMS Sheffield resulted in the inability to detect an incoming exocet missile during the Falklands war resulting in the loss of the ship and the lives of 20 sailors.'

Compare this with the cover-up the prejudiced BBC reports at
http://news.bbc.co.uk/onthisday/hi/dates/stories/may/4/newsid_2504000/2504155.stm:

'The ship caught fire when a French-made Exocet missile penetrated deep into HMS Sheffield's control room ... The Exocet missile is designed to skim the sea to avoid radar detection. It has its own radar that guides it to its intended target ... It was the first of four Royal Navy ships sunk during the Falklands War.'

These lying crackpots, or prejudiced loony 'BBC expert journalists' (translation: abusive, dismissive, enraged, patronising assertive ignoramuses), are thereby censoring off the radar the scientific, experimentally-proved FACT that Ivor Catt had way back in the 1960s sorted out the practical issue of 'mutual inductance' (cross-talk) in his paper published in IEEE Transactions on Electronic Computers, vol. EC-16, December 1967, Cross-talk (Noise) in Digital Systems, and subsequent papers. Any finding which proves suppression of Catt is costing lives must be covered up by the paranoid BBC, an example of a 'Science-Contradicting Unclear Megalomaniacs and Very Important Lazy Experts' (acronyms s.c.u.m. and v.i.l.e.).

Let's take a look at their methods.

First, ignore Catt since
1969.

Second, claim or strongly hint without naming sources or providing scientific evidence that anyone trying to save lives is a crackpot, egotist, paranoid, mad, etc.: "Depending on who you talk to in the generally conservative semiconductor industry, Catt is either a crank or a visionary. ..." - New Scientist, 12 June 1986, p35.

Third, when Catt's wafer-scale "chip" receives £16 million funding and works as the first wafer-scale technology in the world to succeed, change tune and deplore censorship: "Ivor Catt [is] an innovative thinker whose own immense ability in electronics has all too often been too far ahead of conventional ideas to be appreciated..." - Wafers herald new era in computing, New Scientist, 25 February 1989, p75.

Fourth, when Catt tries to apply the same technology to life-saving,
http://www.ivorcatt.com/3ew.htm, censor him all over again, forgetting the lesson learned:

From: Jeremy.Webb@rbi.co.uk
[mailto:Jeremy.Webb@rbi.co.uk] Sent: Mon 30/08/2004 11:29 To: ivorcatt@electromagnetism.demon.co.uk; Cook, Nigel B Cc: Jeremy.Webb@rbi.co.uk Subject: RE: Catt and New Scientist

Dear Ivor and Nigel

If this is mediation, I'm a Dutch uncle. ... Hawking and Penrose are well regarded among their peers. I am eager to question their ideas but I cannot afford to ignore them. Any physicist working today would be daft to do so. ...

Yours Jeremy [Editor, New Scientist, who in an interview with
The Hindu: ‘Scientists have a duty to tell the public what they are doing...’]

What is interesting is that most people imagine that the corruption, and prejudices in favour of status quo, which dominated medieval science have disappeared as if by magic due to enlightenment. Despite two world wars in the 20th century? Despite only 25 generations in the 500 years since Copernicus? How exactly is human nature supposed to have radically altered? By evolution? By avoiding the worship of false knowledge (see above example of Maxwell)? Really all that has happened is that science itself has become the new religion, complete with sacrosanct dogmas based on speculation and guesswork; in fact this is unfair to religion! I should say that most religion has at least some basis in morality, but Maxwell's equations carry all of the dogma with none of that. Radio effects were predicted by Michael Faraday in his 1846 paper, Thoughts on Ray Vibrations.

In his final (1873) edition of his book A Treatise on Electricity and Magnetism, Article 110:

‘... we have made only one step in the theory of the action of the medium. We have supposed it to be in a state of stress, but we have not in any way accounted for this stress, or explained how it is maintained...’

In Article 111, he admits further confusion and ignorance:

‘I have not been able to make the next step, namely, to account by mechanical considerations for these stresses in the dielectric [spacetime fabric]... When induction is transmitted through a dielectric, there is in the first place a displacement of electricity in the direction of the induction...’

First, Maxwell admits he doesn’t know what he’s talking about in the context of ‘displacement current’. Second, he talks more! Now Feynman has something about this in his lectures about light and EM, where he says idler wheels and gear cogs are replaced by equations. So let’s check out Maxwell's equations.

One source is A.F. Chalmers’ article, ‘Maxwell and the Displacement Current’ (Physics Education, vol. 10, 1975, pp. 45-9).

Chalmers states that Orwell’s novel 1984 helps to illustrate how the tale was fabricated:

‘… history was constantly rewritten in such a way that it invariably appeared consistent with the reigning ideology.’

Maxwell tried to fix his original calculation deliberately in order to obtain the anticipated value for the speed of light, proven by Part 3 of his paper, On Physical Lines of Force (January 1862), as Chalmers explains:

‘Maxwell’s derivation contains an error, due to a faulty application of elasticity theory. If this error is corrected, we find that Maxwell’s model in fact yields a velocity of propagation in the electromagnetic medium which is a factor of root 2 smaller than the velocity of light.’

It took three years for Maxwell to finally force-fit his ‘displacement current’ theory to take the form which allows it to give the already-known speed of light without the 41% error. Chalmers noted: ‘the change was not explicitly acknowledged by Maxwell.’ Weber, not Maxwell, was the first to notice that, by dimensional analysis (which Maxwell popularised), 1/(square root of product of magnetic force permeability and electric force permittivity) = light speed.

"... the innovator has for enemies all those who have done well under the old conditions, and lukewarm defenders in those who may do well under the new. This coolness arises partly from fear of the opponents, who have the laws on their side, and partly from the incredulity of men, who do not readily believe in new things until they have had a long experience of them. Thus it happens that whenever those who are hostile have the opportunity to attack they do it like partisans, whilst the others defend lukewarmly..." -
http://www.constitution.org/mac/prince06.htm

As long as you have some asymmetry in the current, any conductor can be made to work, with the radio emission occurring in a direction perpendicular to the varying current. A spherical conductor with a central feed would not emit radio waves, because there would be no net current in any direction, but you can use a cylindrical conductor in coax as an aerial.

Catt's analysis applies to the case where the capacitor plates are close together in comparison to the length of the plates. For all capacitors used in electronics, this is true, since only a thin insulating film separates the foil plates, which are long and are generally rolled up. In this situation, any delay from one plate to the other is small.

But if you separate the plates by a large distance in the air, the capacitor appears more like a radio, with an appreciable delay time. The signal induced the second plate (receiver aerial) is also smaller than that in the first plate (transmitter aerial) because of the dispersion of energy radiated from the first plate. The second plate (receiver aerial) responds with a time-lag of x/c seconds(where x is the distance between the aerials or plates), and with a voltageof vy/(y + x), where v is the value in the first plate, y is the length ofthe plates (assuming both are parallel), and x is the distance between the plates. This formula is the simplest possible formula that reduces to vvolts when the ratio x/y is small (normal capacitors) and but becomes vy/x volts for radio systems (so that the radio signal strength in volts/metre falls off inversely with distance of the constant length receiver aerialfrom the transmitter).

Maxwell's displacement current equation can be derived entirely from Catt's word. It is: i = # .(dE/dt). My analysis below is confirmed by Catt's co-author Dr Walton: The two curl 'Maxwell' (Heaviside) equations are unified by the Heaviside vector E = cB, where E is electric field strength and B is magnetic fieldstrength, and all three vectors E, c, and B are orthagonal, so the curlvector (difference in gradients in perpendicular directions) can be appliedsimply to this unique E=cB:

curl.E = c.curl.B
curl.B = (1/c).curl.E

Now, because any field gradient or difference between gradients (curl) isrelated to the rate of change of the field by the speed of motion of thefield (eg, dB/dt = -c dB/dr, where t is time and r is distance), we canreplace a curl by the product of the reciprocal of -c and the rate of field change:

curl.E = c [-(1/c)dB/dt] = -dB/dt (Faraday's law of induction)
curl.B = (1/c) [-(1/c) dE/dt] = -(1/c^2 ) dE/dt

which when compared toAmpere's empirical law immediately gives Maxwell's dimensionally correct approximation: i = # .(dE/dt).For the second part see Catt-Davidson-Walton paper http://www.ivorcatt.org/icrwiworld78dec2.htm: For the Walton voltage stepwise charging contradict's the continuous exponential model from Maxwell's displacement current.

Part of the problem is with Maxwell ignoring the mechanism of spatial flow of energu in capacitor plates, and part is the error of Catt and Walton in using Heaviside's flawed model of 'energy current' where at the front the change in voltage from 0 to v volts is assumed to occur instantly (in zero time), so Maxwell's displacement current is there i = permittivity.dE/dt = permittivity.dv/(x.dt) = permittivity.dv/(0) = infinite. The associated electric current with that infinite gradient in electric field would be infinite, and would result in infinite electromagnetic radiation/'radio' transmission sideways (in the direction of traditional 'displacement current'). This is obviously wrong. In all real pulses, electric fields can't alter instantly. There is a small delay, a real rise time, which prevents the infinite result. The 'displacement current' effect is however a perfect mutual electromagnetic radiation energy exchange effect in a transmission line, which exactly cancels out at large distances from the transmission line by virtue of total (complete) interference.

Hence, Maxwell's model is wrong: instead of a time-varying electric field causing a perpendicular displacement current, the true model is this: the gradient of the electric field in the conductor causes an electric current, which in turn causes perpendicular emission of electromagnetic radiation, which causes a similar effect to the mythical 'displacement current' postulated by Maxwell. Hence, Maxwell's entire electromagnetic unification is a fraud. The corrected model is entirely compatible with quantum field theory, and sheds light on why classical electromagnetism failed to describe the radiation emission from atomic electrons in Bohr's model!

In normal radio transmission the signal frequency is obviously matched to the aerial like a tuning fork, with a loading coil as necessary. So the dE/dt due to the radio feed would govern the transmission, not steps. Catt's stepwise curve kicks in where you have a constant step applied to the aerial, like a capacitor plate charging up. dE/dt then becomes very high while the pulse is reflecting (and this adding to more incoming energy) at the end of the aerial or capacitor plate. Obviously any real signal will have a rise time, so dE/dt will not be infinite.



The actual value of dE/dt will gradually fall as the capoacitor charges and equal to approximately (assuming uniform rise): v/(XT) where X is the distance over which voltage step v rises, X = cT where T is the rise-time of the Heaviside signal. Hence, dE/dt ~ v/(XT) = v/(cT^2).

It is often very large, however, causing high frequency transients to be emitted which cause EMC problems that cost innocent people their lives in many situations in today's electronic world.


"In electrical energy transfer the charges don't have to move (if you fire two similar logic steps through one another in opposite directions, there is no electric current while they overlap, and no resistance, so electric current is dead as the mechanism for energy transfer in electricity), electric current plays no role in the process of the electrical energy transfer (only in resistance, where electrons remove energy from the Heaviside energy current), and the illusion of moving charge is the edge of the light-speed Heaviside energy current; electromagnetic force radiation (gauge boson photon of quantum field theory)." - http://en.wikipedia.org/wiki/Ivor_Catt

A turkey breeding (or string theory) community can’t afford a diversity that would make the most popular theory (strings/turkeys) look silly. Dictators just try to suppress all criticism and shoot the messengers. Instead of telling them what they don’t like to hear and having them doing the shooting, you need to adopt more sturdy methods and go on a turkey cull.

Once they define string theory as being such a beautiful piece of genuine science, they are actually defending science by censoring out criticism. You have to see this from their warped perspective!

So the Royal Society is actively trying to get science taken completely off the internet, see Motl’s complaints about it. Because string theory predictions are so similar (in testability) to certain crackpotism about how many fairies can sit on the tip of a pin, the Royal Society fears that the enlightenment will soon end unless authoritative censorship of ideas and data is stringently enforced. A terrible danger to real science arises from allowing heretics to come up with new models which might one day discredit some dogma or orthodoxy which is revered. Since there is so much of this in string theory, the danger is really very acute.

One man who should be applauded is Professor Lee Smolin. At http://www.logosjournal.com/issue_4.3/smolin.htm he writes:

"Einstein's ability to see flaws and his fierce refusal to compromise had real repercussions. His professors did not support him in his search for an academic job and he was unemployed until he found work as a patent inspector."

"By virtue of his involvement, Catt knows all the ins and outs of one of the major scientific scandals of the last 15 years, viz. the systematic suppression in the world of electronics of all publications about the phenomenon of the so-called glitch and its ramifications."

- Professor Edsger W. Dijkstra, Burroughs Research Fellow [28]

"There was a realisation in the mid 1970s that a capacitor was in fact a transmission line ... Catt, attempting to bypass what he felt were erroneous interpretations, based everything on those concepts first proposed by Heaviside. The price that must be paid for this is computational complexity as the treatment is distributed in space. Nevertheless, his formulations of propagating TEM waves involve a network which looks identical to what we now call a two-dimensional series TLM mesh … Both John and Catt provide numerical modelling systems which are based on the use of electrical networks to treat electromagnetic analogues of physical problems. … The approaches of Johns and Catt provide a firm basis for the rules that are applied and once this is clear, then it is possible to intrude into that 'what-if' land (what if we relax some of the strict electromagnetics rules?) and research of this nature is in progress at this moment."

- Some insights into the history of numerical modelling, by D. de Cogan, School of Information Systems, University of East Anglia, Norwich NR4 7TJ, recent IEE paper: PDF[29], http [30] From http://en.wikipedia.org/wiki/Ivor_Catt

1. Knowledge is power.
2. It's not what you know, but who you know.

The two sayings above tend to contradict each other... In the same way a single word can convey opposite information depending on the tone of voice: compare "yes!" spoken softly with a smile to "Y--E--S?" shouted in anger at a fool. The political world has plenty of Orwellian double-speak.

The fool is the person who ignores these subtle issues, and pretends that they don't exist. People with a hearing impairment which leads to problems in understanding the subtle underlying signs and double-speak, like the various tones of voice to convey different meanings of the same word, are said to have a medical condition such as autism. The mainstream is defines as being correct. In 1930s Germany or Russia, merely pointing out the existence of irrational dictatorship would be defined by the mainstream as insanity. So some types of eccentricity are not absolutely fixed definitions for all time, but are merely relative to the political situation in fashion or in power.

One politician might look "arrogant" to one group of people and simultaneously "authoritative" to another. It is not clear which the person is; there is a relativism at play. In science, as Tony Smith says, there should be shame but there is not. Science is just as political as the worst, most bigoted politics there has ever been in human history!

Compare the mountaineer to the scientist. If the mountaineer climbs a mountain which has never been climbed before, he needs to be famous enough or rich enough or well connected enough in the first place that the event is recorded or that he at least has the means (rugged cameras, etc.) to provide some evidence. You could say he could leave something behind at the top to "prove it", but the idem may be buried under ice or blown away. It may never be found or if you are paranoid and conspiratorial, the next person to climb the mountain may not spend much time searching to try to find evidence that he is the second guy to arrive at the summit. (Easier to just take some pictures and start down again without seriously looking!)

The mountaineer who has a difficult struggle, or who keeps on trying despite setbacks and failed attempts, might gradually lose media interest even if he had it to begin with (sorry if this sounds illucid, it is half-joking!). If he eventually makes it with no reporters, he could end up having done something which is unrecognised. (Of course, it is a bad analogy. A mountaineer risks his life for an abstract goal, while a scientist risks merely a career for useful knowledge.)

(Re: the joke at the top of the page. The context is this: Ivor Catt is the victim of the joke. Cook explained to Catt that Catt's dismissal of 'Maxwell's equations' is wrong because Catt starts out by taking 2 of the 20 differential equations of Maxwell and then shows that those 2 equations, 10% of the total, contain 2 important constants of nature. Catt falsely concludes that 'Maxwell's equations have no content'. Catt REFUSES TO RESPOND FOR NEARLY A DECADE TO COOK'S EXPLANATION OF CATT'S ERROR. Then Catt then proceeds to try to say (on his own internet site somewhere) Cook is 'confused'. In fact, Cook is cernatinly not confused, and Catt is refusing to drop a falsified claim. Unfortunately, Catt does not get everything entirely wrong, like the capacitor charging in steps or wafer scale integration. Even if Catt eliminated errors from his writings, his genuine advances might still be suppressed by the bureaucratic inertia, incompetence, and vindictiveness of Professor X.)

Radio emission results when the current in the aerial varies with time, ie if di/dt is not zero (this is equivalent to saying that radio emission results from the acceleration of charge). There is a variation in the E-field along the conductor, even in direct current, over the small distance at the front of the step where the voltage rises from 0 to v. The current similarly rises from 0 to i. So there is radio energy transfer in a charging capacitor.

(1) In order to detect radio energy, you need to have an oscillatory wave. Feynman says the normal forces of electromagnetism (for example, attraction between the two charged capacitor plates) is some kind of exchange of force-carrying energy (photons called gauge bosons). Feynman does not say any more about the dynamics. However, the continuous action of such forces implies a continuous exchange of energy. This is like Prevost's breakthrough in thermodynamics of 1792, when he realised that in the case now called oscillatory photons (infrared radiation), there is a continuous exchange at constant temperature.

(2) Point (1) above says that energy is being continuous exchanged as shown by the Feynman diagram quantum field theory. This is not a heresy. Heuristic development of the subject in a physical way is a step forward. Oscillatory photons carry heat, others carry forces. Proof: http://feynman137.tripod.com/

6 Comments:

At 10:46 AM, Anonymous Anonymous said...

http://cosmicvariance.com/2005/12/19/the-universe-is-the-poor-mans-particle-accelerator :

Sean claims (falsely) that the big bang nucleosynthesis validates the universal gravitational constant as having the same value during fusion of the light elements within the first few minutes of the big bang as today.

http://cosmicvariance.com/2005/12/19/the-universe-is-the-poor-mans-particle-accelerator#comment-9295 : "When you claim +/- 10% agreement on G could you provide reference please? (The one big failure of general relativity for the big bang is that the gravitational effect is out by a factor of 10, implying unobserved dark matter.) "

http://cosmicvariance.com/2005/12/19/the-universe-is-the-poor-mans-particle-accelerator#comment-9296: "Best I could find after 30 seconds of searching was this paper, which puts a 20% limit on the variation of G. So I edited the post, just to be accurate. "

http://cosmicvariance.com/2005/12/19/the-universe-is-the-poor-mans-particle-accelerator#comment-9299: "Thanks! Fusion rate would increase (due to compression) if G rises, but would be reduced if the Coulomb repulsion between protons also rises: the two effects offset one another. So G will appear constant if you it is really varying and you ignore a similar variation with time of Coulomb’s law. The strong nuclear force can’t cause fusion beyond a very limited range, so the longer range forces control the fusion rate."

http://cosmicvariance.com/2005/12/19/the-universe-is-the-poor-mans-particle-accelerator#comment-9381: "On quantum gravity: the error is not necessarily in quantum mechanics, but in the incompleteness of GR. My point is that if the difference between electromagnetic force and gravity is the square root of the number of charges in the universe (and there is only one proof of this available), this ratio is nearly fixed to the present-day figure within 1 second of the big bang by the creation of quarks. If gravity was x times weaker during the fusion of nucleons into light elements in the first few seconds to minutes, then Coulomb’s law would also be x times weaker. So the gravitational compression would be less, but so would the Coulomb barrier which hinders fusion. The two effects cancel each other out. Therefore you have no basis whatsoever to claim G only varies by 20%. What you should say is that fusion calculations validate that the ratio of gravity to electromagnetism was the same to within 20% of today’s ratio."

Hence G could have been any number of times weaker when fusion occurred and the same fusion would have resulted, simply because Coulomb's law is linked to G!

 
At 10:47 AM, Anonymous Anonymous said...

I like what Einstein did in general relativity, but don't like the restricted theory as it is popularised. Since rotational motion has never been relative, the mere relativism of linear motion is a fraud because in general relativity Einstein denies the possibility of going in an undeflected straight line in the real universe.

Why not just respect Einstein for Bose-Einstein statistics, the simple photoelectric effect equation (not the photoelectric effect discovery, which goes back to Thomas Edison and other real, experimental, physicists), general relativity forgetting the biggest blunder - the ad hoc fiddle called the cosmological constant (although the David Hilbert obtained a field equation similar to Einstein's at the same time), his druunkard's walk statistical summation in Brownian motion (which I use for another purpose on my home page!!!), and his letter to Roosevelt warning of the risk of a Nazi atom bomb.

Lorentz and FitzGerald from 1889-93 had the Lorentz transformation developed to the point of including length contraction, mass increase, and time dilation (Larmor has a lot about it in his book "Aether and Matter" 1901).

Several people had obtained E=mc2 from electromagnetism before Einstein.
E=mc2 is not personal property!

http://www.guardian.co.uk/international/story/0,3604,253524,00.html

"Olinto De Pretto, an industrialist from Vicenza, published the equation E=mc2 in a scientific magazine, Atte, in 1903..."

Einstein has so many defenders that other people don't get a look in. Eventually historians may say there was only one physicist in the 20th century, Einstein.

Notice that Planck was the one to provide the first good derivation of E=mc2, after Einstein had merely obtained it for a limited class of examples such as the energy delivered by radiation pressure doing work against the surface which reflects it.

There are many approaches to the so-called Einstein equations of restricted ("special") relativity, and Einstein's approach contradicts Einstein's own general relativity.

Why ignore absolute rotational motion (you get dizzy if spun around even if you are in a box which is also being spun around)?

"Special" (RESTRICTED) relativity only works when you make the fiddling postulate that you can ignore all accelerations.

Problem: in this universe you are always in a gravitational, accelerative field.

Lubos Motl is so far into double-talk that he can happily fail to see that accelerations are a problem to the restricted theory.

The heresy surrounding any critical mention of bigwig physicists is holding back physics:

"... it is not possible anymore to define a state which would be recognised as the vacuum by all observers.

"This is precisely the situation when fields are quantized on curved backgrounds. ... different observers will identify different vacuum states. As a consequence what one obsever calls the vacuum will be full of particles for a different observer."

- Introductory Lectures on Quantum Field theory,

arXiv: hep-th/0510040 v1, 5 Oct 05, page 85.

Hence "special" relativity is discredited (in favour of GENERAL relativity, which includes absolute motions like acceleration) is DISCREDITED by QUANTUM FIELD THEORY.

Yet popularisers continue to ignore general relativity!

 
At 10:48 AM, Anonymous Anonymous said...

“… the innovator has for enemies all those who have done well under the old conditions, and lukewarm defenders in those who may do well under the new. This coolness arises partly from fear of the opponents, who have the laws on their side, and partly from the incredulity of men, who do not readily believe in new things until they have had a long experience of them. Thus it happens that whenever those who are hostile have the opportunity to attack they do it like partisans, whilst the others defend lukewarmly…” - http://www.constitution.org/mac/prince06.htm

http://www.math.columbia.edu/~woit/wordpress/?p=215#comment-4082:

Nigel Says:
July 7th, 2005 at 7:15 pm
Editor of Physical Review Letters says

Sent: 02/01/03 17:47
Subject: Your_manuscript LZ8276 Cook
MECHANISM OF GRAVITY
Physical Review Letters does not, in general, publish papers on alternatives to currently accepted theories ? Yours sincerely, Stanley G. Brown, Editor, Physical Review Letters

Now, why has this nice genuine guy still not published his personally endorsed proof of what is a ?currently accepted? prediction for the strength of gravity? Will he ever do so?

?String theory has the remarkable property of predicting gravity?: false claim by Edward Witten in the April 1996 issue of Physics Today, repudiated by Roger Penrose on page 896 of his book Road to Reality, 1994: ?in addition to the dimensionality issue, the string theory approach is (so far, in almost all respects) restricted to being merely a perturbation theory?. String theory does not predict for the strength constant of gravity, G! However, the Physical Review Letters editor still ?believes in? Edward Witten and Physics Today.

http://www.math.columbia.edu/~woit/wordpress/?p=215#comment-4081:

Peter Woit Says:
July 7th, 2005 at 7:27 pm

I’m tempted to delete the previous comment, but am leaving it since I think that, if accurate, it is interesting to see that the editor of PRL is resorting to an indefensible argument in dealing with nonsense submitted to him (although the “…” may hide a more defensible argument). Please discuss this with the author of this comment on his weblog, not here. I’ll be deleting any further comments about this.

http://www.math.columbia.edu/~woit/wordpress/?p=215#comment-4080:

Alejandro Rivero Says:
July 8th, 2005 at 6:34 am

currently accepted

is not different of the typical forms to request funds in some project, where you are basically asked what are you to discover, and when. I call this part of science, very botanic-wise, the “classification” side. The (also botanic) counterpart, “exploration”, is always more problematic. Smolin article on “New Einstein” was about this, wasn’t it?

 
At 10:49 AM, Anonymous Anonymous said...

the diagram with photon striking electron "Compton effect".

It is key to Heisenberg's uncertainty principle.

The claims of "quantum entanglement" take it for certain that somehow when you measure a PHOTON not an electron, the PHOTON's polarisation is changed by a similarly uncertain amount, and since the Bell-Aspect experiment starts with two identical photons, you then conclude that measuring the polarisation of one magically changes that of the other (the result shows correlation).

However, how does a photon's state change when measured, as it is going at light speed?

Are you certain that the Compton effect "uncertainty principle" applies to measuring polarisation of light? There is no evidence for this. It is guesswork, so the xperiment proves nothing about entanglement.

I did a lot of undergraduate experiments on polarised light, and I've used photomultipliers to register individual quanta.

Light is a very complex problem in physics! Maxwell's theory of light has a "displacement current" created by electric field in the vacuum which creates a magnetic field, which in turn creates another electric field.

Nobody in modern physics, particularly string theorists, know anything about displacement current. Yet they have the cheek to be abusive to investigators who work in electronics on capacitors and investigate it.

String theorists have no interest in the experiments on light and its electromagnetic basis which come from electronics, which is the very seat of electromagnetism.

All these people want to do is to pontificate, suppress, and make rude patronising comments on subjects they by their own admission are proud to remain ignorant of.

Before claiming that you know for certain that two photons metres apart are "entangled" don't you first think you should be certain where the conflict between Maxwell's wave theory of light and quantum theory arises?

If you don't have a clue about the model of light in classical electromagnetism and how it relates to the Heaviside light-speed electric energy flow, then you may be building a mud castle in quicksand.

 
At 1:46 PM, Anonymous Anonymous said...

Discussion with Assistant Professor of Physics at Harvard, string theorist Dr Lubos Motl, at

http://motls.blogspot.com/2006/01/pure-quantum-gravity-cannot-work.html

(fast comments extract, in case later deleted)

Luis Alvarez-Gaume and Miguel A. Vazquez-Mozo, Introductory Lectures on Quantum Field Theory, arXiv.org hep-th/0510040 v1, 5 October 2005.

These guys have now made a major step in developing a classical model of QFT, see pp 70-71, 83-85: p71: "... the electromagnetic coupling grows with energy... the polarisation of the vacuum [ether] ... electron-positron pairs around the location of the[core of the] charge. These virtual pairs behave as dipoles that, as in a dielectric medium, tend to screen this charge ... decreasing its value at long distances (i.e. lower energies)."

p85: "Here we have illustrated the creation of particles [pair-production asquantum tunnelling] by semiclassical sources in Quantum Field Theory... what one observer calls the vacuum will be full of particles for a different observer [hence special/restricted relativity is horses***, giving way to the absolute motion implicit in accelerations and general motion, hence general relativity is an ether theory not a non-ether theory]."

It is curious to see restricted/special relativity being abandoned on page 85 with the technically obscure words: "The breaking of such invariance, as happened in the case of coupling to a time-varying source analyzed above, implies that it is not possible anymore to define a state which would be recognised as the vacuum by all observers."

So special relativity is ditched because of quantum field theory! Didn't Dirac do this in his paper sating quantum field theory implies an ether, published in Nature in 1951? Or did he make the error of talking clearly?

‘… with the new theory of electrodynamics we are rather forced to have an aether.’ – P.A.M. Dirac, ‘Is There an Aether?,’ Nature, v.168, 1951, p.906. See also Dirac’s paper in Proc. Roy. Soc. v.A209, 1951, p.291.

‘Recapitulating, we may say that according to the general theory of relativity, space is endowed with physical qualities... According to the general theory of relativity space without ether [physical continuum] is unthinkable.’ – Albert Einstein, Leyden University lecture on ‘Ether and Relativity’, 1920. (Einstein, A., Sidelights on Relativity, Dover, New York, 1952, pp. 15, 16, and 23.)

‘In many interesting situations… the source of the gravitational field can be taken to be a perfect fluid…. A fluid is a continuum that ‘flows’... A perfect fluid is defined as one in which all antislipping forces are zero, and the only force between neighboring fluid elements is pressure.’ – Bernard Schutz, General Relativity, Cambridge University Press, 1986, pp. 89-90.

‘The Michelson-Morley experiment has thus failed to detect our motion through the aether, because the effect looked for – the delay of one of the light waves – is exactly compensated by an automatic contraction of the matter forming the apparatus.’ – Professor A.S. Eddington, MA, MSc, FRS (Plumian Professor of Astronomy and Experimental Philosophy, Cambridge), Space Time and Gravitation: An Outline of the Gener
Anonymous | Homepage | 01.04.06 - 1:45 pm | #

--------------------------------------------------------------------------------

To the extent LQG methods are like a Feynman Checkerboard, I don't think you can call LQG useless. I do agree you need to be looking at all the bosons (and fermions).
John G | Homepage | 01.04.06 - 2:41 pm | #

--------------------------------------------------------------------------------

Dear Anonymous,

I was initially afraid that Alvarez-Gaume et al. promote aether in their lectures on quantum field theory. That would be pretty bad. After a closer scrutiny, it turned out that all the bizarre sentences about the aether were inserted by you, and Alvarez-Gaume et al. talk about the standard issues of particle production in curved spaces.

In 1951, Dirac meant nothing else than the Dirac sea by the word "aether" - it is the usual Lorentz-invariant vacuum. Just a terminological issue whether we would call such vacuum "an aether". We don't call it this way today.

Dirac has wrote strange things, too (for example, against the renormalization group), but this one is not a particularly clear example.

John, Feynman checkerboard is a discretization of 2D Dirac particle's path integral - which is itself just a game that I would hardly call "useful". But Feynman did not intend to suggest that the discrete character of the checkerboard was real physics. It was just a game and an approximation of the real physics. LQG wants to argue that the real world "is" a discrete checkerboard. So the purposes and interpretations are very different in these two cases.

Incidentally, the Feynman checkerboard does give the Dirac equation in the continuum while LQG does not give gravity.

All the best
Lubos
Lubos Motl | Homepage | 01.04.06 - 3:00 pm | #

--------------------------------------------------------------------------------

Lubos,

The word aether should be banned for vagueness and confusion with Maxwell's horses*** gear cog space "displacement current" mechanism, but I included it just because that is the word people used.

I don't like your efforts to promote bigotry. If Dirac uses such a word, he does that.

If you say the quantum vacuum or "Dirac sea" you lose contact with reality you ******* *****!

You need to ACCEPT NATURE AS IT IS, not impose ******* string crap with no damn evidence.

Best wishes,

Anon | Homepage | 01.04.06 - 4:46 pm | #

 
At 4:01 AM, Anonymous Anonymous said...

http://www.math.columbia.edu/~woit/wordpress/?p=320#comment-7181

anon Says:

January 4th, 2006 at 5:03 pm

Lubos is symptomatic of the bigotry of many theoretical physicists.

In a way you could say it is refreshingly honest to see a person behaving so outrageously instead of quietly suppressing opposition.

Lubos loudly proclaims without a fig leaf of evidence that string theory must be right, and that alternatives are a waste of time.

This in my opinion is more honest than the usual “conspiracy of silence” which theoretical physicists use. Without people like Lubos, there wouldn’t be much evidence of the misery caused!

 

Post a Comment

<< Home