WEATHER »

Alternative Cosmologies, Part I

A Conversation with Reg Cahill About Process Physics


Sunday, February 17, 2013
Article Tools
Print friendly
E-mail story
Tip Us Off
iPod friendly
Comments
Share Article

The term “ether” is unique in the history of physics not only because of the so many different meanings in which it has been used but also because it is the only term that has been eliminated and subsequently reinstated, though with a different connotation, by one and the same physicist [Einstein]. – Max Jammer, in the Foreword to Einstein and the Ether (2000)

Physics is considered the hardest of the sciences, in terms of its degree of precision and certainty regarding its data and theories. Physics is not, however, as “hard” a science as most people like to think. In fact, physical theories, like all theories, are human creations that approximate reality and, as such, are never the whole story. There is always more than one theory that can explain the available evidence – far more than one, in fact. The difficulty is finding a theory that is self-consistent, explains the evidence at issue, and fits within a broader explanatory framework.

Tam Hunt

In reading books like Brian Greene’s The Fabric of the Cosmos and The Elegant Universe, or Lee Smolin’s The Trouble With Physics, we realize that the one certainty in physics is this: our physical theories will continue to change over time. We will never – literally – have a complete description of the universe and its workings because we simply don’t know the full extent of what we don’t know. And we will never know the full extent of what we don’t know.

It’s important to keep in mind also that all theories, including physical theories, rest on assumptions about the nature of reality, generally called “postulates” or “principles.” If the assumptions turn out to be wrong, the theory will very likely be wrong. Last, theories can never be proven – they can only be supported by experimental evidence or disproved by experimental evidence that contradicts the theory at issue. The degree to which theories are rejected as invalid depends on the degree to which experimental evidence disproves key features of the theory.

For example, even though there is a fairly strong consensus among cosmologists that the universe is expanding at a faster rate than previously predicted by general relativity (the prevailing theory of gravity), very few physicists were willing, based on this difficulty alone, to reject general relativity as a theory. Rather, various fixes, including the now widespread concept of “dark energy,” have been developed to reconcile general relativity with the unexpected data about accelerating expansion (which led to the 2011 Nobel Prize in physics) and other gravitational anomalies. Dark matter is a similar patch to general relativity, for which there is even less evidence.

There are many other possible views, however, that can explain the data as well or better. Reg Cahill, my interviewee for this first installment of a new series on alternative cosmologies, for example, has pointed out that the supernovae data relied upon for the accelerating expansion model of the universe, can equally support the view of a universe expanding at a constant rate. A later interviewee, Jayant Narlikar, a supporter of the quasi steady-state cosmology, believes that we may be fundamentally misinterpreting Hubble’s law on redshift, which is the basis for the prevailing view that the universe is expanding.

While even scientists and science journalists often speak sloppily about theories being proven, it simply is not the case that any theory – even theories that rise to the level of “laws,” due to very strong experimental support – are ever proven. Richard Feynman, a Nobel Prize-winning physicist, stated: “[E]ven those ideas which have been held for a very long time and which have been very accurately verified might be wrong …. [W]e now have a much more humble point of view of our physical laws – everything can be wrong!”

There remains a chorus of discontent over the state of physics today. David Gross, another Nobelist and a physics professor at UC Santa Barbara, ended a 2005 conference on string theory, the dominant research interest for most theoretical physicists over the last two decades, by saying: “We don’t know what we are talking about … The state of physics today is like it was when we were mystified by radioactivity … They were missing something absolutely fundamental. We are perhaps missing something as profound as they were back then.”

Reg Cahill is a professor of physics at Flinders University in Adelaide, Australia, Reg has for many years now challenged the mainstream physical consensus in many ways. He’s published widely, but generally in “dissident” physics journals (yes, there are dissident journals in physics) because he challenges ideas that are generally considered to be settled. While Cahill is a proud maverick in his field he has nevertheless received recognition for his achievements and was awarded a Gold Medal in 2010 by the Telesio - Galilei Academy of Science for his development of the “process physics” model that is his signature achievement.

I had the pleasure of interviewing Reg, by email, on his process physics, why he believes Einstein got it wrong on gravity, and why a new understanding of the nature of mind is key for a more accurate physics. I am also working on a book about Reg, his physical theories and process philosophy as an alternative to the prevailing materialism of our era. I am highly intrigued by Reg’s ideas, but I am not equipped at this time to make my own determination about their validity, let alone their superiority to the mainstream views. Reg seems to me to be overly categorical in his pronouncements at times, and can tend toward the grandiose at other times. It is, however, clear to me that Reg’s voice should be part of today’s discussions about cosmology and physics more generally.

What led you into physics?

At a very young age, maybe five or six, I was fascinated by questions of how things worked – my first case was the radio. I thought it was an astounding process. Others, it seemed to me, didn’t ask such questions. From there my interest in physics simply developed. On leaving school I was initially planning on being an electrical engineer, but dropped that very quickly when the University of New South Wales (in Australia) offered me a scholarship to do a physics degree, and then stay to do a PhD. During the latter years of school and early years of university I was involved with radios, etc., as a hobby – and built my own oscilloscope, and later modified a military aircraft scope to work on mains voltage.

Who is the most influential thinker with respect to your own worldview, or your own brand of physics?

I don’t think there was any one person. My research over the years has simply gone deeper into fundamental issues – first doing low-energy nuclear physics, then high-energy particle physics: quarks and gluon theory, and finally now into developing new understandings of space and matter – at the “process physics” level. So this was an ongoing process of following the clues deeper and deeper. I came at the process physics point of view independently, and then was amazed to learn that Alfred North Whitehead, and other similar process thinkers, including Heraclitus of ancient Greece, had arrived at the same sort of thinking, but by very different routes.

Could you describe briefly your “process physics” and how it differs from mainstream views in physics?

My process physics perspective is fleshed out in a number of papers and my 2005 book, Process Physics. Basically, conventional physics uses a syntax-based model – where symbols stand for things: matter, space, time … The physical laws are encoded in rules of manipulations of these symbols [as described by the equations of various physical theories]. In process physics, to the contrary, one is modeling, at the deepest levels, that nature is all about process – processes involving the generation of patterns, their interaction by means of pattern recognition, and change – so this is an information-based model.

It is not, however, about our knowledge or information about reality; rather, it is about interacting patterns, where the structure of the patterns determines their interaction and evolution over time. This is a semantic information theory, whereas conventional physics uses syntactical information. Also, process physics does not begin by assuming the existence of space, matter, etc. It assumes only a cosmic-indexing type of time, which is emergent. These phenomena, in my theory, emerge from the more fundamental level of reality.

Physics has been in the news a lot lately with the discovery of evidence supporting the existence of the Higgs boson. Could you explain this finding and how it impacts, if at all, your process physics.

The standard model [of particle physics] starts by assuming that the equations have a certain symmetry. That symmetry requires all particles to be massless – which they are not. To avoid that outcome, a new field is introduced which, in an ad hoc way, forces most particles to have mass. So the whole procedure lacks elegance. This field in turn results in the supposed existence of a new particle – the Higgs boson. Given the manifest inelegance of this model I would be very surprised if the claimed discovery of the Higgs boson survives scrutiny. As for process physics, I doubt the new Higgs field data has any significance.

Is the Higgs field a modern name for what was previously called the “ether” and perhaps wrongly dismissed? How does your work relate to ether-based theories of physics?

No. The “ether” in process physics has been replaced by the term “dynamical space”. In conventional physics, space is a geometrical “container,” to the extent that its existence is even acknowledged. The 19th Century notion of the “ether” was considered to exist in the “container” of space. The “dynamical space” is, however, a complex fractal system, which only manifests geometrical properties at the higher level. Dynamical space is not just a concept. It has been detected repeatedly for more than 120 years without being widely acknowledged. Contrary to the widespread views on this issue, the speed of light is in fact anisotropic [not constant for all observers], when measured by an observer moving through the dynamical space. For example, the famous Michelson-Morley experiment did, in fact, when analyzed correctly, find evidence for light anisotropy. And the dynamics of dynamical space have also been discovered. I would expect that it is the dynamics of this new type of space – in particular its detected fractal texture – which causes particles to have mass. I am working on this conjecture.

There are many notions of the “ether,” and ultimately terminology is far less important than the concepts they convey. However, you write in your recent paper, suggesting that Einstein’s special relativity theory has been falsified, and Lorentz’s competing theory of relativity supported, the following as your introduction: “Physics has failed, from the early days of Galileo and Newton, to consider the existence of space as a structured, detectable and dynamical system, and one that underpins all phenomena…” This sounds to me like the ether that Lorentz himself advocated, so is it not fair to call your neo-Lorentzian theory a type of ether theory?

Aether theories are [generally] dualistic – they have both a space and an aether embedded in that space. Indeed, physicists find it almost impossible to abandon this dualism, except in special relativity and general relativity where both space and aether were abandoned in favor of space-time [a four-dimensional reality that views time as akin to an additional spatial dimension]. Lorentzian relativity is also a dualistic theory with an aether embedded in a space, but with time a separate phenomenon. In neo-Lorentzian relativity [which is a fair characterization of my process physics] we abandon this dualism by positing a structured dynamical space [as the fundamental level of reality]. This dynamical space appears to be fractally textured – according to experiment and theory. This dynamical space is different from both the older notion of space (as a perfect geometrical system) and to an aether, as some form of particulate system embedded in and possibly moving through the geometrical space. In neo-Lorentzian relativity, the “geometry” of the dynamical space is emergent – including its three-dimensionality [and other properties].

More generally, hasn’t physics come around in recent decades to the idea of space as a real entity and not simply a vessel for matter and energy? Mainstream publications like Brian Greene’s book The Fabric of the Cosmos focus on the fact that empty space has certain properties. Einstein also later repudiated his own suggestion, in his seminal 1905 paper on special relativity, that space has no properties.

To the contrary, conventional physics focuses on spacetime, not space as a separate entity. The very concept of “space” is actually rejected by special relativity and general relativity, although sloppy language often confuses the issue. So referring, in special relativity and general relativity, to empty space as having properties is actually misleading. In other words, what one observer identifies as a spatial part of spacetime, is different from another observer’s space part of that same spacetime.

Why is relativity theory so hard to challenge in mainstream physics journals? Are physicists generally group-thinkers who are highly resistant to challenges from the fringes, as respected thinkers like Lee Smolin and Thomas Kuhn have suggested?

In my view, few physicists actually understand special or general relativity. Most physicists’ complete belief in these theories is just that: belief without deep understanding – and they defend that belief with ferocity. Indeed, most physicists appear not to accept the scientific method – namely that ongoing experiments should decide whether a theory survives or not. Of course, special relativity, in particular, has been the foundation of physics for more than 100 years – and most physicists would say that its falsification would be incredibly unlikely. However, my recent paper on neo-Lorentzian relativity (“Dynamical 3-Space: neo-Lorentz Relativity”) shows just that – that special relativity is exactly derivable from Galilean relativity, and special relativity does not do the job claimed for it – meaning that its predictions are inconsistent with experiment.

Can you describe briefly your recent work suggesting that Einstein’s Special Relativity has been falsified?

Special relativity, rather than being a fundamentally new theory, is exactly derivable from Galilean relativity by an exact linear change of space and time coordinates, which mixes the Galilean space and time coordinates. So it turns out that there is no new physics in special relativity that is not already in Galilean relativity. In particular, the various so-called relativistic effects (length contraction, time dilation …) are merely coordinate artifacts. Such actual phenomena cannot emerge from merely a change of coordinates.

One can also show experimentally that these supposed “relativistic effects” are not those actually detected in experiments. One example is that the length contraction effect in neo-Lorentzian relativity is determined by the speed of an object relative to the dynamical space (which is some 500km/s for an object at rest on earth), whereas the special relativity length contraction is determined by the object’s speed with respect to the observer, which in most experiments is 0 km/s. This extreme contrast in predictions is manifestly checked by comparing results from Michelson interferometer experiments with spacecraft earth-flyby Doppler shift data: The outcome is that the special relativity prediction is falsified, and the neo-Lorentzian relativity prediction is confirmed.

Your process physics aligns well with Alfred North Whitehead’s work in philosophy, mathematics, and physics. Whitehead was a well-known panpsychist in that he believed that all matter has some mind associated with it, such that as matter complexifies so mind complexifies. How important is panpsychism in your process physics? Is this idea captured in your notion of “semantic information”?

I developed process physics before I became aware of Whitehead’s philosophy – I was more aware of the work of Heraclitus at that time. Nevertheless, I was happy to acknowledge the philosophical ideas of these and other process philosophers, when I became aware of them, and I now have an ongoing working relationship with various process philosophers. One should note that of course these philosophers had no detailed/mathematical/ implementation/theory/model for their philosophies and this is what my process physics attempts to provide. I also suspect that panpsychism is a valid property of reality, and yes, it is an aspect of “semantic information.”

John Archibald Wheeler made famous the notion that information may be fundamental to reality with his phrase “it from bit.” Do you agree with this idea and if not how would you modify it?

I agree, although Wheeler did not have an implementation mechanism or [detailed] model. In any case one must carefully distinguish between syntactical information, which I suspect is what Wheeler was referring to, and semantic information.

What do you mean by “semantic information” and how does this idea relate to the philosophical view known as panpsychism?

Process physics is about self-generated patterns – and how these patterns interact. So the theory and the reality it models are about active information - information has meaning for the system, and so is called semantic information. Syntactical information is that stored by way of symbols, and then “interactions” are by way of rules, i.e., equations. Equations always presuppose some a priori syntactical rules, and so cannot be fundamental. Semantic information, being active, suggests that the universe is self-aware in some manner, and at all levels. This is my preferred concept of panpsychism.

At the risk of beating this horse to death, another question on the nature of the ether and ether theories: I see the ether concept, or what you call dynamical space, as pretty key for the development of a more ideal future physics, and this is one of the key reasons I was intrigued by your Process Physics when I first came across it. Einstein stated in a 1919 letter to Lorentz that “with the word ether we say nothing else than that space has to be viewed as a carrier of physical qualities.” Do you agree with Einstein here? Would you agree that your “dynamical space” could be described in the same way, as a carrier of physical qualities, which are necessary for an accurate view of nature?

The (new) ether is, in my theory, a dynamical system, the “dynamical space” I’ve described above – which has a very complicated structure at the deepest level. I describe it as a “quantum foam,” meaning that at a deep level the dynamical space is describable by a wave-function whose time evolution is described by a Schrodinger-type theory. On larger scales the dynamical space can be described as being somewhat geometrical, i.e., having three dimensions, etc. It is this aspect that we use as spatial coordinates x, y, z. I would agree with Einstein’s above statement except that dynamical space is not the “carrier” of these properties; rather, disturbances of the dynamical space are in fact what we generally call “physical stuff.”

You stated above that you are working on the concept that the fractal nature of dynamical space may be the underlying reason that particles have mass. Can you flesh out this idea and contrast it with the Higgs field concept that has gained some recent support?

Wave functions propagating through a fractal space will have their energy changed. My conjecture is that this is equivalent to giving “mass” to the wave function. In the Higgs model there is no such structured space, only a smooth space-time, and so the Higgs field is, incorrectly in my view, constructed to provide mass to massive particles.

You have been a consistent critic of Einstein’s relativity theories. Can your process physics be viewed as a full substitution for Einstein’s theory of gravity, general relativity? If so, what kind of real world/technological changes would this substitution lead to?

This is a complete change: Einstein’s special relativity and general relativity have failed in almost every case to explain the observed data – and this contrary evidence grows stronger every day. The process physics perspective could lead to a fundamental revolution in physics – and there isn’t much that will not be changed if these ideas are adopted. Process physics will also have impacts outside of physics – such as in providing a more firm theoretical basis for non-local interactions [“entanglement”] that are often denied by physics at present, and a broader interconnectedness of the universe that is not acknowledged at present.

Comments

Independent Discussion Guidelines

OK pk, where are you?

billclausen (anonymous profile)
February 17, 2013 at 4:38 a.m. (Suggest removal)

STR in a space-time of one dimension each (x,t) is based on two principles "In the beginning, there was a coordinate system"):

1. The "speed" of light is a constant ("Light" is an excitatioin of an intellectual coordinate system ("Let there be light"), and is defined (at this point) by c = x(c)/t(c)

At this point, v = (v/c)*c = Beta*c

2. "Velocity" is specified to be indepent of C. Since V is now independent (orthogonal) to C, "distances" can be defined such that CT is orthogonal to VT', where T is a related scaling factor. If a density p=1 is specified, then a "rest mass" is defined as M0 = pCT

The so-called "Time Dilation" equation (actually a metric) can then be defined via the Pythagorean Theorem, as can the relativistic relations between energy and momentum (E-MC^2)

Note that these equations, being specifed in terms of V and C are independent of the (orthogonal) coordinate system, in the same way that Newton's laws (P= MV, E - 1/2MV^2) are.
(That is, once C is specified via a number, V function of the scaling factors V = C*Gamma(T,T')

C is the Lagrangian, and (1)CT is the "Action" (= Hbar in QM)..(x,t) is now a scalar field, and Maxwell can be expressed in terms of the EM field tensor, which is valid up to a scalar field (gauge invariance) - that is, for any value of constant C. So STR (and QFT) ignore the scalar field, which is why they can set C=1, and refer everything to EITHER momentum space (Heisenberg p = hbK) or energy space (Schroedinger E=hbCT), with hb the action per unit cycle in a system of conserved mass (the distinction between charge and mass is the distinction between linear EM and Gravity).....

If a standard "length" is specified in space, "time" is a change in mass with velocity (at the particle v = 0, and the Lorentz transform simply indicates the rest mass has changed.

If a standard "time" is specified (simultaneity, time is not relevant to dynamics) then a change in "space" with v is a density.

If BOTH space and time change, interactions (however small) betweed particles of different "times" (masses) are and "spaces" (scalar field densities) are involved AND interactions are included via infinitesimals, then GTR is involved. (An alternative is the introduction of "virtual particles" a la Feynman)

The basic problem is the "zero point" energy, since we exist experimentally as linear noise in a infinitesimally curved (GTR) "flat" space(STR) - think of the mass at the center of the galaxy, vs the mass of earth.....

("Sense", which is degenerate in the Time Dilation equation, is an imbalance in local field, corresponding to attraction / repulsion, especailly if the radius is (r(x,y),t) or r(x,y,z),t)

It gets much, much worse.....:-)

BuleriaChk (anonymous profile)
February 17, 2013 at 9:48 p.m. (Suggest removal)

If one specifies that the source and sensor systems are exactly the same at the end of the visible universe and locally, then a "red shift" can only mean that a photon has lost energy during its journey (by interaction with other photons as a headwind - photon interaction being gravity, either through E or B fields on its way - i.e. NOT Compton scattering) This can only be considered an "expansion" if the photon physically exists all the way from source to sensor. In reality, a photon is like a radar pulse longitudinaly with polarization (spin) Furthermore, it is negative energy relative to the source and sink (if the nuclei of the emitting atoms are protons)

It gets much, much worse......

BuleriaChk (anonymous profile)
February 17, 2013 at 9:55 p.m. (Suggest removal)

Photon like a radar pulse - that is, it has a "lifetime", or a "coherence" length...

BuleriaChk (anonymous profile)
February 17, 2013 at 9:56 p.m. (Suggest removal)

bill: drawing and painting; scouring ebay for printed ephemera; spending time with his grandsons; reading books about early jewish apocalyptic and merkabah mysticism; & spending time with his grandsons.

finding evanescent appeal in the murmurings of heterodox physicists. & to show that although an atheist, he can appreciate a good song about jesus, recommending the following for anyone with aching head from too much immersion in those murmurings:

https://www.youtube.com/watch?v=ebj_e...

pk (anonymous profile)
February 17, 2013 at 11:18 p.m. (Suggest removal)

ps. in spite of my declining interest in this sort of thing, i do think Tam is doing a good job in presenting these ideas.

pk (anonymous profile)
February 18, 2013 at 7:45 a.m. (Suggest removal)

Aether has mass. Aether physically occupies three dimensional space. Aether is physically displaced by particles of matter. There is no such thing as non-baryonic dark matter anchored to matter. Matter moves through and displaces the aether.

Galaxies move through and displace the aether.

A 'new dark force' is more speculative than understanding space itself has mass.

'Galactic Pile-Up May Point to Mysterious New Dark Force in the Universe'
http://www.wired.com/wiredscience/201...

"The reason this is strange is that dark matter is thought to barely interact with itself. The dark matter should just coast through itself and move at the same speed as the hardly interacting galaxies. Instead, it looks like the dark matter is crashing into something — perhaps itself – and slowing down faster than the galaxies are. But this would require the dark matter to be able to interact with itself in a completely new an unexpected way, a “dark force” that affects only dark matter."

It's not a new force. It's the aether displaced by each of the galaxy clusters interacting analogous to the bow waves of two boats which pass by each other.

The Milky Way's halo is what is referred to as the curvature of spacetime.

The Milky Way's halo is the state of displacement of the aether.

The geometrical representation of gravity as curved spacetime physically exists in nature as the state of displacement of the aether.

Displaced aether pushing back and exerting inward pressure toward matter is gravity.

A moving particle has an associated aether displacement wave. In a double slit experiment the particle travels through a single slit and the associated wave in the aether passes through both.

mpc755 (anonymous profile)
February 21, 2013 at 7:50 p.m. (Suggest removal)

Whoa! Did anyone see a fireball in the sky tonite at 10:30pm?

It was travelling east to west and seemed huge and low on the horizon!

What tipped me off was the sky outside my window started to get really bright, a second later, the fireball whized by (through the aether ... haha).

EastBeach (anonymous profile)
February 21, 2013 at 10:59 p.m. (Suggest removal)

Wild.. there's been a lot of these occurences worldwide as of late.. is Melancholia soon to follow?

Ken_Volok (anonymous profile)
February 21, 2013 at 11:57 p.m. (Suggest removal)

I heard an explosion at 10:35 and it sounded distant. I was in Mission Canyon at the time.

billclausen (anonymous profile)
February 22, 2013 at 4:37 a.m. (Suggest removal)

KV: Melancholia, yes, a strange, if occasionally overwrought, but very interesting/compelling film by Lars von Trier dealing with an end-of-world scenario. Was just thinking about this too in light of recent cosmological events. For those who haven't seen or maybe even heard of it, it's most definitely worth checking out, but be aware that it is a von Trier film with all that that implies.

zappa (anonymous profile)
February 22, 2013 at 5:58 a.m. (Suggest removal)

billclausen (anonymous profile)
February 23, 2013 at 3:13 a.m. (Suggest removal)

Thanks BC, that was it.

It was really spooky how the sky started glowing brighter before I actually saw the fireball cross my field of view.

From my vantage on the Riviera, it looked to me like the fireball was going to hit the Mission ... in divine retribution for Bishop Thomas Curry's transgressions.

EastBeach (anonymous profile)
February 23, 2013 at 12:06 p.m. (Suggest removal)

Santa Barbara Fair & Expo

Santa Barbara Fair and Expo celebrates “25 Years of Magic”, ... Read More