The standard model of cosmology is acclaimed in physics as accurate, robust, well-tested, our best scientific theory of the cosmos, but it has had serious anomalies for a while, including the Hubble tension, anomalous galaxies, and the completely unexplained nature of dark energy and dark matter. And lurking behind it all is the lack of a unified theory: General Relativity (GR) and quantum mechanics (QM) are inconsistent. Now startling new observations by the James Webb Space Telescope (JWST) in 2022 of (...) the early universe present the strongest challenge yet to the standard model, and whispers have started that this shows there is something wrong with the fundamental theory, General Relativity itself. This would be a crisis for cosmology. But haven’t they tested this theory already, and shown it is correct? How could it turn out wrong at this late stage? Here we compare the standard cosmology with an alternative fundamental theory, that has a strikingly different overall cosmological behavior: a simple cyclic expansion function. It is simple and deterministic. There are only two or three general parameters. The interesting result is that this alternative cosmology: (A) closely matches the expansion observed and modelled through the CDM standard model, now going back to red-shifts of 5-15; and (B) it also predicts unexpected early galaxy formation now being reported by the JWST. The point here is not to try to prove this alternative theory however, but rather show how it compares to the conventional cosmology. This show us clearly how weak the empirical evidence for the standard model really is against a counterfactual fundamental theory. Some results established in science are robust against theory change, but we find the standard cosmological model and the implications drawn from it are not robust at all. (shrink)
The Schwarzschild solution (Schwarzschild, 1915/16) to Einstein’s General Theory of Relativity (GTR) is accepted in theoretical physics as the unique solution to GTR for a central-mass system. In this paper I propose an alternative solution to GTR, and argue it is both logically consistent and empirically realistic as a theory of gravity. This solution is here called K-gravity. The introduction explains the basic concept. The central sections go through the technical detail, defining the basic solution for the geometric tensor, the (...) Christoffel symbols, Ricci tensor, Ricci scalar, Einstein tensor, stress-energy tensor and density-pressure for the system. The density is integrated, and some consistency properties are demonstrated. A notable feature is the disappearance of the event horizon singularity, i.e. there are no black holes. So far this is for a single central mass. A generalization of the solution for multiple masses is then proposed. This is required to support K-gravity as a viable general interpretation of gravity. Then the question of empirical tests is discussed. It is argued that current observational data is almost but not quite sufficient to verify or falsify K-gravity. The Pioneer spacecraft trajectory data is of particular interest, as this is capable of providing a test; but the data (which originally showed anomalies that match K-gravity) is now uncertain. A new and very practical experiment is proposed to settle the matter. This would provide a novel test of GTR, and a novel test of the cause of the Pioneer anomalies. In conclusion, K-gravity has extensive ramifications for gravitational physics and for the philosophy of GTR and space-time. (shrink)
Physicists routinely claim that the fundamental laws of physics are 'time symmetric' or 'time reversal invariant' or 'reversible'. In particular, it is claimed that the theory of quantum mechanics is time symmetric. But it is shown in this paper that the orthodox analysis suffers from a fatal conceptual error, because the logical criterion for judging the time symmetry of probabilistic theories has been incorrectly formulated. The correct criterion requires symmetry between future-directed laws and past-directed laws. This criterion is formulated and (...) proved in detail. The orthodox claim that quantum mechanics is reversible is re-evaluated. The property demonstrated in the orthodox analysis is shown to be quite distinct from time reversal invariance. The view of Satosi Watanabe that quantum mechanics is time asymmetric is verified, as well as his view that this feature does not merely show a de facto or 'contingent' asymmetry, as commonly supposed, but implies a genuine failure of time reversal invariance of the laws of quantum mechanics. The laws of quantum mechanics would be incompatible with a time-reversed version of our universe. (shrink)
In his [1937, 1938], Paul Dirac proposed his “Large Number Hypothesis” (LNH), as a speculative law, based upon what we will call the “Large Number Coincidences” (LNC’s), which are essentially “coincidences” in the ratios of about six large dimensionless numbers in physics. Dirac’s LNH postulates that these numerical coincidences reflect a deeper set of law-like relations, pointing to a revolutionary theory of cosmology. This led to substantial work, including the development of Dirac’s later [1969/74] cosmology, and other alternative cosmologies, such (...) as the Brans-Dicke modification of GTR, and to extensive empirical tests. We may refer to the generic hypothesis of “Large Number Relations” (LNR’s), as the proposal that there are lawlike relations of some kind between the dimensionless numbers, not necessarily those proposed in Dirac’s early LNH. Such relations would have a profound effect on our concepts of physics, but they remain shrouded in mystery. Although Dirac’s specific proposals for LNR theories have been largely rejected, the subject retains interest, especially among cosmologists seeking to test possible variations in fundamental constants, and to explain dark energy or the cosmological constant. In the first two sections here we briefly summarize the basic concepts of LNR’s. We then introduce an alternative LNR theory, using a systematic formalism to express variable transformations between conventional measurement variables and the true variables of the theory. We demonstrate some consistency results and review the evidence for changes in the gravitational constant G. The theory adopted in the strongest tests of Ġ/G, by the Lunar Laser Ranging (LLR) experiments, assumes: Ġ/G = 3(dr/dt)/r – 2(dP/dt)/P – (dm/dt)/m, as a fundamental relationship. Experimental measurements show the RHS to be close to zero, so it is inferred that significant changes in G are ruled out. However when the relation is derived in our alternative theory it gives: Ġ/G = 3(dr/dt)/r – 2(dP/dt)/P – (dm/dt)/m – (dR/dt)/R. The extra final term (which is the Hubble constant) is not taken into account in conventional derivations. This means the LLR experiments are consistent with our LNR theory (and others), and they do not really test for a changing value of G at all. This failure to transform predictions of LNR theories correctly is a serious conceptual flaw in current experiment and theory. (shrink)
IBE ('Inference to the best explanation' or abduction) is a popular and highly plausible theory of how we should judge the evidence for claims of past events based on present evidence. It has been notably developed and supported recently by Meyer following Lipton. I believe this theory is essentially correct. This paper supports IBE from a probability perspective, and argues that the retrodictive probabilities involved in such inferences should be analysed in terms of predictive probabilities and a priori probability ratios (...) of initial events. The key point is to separate these two features. Disagreements over evidence can be traced to disagreements over either the a priori probability ratios or predictive conditional ratios. In many cases, in real science, judgements of the former are necessarily subjective. The principles of iterated evidence are also discussed. The Sceptic's position is criticised as ignoring iteration of evidence, and characteristically failing to adjust a priori probability ratios in response to empirical evidence. (shrink)
In the philosophy of time, the neo-positivist is focussed above all else on sustaining the view called the static theory of time, as the very foundation of their scientific metaphysics. This is the deeply held metaphysical conviction of almost all ‘modern philosophical-scientific’ writers on time. In fact it is hardly too much to say that the entire official modern 20th Century philosophy of physics rests on the assumption that the static theory of space-time is the only concept of time we (...) can use in physics. The static theory of time prescribes the representational space for physics as being logically based on space-time – to the point physicists are incapable of conceiving a theory without space-time any longer, and mutter superstitiously among themselves if someone suggests such a thing. By extension, this space-time provides the representation of all reality for the modern scientific materialists, since they believe that everything ultimately reduces to physics. This agenda simultaneously requires discrediting the alternative concept of time flow and the associated traditional metaphysical concepts that it supports. This Chapter explains the concept of time flow and contrasts it with the static or block universe theory as a genuine and natural alternative for a physical ontology. (shrink)
This study presents a new type of foundational model unifying quantum theory, relativity theory and gravitational physics, with a novel cosmology. It proposes a six-dimensional geometric manifold as the foundational ontology for our universe. The theoretical unification is simple and powerful, and there are a number of novel empirical predictions and theoretical reductions that are strikingly accurate. It subsequently addresses a variety of current anomalies in physics. It shows how incomplete modern physics is by giving an example of a theory (...) that is genuinely unified. In doing this, it radically alters the metaphysical interpretation of the nature of time, space and matter currently interpreted from modern physics. It also profoundly challenges materialist expectations about a naturalistic account our own existence. I contend here that there is sufficient evidence to support this theory as a leading paradigm for a unified foundational theory. (shrink)
‘Philosophy’ today has come to mean the academic ideological disputes between various grandiose ‘meta-philosophies’, rather than the content or explanation of the real problems and issues. I illustrate typical expressions of the conventional ‘scientific' anti-realist philosophy of time here, and how far it has infiltrated the scientific world view.
This chapter starts with a simple conventional presentation of time reversal in physics, and then returns to analyse it, rejects the conventional analysis, and establishes correct principles in their place.
The conventional claims and concepts of 5* - 8* are a hang-over from the classical theory of thermodynamics – i.e. thermodynamics based on a fully deterministic micro-theory, developed in the time of Boltzmann, Loschmidt and Gibbs in the late C19th. The classical theory has well-known ‘reversibility paradoxes’ when applied to the universe as a whole. But the introduction of intrinsic probabilities in quantum mechanics, and its consequent time asymmetry, fundamentally changes the picture.
These are the first two chapters from a monograph (The Time Flow Manifesto, Holster, 2013-14; unpublished), defending the concepts of time directionality and time flow in physics and naturalistic metaphysics, against long-standing attacks from the ‘conventional philosophy of physical time’. This monograph sets out to disprove twelve specific “fallacies of the conventional philosophy”, stated in the first section below. These are the foundational principles of the conventional philosophy, which developed in the mid-C20th from positivist-inspired studies. The first chapter begins by (...) re-presenting the basic analysis of time reversal symmetry in the context of probabilistic or non-deterministic processes, removing the first critical error in the conventional account. The second chapter argues for a law-like explanation of physical time asymmetry and irreversibility, and shows how the ‘reversibility paradoxes’ are explained. (shrink)
In this chapter, we see one way that time flow may force us to develop our physical theory if we add it back into physics proper. Now of course this is speculative in this context, and should be thought of as a model. The two following extracts are from introductions a more complete unified theory. They explain the basic mathematical models that are required to illustrate the point that such models may be plausible. The second extract, ‘the parable of the (...) ants’, introduces us to the ideological-philosophical conflict that prevents such a development being considered in the present generation, which we will go on to next. (shrink)
This study analyses the predictions of the General Theory of Relativity (GTR) against a slightly modified version of the standard central mass solution (Schwarzschild solution). It is applied to central gravity in the solar system, the Pioneer spacecraft anomalies (which GTR fails to predict correctly), and planetary orbit distances and times, etc (where GTR is thought consistent.) -/- The modified gravity equation was motivated by a theory originally called ‘TFP’ (Time Flow Physics, 2004). This is now replaced by the ‘Geometric (...) Model’, 2014 , which retains the same theory of gravity. This analysis is offered partially as supporting detail for the claim in  that the theory is realistic in the solar system and explains the Pioneer anomalies. The overall conclusion is that the model can claim to explain the Pioneer anomalies, contingent on the analysis being independently verified and duplicated of course. -/- However the interest lies beyond testing this theory. To start with, it gives us a realistic scale on which gravity might vary from the accepted theory, remain consistent with most solar-scale astronomical observations. It is found here that the modified gravity equation would appear consistent with GTR for most phenomena, but it would retard the Pioneer spacecraft by about the observed amount (15 seconds or so at time). Hence it is a possible explanation of this anomaly, which as far as I know remains unexplained now for 20 years. -/- It also shows what many philosophers of science have emphasized: the pivotal role of counterfactual reasoning. By putting forward an exact alternative solution, and working through the full explanation, we discover a surprising ‘counterfactual paradox’: the modified theory slightly weakens GTR gravity – and yet the effect is to slow down the Pioneer trajectory, making it appear as if gravity is stronger than GTR. The inference that “there must be some tiny extra force…” (Musser, 1998 ) is wrong: there is a second option: “…or there may be a slightly weaker form of gravity than GTR.” . (shrink)
This paper argues that ordinary object languages for fundamental physics are incomplete, essentially because they are extensional, and consequently lack any adequate formal representation of contingency. It is shown that it is impossible to formulate adequate deduction systems for general transformations in such languages. This is argued in detail for the time reversal transformation. Two important controversies about the application of time reversal in quantum mechanics are summarized at the start, to provide the context of this problem, and show its (...) serious implications, but the aim here is not to solve these problems. The flaw is not special to quantum mechanics: it is a general feature traced to extensionality, and demonstrated through a simple example in classical physics. It is proposed that this defect can be overcome by extending to an intensional semantics, but this involves extending the usual formalism of physics. A detailed proposal for such an extension is given in Part 2. (shrink)
This continues from Part 1. It is shown how an intensional interpretation of physics object languages can be formalised, and how a syntactic compositional time reversal operator can subsequently be defined. This is applied to solve the problems used as examples in Part 1. A proof of a general theorem that such an operator must be defineable is sketched. A number of related issues about the interpretation of theories of physics, including classical and quantum mechanics and classical EM theory are (...) discussed. (shrink)
Pavel Tichy (1936-1994) was a Czech philosopher who originally studied and worked at Charles University in Prague, and spent the second half of his life in New Zealand as a political refugee. Early in his career he invented intensional logic, simultaneously with Richard Montague, but published his version in 1971, slightly after Montague's 1970 papers, and has never been recognised for this achievement. But this was only the beginning of his work. He developed a highly original theory of semantics called (...) Transparent Intensional Logic, which is the basis of an important research program based in the Czech Republic and Slovenia. He published a wide range of original work in semantics, philosophy of logic and language, philosophy of science, and metaphysics; but despite the indisputable quality of this work, he has gained little contemporary recognition. This article provides a brief introduction to his work, focussing mainly on basic ideas of his intensional semantics and his theory of 'constructions'. (shrink)
David Albert has recently argued that classical electromagnetic theory (EM) is not time reversal invariant (non-TRI), while David Malament rejects this argument and maintains the orthodox result, that EM is TRI. Both Albert's and Malament's arguments are analysed, and both are found wanting in certain respects. It is argued here that the result really depends on the choice of theoretical ontology choosen to interpret EM theory, and there is more than one plausible choice. Albert and Malament have choosen different plausible (...) ontologies; but neither shows that their choice of interpretation is definitive. Deeper principles about this choice are examined. The extension to EM theory with magnetic monopoles is also examined. It is concluded that, despite certain flaws in his account, Albert's analysis does reveal serious problems in the orthodox account, which Malament's response does not adequately address. (shrink)
This paper outlines a ‘paradox’ in quantum measurement theory, illustrated with two different types of systems. If this paradox cannot be resolved in ordinary quantum mechanics (as I currently think), it is alarming. If it can be resolved, it can be added to a long list of examples that show the internal consistency of quantum mechanics, and in this case I hope the correct analysis will be an interesting example for students. The immediate paradox involves a failure of Lorentz invariance (...) for measurements on certain types of single particle systems. It seems to show that an absolute frame of reference is required to describe wave function collapse. But if the argument is correct, quantum measurement theory is not merely dependant on the frame of collapse - it is inconsistent, and experimental tests would have to be done to verify whether the apparent non-local effects predicted are real. (shrink)