An Unqualified Commentary

At the end of most scientific discussions, an attempt is often made to draw some conclusions about the work presented, albeit with a few ‘qualified’ caveats. Of course, having never claimed any authority on any of the issues discussed, it was felt that maybe this discussion should be qualified with the caveat of ‘unqualified’. Of course, the word ‘conclusion’ might also be somewhat misleading, as it may suggest a finality that rarely applies in science. For even on the opening page of this section, it was recognised that ‘this section must always be defined as work-in-progress’; although it might have been simpler to stated ‘this section will always be incomplete’.  For if there is any overarching conclusion in science, which is supported by history, it is that the debate about ‘this model’ versus ‘that model’ will continue long into the future.

Given such caveats, was there any purpose to the work presented?

In retrospect, we can always question any human endeavour, no matter how great or small. However, it appears to be part of the human condition, both as individuals and collectively, that we come to seek some understanding of the world and the wider universe. As such, this website was only ever a ‘personal attempt’ to learn, rather than to lecture, on the nature of existence and this alone defines its purpose, no matter if it turns out to be wrong on every account. This said, the development of the Mysearch website was also a genuine attempt to conduct an honest ‘duty of inquiry’ into a broad spread of ideas that underpin modern science, and our various worldviews, such that any commentary would not necessarily be without some foundations.

OK, so what unqualified commentary might be made about science as a whole?

While the details of the scientific method might be debated, the framework shown right consisting of incremental stages may not be unreasonable for general reference. Within this methodology, there is room for speculation, hypothesis and theory, which ideally only becomes accepted as ‘fact’ once verified via some quantifiable and repeatable process. Of course, it has to be recognised that even an accepted ‘fact’ can subsequently be proved wrong, but this is very different from the acceptance of hypothesis or theory prior to any reasonable level of verification.

So does deviation from this scientific method take place?

History might suggest that there has always been a tendency to start accepting well established theory, as fact, prior to any formal verification. For example, Aristotle’s model of the universe was long held true, even though it was only founded on philosophical conjecture with little thought given to its verification. While later theories associated with Newton’s laws of motion and universal gravitation might be seen as classical examples of the scientific method, it was clear that many theories were starting to outpace the ability of science to necessarily verify. So, by the time we come to the latter part of the 19th and early 20th centuries, it might be argued that scientific verification was becoming increasingly dependent on the construct of mathematical models, which were often only supported by a few inconclusive experiments. In this context, we might perceive the development of ‘theoretical science’ as opposed to the ‘applied sciences’ to the point that there may now be a perception that theoretical science is somehow intellectually superior to the applied sciences.

If true, why might some hold to this elitist ‘segregation’ of science?

Well, at one level, many of the issues addressed within the general classification of theoretical science appeared to have no obvious or definitive answer at the time of publication. Therefore, it started to be seen as the ‘leading-edge’ of advanced scientific thinking, even though the ‘applied sciences’ still reflected the actual bedrock of verified knowledge. However, in being unshackled from the need to verify any given theory before publication, it would appear that theoretical science gained the ability to capture the public imagination, and research funding, such that we, the general public, also started to believe that science was moving ever-closer towards some elusive ‘Theory of Everything’.

But why the apparent criticism of such optimism?

While it is both difficult and time-consuming for the lay-person to understand many of details being discussed within the remit of theoretical science, there is a ‘feeling’ that some of the subjects reviewed in this section are not necessarily as ‘cut-and-dry’ as mainstream science would like us to believe. One might go as far to say that the accepted worldview of science is just as capable of indoctrination and hype as any other form of worldview. While this may be too strong, ‘big science’ has certainly learnt the value of marketing PR in its need to raise research funding and this is not always a ‘good thing’.

So is this just the pretext for another conspiracy theory?

Absolutely not. There are many discussions within this website that argue strongly in support of the scientific worldview, but this does not mean that every claim of science should be simply accepted as a ‘gospel truth. For this would not only be to abandon the scientific method, but to belittle the very ‘ethos of science'. So while still strongly supporting the needs, goals and necessity of science, it is naïve to believe that the institutions of science are immune to the pressures and temptations of the world, where success brings not only esteem and influence, but financial security. As a consequence, many young scientist may have come to perceive the dangers in challenging too many of the accepted ‘pillars of science’ . However, putting such concerns to one side, it has been repeatedly argued that the following principles laid down by William Clifford over 100 years ago are still a valid ‘yardstick’ by which to examine any worldview, including the scientific one.

  1. The Duty of Inquiry: It is wrong to believe, or accept, as fact anything based on insufficient evidence. As such, doubt and investigation should take precedence over acceptance and belief.

  2. The Weight of Authority: We should only accept the statements of another when it is reasonable to assume that this person has adhered to the duty of inquiry and knowledgeably speaks the truth, so far as it may be known.

  3. The Limits of Inference: We may infer beyond our experience only when what we do not know is like what we do know, while continuing to accept that any inference must be subject to verification.

But are these principles really that problematic to theoretic science?

 While the first two principles may not be an obvious problem, the segregation of science into different specialised fields of research, especially over the last hundred years or so, means that the knowledge of many scientists is defined by depth within a single field of science rather than the breadth across the spectrum of science. Therefore, when the problem space extends across multiple fields of expertise, it is often necessary for one specialist to simply rely on the work of another. As such, it may not always be obvious as to whether any given assumption, in a hierarchy of other assumptions, is based on ‘insufficient evidence’. However, it seems clear that the ‘limits of inference’ principle could be a particular problem to theoretical science, if seen as a barrier to further speculation. So while speculation is a legitimate part of the scientific method, the problems start when speculation becomes hypothesis, hypothesis is established as theory, and theory is finally accepted as fact before any reasonable level of verification is attained. We might outline a few of the issues previously discussed as examples:

What is a black hole and do they exist?

The earliest ideas of a black hole can be dated back to 1783, while the basic idea of a escape velocity exceeding the speed of light can still be discussed in terms of Newtonian mechanics. However, the later definition of the Schwarzschild radius suggested the idea of an event horizon, which in connection to relativistic theory implied a breakdown of all normal ideas about spacetime, after which all the mass contained within the event horizon would collapse into a singularity. However, the idea of the singularity preceded any in-depth understanding of the classical atomic structure and the possible limits to sub-atomic degeneracy. So, over the years, ever more complex models of a black-hole have been developed in an attempt to reconcile the description within the accepted understanding of both relativity and quantum theory. However, the basic idea is now so well established that few ever question their physical existence, in some form or another, even although we might still have to question the mathematical models that describe them and the actual role they play in the large-scale structure of the cosmos.

What about quantum theory and its many interpretations?

While the details of a black-hole might be questioned, its description can still be outlined to a wider audience in terms of Newtonian mechanics. However, this was never the case with quantum theory, which appeared to cross the limits of inference from the outset, as it is ‘nothing like what we know’, or possibly understand even today, in terms of its requirements for wave-particle duality and its probability description of the wave function collapse. While such reservations cannot simply refute the idea, 'causal logic' and Occam’s Razor might suggest that verification was even more essential in this case. However, today, every young physicists is made well aware of Einstein’s failed attempts to discredit aspects of quantum theory, e.g. EPR. So, over time, quantum theory has become established as ‘fact’, while morphing from quantum mechanics into Quantum Field Theory (QFT), which now encompasses 3 different models of the quantum world.

While some may still question the actual level of verification associated with all these models, it does not seem to have stopped theoretical science from using these models to underpin the standard particle model. However, despite all the reservations associated with these theories, both in terms of mathematical consistency and as a description of physical cause and effect; the last 60 years of experimentation appears to have only tested specific ‘data-points’ in isolation without necessarily challenging or proving some of the more fundamental concepts underpinning the standard model. While this website has simply made an attempt to understand some of the issues associated with these models, in truth, it is a specialist area of research that few have the authority or resources to question. Of course, over the years, some have tried to question the details of these quantum models and a few have even attempted to propose alternative theories. In different times, these individuals might be described as ‘maverick thinkers’ or kindly dismissed as ‘Don Quixote characters charging the established ‘windmill’ of accepted science. However, in today’s world, the trolls of the scientific community seem to quickly pile-in and form a ‘majority’ who then dismiss these individuals out-of-hand as ‘cranks’ or ‘nuts’ with dire warnings that their work should simply be discarded without review. Of course, they need not worry too much on the last point, as ‘cranks and nuts’ never get their work pass peer-review. Again, it is possible that young scientists quickly learn of the dangers of straying too far off the mainstream path of science, at least, if they want a career in theoretical science. If you think this is unfair, try opening a thread of discussion in the Physics Forum about any non-mainstream idea and monitor the reaction. Anyway, moving on:

What about the state-of-play of the theoretical science of cosmology?

Possibly, given its scope to cover all of time and space, it seems that the field of cosmology has been allowed to disregard almost all consideration of the ‘limits of inference’ principle. While the lay-person can still review the ideas associated with the Friedmann equations and run their own energy-density models, it is not clear that they will be any the wiser as to what is really going on. For this model is primarily using gravity-only calculations, which takes little account of other forces and energy sources at work in the universe. However, today, even the most basic energy-density model powered by gravitation requires the inclusion of ‘dark energy ’, the attributes of which are like ‘nothing we know’. Of course, it has long been recognised that the energy-density model would also breakdown as the moment of the ‘Big Bang’ approaches a ‘quantum singularity. As such, the energy-density model is now preceded by the idea of quantum inflation, which in-turn can make references to even more speculative ideas, such as string theory, quantum gravity and colliding M-branes etc. However, despite the degree of speculation associated with all these ideas, confidence still appears  high about our understanding right back beyond the first microsecond of existence of the universe, although increasing qualification may now surround the full scope of the quantum universe. Even if we return to the comparatively mundane examination of galactic rotation, it would seem that basic Newtonian mechanics has to be jettisoned in favour of another unverified idea in the form of ‘dark matter’, which also has no description in the current particle model. Never the less, cosmologists appear increasingly confident based on ‘evidence’ in the form of various computer simulations of the current standard model.

Note: The model, called Illustris, creates a 3D space of some 12 billion pixels using the ‘accepted’ equations of cosmology, but with the added assumption of normal and dark matter plus dark energy. The simulation starts just 12 million years after the ‘Big Bang’ and simulates some 41,000 galaxies condensing into galactic clusters from a seemingly chaotic churning of matter.

You are left to consider whether a simulation leading to a known end-point, i.e. today’s observed universe, is sufficient verification of the cosmological model. However, just in case there is still some doubts, there is some $8.8 billion currently being earmarked for the James Webb Space Telescope (JWST), the successor to the Hubble Space Telescope (HST). While this ‘big science’ project will surely help theoretical science put pay to any remaining cynicism surrounding the current model, the following question is tabled none the less:

Can theoretical science really talk with authority when verification is questionable?

As the issues touched on above have already been discussed, we might turn the attention on one final topic that has been in the news lately linked to the ‘discovery’ of the Higgs boson. The search for the Higgs particle is now being typically acclaimed as one of the great success stories of ‘big science’ and another verification of the standard particle model. While not an easy subject for the lay-person to understand, we may still all be interested in what $13.25 billion of research funding has achieved.

Note: The reference to ‘big science’ does not necessarily mean fundamental science. Rather it seems reflect the ‘big’ money required to do anything that remotely comes near to addressing some of the fundamental questions raised over the last 100 years .

The search for the Higgs Boson has occupied theoretical science for the last 50 years or so. While the name of Peter Higgs is now firmly linked to the recent CERN ‘discovery’, there is some historical debate surrounding the initial theoretical prediction of the Higgs boson and Higgs field, as there were a number of theoretical scientists working on a mass-generating mechanism back in the early 1960’s. However, possibly as a means of confirming the success and validity of CERN’s discovery, Sir Peter Higgs has since been knighted, become a Nobel Prize Laureate and won the freedom of the city of Edinburgh, although for the purposes of this discussion, we shall focus on a more important issue:

What the hell is a Higgs boson and the Higgs field?

If you know nothing about the particle model , it is suggested that you follow the previous link to review the outline description of the terminology frequently used. However, for simplicity, we can reduce the complexity of particle model to just three particles, where an atomic nucleus consists of ‘protons’ and ‘neutrons’ surrounded by ‘electrons’.

Note: In order to cross-reference the jargon of the particle model, the protons and neutrons are associated with a grouping called ‘baryons’, which are thought to have a sub-structure defined in terms of ‘quarks’. The electron, as a particle, has no known sub-structure and is associated with a group called ‘leptons’. The groupings of ‘baryons’ and ‘leptons’ also sit under the overall group name of ‘fermions’.

For the moment, these group names do not help and the more important point is that we have 3 types of particles, i.e. protons, neutrons and electrons, to which we might attached the concept of mass. As such, we might now try to  introduce the idea of another group called ‘bosons’ that are often confusing described using the semantics of a ‘particle’, but which act as a ‘force-carrier’. In this context, we might equate the idea of different bosons to the basic description of the fundamental forces plus an extended idea of mass:

Boson Force
Photon electromagnetic
Gluon strong
W-Z weak
Graviton gravity
   
Higgs ‘mass’

The Higgs boson is separated from the other four, because mass is not classically defined as a force. However, it might be argued that mass and gravity are distinct concepts, such that mass might still exist without gravity, where mass would be left to explain inertia.

Note: The idea of a ‘force-carrier’ may appear quite strange to a lay-person familiar with the description of force as the change in energy [E] with distance [x], i.e. [dE/dx]. As such, we might wish to consider what scalar energy is be transported by the ‘force-carrier’, e.g. kinetic energy [1/2mv2] apparently requiring the force-carrier to have mass or as potential energy possibly being transported as a wave in the field? If the latter, what is the source and nature of this potential energy?

So, as a generalisation, the idea of the Higgs boson appears to describe a ‘particle’ that acts as the ‘force-carrier’ responsible for the mass of ‘real’ particles, i.e. the protons, neutrons and electron. At this point, it is worth remembering that the universe is not responsible for the semantics for its description, this is purely the work of theoretical scientists.

 OK, but where does the Higgs field fit into this?

In the confines of quantum theory, the idea of a particle gets replaced by the discussion of fields, where the field represents some quantity at each point in space. Some fields are described as scalar fields, only having magnitude, while others are described as vector fields having both magnitude and direction.

Note: Given the level of physical verification, it may not be unfair to describe Quantum Field Theory (QFT) as a mathematical and conceptual model that underpins the standard particle model. Therefore, the physical reality of the many quantum fields discussed in this context may still be questionable, such that we might also have to question the physical reality of the  Higgs fields.

So while the semantics of a particle is often retained, quantum theory is essentially describing a particle as something more like a ‘ripple’ in a field. Using this analogy, the Higgs boson might be described as smallest possible ripple in the Higgs field. However, it is the Higgs scalar field that is thought to give a ‘particle’ its mass through an ‘interaction’ in which particles that interact strongly with the Higgs field have more mass and vice versa.

Note: Whether the idea of mass, either inertial or gravitational, is actually explained by this conceptual description of field interaction may also have to be questioned and examined in more detail. However, it appears to be more of an analogy rather than an explanation. For example, Q: why is the mass of a proton 1836 times greater than an electron, A: because it interacts 1836 times more strongly with the Higgs field. Of course, we might want to know how mass as an energy-density and a scalar quantity interacts with anything without some description of the kinematics at work.

So what level of verification now supports this theory?

Well, as is normal in theoretical science, the ‘devil in the detail’, which far exceeds the lay-person description given above. However, a lay-person might still recognise that the Higgs boson and associated field is not open to direct observation, such that it has to be inferred through the complexity of computer analysis of high-energy collisions, which may involve billions of particles.

Note: As the Higgs boson cannot be directly detected, its existence is inferred by following a decay trail  of the boson into other particles. However, the interpretation of this decay trail may be very dependent on the actual particle model assumed to underpin the analysis. While there are apparently a  number of ways the Higgs boson might decay, the CERN analysis has focused on two; the  decay into two photons and the decay into four leptons.

 

So what was actually detected by CERN?

Reading between the lines of the many, many, many articles written about the Higgs discovery, it would seem that CERN have only actually claimed to have found ‘a’ Higgs boson, but not necessarily ‘the’ Higgs boson assumed by the current particle model. Therefore, as indicated, the devil is in the detail, not only when it comes to the theory, but also the actual verification process itself. Clearly, the verification process for the Higgs boson is very complex in all its details, such that we might only characterise this complexity. It would seem that the verification process is anchored in a computer simulation of the Higgs boson, which is designed to analyse the decay processes according to calculations accepted by the standard model. Within the simulation, an algorithm is required to select all the particles that might be produced within the ‘observed’ decay trails and all the many other processes or ‘background noise’ that might also produce similar results. A method is then needed to filter the possible ‘genuine’ data from all the background noise by comparing energy, momentum, angles and mass plus many other quantities, which might affect the interpretations of the results.

Note: Presumably, these ‘particles’ are moving with relativistic velocities, such that relativistic effects would also have to be accounted, not only in terms of energy and momentum, but the effective kinetic mass and decay times affected by time-dilation. Just devil in the detail?

Having established the simulation, the raw data collected by the actual detection technology, which represents another level of complexity that we shall ignore, is then fed into the computer simulation model, which if correct will then filter and calculate the required strength of the energy ‘signal’ , which is then interpreted as evidence of the existence of Higgs boson.

So what ‘signal’ was detected?

Based on theory, the mass of the Higgs boson is now assumed to correspond to an energy in the region of 125GeV, which might be compared to that of a proton in the region of ~1GeV.  However, these units do not real convey much information on an intuitive level, such that we might convert 125GeV to 2*10-8 joules, where 1 joule might be visualised in terms of an apple falling 1 metre. So while the mass of the Higgs boson may appear large in comparison to other sub-atomic ‘particles’, its energy is still incredible small. In addition, the Higgs boson is highly unstable with a decay lifetime in the order of one tenth of one billionth of one trillionth of a second, i.e. 10-22 seconds.

Note: Does ‘something’ that only has a theoretical lifetime of 10-22 seconds actual constitute a ‘particle’ ? If not, might it be interpreted as some, as yet, unexplained transition or fluctuation in the minuscule energy states being modelled within a computer simulation predicated on an essentially unverified particle model?

However, while the Higgs boson only represents 1/100th of billionth of the energy of a falling apple and only exists for 1/10th of one billionth of one trillionth of a second, it requires very high energy conditions to come into existence within the Higgs field, which is assumed to exist throughout space-time. So far, the only place on Earth, where these conditions are thought to have been recreated is within the Large Hadron Collider (LHC) at CERN. As outlined, the Higgs boson cannot be detected directly, only inferred via the detection, analysis and interpretation of many decay products, e.g.

While the decay paths into 2 photon or 2 Z-boson are low-percentage outcomes, they apparently have the advantage of producing much less background noise. However, it is not possible for a 125-GeV Higgs boson to decay to two Z-bosons, because they each have an assumed mass of 91GeV, i.e. a combined mass of 182GeV, which is more than 125 GeV. This seems to be explained away in terms of a Z-boson and a virtual Z-boson, whose effective mass is much less. However, this ambiguity might explain why the Higgs boson signal was predicated on finding a photon decay trail as it was though the best way to find a ‘clean’ signal.

Note: The statement that the photon decay produces the ‘cleanest’ sign is not really understood as most particle-antiparticle pairs appear capable of decaying into two photons. Presumable more devil in the detail?

What confidence is there in the CERN results?

Despite all the obvious complexity and reliance on a hierarchy of assumptions embedded in the particle model and the mathematics of quantum theory, scientists at CERN have declared a statistical confidence of [], which might be translated as being one chance 1 in 3.5 million that the signal detected was a false-positive resulting from background noise. However, whether this is actually confirming that the ‘signal’ found at 125GeV is the Higgs boson, as required by the standard particle model, might still be debated, as it is lighter than theory initially suggested.

Does the notion of scientific investigative journalism even exist?

This is a rhetorical question, as the discovery of the Higgs boson now appears to be supported by mainstream particle physics and the many thousands of websites and publications, which obviously feel ‘qualified’ to support the evidence presented. However, a degree of caution may still be necessary when it comes to separating out those who are simply ‘affirming’ their belief in the evidence, those who are simply ‘reporting’ what they been told, those who may have simply been ‘involved’ in the CERN project and those are truly ‘qualified’ to ‘confirm’ and ‘explain’ the findings.

Note: At this point, it would be a public service if CERN could identified exactly who is ‘qualified’ to talk authoritively on all the technical details of discovering a Higgs boson, albeit not necessarily ‘the’ Higgs boson and their overall confidence in the theoretical quantum-particle model used to support it. For does a [5σ] confidence really apply to the whole theory?

Even so, it would appear that the majority have already decided that the standard particle model has been re-affirmed by the CERN experiment such that ‘big science’ might now be well positioned to seek more funding for its next experimental program.

However, despite the apparent growing evidence, might the particle model still be wrong?

One overarching concern that might still be questioned is the apparent lack of consistency within the standard particle model, which appears self-evident in the continued use of ‘particle’ semantics. For it seems that the actual theory is predicated on a mathematic description of quantum fields, many of which may still be questionable in terms of any physical reality. So while it may have to be left to time to ultimately answer some of the questions raised, might we not still try to clarify some of the semantics of a ‘particle’ model, which in reality appears to have been jettisoned in preference of the underlying quantum model. Today, most theoretical scientists appear to be comfortable with the idea of the duality of ‘particle-fields’ , which presumably stretches back to the ‘wave-particle’ duality of quantum mechanics. However, others may still ask for clarification in terms of the ‘cause and effect’ description that might then explain this model, e.g.

  • The current model seems to describe the Higgs boson as a ’ripple’ in the Higgs Field, which presumably may have to be quantized in the same way as the photon associated with an EM field. However, there is still considerable ambiguity about the actual existence of a photon across the entire EM spectrum, especially in the radio spectrum, where the wavelength would be measured in kilometres. Does the Higgs boson ripple have any frequency spectrum in terms of Planck’s energy equation, i.e. E=hf?

  • As a generalisation, fields of force as associated with electric and gravitational potential are normally vector fields, so what mechanism explains the Higgs boson ripple ‘propagating’ through the Higgs scalar field, i.e. what causes the Higgs boson to move in the Higgs field in order to fulfil its role as a force-carrier?

  • It is assumed that the Higgs field permeates all of space-time, but what evidence supports this assumption and what relationship does this quantum field have with the spacetime of relativity or all the other quantum fields implied by quantum theory? Again, is there a physical ‘cause and effect’  mechanism that helps explain the existence and purpose of so many conceptual fields?

  • At this time, there appears to be no experimental evidence supporting the existence of the graviton boson’ within the particle model. As such, the assumed existence of the ‘Higgs boson’ would only start to partially address the issue of unifying the standard model in respect to both mass and gravitation. Equally, is it possible that the addition of mass as a fundamental force-carrier may have some serious implications on Einstein’s field equations of general relativity? 

  • Finally, we might return to the idea of mass having any substance beyond its description as an energy-density confined within some given volume of space-time. For it has been argued that ‘mass’ has no more substance than a ‘particle’ at the quantum level and both may have to be replaced by the more fundamental concept of field-energy. If so, mass (kg) cannot be used as a fundamental unit in the definition of energy, rather its appear to be more logical for the reverse. However, the idea of energy as a scalar quantity then requires a mechanism to explain its movement in space-time. Does the Higgs mechanism help in this regard?

While the concerns above may easily be addressed an a ‘qualified’ expert or more likely simply dismissed as ‘unqualified commentary’, the issue of verification still appears to be a ‘thorn in the side’ of theoretical science for a number of reasons.

  • The sheer complexity of the verification process and the reliance on the data analysis predicated on so many theoretical assumptions might appear to leave considerable room for further questioning. 

  • Ideally, verification requires the result data to be reproduced by another independent organisations. However, in this case, the costs and expertise required to build the LHC appears so prohibitive that any truly independent secondary verification may be unlikely in the near future. This appears equally true in the theoretical ‘big science’ of cosmology, which seems to be adding to the complexity of the standard particle model by ‘discovering’ new particle-energy concepts in the form of ‘dark matter’ and ‘dark energy.

  • It is unclear whether the current particle-quantum model has ever been shown to provide a ‘cause and effect’  description of sub-atomic reality. While possibly accepting its value as a means of calculation, it might be ask whether this fact has simply been exploited by ‘big science’ to justify the search parameters of its next expensive experiment. So while these models may be the best we have, it is unclear that these models help to truly reflect or provide an accurately description of what might be called quantum reality.

But why should we, the general public, care about such concerns?

From my own perspective, the future of humanity may ultimately depend on the ‘scientific worldview’ being built on solid foundations. In this respect, history shows us that the ‘worldview’ of the majority, i.e. mainstream opinion, is invariably, if not always wrong or at the very least incomplete. As a consequence, it has also required the courage of possibly ‘unqualified’ individuals, i.e. the mavericks, dreamers, cranks and nuts, to challenge the accepted worldview. However, the 20th century has seen a ‘paradigm shift’ away from the work of essentially independent scientists towards scientific research teams, funded by the large institutions of modern society. This not necessarily a criticism, as this larger scale approach and field specialisation has brought about many benefits and should certainly not be judged in terms of a subliminal conspiracy theory. However, this is not to say that the development of ‘big science’ cannot cause problems, if aspects of its operation is giving more priority to the reality of its ongoing research funding than the reality of the scientific model being pursued.

Note: Forbes has put the total cost of CERN project in the order of $13.25 billion to-date. CERN also employs some 2,400 full-time employees and 1,500 part-time employees; while hosting some 10,000 visiting scientists and engineers from 600+ universities representing 113 nations. As such, it qualifies as big science that requires big money to maintain its existence.

However, the sort of money required to fund ‘big science’ in today’s world of economic recession does not come easy. Therefore, the many institutions of science, both applied and theoretical, are competing for limited pots of money. However, the justification for funding theoretical science, which may have no obvious payback to society, may have come to rely heavily the general perception and acceptance of a theoretical model, i.e. its brand image. For we might well ask:

Who would fund a multi-billion dollar research program suspected of being founded on false assumptions and suspect theory?

While, as repeatedly stated, this is not a thinly disguised attempt to forward a conspiracy theory involving ‘big science’, it is possibly naïve not to assume that the various institutions of ‘big science’ associated with governments, academia and commerce, do not spend a considerable amount of time, money and influence in promoting their own specific ‘worldview’ of science. In this respect, the larger institutions of theoretical science may effectively create a ‘cartel’ that is stifling the competition in terms of any alternative models, which in any other field of free-enterprise would probably be subject to investigation by a monopolies commission. Of course, the counter-argument is that theoretical science is simply reflecting a convergence of consensus within the scientific community towards a given theoretical model based on the available evidence. Either way, it might still be wise to remember the words of Albert Einstein:

Concepts that have proven useful in ordering things
easily achieve such an authority that we forget their
 mundane origin and accept them as unalterable facts.
Scientific progress is often stalled for a long time
by such errors.