New Electroweak precision measurements

CMS of LHC (CERN) has just reported new Electroweak precision measurements {(sin(θ), lepton/eff)^2 = 0.23101±0.00052} on November 14, 2017, see .

In Standard Model, Weinberg angle is a function of two more fundamental physical constants: weak isospin g and weak hypercharge g’, and they are all ‘free parameters’ (not derived theoretically).

On the other hand, the Weinberg angle was calculated theoretically in G-theory, see or page 36 of ‘Super Unified Theory”.


In fact, the Weinberg angle (θ) is precisely defined by the equation (10), page 37 of ‘Super Unified Theory”, as follow.

     Sin (Δ θ1) =  Sin^2 (Δ θ2) = (Sin^2   Δ θ3) ^2 = (Sin^3 Δ θ4) ^3

= (Sin^6 Δ θ5) ^6 = (Sin^64 Δ θ6) ^64 ……. Equation (10)


  Sin (Δ θ1) = Sin {A (1) – 3 (A (0)/24}

= Sin {Cabibbo angle (θc)) – 3 (A (0)/24} = 0.23067


Sin^2 (Δ θ2 = 28.75°; Weinberg angle (θW)) = 0.2313502

Δ θ2 = 28.75° (Weinberg angle (θ))


{Sin (Δ θ1) + Sin^2 (Δ θ2)}/2 = 0.2310

All Δ θn are mixing angles.


The Angel and demons in the 100 years of physics nightmare

Natural is moving nicely minute by minute for the past 14 billion years and is playing its predetermined dance to its predetermined destiny with grace and joy.


On the contrary, the human mainstream physics is now in a hellfire nightmare after the discovery of a new boson in 2012. Is it suddenly falling into this hellfire nightmare unexpectedly? Or, were many hellfire demons already plagued the mainstream physics since the beginning 100 years ago? Logically, the latter must be the case. That is, the cause for the nightmare today can be traced out from its history.

The brief history

One, in (1925 – 1927), Copenhagen doctrine DECLARED that ‘quantum uncertainty’ is an intrinsic attribute of nature, and it cannot be removed by improvement of measurement in principle, and this led to the ‘measurement mystery’.

Soon, Schrödinger came up a Cat-riddle, and it CREATED the ‘superposition mystery’, the omnipresent of the ‘Quantum God’.


Two, in early 1954, a general gauge symmetry theory was developed by Chen Ning Yang and Robert Mills. Then, in the first part of 1960s, Murray Gell-Mann discovered the “Eightfold Way representation” from the experimental data. The Yang-Mills theory is a mathematic beautiful tool to describe some symmetries while the ‘Eightfold way’ is obviously encompassing some beautiful symmetry. However, the Yang–Mills field must be massless in order to maintain gauge invariance.


Three, in order for the Yang-Mills gauge to make contact to the real world (the Eightfold Way), it must be spontaneously broken. In 1964, Higgs and et al came up a ‘tar-lake like field’ (the Higgs mechanism) to break the SU gauge spontaneously.


Four, in 1967, Steven Weinberg and others combined a SU (2) gauge (a special Yang-Mills gauge) and the Higgs mechanism to construct the EWT (Electroweak Theory). And, this EWT works beautifully for a two quark model (with up and down quarks).


Five, in the November Revolution of 1974, Samuel Ting discovered Charm quark via the J/ψ meson; the original two quark model was thus expanded as a four quark model.


Six, in 1973, Maskawa and Kobayashi introduced the “CP Violation in the Renormalizable Theory of Weak Interaction”. Together with the idea of Cabibbo angle (θc), introduced by Nicola Cabibbo in 1963, the ‘Cabibbo–Kobayashi–Maskawa matrix’ was constructed. As this CKM matrix demands AT LEAST ‘3 generations of quarks’, a six quark model was constructed, the SM (Standard Model). The SM further predicts the weak-current (Ws) and neutral current (Z). tau (τ) lepton was discovered in 1975.


Seven, in 1983, the Ws was discovered, and Z soon after. Then, top quark was finally discovered in 1995.


At this point, the SM is basically confirmed. However, the Higgs mechanism also predicted a field boson. As the Higgs mechanism is the KEY cornerstone for SM, it (the SM) will not be complete if the Higgs field boson is not discovered.


The brief history of BSMs

With the great success of SM, a few BSMs (beyond standard model) quickly emerged.


One, the GUT (grand Unified Theory), with a higher symmetry; {SU (5), SU (3) x SU (2) x U (1); at about 10^16 Gev energy scale}. This work was mainly done by Glashow in 1974. The key prediction of GUT is the proton decay. From the early 1980s, a major effort was launched to detect the proton decay. But, the proton decay’s half-life is now firmly set as over 10 ^ 33 years, much longer than the life time of this universe, To date, all attempts to observe new phenomena predicted by GUTs (like proton decay or the existence of magnetic monopoles) have failed. With these results, Glashow was basically going into hibernation, while hoping that ‘sterile neutrino’ come to his rescue.


Two, the Preon model (done by Abdus Salam) which was expanded as Rishons model (mainly done by Haim Harari). It has sub-quarks (T, V): {T (Tohu which means “unformed” in Hebrew Genesis)  and V ( Vohu which means “void” in Hebrew Genesis)}.

Rishons (T or V) carry hypercolor to reproduce the quark color, but this set up renders the model non-renormalizable. So, it was almost abandoned on day one.


Three, the M-string theory began as a bosonic string theory. In order to produce fermions, it must incorporate with the idea of SUSY. That is, M-string theory and SUSY must be Dicephalic parapagus twins.


In the 1960s–1970s, Vera Rubin and Kent Ford had confirmed the existence of dark mass (not dark matter). SUSY was claimed as the best candidate to provide this dark mass. Thus, M-string theory dominates the BSM for the past 40 years.


The awakening of the demons

In 2012, a Higgs boson-like particle was discovered, with a measured mass = 125.26 Gev which is trillions and trillions smaller than the expected value.


The only way out for this predicament is by having a hidden massive partner to cancel (balance) out its huge mass. This massive partner can be a SUSY particle or a twin-Higgs. By March 2017, no twin-Higgs nor any SUSY were discovered under two (2) Tev range. Even if SUSY were existing in a higher energy sphere, it (SUSY) is no longer a solution for this Higgs-naturalness issue.


Furthermore, the b/b-bar should account for over 60% decaying channel for Higgs boson. But by now (November 2017), this channel is still not confirmed. The best number was 4.5 sigma from a report a year ago, which is not enough to make a confirmation. Most importantly, even if the channel were confirmed, it cannot meet this 60% mark.


Thus, many physicists now are open the possibility that this 2012-boson might not be the Higgs boson per se.


Yet, this Higgs demon does not stop its dance with the above issues.

The neutrino’s mass by definition cannot be accounted by Higgs mechanism, as a tar-lake like field to slow down the massless particle to gain an apparent mass, as neutrinos do not slow down in the Higgs field at all. Thus, neutrinos must be Majorana fermions.


Yet, the Majorana angel has never been observed.

One, by definition, Majorana particle must be its own antiparticle. But, many data now show that neutrino is different from its antiparticle.

Two, Majorana neutrino should induce the ‘neutrinoless double beta decay’, but its half-life is now set as over 10 ^ 25 years, much longer than the lifetime of this universe.

Three, by definition again, Majorana particle’s mass must come from ‘Sea-saw’ mechanism, that is, balanced by a massive partner, such as sterile neutrino or else (SUSY or whatnot). But, ‘sterile neutrino’ is now almost completely ruled out by many data (IceCube, etc.)

Four, the most recent analysis of the ‘Big Bang Nucleosynthesis’ fits well if the neutrino is a Dirac fermion (without a massive partner). If the neutrino is viewed as Majorana particle (with a hidden massive partner), ‘the Big Bang Nucleosynthesis’ can no longer fit the observation data.


Without a Majorana neutrino, the Higgs mechanism is DEAD. With a dead Higgs mechanism, SM is then fundamentally wrong as a correct model, although it is an effective theory.


This Higgs demon is now killing the SM, pushing the mainstream physics into the hellfire dungeon.


Of course, Weinberg and many prominent physicists still hope a rescue from one of the BSMs, especially from the M-string theory. But, SUSY (a major component of M-string) is now totally ruled out as an EFFECTIVE rescue. And, many most prominent String-theorists are now abandoning the M-string theory, see Steven Weinberg video presentation for ‘Int’l Centre for Theoretical Physics’ on Oct 17, 2017, at 1:32 (one hour and 32 minutes mark. Video is available at . A brief quote for his saying is available at


The rescuing angels

While the theoretical physics is falling into the hellfire dungeon step by step, the experimental physics angels are descending on Earth with sincerity and kindness.

One, dark mass (not dark matter) was firmly confirmed by 1970s.

Two, acceleration of the expansion of universe was discovered in 1997.

Three, a good estimation of CC (Cosmology Constant) ~ 3×10−122 was reached in 2000s.

Four, a new boson with 125.26 Gev mass was discovered in 2012.

Five, Planck CMB data (2013 and 2015) provided the followings:

(dark energy = 69.2; dark matter = 25.8; and visible matter = 4.82)

Neff = 3.04 +/- …

Hubble Constant: H0 (early universe) = 66.93 ± 0.62 km s−1 Mpc−1 (by Using ΛCDM with Neff = 3)

These were further supported by ‘Dark Energy Survey”.


Six, the Local Value of the Hubble Constant: H0 (now, later universe) = 73.24 ± 1.74 km s−1 Mpc−1. The difference between this measurement and the Planck CMB data show a dark flow rate, w = 9%.

Seven, the LIGO twin-neutron stars coalescing ruled out most of the MOND models in October 2017.

Eight, there is no difference between matter and its antimatter in addition to being having different electric charges.


The failed Inter-Universes Escape

Under a total siege by the data angels, the Higgs mechanism led army planed an ‘Inter-Universes Escape’. Its war plan was very simple, with two tactics.

One, blind its own eyes and yelling super loud: {We are the only game in town.} For this, they organized a Munich conference: {Why Trust a Theory? Reconsidering Scientific Methodology in Light of Modern Physics, on 7-9 December, 2015, see }.


Two, INVENTING almost unlimited ghost universes by using the dominant cosmology theory, the ‘inflation cosmology’.

“Inflation” was a reverse-engineering work for resolving some cosmology observations, such as the flatness, horizon and homogeneous cosmologic facts. As a reverse-engineering, it (inflation) of course fits almost all the old data and many NEW observations. But, almost all reverse-engineering are only constrained by the THEN observed data while without any ‘guiding principle’.

That is, the ‘initial condition’ of the ‘inflation’ cannot be specified or determined. This guidance-less fact allows unlimited ‘inflation models’ to be invented. Of course, it leads to ‘eternal inflation’, having unlimited bubble-universes.

At the same time, the M-string theory also reached its final destination, the ‘String Landscape’, having also unlimited string vacua, again for unlimited bubble-universes (the Multiverse). That is,

“Eternal inflation” = ‘string landscape’ = multiverse

Now, there is a CONVERGANCE coming from two independent pathways, and this could be a great justification for its validity.


With the super weapon of Multiverse, ‘the Higgs mechanism led army’ is no longer besieged by the angel of facts. Those facts (nature constants, etc.) of this universe is just a random happenstance, and even Nature does not know how to calculate them.


The only way to kill this Multiverse escape is by showing:

One, ALL the angel facts of THIS universe can be calculated.

Two, ALL the angel facts of THIS universe is bubble-independent, see .




More discussions on M-string theory is available at .


The Arch-Demons

In addition to rule out the Multiverse nonsense, there are some other major issues:

One, baryongenesis

Two, the dark energy/dark mass

Three, the gravity/spacetime

Four, is ‘Quantum-ness’ fundamental? (Including its measurement and superposition issues).


In G-theory, the ‘quantum-ness’ is not fundamental but emerges from the dark energy, see .





Furthermore, the G-theory universe is all about ‘computation’, that is, there must be a computing device in the laws of physics. And, of course, there is. In G-theory, both proton and neutron are the base of Turing computer, see .


These two points show that the ‘quantum-ness’ is not about ‘uncertainty’ but is all about the ‘Cosmo-certainty’, see . That is, the Copenhagen doctrine is in fact one of the Arch-Demon.


In addition to ‘computation’, THIS (not other-verse) universe is all about energy and mass. So, the Structure Function of THIS universe can be defined as:

S (universe) = S (energy, mass)

= S (dark energy, dark mass, visible relativistic mass/energy)

As both Newtonian and GR are related to the structure of this universe, Gravity can be defined by the S-function, as:

Gravity = G (S) = G (dark energy, dark mass, visible mass)

= G (dark energy) @ G (mass)

For G (mass), it has only one parameter, mass. This FACT shows that every ‘mass’ must interact with ALL other masses in THIS universe. That is, the Simultaneity Function can be defined by G (mass), that is,

G (mass) = Si (mass); G (mass) is a simultaneity function.

This Si function can be renormalized only if the gravity interaction transmits instantaneously. In fact, if the gravity of the Sun reaches Earth with light speed, it will not fit the reality. The Sun/Earth gravitational interaction is precisely described with Newtonian gravity law, which encompasses instantaneity.


So, for Sun/Earth gravity at least (if not for other cases), G (mass) should be the function of both {simultaneity and instantaneity}. Thus, we can define:

G (Sun/Earth) = G (mass, simultaneity, instantaneity)


For Newtonian gravity, the ‘masses’ are wrapped into two points, the ‘center of mass’ while the simultaneity and instantaneity are innate part of the equation.


For GR, the simultaneity and instantaneity are wrapped into the ‘spacetime sheet’. When mass interacts with the GR spacetime sheet, it transmits both simultaneously and instantaneously.


This kind of wrapping makes both gravity theories automatically incomplete, as effective theories at best. Now, Newtonian gravity is now viewed as wrong in terms of Occam’s razor, and thus it does the modern physics no harm. On the other hand, GR is still viewed as the Gospel on gravity, and it becomes the greatest hindrance for getting a correct gravity theory.


If GR did provide us some insights before, it is a long time ago past tense. The recent promotion about the greatness of the LIGO discovery will further drag us down the hellfire dungeon. LIGO indeed might provide some additional data to confirm what we already know, but it cannot rescue GR’s fate as a total trash. The following is just a short list of GR’s shortcomings.

One, GR plays zero role in the construction of quark/lepton.

Two, GR plays zero role in calculating the nature constants, such as Alpha or Cabibbo/Weinberg angles, etc.

Three, GR fails to account for dark mass and dark energy, unable to derive the Planck CMB data.

(dark energy = 69.2; dark matter = 25.8; and visible matter = 4.82)

Neff = 3.04 +/- …

Hubble Constant: H0 (early universe) = 66.93 ± 0.62 km s−1 Mpc−1 (by Using ΛCDM with Neff = 3)

Four, GR provides no hint of any kind for the BaryonGenesis, which is definitely a cosmology issue, and this alone should give GR the death sentence.

Five, the last but not the least, GR is not compatible with QM (quantum mechanics).

More details on this, see .


Yes, GR is of course a very EFFECTIVE gravity theory (as a great reverse-engineering work) but is definitely a wrong one for the correct theory. The GR wrapping which hides the essences of gravity (simultaneity and instantaneity) renders it unsalvageable and unamendable. That is, it is in fact the greatest hindrance for getting a correct gravity theory. So, GR is the other Arch-Demon for modern physics.


Here is the ArchAngel

All the calculations for those angel facts (of section D) are done in G-theory (Prequark Chromodynamics).

Superficially, Prequark model is similar to the Preon (Rishons) model, but there are at least four major differences between them.

One, the Rishons model has sub-quarks (T, V): {T (Tohu which means “unformed” in Hebrew Genesis)  and V ( Vohu which means “void” in Hebrew Genesis)}. But, Harari did not know what T is (just being unformed). On the other hand, the A (Angultron) is an innate angle, a base to calculate Weinberg angle and Alpha, see .


Two, the choosing of (T, V) as the bottom in the Rishons model was ad hoc, a result of reverse-engineering. On the contrary, there is a very strong theoretical reason for where the BOTTOM is in G-theory.

In G-theory, the universe is ALL about computation, computable or non-computable. For computable, there is a TWO-code theorem. For non-computable, there are 4-color and 7-color theorems.

That is, the BOTTOM must be with two-codes. Any lower level under the two code will become TAUTOLOGY, just repeating itself.

Anything more than two codes (such as 6 quarks + 6 leptons) cannot be the BOTTOM.


Three, rishons (T or V) carry hypercolor to reproduce the quark color, but this set up renders the model non-renormalizable, quickly going into a big mess. So, it was abandoned almost on day one. On the other hand, prequarks (V or A) carry no color, and the quark color arises from the “prequark SEATs”. In short, Rishons model cannot work out a {neutron decay process} different from the SM process.




This is one of the key differences between prequark and (Rishons and SM).


Four, Preon/Rishons model does not have Gene-colors which are the key drivers for the neutrino oscillations.


More details on those differences, see .


In addition to being theory to describe particles, G-theory also resolves ALL cosmologic issues which consists of only three:

One, the initial condition of THIS universe

Two, the final fate of THIS universe

Three, the BaryonGenesis mystery


BaryonGenesis determines the STRUCTURE of THIS universe, that is,

G (S) = G (dark energy, dark mass, visible mass)

= G (dark energy) @ G (mass)

So, BaryonGenesis must be the function of G (S), which is described as:

(dark energy = 69.2; dark matter = 25.8; and visible matter = 4.82)

The calculation of this Planck CMB date in G-theory uses the ‘mass-LAND-charge’, that is, all 48 fermions (24 matter and 24 antimatter) carry the same mass-land-charge while their apparent masses are different. And, MASS-pie of THIS universe is evenly divided among those 48 fermions. That is, the antimatter does in fact not disappear (not be annihilated) while it is invisible. See the calculation below. More details, see .

This BaryonGenesis of G-theory rules out the entire sterile dark sector (WIMPs, SUSY, sterile neutrino, axion, MOND, etc.) completely.

On November 8, 2017, Nature (Magazine) announced the death of WIMP, see .


This BaryonGenesis calculation must also link to the issues of {initial condition and the final fate}. And indeed, it does.

BaryonGenesis in fact has two issues.

One, where is the antimatter in THIS universe?

Two, why is THIS universe dominated by matter while not by antimatter?


The ‘One’ was answered with the above calculation.

The ‘Two’ can only be answered by ‘Cyclic Multiverse’.

However, for THIS universe goes into a ‘big crunch’ state, the omega (Ω) must be larger than 1, while it is currently smaller than 1. That is, there must be a mechanism to move (evolve) Ω from less than 1 to bigger than 1.

Again, only G-theory has such a mechanism, and it is not a separately invented but is a part of BaryonGenesis calculation, the ‘Dark Flow, W’.

This dark flow (W) prediction of the G-theory was confirmed in 2016, see .


G-theory of course accounts for the ‘initial condition’, see .


Army of the Archangel

Weinberg has been complaining about the Arch-Demon (Copenhagen doctrine) many times but without making any new proposal, see .


On the other hand, ‘t Hooft (Nobel Laureate) did embrace the G-theory from the point of Cellular Automaton Quantum-ness, see . In 2016, he even published a book on it.


More details, see .


Sabine Hossenfelder just issued a death sentence for Naturalness (see ).



The death of Naturalness is a precursor for the death of Higgs Mechanism.



Steven Weinberg just revealed the death of M-string theory in his October 2017 video lecture.



Paul J. Steinhardt announced the death of ‘inflation cosmology’ in 2016.




The current hellfire nightmare of the mainstream physics did not start in 2012 but is the results of three demons {Copenhagen doctrine, GR and the Higgs mechanism}, began in 100 years ago. Fortunately, many angel facts (experimental data) have revealed their demon-faces. Finally, the ArchAngel (the G-theory) has come for the rescue. With the growing army of ArchAngel, the human physics’ salvation is now secured.







Science is not some eye catching headlines

Cosmos Magazine reported on 23 October 2017: {Universe shouldn’t exist, CERN physicists conclude (see )}.

This title is truly eye catching, and it indeed goes viral in the public media, see also .

Under this eye catching hype, there is a very good and solid science, finding out that there is essentially no difference between proton and its antiparticle (see–proton-and-antiproton-share-fundamental-properties.html , and, ).


Instead of making such an eye catching hype, science should do some soul searching: {What has gone wrong?}

A) What went wrong?

The obvious WRONG conclusion is based on two speculations.
One, the matter (especially proton) and antimatter (antiproton) were created equal (amount) at the Big Bang.

Two, the FACT of today that THIS universe is dominated by matter is because that the antimatter has almost ALL been annihilated.

These two speculations lead to a new speculated conclusion: there must be a process which annihilates antimatter while preserve the matter.

Then, this speculated conclusion lead to the fourth speculation: there must be some differences between matter and anti-matter in addition to its definition, having opposite electric charge.

Yet, the recent data shows that there is virtually NO difference between the two.


B) Righting the wrong

Instead of making an eye catching joke, science must conclude that one (at least) of the two original speculations must be wrong.

In G-theory, matter and anti-matter are not mirror counter partners but are woven together by one string and one anti-string. That is, the anti-matter is the necessary partner co-exist with the matter simultaneously, and there is no anti-matter-annihilation massacre right after the Big Bang.




See {BaryonGenesis, the master-key of all mysteries; }


C) The supporting facts

This G-theory is supported by two facts:

One, as the anti-matter is a co-existing partner of matter, the dark mass calculation must account the anti-matter together with the matter in the equation, and that calculation fits the Planck data perfectly.


Two, there are zillions anti-matter (anti-quarks) inside of proton or neutron; the anti-matter does not disappear from this matter dominated universe.



D) Additional issues

Yet, the two facts above cannot escape from the fact that matter (such as proton, neutron and electron) is after all DIFFERENT from its anti-partners (anti-proton, anti-neutron and positon). That is, why is THIS universe dominated with matter, not anti-matter?

This last question was addressed in G-theory long ago in terms of “Cyclic multiverse”.

That is, the matter universe and anti-matter universe appear alternately in a cyclic multiverse.


Unfortunately, the “Inflation-paradigm” has misled the entire world for more than 40 years. Fortunately, ‘inflation’ is now killed (see ).


However, the ‘cyclic multiverse’ model of Paul J. Steinhardt, et al, did not address two important issues.

One, why is THIS universe dominated with matter, not anti-matter?

Two, what is the detail mechanism for pushing Ω over 1 from its current value of less than 1?


Again, G-theory provides the answer.


E) The conclusion

Regardless of the G-theory, physics mainstream community should reexamine its two original speculations. Any eye catching headline will not advance science a single bit.

The Mickey Mouse principle

After the EPR argument, the ‘quantum mechanics (QM)’ is known as incomplete. In recent years, Steven Weinberg has repeatedly voiced his complain about the incompleteness of QM, see , without giving a precise new proposal.

On the other hand, Gerard ‘t Hooft (a Nobel Laureate of physics) published a book {The Cellular Automaton Interpretation of Quantum Mechanics (by Springer in 2016)} and followed up with a new article {Free Will in the Theory of Everything (in September 2017)} to propose a complete new FRAMEWORK for QM.

A) The ‘t Hooft/ Maudlin debate

However, ‘t Hooft’s new QM is violently attacked by many, such as Tim Maudlin. The center of the battlefield is still about the EPR argument, especially about its derivative (the Bell’s theorem).

Bell’s theorem: {No physical theory of local hidden variables can ever reproduce all of the predictions of quantum mechanics}; rules out local hidden variables as a viable explanation of quantum mechanics (though it still leaves the door open for non-local hidden variables).

In the general consensus, Bell’s theorem is now verified by Alain Aspect (1981) and Hensen (2015) experiments.

However, even John Stewart Bell admitted that Bell’s theorem can be invalided under the condition of superdeterminism.

Superdeterminism: the apparent freedom of choice of an agent (Alice or Bob) is in fact the reenacting a predetermined screenplay; that is, there is not true free-will. Thus, Bell’s theorem depends on the assumption of “free will”, which does not apply to deterministic theories.


Now, the battle line is very clear:

For Maudlin:

One, Bell’s theorem has verified.

Two, the automata are 1) following deterministic rules and 2) reacting at any time to only local inputs. That is, cellular automaton lying on a grid are updated according to laws that only involve nearest neighbors, nothing else, so that deserves to be called “local”.

Three: so I hope we agree that neither the local indeterministic automata nor the local deterministic automata of this sort could be used in an empirically acceptable theory, even though producing the right empirical results is logically possible in each case.

Four (conclusion): cellular automaton QM is totally wrong.


For   ‘t Hooft:

One, my findings are so different from Bell’s. The core ingredient of my views is the existence of mappings of the states of a local, deterministic system onto orthonormal sets of basis elements of Hilbert space.  QFT is a local indeterministic theory that obviously predicts violations of Bell’s inequality, and it was described by Bell himself as “not just inanimate nature running on behind-the-scenes clockwork, but with our behaviour, including our belief that we are free to choose to do one experiment rather than another, absolutely predetermined”

Two, ‘t Hooft’s CA is a *quantum* cellular automata: “the local indeterministic automata should produce behavior that is indistinguishable from local deterministic automata that are all running different deterministic pseudo-random number generators; that is, there exists an automaton-like theory with quantum evolution laws, mimicking the Standard Model at large distances, that yields the same predictions as a deterministic automaton.

With the superdeterminism loophole remains open, the above argument is identical to the ‘chicken talk to duck’, singing their own songs without any meaningful conversation.

B) The verdict

So, ‘t Hooft concluded: {I still feel the burden of producing more precise models, ones that generate more precisely systems of particles resembling the SM. As long as that hasn’t been completed, you can continue shouting at me.}

Fortunately, there is a (the) precise model which generates particles exactly resembling (identical to) the SM zoo.


In Prequark Chromodynamics, both proton and neutron have the cellular automaton descriptions (as glider of Conway’s Life game, the base for a Turing computer), see . And, this is now widely known via Twitter.







C) Bell’s theorem revisited

With Prequark Chromodynamics, the ‘t Hooft/Maudlin debate can now be settled. But, I do not agree with the view that superdeterminism plays a major role in QM. Thus, I will revisit this ‘Bell’s theorem’ issue.

In addition to the superdeterminism loophole, there are two issues for the experimental verification for the theorem.

One, there are loopholes for the experiments, and some of them are intrinsic, having new loopholes in ad infinitum sense.

Two, all experiments are theory-based (biased). That is, all the experimental verification will not guaranteed the intended theory to be CORRECT. The two best examples are GR (general relativity) and SM (standard model of particles). GR has passed ALL experimental tests which we human can throw at it, but it is now known as an ‘effective theory’ at best if not all the way wrong (as a gravity theory). SM has also passed all tests which we human can throw at it, but no one in the whole world believes that it is a complete theory.

On the other hand, a theorem (not law) could be disproved logically or linguistically.

Bell’s theorem: {No physical theory of local hidden variables can ever reproduce all of the predictions of quantum mechanics}; rules out local hidden variables as a viable explanation of quantum mechanics (though it still leaves the door open for non-local hidden variables).


Is this theorem logically or linguistically sound?

It consists of only two linguistic (logic) terms: {local hidden variables theory} and {quantum mechanics}.

“Local hidden variables” = “”local realism”

Locality: means that reality in one location is not influenced by measurements performed simultaneously at a distant location; that is, no instantaneous (“spooky”) action at a distance.

Realism: means that the moon is there even when not being observed; that is, microscopic objects have real properties determining the outcomes of quantum mechanical measurements.

Yet, violation of Bell’s inequality implies that at least one of the two assumptions (locality or realism) must be false.

Determinism must be confined in the domain of {locality + realism}.

Superdeterminism (without free will) can roam outside of the deterministic domain.

Freedom refers to the physical possibility of determining settings on measurement devices independently of the internal state of the physical system being measured.

Non-locality: the signal involved must propagate instantaneously (or with superluminally signal), so that such a theory could not be Lorentz invariant.

If we can show that QM is totally local and real, then Bell’s theorem is invalid or simply moot.


QM is different from the local/real theory with only two major attributes: quantum uncertainty and superposition (Schrödinger’s cat).

One, quantum uncertainty: means that two noncommuting observables (such as position/momentum or time/energy) can never have completely well-defined values simultaneously, and this uncertainty is intrinsic, irremovable by the improvement of the measurements.

Two, superposition: the fate of Schrödinger’s cat.


In G-theory, these two mysterious QM wonders are totally deterministic.

First, QM is an emergent, not fundamental. QM uncertainty equation is the result of dark energy (the expansion of the universe).




See and see Note 2 below.


Second, there is a {deterministic attractor}: all superposed states converge to  deterministic macro-states, see


Third, quantum-ness cannot avoid the deterministic outcome.


In fact, all the Alain Aspect type experiments show only that quantum particles have a special attribute, the entanglement while the entanglement is 100% deterministic. There is no superluminally signal between the entangled particles as their states are superdetermined.


However, the superdeterministic feature of entanglement does not imply that the entire QM is superdeterministic. QM is completely deterministic (local and real) for three reasons.

One, the QM uncertainty is only the apparent effect of the expansion of the universe.

Two, the superposition is erased by the deterministic attractor.

Three, the entanglement is superdetermined.


Now, the Bell’s theorem can be mooted for three reasons.

One, there is a loophole (superdeterminism).

Two, all the experimental tests which support the Bell’s theorem cannot and will not guaranteed its validity (same fate as GR and SM).

Three, G-theory shows that 1) proton and neutron are Gliders (cellular automaton), 2) the expansion of the universe is 100% deterministic while the QM uncertainty is the emergent of it, 3) the superposition is erased by the deterministic attractor.


D) Clarifying the differences

I do agree with ‘t Hooft’s Cellular Automaton QM in principle as the G-theory (with proton/neutron as Glider) was developed 30 years before ‘t Hooft’s book (by Springer in 2016). I however do not agree with him about the ‘superdeterminism’ playing a MAJOR role in the case of completely excluding the ‘free will’.

Here, I would like to introduce the “Mickey Mouse principle”.

Mickey Mouse principle: Mickey Mouse and all Mickey Mouse-like entities are real.

At here, Mickey Mouse is an undefined term, understood in sociological sense. However, it has, at least, two attributes.

One, Mickey Mouse has no biological correspondence in terms of the ‘word’ mouse. That is, it is not real as a biological mouse.

Two, Mickey Mouse is observable as it is.


So, anything which encompasses the two attributes above will be a Mickey Mouse-like entity.

Example: if rhinoceros (or Saola, Narwhal, Unicornfish, Texas unicorn mantis, Okapi, Goblin spiders, Helmeted curassows, Unicorn shrimp, Arabian oryx, etc.) is clearly defined as not Unicorn, then Unicorn has no biological base, similar to Mickey Mouse, and it is a Mickey Mouse-like entity.

Yet, Unicorn is of course REAL in accordance to the “Mickey Mouse principle” as it is observable at many places, in arts (paintings, sculptures, animations, etc.).

The ‘free will’ is the backbone of the legal system (a subsystem of nature). Without IT, the entire legal system collapse. So, the ‘free will’ is at least a Mickey Mouse-like entity, and thus no law can exclude it.

On the same token, ‘superdeterminism’ cannot be excluded as it is the backbone for entanglement.

Of course, we cannot exclude the Bell’s theorem although it is a totally useless in the REAL world.


Note 1: the Mickey Mouse principle was first introduced here,


Note 2: in addition to change QM uncertainty to QM certainty, the EMERGENT QM equation also controls the evolution of the universe, also see .




This PREDICTED dark flow (by G-theory) is now verified,





And, all these was predicted in the book (Super Unified Theory)



Note 3: I discussed this issue in August 2012 already, at



Chances of Redemption for LIGO

LIGO claimed thus far for three DISCOVERIES for GW (gravitational wave). Without any independent verification, anyone (including LIGO) cannot make any claim for DISCOVERY by definition. LIGO’s three claims are not only wrong but are craps, and I made this point clear over one year ago, see and,


In addition to the scientific practice (moral) above, LIGO’s claims were logically wrong, as it did not detect any 2nd crest for any of the events. See,


The {spin and the masses} of those black holes, at this point, are just speculation craps. LIGO should just tell us about the wavelength and amplitude of those GWs.


On August 18, 2017, J Craig Wheeler (a member of LIGO collaboration) tweeted LIGO was verified with an optical counterpart, see


There are two concerning points.

One, neutron star by definition is a failed block hole, that is, its mass is smaller than an average black hole. So, the GW of twin neutron stars merging should be much smaller than the black hole cases, in two to three orders of magnitude.

Two, the Hubble telescope image of the neutron stars event on August 22 shows that the event is still in the merging process (not yet complete). So, any LIGO detection (on or before August 18) will be the pre-final waves for the event. Yet, why LIGO did not detect those pre-final waves for the so-called discovered twin black hole cases, while those BH pre-final waves should be much stronger than the neutron star case.


If LIGO can detect this neutron star event, it must be able to detect the other crests of the previous BH events. I truly hope that it could, as it has spent too much of my money on this.

This neutron star event can be the only chance for LIGO’s redemption. I am giving it my best wish.

Finally, I do want to complement LIGO’s manner on this occasion, as it did not make a crap claim right the way as it did before, and this alone is a redemption for LIGO, see its press-release below.


25 August 2017 — The Virgo and LIGO Scientific Collaborations have been observing since November 30, 2016 in the second Advanced Detector Observing Run ‘O2’ , searching for gravitational-wave signals, first with the two LIGO detectors, then with both LIGO and Virgo instruments operating together since August 1, 2017. Some promising gravitational-wave candidates have been identified in data from both LIGO and Virgo during our preliminary analysis, and we have shared what we currently know with astronomical observing partners. We are working hard to assure that the candidates are valid gravitational-wave events, and it will require time to establish the level of confidence needed to bring any results to the scientific community and the greater public. We will let you know as soon we have information ready to share. (See )}


I made the following statement in my book “Nature’s Manifesto”:

{Yes, GW (gravitational wave) is real, and it will be detected one day. It is very possible that LIGO will be the one to accomplish this. But, LIGO announcement this year (2016) is definitely a bullcrap, (page 269, Nature’s Manifesto, see or ).}

For a whole redemption, LIGO must formally renounce its {three discovery claims}, as they were not independently verified. They should be renamed as {promising gravitational-wave candidates}.


Note (added on October 3, 2017): LIGO received Nobel physics 2017 today. But there are two points must be pointed out.

One, LIGO failed to detect a KNOWN GW event (see Hubble image of Binary Neutron Star merge of NGC 4993 on August 22, 2017).


Two, LIGO’s new claim GW170814 has a small enough patch for some non-LIGO verifications, but they are not done.


That is, all LIGO’s discoveries thus far (on this receiving Nobel physics day) are self-claims, without any verification from and with other astrophysics measurements.


God did, you say

Who created this universe?

God did, you say.

I have no way to argue with you as I know 100% sure that I did not do it. By all means, I am not interested in the issue of who did it. I am only interested in two issues.

Issue one (I1), the ACT of creation, of how (not about who did)?

Issue two (I2), its product: the structure of this created product (not about who did again).


The mainstream physics does not and is unable to address the issue one (I1). However, it has done some great works on the issue two (I2), at least with three great pillars.

P1, Standard Model (SM) + quantum principle + some measured nature-constants (such as, Alpha, CC, Cabibbo /Weinberg angles, etc.)

P2, Planck CMB data + Hubble (Big Bang) cosmology

P3, Newtonian gravity + GR (General Relativity)


These three pillars are wholly established without any ambiguity or disagreement. But, there are at least three unresolved issues (UI) from these three pillars.

UI1, many of those measured nature-constants cannot be derived (calculated) with these three pillars.

UI2, SM is incomplete and unstable, not including the gravity, the dark sector and the fine-tuning of Higgs mass, etc.

UI3, quantumness and gravity are incompatible; P1 and P3 do not jive.


It is easy to show that when one UI is resolved, all will be resolved. On the other hand, if a pathway is definitely a wrong track for one UI, it will be wrong for all.


So, I will discuss the UI issues beginning with the UI2, as the mainstream physics community has spent most of its energy on BSM (Beyond SM) which takes the SUSY as the paradigm.


But, no SUSY at {LHC, dark matter search (such as LUX), astrophysical sources (such as, AMS02, IceCuble, etc.) thus far (July 7, 2017)}.



In fact, there are two ways to address this UI2; horizontally like SUSY or vertically like Prequark.



With Prequark Chromodynamics (see ):

One, UI1 is resolved.




Two, UI2 is resolved.

Planck CMB data is derived (calculated)

In this calculation, dark mass/visible mass/dark energy are related with a precise dynamics (including the dark flow). That is, dark mass/visible mass CANNOT be related only via gravity (their masses). So, WIMP (or any DARK PARTICLE scenario) by definition (with the above equations) is wrong conceptually.


Three, UI3 is resolved.

Quantumness is an emergent of gravity.


While the mainstream physics is unable to address the issue one (the ACT of creating this universe), it does able to reverse-engineering to reconstruct the Big Bang state, the so-called ‘inflation-scenario’; that is, there must be a period of ‘exponential expansion’ at Big Bang.


As only a reverse-engineering, it can and must fit with the current observable universe. But, its shortcoming is now wholly denounced by Anna Ijjas, Paul J. Steinhardt and Abraham Loeb (see ). However, there are two issues about this ‘inflation-war’.

One, the pro-inflation camp is now claiming: {You can create a universe from nothing—you can create infinite universes from nothing—as long as they all add up to nothing.} This is Plagiarism, as everyone knows that ‘inflation’ is not about {creating something from nothing} but is about manifestation of this universe from something very SMALL (definitely a something) while the creation-cosmology is my work, see . Furthermore, I politely informed Guth about this in 1993.



Also see

“Inflation” is now totally discredited, not even a science, see


Two, while any Bounce-cosmology can account for the ‘exponential expansion’ phase,


All other bounce-cosmologies do not have a mechanism to change Ω from the current value of (< 1) to (>1). Only G-theory (Prequark Chromodynamics) has a ‘Dark Flow’ mechanism to accomplish this task.


The PREDICTION of the current dark flow of 9% is now verified by the new Hubble Constant measurement, see .



The G-theory (Prequark Chromodynamics) has not only resolved all UI (unresolved) issues, and it is able to address the impossible: the ACT of creation, with a ‘First Principle’.



All the G-theory (Prequark Chromodynamics) predictions are the consequences of this ‘First Principle’. Furthermore, this G-theory (Prequark Chromodynamics) is now saving the soul of the mainstream physics.

First, the Higgs fiasco:

A new boson (with 125.26 Gev. mass) was declared as Higgs boson in 2012, and Peter Higgs won the Nobel in 2013. But,

One, Higgs mechanism is not verified five (5) years after that discovery.



{Note (added on July 11, 2017): this week CERN reported that the evidence (a signal of 3.6 sigma, not confirmation) of H  -> bb channel was recorded after analyzing 50 fb-1 of data (the Run I and Run II of 2016). End Note.}

A 3.6 sigma signal from 50 fb-1 data is by no means a success for Higgs, and it is in fact a major problem for it. Furthermore, the life of Higgs mechanism is hinged on neutrino being a Majorana fermion, but the recent evidence has showed otherwise. Without a Majorana neutrino, Higgs mechanism is definitely wrong. Without confirming Higgs mechanism, the new boson is definitely not a Higgs.


And, the mass of Higgs boson is still not calculable (derivable) via Higgs mechanism (regardless of whether it is right or wrong).





But G-theory (Prequark Chromodynamics) is able to calculate this new boson (vacuum Boson) mass.


My soul saving call for this Higgs nonsense was issue in 2015.


It is very nice to see that the mainstream physics is now admitting its Higgs nonsense.


Second, the LIGO nonsense.

LIGO is THUS FAR another OPERA joke and BECIP2 fiasco.

I made this point one year before the work of Creswell et al, see .

LIGO’s claim is conceptually wrong.

It has two points.

P1: its detection has the astrophysical (not terrestrial) origin.

P2: its interpretation is that that signal is the result of two massive black hole coalescing.


LIGO’s argument for P1 is based on two point.

One, the signals (after subtracted all noise) from each detector has the same (or similar) waveform.

Two, the time lag between the two signals is less than 10 milliseconds.


These two points can at best make the matching signals as a candidate for GW (gravitational wave).

As the two detectors (separated over 3,000 miles) have the similar designs and similar apparatus, they could have similar inherited system noise (ISN). This ISN could be very strong at the turn-on phase (before going to a steady state). If the two detectors are turned on at about the same time, these ISNs can be easily matched within the 10 millisecond time lag. When two similar systems go into steady states, the ISNs will become weaker, but the matching can still happen. With this analysis, it is easy to PREDICT that the STRONG signals should always happen at the turn-on phase.


But, most important of all is that without detecting the 2nd crest of the same event, a matching cannot be confirmed as GW. I made this point very clear one year ago.


LIGO’s claim of P2 is simply not science, as it at best is just a speculation. The P2 claim thus far has identified 9 black holes (6 pre-coalescing, 3 now existing).

Black hole by all means is not invisible, especially when there are INTERACTIONS. Black hole can be indirectly seen with ‘gravitational lensing’, or the behavior of the nearby stars. The interaction can of course be detected with some other signatures, such as gamma-ray burst, neutrinos (from the collision of the ‘event horizon’). But, one year went by, no any sign of those from the following surveillant eyes:

Fermi Gamma-ray Space Telescope

Fermi Large Area Telescope

Dark Energy Camera, a 570-Megapixel digital camera mounted on a telescope in the Chilean Andes


How can LIGO claim P2 without any 2nd party verification? This is not science.


But most important of all is that LIGO speaks a {twin massive black holes} population density way, way above the current observation (data), and there is no observed ‘PROCESS’ which can produce the LIGO twin black holes. I have made this point very clear again one year ago.



For P1, LIGO very much did not clean all the noises. Without the detection of the 2nd crest of the same event, the LIGO signal is very much a piece trash caught between two detectors. The following graph is a very good description of LIGO’s work thus far.


The followings are the facts about LIGO thus far.

One, it has no proof that its so-called signals are GW signals.

Two, it has no ideal of any astrophysical process which can produce the GW150914 type of twin black holes.

Three, its detectionS speaks a total different cosmologic structure which is in conflict with all the current observable data, especially on the issue of population-density of the LIGO-twin-black holes.

Four, it has no support from any other surveillant eyes and ears.


Third, “Inflation” is now totally discredited, not even a science.

There are a few facts about ‘inflation’.

One, it is just a reverse-engineering to produce a ‘Big Bang’ state: that is, a period of exponential expansion from something very small.

Two, it does not provide any guideline for the ‘fate’ of this universe.

Three, it does not provide any explanation for the current ‘accelerating cosmic expansion’.

Four, it is highly sensitive to its initial condition; that is, it itself is not an initial condition of THIS universe. Thus, it is an ad hoc trash, not needed for this universe.

Five, it cannot shake-off a bad consequence, the multiverse; that is, it does not even provide any explanation for THIS universe.

Six, it does not provide a solution for the baryongenesis issue.


On the other hand, the ‘Bounce-cosmology’ does provide:

One, the initial ‘exponential expansion’ via the cyclic-multiverses before THIS big bang.


The ‘matter and anti-matter alternately appear in each bounce’ naturally resolve the BaryonGenesis issue.

Two, the ‘FATE’ of any universe (including this one) is clearly defined, a new bounce.

Three, the ‘current accelerating cosmic expansion’ is clearly explained with a dark flow. And, this dark flow is now confirmed by the new Hubble Constant measurement. See


And it is also confirmed by the Planck CMB data.


In fact, the entire evolution of THIS universe {from Big Bang -> CMB -> star/galaxy formation -> current accelerating cosmic expansion -> a new bounce} is explained with the dark flow (W).



Now the LINEs have drawn very clearly.

One, SUSY vs Prequark

Two, Higgs boson vs Vacuum boson

Three, inflation-scenario vs bounce-cosmology (with dark flow of W)

Four, quantum uncertainty: fundamental vs emergent

Five, creation law vs incomprehensible














Guth and Gefter, welcome for quoting the G-theory

On June 1, 2017, Amanda Gefter wrote an article at Nautilus defending Alan Guth on the recent ‘Inflation war’, by saying: { You can create a universe from nothing—you can create infinite universes from nothing—as long as they all add up to nothing.}

This statement is the KEY point in the G-theory, which I have informed Guth in 1993 when I politely told him that his ‘inflation’ is wrong. I showed him two points.

One, the neutron decay in G-theory, which associates with a vacuum boson and the calculation of its mass.

Two, the creation law:

Law of Creation — If B is created by “creating something from nothing process,” B (the something) must remain to be “nothingness” in essence.



This creation law was stated on page 45 in the book ‘Super Unified theory’, US copyright © 1984 # TX 1-323-231

This creation law is also available online at many places for over 25 years.

One, see (online since 1996)


Three, it is also the key point of the book {Nature’s Manifesto — Nature vs Bullcraps} which is available to the ‘Department of physics, MIT’ since January 2017, also see

Guth and Gefter, welcome to the G-theory. Everyone knows that ‘inflation’ is not about {creating something from nothing} but is about manifestation of this universe from something very SMALL (definitely a something). When you or anyone else tries to change your position by borrowing other idea, please state the ‘source’ of the quote the next time when you are using the idea of G-theory.

There are two more differences between ‘inflation’ and ‘cyclic multiverse (CM)’:

One, the exponential expansion (EE) of CM happened before THIS big bang, while the EE happened after this big bang for ‘inflation’.

Two, the expansion (exponential or after big bang) is an innate property of the equation-zero, not a ‘gravitationally self-repulsive force’ of the ‘inflation’. The exponential expansion is caused by the ‘bounces’, see graph above. The ‘after big bang expansion’ is caused by ‘dark flow’, see graph below.

Note, Gefter’s article is available at 

In addition to this post, I also commented at Gefter’s article, available at