Superdeterminism, Causality, and the Hidden Branching: An Engineering Graduate’s Quick Insight
- Gabriel Boboc
- May 22
- 10 min read
Introduction
Quantum mechanics challenges our deepest intuitions about reality, randomness, and causality. Among various interpretations, superdeterminism proposes a radical deterministic alternative, asserting that every quantum measurement outcome—and even the experimenter’s choices—are fixed by hidden variables predetermined since the universe’s origin.
While this interpretation seeks to “save” Einstein’s dream of a deterministic universe, a brief reflection reveals an unexpected and paradoxical consequence: superdeterminism actually undermines Einstein’s causality more severely than quantum mechanics itself ever does. It introduces a level of causality breaking and conspiratorial correlation that has no precedent in physics.
Remarkably, this insight—that superdeterminism is essentially a “one-branch Many-Worlds Interpretation (MWI)” hidden in initial conditions—took only a few minutes of pondering, sparked while drinking coffee and listening to Sabine Hossenfelder discuss the topic.
What is Superdeterminism?
Superdeterminism rejects the assumption of free choice in measurement settings (measurement independence). It claims that all events—including seemingly free experimental choices—are encoded in a predetermined state of the universe. In this way, it tries to restore both locality and determinism to quantum physics, defying Bell’s theorem by rejecting its foundational assumptions.
The Paradox: Superdeterminism Undermines Einstein’s Causality
Ironically, while superdeterminism aims to salvage Einstein’s vision of a causal and deterministic universe, it does so at the cost of introducing a deeper causality violation than Einstein ever accepted.
By pre-coordinating experimenter choices with particle properties across arbitrary spacetime separations—sometimes billions of light years apart—it demands a conspiratorial “cosmic coordination” that breaks the intuitive temporal order of cause and effect.
This kind of causality breaking is not just unusual; it is fundamentally more severe and counterintuitive than the nonlocal correlations of orthodox quantum mechanics.
Wigner’s Friend Paradox and the Need to Branch
A further challenge for superdeterminism arises from the Wigner’s friend paradox, where an observer (the friend) inside a closed lab measures a quantum system, but another observer (Wigner) outside assigns a superposition to the entire lab plus friend system.
Superdeterminism must somehow reconcile the friend’s definite measurement outcome with Wigner’s superposed description. To maintain determinism and consistency, superdeterminism cannot avoid effectively branching at some point:
The friend’s observation collapses the wavefunction from their perspective, implying a definite outcome.
Wigner’s perspective maintains superposition until he interacts.
To reproduce this consistent experience across observers, superdeterminism must encode all possible branches and outcomes in the universe’s initial state—reintroducing hidden branching despite claiming a single deterministic path.
Thus, the Wigner’s friend paradox forces superdeterminism to embrace a hidden “branching” structure akin to Many-Worlds, but buried in initial conditions rather than explicit world splitting.
The Hidden “Branching” Problem: A One-Branch Many-Worlds in Disguise
To reproduce quantum statistics exactly, superdeterminism must encode all possible measurement settings and their outcomes in the universe’s initial conditions. This is a form of hidden branching: instead of many worlds existing explicitly, the single universe must simulate all outcomes in its finely tuned initial state.
Thus, superdeterminism is effectively a one-branch Many-Worlds Interpretation—branching has not disappeared, but is buried invisibly in the past, embedded in conspiratorial correlations.
This paradox highlights that superdeterminism neither simplifies quantum reality nor fully restores classical causality. Instead, it hides the complexity and branching behind a deterministic facade.
Statistical Reproduction Problem
To match quantum statistics without randomness, superdeterminism must "pre-load" the universe with exact conditions to:
Mimic Bell violations,
Preserve Born rule probabilities over infinite repetitions,
Ensure every detector choice is coordinated with the hidden variables of the particle.
This is called the fine-tuning problem, and it's one of the biggest attacks on superdeterminism.
Physicist Tim Maudlin and others have said this makes the theory either:
Unfalsifiable (like astrology),
Or so convoluted it reintroduces all the complexity it tried to avoid.
Superdeterminism and Massive Black Holes
Another intriguing question arises when considering massive black holes and their event horizons. If superdeterminism relies on a globally fixed initial state encoding all measurement outcomes, how does it handle regions causally disconnected by event horizons?
Inside black holes, information paradoxes and potential breakdowns of classical causality pose severe challenges. If parts of the universe become causally isolated, maintaining a superdeterministic “cosmic plan” coordinating measurement settings and outcomes across all regions becomes conceptually problematic.
This tension suggests that superdeterminism might face serious difficulties accounting for quantum measurements inside or near event horizons without invoking further exotic assumptions—potentially pushing the theory into even deeper layers of complexity and fine-tuning.
Implications for the Universe’s Initial Conditions and Infinity
A striking consequence of superdeterminism is that the universe’s initial state—often thought of as the state at or before the Big Bang—must already encode, with absolute precision, all future experimental settings and their outcomes, no matter how far in the future or distant in space.
If the universe is spatially or temporally infinite, this suggests an infinite, hyper-detailed superdeterministic “plan”, coordinating everything across all scales of space and time. This pushes the interpretation into an even more extraordinary realm of complexity and fine-tuning, far beyond what most physical theories posit.
Such an idea strains credulity and raises deep philosophical questions about the nature of time, causality, and whether any meaningful “prediction” or “freedom” exists within such a rigidly pre-scripted cosmos.
The Challenge of Verifying Determinism at Cosmic Scales
Superdeterminism hinges on the idea that all quantum events—even those occurring at the edge of the observable universe—are predetermined and correlated with our measurement choices. But how can we ever guarantee or test that a tiny quantum event billions of light years away is truly deterministic and conspiratorially coordinated with us here and now?
This leads to a practical and philosophical dead end:
Empirically unverifiable: There is no conceivable experiment to confirm or falsify such universal predetermination, especially for events so remote in space and time.
Loss of predictive power: If all is fixed in advance, and no independent “free” variation exists, then the notion of prediction loses meaning. Instead, all outcomes are effectively encoded and hidden in initial conditions beyond reach.
Paradoxical observer role: It forces us to accept that the observer and the observed form one inseparable, predetermined system, collapsing the usual distinction between cause and measurement.
Thus, the superdeterministic worldview, while logically consistent, appears to close off empirical inquiry into the fundamental randomness or determinism of the universe.
The Observerless Universe
In Gerard ’t Hooft’s flavor of superdeterminism, the role of the observer is not just minimized; it's effectively eliminated as a meaningful physical entity. If all states and their evolutions are determined from the start—including what you think, feel, decide, and measure—then:
Conscious experience becomes a kind of byproduct of deterministic evolution, not something that can influence or reflect back on reality.
Observation ceases to have causal significance; it’s not that "you observe and then the system responds," but rather "both your act of observation and the system's response were pre-encoded together."
This leads to the bizarre implication that there’s nothing left to “observe” in any meaningful sense, because all outcomes, thoughts, and correlations are merely playing out like the frames of a pre-written movie.
And worse: even the illusion of making a decision or becoming aware of an event is just another gear turning in the deterministic machinery — offering no causal influence and no epistemic access to why it turns that way.
It becomes a theory without observers, consciousness, or agency — stripping physics of its operational core. No room for "you" to witness a quantum event and no role for measurement to reveal anything. Just pre-scripted correlations.
In a way, this turns physics into cosmic bookkeeping rather than a science of observation, prediction, and understanding.
The need for an absolute time
’t Hooft’s deterministic approach (particularly his cellular automaton interpretation and associated superdeterministic ideas) tends to implicitly require an absolute space-time backdrop for its logic to work. Here's why:
1. Predetermined Initial Conditions
To encode all measurement outcomes, observer decisions, and their correlations into the universe’s initial conditions, you need:
A global ordering of events.
A definite “initial” moment (often conceptually like a t = 0).
A universal clock ticking deterministically.
But relativity forbids this. In both Special and General Relativity:
There is no privileged global time.
Simultaneity is relative.
The structure of space-time itself is dynamic, not a static backdrop.
2. Hidden Variable Coordination
In superdeterminism, the "hidden variables" must perfectly coordinate events that are spacelike separated, e.g., the setting of a detector on Earth and the state of a photon 1 billion light years away.
To do this without violating relativistic causality, you would need:
An absolute reference frame to define the coordination.
Some non-relativistic substrate (like a preferred foliation or lattice).
Again, this reintroduces something like Newtonian absolute space and time—which modern physics has spent a century moving beyond.
3. Observer-Free Determinism
If everything—including conscious choice—is part of the deterministic unfolding, then time itself must be objectively flowing along a preset path. But quantum mechanics and relativity both challenge that idea:
Quantum time evolution is unitary and time-symmetric.
Relativity allows for radically different perceptions of time, depending on reference frame.
’t Hooft’s framework tends to smuggle in a global Newtonian clock ticking away in the background—a hidden absolute space-time—which is philosophically and physically regressive. So instead of defending General Relativity and causality he is undermining them.
Superdeterminism vs. Quantum Computing
Quantum computing relies on superposition, interference, and probabilistic outcomes that depend on measurement collapse. A quantum algorithm explores multiple paths in parallel and amplifies the correct answer through constructive interference, while destructive interference suppresses wrong ones. But crucially: the result isn't predetermined. It's statistical. We run the algorithm multiple times to get a probability distribution.
Now enter superdeterminism:
In superdeterminism, there is no true randomness.
All measurement outcomes, including the intermediate states and errors, are pre-scripted by the initial conditions of the universe.
That means: the outcome of a quantum computation is fixed before the machine is even built.
So, here's the paradox:
If the quantum computer works perfectly, it wasn’t due to clever engineering or quantum principles—it just had to because the universe’s script said so.
If the quantum computer fails, again, it’s not due to decoherence or noise—it was just baked into the initial cosmic plan.
The meaning of a quantum algorithm becomes empty. We’re not learning or processing information—we're watching a deterministic puppet show.
And worse: error correction in quantum computers is built on the statistical behavior of entangled qubits. But if errors are not truly probabilistic, then correcting them is just shadowboxing with fate. There's no real "noise" to fix—just an illusion of unpredictability.
Under strict superdeterminism, quantum error correction becomes one of the most unintentionally ironic constructs in physics:
We build complex codes (like Shor or surface codes) to correct errors caused by decoherence and noise.
But in a superdeterministic universe, those "errors" are not truly errors — they were always destined to happen.
Therefore, any correction that works was also predetermined.
So we're not correcting anything — we're just enacting a prewritten drama.
It becomes a kind of ritual, not a response to real uncertainty. The quantum error correction code didn't save your qubit — it was already written in the stars that you'd run the correction and it would succeed (or fail).
This makes the very notion of error lose meaning. An "error" implies something that wasn't supposed to happen. But superdeterminism forbids that: everything was supposed to happen — even the mistakes.
So, in this view, quantum error correction is the cosmic equivalent of sweeping a floor that was always clean or always dirty — you never changed anything.
Feynman Path Integrals in Superdeterminism
In standard QM, the path integral formulation says:
A particle doesn't take just one path — it takes all possible paths from point A to B, and each path contributes an amplitude.The interference of all these paths gives you the probability of an outcome.
This beautifully encodes quantum uncertainty and non-classical behavior, like tunneling and interference.
Now Enter Superdeterminism
In superdeterminism, there are no true probabilities, no free variable choices, no real superpositions — everything is determined by hidden variables from the beginning of time.
So what happens to the Feynman path integral?
The particle doesn’t "consider" all paths. It was always going to take one.
The sum-over-histories becomes a decorative illusion — a mathematical artifact, not physical reality.
The path integral still “works” mathematically, but under superdeterminism, only one path ever mattered, and the rest are meaningless scribbles.
This turns Feynman’s beautiful vision — where the universe explores all possibilities — into a lie told by mathematics, purely for human convenience.
Superdeterminism and the Ultraviolet (UV) Problem in Quantum Gravity
At high energies or short distances — in the ultraviolet (UV) regime — physics becomes increasingly dominated by fluctuations. In quantum field theory and especially quantum gravity, this means:
A vast number of configurations contribute significantly to the path integral.
There is no clear “preferred” history because amplitudes are nearly uniformly spread — many paths or geometries become equally probable.
The semi-classical approximation fails, and no single trajectory or state dominates.
Now consider this in superdeterminism:
If there’s only one real outcome, predetermined by hidden variables, how does the theory "choose" among a near-infinite set of UV-dominant contributions?
Here’s the problem:
Path degeneracy: At the UV scale, many Feynman paths (or spacetime geometries) are nearly equally weighted.
Superdeterminism requires only one of them to be “real.”
But without probabilistic weighting, the “choice” of this one path becomes arbitrary — or worse, undefined.
The theory has no mechanism to explain why that one configuration gets picked.
This Leads to a Contradiction:
If you admit that multiple UV paths contribute significantly, you're back to a probabilistic framework (which superdeterminism rejects).
If you say only one path was real all along, you must explain how that path was selected in a UV-dominated, high-degeneracy regime — without relying on probability or symmetry breaking.
In Short:
Superdeterminism breaks down exactly where quantum gravity becomes most wild — in the UV, where “many paths” is not a feature, but a necessity.
Reflections and Surprise
As an engineering graduate without formal training in quantum foundations, I found this contradiction striking. It took just a few minutes of reflection—listening to Sabine Hossenfelder talk about superdeterminism over coffee—to articulate these points clearly.
I am genuinely amazed that this fundamental paradox—that superdeterminism introduces an even stronger causality-breaking and a hidden branching—does not seem to be widely acknowledged or debated more vigorously.
Support from the Literature
This critique aligns with several prominent voices in quantum foundations:
Fine-tuning and conspiracies: Tim Maudlin and others have noted the extreme “fine-tuning” required to coordinate hidden variables with measurement settings, labeling it conspiratorial and problematic.
Sabine Hossenfelder, who defends superdeterminism cautiously, openly admits the “conspiracy problem” remains a key obstacle.
Philosophical discussions note that the need to simulate counterfactual outcomes effectively amounts to branching, undermining the claim that superdeterminism avoids Many-Worlds.
Conclusion
Superdeterminism promises a deterministic universe, but at the cost of introducing hidden conspiracies and causality violations deeper than those of standard quantum mechanics. Its claim to restore Einstein’s causality is paradoxically undermined by its own demands.
That these insights emerged so quickly in a casual moment underscores the value of fresh perspectives outside traditional academic silos. It is my hope that such reflections stimulate deeper discussions on the foundations of quantum mechanics and the true nature of causality.
References
Maudlin, T. (2011). Quantum Non-Locality and Relativity. Wiley-Blackwell.
Hossenfelder, S. (2020). “Superdeterminism: A Guide for the Perplexed.” arXiv:1912.06462 [quant-ph].
Bell, J. S. (1964). “On the Einstein Podolsky Rosen Paradox.” Physics 1, 195.
Wigner, E. P. (1961). “Remarks on the Mind-Body Question.” In The Scientist Speculates, edited by I. J. Good.
Comments