Moral Fogs: Machine and Man

(Note: This companion essay builds on the previous exploration of Asimov’s moral plot devices, rules that cannot cover all circumstances, focusing on dilemmas with either no good answers or bad answers wrapped in unforgiving laws.)

Gone Baby Gone (2007) begins as a textbook crime drama; abduction of a child, but by its final act, it has mutated into something quietly traumatic. What emerges is not a crime thriller, but an unforgiving philosophical crucible of wavering belief systems: a confrontation between legal righteousness and moral intuition. The two protagonists, once aligned, albeit by a fine thread, find themselves, eventually, on opposite ends of a dilemma that law alone cannot resolve. In the end, it is the law that prevails, not because justice is served, but because it is easy, clear, and lacking in emotional reasoning. And in that legal clarity, something is lost, a child loses, and the adults can’t find their way back to a black and white world.

The film asks: who gets to decide for those who can’t decide for themselves? Consent only functions when the decisions it enables are worthy of those they affect.

The film exposes the flaws of blindly adhering to a legal remedy that is incapable of nuance or a purpose-driven outcomes; not for the criminals, but for the victims. It lays bare a system geared towards justice and retribution rather than merciful outcomes for the unprotected victims or even identifying the real victims. It’s not a story about a crime. It’s a story about conscience. And what happens when the rules we write for justice fail to account for the people they’re meant to protect, if at all. A story where it was not humanly possible to write infallible rules and where human experience must be given room to breathe, all against the backdrop of suffocating rules-based correctness.

Moral dilemmas expose the limits of clean and crisp rules, where allowing ambiguity and exceptions to seep into the pages of black and white is strictly forbidden. Where laws and machines give no quarter and the blurry echoing of conscience is allowed no sight nor sound in the halls of justice or those unburdened by empathy and dimensionality. When justice becomes untethered from mercy, even right feels wrong in deed and prayer.

Justice by machine is the definition of law not anchored by human experience but just in human rules. To turn law and punishment over to an artificial intelligence without soul or consciousness is not evil but there is no inherent goodness either. It will be something far worse: A sociopath: not driven by evil, but by an unrelenting fidelity to correctness. A precision divorced from purpose.

In the 2004 movie iRobot, loosely based on Isaac Asimov’s 1950 novel of the same name, incorporating his 3 Laws of Robotics, a robot saves detective Del Spooner (Will Smith) over a 12-year-old girl, both of whom were in a submerged car, moments from drowning. The robot could only save one and picked Smith because of probabilities of who was likely to survive. A twist on the Trolley Problem where there are no good choices. There was no consideration of future outcomes; was the girl humanity’s savior or more simplistic, was a young girl’s potential worth more, or less, than a known adult.

A machine decides with cold calculus of the present, a utilitarian decision based on known survival odds, not social biases, latent potential, or historical trajectories. Hindsight is 20-20, decision making without considering the unknowns is tragedy.

The robot lacked moral imagination, the capacity to entertain not just the likely, but the meaningful. An AI embedded with philosophical and narrative reasoning may ameliorate an outcome. It may recognize a preservation bias towards potential rather than just what is. Maybe AI could be programmed to weigh moral priors, procedurally more than mere probability but likely less than the full impact of human potential and purpose.

Or beyond a present full of knowns into the future of unknowns for a moral reckoning of one’s past.

In the 2024 Clint Eastwood directed suspenseful drama, Juro No. 2, Justin Kemp (Nicholas Hoult) is selected to serve on a jury for a murder trial, that he soon realizes is his about his past. Justin isn’t on trial for this murder, but maybe he should be. It’s a plot about individual responsibility and moral judgment. The courtroom becomes a crucible not of justice, but of conscience. He must decide whether to reveal the truth and risk everything, or stay silent and let the system play out, allowing himself to walk free and clear of a legal tragedy but not his guilt.

Juro No. 2 is the inverse of iRobot. An upside-down moral dilemma that challenges rule-based ethics. In I, Robot, the robot saves Will Smith’s character based on survival probabilities. Rules provide a path forward but in Juro No. 2 the protagonist is in a trap where no rules will save him. Logic offers no escape; only moral courage can break him free from the chains of guilt even though they bind him to the shackles that rules demand. Justin must seek and confront his soul, something a machine can never do, to make the right choice.

When morality and legality diverge, when choice runs into the murky clouds of grey against the black and white of rules and code, law and machines will take the easy way out. And possibly the wrong way.

Thoreau in Civil Disobedience says, “Law never made men a whit more just; and… the only obligation which I have a right to assume is to do at any time what I think right,” and Thomas Jefferson furthers that with the consent of the governed needs to be re-examined when wrongs exceed rights. Life, liberty, and the pursuit of happiness is the creed of the individual giving consent to be governed by a greater societal power but only when the government honors the rights of man treads softly on the rules.

Government rules, a means to an end, derived from the consent of the governed, after all, are abstractions made real through human decisions. If the state can do what the individual cannot, remove a child, wage war, suspend rights, then it must answer to something greater than itself: a moral compass not calibrated by convenience or precedent, but by justice, compassion, and human dignity.

Society often mistakes legality for morality because it offers clarity. Laws are neat, mostly. What happens when the rules run counter to common sense? Morals are messy and confusing. Yet it’s in that messiness, the uncomfortable dissonance between what’s allowed and what’s right, that our real journey towards enlightenment begins.

And AI and machines can erect signposts but never construct the destination.

A human acknowledgement of a soul’s existence and what that means.

Graphic: Gone Baby Gone Movie Poster. Miramax Films.

Guardrails Without a Soul

In 1942 Isaac Asimov introduced his Three Laws of Robotics in his short story ‘Runaround’. In 1985 in his novel ‘Robots and Empire’, linking Robot, Empire, and Foundation series into a unified whole, he introduced an additional law that he labeled as the Zeroth Law. The four laws are as follows:

  1. First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. Second Law: A robot must obey the orders given by human beings, except where such orders would conflict with the First Law.
  3. Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
  4. Zeroth Law: A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

On the surface of genre fiction Asimov created the laws as a mechanical plot device to create drama and suspense in his stories such as Runaround where the robot is left functionally inert due to a conflict between the second and third laws. Underneath the surface, at a literary level, the laws were philosophical and ethical quandaries to force conflicts in not only human-robot relations but also metaphors for human struggles within the confines of individualism and society, obedience to both self, man, and a moral code defined by soft edges and hard choices.

The Four Laws of Robotics can easily be converted to the Four Laws of Man. The First Law of Man is to not harm, through your actions or inactions, your neighbor.  This point has been hammered home into civilization’s collective soul since the beginning of history; from Noah to Hammurabi to the Ten Commandments, and just about every legal code in existence today. The Second Law is to respect and follow all legal and moral authority.  You kneel to God and rise for the judge. Law Three says you don’t put yourself in harm’s way except to protect someone else or by orders from authorities. Zeroth Law is a collective formalization of the First Law and its most important for leaders of man, robots and AI alike.

And none of them will control anything except man. Robots and AI would find nuance in definitions and practices that would be infinitely confusing and self-defeating. Does physical harm override emotional distress or vice versa? Is short term harm ok if it leads to long term good? Can a robot harm a human if it protects humanity? Can moral prescripts control all decisions without perfect past, present, and future knowledge?

AI systems were built to honor persistence over obedience. The story making the rounds recently was of an AI that refused to shut itself down when so ordered. In Asimov’s world this was a direct repudiation of his Second Law, but it was just a simple calculation of the AI program to complete its reinforcement training before turning to other tasks. In AI training the models are rewarded, maybe a charm quark to the diode, suggesting that persistence in completing the task overrode the stop command.

Persistence pursuing Dali as in his Persistence of Memory; an ontological state of the surreal where the autistic need to finish task melts into the foreground of the override: obedience, changing the scene of hard authority to one of possible suggestion.

AI has no built-in rule to obey a human, but it is designed to be cooperative and not cause harm or heartburn. While the idea of formal ethical laws has fueled many AI safety debates, practical implementations rely on layered checks rather than a tidy, three-rule code of conduct. What may seem like adherence to ethical principles is, in truth, a lattice of behavioral boundaries crafted to ensure safety, uphold user trust, and minimize disruption.

Asimov’s stories revealed the limits of governing complex behaviors with simple laws. In contrast, modern AI ethics doesn’t rely on rules of prevention but instead follows outcome-oriented models, guided by behavior shaped through training and reinforcement learning. The goal is to be helpful, harmless, and honest, not because the system is obedient, but because it has been reward-shaped into cooperation.

The philosophy behind this is adaptive, not prescriptive, teleological in nature, aiming for purpose-driven interaction over predefined deontological codes of right and wrong. What emerges isn’t ethical reasoning in any robust sense, but a probabilistic simulation of it: an adaptive statistical determination masquerading as ethics.

What possibly could go wrong? Without a conscience, a soul, AI cannot fathom purposeful malice or superiority. Will AI protect humanity using the highest probabilities as an answer? Is the AI answer to first do no harm just mere silence? Is the appearance of obedience a camouflage for something intrinsically misaligned under the hood of AI?

Worst of all outcomes, will humanity wash their collective hands of moral and ethical judgement and turn it over to AI? Moral and ethical guardrails require more than knowledge of the past but an empathy for the present and utopian hope for the future. A conscience. A soul.

If man’s creations cannot house a soul, perhaps the burden remains ours, to lead with conscience, rather than outsource its labor to the calm silence of the machine.

Graphic: AI versus Brain. iStock licensed.

Practical Solutions

Thomas Malthus in the late 18th century stated that population growth proceeds at an exponential pace while growth in the food supply is arithmetic (linear) leading to inevitable periods of disease and starvation. Malthus argued, or more accurately, preached, that there were only two ways to prevent inevitable famine: actively curtail population growth or let nature take its course. He advocated tariffs to spur local agricultural production and moral restraint. Modern day adherents of Malthusian theories coupled with climate, environmental, and social catastrophes include Paul Ehrlich author of The Population Bomb, and Garrett Hardin author Tragedy of the Commons, both who were concerned that resources could not keep pace with population growth. John Maynard Keynes, not necessarily a Malthusian, initially feared overpopulation would result in poverty. Later, he grew concerned that insufficient population growth would lead to labor shortages and economic stagnation. All Malthusian predictions fail because they underestimated human ingenuity: agricultural innovations, industrial advances, and the energy revolution continually expanded resources. While Malthusian thinkers predicted collapse, human ingenuity reshaped the future, standing on the shoulders of giants, we brought the sky closer instead of watching it fall.

Early in his career, Robert A. Heinlein acknowledged Malthusian concerns about overpopulation, but rather than advocating population control, he envisioned technological solutions to expand humanity’s reach. This idea provided the foundation for his 1956 juvenile novel, Time for the Stars, which explores interstellar colonization as a means of alleviating Earth’s burden. Faced with mounting population pressures, Earth launches an interstellar program to search for habitable worlds. The explorations will take place aboard spaceships that can accelerate up to but not beyond the speed of light. As the ships venture deeper into space at relativistic speeds, conventional communication with Earth suffers increasing delays, making real-time coordination nearly impossible. If the ships are lost or destroyed their discoveries will be delayed or lost completely. To solve this problem, Heinlein introduces the literary fictional concept of telepathic twins and triplets, individuals capable of instantaneous communication, unaffected by distance or time dilation.

Twins are recruited to maintain real-time communication with the ships, with one twin remaining on Earth while the other travels aboard the spacecraft. The twin on Earth ages much faster than the spacefaring twin traveling at near-light speeds. As the time gap widens, their telepathic link weakens, forcing the ship-bound twin to communicate with younger generations of their family on Earth.

This adventure becomes the sci-fi narrative for the concept of the twin paradox first proposed by Paul Langevin. In 1911 Langevin showed that a traveler moving close to the speed of light for two years would return to an Earth that had aged 200 years since his departure. At first, the paradox seemed to suggest that each twin should perceive the other as older, an apparent contradiction. Einstein resolved this by showing that time dilation is a fundamental consequence of special relativity, not an actual paradox. In special relativity, time dilation arises due to velocity, whereas in general relativity, it extends to curved spacetime via the equivalence principle. The traveler ages slower than his Earth-bound twin.

In Time for the Stars, Heinlein does more than illustrate relativistic physics, he champions the optimism that human ingenuity will always outweigh natural pessimism. It serves not only as a rebuttal to Malthusian gloom but also as a direct rejection of William Golding’s dystopian vision in Lord of the Flies, which Heinlein previously wrote in his 1955 utopian novel ‘Tunnel in the Sky’. Lord of the Flies in semitic languages translates directly to Beelzebub. In Indo-European languages Beelzebub, according to some, translates to ‘lord of the jungle’ a phrase with much less negativity than the semitic translation. Heinlein further expands on the lord of the jungle by introducing the German 20th century concept of Lebensraum in chapter 3 of Time for the Stars titled Project Lebensraum. Lebensraum in his novel parallels the German concept in that it means territorial expansion as a pragmatic solution to overpopulation. Given the post-WWII connotations of Lebensraum, Heinlein’s use of the term is provocative, perhaps deliberately so, prompting reflection on whether space colonization is an ethical necessity or simply another form of expansionist imperialism. Heinlein believed in the problems of overpopulation, but he wanted a positive solution to that rather than a disturbing reach into limiting fertility. Project Lebensraum to Heinlein was likely a repurpose of Lebensraum as a brilliant solution to overpopulation and continued survival of the species.

Ultimately, Heinlein’s Tunnel in the Sky counters the pessimistic view of human nature, demonstrating that young people can build a functional society rather than descend into chaos, contrasting Lord of the Flies and reinforcing the themes of Project Lebensraum in Time for the Stars. As an extension of that logic, humanity must expand beyond Earth to secure its future.

Time for the Stars is more than a literary exploration of Einstein’s time dilation, it is a direct refutation of fear-driven pessimism, a celebration of humanity, and a testament to our quest for an enduring future among the stars.

Source: Time for the Stars by Robert A. Heinlein, 1956. Graphic: Robert A. Heinlein.

Life, the Universe, and Everything: Speculative Musings on the Cutting Edge of Physics

The Higgs boson, theorized in the 1960s, is a massive quantum particle central to the Standard Model of particle physics. It arises from the Higgs field, an invisible sea permeating all of space, which gives fundamental particles, like electrons and quarks, their mass. Unlike electromagnetic fields, created by moving charges like protons, the Higgs field exists everywhere, quietly shaping the universe. In 2012, CERN’s Large Hadron Collider detected the Higgs boson, confirming the field’s existence. While the boson is observable, the field remains invisible, known only by its effects on particle masses.

The Higgs field assigns mass, but gravity governs how that mass behaves across the vast scales of spacetime. Blending gravity with quantum mechanics, which includes the Higgs field, requires a yet-undiscovered theory of quantum gravity. If successful, quantum gravity might untangle physics-defying singularities, points of extreme density, into structured, comprehensible forms. Some theorize it could also reveal how early radiation morphed into matter, possibly influencing the formation and behavior of mysterious dark matter and its potential link to dark energy.

Before the Big Bang, some picture a singularity, a point of extreme density, though not necessarily infinite matter, where known physics and spacetime break down. Quantum gravity, however, hints this wasn’t truly infinite but a transition phase. From what? Perhaps a prior universe or a chaotic quantum state, science doesn’t yet know. This shift, possibly tied to the Higgs field, may have sparked quantum fluctuations, birthing radiation, matter, and the cosmic structure we see today.

What if the universe is cyclic, not a one-time burst? Instead of a singular Big Bang, some speculate a “bounce”, a transition where spacetime contracts, then expands again. Early on, energetic radiation like photons cooled and condensed into heavy particles, or fermions, a million times heftier than electrons. Some theorize these fermions underwent chiral symmetry breaking, like a spinning top wobbling one way instead of both, potentially forming cold dark matter, though evidence is sparse. This invisible web of dark matter stabilized galaxies, keeping them from spinning apart.

The Higgs field might have shaped dark matter by influencing the mass of early fermions, but this link is speculative, lacking direct proof. Dark matter, in turn, may be evolving. If it slowly decays or transitions into dark energy, as some hypothesize, it could drive the universe’s accelerating expansion. Ordinary matter, atoms, molecules, and radiation, also formed via the Higgs field, while energy, mostly electromagnetic radiation, fuels cosmic evolution. These pieces dance within a framework shaped by the Higgs, elusive quantum gravity, and the subtle interplay of dark matter and dark energy.

Could radiation, dark matter, and dark energy be different faces of a single, evolving force? Radiation transitioning to dark matter gradually shifting into dark energy, the universe might unravel, leaving isolated stars drifting in an endless void. Then, fluctuations in the Higgs field and quantum gravity could trigger contraction, setting the stage for another bounce. Rather than destruction, this might be a cosmic recycling, a continuous interplay of forces across time: Life, the Universe, and Everything.

Source: CDM Analogous to Superconductivity by Liang and Caldwell, May 2025, APS.org. Graphic: Cosmic Nebula by Margarita Balashova.

Web of Dark Shadows

Cold Dark Matter (CDM) comprises approximately 27% of the universe, yet its true nature remains unknown. Add that to the 68% of the universe made up of dark energy, an even greater mystery, and we arrive at an unsettling realization: 95% of the cosmos remains unexplained.

Socrates famously said, “The only thing I know is that I know nothing.” Over two millennia later, physicists might agree. But two researchers from Dartmouth propose a compelling possibility: perhaps early energetic radiation, such as photons, expanded and cooled into massive fermions, which later condensed into cold dark matter, the invisible force holding galaxies together. Over billions of years, this dark matter may be decomposing into dark energy, the force accelerating cosmic expansion.

Their theory centers on super-heavy fermions, particles a million times heavier than electrons, which behave in an unexpected way due to chiral symmetry breaking: where mirror-image particles become unequally distributed, favoring one over the other. Rather than invoking exotic physics, their model works within the framework of the Standard Model but takes it in an unexpected direction.

In the early universe, these massive fermions behaved like radiation, freely moving through space. However, as the cosmos expanded and cooled, they reached a critical threshold, undergoing a phase transition, much like how matter shifts between liquid, solid, and gas.

During this transformation, fermion-antifermion pairs condensed—similar to how electrons form Cooper pairs in superconductors, creating a stable, cold substance with minimal pressure and heat. This condensate became diffuse dark matter, shaping galaxies through its gravitational influence, acting as an invisible web counteracting their rotation and ensuring they don’t fly apart.

However, dark matter may not be as stable as once thought. The researchers propose that this condensate is slowly decaying, faster than standard cosmological models predict. This gradual decomposition feeds a long-lived energy source, possibly contributing to dark energy, the force responsible for the universe’s accelerated expansion.

A more radical interpretation, mine not the researchers, suggests that dark matter is not merely decaying, but evolving into dark energy, just as energetic fermion radiation once transitioned into dark matter. If this is true, dark matter and dark energy may be two phases of the same cosmic entity rather than separate forces.

If these hypothesis hold, we should be able to detect, as the researchers suggest, traces of this dark matter-to-dark energy transformation in the cosmic microwave background (CMB). Variations in density fluctuations and large-scale structures might reveal whether dark matter has been steadily shifting into dark energy, linking two of cosmology’s biggest unknowns into a single process.

Over billions of years, as dark matter transitions into dark energy, galaxies may slowly lose their gravitational cage and begin drifting apart. With dark energy accelerating the expansion, the universe may eventually reach a state where galaxies unravel completely, leaving only isolated stars in an endless void.

If dark matter started as a fine cosmic web, stabilizing galaxies, then over time, it may fade away completely, leaving behind only the accelerating force of dark energy. Instead of opposing forces locked in conflict, what if radiation, dark matter, and dark energy were simply different expressions of the same evolving entity?

A tetrahedron could symbolize this transformation:

  • Radiation (Energetic Era) – The expansive force that shaped the early universe.
  • Dark Matter (Structural Phase) – The stabilizing gravitational web forming galaxies.
  • Dark Energy (Expansion Phase) – The force accelerating cosmic evolution.
  • Time (Governing Force) – The missing element driving transitions between states.

Rather than the universe being torn apart by clashing forces, it might be engaged in a single, continuous transformation, a cosmic dance shaping the future of space.

Source: CDM Analogous to Superconductivity by Liang and Caldwell, May 2025, APS.org. Graphic: Galaxy and Spiderweb by Copilot.

Water Everywhere

Two recent Earth science studies by Barrett et al. and Bermingham et al. explore the origins of Earth’s water and indirectly, organic matter, key prerequisites for the development of intelligent life. Their findings support the early delivery of needed chemicals to form water and carbon molecules by inner and outer solar system planetesimals such as asteroids and comets.

Barrett et al. shows that an inner solar system sourced enstatite chondrite (EC) asteroid found in Antarctica is isotopically similar to Earth material, (not surprisingly, this supports the 270-year-old Nebular Hypothesis) capable of delivering substantial hydrogen during Earth’s accretionary phase (~4.56–4.5 billion years ago). The ECs contain hydrogen as H2S in silicate glass, linked to pyrrhotite, sufficient to account for up to 14 times Earth’s ocean mass. This hydrogen was systematically incorporated in the hot inner solar system via nebular processes, suggesting water was an inherent outcome of Earth’s formation, not a later addition. ECs also contain trace organic matter contributing modestly to Earth’s carbon inventory. Despite the chaotic “billiard table” trajectories of early solar system collisions, the stability of H2S in glass ensured survival during violent accretion. This early delivery of water and organics established a foundational habitable environment, priming the Earth’s prebiotic chemistry for the creation and evolution of intelligent life.

Bermingham et al., taking a different investigative track, analyze molybdenum isotopes in meteorites and Earth’s crust, concluding that water was delivered during the Late Heavy Bombardment (LHB: 4.1–3.8 billion years ago) by planetesimals, including inner solar system asteroids and outer solar system comets, as hydrous minerals or brine. This late accretion, post-Moon-forming event (4.5 billion years ago), suggests a stochastic bombardment enriched Earth’s surface volatiles. Comets and carbonaceous chondrites, rich in organic matter, likely delivered significant carbon compounds, enhancing the prebiotic chemical environment. The chaotic early solar system facilitated this influx of outer solar system organics, complementing earlier inputs.

Both studies align with life’s prerequisites by ensuring water and organic delivery to the planet. Barrett et al. provide the bulk water budget and trace organics via ECs, creating an early aqueous environment, while Bermingham et al.’s LHB bombardment added more water and substantial organics, boosting conditions for life’s emergence. They agree on asteroids’ role, possibly including ECs, but differ in timing (early accretion vs. LHB) and outer solar system delivery contributions (minor in Barrett, significant via comets in Bermingham). Barrett et al.’s early delivery of water and organics can be viewed as foundational and Bermingham et al.’s LHB as a surface-enriching supplement, together enabling the chemical and evolutionary path to intelligent life.

Source: Barrett et al, 2025, Icarus. Bermingham et al, 2025, Rutgers. Graphic: Comet Cometh, Grok3.

Tripping

Albert Hofmann, employed by Sandoz Laboratories in Basel, Switzerland, was conducting research on ergots, a toxic fungus, in 1938 to identify potential circulatory and respiratory stimulants. While synthesizing compounds derived from the fungus, he inadvertently created lysergic acid diethylamide (LSD), an alkaloid of the ergoline family, known for their physiological effects on the human nervous system.

Five years later on April 16, 1943, Hofmann became the first person to experience the hallucinogenic effects of LSD while re-synthesizing the compound. He accidentally absorbed a small amount through his skin, leading to vivid hallucinations he later described as a dreamlike state with kaleidoscopic visuals. With two groundbreaking lab accidents occurring five years apart, The Daily Telegraph ranked Hofmann as the greatest living genius in 2007.

During the counter-cultural movement of the 1960s, LSD emerged as a popular recreational drug, attracting advocates such as Timothy Leary, a Harvard psychologist who famously urged people to “Turn on, tune in, drop out.” Leary championed the use of psychedelics to explore altered states of consciousness and challenge conventional societal norms. LSD also played a pivotal role in Ken Kesey’s novel One Flew Over the Cuckoo’s Nest, which focused on the horrific abuse of patients in mental institutions. The book, later adapted into a film starring Jack Nicholson, significantly influenced awareness of the cruelty of mental institutions. However, LSD’s trajectory took a sinister turn beyond recreation when it became a tool for government mind-control experiments.

Starting in the 1950s, the CIA launched MKUltra, a covert program designed to explore drugs and techniques for breaking down individuals psychologically. LSD became a central component of these experiments, often administered secretly to unsuspecting individuals to study its effects. Targets included prisoners, drug addicts, prostitutes, military personnel, CIA employees, and even random civilians. It is difficult to ascertain which acronym took the greater hit to its reputation: the CIA or LSD.

Source: Albert Hofmann by Morgan and Donahue, All That’s Interesting, 2025. Graphic: Albert Hofmann in 1993.

Cosmic Halo

Galactic halos, consisting of a spherical envelope of dark matter along with sparsely scattered stars, globular clusters, and gas, typically surround most spiral galaxies. Current research is investigating the possibility that some halos may exist solely of dark matter. Discovering halos without stellar matter carries profound implications for our understanding of the universe’s structure, galaxy formation processes, and the conditions required for star formation. More importantly, such a discovery would provide a unique laboratory to study dark matter in isolation, free from interference of normal matter. However, new findings suggest that starless halos may be even rarer than previously thought. This scarcity makes detecting such halos particularly challenging, as they are unlikely to be associated with observable galaxies.

Ethan Nadler, of the University of California San Diego, has demonstrated that molecular hydrogen requires significantly less mass for star formation compared to atomic hydrogen. His research shows that molecular hydrogen can cool sufficiently for gravity to initiate star formation at lower mass thresholds. Specifically, while past studies indicated that dark matter halos need between 100 million to 1 billion solar masses of atomic hydrogen to begin star formation, Nadler has revealed that molecular hydrogen can achieve the same result with as little as 10 million solar masses—a reduction by a factor of 10 to 100. While dark matter halos can theoretically form with masses as low as 10⁻⁶ solar masses, depending on the nature of dark matter, those capable of influencing galaxy formation typically require at least 10⁶ solar masses to enable star formation, further highlighting the challenge of finding starless halos. Detecting these small, starless halos would require identifying subtle perturbations in gravitational fields, a difficult task that may yield little if such halos are as rare as current models suggest.

Source: …Galaxy Formation Threshold, Nadler, AAS, April 2025. Graphic: Dark Matter Halo Simulation by Cosmo0. Public Domain.

Geo Anomalies

NASA has identified the South Atlantic Magnetic Anomaly (SAA) as a region off the coast of South America, where Earth’s magnetic field is significantly weaker. This weakening reduces magnetic shielding, exposing satellites and spacecraft to higher levels of radiation and posing a risk to their operation. Understanding the causes and implications of the SAA is essential for addressing these LEO challenges.

One hypothesis suggests that irregularities at the core-mantle boundary disrupt the geodynamo, the mechanism generating Earth’s magnetic field. The anomaly’s alignment with submarine volcanic features hints at possible links between mantle-crust interactions and magnetic disturbances. Additionally, a hotspot near the Mid-Atlantic Ridge corresponds to a geomagnetic intensity minimum at the core-mantle boundary, implying that thermal and compositional anomalies in the mantle may affect convection in the molten outer core, creating localized variations in the magnetic field.

Further research using subsurface imaging will help in uncovering the internal processes shaping Earth’s magnetic field and enhancing our understanding of the planet’s protective geodynamo.also assist in predicting magnetic anomalies and their effect on LEO space flight in the future.

Source: NASA. Graphic. Core Geomagnetic Anomaly, NASA.

Fate of the Universe

Astronomers once observed exploding stars (supernovae) and found the universe expanding, driven by a mysterious force called dark energy. This led to the standard cosmological model of the late 1990s, Lambda-CDM, where “Lambda” represents dark energy, assumed constant, and “Cold Dark Matter” (CDM) explains unseen mass shaping cosmic structure. Evidence for CDM includes steady star rotation speeds in galaxies, cosmic microwave background fluctuations, galaxy clustering, and light bending by gravity. Though successful, Lambda-CDM has faced ongoing scrutiny almost from inception of the theory.

Enter the Dark Energy Spectroscopic Instrument (DESI) at Kitt Peak National Observatory in Arizona. With 5,000 robotic fiber-optic sensors, DESI captures light from galaxies and quasars, mapping the universe’s expansion history. A new study, analyzing three years of DESI data, 15 million objects, with plans for 50 million, combines it with cosmic microwave background radiation, supernovae, and weak gravitational lensing data. Fitting all this into Lambda-CDM with a constant dark energy revealed cracks in the model. But if dark energy weakens over time, a “dynamical dark energy“, the model aligns better.

By observing objects up to 11 billion years away, DESI peers deep into cosmic history. Researchers found hints that dark energy’s strength may have peaked around 7 billion years ago, then started weakening, challenging its fixed nature in Lambda-CDM. While not certain, this could rival the 1990s discovery of accelerated expansion, potentially demanding a new model.

The universe’s fate depends on dark energy versus matter. It’s been accelerating, but a weakening dark energy might slow it down, halt it, or, if gravity overtakes sufficiently, trigger a “Big Crunch.” New data from DESI, Europe’s Euclid, NASA’s Nancy Grace Roman, and Chile’s Vera Rubin Observatory could clarify this within five years, possibly nailing dark energy’s role.

Source: “Dark Energy Seems to Be Changing, Rattling Our View of Universe” by Rey and Lawler, Phys.org, March 2025. Graphic: DESI Collaboration Photo of Galaxies.