Galactic Emptiness

I like the quiet.

From the dark, an enigmatic mass of rock and gas streaks inward. Discovered by the ATLAS telescope in Chile on 1 July 2025, it moves at 58 km/s (~130,000 mi/hr), a billion-year exile from some forgotten, possibly exploded star, catalogued as 3I/Atlas. The press immediately fact-checks then shrieks alien mothership. Harvard’s Avi Loeb suggests it could be artificial, citing its size, speed: “non-gravitational acceleration”, and a “leading glow” ahead of the nucleus. Social media lights up with mothership memes, AI-generated images, and recycled Oumuamua panic.

Remaining skeptical but trying to retain objectivity, I ask; is it anything other than a traveler of ice and dust obeying celestial mechanics? And it is very difficult to come up with any answer other than, no.

NASA’s flagship infrared observatory, the James Webb Space Telescope (JWST) spectra show amorphous water ice sublimating 10,000 km from the nucleus. The Hubble telescope resolves a 13,000-km coma (tail), later stretching to 18,000 km that is rich in radiation forged organics: tholins, and fine dust.

The “leading glow” is sunlight scattering off ice grains ejected forward by outgassing. The “non-gravitational acceleration” is gas jets, not engines. Loeb swings and misses again: ‘Oumuamua in 2017, IM1 in 2014, now this. Three strikes. The boy who cried alien is beginning to resemble the lead character in an Aesop Fable.

Not that I’m keeping score…well I am…sort of. Since Area 51 seeped into public lore, alien conspiracies have multiplied beyond count, but I still haven’t shaken E.T.’s or Stitches’ hand. No green neighbors have moved next door, no embarrassing probes, just the Milky Way in all its immense, ancient glory remaining quiet. A 13.6-billion-year-old galaxy 100,000 light-years across, 100–400 billion stars, likely most with host planets, and us, alone on a blue dot warmed by a middle-aged G2V star, 4.6 billion years old, quietly fusing hydrogen in the Orion Spur, between the galaxy’s Sagittarius and Perseus spiral arms.

No one knocking. But still, I like the quiet.

An immense galaxy of staggering possibilities, where the mind fails to comprehend the vastness of space and physics provides few answers.  The Drake Equation, a probabilistic 7 term formula used to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way galaxy yields an answer of less than one (0.04 to be exact) which is less than the current empirical answer of 1, which is us on the blue dot.

For the show me crowd here’s the Drake Equation N = R* × f_p × n_e × f_l × f_i × f_c × L and inserting 2025 consensus for the parameters: Two stars born each year. Nearly all with planets. One in five with Earth‑like worlds. One in ten with life. One in a hundred with intelligence. One in ten with radio. A thousand years of signal. And the sum is: less than one.

For the true optimist let’s bump up N to 100.  Not really a loud party but enough noise that someone should have called the police by now.

No sirens. I like the quiet.

But now add von Neumann self-replicating probes traveling at relativistic speeds, one advanced civilization could explore the galaxy in 240 ship-years (5,400 Earth years). A civilization lasting 1 million years could do this 3000 times over. Yet we see zero Dyson swarms, zero waste heat, zero signals. Conclusion: Either N = 0, or every civilization dies before it advances to the point it is seen by others. That leaves us with a galaxy in a permanent civilizational nursery state, or existing civilizations have all died off before we had the ability to look for them, or we are alone and always have been.

Maybe then, but not now. Or here but sleeping in the nursery. I like the quiet.

But then I remember Isaac Asimov’s seven‑novel Foundation saga. The Galactic Empire crumbles. Hari Seldon’s psychohistory predicts collapse and rebirth. The Second Foundation manipulates from the shadows. Gaia emerges as a planet‑wide mind. Robots reveal they kept it going: Daneel Olivaw, 20,000 years old, guiding humanity. And the final page (Foundation and Earth, 1986) exposes the beginning: Everything traces back to Earth. A radioactive cradle that forced primates to evolve repair genes, curiosity, and restlessness. We are radiation’s children. We didn’t find aliens. We are the aliens.

We are the cradle. We are the travelers. I still like the quiet.

Beginnings

A recent ScienceDaily write‑up titled “Scientists just found the hidden cosmic fingerprints of dark matter” suggests a breakthrough in the elusive substance that binds galaxies together. In reality, the study reports that Lyman‑Alpha emitters are a transient phenomenon, interesting, but nowhere near the revolutionary advance implied by the headline.

For readers uninitiated in cosmology and astrophysics, that’s a lot of jargon at once. So let’s bring it down a notch with some plain definitions.

Dark matter is the invisible mass that holds galaxies together through gravity. Without it, galaxies would fly apart. We infer its existence only because galaxies behave as they do. It makes up about 27% of the universe’s total energy density. By comparison, ordinary matter, the stuff we can see and measure, accounts for a measly 5%. Dark energy, the mysterious driver of cosmic acceleration, contributes about 68%. But that’s a story for another day.

Lyman‑Alpha emitters (LAEs) are distant, generally low‑mass galaxies that shine in Lyman‑alpha radiation: ultraviolet light produced when a hydrogen electron drops from the second energy level to the ground state (n=2 → n=1). Because this light is strongly redshifted by cosmic expansion, LAEs act as beacons of the early universe. Observing the ones implied in the opening science press headline means looking back to a time when the cosmos was less than a billion years old.

Scientists examine the clustering of LAEs across three epochs, each marking a milestone in cosmic evolution, a page from the manuscript of creation. At a redshift of 6, when the universe was about 0.9 to 1.0 billion years old, roughly 12.8 billion years ago, the first galaxies and stars were re‑ionizing neutral hydrogen, lifting the primordial fog and making the universe transparent. This period is known as the Epoch of Reionization.

The next epoch, at a redshift of 5.7 (about 100 million years later, or 12.7 billion years ago), is called the Late Reionization / Transition Epoch. Here, scientists measure how quickly the fog of neutral hydrogen dissipated and how galaxies began to cluster. Clustering serves as a proxy for the gravitational wells of dark matter, which drew in and anchored ordinary matter.

Finally, at a redshift of 3, around 11.8 billion years ago, the Post‑Reionization Epoch reveals a more mature universe with large‑scale structures taking shape. LAEs in this era trace galaxy clustering and help infer the masses of the dark matter halos they inhabit. These halos are vast, spherical envelopes of unseen matter surrounding galaxies and clusters.

With this groundwork, we return to the science press claim that researchers have found the “fingerprints” of dark matter itself. In truth, the fingerprints show no loops or swirls, no identification of what dark matter is or how it is distributed, only confirmation of what is already established. Without dark matter, galaxies would not exist. It is, in essence, a Cartesian maxim: I gather, therefore I am. Nothing more. Nothing less.

There was, however, a genuine insight. Lyman‑alpha emitters are transient, short‑lived luminous phases in galaxies that trace the framework of dark matter. The clustering function does not reveal dark matter’s nature; it just shows how rarely baryonic light, the real stuff of frogs, men, and cybertrucks aligns with gravitational tugs.

This raises a deeper question: why does dark matter clump at all, instead of remaining uniform across the cosmos? The answer lies in gravitational instability. Minute quantum fluctuations in the infant universe were stretched to cosmic scales by inflation, imprinting faint density variations, ripples in spacetime itself (if time exists is another a question for a different day). Cold, non‑interacting dark matter streamed into these wells, not merely seeking density but becoming it, deepening the imprints and laying the invisible scaffolding upon which galaxies and clusters would later rise. In turn, the growing clumps reinforced the very variations that seeded them, a feedback loop that sculpted the universe’s large‑scale structure. Quantum fractures first, dark matter responding.

And yet another knot: where did dark matter come from? If it does not interact, how could it be born from interaction? Perhaps it is not a product of the Big Bang at all. Did it exist outside the Bang, or was it a transformation from an earlier state?

Unto the spirit of dark energy, the expansive gust that stretches spacetime, accelerating the universe’s drift into an ever‑expanding horizon. If dark matter is transformation, is dark energy its continuation, or merely a phase toward dissolution?

Together they form a cosmic tension: cohesion and dispersal, gathering and vanishing. The Big Bang may not be the beginning, but only the first visible flare in a manuscript already dictated eons before the dawn.

In this reframing, baryonic matter: atoms, stars, flesh, machines, is a late arrival. Bone, blood, and silicone are ritual sparks, flaring briefly in the gravitational wells carved by dark matter and stretched by dark energy. We are not the fathers of the universe, but the children of a violent past.

Dark matter is the glue. Dark energy erases the image. We are but the punctuation; marks in a manuscript whose lines were written long before our arrival.

Source: …Fingerprints of Dark Matter, Science Daily, Sept. 2025. ODIN: Clustering Analysis… by Herrera et al, Astrophysical Journal Letters, 2025. Graphic: Lyman-Alpha Galaxy Up Close Illustration by M. Wiss, 2009. Public Domain

Shot in the Dark

The Earth orbits the Sun at a brisk 107,000 km/hr (66,486 mi/hr). The Sun, in turn, circles the Milky Way at a staggering 828,000 km/hr (514,495 mi/hr). And deep in the galactic core, stars whirl around the supermassive black hole at relativistic speeds, up to 36 million km/hr (22,369,363 mi/hr). Gravity is the architect and master of this motion: the invisible hand that not only initiates these velocities but binds our galaxy into a luminous spiral of unity.

Except it shouldn’t. Not with the piddling amount of mass that we can see.

The Milky Way contains 60-100 billion solar masses, an impressive sum, but a puny, gravitationally insufficient amount. With only that amount of ordinary matter, the galaxy would disperse like dry leaves in a breeze. Its stars would drift apart, its spiral arms dissolve, and the universe itself would remain a diffuse fog of light and entropy, never coalescing into structure or verse. No Halley’s Comet. No seasons. No Vivaldi.

To hold the Milky Way together at its observed rotation speeds requires about 1.4 trillion solar masses, seven times the visible amount. And we know this mass is there not because we’ve seen it, but because the galaxy exists. Much like Descartes’ Cogito, ergo sum (“I think, therefore I am”), we reason: The Milky Way is; therefore, it must possess sufficient mass.

The problem is that 85% of that mass is missing; from view, from touch, from detection. Enter stage right: Dark Matter. It does not emit, absorb, or reflect light. It does not interact with ordinary matter in any known way. It is invisible, intangible, a Platonic ether of shadow reality. Without it, the sacrament of gravity and being floats away like a balloon on a huff and puff day. And the universe loses its meaning.

Much like the neutrino, predicted by theory, is a particle once postulated to preserve the sanctity of conservation laws, a piece of the quantum world long before it was ever seen. Dark Matter is another elusive phantom, inferred by effect, but physically undetected. Dark Matter bends light, sculpts galaxies, and governs gravitational dynamics, yet it inhabits a metaphysical realm that requires faith to make it real. Unlike the neutrino, it lacks a theoretical platform. The General Theory of Relativity insists it must have mass; the Standard Model offers it no space. It is an effect without a cause: a gravitational fingerprint without a hand.

Yet, physicists are trying to tease it out, not so much to grasp a formless ghost, but rather to catch a glimpse of a wisp, a figment, without knowing how or where to look. To bring light to the dark one must grope around for a switch that may or may not exist.

Researchers at the University of Zurich and the Hebrew University of Jerusalem have devised an experiment called QROCODILE: Quantum Resolution-Optimized Cryogenic Observatory for Dark matter Incident at Low Energy (One can only guess at the amount of time and gin the Docs spent on that acronym 😊) to help tease out the existence of Dark Matter.

The experiment is designed to detect postulated ultralight dark matter particles that may interact with ordinary matter in currently unfathomable ways. To find these particles they have built a detector of superconducting nanowire sensors, cooled to near absolute zero, that achieves an astounding sensitivity to detect an infinitesimally small mass of 0.11 electron-volts (eV).

0.11 eV is roughly the energy difference between two quantum states in a molecule. An imperceptible shiver in the bond between two hydrogen atoms: a mass so slight, it might provoke a murmur of dark matter itself.

Using this detector over a 400-hour run (16.66 days) the team recorded a handful of unexplained signals that are real but not necessarily dark matter. Eventually they hope to achieve detections that resolve directionality, helping distinguish dark matter from background noise. The next phase of the experiment: NILE QROCODILE, (groan*) will move the detectors underground to reduce cosmic interference.

QROCODILE is a shot in the dark. It’s an epistemological paradox: how do you build a detector for something you don’t understand? How, or why, do you build an energy detector for a substance, if it is indeed a substance, that doesn’t emit or absorb energy.

While dark matter is known through its gravitational pull, that detection at a particle level is infeasible. Energy detectors, then, are a complementary strategy, betting on weak or exotic interactions beyond gravity.

Whether it finds Dark Matter or not, QROCODILE reminds us that science begins not with certainty, but with the courage to ask questions in the dark, and the craftsmanship to build instruments that honor the unknown.

* NILE QROCODILE: an acronym that evokes remembrance of the socially awkward Dr. Brackish Okun, a secluded researcher of aliens and their tech at Area 51 in the 1996 movie Independence Day.

Source: …Dark Matter Search with QROCODILE… by Laura Baudis et al, Physical Review Letters, 2025. Graphic: Nile Crocodile Head by Leigh Bedford, 2009. Public Domain.

Color in the Eye of the Beholder

Ansel Adams (1902-1964), photographer of the majestic, was exceptionally elusive when it came to why he preferred black-and-white photographs over color, offering only a few comments on his medium of choice. He believed that black-and-white photography was a “departure from reality” which is true on many levels but that is also true of most artistic efforts and products. He also held the elementary belief that “one sees differently with color photography than black-and-white.” Some have even suggested that Adams said, “…when you photograph them in black and white, you photograph their souls,” but this seems apocryphal since most of his oeuvre was landscape photography.

Adams’s black-and-white photography framed the grandeur of the mountainous West in stark, unembellished terms. Yet without color, a coolness loiters, untouched by human sentiment or warmth. As an unabashed environmentalist, maybe that was his point, the majesty of the outdoors was diminished by human presence. In black-and-white, the wilderness remained unsullied and alone.

But to Claude Monet (1840-1926), founding French Impressionist, color and light, was everything in his eye. Color defined his paintings, professing that “Color is my day-long obsession, (my) joy…,” he confessed. Color was also a constant burden that he carried with him throughout the day and into the night, lamenting, “Colors pursue me like a constant worry. They even worry me in my sleep.” He lived his aphorism: “Paint what you really see, not what you think you ought to see…but the object enveloped in sunlight and atmosphere, with the blue dome of Heaven reflected in the shadows.” His reality was light and color with a human warming touch.

Adams and Monet’s genius were partially contained in their ability to use light to capture the essence of the landscape, but Monet brought the soul along in living color. Monet’s creed, “I want the unobtainable. Other artists paint a bridge, a house, a boat, and that’s the end…. I want to paint the air which surrounds the bridge, the house, the boat, the beauty of the air in which these objects are located…”

Color is a defining quality of humanity. Without color life would be as impersonal as Adam’s landscapes, beautiful, majestic even, but without passion or pulse. A sharp, stark visual with little nuance, no emotional gradations from torment to ecstasy, just shadows and form.

Understanding color was not just a technical revelation for 19th-century French artists, it was a revolutionary awakening, a new approach to how the eye viewed color and light. The Impressionists and Pointillists brought a new perception to their canvases. And the catalyst for this leap away from the tired styles of Academic Art and Realism was Michel Eugene Chevreul, a chemist whose insight into color harmony and contrast inspired the Monets and Seurats to pursue something radically different in the world of art. His chromatic studies inspired them to paint not for the viewer’s eye, but with it, transforming perception from passive witness into an active collaboration between painter, subject, and observer.

Chevreul’s breakthrough was deceivingly simple. Colors are not static blots on a canvas but relational objects that come alive when surrounded by other hues of the spectrum. A hue in isolation is perceived differently than when seen next to another. Red deepens next to green; blue pulsates with enthusiasm against orange. This principle, simultaneous contrast, revealed that the eye does not just passively accept what it sees but synthesizes it to a new reality.

Chevreul’s theories on complementary colors and optical mixing laid the foundation for painters to forsake rigid outlines, often rendered in the non-color of black, and embrace Impressionism: not merely an art style, but a promise of perception, a collaboration between painter and viewer. Rather than blending pigments on a palette, artists like Monet and Seurat placed discrete strokes side by side, allowing the viewer’s mind to complete the image.

This optical mixing is a product of the way the eye and the brain process the various wavelengths of white light. When complementary colors are adjacent to one another the brain amplifies the differences. Neurons in the eye are selfish. When a photoreceptor is stimulated by a color it suppresses adjacent receptors sharpening the boundaries and contrast. And the brain interprets what it sees based on context. Which is why sometimes we see what is not there or misinterpret what is there, such as faces on the surface of Mars or UFOs streaking through the sky. There is also a theory that the brain processes color in opposing pairs. When it sees red it suppresses green creating a vibrancy of complementary colors when placed together.

The Impressionists intensely debated Chevreul’s concepts then they brushed them to life with paint. They painted not concrete objects, but forms shaped by light and color. Haystacks and parasols within a changing mood of contrasting color. . Interpretation by the eye of the beholder.

Chevreul’s collected research, The Principles of Harmony and Contrast of Colors and Their Applications to the Arts, originally published in 1839, remains in print nearly two centuries later.

Source: The Principles of Harmony and Contrast of Colors and Their Applications to the Arts by Michel Eugène Chevreul, 1997 (English Translation). Graphic: Woman with a Parasol by Monet, 1875. National Gallery of Art, Washington, DC. Public Domain.

Cosmos of the Lonely

The universe keeps expanding. When researchers analyze data from the Hubble and James Webb telescopes, alongside a suite of other astronomical tools, they find that the recessional velocity of galaxies, the speed at which they appear to move away from the Earth, varies depending on what they measure.

If they calibrate distances deep into the cosmos using Cepheid variable stars, the expansion rate appears faster than when they use red giant stars or the Cosmic Microwave Background (CMB). This discrepancy, known as the Hubble tension, reveals a deeper mystery: different cosmic yardsticks yield different rates of expansion.

Yet despite the disagreement in values, all methods affirm the same truth: space is stretching…a lot…like a sheet pulled and stretched taut between Atlas’s burden and Hermes flight: a cosmos caught between gravitational pull and a mysterious push: Pushmi-Pullyu on a cosmic scale.

To understand why the cosmos resembles a sheet of rubber we need to travel back about 110 years and peer into the minds of those who first saw increasing separation as a universal law. These new architects of reality: Einstein, Friedmann, Lemaitre; who replaced Newton’s planetary, static models of the cosmos with a dynamic spacetime of bends, ripples, and persistent expansion.

After Einstein published his General Theory of Relativity in 1915, Russian physicist Alexander Friedmann’s analysis of his work showed that the universe could be expanding, and that Einstein’s equations could be used to calculate the rate. In 1927 Belgium priest and physicist Georges Lemaitre proposed that the expansion might be proportional to a galaxy’s velocity relative to its distance from Earth. By 1929, American astronomer Edwin Hubble expanded on Lemaitre’s work and published what became known as Hubble-Lemaitre law: galaxies are moving away from us at speeds proportional to their distance. The greater the distance the faster the speed.

A key feature of this law is the Hubble constant, the proportionality that links velocity and distance. Hubble’s initial estimate for this constant was whopping, and egregiously off, 500 kilometers per second per megaparsec (km/s/Mpc), but as measurements improved, it coalesced around a range between 67 and 73, with the most recent value at 70.4 km/s/Mpc, published by Freedman et al. in May 2025.

The Hubble constant is expressed in kilometers per second per megaparsec. The scale of these units is beyond human comprehension but let’s ground it to something manageable. A megaparsec is about 3.26 million light-years across, and the observable universe, though only 13.8 billion light-years old, has stretched to 46 billion light-years in radius, or 93 billion light-years in diameter, due to the expansion of space (see mind warping explanation below).  

To calculate the recessional velocity across this vast distance, we first convert 46 billion light-years into megaparsecs: which equates to 14,110 megaparsecs. Applying Hubble’s Law: 70 km/s/Mpc times 14,110 Mpc equals 987,700 km/s. This is the rate at which a galaxy 46 billion light-years away would be receding relative to another galaxy one megaparsec closer to Earth.

That’s more than three times the speed of light (299,792 km/sec) or Warp 3 plus in Star Trek parlance. Einstein said this was impossible but fortunately there is some nuance that keeps us in compliance with Special Relativity (or else the fines would be astronomical). This isn’t the speed of a galaxy moving through space, but the speed at which space between galaxies is expanding. Which, admittedly, is terribly confusing.

The speed of a galaxy, composed of matter, energy, and dark matter, must obey Einstein’s rules: gravity and Special Relativity. And one of the rules is that the speed of light is the cosmic speed limit, no one shall pass beyond this.

But space between the galaxies decides to emphasize the rules in a different order. The expansion of space is still governed by Einstein’s equations, just interpreted through the lens of spacetime geometry rather than the motion of objects. This geometry is shaped by, yet not reducible to, matter, energy, and dark matter.

Expansion is a feature of spacetime’s structure, not velocity in the usual sense, and thus isn’t bound by the speed of light. If space wants to expand, stretch, faster than a photon can travel, well so be it.

The space between galaxies is governed by dark energy and its enigmatic rules of geometry. Within galaxies, the rules are set by dark matter, and to a lesser extent by matter and energy, even though dark energy is likely present, its influence at galactic scales is minimal.

Note the use of the word scale here. Galaxies are gigantic, the Milky Way is 100,000-120,000 light-years in diameter. But compared to the universe at 93,000,000,000 light-years across, they’re puny. You would need 845,000 Milky Ways lined up edge-to-edge to span the known universe.

Estimates of the number of galaxies in the universe range from 100 billion to 2 trillion. So, at the scale of the universe, galaxies are mere pinpoints of light; blips of energy scattered across the ever-expanding heavens.

This brings us to dark energy, the mysterious force driving cosmic expansion. No one knows what it is, but perhaps empty space and dark energy are the same. There’s even some speculation, mostly mine, that dark energy is a phase shift of dark matter. A shift in state. A triptych move from Newtonian physics to Quantum Mechanics to…Space Truckin’.

In the beginning moments after the big bang, the universe was dominated by radiation composed of high energy particles and photons. As the universe cooled, the radiation gave way to matter and dark matter. As more time allowed gravity to create structures, black holes emerged and a new force began to dominate, dark energy. But where did the dark energy come from? Was it always part of the universe or did it evolve from other building blocks. Below are a few speculative ideas floating around the cosmic playroom.

J.S. Farnes proposed a unifying theory where dark matter and dark energy are aspects of a single negative mass fluid. This fluid could flatten galaxy rotation curves and drive cosmic expansion, mimicking both phenomena simultaneously.

Mathematicians Tian Ma and Shouhong Wang developed a unified theory that alters Einstein’s field equations to account for a new scalar potential field. Their model suggests that energy and momentum conservation only holds when normal matter, dark matter, and dark energy are considered together.

Ding-Yu Chung proposed a model where dark energy, dark matter, and baryonic matter emerge from a dual universe structure involving positive and negative mass domains. These domains oscillate and transmute across dimensions.

These ideas all rotate around the idea that reality revolves around a concept that everything evolves and that matter and energy, of all forms, flickers in and out of existence depending on dimensional scaffolding of space and the strength of gravity and radiation fields.  Rather than radiation, energy, matter, dark matter, and dark energy as separate entities, these may be expressions of a single evolving field, shaped by phase transitions, scalar dynamics, or symmetry breaking.

Now back to my regularly scheduled program. In August 2025, Quanta Magazine reported on a study led by Nobel laureate Adam Riess using the James Webb Telescope (JWST) to measure over 1,000 Cepheid variable stars with unprecedented precision. Cepheid stars pulsate in brightness over time with a highly predictable rate or rhythm, making them ideal cosmic yardsticks. Riess’s team found a Hubble constant of ~73.4 km/s/Mpc, consistent with previous Hubble Space Telescope measurements of Cepheid stars but still significantly higher than what theory predicts.

That theory comes from the standard model of cosmology: Lambda Cold Dark Matter. According to this framework photons decoupled from the hot electron-proton opaque soup about 380,000 years after the Big Bang went boom, allowing light to travel freely for the first time, and allowing space to be somewhat transparent and visible. This event produced the Cosmic Microwave Background (CMB).

This CMB permeates the universe to this day. It was discovered in 1964 by Bell Lab physicists Arno Penzias and Robert Wilson, who were trying to eliminate background noise from their radio antenna. The noise turned out to be the faint afterglow from the Big Bang, cooled down from its original 3000 Kelvin to a frosty 2.7 Kelvin. They received the Nobel Prize in Physics for this discovery in 1978.

Light from the CMB, as measured by the European Space Agency Planck satellite, has a redshift of approximately 1100, meaning the universe has expanded by a factor of 1100 over the past 13.42 billion years. By analyzing the minute temperature fluctuations in the CMB, Planck can infer the density of matter, dark energy, and curvature of the universe. Inserting these parameters into the Lambda Cold Dark Matter model yields a Hubble constant which turns out to be 67.4 + 1.71 (65.69-69.11). This value is considered the gold standard. Values beyond the Planck measurement are not necessarily wrong, just not understood.

At first glance, the difference between Planck’s 67.4 and Riess’ 73.4 may seem small. But it is cosmically significant. Two galaxies 43 billion light-years away and 3.26 billion light-years apart (1000 Mpc) would have a velocity difference of 6000 km/s or about 189 billion kilometers of increased separation per year. That’s the scale of what small differences in the value can add up to and is referred to as the Hubble tension.

Meanwhile, a competing team of researchers studying red branch and giant branch stars consistently scored the Hubble constant closer to the theoretical prediction of 67.4. This team led by Wendy Freedman believes that Hubble tension, the inability of various methods of measuring the Hubble constant to collapse to a single value, is a result of measurement errors

While some researchers, Wendy Freedman and others, suggest lingering systematic errors may still be at play, the persistence of this discrepancy, across instruments, methods, and team, has led others to speculate about new physics. Among the most provocative ideas: the possibility that the universe’s expansion rate may vary depending on direction, hinting at anisotropic expansion and challenging the long-held assumption of cosmic isotropy. But this seems far-fetched and if true it would likely break the Lambda Cold Dark Matter model into pieces.

And so, the cosmos grows lonelier. Not because the galaxies are fleeing, but because space itself is stretching, a wedge governed by the geometry of expansion. The further they drift apart, the less they interact, a divorce from neglect rather than malice. In time, entire galaxies will slip beyond our cosmic horizon, receding faster than light, unreachable even in principle. A cosmos of the lonely.

Source: The Webb Telescope Further Deepens the Biggest Controversy in Cosmology by Liz Kruesi, Quanta Magazine, 13 August 2024. JWST Observations Reject Unrecognized Crowding of Cepheid Photometry as an Explanation for the Hubble Tension at 8σ Confidence by Riess et al, The Astrophysical Journal Letters, 6 February 2024. Graphic: Cosmic Nebula by Margarita Balashova.

Women and Glass: The Starlight Calculators of Harvard

In the halcyon days of yore before digital ubiquity and tonal exactitude, computers were made of flesh and blood, fallibility crossed with imaginative leaps of genius. Photographs etched starlight’s past onto glistening glass and preserved silver. Solid archives where memory endures and future discoveries shimmer with potential, encoded in celestial light of the heavens awaiting the discerning caress of curiosity, intuition, and reason.

In 1613, English poet Richard Brathwait, best remembered for his semi-autobiographical Drunken Barnaby’s Four Journeys, enshrined the word computer into written English while contemplating the divine order of the heavens, calling God the “Truest computer of Times.” Rooted in the Latin computare, meaning “to reckon together,” the term evolved over the next three centuries to describe human minds inimitably attuned to the interpretation of visual data: star fields, spectral lines, geologic cross-sections, meteorological charts, and other cognitive terranes steeped in mystery, teasing initiates with hints of vision and translation. These were not mere calculators nor unimaginative computers, but perceptive analysts, tracing patterns, exposing truths, and coaxing insights from fluid shapes etched into the fabric of nature.

By the time of the Enlightenment and the scientific revolution, human computers had become the invisible deciphering force behind truth seeking laboratories, the unsung partners in progress, cataloging, interpreting, and taming the flood of empirical but seemingly nonsensical data that overwhelmed those without insight. Harvard College Observatory was no exception. With photography now harnessed to astronomy’s telescopes, the observatory could suddenly capture and archive starlight onto glass plates of coated silver, forever changing astronomy from the sketches of Galileo to silver etches of eternal starlight.

But these glass plates, resplendent with cosmic information, remained galleries of dusty, exposed negatives, inert until absorbed and guided by human curiosity and insight.

Enter the women computers of Harvard, beginning in 1875, over 140 women, many recruited by Edward Charles Pickering, processed more than 550,000 photographic plates, the last collected in 1992, bringing much needed coherence and linearity to the chaos of too much. They sorted signal from celestial noise, revealing the hidden order of the universe inscribed in silver, preserved in silica.

In 1875 the initial cohorts, the pioneers, the first names of Harvard women computers, although not exactly given that moniker, to appear on the glass plates were names like Rebecca Titsworth Rogers, Rhoda G. Saunders, and Anna Winlock assisting in the absolutely essential process of what we would now call cross-referencing the glass plate’s ‘metadata’ with the astronomical data.  Ascertaining that time and space of the data match the time and space of the metadata. In 1881 Pickering, the observatory’s fourth director, began hiring women specifically as Astronomical Computers, a formal role focused on analyzing and deciphering the growing collection of glass plate photographs.

This shift in 1881 was more than semantic, a fancy title for drudge work and tedious plate cataloging but a structured program where women like Williamina Fleming, Annie Jump Cannon, Henrietta Swan Leavitt, and Cecilia Payne-Gaposchkin were tasked with not just cataloging stars, but studying stellar spectra, and the lights powering life and imagination throughout the universe. Indispensable efforts that lead to the Henry Draper Catalogue, eventually containing the half million plus glass plates, and the foundations of modern stellar classification systems and 21st century astronomy. Their stories are worthy of a Horatio Alger novel, maybe not exactly rags to riches, but certainly humble beginnings to astronomical fame. They were paid peanuts, but they were the elephants in the observatory.

Williamina Fleming, in 1879 arrived in Boston penniless and abandoned by her husband secured a job as a domestic in the home of Edward Pickering, yes that guy. She impressed Pickering’s wife, Elizabeth, with such intelligence that she recommended her for work in the observatory. She quickly outpaced her male counterparts and in 1881 was officially hired as one of the first Harvard Computers.

Studying the photographed spectra of stars, she developed a classification system, the natural human desire to find order in apparent chaos, based on the abundance of hydrogen on the surface of a star or more exact the strength of hydrogen absorption lines from the spectra data. The most abundant stars were classed as A stars, the next most abundant as B stars, and on down to V.

In 1896 Pickering hired Annie Jump Cannon, a physics degree from Wellesley and an amateur photographer, modified Fleming’s stellar classification system based also on the surface temperature of a star rather than hydrogen abundance. Her method was to use the strength of the Balmer absorption lines, electrons excited within hydrogen atoms, like dancers at different tempos, reveal themselves through subtle spectral lines now understood to be differing ionization states of the atom directly tied to the surface temperature of the star.

Her system used the same letters to avoid redoing the entire Harvard catalogue, but she reduced the list down to 7 and reordered them from hottest to coolest: O, B, A, F, G, K, M. Her classification is still in use today. Earth revolves around a G-class star which has a medium surface temperature of about 5800 K (9980 F or 5527 C).

Henrietta Swan Leavitt graduated from Harvard’s Women’s College in 1892 with what we might now call a liberal arts degree. A year later, she began graduate work in astronomy, foundation for employment at the Harvard Observatory. After several extended detours tucked under her petticoats, Edward Charles Pickering brought her back to the Observatory in 1903. She worked initially without pay, later earning an unfathomable 30 cents an hour.

There, Leavitt collaborated with Annie Jump Cannon, in a coincidence of some note both women were deaf, though one is left with the feeling that the absence of sound may have amplified the remaining sensory inputs to their fertile minds. In time, Leavitt uncovered a linear relationship between the period of Cepheid variable stars and their luminosity, a revelation that became an integral part of the cosmic yardstick for measuring galactic distances. The Period-Luminosity relation is now enshrined as Leavitt’s Law.

Cepheid variables form the second rung of the Cosmic Distance Ladder; after parallax, and before Type Ia supernovae, galaxy rotation curves, surface brightness fluctuations, and, finally, the ripples of Einsteinian gravitational waves. Leavitt’s metric would prove essential to Edwin Hubble’s demonstration that the universe is expanding.

Swedish mathematician Gösta Mittag-Leffler considered nominating her for the Nobel Prize in Physics, but his plans stalled upon learning she had died in 1921. The Nobel, then as now, is non-awardable to the dead.

Cecilia Payne-Gaposchkin, a transplanted Brit, joined the Harvard Observatory as an unpaid graduate fellow while working towards her PhD at Radcliffe in astronomy. Upon earning her doctorate, she continued at the Observatory with no title and little pay. By 1938 she was awarded the title of Astronomer and by 1956 was made full professor of Harvard’s faculty.

In her dissertation she accurately showed for the first time that stars are composed primarily of hydrogen and helium, proving that hydrogen was the most abundant element in the universe, overturning long held but erroneous assumptions. But in a twist of fate, astronomer Henry Norris Russell persuaded her to label her conclusions of hydrogen abundance as spurious. Four years later Russell’s research reached the same conclusion, but he barely gave her an honorable mention when he published his results.

She wasn’t the first nor will she be the last to suffer at the hands of egotistical professors, more enamored of self rather than truth, but her elemental abundance contribution to astronomy brushed away the conceit that stars must mimic rocky planets in their composition, much like Galileo ended Earth’s reign as a center of everything. Twentieth century astronomer Otto Struve hailed her dissertation as “the most brilliant PhD thesis ever written in astronomy.”

Undeterred and building on her studies of spectral emissions of stars she turned her gaze to high luminosity and variable stars with husband astronomer Sergi Illarionovich Gaposchkin. After 2 million observations of variable stars, their efforts laid the groundwork for stellar evolution: how stars change over the course of time. From hints of dispersed stardust to starlight and back again. Cycles of stellar life repeated billions of times over billions of years.

Harvard’s astronomical female human computers, initially mere clerks transcribing stars from silver and glass, evolved into interpreters of light, shaping the very foundations of astronomy. Through logic, imagination, and an unyielding devotion to truth, they charted the heavens and opened lighted pathways for generations to follow.

Graphic: The Harvard Computers standing in front of Building C at the Harvard College Observatory, 13 May 1913, Unknown author. Public Domain

To Boldly Go

On 23 June 2025, after more than three decades of evolution, from a gleam of an idea to detailed planning, exacting execution, and the physical realization of the world’s largest astronomical camera, the Vera C. Rubin Observatory’s Legacy Survey of Space and Time (LSST) in Chile unveiled to the public its first breathtaking images. Among them: razor-sharp mosaics of the Trifid and Lagoon Nebulae, and the sprawling Virgo Cluster, home to millions of galaxies. Captured with world-class light-collecting mirrors, these images marked the beginning of a spectacular ten-year quest to map the known universe and illuminate the 95% we still don’t understand: dark matter and dark energy. An exciting, albeit, Herculean future awaits, built on an equally stunning past where dreams and science converged into one of the most staggering feats of technological achievement in modern astronomy.

Let the future map of the universe tell its own story in due time. The path to the map deserves a chapter all its own.

In 1969 Willard Boyle and George Smith of Bell Labs invented a device capable of detecting and measuring the intensity of light which they named CCD or Charge-Coupled Device: a breakthrough that earned them the 2009 Nobel Prize in Physics. A CCD converts incoming photons into electrical signals, creating a voltage map of light intensity, a digital proxy for the number of photons striking its surface. Initially constructed as a semiconductor chip, it quickly evolved into a pixelated imaging sensor. These sensors quickly became the gold standard for digital consumer and scientific imaging but due to costs, consumer applications such as your phone camera switched over to CMOS sensors due to lower costs. Scientific and surveillance systems, such as the Hubble Telescope, SOAR, and SNAP, still employ CCDs because of their superior image fidelity.

In the late 1980s J. Anthony ‘Tony’ Tyson, an experimental physicist at Bell Labs, focused on developing instrumentation to detect faint optical signals using CCDs. His inspiring contribution to the CCD was to recognize their potential in imaging the heavens and laying the groundwork for digital deep sky surveys. He quickly discovered faint blue galaxies and gravitational lensing using modified CCDs that he helped developed. Additionally, he helped build the Big Throughput Camera that was instrumental in the 1998 discovery of dark energy.

Tyson never thought small. His CCDs were instruments of the infinitesimal, but his dreams were as gargantuan as the universe itself. In fact, his dream was the universe. In 1994 he proposed his “Deep Wide Fast” telescope, a scaleup of his Big Throughput Camera and the forerunner of the LSST. The Deep Wide Fast was a concept that would combine a deep imaging device with rapid cadence, and broad coverage simultaneously. In other words, synoptic realization of the universe in near real time.

Throughout the 1990s, Tyson rallied minds and resources to shape his cosmic vision. John Schaefer of the Research Corporation helped secure early funding. Roger Angel proposed the use of the innovative Paul Baker three-mirror telescope design. Institutions like the Universities of Arizona and Washington, along with the Optical Astronomy Observatory, all hitched their wagons to Tyson’s star-filled dream of mapping the universe.

In 1998 Tyson presented designs for a Dark Matter Telescope and in 1999 the science case was submitted to the Astronomy and Astrophysics Decadal Survey. In 2003 the first formal proposal was sent to the Experimental Program Advisory Committee at SLAC (Stanford Linear Accelerator Center). It consisted of an 8.4-meter mirror with a 2.3-billion-pixel camera capable of surveying the entire visible sky every few nights. The proposal also laid out the NSF–DOE partnership, with SLAC leading the camera development and other institutions handling optics, data systems, and site operations.

In 2004 Tyson left Bell Labs and joined the University of California at Davis as a cosmologist and continued to shepherd the LSST project from there.

In 2007 the project received $30 million in private funding from Charles Simonyi, Bill Gates, and others. The telescope is named the Simonyi Survey Telescope. In 2010 U.S. National Science Foundation (NSF) and Department of Energy (DOE) joined in the quest to view the universe through the sharp eyes of the LSST.

The telescope’s primary 8.4-meter and the 5.0-meter tertiary mirrors were built at the University of Arizona, beginning in 2008, completed in 2015, and stored on-site in Chile since 2019. Fabricated in the U.S., the 3.4-meter secondary was later coated in Germany with nickel-chromium, silver, and silicon nitride, materials chosen to enhance reflectivity, durability, and long-term performance.

In 2015 SLAC, which oversaw the design, fabrication, and integration of the camera, began building the components with assistance from Brookhaven National Laboratory, Lawrence Livermore National Laboratory, and IN2P3/CNRS in France. By 2024 the camera was finished and shipped to Chile. In 2025 the camera was installed and integrated with the telescope. In June of 2025 the first light images were released to the public.

The camera measures roughly 3 meters in length, 1.65 meters in diameter, and weighs 3 metric tons, an imposing instrument, rivaling the bulk of a small car. Its imaging surface, a 64-centimeter focal plane, contains 3.2 billion pixels, each a 10-micron square, roughly one-tenth the width of a human hair. These pixels, etched across 189 custom CCD sensors arranged into 21 modular “rafts,” are laid flat to within 10 microns, ensuring near-perfect focus. The entire array is chilled to –100°C to suppress electronic and thermal noise, enhancing signal fidelity.

Before photons reach the sensor, they pass through three precision-crafted corrective lenses, including the largest ever installed in an astronomical camera, and up to six interchangeable filters spanning ultraviolet to near-infrared. The filter exchange system enables the observatory to target specific wavelength bands, tailored to sky conditions and science goals.

The integrated LSST system is engineered to capture a 15-second exposure every 20 seconds, producing thousands of images per night, tallying approximately 15 terabytes of new data. Each image covers 9.6 square degrees of sky, roughly equivalent to the diameter of 45 full moons, allowing the system to survey the entire visible southern sky every 3–4 nights. Imaging a single field across all six filters can take up to 5–6 minutes, though filters are selected dynamically based on science goals and atmospheric conditions.

The system’s angular resolution is sharp enough to resolve a golf ball from 15 miles away and at the edge of the observable universe, this scales to structures no smaller than a large galaxy; certainly not stars, not planets, nor restaurants. Over its decade-long campaign, LSST is projected to catalogue more than 17 billion stars and 20 billion galaxies, a composite digital universe stitched together from individual photons captured from 3 million images, each snapped every few seconds over the clear night sky of Chile. The LSST will not simply map what’s visible but illuminate the unknown. Beneath the sophisticated hardware and software lies a deeper purpose: to shine the light of curiosity on the 95% of the universe that remains in the shadows of time and space: dark matter and dark energy, the known unknown dynamic force behind galactic formation and cosmic expansion. The LSST is more than a camera. It is a reckoning with the vast unknown, a testament to humanity’s refusal to let mystery remain unexplored and uncharted: to find God.

In 2013 Tyson was named chief scientist of the LSST and is still actively contributing to the intellectual vision of the project and mentoring the next gen of cosmologists and engineers.

Graphic: LSST Camera Schematic and Trifid Nebula by SLAC-DOE-NSF.

Life, the Universe, and Everything: Speculative Musings on the Cutting Edge of Physics

The Higgs boson, theorized in the 1960s, is a massive quantum particle central to the Standard Model of particle physics. It arises from the Higgs field, an invisible sea permeating all of space, which gives fundamental particles, like electrons and quarks, their mass. Unlike electromagnetic fields, created by moving charges like protons, the Higgs field exists everywhere, quietly shaping the universe. In 2012, CERN’s Large Hadron Collider detected the Higgs boson, confirming the field’s existence. While the boson is observable, the field remains invisible, known only by its effects on particle masses.

The Higgs field assigns mass, but gravity governs how that mass behaves across the vast scales of spacetime. Blending gravity with quantum mechanics, which includes the Higgs field, requires a yet-undiscovered theory of quantum gravity. If successful, quantum gravity might untangle physics-defying singularities, points of extreme density, into structured, comprehensible forms. Some theorize it could also reveal how early radiation morphed into matter, possibly influencing the formation and behavior of mysterious dark matter and its potential link to dark energy.

Before the Big Bang, some picture a singularity, a point of extreme density, though not necessarily infinite matter, where known physics and spacetime break down. Quantum gravity, however, hints this wasn’t truly infinite but a transition phase. From what? Perhaps a prior universe or a chaotic quantum state, science doesn’t yet know. This shift, possibly tied to the Higgs field, may have sparked quantum fluctuations, birthing radiation, matter, and the cosmic structure we see today.

What if the universe is cyclic, not a one-time burst? Instead of a singular Big Bang, some speculate a “bounce”, a transition where spacetime contracts, then expands again. Early on, energetic radiation like photons cooled and condensed into heavy particles, or fermions, a million times heftier than electrons. Some theorize these fermions underwent chiral symmetry breaking, like a spinning top wobbling one way instead of both, potentially forming cold dark matter, though evidence is sparse. This invisible web of dark matter stabilized galaxies, keeping them from spinning apart.

The Higgs field might have shaped dark matter by influencing the mass of early fermions, but this link is speculative, lacking direct proof. Dark matter, in turn, may be evolving. If it slowly decays or transitions into dark energy, as some hypothesize, it could drive the universe’s accelerating expansion. Ordinary matter, atoms, molecules, and radiation, also formed via the Higgs field, while energy, mostly electromagnetic radiation, fuels cosmic evolution. These pieces dance within a framework shaped by the Higgs, elusive quantum gravity, and the subtle interplay of dark matter and dark energy.

Could radiation, dark matter, and dark energy be different faces of a single, evolving force? Radiation transitioning to dark matter gradually shifting into dark energy, the universe might unravel, leaving isolated stars drifting in an endless void. Then, fluctuations in the Higgs field and quantum gravity could trigger contraction, setting the stage for another bounce. Rather than destruction, this might be a cosmic recycling, a continuous interplay of forces across time: Life, the Universe, and Everything.

Source: CDM Analogous to Superconductivity by Liang and Caldwell, May 2025, APS.org. Graphic: Cosmic Nebula by Margarita Balashova.

Web of Dark Shadows

Cold Dark Matter (CDM) comprises approximately 27% of the universe, yet its true nature remains unknown. Add that to the 68% of the universe made up of dark energy, an even greater mystery, and we arrive at an unsettling realization: 95% of the cosmos remains unexplained.

Socrates famously said, “The only thing I know is that I know nothing.” Over two millennia later, physicists might agree. But two researchers from Dartmouth propose a compelling possibility: perhaps early energetic radiation, such as photons, expanded and cooled into massive fermions, which later condensed into cold dark matter, the invisible force holding galaxies together. Over billions of years, this dark matter may be decomposing into dark energy, the force accelerating cosmic expansion.

Their theory centers on super-heavy fermions, particles a million times heavier than electrons, which behave in an unexpected way due to chiral symmetry breaking: where mirror-image particles become unequally distributed, favoring one over the other. Rather than invoking exotic physics, their model works within the framework of the Standard Model but takes it in an unexpected direction.

In the early universe, these massive fermions behaved like radiation, freely moving through space. However, as the cosmos expanded and cooled, they reached a critical threshold, undergoing a phase transition, much like how matter shifts between liquid, solid, and gas.

During this transformation, fermion-antifermion pairs condensed—similar to how electrons form Cooper pairs in superconductors, creating a stable, cold substance with minimal pressure and heat. This condensate became diffuse dark matter, shaping galaxies through its gravitational influence, acting as an invisible web counteracting their rotation and ensuring they don’t fly apart.

However, dark matter may not be as stable as once thought. The researchers propose that this condensate is slowly decaying, faster than standard cosmological models predict. This gradual decomposition feeds a long-lived energy source, possibly contributing to dark energy, the force responsible for the universe’s accelerated expansion.

A more radical interpretation, mine not the researchers, suggests that dark matter is not merely decaying, but evolving into dark energy, just as energetic fermion radiation once transitioned into dark matter. If this is true, dark matter and dark energy may be two phases of the same cosmic entity rather than separate forces.

If these hypothesis hold, we should be able to detect, as the researchers suggest, traces of this dark matter-to-dark energy transformation in the cosmic microwave background (CMB). Variations in density fluctuations and large-scale structures might reveal whether dark matter has been steadily shifting into dark energy, linking two of cosmology’s biggest unknowns into a single process.

Over billions of years, as dark matter transitions into dark energy, galaxies may slowly lose their gravitational cage and begin drifting apart. With dark energy accelerating the expansion, the universe may eventually reach a state where galaxies unravel completely, leaving only isolated stars in an endless void.

If dark matter started as a fine cosmic web, stabilizing galaxies, then over time, it may fade away completely, leaving behind only the accelerating force of dark energy. Instead of opposing forces locked in conflict, what if radiation, dark matter, and dark energy were simply different expressions of the same evolving entity?

A tetrahedron could symbolize this transformation:

  • Radiation (Energetic Era) – The expansive force that shaped the early universe.
  • Dark Matter (Structural Phase) – The stabilizing gravitational web forming galaxies.
  • Dark Energy (Expansion Phase) – The force accelerating cosmic evolution.
  • Time (Governing Force) – The missing element driving transitions between states.

Rather than the universe being torn apart by clashing forces, it might be engaged in a single, continuous transformation, a cosmic dance shaping the future of space.

Source: CDM Analogous to Superconductivity by Liang and Caldwell, May 2025, APS.org. Graphic: Galaxy and Spiderweb by Copilot.

Water Everywhere

Two recent Earth science studies by Barrett et al. and Bermingham et al. explore the origins of Earth’s water and indirectly, organic matter, key prerequisites for the development of intelligent life. Their findings support the early delivery of needed chemicals to form water and carbon molecules by inner and outer solar system planetesimals such as asteroids and comets.

Barrett et al. shows that an inner solar system sourced enstatite chondrite (EC) asteroid found in Antarctica is isotopically similar to Earth material, (not surprisingly, this supports the 270-year-old Nebular Hypothesis) capable of delivering substantial hydrogen during Earth’s accretionary phase (~4.56–4.5 billion years ago). The ECs contain hydrogen as H2S in silicate glass, linked to pyrrhotite, sufficient to account for up to 14 times Earth’s ocean mass. This hydrogen was systematically incorporated in the hot inner solar system via nebular processes, suggesting water was an inherent outcome of Earth’s formation, not a later addition. ECs also contain trace organic matter contributing modestly to Earth’s carbon inventory. Despite the chaotic “billiard table” trajectories of early solar system collisions, the stability of H2S in glass ensured survival during violent accretion. This early delivery of water and organics established a foundational habitable environment, priming the Earth’s prebiotic chemistry for the creation and evolution of intelligent life.

Bermingham et al., taking a different investigative track, analyze molybdenum isotopes in meteorites and Earth’s crust, concluding that water was delivered during the Late Heavy Bombardment (LHB: 4.1–3.8 billion years ago) by planetesimals, including inner solar system asteroids and outer solar system comets, as hydrous minerals or brine. This late accretion, post-Moon-forming event (4.5 billion years ago), suggests a stochastic bombardment enriched Earth’s surface volatiles. Comets and carbonaceous chondrites, rich in organic matter, likely delivered significant carbon compounds, enhancing the prebiotic chemical environment. The chaotic early solar system facilitated this influx of outer solar system organics, complementing earlier inputs.

Both studies align with life’s prerequisites by ensuring water and organic delivery to the planet. Barrett et al. provide the bulk water budget and trace organics via ECs, creating an early aqueous environment, while Bermingham et al.’s LHB bombardment added more water and substantial organics, boosting conditions for life’s emergence. They agree on asteroids’ role, possibly including ECs, but differ in timing (early accretion vs. LHB) and outer solar system delivery contributions (minor in Barrett, significant via comets in Bermingham). Barrett et al.’s early delivery of water and organics can be viewed as foundational and Bermingham et al.’s LHB as a surface-enriching supplement, together enabling the chemical and evolutionary path to intelligent life.

Source: Barrett et al, 2025, Icarus. Bermingham et al, 2025, Rutgers. Graphic: Comet Cometh, Grok3.