Cosmos of the Lonely

The universe keeps expanding. When researchers analyze data from the Hubble and James Webb telescopes, alongside a suite of other astronomical tools, they find that the recessional velocity of galaxies, the speed at which they appear to move away from the Earth, varies depending on what they measure.

If they calibrate distances deep into the cosmos using Cepheid variable stars, the expansion rate appears faster than when they use red giant stars or the Cosmic Microwave Background (CMB). This discrepancy, known as the Hubble tension, reveals a deeper mystery: different cosmic yardsticks yield different rates of expansion.

Yet despite the disagreement in values, all methods affirm the same truth: space is stretching…a lot…like a sheet pulled and stretched taut between Atlas’s burden and Hermes flight: a cosmos caught between gravitational pull and a mysterious push: Pushmi-Pullyu on a cosmic scale.

To understand why the cosmos resembles a sheet of rubber we need to travel back about 110 years and peer into the minds of those who first saw increasing separation as a universal law. These new architects of reality: Einstein, Friedmann, Lemaitre; who replaced Newton’s planetary, static models of the cosmos with a dynamic spacetime of bends, ripples, and persistent expansion.

After Einstein published his General Theory of Relativity in 1915, Russian physicist Alexander Friedmann’s analysis of his work showed that the universe could be expanding, and that Einstein’s equations could be used to calculate the rate. In 1927 Belgium priest and physicist Georges Lemaitre proposed that the expansion might be proportional to a galaxy’s velocity relative to its distance from Earth. By 1929, American astronomer Edwin Hubble expanded on Lemaitre’s work and published what became known as Hubble-Lemaitre law: galaxies are moving away from us at speeds proportional to their distance. The greater the distance the faster the speed.

A key feature of this law is the Hubble constant, the proportionality that links velocity and distance. Hubble’s initial estimate for this constant was whopping, and egregiously off, 500 kilometers per second per megaparsec (km/s/Mpc), but as measurements improved, it coalesced around a range between 67 and 73, with the most recent value at 70.4 km/s/Mpc, published by Freedman et al. in May 2025.

The Hubble constant is expressed in kilometers per second per megaparsec. The scale of these units is beyond human comprehension but let’s ground it to something manageable. A megaparsec is about 3.26 million light-years across, and the observable universe, though only 13.8 billion light-years old, has stretched to 46 billion light-years in radius, or 93 billion light-years in diameter, due to the expansion of space (see mind warping explanation below).  

To calculate the recessional velocity across this vast distance, we first convert 46 billion light-years into megaparsecs: which equates to 14,110 megaparsecs. Applying Hubble’s Law: 70 km/s/Mpc times 14,110 Mpc equals 987,700 km/s. This is the rate at which a galaxy 46 billion light-years away would be receding relative to another galaxy one megaparsec closer to Earth.

That’s more than three times the speed of light (299,792 km/sec) or Warp 3 plus in Star Trek parlance. Einstein said this was impossible but fortunately there is some nuance that keeps us in compliance with Special Relativity (or else the fines would be astronomical). This isn’t the speed of a galaxy moving through space, but the speed at which space between galaxies is expanding. Which, admittedly, is terribly confusing.

The speed of a galaxy, composed of matter, energy, and dark matter, must obey Einstein’s rules: gravity and Special Relativity. And one of the rules is that the speed of light is the cosmic speed limit, no one shall pass beyond this.

But space between the galaxies decides to emphasize the rules in a different order. The expansion of space is still governed by Einstein’s equations, just interpreted through the lens of spacetime geometry rather than the motion of objects. This geometry is shaped by, yet not reducible to, matter, energy, and dark matter.

Expansion is a feature of spacetime’s structure, not velocity in the usual sense, and thus isn’t bound by the speed of light. If space wants to expand, stretch, faster than a photon can travel, well so be it.

The space between galaxies is governed by dark energy and its enigmatic rules of geometry. Within galaxies, the rules are set by dark matter, and to a lesser extent by matter and energy, even though dark energy is likely present, its influence at galactic scales is minimal.

Note the use of the word scale here. Galaxies are gigantic, the Milky Way is 100,000-120,000 light-years in diameter. But compared to the universe at 93,000,000,000 light-years across, they’re puny. You would need 845,000 Milky Ways lined up edge-to-edge to span the known universe.

Estimates of the number of galaxies in the universe range from 100 billion to 2 trillion. So, at the scale of the universe, galaxies are mere pinpoints of light; blips of energy scattered across the ever-expanding heavens.

This brings us to dark energy, the mysterious force driving cosmic expansion. No one knows what it is, but perhaps empty space and dark energy are the same. There’s even some speculation, mostly mine, that dark energy is a phase shift of dark matter. A shift in state. A triptych move from Newtonian physics to Quantum Mechanics to…Space Truckin’.

In the beginning moments after the big bang, the universe was dominated by radiation composed of high energy particles and photons. As the universe cooled, the radiation gave way to matter and dark matter. As more time allowed gravity to create structures, black holes emerged and a new force began to dominate, dark energy. But where did the dark energy come from? Was it always part of the universe or did it evolve from other building blocks. Below are a few speculative ideas floating around the cosmic playroom.

J.S. Farnes proposed a unifying theory where dark matter and dark energy are aspects of a single negative mass fluid. This fluid could flatten galaxy rotation curves and drive cosmic expansion, mimicking both phenomena simultaneously.

Mathematicians Tian Ma and Shouhong Wang developed a unified theory that alters Einstein’s field equations to account for a new scalar potential field. Their model suggests that energy and momentum conservation only holds when normal matter, dark matter, and dark energy are considered together.

Ding-Yu Chung proposed a model where dark energy, dark matter, and baryonic matter emerge from a dual universe structure involving positive and negative mass domains. These domains oscillate and transmute across dimensions.

These ideas all rotate around the idea that reality revolves around a concept that everything evolves and that matter and energy, of all forms, flickers in and out of existence depending on dimensional scaffolding of space and the strength of gravity and radiation fields.  Rather than radiation, energy, matter, dark matter, and dark energy as separate entities, these may be expressions of a single evolving field, shaped by phase transitions, scalar dynamics, or symmetry breaking.

Now back to my regularly scheduled program. In August 2025, Quanta Magazine reported on a study led by Nobel laureate Adam Riess using the James Webb Telescope (JWST) to measure over 1,000 Cepheid variable stars with unprecedented precision. Cepheid stars pulsate in brightness over time with a highly predictable rate or rhythm, making them ideal cosmic yardsticks. Riess’s team found a Hubble constant of ~73.4 km/s/Mpc, consistent with previous Hubble Space Telescope measurements of Cepheid stars but still significantly higher than what theory predicts.

That theory comes from the standard model of cosmology: Lambda Cold Dark Matter. According to this framework photons decoupled from the hot electron-proton opaque soup about 380,000 years after the Big Bang went boom, allowing light to travel freely for the first time, and allowing space to be somewhat transparent and visible. This event produced the Cosmic Microwave Background (CMB).

This CMB permeates the universe to this day. It was discovered in 1964 by Bell Lab physicists Arno Penzias and Robert Wilson, who were trying to eliminate background noise from their radio antenna. The noise turned out to be the faint afterglow from the Big Bang, cooled down from its original 3000 Kelvin to a frosty 2.7 Kelvin. They received the Nobel Prize in Physics for this discovery in 1978.

Light from the CMB, as measured by the European Space Agency Planck satellite, has a redshift of approximately 1100, meaning the universe has expanded by a factor of 1100 over the past 13.42 billion years. By analyzing the minute temperature fluctuations in the CMB, Planck can infer the density of matter, dark energy, and curvature of the universe. Inserting these parameters into the Lambda Cold Dark Matter model yields a Hubble constant which turns out to be 67.4 + 1.71 (65.69-69.11). This value is considered the gold standard. Values beyond the Planck measurement are not necessarily wrong, just not understood.

At first glance, the difference between Planck’s 67.4 and Riess’ 73.4 may seem small. But it is cosmically significant. Two galaxies 43 billion light-years away and 3.26 billion light-years apart (1000 Mpc) would have a velocity difference of 6000 km/s or about 189 billion kilometers of increased separation per year. That’s the scale of what small differences in the value can add up to and is referred to as the Hubble tension.

Meanwhile, a competing team of researchers studying red branch and giant branch stars consistently scored the Hubble constant closer to the theoretical prediction of 67.4. This team led by Wendy Freedman believes that Hubble tension, the inability of various methods of measuring the Hubble constant to collapse to a single value, is a result of measurement errors

While some researchers, Wendy Freedman and others, suggest lingering systematic errors may still be at play, the persistence of this discrepancy, across instruments, methods, and team, has led others to speculate about new physics. Among the most provocative ideas: the possibility that the universe’s expansion rate may vary depending on direction, hinting at anisotropic expansion and challenging the long-held assumption of cosmic isotropy. But this seems far-fetched and if true it would likely break the Lambda Cold Dark Matter model into pieces.

And so, the cosmos grows lonelier. Not because the galaxies are fleeing, but because space itself is stretching, a wedge governed by the geometry of expansion. The further they drift apart, the less they interact, a divorce from neglect rather than malice. In time, entire galaxies will slip beyond our cosmic horizon, receding faster than light, unreachable even in principle. A cosmos of the lonely.

Source: The Webb Telescope Further Deepens the Biggest Controversy in Cosmology by Liz Kruesi, Quanta Magazine, 13 August 2024. JWST Observations Reject Unrecognized Crowding of Cepheid Photometry as an Explanation for the Hubble Tension at 8σ Confidence by Riess et al, The Astrophysical Journal Letters, 6 February 2024. Graphic: Cosmic Nebula by Margarita Balashova.

To Boldly Go

On 23 June 2025, after more than three decades of evolution, from a gleam of an idea to detailed planning, exacting execution, and the physical realization of the world’s largest astronomical camera, the Vera C. Rubin Observatory’s Legacy Survey of Space and Time (LSST) in Chile unveiled to the public its first breathtaking images. Among them: razor-sharp mosaics of the Trifid and Lagoon Nebulae, and the sprawling Virgo Cluster, home to millions of galaxies. Captured with world-class light-collecting mirrors, these images marked the beginning of a spectacular ten-year quest to map the known universe and illuminate the 95% we still don’t understand: dark matter and dark energy. An exciting, albeit, Herculean future awaits, built on an equally stunning past where dreams and science converged into one of the most staggering feats of technological achievement in modern astronomy.

Let the future map of the universe tell its own story in due time. The path to the map deserves a chapter all its own.

In 1969 Willard Boyle and George Smith of Bell Labs invented a device capable of detecting and measuring the intensity of light which they named CCD or Charge-Coupled Device: a breakthrough that earned them the 2009 Nobel Prize in Physics. A CCD converts incoming photons into electrical signals, creating a voltage map of light intensity, a digital proxy for the number of photons striking its surface. Initially constructed as a semiconductor chip, it quickly evolved into a pixelated imaging sensor. These sensors quickly became the gold standard for digital consumer and scientific imaging but due to costs, consumer applications such as your phone camera switched over to CMOS sensors due to lower costs. Scientific and surveillance systems, such as the Hubble Telescope, SOAR, and SNAP, still employ CCDs because of their superior image fidelity.

In the late 1980s J. Anthony ‘Tony’ Tyson, an experimental physicist at Bell Labs, focused on developing instrumentation to detect faint optical signals using CCDs. His inspiring contribution to the CCD was to recognize their potential in imaging the heavens and laying the groundwork for digital deep sky surveys. He quickly discovered faint blue galaxies and gravitational lensing using modified CCDs that he helped developed. Additionally, he helped build the Big Throughput Camera that was instrumental in the 1998 discovery of dark energy.

Tyson never thought small. His CCDs were instruments of the infinitesimal, but his dreams were as gargantuan as the universe itself. In fact, his dream was the universe. In 1994 he proposed his “Deep Wide Fast” telescope, a scaleup of his Big Throughput Camera and the forerunner of the LSST. The Deep Wide Fast was a concept that would combine a deep imaging device with rapid cadence, and broad coverage simultaneously. In other words, synoptic realization of the universe in near real time.

Throughout the 1990s, Tyson rallied minds and resources to shape his cosmic vision. John Schaefer of the Research Corporation helped secure early funding. Roger Angel proposed the use of the innovative Paul Baker three-mirror telescope design. Institutions like the Universities of Arizona and Washington, along with the Optical Astronomy Observatory, all hitched their wagons to Tyson’s star-filled dream of mapping the universe.

In 1998 Tyson presented designs for a Dark Matter Telescope and in 1999 the science case was submitted to the Astronomy and Astrophysics Decadal Survey. In 2003 the first formal proposal was sent to the Experimental Program Advisory Committee at SLAC (Stanford Linear Accelerator Center). It consisted of an 8.4-meter mirror with a 2.3-billion-pixel camera capable of surveying the entire visible sky every few nights. The proposal also laid out the NSF–DOE partnership, with SLAC leading the camera development and other institutions handling optics, data systems, and site operations.

In 2004 Tyson left Bell Labs and joined the University of California at Davis as a cosmologist and continued to shepherd the LSST project from there.

In 2007 the project received $30 million in private funding from Charles Simonyi, Bill Gates, and others. The telescope is named the Simonyi Survey Telescope. In 2010 U.S. National Science Foundation (NSF) and Department of Energy (DOE) joined in the quest to view the universe through the sharp eyes of the LSST.

The telescope’s primary 8.4-meter and the 5.0-meter tertiary mirrors were built at the University of Arizona, beginning in 2008, completed in 2015, and stored on-site in Chile since 2019. Fabricated in the U.S., the 3.4-meter secondary was later coated in Germany with nickel-chromium, silver, and silicon nitride, materials chosen to enhance reflectivity, durability, and long-term performance.

In 2015 SLAC, which oversaw the design, fabrication, and integration of the camera, began building the components with assistance from Brookhaven National Laboratory, Lawrence Livermore National Laboratory, and IN2P3/CNRS in France. By 2024 the camera was finished and shipped to Chile. In 2025 the camera was installed and integrated with the telescope. In June of 2025 the first light images were released to the public.

The camera measures roughly 3 meters in length, 1.65 meters in diameter, and weighs 3 metric tons, an imposing instrument, rivaling the bulk of a small car. Its imaging surface, a 64-centimeter focal plane, contains 3.2 billion pixels, each a 10-micron square, roughly one-tenth the width of a human hair. These pixels, etched across 189 custom CCD sensors arranged into 21 modular “rafts,” are laid flat to within 10 microns, ensuring near-perfect focus. The entire array is chilled to –100°C to suppress electronic and thermal noise, enhancing signal fidelity.

Before photons reach the sensor, they pass through three precision-crafted corrective lenses, including the largest ever installed in an astronomical camera, and up to six interchangeable filters spanning ultraviolet to near-infrared. The filter exchange system enables the observatory to target specific wavelength bands, tailored to sky conditions and science goals.

The integrated LSST system is engineered to capture a 15-second exposure every 20 seconds, producing thousands of images per night, tallying approximately 15 terabytes of new data. Each image covers 9.6 square degrees of sky, roughly equivalent to the diameter of 45 full moons, allowing the system to survey the entire visible southern sky every 3–4 nights. Imaging a single field across all six filters can take up to 5–6 minutes, though filters are selected dynamically based on science goals and atmospheric conditions.

The system’s angular resolution is sharp enough to resolve a golf ball from 15 miles away and at the edge of the observable universe, this scales to structures no smaller than a large galaxy; certainly not stars, not planets, nor restaurants. Over its decade-long campaign, LSST is projected to catalogue more than 17 billion stars and 20 billion galaxies, a composite digital universe stitched together from individual photons captured from 3 million images, each snapped every few seconds over the clear night sky of Chile. The LSST will not simply map what’s visible but illuminate the unknown. Beneath the sophisticated hardware and software lies a deeper purpose: to shine the light of curiosity on the 95% of the universe that remains in the shadows of time and space: dark matter and dark energy, the known unknown dynamic force behind galactic formation and cosmic expansion. The LSST is more than a camera. It is a reckoning with the vast unknown, a testament to humanity’s refusal to let mystery remain unexplored and uncharted: to find God.

In 2013 Tyson was named chief scientist of the LSST and is still actively contributing to the intellectual vision of the project and mentoring the next gen of cosmologists and engineers.

Graphic: LSST Camera Schematic and Trifid Nebula by SLAC-DOE-NSF.