The Jellyfish of Mind and Being

This essay began as a passing thought about jellyfish, those umbrellas of the sea drifting in blooms, fluthers, smacks, and swarms. They have no brain, no central command, only a diffuse matrix of neurons spread across their bodies. Yet they pulse, sting, drift, eat, and spawn; all without any trace of self-awareness.

This decentralized nerve net exposes the brittleness of Descartes’ dictum, cogito ergo sum: “I think, therefore I am.” Descartes, as did Socrates before him, equated thinking with consciousness.

For Socrates, thinking was the essence of the soul, inseparable from awareness and virtue. For Descartes, thinking was the proof of existence: the cogito. For philosophers today, consciousness reaches beyond thought, defined by the raw fact of experience; the sheer presence of what is.

Philosophers and neuroscientists now separate thinking (Reasoning, problem-solving, language; although language is at minimum a bridge from brain to mind) from consciousness (the subjective “what it’s like” experience). Yet separating the two only deepens the fog, the mystery of being. A newborn may have consciousness without thought. A computer may “think” without consciousness. A jellyfish reacts but does not reflect; its life is sensation without self-awareness.

Consciousness is more than biology or electronics, a core of being rising above life, thought, and reaction. Living is not the same as consciousness. Living is metabolism, reaction, survival. Consciousness is the something extra, the lagniappe, the “what it’s like” to be. A dog feels pain without philosophizing. A newborn hungers without reflection. A jellyfish recoils from harm, detects light, adapts its behavior. Is that sentient? Perhaps. But self-aware thought? Almost certainly not.

The spectrum of awareness occupies a wide corridor of argument and reality. On one end, the jellyfish: life without thought, existence without awareness. On the other, humans: tangled in language, reflection, and self-modeling cognition. Between them lies the mystery. Anesthesia, coma, or dreamless sleep show that thought can vanish while consciousness flickers on, or vice versa. The two are not bound in necessity; reality shows they can drift apart.

Neuroscience maps the machinery, hippocampus for memory, thalamus for awareness, but cannot settle the duality. Neurons may spark and signals flow, yet consciousness remains more than electrical activity. It is not reducible to living. It is not guaranteed by thought. It is the specter of being that transcends living biology.

The jellyfish reminds us that being does not require thinking. Humans remind us that thinking does not explain consciousness. Between them, philosophy persists, not by closure, but by continuing to ask.

Perhaps the jellyfish is not a primitive creature but a reflecting pool of possibilities: showing us that being does not require thinking, and that consciousness may be more elemental than the cogito admits. The question is not whether we think, but whether we experience. And experience, unlike thought, resists definition but it defines who we are.

In the end, Scarecrow, like the jellyfish, had no brain but was deemed the wisest man in Oz.

Graphic: A Pacific sea nettle (Chrysaora fuscescens) at the Monterey Bay Aquarium in California, USA. 2005. Public Domaine

Galactic Emptiness

I like the quiet.

From the dark, an enigmatic mass of rock and gas streaks inward. Discovered by the ATLAS telescope in Chile on 1 July 2025, it moves at 58 km/s (~130,000 mi/hr), a billion-year exile from some forgotten, possibly exploded star, catalogued as 3I/Atlas. The press immediately fact-checks then shrieks alien mothership. Harvard’s Avi Loeb suggests it could be artificial, citing its size, speed: “non-gravitational acceleration”, and a “leading glow” ahead of the nucleus. Social media lights up with mothership memes, AI-generated images, and recycled Oumuamua panic.

Remaining skeptical but trying to retain objectivity, I ask; is it anything other than a traveler of ice and dust obeying celestial mechanics? And it is very difficult to come up with any answer other than, no.

NASA’s flagship infrared observatory, the James Webb Space Telescope (JWST) spectra show amorphous water ice sublimating 10,000 km from the nucleus. The Hubble telescope resolves a 13,000-km coma (tail), later stretching to 18,000 km that is rich in radiation forged organics: tholins, and fine dust.

The “leading glow” is sunlight scattering off ice grains ejected forward by outgassing. The “non-gravitational acceleration” is gas jets, not engines. Loeb swings and misses again: ‘Oumuamua in 2017, IM1 in 2014, now this. Three strikes. The boy who cried alien is beginning to resemble the lead character in an Aesop Fable.

Not that I’m keeping score…well I am…sort of. Since Area 51 seeped into public lore, alien conspiracies have multiplied beyond count, but I still haven’t shaken E.T.’s or Stitches’ hand. No green neighbors have moved next door, no embarrassing probes, just the Milky Way in all its immense, ancient glory remaining quiet. A 13.6-billion-year-old galaxy 100,000 light-years across, 100–400 billion stars, likely most with host planets, and us, alone on a blue dot warmed by a middle-aged G2V star, 4.6 billion years old, quietly fusing hydrogen in the Orion Spur, between the galaxy’s Sagittarius and Perseus spiral arms.

No one knocking. But still, I like the quiet.

An immense galaxy of staggering possibilities, where the mind fails to comprehend the vastness of space and physics provides few answers.  The Drake Equation, a probabilistic 7 term formula used to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way galaxy yields an answer of less than one (0.04 to be exact) which is less than the current empirical answer of 1, which is us on the blue dot.

For the show me crowd here’s the Drake Equation N = R* × f_p × n_e × f_l × f_i × f_c × L and inserting 2025 consensus for the parameters: Two stars born each year. Nearly all with planets. One in five with Earth‑like worlds. One in ten with life. One in a hundred with intelligence. One in ten with radio. A thousand years of signal. And the sum is: less than one.

For the true optimist let’s bump up N to 100.  Not really a loud party but enough noise that someone should have called the police by now.

No sirens. I like the quiet.

But now add von Neumann self-replicating probes traveling at relativistic speeds, one advanced civilization could explore the galaxy in 240 ship-years (5,400 Earth years). A civilization lasting 1 million years could do this 3000 times over. Yet we see zero Dyson swarms, zero waste heat, zero signals. Conclusion: Either N = 0, or every civilization dies before it advances to the point it is seen by others. That leaves us with a galaxy in a permanent civilizational nursery state, or existing civilizations have all died off before we had the ability to look for them, or we are alone and always have been.

Maybe then, but not now. Or here but sleeping in the nursery. I like the quiet.

But then I remember Isaac Asimov’s seven‑novel Foundation saga. The Galactic Empire crumbles. Hari Seldon’s psychohistory predicts collapse and rebirth. The Second Foundation manipulates from the shadows. Gaia emerges as a planet‑wide mind. Robots reveal they kept it going: Daneel Olivaw, 20,000 years old, guiding humanity. And the final page (Foundation and Earth, 1986) exposes the beginning: Everything traces back to Earth. A radioactive cradle that forced primates to evolve repair genes, curiosity, and restlessness. We are radiation’s children. We didn’t find aliens. We are the aliens.

We are the cradle. We are the travelers. I still like the quiet.

The Long Way

By 1881, literature was shifting, Realism’s clarity giving way to Modernism’s psychological fog. Henry James pioneered the transformation, publishing what many hailed as his masterpiece and others found nearly unreadable. He moved from the crisp windows of Daisy Miller and Washington Square, where social dilemmas are transparent, into the labyrinth of The Portrait of a Lady, a slow, meandering narrative that tested patience to the point of exasperation. James stretched his scenes into long psychological dramas, shadowed by melancholy, lingering on minutiae rather than decisive events. To admirers, this was a profound exploration of consciousness, to detractors, a soporific feast of abstraction.

Where James’s Portrait is a punishing fugue of memory and angst, a darkness at the edge of noon, Proust’s Swann’s Way (1913) offers a sensual slow dance of lush detail, playful childhood games, and adult desire. In Combray, the family had two ways to take their walks: the short way and the long way. The short way was familiar, contained, offering scenery but little transformation. The long way was expansive, expressive, full of detours and revelations. In Swann in Love, the same pattern unfolds: the first half is Swann’s descent into desire, the short way of immediacy; the second half is his struggle to free himself, the long way of disillusionment and reflection. For Proust, the long way is where life’s lessons are held. Meaning is not found in shortcuts but in detours, delays, and the endurance of memory. The long way is the design of his art: winding detours that illuminate the search for lost time.

Wilde enters here as counterpoint. Where Proust lingers in digressive glow, Wilde sharpens language into bite. His wit distills the same metaphysical concerns: beauty, desire, memory, decay, into crystalline aphorisms. Wilde’s sentences are daggers wrapped in velvet, each polished to a point. If Proust is the cathedral of memory, Wilde is the mirror that cuts as it reflects. The Picture of Dorian Gray dramatizes the peril of desire and the corruption of beauty; themes Proust refracts through memory and longing. But Wilde compresses the ineffable into epigram: glow against bite, long way against short.

Cinema, now, becomes the continuance of these styles. Wilde’s paradox and Proust’s memory echo in films as diverse as Spectre (2015), No Time to Die (2021), and Gosford Park (2001). In Spectre, Madeleine Swann, a psychologist whose very name invokes Madeleine tea cakes and Swann’s Way, probes Bond’s past like Proust probing consciousness, turning trauma into narrative. In No Time to Die, desire and mortality entwine, echoing Proust’s meditation that “life has taken us round it, led us beyond it.” And in Gosford Park, Sir William McCordle brushing crumbs from a breast, Swann brushing flowers from a bosom, gestures lifted from Proust’s sensual triggers, collapse time into desire, while Altman’s upstairs-downstairs satire mirrors Wilde’s social wit. These films remind us that both the glow and the bite, the long way and the short, remain inexhaustible. The short as overture, the long as movement. One as a flash of life, the other as the light of experience.

James stretches narrative into labyrinthine difficulty. Proust redeems patience with memory’s illumination. Wilde polishes language into paradoxical brilliance. Chaplin, in Modern Times (1936), adds another metaphor: the gears of industry grinding human life into repetition. Yet even here, the Tramp and the Gamin walk off together, the long way, not the shortcut; suggesting resilience and hope. Between them, Modernism oscillates: fog and clarity, glow and bite, labyrinth and mirror, machine and memory. Meaning is elusive but never absent. It waits in the folds of memory, in the flash of wit, in the shadows of desire, in the detours of the long way, ready to be revealed.

Through memory’s fragments, along the winding road of joy and grace, we taste again the sweetness of love, the timelessness of innocence, and life’s inexhaustible richness.

Graphic: Marcel Proust, Hulton Archive/Getty Images.

Michel de Montaigne Bergerac 2019

Bordeaux Red Blends from Southwest, France

Merlot 60%, Cabernet Franc 20%, Cabernet Sauvignon 20%

Purchase Price $16.99

Wine Enthusiast 90, Wilfred Wong 90, ElsBob 90

ABV 14%

A clear ruby to purple wine in color. A medium to full bodied wine with aromas of red and black fruits and spice. On the palate plums and cherries predominant with oak derivatives. The tannins are meaty and balanced with crisp acidity. A beautiful finish that will compliment most beef dishes.

An excellent fine wine at a very attractive price. Current prices range from $13.50-18.00.

Trivia:  Michel de Montaigne was likely the most influential philosopher of the 16th-century French Renaissance. A dyed-in-the-wool skeptic, a cantankerous crank whose motto Que sais-je? (“What do I know?”) enshrined his worldview; much like Socrates, who also claimed to know nothing. Montaigne questioned everything and taught that doubt was the only path to wisdom.

But he carried it too far: intellectually thin and logically obtuse. He believed that customs and morals were cultural artifacts, lacking any universal tether. Truth, for Montaigne, was a matter of perspective; malleable, contingent, shaped by accepted practice. One man’s cannibal was another man’s epicurean.

To anchor this relativism, he wrote: “We are, I know not how, double in ourselves, so that what we believe we disbelieve, and cannot rid ourselves of what we condemn.” A long-winded version of c’est la vie (“that’s life”), or more precisely, à chacun son goût (“to each his own”).

Experience was his shrine, but it lacked a foundation. No base of knowledge to anchor belief. A man easily swayed by his own prejudices and lack of a black and white moral code.

His philosophy of go-along-to-get-along, born of tolerance and introspection, risked becoming a prescription for annihilation, not of others, but of moral clarity and oneself. A path to accepting everything and believing nothing. A philosophy polished so smooth it reflects everything and reveals nothing.

Color in the Eye of the Beholder

Ansel Adams (1902-1964), photographer of the majestic, was exceptionally elusive when it came to why he preferred black-and-white photographs over color, offering only a few comments on his medium of choice. He believed that black-and-white photography was a “departure from reality” which is true on many levels but that is also true of most artistic efforts and products. He also held the elementary belief that “one sees differently with color photography than black-and-white.” Some have even suggested that Adams said, “…when you photograph them in black and white, you photograph their souls,” but this seems apocryphal since most of his oeuvre was landscape photography.

Adams’s black-and-white photography framed the grandeur of the mountainous West in stark, unembellished terms. Yet without color, a coolness loiters, untouched by human sentiment or warmth. As an unabashed environmentalist, maybe that was his point, the majesty of the outdoors was diminished by human presence. In black-and-white, the wilderness remained unsullied and alone.

But to Claude Monet (1840-1926), founding French Impressionist, color and light, was everything in his eye. Color defined his paintings, professing that “Color is my day-long obsession, (my) joy…,” he confessed. Color was also a constant burden that he carried with him throughout the day and into the night, lamenting, “Colors pursue me like a constant worry. They even worry me in my sleep.” He lived his aphorism: “Paint what you really see, not what you think you ought to see…but the object enveloped in sunlight and atmosphere, with the blue dome of Heaven reflected in the shadows.” His reality was light and color with a human warming touch.

Adams and Monet’s genius were partially contained in their ability to use light to capture the essence of the landscape, but Monet brought the soul along in living color. Monet’s creed, “I want the unobtainable. Other artists paint a bridge, a house, a boat, and that’s the end…. I want to paint the air which surrounds the bridge, the house, the boat, the beauty of the air in which these objects are located…”

Color is a defining quality of humanity. Without color life would be as impersonal as Adam’s landscapes, beautiful, majestic even, but without passion or pulse. A sharp, stark visual with little nuance, no emotional gradations from torment to ecstasy, just shadows and form.

Understanding color was not just a technical revelation for 19th-century French artists, it was a revolutionary awakening, a new approach to how the eye viewed color and light. The Impressionists and Pointillists brought a new perception to their canvases. And the catalyst for this leap away from the tired styles of Academic Art and Realism was Michel Eugene Chevreul, a chemist whose insight into color harmony and contrast inspired the Monets and Seurats to pursue something radically different in the world of art. His chromatic studies inspired them to paint not for the viewer’s eye, but with it, transforming perception from passive witness into an active collaboration between painter, subject, and observer.

Chevreul’s breakthrough was deceivingly simple. Colors are not static blots on a canvas but relational objects that come alive when surrounded by other hues of the spectrum. A hue in isolation is perceived differently than when seen next to another. Red deepens next to green; blue pulsates with enthusiasm against orange. This principle, simultaneous contrast, revealed that the eye does not just passively accept what it sees but synthesizes it to a new reality.

Chevreul’s theories on complementary colors and optical mixing laid the foundation for painters to forsake rigid outlines, often rendered in the non-color of black, and embrace Impressionism: not merely an art style, but a promise of perception, a collaboration between painter and viewer. Rather than blending pigments on a palette, artists like Monet and Seurat placed discrete strokes side by side, allowing the viewer’s mind to complete the image.

This optical mixing is a product of the way the eye and the brain process the various wavelengths of white light. When complementary colors are adjacent to one another the brain amplifies the differences. Neurons in the eye are selfish. When a photoreceptor is stimulated by a color it suppresses adjacent receptors sharpening the boundaries and contrast. And the brain interprets what it sees based on context. Which is why sometimes we see what is not there or misinterpret what is there, such as faces on the surface of Mars or UFOs streaking through the sky. There is also a theory that the brain processes color in opposing pairs. When it sees red it suppresses green creating a vibrancy of complementary colors when placed together.

The Impressionists intensely debated Chevreul’s concepts then they brushed them to life with paint. They painted not concrete objects, but forms shaped by light and color. Haystacks and parasols within a changing mood of contrasting color. . Interpretation by the eye of the beholder.

Chevreul’s collected research, The Principles of Harmony and Contrast of Colors and Their Applications to the Arts, originally published in 1839, remains in print nearly two centuries later.

Source: The Principles of Harmony and Contrast of Colors and Their Applications to the Arts by Michel Eugène Chevreul, 1997 (English Translation). Graphic: Woman with a Parasol by Monet, 1875. National Gallery of Art, Washington, DC. Public Domain.

The Lost Boys

The end of the Peloponnesian War in 404 BC marked the end of Athens’ Golden Age. Most historians agree that the halcyon days of Athens were behind her.  Some however, such as Victor Davis Hanson in his multi-genre meditations, A War Like No Other, a discourse on military history, cultural decay, and philosophical framing, offers a more nuanced view suggesting that Athens was still capable of greatness, but the lights were dimming.

During the following six decades, after the war, Athens rebuilt. Its navy reached new heights. Its long walls were rebuilt within a decade. Aristophanes retained his satirical edge even if it was a bit more reflective. Agriculture returned in force. Even Sparta reconciled with Athens or vice versa, recognizing once again that the true enemy was Persia.

Athens brought back its material greatness, but its soul was lost. What ended the Golden Age of Athens wasn’t crumbled walls or sunken ships. It was the loss of lives that took the memory, the virtuosity of greatness with it. With them generational continuity, civic pride, and a religious belief in the polis vanished. The meaning, truth, and myth of Athenian exceptionalism died with their passing. The architects of how to lead a successful, purpose driven civilization had disappeared, mostly through death by war or state but also by plague.

Victor Davis Hanson, in his A War Like No Other lists many of the lives lost to and during the war that took much of Athens’ exceptionalism with them to their graves. Below is a partial listing of Hanson’s more complete rendering with some presumptuous additions.

Alcibiades was an overtly ambitious Athenian strategist; brilliant, erratic, and ultimately treasonous. He championed the disastrous Sicilian expedition, Athens greatest defeat. Over the course of the war, he defected multiple times: serving Athens, then Sparta, then Persia, before returning to Athens. He was assassinated in Phrygia around 404 BC while under Persian protection, by, many beleive, the instigation of the Spartan general Lysander.

Euripides though he did not fight in the war exposed its brutality and hypocrisy in his plays such as The Trojan Woman and Helen. The people were not sufficiently appreciative of his war opinions or plays, winning only four firsts at Dionysia compared to 24 and 13 for Sophocles and Aeschylus, respectively. Disillusioned, he went into self-imposed exile in Macedonia and died there around 406 BC by circumstances unknown.

The execution of the Generals of Arginusae remains a legendary example of Athenian arbitrary retribution; proof that a city obsessed with ritualized honor could nullify military genius, and its future, in a single stroke. The naval Battle of Arginusae, fought in 406 BC, east of the Greek island of Lesbos, was the last major Athenian victory over the Spartans in the Peloponnesian War. Athenian command of the battle was split between 8 generals: Aristocrates, Aristogenes, Dimedon, Erasinides, Lysias, Pericles the Younger (son of Pericles), Protomachus, and Thrasyllus. After their victory over the Spartan fleet a storm prevented the Athenians from recovering the survivors, and the dead, from their sunken ships. Of the six generals that returned to Athens all were executed for their negligence. Protomachus and Aristogenes, likely knowing their fate, chose not to return and went into exile.

Pericles, the flesh and blood representation of Athens’ greatness was the statesman and general who led the city-state during its golden age. He died of the plague in 429 BC during the war’s early years, taking with him the vision of democratic governance and Athens’ exceptionalism. His 3 legitimate sons all died during the war. His two oldest boys likely died of the plague around 429 BC and Pericles the Younger was executed for his part in the Battle of Arginusae.

Socrates, the world’s greatest philosopher (yes greater than Plato or Aristotle) fought bravely in the war, but he was directly linked to the traitor Alcibiades. He was tried and killed in 399 BC for subverting the youth and not giving the gods their due. That was all pretense. Athens desired to wash their collective hands of the war and Socrates was a very visible reminder of that. He became a ritual scapegoat swept up into the collective expurgation of the war’s memory.

Sophocles, already a man of many years by the beginning of the war, died in 406 BC at the age of 90 or 91, a few years before Athens’ final collapse. His tragedies embodied the ethical and civic pressures of a society unraveling. With the deaths of Aeschylus in 456 BC, Euripides in 406 BC, and Sophocles soon after, the golden age of Greek tragedy came to a close.

Thucydides, author of the scholarly standard for the Peloponnesian War, was exiled after ‘allowing’ the Spartans to capture Amphipolis, He survived the war, and the plague, but never returned to Athens. His History ends in mid-sentence for the period up to 411 BC. He lived till 400 BC, and no one really knows why he didn’t finish his account of the war. Xenophon picked up where Thucydides left off and finished up the war in his first two books of Hellenica which he composed somewhere in the 380s BC.

The Peloponnesian War ended Athens’ greatest days. The men who kept its lights bright were gone. Its material greatness returned, glowing briefly, but its civic greatness, its soul, slowly dimmed. It was a candle in the wind of time that would be rekindled elsewhere. The world would fondly remember its glory, but Athens had lost its spark.

Source: A War Like No Other by Victor Davis Hanson, 2005. Graphic: Alcibiades Being Taught by Socrates, Francois-Andre Vincent, 1776. Musee Fabre, France. Public Domain.

The Sum of All Fears–Real and Imagined

The Peloponnesian War, fought over 27 years (431-404 BC), cost the ancient Greek world nearly everything. War deaths alone approached 8-10 percent of their population: up to 200,000 deaths from battle and plague. The conflict engulfed nearly all of Greece, from the mainland to the Aegean islands, Asia Minor and Sicily. Though Sparta and its allies, in the end, claimed a tactical victory, the war left Greece as a shadow of its former self.

The Golden Age of Athens came to an end. Athenian democracy was replaced, briefly, by the Thirty Tyrants. Sparta, unwilling to jettison its insular oligarchy, failed to adapt to imperial governance, naval power, or diplomatic nuance. Within a generation Sparta was a relic of history.  First challenged by former allies in the Corinthian War, then shattered by Thebes, which stripped the martial city-state of its aura of invincibility along with its helot slave labor base: the economic foundation of Sparta. Another generation later, Macedon under Philip II and Alexander the Great finished off Greek dominance of the Mediterranean. After Alexander’s death in 323 BC, Rome gradually absorbed all the fractured pieces. Proving again, building an empire is easier than keeping one.

Thucydides, heir to the world’s first historian: Herodotus, reduced the origins of the Peloponnesian War to a primal emotion: fear. In Book I of his History of the Peloponnesian War he writes: “The growth of the power of Athens, and the alarm which this inspired in Sparta, made war inevitable.” Athens had violated trade terms under the Megarian Decree with a minor Spartan ally but that was pretext, not cause. Sparta did not go to war over market access. It went to war over fear. Fear of what Athens had become and a future that armies and treaties may not contain.

War and fear go together like flame to fuse. Sparta went to war not for fear of a foe, Sparta knew no such people. It was not fear of an unknown warrior, nor fear of battlefields yet to be choregraphed, but fear of an idea: democracy maintained and backed by Athenian power. And perhaps, more hauntingly precise, fear of itself. Not that it feared it was weak but of what it may become. They feared no sword or spear, their discipline reigned supreme against flesh and blood. Yet no formation, no stratagem, no tactic of war could bring down a simple Athenian belief: the rule of the many, an idea anathema, heretical even, to the Spartan way of life.

So, they marched to war, not to defeat an idea but to silence the source. Not to avenge past aggression but to stop a future annexation. They won battles, small and large. They razed cities. But they only destroyed men. The idea survived. It survived in fragments, bits here, bits there, across time and memory. What it did kill, though, was the spirit of Athens, the Golden Age of Athens. But the idea that was Athens lived on across space and time: chiseled into republics that rose from its ashes and ruins.

The radiance of Athens dimmed to shadow. Socrates became inconvenient. Theater became therapy; a palliative smothering of a cultural surrender. And so, civilization moved to Rome.

Source: A War Like No Other by Victor Davis Hanson, 2005. History of the Peloponnesian War by Thucydides, Translated by Richard Crawley, 2021. Graphic: Syracuse vs Athens Naval Battle. CoPilot.

Shadows of Reality — Existence Beyond Nothingness

From the dawn of sentient thought, humanity has wrestled with a single, haunting, and ultimately unanswerable question: Is this all there is? Across the march of time, culture, and science, this question has echoed in the minds of prophets, philosophers, mystics, and skeptics alike. It arises not from curiosity alone, but from something deeper, an inner awareness, a presence within all of us that resists the idea of the inevitable, permanent end. In every age, whether zealot or atheist, this consciousness, a soul, if you will, refuses to accept mortality. Not out of fear, but from an intuition that there must be more. This inner consciousness will not be denied, even to non-believers.

One needs to believe that death is not an end, a descent into nothingness, but a threshold: a rebirth into a new journey, shaped by the echoes of a life already lived. Not logic, but longing. Not reason, but resonance. A consciousness, a soul, that seeks not only to understand, but to fulfill, to carry forward the goodness of a life into something greater still. Faith in immortality beyond sight. A purpose beyond meaning. Telos over logos.

While modern thinkers reduce existence to probability and simulation, the enduring human experience, expressed through ancient wisdom, points to a consciousness, a soul, that transcends death and defies reduction. Moderns confuse intellect or brain with consciousness.

Contemporary thinkers and writers like Philip K. Dick, Elon Musk, and Nick Bostrom have reimagined this ancient question through the lens of technology, probability, and a distinctly modern myopia. Their visions, whether paranoid, mathematical, or speculative, suggest that reality may be a simulation, a construct, or a deception. In each case, there is a higher intelligence behind the curtain, but one that is cold, indifferent, impersonal. They offer not a divine comedy of despair transcending into salvation, but a knowable unknown: a system of ones and zeros marching to the beat of an intelligence beyond our comprehension. Not a presence that draws us like a child to its mother, a moth to a flame, but a mechanism that simply runs, unfeeling, unyielding, and uninviting. Incapable of malice or altruism. Yielding nothing beyond a synthetic life.

Dick feared that reality was a layered illusion, a cosmic deception. His fiction is filled with characters who suspect they’re being lied to by the universe itself, yet they keep searching, keep hoping, keep loving. Beneath the paranoia lies a desperate longing for a divine rupture, a breakthrough of truth, a light in the darkness. His work is less a rejection of the soul than a plea for its revelation in a world that keeps glitching. If life is suffering, are we to blame?

Musk posits that we’re likely living in a simulation but offers no moral or spiritual grounding. His vision is alluring but sterile, an infinite loop of code without communion. Even his fascination with Mars, AI, and the future of consciousness hints at something deeper: not just a will to survive, but a yearning to transcend. Yet transcendence, in his world, is technological, not spiritual. To twist the spirit of Camus: “Should I kill myself or have a cup of coffee?”, without transcendence, life is barren of meaning.

Bostrom presents a trilemma in his simulation hypothesis: either humanity goes extinct before reaching a posthuman stage, posthumans choose not to simulate their ancestors, perhaps out of ethical restraint or philosophical humility, or we are almost certainly living in a simulation. At first glance, the argument appears logically airtight. But on closer inspection, it rests on a speculative foundation of quivering philosophical sand: that consciousness is computational and organic, that future civilizations will have both the means and the will to simulate entire worlds, and that such simulations would be indistinguishable from reality. These assumptions bypass profound questions about the nature of consciousness, the ethics of creation, and the limits of simulated knowledge. Bostrom’s trilemma appears rigorous only because it avoids the deeper question of what it means to live and die.

These views, while intellectually stimulating, shed little light on a worthwhile future. We are consigned to existence as automatons, soulless, simulated, and suspended in probability curves of resignation. They offer models, not meaning. Equations, not essence. A presence in the shadows of greater reality.

Even the guardians of spiritual tradition have begun to echo this hollow refrain. When asked about hell, a recently deceased Pope dismissed it not as fire and brimstone, but as “nothingness,” a state of absence, not punishment. Many were stunned. A civilizational lifetime of moral instruction undone in a breath. And yet, this vision is not far from where Bostrom’s simulation hypothesis lands: a world without soul, without consequence, without continuity. Whether cloaked in theology or technology, the message is the same, there is nothing beyond. The Seven Virtues and the Seven Deadly Sins have lost their traction, reduced to relics in a world without effect.

But the soul knows better. It was not made for fire, nor for oblivion. It was made to transcend, to rise beyond suffering and angst toward a higher plane of being. What it fears is not judgment, but erasure. Not torment, but the silence of meaning undone. Immortality insists on prudent upkeep.

What they overlook, or perhaps refuse to embrace, is a consciousness that exists beyond intellect, a soul that surrounds our entire being and resists a reduction to circuitry or biology. A soul that transcends blood and breath. Meaning beyond death.

This is not a new idea. Socrates understood something that modern thinkers like Musk and Bostrom have bypassed: that consciousness is not a byproduct of the body, but something prior to it, something eternal. For Socrates, the care of the soul was the highest human calling. He faced death not with fear, but with calm, believing it to be a transition, not an end or a nothingness, but a new beginning. His final words were not a lament, but a gesture of reverence: a sacrifice to Asclepius, the god of healing, as if death itself were a cure.

Plato, his student, tried to give this insight form. In his allegory of the cave, he imagined humanity as prisoners mistaking shadows for reality. The journey of the soul, for Plato, was the ascent from illusion to truth, from darkness to light. But the metaphor, while powerful, is also clumsy. It implies a linear escape, a single ladder out of ignorance. In truth, the cave is not just a place, it is a condition. We carry it with us. The shadows are not only cast by walls, but by our own minds, our fears. And the light we seek is not outside us, but within.

Still, Plato’s intuition remains vital: we are not meant to stay in the cave. The soul does not long merely for survival, it is immortal, but it needs growth, nourished by goodness and beauty, to transcend to heights unknown. A transcendence as proof, the glow of the real beyond the shadow and the veil.

In the end, the soul reverberates from within: we are not boxed inside a simulation, nor trapped in a reality that leads nowhere. Whether through reason, compassion, or spiritual awakening, the voice of wisdom has always whispered the same truth: Keep the soul bright and shiny. For beyond the shadows, beyond the veil of death, there is more. There is always more.

Drunken Monkey Hypothesis–Good Times, Bad Times

In 2004, biologist Robert Dudley of UC Berkeley proposed the Drunken Monkey Hypothesis, a theory suggesting that our attraction to alcohol is not a cultural accident but an evolutionary inheritance. According to Dudley, our primate ancestors evolved a taste for ethanol (grain alcohol) because it signaled ripe, energy-rich, fermenting fruit, a valuable resource in dense tropical forests. Those who could tolerate small amounts of naturally occurring ethanol had a foraging advantage, and thus a caloric advantage. Over time, this preference was passed down the evolutionary tree to us.

But alcohol’s effects have always been double-edged: mildly advantageous in small doses, dangerous in excess. What changed wasn’t the molecule, it was our ability to concentrate, store, and culturally amplify its effects. Good times, bad times…

Dudley argues that this trait was “natural and adaptive,” but only because we didn’t die from it as easily as other species. Ethanol is a toxin, and its effects, loss of inhibition, impaired judgment, and aggression, are as ancient as they are dangerous. What may have once helped a shy, dorky monkey approach a mate or summon the courage to defend his troop with uncharacteristic boldness now fuels everything from awkward first dates, daring athletic feats, bar fights, and the kind of stunts or mindless elocutions no sober mind would attempt.

Interestingly, alcohol affects most animals differently. Some life forms can handle large concentrations of ethanol without impairment, such as Oriental hornets, which are just naturally nasty, no chemical enhancements needed, and yeasts, which produce alcohol from sugars. Others, like elephants, become particularly belligerent when consuming fermented fruit. Bears have been known to steal beer from campsites, party hard, and pass out. A 2022 study of black-handed spider monkeys in Panama found that they actively seek out and consume fermented fruit with ethanol levels of 1–2%. But for most animals, plants, and bacteria, alcohol is toxic and often lethal.

Roughly 100 million years ago in the Cretaceous, flowering plants evolved to produce sugar-rich fruits, nectars, and saps, highly prized by primates, fruit bats, birds, and microbes. Yeasts evolved to ferment these sugars into ethanol as a defensive strategy: by converting sugars into alcohol, they created a chemical wasteland that discouraged other organisms from sharing in the feast.

Fermented fruits can contain 10–400% more calories than their fresh counterparts. Plums (used in Slivovitz brandy) show some of the highest increases. For grapes, fermentation can boost calorie content by 20–30%, depending on original sugar levels. These sugar levels are influenced by climate, warm, dry growing seasons with abundant sun and little rainfall produce sweeter grapes, which in turn yield more potent wines. This is one reason why Mediterranean regions have long been ideal for viticulture and winemaking, from ancient Phoenicia to modern-day Tuscany, Rioja, and Napa.

The story of alcohol is as ancient as civilization itself. The earliest known fermented beverage dates to 7000 BC in Jiahu, China, a mixture of rice, honey, and fruit. True grape wine appears around 6000 BC in the Caucasus region (modern-day Georgia), where post-glacial soils proved ideal for vine cultivation. Chemical residues in Egyptian burial urns and Canaanite amphorae prove that fermentation stayed with civilization as time marched on.

Yet for all its sacred and secular symbolism, Jesus turning water into wine, wine sanctifying Jewish weddings, or simply easing the awkwardness of a first date, alcohol has always walked a fine line between celebration and bedlam. It is a substance that amplifies human behavior, for better or worse. Professor Dudley argues that our attraction to the alcohol buzz is evolutionary: first as a reward for seeking out high-calorie fruit and modulating fear in risky situations, but it eventually became a dopamine high that developed as an end in itself.

Source: The Drunken Monkey by Robert Dudley, 2014.

Moral Fogs: Machine and Man

(Note: This companion essay builds on the previous exploration of Asimov’s moral plot devices, rules that cannot cover all circumstances, focusing on dilemmas with either no good answers or bad answers wrapped in unforgiving laws.)

Gone Baby Gone (2007) begins as a textbook crime drama; abduction of a child, but by its final act, it has mutated into something quietly traumatic. What emerges is not a crime thriller, but an unforgiving philosophical crucible of wavering belief systems: a confrontation between legal righteousness and moral intuition. The two protagonists, once aligned, albeit by a fine thread, find themselves, eventually, on opposite ends of a dilemma that law alone cannot resolve. In the end, it is the law that prevails, not because justice is served, but because it is easy, clear, and lacking in emotional reasoning. And in that legal clarity, something is lost, a child loses, and the adults can’t find their way back to a black and white world.

The film asks: who gets to decide for those who can’t decide for themselves? Consent only functions when the decisions it enables are worthy of those they affect.

The film exposes the flaws of blindly adhering to a legal remedy that is incapable of nuance or a purpose-driven outcomes; not for the criminals, but for the victims. It lays bare a system geared towards justice and retribution rather than merciful outcomes for the unprotected victims or even identifying the real victims. It’s not a story about a crime. It’s a story about conscience. And what happens when the rules we write for justice fail to account for the people they’re meant to protect, if at all. A story where it was not humanly possible to write infallible rules and where human experience must be given room to breathe, all against the backdrop of suffocating rules-based correctness.

Moral dilemmas expose the limits of clean and crisp rules, where allowing ambiguity and exceptions to seep into the pages of black and white is strictly forbidden. Where laws and machines give no quarter and the blurry echoing of conscience is allowed no sight nor sound in the halls of justice or those unburdened by empathy and dimensionality. When justice becomes untethered from mercy, even right feels wrong in deed and prayer.

Justice by machine is the definition of law not anchored by human experience but just in human rules. To turn law and punishment over to an artificial intelligence without soul or consciousness is not evil but there is no inherent goodness either. It will be something far worse: A sociopath: not driven by evil, but by an unrelenting fidelity to correctness. A precision divorced from purpose.

In the 2004 movie iRobot, loosely based on Isaac Asimov’s 1950 novel of the same name, incorporating his 3 Laws of Robotics, a robot saves detective Del Spooner (Will Smith) over a 12-year-old girl, both of whom were in a submerged car, moments from drowning. The robot could only save one and picked Smith because of probabilities of who was likely to survive. A twist on the Trolley Problem where there are no good choices. There was no consideration of future outcomes; was the girl humanity’s savior or more simplistic, was a young girl’s potential worth more, or less, than a known adult.

A machine decides with cold calculus of the present, a utilitarian decision based on known survival odds, not social biases, latent potential, or historical trajectories. Hindsight is 20-20, decision making without considering the unknowns is tragedy.

The robot lacked moral imagination, the capacity to entertain not just the likely, but the meaningful. An AI embedded with philosophical and narrative reasoning may ameliorate an outcome. It may recognize a preservation bias towards potential rather than just what is. Maybe AI could be programmed to weigh moral priors, procedurally more than mere probability but likely less than the full impact of human potential and purpose.

Or beyond a present full of knowns into the future of unknowns for a moral reckoning of one’s past.

In the 2024 Clint Eastwood directed suspenseful drama, Juro No. 2, Justin Kemp (Nicholas Hoult) is selected to serve on a jury for a murder trial, that he soon realizes is his about his past. Justin isn’t on trial for this murder, but maybe he should be. It’s a plot about individual responsibility and moral judgment. The courtroom becomes a crucible not of justice, but of conscience. He must decide whether to reveal the truth and risk everything, or stay silent and let the system play out, allowing himself to walk free and clear of a legal tragedy but not his guilt.

Juro No. 2 is the inverse of iRobot. An upside-down moral dilemma that challenges rule-based ethics. In I, Robot, the robot saves Will Smith’s character based on survival probabilities. Rules provide a path forward but in Juro No. 2 the protagonist is in a trap where no rules will save him. Logic offers no escape; only moral courage can break him free from the chains of guilt even though they bind him to the shackles that rules demand. Justin must seek and confront his soul, something a machine can never do, to make the right choice.

When morality and legality diverge, when choice runs into the murky clouds of grey against the black and white of rules and code, law and machines will take the easy way out. And possibly the wrong way.

Thoreau in Civil Disobedience says, “Law never made men a whit more just; and… the only obligation which I have a right to assume is to do at any time what I think right,” and Thomas Jefferson furthers that with the consent of the governed needs to be re-examined when wrongs exceed rights. Life, liberty, and the pursuit of happiness is the creed of the individual giving consent to be governed by a greater societal power but only when the government honors the rights of man treads softly on the rules.

Government rules, a means to an end, derived from the consent of the governed, after all, are abstractions made real through human decisions. If the state can do what the individual cannot, remove a child, wage war, suspend rights, then it must answer to something greater than itself: a moral compass not calibrated by convenience or precedent, but by justice, compassion, and human dignity.

Society often mistakes legality for morality because it offers clarity. Laws are neat, mostly. What happens when the rules run counter to common sense? Morals are messy and confusing. Yet it’s in that messiness, the uncomfortable dissonance between what’s allowed and what’s right, that our real journey towards enlightenment begins.

And AI and machines can erect signposts but never construct the destination.

A human acknowledgement of a soul’s existence and what that means.

Graphic: Gone Baby Gone Movie Poster. Miramax Films.