Tripping

Albert Hofmann, employed by Sandoz Laboratories in Basel, Switzerland, was conducting research on ergots, a toxic fungus, in 1938 to identify potential circulatory and respiratory stimulants. While synthesizing compounds derived from the fungus, he inadvertently created lysergic acid diethylamide (LSD), an alkaloid of the ergoline family, known for their physiological effects on the human nervous system.

Five years later on April 16, 1943, Hofmann became the first person to experience the hallucinogenic effects of LSD while re-synthesizing the compound. He accidentally absorbed a small amount through his skin, leading to vivid hallucinations he later described as a dreamlike state with kaleidoscopic visuals. With two groundbreaking lab accidents occurring five years apart, The Daily Telegraph ranked Hofmann as the greatest living genius in 2007.

During the counter-cultural movement of the 1960s, LSD emerged as a popular recreational drug, attracting advocates such as Timothy Leary, a Harvard psychologist who famously urged people to “Turn on, tune in, drop out.” Leary championed the use of psychedelics to explore altered states of consciousness and challenge conventional societal norms. LSD also played a pivotal role in Ken Kesey’s novel One Flew Over the Cuckoo’s Nest, which focused on the horrific abuse of patients in mental institutions. The book, later adapted into a film starring Jack Nicholson, significantly influenced awareness of the cruelty of mental institutions. However, LSD’s trajectory took a sinister turn beyond recreation when it became a tool for government mind-control experiments.

Starting in the 1950s, the CIA launched MKUltra, a covert program designed to explore drugs and techniques for breaking down individuals psychologically. LSD became a central component of these experiments, often administered secretly to unsuspecting individuals to study its effects. Targets included prisoners, drug addicts, prostitutes, military personnel, CIA employees, and even random civilians. It is difficult to ascertain which acronym took the greater hit to its reputation: the CIA or LSD.

Source: Albert Hofmann by Morgan and Donahue, All That’s Interesting, 2025. Graphic: Albert Hofmann in 1993.

Real Not Real

Have no fear of perfection; you’ll never reach it.” – Dali.

Salvador Dalí was the entertaining, surrealist voice of the masses. His dreamlike spectacle of melting clocks and flamboyant persona captivated popular culture, injecting eccentric brushstrokes into the lives of the disengaged and disinterested. Dalí spoke directly to the public’s fascination with dreams and absurdity, transforming art into a theatrical experience and a giggly poke at the eminent egos on high altars.

Dalí was a 20th-century Spanish artist who drew from influences such as Renaissance art, Impressionism, and Cubism, but by his mid-twenties, he had fully embraced Surrealism. He spent most of his life in Spain, with notable excursions to Paris during the 1920s and 1930s and to the United States during the World War II years. In 1934, he married the love of his life, Gala. Without her, Dalí might never have achieved his fame. She was not just his muse but also his agent and model. A true partner in both his art and life. Together, they rode a rollercoaster of passion and creativity, thrills and dales, until her death in 1982.

Dalí had strong opinions on art, famously critiquing abstract art as “inconsequential.” He once said, “We are all hungry and thirsty for concrete images. Abstract art will have been good for one thing: to restore its exact virginity to figurative art.” He painted images that were real and with context that bordered on the not real, the surreal. For those who believed that modern abstract art had no life, no beauty, no appeal, he provided a bridge back to a coherent emotional foundation with a dreamlike veneer. Incorporating spirituality and innovative perspectives into his dreams and visions of life.

The Persistence of Memory (1931) is Dalí’s most recognizable and famous painting, but his 1951 work Christ of Saint John of the Cross is arguably his most autobiographical and accessible piece. A painting dripping with meaning and perspective, Dalí claimed it came to him in a dream inspired by Saint John of the Cross’s 16th-century sketch of Christ’s crucifixion. The perspective is indirectly informed by Saint John’s vision, while the boat and figures at the bottom reflect influences from La Nain and Velázquez. The triangular shape created by Christ’s body and the cross represents the Holy Trinity, while Christ’s head, a circular nucleus, signifies unity and eternity: “the universe, the Christ!” Dalí ties himself personally to the crucifixion by placing Port Lligat, his home, in the background. He considered this painting a singular and unique piece of existence, one he likely could never reproduce because the part of him that went into the painting was gone forever.That part is shared with his viewers, offering a glimpse into Christ’s pain, Dalí’s anguish, and his compassion: an emotional complexity that transcends mortal comprehension.

Source: Salvador Dali by Robert Descharnes, 1984. Graphic: Christ of Saint John of the Cross, Dali, 1951. Low Res. Copyright Glasgow Corporation.

April Fools

April Fool’s Day brings pranks, jokes, and ‘kick-me’ notes to the consternation of almost all. The silliness has its foundations a long way in the past, nearly 450 years ago by some estimates, and is still going strong.

While its actual roots are debated, one popular explanation suggests that the origins of April Fool’s Day go back to the change from the Julian to the Gregorian calendar which occurred on 15 October 1582. Under the Julian calendar the new year began on April 1 and was moved to January 1st under the new calendar.

The transition to the new calendar was announced in Rome by Pope Gregory and was quickly adopted in Catholic countries such as Spain and Portugal but other areas found its adoption was slower, either through slow communications, religious differences, or just resistant to change.

Since the change over to the new calendar was slow, some people and communities continued to celebrate the New Year on April 1st and were roundly mocked as April Fools. This mockery quickly morphed into jokes, both vernacular and practical, which continued to this day.

The BBC has become known for their irreverent April Fool’s jokes. In 1957 they showed their viewers a clip of Swiss farmers harvesting spaghetti from trees, with many in their audience calling in to find out how they could grow their own spaghetti trees. In 1980 the BBC told its listeners that Big Ben was going to a digital readout and followed that up with an announcement that clock hands would be sold off to the first four that called in.

Black Swans Part II

Last week, we introduced Taleb’s definition of black swans; rare, unpredictable ‘unknown unknowns’ in military terms, with major impacts, exploring historical examples that reshaped society post-event. This week I’m going to introduce a fictional black swan and how to react to them but before that the unpredictable part of Taleb’s definition needs some modifications. True black swans by Taleb definition are not only rare but practically non-existent outside of natural disasters such as earthquakes. To discuss a black swan, I am going to change the definition a bit and say these events are unpredictable to most observers but predictable or at least imaginable to some. Taleb would likely call them grey swans. For instance, Sputnik was known to the Soviets, but an intelligence failure and complete surprise to the rest of the world. Nikola Tesla anticipated the iPhone 81 years ahead of time. 9/11 was known to the perpetrators and was an intelligence failure. Staging a significant part of your naval fleet in Pearl Harbor during a world war and forgetting to surveil the surrounding area is not a black swan, just incompetence.

With that tweak out of the way, we’ll explore in Part II where Taleb discusses strategies to mitigate a black (grey) swan’s major impacts with a fictional example. His strategies can be applied to pre-swan events as well as post-swan. Pre-swan planning in business is called contingency planning, risk management, or, you guessed it, black swan planning. They include prioritizing redundancy, flexibility, robustness, and simplicity, as well as preparing for extremes, fostering experimentation, and embracing antifragility.

Imagine a modern black swan: a relentless AI generated cyberattack cripples the Federal Reserve and banking system, wiping out reserves and assets. Industry and services collapse nationwide and globally as capital evaporates, straining essentials, with recovery decades away if ever. After the shock comes analysis and damage reports, then the rebuilding begins.

The Treasury, with no liquid assets, must renegotiate debt to preserve global trust. Defense capabilities are maintained at a sufficient level, hopefully hardened, to protect national security, while the State Department reimagines the world to effectively bolster domestic production and resource independence while keeping the wolves at bay.

Non-essential programs, from expansive infrastructure projects, research, federal education initiatives, all non-essential services are shelved, shifting priorities and remaining resources to maintaining core social and population safety nets like Social Security and Defense. Emergency measures kick in: targeted taxes on luxury goods and wealth are imposed to boost revenue and redirect resources. Tariffs encourage domestic production and independence.

Federal funding to states and localities is reduced to a trickle. States and municipalities must take ownership of essential public services such as education, water, roads, and public safety. The states are forced to retrench and innovate, turning federal scarcity into local progress.

Looking ahead, resilience becomes the first principle. Diversification takes center stage, with the creation of a sovereign wealth fund based on assets like gold, bitcoin, and commodities, bolstered by states that had stockpiled reserves such as rainy-day funds, ensuring financial stability. Local agriculture, leaner industries and a realigned electrical grid, freed from federal oversight, innovate under pressure, strengthening a recovery. Resilience becomes antifragility, the need to build stronger and better in the face of adversity. And finally, the government must revert to its Lockean and Jeffersonian roots, favoring liberty and growth over control, safety, and stagnation: anti-fragility.

Source: The Black Swan by Nassim Nicholas Taleb, 2007. Graphic: The Black Swan hardback cover.

Ship Rams Bridge—Bridge at Fault

An hour and a half after midnight on 26 March 2024 the main spans of the Francis Scott Key bridge collapsed when the Singapore registered container ship MV Dali lost power and collided with the supporting pier of the main truss section. The NTSB blamed the bridge for being old and not up to snuff with the latest safety features. If the bridge had been built a day after the collision with adequate safety redundancies in place it certainly would not have collapsed.

The bridge collapse blocked the shipping channel to the Port of Baltimore, causing daily economic losses of $15 million and billions in total from damages, lost business, and liability.

The Maryland Department of Transportation stated that construction of a replacement bridge, which began in early 2025 is projected to take four years to complete, with an estimated cost of $2 billion.

The U.S. government and Maryland sued the ship’s owners and operators for full liability but maritime law limits catastrophic losses for shipping companies. The owners and operators will be limited to $100 million in damages, mainly for cleanup of collapsed bridge.

The ship’s owners not only argued for limited liability under maritime law, but that bridge was ill-prepared for a sneaky late-night naval attack on its structure. Of course, the fact that the ship was ill-prepared to sail the seven seas is immaterial. Legal arguments are interesting features of modern life, providing unlimited opportunities for head scratching cognitive dissonance and wonderment.

Graphic: Francis Scott Key Collapse by NSTB.

14th Amendment

The 14th Amendment, introduced during the Reconstruction era, was crafted to address legal and constitutional deficiencies exposed after the U.S. Civil War. Its first sentence; “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside“, has become a focal point for competing interpretations. Much like the Second Amendment, its wording has sparked legal and grammatical debates, particularly surrounding the clause “and subject to the jurisdiction thereof.”

The Second Amendment faced similar scrutiny for over 200 years, particularly its prefatory clause, “A well-regulated Militia.” This ambiguity was finally addressed in District of Columbia v. Heller (2008), where the Supreme Court clarified that the historical record and documents like the Federalist Papers supported the right of private citizens to own firearms. The Court also ruled that the prefatory clause did not limit or expand the operative clause, “the right of the people to keep and bear Arms, shall not be infringed.

Likewise, the 14th Amendment’s clause “and subject to the jurisdiction thereof” remains unsettled, awaiting similar historical and grammatical scrutiny to solidify its interpretation. Initially aimed at protecting freed slaves and securing their citizenship, this provision has since invited broader interpretations in response to modern challenges like immigration.

The framers’ intent during Reconstruction was to ensure equality and citizenship for freed slaves and their descendants, shielding them from exclusionary laws. At the time, the inclusive principle of jus soli (birthright citizenship) aligned with the nation’s need to address the injustices of slavery and foster unity among the country’s existing population. However, changing migration patterns and modern cultural dynamics have shifted the debate. The ambiguity of “subject to the jurisdiction thereof” now raises questions about its application, such as how jurisdiction applies to illegal immigrants or children of foreign diplomats, in a globalized world.

Legal precedents such as United States v. Wong Kim Ark (1898) affirmed that nearly all individuals born on U.S. soil are citizens, regardless of whether their parents’ immigration status is legal or illegal. While this aligns with the practical interpretation of jurisdiction, it has spurred debates about the fairness and implications of modern birthright citizenship practices.

Immigration today involves a broader spectrum of cultures and traditions than during earlier waves, when newcomers often shared cultural similarities with the existing population. Assimilation, once relatively seamless, now faces greater challenges. Nations like Britain and Germany have recently revised their jus soli policies to prioritize the preservation of societal norms. The unresolved question of how to address declining populations further complicates the debate; a debate with the citizens that has not occurred much less resolved.

While originally crafted to address the systemic exclusion of freed slaves, the 14th Amendment’s principle of birthright citizenship continues to evolve in its application.

Graphic: 14th Amendment Harper’s Weekly.

White Guard

Mikhail Bulgakov’s White Guard, set during the Ukrainian War of Independence (1917–1921) amid the Russian Civil War, captures Kyiv in an existential power struggle against varied forces: Ukrainian nationalists allied with German troops, the White Guard clinging to Tsarist dreams, Lenin’s Bolsheviks closing in, plus Poles and Romanians. Against this bloody backdrop, Bulgakov crafts a semi-autobiographical tale of loss and fatalism, culminating in a nihilistic realization of humanity’s purpose: “But this isn’t frightening. All this will pass. The sufferings, agonies, blood, hunger, and wholesale death. The sword will go away, but these stars will remain… So why are we reluctant to turn our gaze to them? Why?”

Bulgakov, a doctor of venereal diseases like the book’s protagonist Alexei Turbin, knew hopelessness. In 1918, syphilis was a scourge, often incurable, leading to madness, mirroring the war’s societal decay. Alexei volunteers for the White Guard, tending to horrors he can’t heal, his efforts dissolving in a dream: “shadows galloped past…Turbin was dying in his sleep.” War becomes a disease, resistance futile. Yet Bulgakov’s lens widens. Sergeant Zhilin dreams of Revelation, “And God shall wipe away all tears…and there shall be no more death,” finding humility in cosmic indifference. Petka, an innocent, dreams simply of a sunlit ball, untouched by great powers. “Blessed are the pure in heart, for they shall see God” (Matthew 5:8).

Then, out of dreamland into the light: “All this will pass.” The stars endure, wars fade. Writing in the 1920s after the White defeat, Bulgakov channels Russian fatalism—Dostoevsky’s inescapable will, Chekhov’s quiet surrender. But he’s not fully broken. His “Why?” pleads, mocks, resists. Why not look up? Survival is luck, death equalizes, yet fighting a losing battle confronts our nothingness. Kyiv falls, the Bolsheviks threaten, the White Guard vanishes, still, Bulgakov continues to ask. Why?

He blends despair with irony, a doctor mocking death as the stars watch. The German expulsion of the Reds in 1918 briefly eased bloodshed, but 1919 brought worse, “Great was the year and terrible the Year of Our Lord 1918, but more terrible still was 1919.” History moves on; stars don’t care. Bulgakov’s question lingers: Why? To fight is to live, fate be damned.

Source: White Guard, Mikhail Bulgakov, trans. Marian Schwartz. Graphic: Ukrainian Soldiers circa 1918.

Howling Through Europe

During the Italian Renaissance, a cultural rebirth fueled optimism and propelled civilization to new heights. Yet, in stark contrast, werewolf and witch trials, often culminating in gruesome executions, cast a superstitious shadow over the cold, rugged rural regions of France and Germany from the 14th to 17th centuries.

One notorious case revolves around Peter Stumpp, dubbed the “Werewolf of Bedburg.” In 1589, near Cologne in what is now Germany, Stumpp faced accusations of lycanthropy during a spree of brutal murders and livestock killings. Under torture, he confessed to striking a pact with the devil, who allegedly provided a magical wolf-skin belt that allowed him to transform into a wolf. He admitted murdering and cannibalizing numerous victims, including children. Stumpp’s punishment was as horrifying as his alleged crimes: he was beheaded and burned, alongside his daughter and mistress, who were also implicated. His severed head was later mounted on a pole as a grim public warning.

In a striking counterexample, a case from 1692 in present-day Latvia and Estonia challenges the typical narrative. An 80-year-old man named Thiess of Kaltenbrun confessed to being a werewolf, not to wreak havoc, but to protect his community. He claimed he and other “werewolves” battled witches, even journeying to Hell and back to secure the region’s grain supply. The court, skeptical of his tale and possibly viewing him as delusional (perhaps an early case of clinical lycanthropy), rejected the death penalty and sentenced him to flogging instead.

These trials often hinged on confessions extracted through torture, blurring the lines between truth, projection, vengeance, and superstition. Historians still debate the reality behind the accused: Were they serial killers, convenient scapegoats for unsolved crimes, or individuals afflicted by psychological conditions like lycanthropy—a disorder marked by the delusion of transforming into an animal? In his 1865 work, The Book of Were-Wolves, Sabine Baring-Gould analyzed cases like Stumpp’s, arguing that while superstition inflated their legend, they may have been rooted in real incidents: gruesome murders or societal fears run amok.

Source: The Book of Werewolves by Baring-Gould. Werewolf Trials by Beck, History, 2021. Graphic: Werewolf Groc 3.

American Colony’s First Naturalization Act of 1664

On 12 March 1664, King Charles II, eager to expand the English empire and reward his loyal brother James, Duke of York, granted him a vast swath of North American territory. This prize included New Netherland—stretching roughly from the Delaware River to the Connecticut River—plus scraps of modern Maine and islands like Long Island, Martha’s Vineyard, and Nantucket along the Atlantic coast. Whether Charles saw the Dutch, who currently claimed and occupied most of this land, as a mere obstacle to be swept aside or a challenge for a later Machiavellian showdown isn’t entirely clear in today’s histories. But that’s a story for a later post. What’s certain is that on September 8, 1664, some 300 British troops under Colonel Richard Nicolls peacefully seized New Netherland, renaming its heart, New Amsterdam, as New York.

The conquest raised a question: what to do with the Dutch, French, Walloons, and other non-English settlers now under British rule? The Articles of Capitulation, signed that day, were generous: these residents could keep their property, trade rights, and personal liberties. They weren’t forced out or stripped of their livelihoods—a pragmatic move to avoid rebellion in a colony where the Dutch outnumbered their new overlords. But this deal had limits. Without British citizenship, they owed no loyalty to the Crown, couldn’t pass property seamlessly under English law, and lacked full access to British markets. Enter the curiously dated Naturalization Act of 12 March 1664—more on that head-scratching date in a moment.

This act offered foreign-born settlers a path to English subjecthood. By swearing allegiance to the Crown and paying a fee—described by some as modest, by others as steep—they could gain the rights and privileges of English subjects. The exact fee is lost to time, but it was likely hefty enough to filter out the poor while drawing in merchants and landowners eager for legal and economic benefits. The act aimed to stabilize the colony’s economy, secure political control, encourage growth, and align local realities with British common law.

In mid-17th-century England, citizenship hinged on jus sanguinis—citizenship by blood. Only children of British parents were natural subjects; foreign-born adults, like New York’s Dutch settlers, needed a legal workaround to join the fold and fully participate in colonial life. The act filled that gap, promising a unified British colony over time.

History pegs this Naturalization Act to 12 March 1664—the same day Charles granted James the land—yet the English didn’t hold New Netherland until September. A citizenship act before possession seems nonsensical. One plausible explanation? It’s a backdated fiction. The real policy likely emerged post-conquest, perhaps in late 1664 or 1665, as Nicolls integrated the Dutch population. Linking it to March 12 could’ve been a deliberate move to dress up the original grant as lawful and inevitable—a tidy origin story for English New York. The Articles handled the surrender’s chaos; naturalization was the long-term glue, and pinning it to the charter’s date cast the shift from Dutch to British rule as seamless and legitimate.

Addendum: The Evolution of British Citizenship

British subjecthood didn’t stay rooted in jus sanguinis. Between 1664 and the mid-18th century, it gradually shifted toward jus soli—citizenship by soil. By the time Sir William Blackstone penned his Commentaries on the Laws of England (1765–1769), anyone born in the Crown’s dominions was a natural-born subject, regardless of parentage. This held until the British Nationality Act of 1981, which dialed back unconditional jus soli. Now, a child born in the UK needs at least one parent to be a British citizen or a settled legal resident to claim citizenship—leaving others out of the fold.

Sources: America’s Best History; discussions with Grok 3. Graphic: Landing of the English at New Amsterdam, 1664, produced 1899, public domain.

Temperance in Early Virginia

The Virginia Colony, established by the Virginia Company of London, was not a cradle of temperance in its early years. Founded in 1607 at Jamestown under a 1606 charter from King James I, this joint-stock venture aimed for profit—gold, trade, and later tobacco—not moral reform. Its settlers, a mix of Anglican adventurers, merchants, and laborers, relied on alcohol (beer and spirits) as a staple, given the assumed scarcity of safe water. Yet, a supposed temperance law dated 5 March 1623 is often cited as America’s first, though no documentary evidence from the Virginia Company’s records supports this claim.

The context for such a measure lies in the colony’s struggles. By 1623, Virginia was a fragile outpost under company control, reeling from the Powhatan Uprising of 1622. This surprise attack by the Powhatan Confederacy killed about 347 settlers—over a quarter of the population—likely in response to English land grabs for growing tobacco. The massacre disrupted food supplies, leaving grain scarce. If a 1623 law restricted alcohol production, it may have been a pragmatic response to conserve resources, not a temperance crusade. Virginia Company records don’t mention such a law, but they do show earlier alcohol regulations for practical ends—economic control, public order, or resource management—rather than moral prohibitions.

Earlier codes hint at this pattern, vigilance in the face of pioneering hardships. The Laws Divine, Moral and Martial, enacted around 1610–1611 under Sir Thomas Dale, imposed strict discipline in the struggling settlement, including penalties for drunkenness to curb idleness. In 1619, Governor George Yeardley’s assembly banned “drunkenness” and excessive gaming, possibly reflecting mild Puritan influence from England’s religious debates. However, these rules targeted abuse, not alcohol itself, and didn’t amount to temperance as later understood. The absence of a Puritan majority—unlike in New England—underscores this distinction. Virginia’s settlers were commerce-driven subjects of the Crown, not the religious reformers who arrived later with the Plymouth Colony (1620) or Massachusetts Bay (1630).

The 1623 claim might stem from a misinterpretation of these regulatory measures, exaggerated by later historians or temperance advocates seeking an early precedent. For comparison, in 1623, the Virginia Company of Plymouth’s minister William Blackstone distributed apples (later tied to cider), but no temperance law emerged there either. Both companies, focused on survival and profit, bore little resemblance to the Puritan ethos that shaped later American temperance movements. Without primary evidence, the 1623 Virginia temperance law remains a historical ghost—possibly a practical rule born of crisis, not a moral milestone.

Source: Initial claim from Encyclopedia of Trivia, elaborated by Grok 3. Graphic: Indian Massacre of 1622, Woodcut by Matthaus Merian, 1628. Public Domain.