Monroe Doctrine

In 1823, President James Monroe issued what became known as the Monroe Doctrine, warning European powers against further colonization or interference in the New World. Though never codified into law or treaty, the doctrine became a guiding principle of U.S. foreign policy, invoked and reinterpreted by successive administrations to assert American influence in the hemisphere. Theodore Roosevelt expanded it, Barack Obama’s administration declared it obsolete, and Donald Trump revived its assertive tone. Its malleability is hailed by some as its strength, denounced by others as its greatest flaw.

The Monroe Doctrine became a symbolic fence around the Western Hemisphere, a firewall against nineteenth‑century imperial powers. Over the next two centuries, it evolved through corollaries, confrontations, and periods of dormancy. Today, in the shadow of Chinese expansion, mainly through its Belt and Road Initiative, Latin American states are drawn to twenty‑first‑century infrastructure with age‑old colonialism lurking in the background. But the Chinese buying influence in the hemisphere is aimed directly at the United States, seeking to erode its traditional dominance and reshape regional loyalties.

The Monroe Doctrine was intended to thwart enemies, potential and real, at the gate. With the exception of Cuba, it largely succeeded through the twentieth century. The 21st century now poses a test of whether the doctrine still has teeth.

If conflict with China is fated, then the United States must first secure its own backyard. The Western Hemisphere cannot be a distraction or a liability, a source of angst and trouble. Before turning its full strategic gaze toward the Middle Kingdom, the U.S. must seal the gates of the New World.

The Monroe Doctrine was written mainly by President Monroe’s Secretary of State, John Quincy Adams. It aimed to support Latin American independence movements from Spain and Portugal, while discouraging Russian influence in the Pacific Northwest and preventing the Holy Alliance: Russia, Austria, Prussia, and France, from restoring monarchs in the Americas. But the doctrine was not all sword: the United States also pledged not to interfere in Europe’s internal affairs or its colonies.

In the early 1800s, the United States lacked the ability to enforce such a bargain militarily. Britain, however, was more than willing to use its naval fleet to guarantee access to New World markets and discourage competition.

By the beginning of the twentieth century, Theodore Roosevelt invoked and expanded the doctrine, effectively making the United States the policeman of the Western Hemisphere. During the Cold War, it was used to counter Soviet influence in Cuba, Nicaragua, and Grenada.

By the 1970s the South American drug trade was declared a national security threat and the War on Drugs began with Colombia the epicenter of hostilities. In 1981, U.S. Congress amended the Posse Comitatus Act to allow military involvement in domestic drug enforcement, extending to Latin America. President Ronald Reagan’s 1986 National Security Decision Directive 221 declared drug trafficking a U.S. national security threat, authorizing military operations abroad, including in Colombia.

After the Cold War, the doctrine faded from explicit policy. In November 2013, Secretary of State John Kerry declared at the Organization of American States that “the era of the Monroe Doctrine is over,” framing a shift toward partnership and mutual respect with Latin America rather than unilateral dominance. By 2020 Colombia’s coca production had hit a new high.

Today, China’s Belt and Road Initiative, port construction and acquisitions, telecom infrastructure, and rare earth diplomacy have carved influence into Latin America and the Caribbean. In this context, the Monroe Doctrine was not asleep but, in a coma, its toes occasionally twitching.

Re-invigorating the Monroe Doctrine is not about making true allies and friends but removing vulnerabilities. The goal is not to bring these nations into the fold but to remove them from Beijing’s orbit.

By mid-2025 official statements claim that ~10% of the U.S. Navy is deployed to counter drug threats, ostensibly from Venezuela and Columbia. But fleet positioning hints at a different story. Most assets are stationed near Puerto Rico, the Virgin Islands, and Guantánamo Bay, closer to Cuba than Caracas. Surveillance flights, submarine patrols, and chokepoint monitoring center on the Florida Straits, Windward Passage, and Yucatán Channel.

This may suggest strategic misdirection. Venezuela is the declared theater, but Cuba is the operational keystone. The U.S. may be deflecting attention from its true concern: Chinese or Russian entrenchment in Cuba and the northern Caribbean.

The Monroe Doctrine began as a warning to monarchs across the Atlantic. In the late twentieth century, it morphed into a war on drugs. Today it reappears as a repurposed drug war, flickering as a warning to Beijing across the Pacific. Whether it awakens as policy or remains sleight of hand, its enduring role is to remind the world that the Western Hemisphere is not a theater for distraction but a stage the United States will guard against intrusion. In the twenty‑first century, its test is not whether it can inspire allies, but whether it can deny adversaries a foothold in America’s backyard.

Graphic: Monroe Doctrine by Victor Gillam, 1896. Public Domain.

Hamlet Goes to Milwaukee—A Tragicomedy in Five Acts

Prolepsis’ Prologue:

The Chorus enters. A single spotlight. A single Damocles’ bullet hangs in the air like a haunted ghost spinning to history’s rhythms and trajectories.

CHORUS:

John Schrank shoots Theodore Roosevelt, 113 long and mostly forgotten years ago, in Milwaukee, Wisconsin on a sharp and chilled Monday, coats pulled tight, 14 October 1912.

That’s the end, my friend, or so it seems. But tragedy demands context, and context demands sacramental passings. Let us reset and reconfigure the scene, with a sentimental barbershop quartet interlude of ‘Moonlight Bay’ drifting in the background and summon the ghosts of campaigns past and the raving refrains of the mad, all served with a bullet.

Act I: The Bull Rising

Before the Bull Moose and the bullet there was tradition and restraint. Before Roosevelt charged up the hill and across the plains, there was McKinley’s calm firmament.

William McKinley, 25th President of the United States, governed with a philosophy of calculated prosperity and protective nationalism, fittingly called the Ohio Napoleon, holding folksy court on America’s front porch. He was deliberate and firm but never rash, he was a Republican loyalist second, leader first, and a quiet expansionist, A Civil War veteran and devout Methodist, McKinley championed high tariffs, the gold standard, and industrial growth as the pillars of American strength.

His first term (1897–1901) unfolded as an economic recovery from Grover Cleveland’s faltering presidency and the Panic of 1893. It was marked by economic stabilization, the Spanish-American War, and the acquisition of overseas territories: Puerto Rico, Guam, the Philippines, and Hawaii, all additions to America’s imperial structure.

His vice president, Garret Hobart died, of heart failure in 1899 at the age of 55. With no constitutional mechanism to fill the vacancy, the office remained vacant until McKinley’s re-election. It wasn’t until the ratification of the 25th Amendment in 1967 that a formal process was established to replace a vice president.

In 1900, Theodore Roosevelt, then Governor of New York and war hero of the San Juan Hill, was chosen as McKinley’s running mate. His nomination was largely a strategy of containment: an attempt to temper his reformist zeal beneath the inconsequential and ceremonial weight of the vice-presidency.

Act II: Bull Cometh

The Bull Moose was buried beneath ceremony, but symbols cannot contain momentum. The front porch would give way to the lists and charging steeds.

On September 6, 1901, President William McKinley stood beneath the vaulted glass of the Temple of Music at the Pan-American Exposition in Buffalo, New York, an American shrine to progress, electricity, and imperial optimism. There, in the charged glow of modernity, he was shot twice in the abdomen by Leon Czolgosz, a Polish American self-declared anarchist and bitter subject of the Panic of 1893 and its resultant mill closures, strikes and wage collapse, etched into his disillusioned psyche.

Czolgosz had been baptized in the radical writings of Emma Goldman, a Lithuanian emigree and firebrand of the American radical left. Goldman championed anarchism, women’s rights, and sexual liberation. She founded Mother Earth, a journal that became an infamous intellectual hearth for dissent and revolutionary analysis.

To Czolgosz, Mckinley was the embodiment of oppression: capitalism, imperialism, and state violence. His answer to these perceived provocations was violence. Concealing a revolver wrapped in a handkerchief, he fired at close range during a public reception, just as McKinley extended his hand in welcome.

Initially, doctors believed McKinley would recover. But gangrene developed around the damaged pancreas, and he died on 14th of September. His death was slow and tragic, a symbolic collapse of the front porch presidency.

Roosevelt, just 42, stepped up and became the youngest president in U.S. history (JFK was 43). With containment at an end, the Bull broke loose. And he mounted the stage with an agenda.

Act III: The Charge of the Bull

The Bull builds a protective legacy of words and stick, sweat and blood.

Roosevelt’s early presidency honored McKinley’s legacy: trust-busting, tariff moderation, and economic expansion. But he soon added his own signature: conservationism, progressive reform, and a bold, moralistic foreign policy.

He preserved 230 million acres of public land and established the U.S. Forest Service, 5 national parks, 18 national monuments, 150 national forests and a constellation wildlife refuges. Stewardship of the land became a sacred ideal that continues to present day.

In foreign affairs, Roosevelt extended the Monroe Doctrine with his Roosevelt Corollary (1904), asserting that the U.S. had the right to intervene in Latin America to prevent “chronic wrongdoing.” It was a doctrinal pivot from passive hemispheric defense against European imperialism to active imperial stewardship, cloaked in the language of civilization and order. America became the self-appointed policeman of the Western Hemisphere.

The corollary was a response to incidents like the 1902 Venezuelan debt crisis where European navies blockaded ports to force repayment. In Cuba, unrest was quelled with U.S. troops in 1906. Nicaragua, Haiti, and Honduras saw repeated interventions to protect U.S. interests and suppress revolutions. If Latin American failed to maintain order or financial solvency, the U.S. would intervene to stabilize rather than colonize.

The doctrine justified the U.S. dominance of the Panama Canal and set the precedent for Cold War interventions, neutralizing the American back yard while containing Soviet expansion in the east.

Act IV: Hamlet in Milwaukee

Heads of kings rest uneasy. Ghosts of injustice haunt. Princes fall prey.

After winning a full term in 1904, Roosevelt honored his promise not to seek reelection in 1908. But disillusioned with his successor, William Howard Taft, Roosevelt returned to politics in 1912, forming the Progressive Party, nicknamed the Bull Moose Party.

Enter stage left, John Schrank, a former barkeep plagued by visions and imagined slights. In the early morning hours of 15 September 1901, 6 days after McKinley was shot and 2 days before he died, the bar tender dreamt that the slain President rose from his casket and pointed to a shrouded figure in the corner: Roosevelt. “Avenge my death”, the ghost spoke. Schrank claimed to forget the dream for over a decade, until Roosevelt’s bid for a third term in 1912 reawakened the vision, which he now interpreted as a divine command.

Schrank believed Roosevelt’s third-term ambition was a betrayal of American tradition set forth in Washington’s Farewell Address. He hated Roosevelt and feared that he would win the election, seize dictatorial power, and betray the constitutional republic. In his delusional state, he believed Roosevelt was backed by foreign powers and was planning to take over the Panama Canal; an anachronistic fear, given total U.S. control of the canal since 1904. Schrank interpreted the ghost’s voice as God’s will: “Let no murderer occupy the presidential chair for a third term. Avenge my death.”

At his trial for the attempted assassination of Roosevelt, Schrank was remanded to a panel of experts to determine his mental competency. They deemed him insane, a “paranoid schizophrenic”, in the language of the time. He was committed to an asylum, where he remained until his death 31 years later.

Schrank’s madness parallels the haunted introspection of Hamlet, Prince of Denmark. Shakespeare’s longest and most psychologically complex tragedy that revolves around a ghost’s command: “Revenge my foul and most unnatural murder.” Hamlet, driven by the specter’s charge, spirals into feigned (and perhaps real) madness, wrestling with betrayal, duty, mortality, and metaphysical doubt. His uncle, the murderer, has married his mother; an Oedipal inversion within the world’s most enduring tragedy.

On 14 October 1912, as Roosevelt stood outside Milwaukee’s Gilpatrick Hotel, Schrank stepped forward and fired. The bullet pierced his steel glasses case and a folded 50-page tome of a speech, slowing its path. Bleeding, a bullet lodged in his chest, Roosevelt refused medical attention. He stepped onto the stage and spoke for 90 minutes, although it is said that due to his loss of blood, he shortened his speech out of necessity. Whether for himself or the audience is lost to history.

Unlike Hamlet, who dithers and soliloquizes his way toward a graveyard of corpses, Schrank shoots, hits, and leaves Roosevelt standing. Hamlet’s tragedy ends in death and metaphysical rupture. Schrank’s farce begins with the demands of a ghost and ends with a 90-minute speech. One prince takes his world with him into death. The other absorbs a bullet and keeps talking.

Act V: Ghosts and Republics

Ghosts and Republics are ephemeral. At the end of time; those fleeting moments, short and long; some, as Proust says, more and more seldom, are best treated with humor and grace.

In tragedy and near calamity, a man’s soul becomes visible. Some are seen darkly, others, bright, clear, unshaken and unafraid of new beginnings even if that beginning is death.

Roosevelt had already charged up San Juan Hill, bullets and fragments whistling past like invitations to a funeral ball. Each a death marker. So, when a solitary bullet from a madman struck him in Milwaukee, it was merely an inconvenience. He quipped: “Friends, I shall ask you to be as quiet as possible. I don’t know whether you fully understand that I have just been shot, but it takes more than that to kill a Bull Moose.”

Sixty-eight years later, Reagan too survived a bullet to the chest. As he was wheeled into the emergency room at George Washington University Hospital, he said he’d “rather be in Philadelphia,” a throwback to his vaudeville days, a gag line used on fake tombstones: “Here lies Bob: he’d rather be in Philadelphia.” W.C. Fields once requested it as his epitaph. He’s buried in California. To the surgeons, Reagan added: “I hope you’re all Republicans.”

Where Roosevelt offered mettle, Reagan offered mirth. Both answered violence with theatrical defiance: natural-born and unshakable leaders, unbothered by the ghosts that tracked them.

They were not alone. Jackson, beat his would-be-assassin with a cane. Truman kept his appointments after gunfire at Blair house. Ford faced two attempts in seventeen days and kept walking. Bush stood unfazed after a grenade failed to detonate. They met their specters with grace, a joke, and a shrug.

The assassins and would-be assassins vanished into the diffusing whisps of history. The leaders of men left a republic haunted not by ghosts, but by a living memory: charged with the courage to endure and to imagine greatness.

Graphic: Assassination of President McKinley by Achille Beltrame, 1901. Public Domain.

Gold in the Middle Kingdom

In November 2024, China’s state media announced the discovery of a “supergiant” gold deposit in the Wangu Gold Field, Hunan Province. Initial exploration and delineation drilling confirmed approximately 300 metric tons (9,645,225 troy ounces) in place. Subsequent geologic modeling suggests that the total resource may exceed 1000 metric tons (32,150,750 troy ounces), potentially making it the largest known deposit in the world.

At the current October 2025 gold price of $4,267.30 per ounce that equates to about $137.3 billion in gross value assuming an unrealistic 100% recovery.

But is all this gold recoverable without sinking vast capital only to lose more in the process? Public data remains limited, yet a ballpark estimate is possible.

Incorporating global subsurface mining economics, the project, assuming a capital expenditure of $12.5 billion and operating costs of $2100 per ounce, would be profitable. Its projected return of 17% is respectable but far from spectacular (more on this below). Not the proverbial gold mine, but a respectable sovereign nest egg, nonetheless.

However, when factoring in a 40% chance of technical success, the projects’ risk-adjusted return drops below 7%, falling short of industry’s typical 10% threshold. In economic terms, the project fails; at least under current conditions and postulated costs.

The deposit is hosted in Neoproterozoic, between 1 billion to 538 million years ago, sandy and silty slates within the Jiangnan orogenic belt. It comprises over 40 quartz-sulfide veins, located from 2000-3000 meters (6500-9850 feet), and associated with north-west trending faults.

The main ore body, V2, averages 1.76 meters in thickness with the other veins ranging from 0.5 to 5 meters with a maximum of 14 meters collectively spanning several square kilometers (exact areal extent remains unpublished). Published average gold grade is stated at 6-8 grams of gold per ton with exceptionally rich veins reaching a world class 138 grams per ton.

At depths of 2,000-3,000 meters, Wangu enters the realm of ultra-deep mining. Compounding that depth challenge is a blistering geothermal gradient, placing the gold-bearing rock in a roasting 110-200 degrees Celsius (230-392 degrees Fahrenheit), temperatures far beyond human endurance without extreme and prohibitively expensive cooling. Robotic retrieval of the resource becomes essential.

To reduce human risk in high-temperature zones, autonomous mining systems will be the default standard. These will include robotic cutters and remote rock loaders, guided by AI software to navigate the narrow veins. Engineering challenges abound: thermal degradation of electronics, lubricant breakdown, sensor failures, and a multitude of other factors. Even in a robotic environment cooling infrastructure, such as ice slurry plants and high-capacity ventilation, will likely be required, adding significantly to the overall operating costs.

At these depths in a highly faulted regime, rock plasticity and instability add to the risk and costs of recovery.

Wangu’s extreme technical demands evoke parallels with deepwater oil exploration and spaceflight, domains where success has come only through phased engineering, initial high costs, and extensive testing. The project may draw on space-grade alloys and ceramics, deepwater telemetry and control, thermal shielding from reentry vehicles, and autonomous navigation from off-Earth rovers.

China’s mining expertise and Hunan’s infrastructure; power grids, skilled labor, automated systems, may mitigate some of these challenges. Still, the scale and depth of the deposit suggest a complex, phased engineering operation. Development will likely proceed vein-by-vein, shallow to deep, prioritizing high-grade zones to maximize early returns and to refine the learning curve.

Estimating a timeline for this project involves multiple phases: feasibility studies, including geotechnical, thermal, and remote sensing analysis, possibly running from 2028 till 2030. With state support, permitting and financing may be expedited, taking only 1 or 2 years. Construction of shafts, cooling systems, and robotic infrastructure may take another 5-8 years. Commissioning, de-bottlenecking, and problem-solving would add another 1-2 years before peak capacity is reached.

If all proceeds smoothly, first gold may be achieved in 12-15 years. However, given the extreme technical challenges, a more realistic horizon is 15-20 years. In a perfect world first gold may be expected between 2040-2045.

Achieving first gold will likely require $10-15 billion in capital expenditure, with operating costs estimated at $1800-2400 per ounce over a 20-year life of mine and 90% resource recovery. Assuming a starting gold price of $4270 per ounce and a 5% annual growth, the project yields an initial IRR of about 17%. But when factoring in the 40% chance of technical success, across geotechnical, thermal, and robotic domains, the risk-adjusted IRR drops below 7%, rendering the project uneconomic under current conditions. Expect years of recycling before this project is formally sanctioned.

Still in a world increasingly skeptical of fiat currencies, Wangu is more than a source of gold, it is a sovereign hedge, a deep Chinese vault of wealth to anchor a post-fiat strategy.

By way of comparison, Fort Knox reportedly holds 147.3 million troy ounces of gold. Additional U.S. government holdings in Denver, New York, West Point, and other sites brings the total to 261.5 million troy ounces; worth roughly $1.1 trillion at today’s prices. The Chinese government officially holds about 74 million troy ounces worth about $315.6 billion. Wangu could theoretically increase China’s gold holdings by 43%.

Graphic: Gold veins in a host rock.

Thrilla in Manila: 50 Years On

The greatest heavyweight fight ever, likely the greatest fight, period, and certainly the most brutal was slugged out 50 years ago today in the Philippines. As it came to be known, The Thrilla in Manila between Muhammad Ali, age 33, and Joe Frazier, age 31, was their third, and final; the rubber match: Frazier took the first, Ali the second, and this was for the belt. The scheduled 15-round contest was held at the Araneta Coliseum in Cubao, Quezon City, with the temperatures reaching a roasting, exhausting, debilitating 100 degrees Fahrenheit (38 degrees Celsius).

Ali won by corner retirement (RTD), also known as a corner stoppage, after Frazier’s chief second, Eddie Futch, asked the referee to stop the fight between the 14th and 15th rounds. This bout is almost universally regarded as one of the greatest and most punishing in boxing history.

With the series tied, the third encounter in the sweltering Philippine heat drove both men to the brink of collapse. They exchanged unremitting punishment, refusing to yield, superhuman fury and drive overriding physical endurance.

Ali dazzled early and late with his patented “rope-a-dope” strategy, occasional dancing, but mostly flat-footed, measuring with the left, and delivering crunching right leads in an insolent rhythm. Frazier’s aggressive left hooks, and especially his punishing body blows, found their mark in the middle rounds, battering Ali with relentless force. By the 14th, Frazier’s eyes were nearly swollen shut, and Ali was exhausted, yet still able to summon his signature dance: bobbing, weaving, taunting with energy he no longer possessed.

After that round, Ali slumped in his corner, exhaustion permeating his entire being. He asked his trainer Angelo Dundee to cut off his gloves: a submission to body over mind. But before Dundee could act, Eddie Futch stepped in and stopped the fight. Mercifully recognizing the unprecedented brutality of the contest, Futch told Frazier, “No one will ever forget what you did here today.”  Ali later admitted, “Frazier quit just before I did. I didn’t think I could fight any more.”

Howard Cosell, ABC sportscaster for the fight commented, “A brutal, unrelenting war between two men who gave everything they had—and then some.”

The official scorecards at the end of the 14th round:

Referee Carlos Padilla: 66-60 for Ali

Judge Larry Nadayag: 66-62 for Ali

Judge Alfredo Quiazon: 67-62 for Ali

Ali summed up the fight with poignant clarity: “It was the closest thing to dying that I know of.

The final bell rang. The two fighters have gone down in history as Titans of the ring: survivors of the most brutal fight ever fought.

(Postscript: Ali and Frazier in 1975 earned about $9 and $5 million respectively for the fight: a pittance by today’s standards.)

Source: Thrilla in Manila-ABC TV fight with Howard Cosell. Watch here–  https://youtu.be/MaRNsNzqsJk  Graphic: Ali-Frazer Fight, PLN Media.

The Lost Boys

The end of the Peloponnesian War in 404 BC marked the end of Athens’ Golden Age. Most historians agree that the halcyon days of Athens were behind her.  Some however, such as Victor Davis Hanson in his multi-genre meditations, A War Like No Other, a discourse on military history, cultural decay, and philosophical framing, offers a more nuanced view suggesting that Athens was still capable of greatness, but the lights were dimming.

During the following six decades, after the war, Athens rebuilt. Its navy reached new heights. Its long walls were rebuilt within a decade. Aristophanes retained his satirical edge even if it was a bit more reflective. Agriculture returned in force. Even Sparta reconciled with Athens or vice versa, recognizing once again that the true enemy was Persia.

Athens brought back its material greatness, but its soul was lost. What ended the Golden Age of Athens wasn’t crumbled walls or sunken ships. It was the loss of lives that took the memory, the virtuosity of greatness with it. With them generational continuity, civic pride, and a religious belief in the polis vanished. The meaning, truth, and myth of Athenian exceptionalism died with their passing. The architects of how to lead a successful, purpose driven civilization had disappeared, mostly through death by war or state but also by plague.

Victor Davis Hanson, in his A War Like No Other lists many of the lives lost to and during the war that took much of Athens’ exceptionalism with them to their graves. Below is a partial listing of Hanson’s more complete rendering with some presumptuous additions.

Alcibiades was an overtly ambitious Athenian strategist; brilliant, erratic, and ultimately treasonous. He championed the disastrous Sicilian expedition, Athens greatest defeat. Over the course of the war, he defected multiple times: serving Athens, then Sparta, then Persia, before returning to Athens. He was assassinated in Phrygia around 404 BC while under Persian protection, by, many beleive, the instigation of the Spartan general Lysander.

Euripides though he did not fight in the war exposed its brutality and hypocrisy in his plays such as The Trojan Woman and Helen. The people were not sufficiently appreciative of his war opinions or plays, winning only four firsts at Dionysia compared to 24 and 13 for Sophocles and Aeschylus, respectively. Disillusioned, he went into self-imposed exile in Macedonia and died there around 406 BC by circumstances unknown.

The execution of the Generals of Arginusae remains a legendary example of Athenian arbitrary retribution; proof that a city obsessed with ritualized honor could nullify military genius, and its future, in a single stroke. The naval Battle of Arginusae, fought in 406 BC, east of the Greek island of Lesbos, was the last major Athenian victory over the Spartans in the Peloponnesian War. Athenian command of the battle was split between 8 generals: Aristocrates, Aristogenes, Dimedon, Erasinides, Lysias, Pericles the Younger (son of Pericles), Protomachus, and Thrasyllus. After their victory over the Spartan fleet a storm prevented the Athenians from recovering the survivors, and the dead, from their sunken ships. Of the six generals that returned to Athens all were executed for their negligence. Protomachus and Aristogenes, likely knowing their fate, chose not to return and went into exile.

Pericles, the flesh and blood representation of Athens’ greatness was the statesman and general who led the city-state during its golden age. He died of the plague in 429 BC during the war’s early years, taking with him the vision of democratic governance and Athens’ exceptionalism. His 3 legitimate sons all died during the war. His two oldest boys likely died of the plague around 429 BC and Pericles the Younger was executed for his part in the Battle of Arginusae.

Socrates, the world’s greatest philosopher (yes greater than Plato or Aristotle) fought bravely in the war, but he was directly linked to the traitor Alcibiades. He was tried and killed in 399 BC for subverting the youth and not giving the gods their due. That was all pretense. Athens desired to wash their collective hands of the war and Socrates was a very visible reminder of that. He became a ritual scapegoat swept up into the collective expurgation of the war’s memory.

Sophocles, already a man of many years by the beginning of the war, died in 406 BC at the age of 90 or 91, a few years before Athens’ final collapse. His tragedies embodied the ethical and civic pressures of a society unraveling. With the deaths of Aeschylus in 456 BC, Euripides in 406 BC, and Sophocles soon after, the golden age of Greek tragedy came to a close.

Thucydides, author of the scholarly standard for the Peloponnesian War, was exiled after ‘allowing’ the Spartans to capture Amphipolis, He survived the war, and the plague, but never returned to Athens. His History ends in mid-sentence for the period up to 411 BC. He lived till 400 BC, and no one really knows why he didn’t finish his account of the war. Xenophon picked up where Thucydides left off and finished up the war in his first two books of Hellenica which he composed somewhere in the 380s BC.

The Peloponnesian War ended Athens’ greatest days. The men who kept its lights bright were gone. Its material greatness returned, glowing briefly, but its civic greatness, its soul, slowly dimmed. It was a candle in the wind of time that would be rekindled elsewhere. The world would fondly remember its glory, but Athens had lost its spark.

Source: A War Like No Other by Victor Davis Hanson, 2005. Graphic: Alcibiades Being Taught by Socrates, Francois-Andre Vincent, 1776. Musee Fabre, France. Public Domain.

Phalanx: Discipline in Geometry

Near the ancient Sumerian city of Girsu, mid-way between present-day Bagdad and Kuwait City, stood a battle marker; the Stele of Vultures, now housed in the Louvre. It commemorates Lagash’s 3rd millennium BC victory over Umma. The stele derives its name from the monument’s carved vultures flying away with the heads of the dead.  It also depicts soldiers of Lagash marching in a dense, shield to shield formation, holding spears chest high and horizontal, led by their ruler: Eannatum, who commissioned the stele in 2460 BC. The importance of the stele, though, is that it is the first visual depiction of the use of a phalanx in a battle. It is believed that the phalanx as a military tactic is much older.

The phalanx was more than a combat formation, it was a battlefield philosophy enshrining discipline and courage over strength, unity of the team over the individual. A dense, rectangular wall of men, generally 8 deep stretching across the battlefield to protect against flanking maneuvers. Each man wore heavy armor of leather and bronze: helmet, cuirass, greaves, armed with a spear and a short sword. But the breakthrough that brought the phalanx great renown was the apsis, a round shield invented for the Greek hoplite in the 8th or 7th century BC. With its dual grip, a forearm strap and central handhold, it allowed the infantryman precise control of his shield, helping create an impenetrable barrier of bronze and bone against the oncoming enemy’s spears and swords. It transformed the phalanx from an offensive wall of attack to an added defensive engine of defiance.

The phalanx only succeeded in cohesion. When courage and discipline held, the formation with the apsis as its core defense was practically unbeatable on confined terrain. It overcame the enemy with a seamless, tight mass executing a relentless forward march into the belly of the opposing beast. But it was only as strong as its weakest link. Once discipline faltered and cohesion broke, the formation collapsed, and the opposing army ran it to ground. Victory belonged not to brute force, but to the combined strength of the military unit. Teams won, individuals lost.

From late 8th century BC onward, Greek phalanxes were manned by hoplites: citizen soldiers, generally landowners and farmers. Emerging in Sparta or Argos, possibly imported from Sumeria or born of parallel discovery in Greece, phalanx battles initially were confined, blunt, and deadly affairs. They devolved into fierce pushing masses of brawn, bone, and metal until one side broke. Heavy casualties occurred when the enemy lines broke and soldiers fled Helter skelter in shock and chaos, pursued by the victors for plunder, unless they were restrained by honor.

The phalanx became the standard that destroyed the mighty Persian armies at Marathon and Thermopylae early in the 5th century BC. At Marathon in 490 BC 10,000 Athenians and 1000 Plataeans stretched out their formation to match the breadth of 26,000 Persians, filling the Marathon plain and denying the armies any room for flanking movements.

The Greeks stacked their wings with additional rows of hoplites and thinned them progressively toward the center creating a convex crescent. The Greek wings advance faster than the center generating a pincer movement that collapsed on the Persian center. When the dust settled 192 Athenians and 11 Plataeans were lost while the Persian losses were approximated at 6400.

In the 19th century, Napoleon, possibly improvising on phalanx encircling tactics developed at Marathon, would invert his attacking army with a concave formation consisting of a strong center and weaker wings. His strategy being to split the enemies’ center with strength and attack their divided ranks on the flanks. The tactic worked until Wellington at Waterloo.

At Marathon, unity triumphed with geometric discipline. At Thermopylae the formation bought time and ended with a sacrifice that concluded Persian hubris.

During the second Persian invasion in 480 BC, Darius’s son Xerxes with 120,000-300,000 men attacked a contingent of 7000 Greeks at Thermopylae. The Greeks held back the Persian advance like a cork in a bottle, using a rotating phalanx of roughly 200 men to defend a narrow pass for two days, until betrayal by Ephialtes exposed their flank and they were destroyed in a inescapable Persian barrage of arrows. Greek losses were estimated at 4000 men including Leonidas’ 300 Spartans and 2000-4000 Persians (beginning and ending estimates for manpower strength vary widely).

The Greeks defiant stand at Thermopylae allowed the Greek navy to regroup at Salamis where they won a decisive victory against the Persian navy. A year later the Greeks at Plataea crushed the Persians quest for a Hellenic satrapy.

The Phalanx endured for another century, including use in the Peloponnesian War, where it remained lethal but of limited use. Then came Epaminondas at Leuctra in 371 BC, transforming the phalanx into a machine that erased Sparta’s mighty reputation. Typically, each army’s phalanx strength was concentrated on their right wing so that the strongest part of a force always faced off against the weaker wing of the opposition. What Epaminondas did was say nuts to that.

He reversed the order and created an oblique formation, more triangular than rectangular with his strongest troops on the left wing. His left wing was stacked 50 deep while keeping his center and right wings thin. His 50-deep was aimed directly at Sparta’s best under the command of King Cleombrotus (in those days officers and kings were in the front rows of the phalanx). As the phalanxes began to attack Epaminondas kept his right-wing stationery creating an asymmetrical front. The left wing easily broke through Sparta’s right wing, killing Cleombrotus and collapsing their superior flank. At that point Epaminondas’s wing pivoted inward creating an enveloping arc around the remaining parts of Sparta’s phalanx effectively ending the Spartan myth of invincibility.

Epaminondas tactics shortened battles with fewer casualties. His innovations proved that properly trained and equipped citizen soldiers could defeat professional warriors while instilling a new civic honor through restraint and discipline. His oblique formation allowed landowners and farmers to settle their disputes, usually in a few hours or less, with minimal loss, and return to their farms in time for the harvest. Epaminondas not only brought asymmetrical tactics to the battlefield but shattered claims of superiority by employing the unexpected.

As the Golden Age of Athens and western civilization’s Greek center waned and Roman hegemony rose, the phalanx evolved again. The Greek phalanx gave way to the Roman manipular system, a staggered checkerboard pattern, enabling units to rotate, reinforce, or retreat as needed. It was a needed refinement and improvement to the phalanx, more effectual on open plains and less susceptible to calvary and arrows.

Then came Hannibal to Cannae in 216 BC. During the 2nd Punic War, he upended the war cart of tactics once again and ruthlessly exploited Rome’s refinements.

Hannibal’s improvisations of the phalanx maneuvering tactics, but not the actual formation, showed that he had studied Marathon. Instead of a convex line with strong wings and a weak center he developed a concave line with strong wings and weak center. He allowed the center to fall back, which the Romans unwittingly obliged by surging into Hannibal’s weak center. With the Romans committed Hannibal’s deception encircled them with precision and brutal lethality. The Romans were annihilated on the field losing somewhere between 50,000-70,000 killed and another 10,000 captured. Hannibal lost 6000-8000 men (again estimates vary). Then came the 3rd Punic War.

The phalanx began as a wall of spears and shields, a bulwark of bronze and bone. Its stunning victories echo through history’s scholarly halls and hallowed plains of death and destruction. Yet its Achilles’ heel, vulnerable flanks, precise terrain requirements proved incompatible to horses and gunpowder.

Still its legacy of discipline and unity endure. Born of necessity, refined through rigor, and studied for centuries, the phalanx stands as a testament Aristotle’s enduring insight, slightly abridged but still profound, ‘The whole is greater than the parts.’ And perhaps the Roman’s said it best: ‘E pluribus unum’, ‘out of many, one.’

Source: A War Like No Other by Victor Davis Hanson, 2005. Et al. Graphic: Stele of Vultures.

Drunken Monkey Hypothesis–Good Times, Bad Times

In 2004, biologist Robert Dudley of UC Berkeley proposed the Drunken Monkey Hypothesis, a theory suggesting that our attraction to alcohol is not a cultural accident but an evolutionary inheritance. According to Dudley, our primate ancestors evolved a taste for ethanol (grain alcohol) because it signaled ripe, energy-rich, fermenting fruit, a valuable resource in dense tropical forests. Those who could tolerate small amounts of naturally occurring ethanol had a foraging advantage, and thus a caloric advantage. Over time, this preference was passed down the evolutionary tree to us.

But alcohol’s effects have always been double-edged: mildly advantageous in small doses, dangerous in excess. What changed wasn’t the molecule, it was our ability to concentrate, store, and culturally amplify its effects. Good times, bad times…

Dudley argues that this trait was “natural and adaptive,” but only because we didn’t die from it as easily as other species. Ethanol is a toxin, and its effects, loss of inhibition, impaired judgment, and aggression, are as ancient as they are dangerous. What may have once helped a shy, dorky monkey approach a mate or summon the courage to defend his troop with uncharacteristic boldness now fuels everything from awkward first dates, daring athletic feats, bar fights, and the kind of stunts or mindless elocutions no sober mind would attempt.

Interestingly, alcohol affects most animals differently. Some life forms can handle large concentrations of ethanol without impairment, such as Oriental hornets, which are just naturally nasty, no chemical enhancements needed, and yeasts, which produce alcohol from sugars. Others, like elephants, become particularly belligerent when consuming fermented fruit. Bears have been known to steal beer from campsites, party hard, and pass out. A 2022 study of black-handed spider monkeys in Panama found that they actively seek out and consume fermented fruit with ethanol levels of 1–2%. But for most animals, plants, and bacteria, alcohol is toxic and often lethal.

Roughly 100 million years ago in the Cretaceous, flowering plants evolved to produce sugar-rich fruits, nectars, and saps, highly prized by primates, fruit bats, birds, and microbes. Yeasts evolved to ferment these sugars into ethanol as a defensive strategy: by converting sugars into alcohol, they created a chemical wasteland that discouraged other organisms from sharing in the feast.

Fermented fruits can contain 10–400% more calories than their fresh counterparts. Plums (used in Slivovitz brandy) show some of the highest increases. For grapes, fermentation can boost calorie content by 20–30%, depending on original sugar levels. These sugar levels are influenced by climate, warm, dry growing seasons with abundant sun and little rainfall produce sweeter grapes, which in turn yield more potent wines. This is one reason why Mediterranean regions have long been ideal for viticulture and winemaking, from ancient Phoenicia to modern-day Tuscany, Rioja, and Napa.

The story of alcohol is as ancient as civilization itself. The earliest known fermented beverage dates to 7000 BC in Jiahu, China, a mixture of rice, honey, and fruit. True grape wine appears around 6000 BC in the Caucasus region (modern-day Georgia), where post-glacial soils proved ideal for vine cultivation. Chemical residues in Egyptian burial urns and Canaanite amphorae prove that fermentation stayed with civilization as time marched on.

Yet for all its sacred and secular symbolism, Jesus turning water into wine, wine sanctifying Jewish weddings, or simply easing the awkwardness of a first date, alcohol has always walked a fine line between celebration and bedlam. It is a substance that amplifies human behavior, for better or worse. Professor Dudley argues that our attraction to the alcohol buzz is evolutionary: first as a reward for seeking out high-calorie fruit and modulating fear in risky situations, but it eventually became a dopamine high that developed as an end in itself.

Source: The Drunken Monkey by Robert Dudley, 2014.

My Name is Legion

Beelzebub has been wandering through western civilization since the Philistines appeared on the scene in the 12th century BC. The polytheistic Philistines of Ekron, one of their five cities within Canaan, worshiped Beelzebub, Baal-Zebub in the Philistine language, as a minor god of healing and protection from diseases, mainly from flies. In the semitic languages Beelzebub was literally known as the “Lord of the Flies”. (In Indo-European languages some interpretations suggest that Beelzebub is translated into a more friendly Lord of the Jungle.)

As monotheistic traditions took root in Canaan, Beelzebub shifted from a protective deity to a purveyor of evil, demonized within emerging Jewish thought. By the 9th century BC, the prophet Elijah condemned the Israel King Ahab and the prophets of Baal for worshiping this god rather than the true God of the Jews. By the time of the New Testament, which mentioned him 7 times in Matthew, Mark, and Luke, he was associated with Satan, who represented the emperor of Hell.

In Matthew and Mark, the Pharisees accused Jesus of casting out demons by the power of Beelzebub or the “Prince of the Demons”. Jesus counters by exclaiming that “Every kingdom divided against itself is brought to desolation, and every city or house divided against itself will not stand.” Jesus’ response backed the Pharisees into a corner, if they admitted that Jesus was casting out demons by God’s power, then they would have to acknowledge his divine authority. But if they insisted, he was working with Satan, they would have to explain why Satan would undermine his own influence: a house divided will not stand. (Lincoln in an 1858 speech used the same words with a moral rather than religious meaning, granted that is a very fine line, “A house divided against itself cannot stand,” suggesting that the evils of slavery would lead to collapse of the country.)

Between the 15th and 17th centuries Beelzebub was transformed into one of the seven princes of Hell: Lucifer the Emperor, Satan, Leviathan, Belphegor, Mammon, Asmodeus, and Beelzebub. Beelzebub represented the deadly sins of gluttony and envy.

In modern times Beelzebub remains a symbol of evil in literature and culture. John Milton’s Paradise Lost cast him as a chief demon and William Golding’s Lord of the Flies takes a more ancient meaning associated with corruption and destruction.

From an ancient minor Philistine god to Satan during the times of Jesus, to a major Christian demon in medieval times, back to Satan himself in modern times; Beelzebub’s transformation reflects the shifting religious and cultural landscapes over millennia, but demons will always have a name. In Mark 5:9, Jesus asks a possessed man, “What is your name?” The demon responds, “My name is Legion, for we are many.”

Graphic: Satan and Beelzebub by William Hayley, Jean Pierre Simon, Richard Westall: Paradise Lost. Public Domain.

Tripping

Albert Hofmann, employed by Sandoz Laboratories in Basel, Switzerland, was conducting research on ergots, a toxic fungus, in 1938 to identify potential circulatory and respiratory stimulants. While synthesizing compounds derived from the fungus, he inadvertently created lysergic acid diethylamide (LSD), an alkaloid of the ergoline family, known for their physiological effects on the human nervous system.

Five years later on April 16, 1943, Hofmann became the first person to experience the hallucinogenic effects of LSD while re-synthesizing the compound. He accidentally absorbed a small amount through his skin, leading to vivid hallucinations he later described as a dreamlike state with kaleidoscopic visuals. With two groundbreaking lab accidents occurring five years apart, The Daily Telegraph ranked Hofmann as the greatest living genius in 2007.

During the counter-cultural movement of the 1960s, LSD emerged as a popular recreational drug, attracting advocates such as Timothy Leary, a Harvard psychologist who famously urged people to “Turn on, tune in, drop out.” Leary championed the use of psychedelics to explore altered states of consciousness and challenge conventional societal norms. LSD also played a pivotal role in Ken Kesey’s novel One Flew Over the Cuckoo’s Nest, which focused on the horrific abuse of patients in mental institutions. The book, later adapted into a film starring Jack Nicholson, significantly influenced awareness of the cruelty of mental institutions. However, LSD’s trajectory took a sinister turn beyond recreation when it became a tool for government mind-control experiments.

Starting in the 1950s, the CIA launched MKUltra, a covert program designed to explore drugs and techniques for breaking down individuals psychologically. LSD became a central component of these experiments, often administered secretly to unsuspecting individuals to study its effects. Targets included prisoners, drug addicts, prostitutes, military personnel, CIA employees, and even random civilians. It is difficult to ascertain which acronym took the greater hit to its reputation: the CIA or LSD.

Source: Albert Hofmann by Morgan and Donahue, All That’s Interesting, 2025. Graphic: Albert Hofmann in 1993.

Real Not Real

Have no fear of perfection; you’ll never reach it.” – Dali.

Salvador Dalí was the entertaining, surrealist voice of the masses. His dreamlike spectacle of melting clocks and flamboyant persona captivated popular culture, injecting eccentric brushstrokes into the lives of the disengaged and disinterested. Dalí spoke directly to the public’s fascination with dreams and absurdity, transforming art into a theatrical experience and a giggly poke at the eminent egos on high altars.

Dalí was a 20th-century Spanish artist who drew from influences such as Renaissance art, Impressionism, and Cubism, but by his mid-twenties, he had fully embraced Surrealism. He spent most of his life in Spain, with notable excursions to Paris during the 1920s and 1930s and to the United States during the World War II years. In 1934, he married the love of his life, Gala. Without her, Dalí might never have achieved his fame. She was not just his muse but also his agent and model. A true partner in both his art and life. Together, they rode a rollercoaster of passion and creativity, thrills and dales, until her death in 1982.

Dalí had strong opinions on art, famously critiquing abstract art as “inconsequential.” He once said, “We are all hungry and thirsty for concrete images. Abstract art will have been good for one thing: to restore its exact virginity to figurative art.” He painted images that were real and with context that bordered on the not real, the surreal. For those who believed that modern abstract art had no life, no beauty, no appeal, he provided a bridge back to a coherent emotional foundation with a dreamlike veneer. Incorporating spirituality and innovative perspectives into his dreams and visions of life.

The Persistence of Memory (1931) is Dalí’s most recognizable and famous painting, but his 1951 work Christ of Saint John of the Cross is arguably his most autobiographical and accessible piece. A painting dripping with meaning and perspective, Dalí claimed it came to him in a dream inspired by Saint John of the Cross’s 16th-century sketch of Christ’s crucifixion. The perspective is indirectly informed by Saint John’s vision, while the boat and figures at the bottom reflect influences from La Nain and Velázquez. The triangular shape created by Christ’s body and the cross represents the Holy Trinity, while Christ’s head, a circular nucleus, signifies unity and eternity: “the universe, the Christ!” Dalí ties himself personally to the crucifixion by placing Port Lligat, his home, in the background. He considered this painting a singular and unique piece of existence, one he likely could never reproduce because the part of him that went into the painting was gone forever.That part is shared with his viewers, offering a glimpse into Christ’s pain, Dalí’s anguish, and his compassion: an emotional complexity that transcends mortal comprehension.

Source: Salvador Dali by Robert Descharnes, 1984. Graphic: Christ of Saint John of the Cross, Dali, 1951. Low Res. Copyright Glasgow Corporation.