Monroe Doctrine

In 1823, President James Monroe issued what became known as the Monroe Doctrine, warning European powers against further colonization or interference in the New World. Though never codified into law or treaty, the doctrine became a guiding principle of U.S. foreign policy, invoked and reinterpreted by successive administrations to assert American influence in the hemisphere. Theodore Roosevelt expanded it, Barack Obama’s administration declared it obsolete, and Donald Trump revived its assertive tone. Its malleability is hailed by some as its strength, denounced by others as its greatest flaw.

The Monroe Doctrine became a symbolic fence around the Western Hemisphere, a firewall against nineteenth‑century imperial powers. Over the next two centuries, it evolved through corollaries, confrontations, and periods of dormancy. Today, in the shadow of Chinese expansion, mainly through its Belt and Road Initiative, Latin American states are drawn to twenty‑first‑century infrastructure with age‑old colonialism lurking in the background. But the Chinese buying influence in the hemisphere is aimed directly at the United States, seeking to erode its traditional dominance and reshape regional loyalties.

The Monroe Doctrine was intended to thwart enemies, potential and real, at the gate. With the exception of Cuba, it largely succeeded through the twentieth century. The 21st century now poses a test of whether the doctrine still has teeth.

If conflict with China is fated, then the United States must first secure its own backyard. The Western Hemisphere cannot be a distraction or a liability, a source of angst and trouble. Before turning its full strategic gaze toward the Middle Kingdom, the U.S. must seal the gates of the New World.

The Monroe Doctrine was written mainly by President Monroe’s Secretary of State, John Quincy Adams. It aimed to support Latin American independence movements from Spain and Portugal, while discouraging Russian influence in the Pacific Northwest and preventing the Holy Alliance: Russia, Austria, Prussia, and France, from restoring monarchs in the Americas. But the doctrine was not all sword: the United States also pledged not to interfere in Europe’s internal affairs or its colonies.

In the early 1800s, the United States lacked the ability to enforce such a bargain militarily. Britain, however, was more than willing to use its naval fleet to guarantee access to New World markets and discourage competition.

By the beginning of the twentieth century, Theodore Roosevelt invoked and expanded the doctrine, effectively making the United States the policeman of the Western Hemisphere. During the Cold War, it was used to counter Soviet influence in Cuba, Nicaragua, and Grenada.

By the 1970s the South American drug trade was declared a national security threat and the War on Drugs began with Colombia the epicenter of hostilities. In 1981, U.S. Congress amended the Posse Comitatus Act to allow military involvement in domestic drug enforcement, extending to Latin America. President Ronald Reagan’s 1986 National Security Decision Directive 221 declared drug trafficking a U.S. national security threat, authorizing military operations abroad, including in Colombia.

After the Cold War, the doctrine faded from explicit policy. In November 2013, Secretary of State John Kerry declared at the Organization of American States that “the era of the Monroe Doctrine is over,” framing a shift toward partnership and mutual respect with Latin America rather than unilateral dominance. By 2020 Colombia’s coca production had hit a new high.

Today, China’s Belt and Road Initiative, port construction and acquisitions, telecom infrastructure, and rare earth diplomacy have carved influence into Latin America and the Caribbean. In this context, the Monroe Doctrine was not asleep but, in a coma, its toes occasionally twitching.

Re-invigorating the Monroe Doctrine is not about making true allies and friends but removing vulnerabilities. The goal is not to bring these nations into the fold but to remove them from Beijing’s orbit.

By mid-2025 official statements claim that ~10% of the U.S. Navy is deployed to counter drug threats, ostensibly from Venezuela and Columbia. But fleet positioning hints at a different story. Most assets are stationed near Puerto Rico, the Virgin Islands, and Guantánamo Bay, closer to Cuba than Caracas. Surveillance flights, submarine patrols, and chokepoint monitoring center on the Florida Straits, Windward Passage, and Yucatán Channel.

This may suggest strategic misdirection. Venezuela is the declared theater, but Cuba is the operational keystone. The U.S. may be deflecting attention from its true concern: Chinese or Russian entrenchment in Cuba and the northern Caribbean.

The Monroe Doctrine began as a warning to monarchs across the Atlantic. In the late twentieth century, it morphed into a war on drugs. Today it reappears as a repurposed drug war, flickering as a warning to Beijing across the Pacific. Whether it awakens as policy or remains sleight of hand, its enduring role is to remind the world that the Western Hemisphere is not a theater for distraction but a stage the United States will guard against intrusion. In the twenty‑first century, its test is not whether it can inspire allies, but whether it can deny adversaries a foothold in America’s backyard.

Graphic: Monroe Doctrine by Victor Gillam, 1896. Public Domain.

Hamlet Goes to Milwaukee—A Tragicomedy in Five Acts

Prolepsis’ Prologue:

The Chorus enters. A single spotlight. A single Damocles’ bullet hangs in the air like a haunted ghost spinning to history’s rhythms and trajectories.

CHORUS:

John Schrank shoots Theodore Roosevelt, 113 long and mostly forgotten years ago, in Milwaukee, Wisconsin on a sharp and chilled Monday, coats pulled tight, 14 October 1912.

That’s the end, my friend, or so it seems. But tragedy demands context, and context demands sacramental passings. Let us reset and reconfigure the scene, with a sentimental barbershop quartet interlude of ‘Moonlight Bay’ drifting in the background and summon the ghosts of campaigns past and the raving refrains of the mad, all served with a bullet.

Act I: The Bull Rising

Before the Bull Moose and the bullet there was tradition and restraint. Before Roosevelt charged up the hill and across the plains, there was McKinley’s calm firmament.

William McKinley, 25th President of the United States, governed with a philosophy of calculated prosperity and protective nationalism, fittingly called the Ohio Napoleon, holding folksy court on America’s front porch. He was deliberate and firm but never rash, he was a Republican loyalist second, leader first, and a quiet expansionist, A Civil War veteran and devout Methodist, McKinley championed high tariffs, the gold standard, and industrial growth as the pillars of American strength.

His first term (1897–1901) unfolded as an economic recovery from Grover Cleveland’s faltering presidency and the Panic of 1893. It was marked by economic stabilization, the Spanish-American War, and the acquisition of overseas territories: Puerto Rico, Guam, the Philippines, and Hawaii, all additions to America’s imperial structure.

His vice president, Garret Hobart died, of heart failure in 1899 at the age of 55. With no constitutional mechanism to fill the vacancy, the office remained vacant until McKinley’s re-election. It wasn’t until the ratification of the 25th Amendment in 1967 that a formal process was established to replace a vice president.

In 1900, Theodore Roosevelt, then Governor of New York and war hero of the San Juan Hill, was chosen as McKinley’s running mate. His nomination was largely a strategy of containment: an attempt to temper his reformist zeal beneath the inconsequential and ceremonial weight of the vice-presidency.

Act II: Bull Cometh

The Bull Moose was buried beneath ceremony, but symbols cannot contain momentum. The front porch would give way to the lists and charging steeds.

On September 6, 1901, President William McKinley stood beneath the vaulted glass of the Temple of Music at the Pan-American Exposition in Buffalo, New York, an American shrine to progress, electricity, and imperial optimism. There, in the charged glow of modernity, he was shot twice in the abdomen by Leon Czolgosz, a Polish American self-declared anarchist and bitter subject of the Panic of 1893 and its resultant mill closures, strikes and wage collapse, etched into his disillusioned psyche.

Czolgosz had been baptized in the radical writings of Emma Goldman, a Lithuanian emigree and firebrand of the American radical left. Goldman championed anarchism, women’s rights, and sexual liberation. She founded Mother Earth, a journal that became an infamous intellectual hearth for dissent and revolutionary analysis.

To Czolgosz, Mckinley was the embodiment of oppression: capitalism, imperialism, and state violence. His answer to these perceived provocations was violence. Concealing a revolver wrapped in a handkerchief, he fired at close range during a public reception, just as McKinley extended his hand in welcome.

Initially, doctors believed McKinley would recover. But gangrene developed around the damaged pancreas, and he died on 14th of September. His death was slow and tragic, a symbolic collapse of the front porch presidency.

Roosevelt, just 42, stepped up and became the youngest president in U.S. history (JFK was 43). With containment at an end, the Bull broke loose. And he mounted the stage with an agenda.

Act III: The Charge of the Bull

The Bull builds a protective legacy of words and stick, sweat and blood.

Roosevelt’s early presidency honored McKinley’s legacy: trust-busting, tariff moderation, and economic expansion. But he soon added his own signature: conservationism, progressive reform, and a bold, moralistic foreign policy.

He preserved 230 million acres of public land and established the U.S. Forest Service, 5 national parks, 18 national monuments, 150 national forests and a constellation wildlife refuges. Stewardship of the land became a sacred ideal that continues to present day.

In foreign affairs, Roosevelt extended the Monroe Doctrine with his Roosevelt Corollary (1904), asserting that the U.S. had the right to intervene in Latin America to prevent “chronic wrongdoing.” It was a doctrinal pivot from passive hemispheric defense against European imperialism to active imperial stewardship, cloaked in the language of civilization and order. America became the self-appointed policeman of the Western Hemisphere.

The corollary was a response to incidents like the 1902 Venezuelan debt crisis where European navies blockaded ports to force repayment. In Cuba, unrest was quelled with U.S. troops in 1906. Nicaragua, Haiti, and Honduras saw repeated interventions to protect U.S. interests and suppress revolutions. If Latin American failed to maintain order or financial solvency, the U.S. would intervene to stabilize rather than colonize.

The doctrine justified the U.S. dominance of the Panama Canal and set the precedent for Cold War interventions, neutralizing the American back yard while containing Soviet expansion in the east.

Act IV: Hamlet in Milwaukee

Heads of kings rest uneasy. Ghosts of injustice haunt. Princes fall prey.

After winning a full term in 1904, Roosevelt honored his promise not to seek reelection in 1908. But disillusioned with his successor, William Howard Taft, Roosevelt returned to politics in 1912, forming the Progressive Party, nicknamed the Bull Moose Party.

Enter stage left, John Schrank, a former barkeep plagued by visions and imagined slights. In the early morning hours of 15 September 1901, 6 days after McKinley was shot and 2 days before he died, the bar tender dreamt that the slain President rose from his casket and pointed to a shrouded figure in the corner: Roosevelt. “Avenge my death”, the ghost spoke. Schrank claimed to forget the dream for over a decade, until Roosevelt’s bid for a third term in 1912 reawakened the vision, which he now interpreted as a divine command.

Schrank believed Roosevelt’s third-term ambition was a betrayal of American tradition set forth in Washington’s Farewell Address. He hated Roosevelt and feared that he would win the election, seize dictatorial power, and betray the constitutional republic. In his delusional state, he believed Roosevelt was backed by foreign powers and was planning to take over the Panama Canal; an anachronistic fear, given total U.S. control of the canal since 1904. Schrank interpreted the ghost’s voice as God’s will: “Let no murderer occupy the presidential chair for a third term. Avenge my death.”

At his trial for the attempted assassination of Roosevelt, Schrank was remanded to a panel of experts to determine his mental competency. They deemed him insane, a “paranoid schizophrenic”, in the language of the time. He was committed to an asylum, where he remained until his death 31 years later.

Schrank’s madness parallels the haunted introspection of Hamlet, Prince of Denmark. Shakespeare’s longest and most psychologically complex tragedy that revolves around a ghost’s command: “Revenge my foul and most unnatural murder.” Hamlet, driven by the specter’s charge, spirals into feigned (and perhaps real) madness, wrestling with betrayal, duty, mortality, and metaphysical doubt. His uncle, the murderer, has married his mother; an Oedipal inversion within the world’s most enduring tragedy.

On 14 October 1912, as Roosevelt stood outside Milwaukee’s Gilpatrick Hotel, Schrank stepped forward and fired. The bullet pierced his steel glasses case and a folded 50-page tome of a speech, slowing its path. Bleeding, a bullet lodged in his chest, Roosevelt refused medical attention. He stepped onto the stage and spoke for 90 minutes, although it is said that due to his loss of blood, he shortened his speech out of necessity. Whether for himself or the audience is lost to history.

Unlike Hamlet, who dithers and soliloquizes his way toward a graveyard of corpses, Schrank shoots, hits, and leaves Roosevelt standing. Hamlet’s tragedy ends in death and metaphysical rupture. Schrank’s farce begins with the demands of a ghost and ends with a 90-minute speech. One prince takes his world with him into death. The other absorbs a bullet and keeps talking.

Act V: Ghosts and Republics

Ghosts and Republics are ephemeral. At the end of time; those fleeting moments, short and long; some, as Proust says, more and more seldom, are best treated with humor and grace.

In tragedy and near calamity, a man’s soul becomes visible. Some are seen darkly, others, bright, clear, unshaken and unafraid of new beginnings even if that beginning is death.

Roosevelt had already charged up San Juan Hill, bullets and fragments whistling past like invitations to a funeral ball. Each a death marker. So, when a solitary bullet from a madman struck him in Milwaukee, it was merely an inconvenience. He quipped: “Friends, I shall ask you to be as quiet as possible. I don’t know whether you fully understand that I have just been shot, but it takes more than that to kill a Bull Moose.”

Sixty-eight years later, Reagan too survived a bullet to the chest. As he was wheeled into the emergency room at George Washington University Hospital, he said he’d “rather be in Philadelphia,” a throwback to his vaudeville days, a gag line used on fake tombstones: “Here lies Bob: he’d rather be in Philadelphia.” W.C. Fields once requested it as his epitaph. He’s buried in California. To the surgeons, Reagan added: “I hope you’re all Republicans.”

Where Roosevelt offered mettle, Reagan offered mirth. Both answered violence with theatrical defiance: natural-born and unshakable leaders, unbothered by the ghosts that tracked them.

They were not alone. Jackson, beat his would-be-assassin with a cane. Truman kept his appointments after gunfire at Blair house. Ford faced two attempts in seventeen days and kept walking. Bush stood unfazed after a grenade failed to detonate. They met their specters with grace, a joke, and a shrug.

The assassins and would-be assassins vanished into the diffusing whisps of history. The leaders of men left a republic haunted not by ghosts, but by a living memory: charged with the courage to endure and to imagine greatness.

Graphic: Assassination of President McKinley by Achille Beltrame, 1901. Public Domain.

The Lost Boys

The end of the Peloponnesian War in 404 BC marked the end of Athens’ Golden Age. Most historians agree that the halcyon days of Athens were behind her.  Some however, such as Victor Davis Hanson in his multi-genre meditations, A War Like No Other, a discourse on military history, cultural decay, and philosophical framing, offers a more nuanced view suggesting that Athens was still capable of greatness, but the lights were dimming.

During the following six decades, after the war, Athens rebuilt. Its navy reached new heights. Its long walls were rebuilt within a decade. Aristophanes retained his satirical edge even if it was a bit more reflective. Agriculture returned in force. Even Sparta reconciled with Athens or vice versa, recognizing once again that the true enemy was Persia.

Athens brought back its material greatness, but its soul was lost. What ended the Golden Age of Athens wasn’t crumbled walls or sunken ships. It was the loss of lives that took the memory, the virtuosity of greatness with it. With them generational continuity, civic pride, and a religious belief in the polis vanished. The meaning, truth, and myth of Athenian exceptionalism died with their passing. The architects of how to lead a successful, purpose driven civilization had disappeared, mostly through death by war or state but also by plague.

Victor Davis Hanson, in his A War Like No Other lists many of the lives lost to and during the war that took much of Athens’ exceptionalism with them to their graves. Below is a partial listing of Hanson’s more complete rendering with some presumptuous additions.

Alcibiades was an overtly ambitious Athenian strategist; brilliant, erratic, and ultimately treasonous. He championed the disastrous Sicilian expedition, Athens greatest defeat. Over the course of the war, he defected multiple times: serving Athens, then Sparta, then Persia, before returning to Athens. He was assassinated in Phrygia around 404 BC while under Persian protection, by, many beleive, the instigation of the Spartan general Lysander.

Euripides though he did not fight in the war exposed its brutality and hypocrisy in his plays such as The Trojan Woman and Helen. The people were not sufficiently appreciative of his war opinions or plays, winning only four firsts at Dionysia compared to 24 and 13 for Sophocles and Aeschylus, respectively. Disillusioned, he went into self-imposed exile in Macedonia and died there around 406 BC by circumstances unknown.

The execution of the Generals of Arginusae remains a legendary example of Athenian arbitrary retribution; proof that a city obsessed with ritualized honor could nullify military genius, and its future, in a single stroke. The naval Battle of Arginusae, fought in 406 BC, east of the Greek island of Lesbos, was the last major Athenian victory over the Spartans in the Peloponnesian War. Athenian command of the battle was split between 8 generals: Aristocrates, Aristogenes, Dimedon, Erasinides, Lysias, Pericles the Younger (son of Pericles), Protomachus, and Thrasyllus. After their victory over the Spartan fleet a storm prevented the Athenians from recovering the survivors, and the dead, from their sunken ships. Of the six generals that returned to Athens all were executed for their negligence. Protomachus and Aristogenes, likely knowing their fate, chose not to return and went into exile.

Pericles, the flesh and blood representation of Athens’ greatness was the statesman and general who led the city-state during its golden age. He died of the plague in 429 BC during the war’s early years, taking with him the vision of democratic governance and Athens’ exceptionalism. His 3 legitimate sons all died during the war. His two oldest boys likely died of the plague around 429 BC and Pericles the Younger was executed for his part in the Battle of Arginusae.

Socrates, the world’s greatest philosopher (yes greater than Plato or Aristotle) fought bravely in the war, but he was directly linked to the traitor Alcibiades. He was tried and killed in 399 BC for subverting the youth and not giving the gods their due. That was all pretense. Athens desired to wash their collective hands of the war and Socrates was a very visible reminder of that. He became a ritual scapegoat swept up into the collective expurgation of the war’s memory.

Sophocles, already a man of many years by the beginning of the war, died in 406 BC at the age of 90 or 91, a few years before Athens’ final collapse. His tragedies embodied the ethical and civic pressures of a society unraveling. With the deaths of Aeschylus in 456 BC, Euripides in 406 BC, and Sophocles soon after, the golden age of Greek tragedy came to a close.

Thucydides, author of the scholarly standard for the Peloponnesian War, was exiled after ‘allowing’ the Spartans to capture Amphipolis, He survived the war, and the plague, but never returned to Athens. His History ends in mid-sentence for the period up to 411 BC. He lived till 400 BC, and no one really knows why he didn’t finish his account of the war. Xenophon picked up where Thucydides left off and finished up the war in his first two books of Hellenica which he composed somewhere in the 380s BC.

The Peloponnesian War ended Athens’ greatest days. The men who kept its lights bright were gone. Its material greatness returned, glowing briefly, but its civic greatness, its soul, slowly dimmed. It was a candle in the wind of time that would be rekindled elsewhere. The world would fondly remember its glory, but Athens had lost its spark.

Source: A War Like No Other by Victor Davis Hanson, 2005. Graphic: Alcibiades Being Taught by Socrates, Francois-Andre Vincent, 1776. Musee Fabre, France. Public Domain.

The Sum of All Fears–Real and Imagined

The Peloponnesian War, fought over 27 years (431-404 BC), cost the ancient Greek world nearly everything. War deaths alone approached 8-10 percent of their population: up to 200,000 deaths from battle and plague. The conflict engulfed nearly all of Greece, from the mainland to the Aegean islands, Asia Minor and Sicily. Though Sparta and its allies, in the end, claimed a tactical victory, the war left Greece as a shadow of its former self.

The Golden Age of Athens came to an end. Athenian democracy was replaced, briefly, by the Thirty Tyrants. Sparta, unwilling to jettison its insular oligarchy, failed to adapt to imperial governance, naval power, or diplomatic nuance. Within a generation Sparta was a relic of history.  First challenged by former allies in the Corinthian War, then shattered by Thebes, which stripped the martial city-state of its aura of invincibility along with its helot slave labor base: the economic foundation of Sparta. Another generation later, Macedon under Philip II and Alexander the Great finished off Greek dominance of the Mediterranean. After Alexander’s death in 323 BC, Rome gradually absorbed all the fractured pieces. Proving again, building an empire is easier than keeping one.

Thucydides, heir to the world’s first historian: Herodotus, reduced the origins of the Peloponnesian War to a primal emotion: fear. In Book I of his History of the Peloponnesian War he writes: “The growth of the power of Athens, and the alarm which this inspired in Sparta, made war inevitable.” Athens had violated trade terms under the Megarian Decree with a minor Spartan ally but that was pretext, not cause. Sparta did not go to war over market access. It went to war over fear. Fear of what Athens had become and a future that armies and treaties may not contain.

War and fear go together like flame to fuse. Sparta went to war not for fear of a foe, Sparta knew no such people. It was not fear of an unknown warrior, nor fear of battlefields yet to be choregraphed, but fear of an idea: democracy maintained and backed by Athenian power. And perhaps, more hauntingly precise, fear of itself. Not that it feared it was weak but of what it may become. They feared no sword or spear, their discipline reigned supreme against flesh and blood. Yet no formation, no stratagem, no tactic of war could bring down a simple Athenian belief: the rule of the many, an idea anathema, heretical even, to the Spartan way of life.

So, they marched to war, not to defeat an idea but to silence the source. Not to avenge past aggression but to stop a future annexation. They won battles, small and large. They razed cities. But they only destroyed men. The idea survived. It survived in fragments, bits here, bits there, across time and memory. What it did kill, though, was the spirit of Athens, the Golden Age of Athens. But the idea that was Athens lived on across space and time: chiseled into republics that rose from its ashes and ruins.

The radiance of Athens dimmed to shadow. Socrates became inconvenient. Theater became therapy; a palliative smothering of a cultural surrender. And so, civilization moved to Rome.

Source: A War Like No Other by Victor Davis Hanson, 2005. History of the Peloponnesian War by Thucydides, Translated by Richard Crawley, 2021. Graphic: Syracuse vs Athens Naval Battle. CoPilot.

Phalanx: Discipline in Geometry

Near the ancient Sumerian city of Girsu, mid-way between present-day Bagdad and Kuwait City, stood a battle marker; the Stele of Vultures, now housed in the Louvre. It commemorates Lagash’s 3rd millennium BC victory over Umma. The stele derives its name from the monument’s carved vultures flying away with the heads of the dead.  It also depicts soldiers of Lagash marching in a dense, shield to shield formation, holding spears chest high and horizontal, led by their ruler: Eannatum, who commissioned the stele in 2460 BC. The importance of the stele, though, is that it is the first visual depiction of the use of a phalanx in a battle. It is believed that the phalanx as a military tactic is much older.

The phalanx was more than a combat formation, it was a battlefield philosophy enshrining discipline and courage over strength, unity of the team over the individual. A dense, rectangular wall of men, generally 8 deep stretching across the battlefield to protect against flanking maneuvers. Each man wore heavy armor of leather and bronze: helmet, cuirass, greaves, armed with a spear and a short sword. But the breakthrough that brought the phalanx great renown was the apsis, a round shield invented for the Greek hoplite in the 8th or 7th century BC. With its dual grip, a forearm strap and central handhold, it allowed the infantryman precise control of his shield, helping create an impenetrable barrier of bronze and bone against the oncoming enemy’s spears and swords. It transformed the phalanx from an offensive wall of attack to an added defensive engine of defiance.

The phalanx only succeeded in cohesion. When courage and discipline held, the formation with the apsis as its core defense was practically unbeatable on confined terrain. It overcame the enemy with a seamless, tight mass executing a relentless forward march into the belly of the opposing beast. But it was only as strong as its weakest link. Once discipline faltered and cohesion broke, the formation collapsed, and the opposing army ran it to ground. Victory belonged not to brute force, but to the combined strength of the military unit. Teams won, individuals lost.

From late 8th century BC onward, Greek phalanxes were manned by hoplites: citizen soldiers, generally landowners and farmers. Emerging in Sparta or Argos, possibly imported from Sumeria or born of parallel discovery in Greece, phalanx battles initially were confined, blunt, and deadly affairs. They devolved into fierce pushing masses of brawn, bone, and metal until one side broke. Heavy casualties occurred when the enemy lines broke and soldiers fled Helter skelter in shock and chaos, pursued by the victors for plunder, unless they were restrained by honor.

The phalanx became the standard that destroyed the mighty Persian armies at Marathon and Thermopylae early in the 5th century BC. At Marathon in 490 BC 10,000 Athenians and 1000 Plataeans stretched out their formation to match the breadth of 26,000 Persians, filling the Marathon plain and denying the armies any room for flanking movements.

The Greeks stacked their wings with additional rows of hoplites and thinned them progressively toward the center creating a convex crescent. The Greek wings advance faster than the center generating a pincer movement that collapsed on the Persian center. When the dust settled 192 Athenians and 11 Plataeans were lost while the Persian losses were approximated at 6400.

In the 19th century, Napoleon, possibly improvising on phalanx encircling tactics developed at Marathon, would invert his attacking army with a concave formation consisting of a strong center and weaker wings. His strategy being to split the enemies’ center with strength and attack their divided ranks on the flanks. The tactic worked until Wellington at Waterloo.

At Marathon, unity triumphed with geometric discipline. At Thermopylae the formation bought time and ended with a sacrifice that concluded Persian hubris.

During the second Persian invasion in 480 BC, Darius’s son Xerxes with 120,000-300,000 men attacked a contingent of 7000 Greeks at Thermopylae. The Greeks held back the Persian advance like a cork in a bottle, using a rotating phalanx of roughly 200 men to defend a narrow pass for two days, until betrayal by Ephialtes exposed their flank and they were destroyed in a inescapable Persian barrage of arrows. Greek losses were estimated at 4000 men including Leonidas’ 300 Spartans and 2000-4000 Persians (beginning and ending estimates for manpower strength vary widely).

The Greeks defiant stand at Thermopylae allowed the Greek navy to regroup at Salamis where they won a decisive victory against the Persian navy. A year later the Greeks at Plataea crushed the Persians quest for a Hellenic satrapy.

The Phalanx endured for another century, including use in the Peloponnesian War, where it remained lethal but of limited use. Then came Epaminondas at Leuctra in 371 BC, transforming the phalanx into a machine that erased Sparta’s mighty reputation. Typically, each army’s phalanx strength was concentrated on their right wing so that the strongest part of a force always faced off against the weaker wing of the opposition. What Epaminondas did was say nuts to that.

He reversed the order and created an oblique formation, more triangular than rectangular with his strongest troops on the left wing. His left wing was stacked 50 deep while keeping his center and right wings thin. His 50-deep was aimed directly at Sparta’s best under the command of King Cleombrotus (in those days officers and kings were in the front rows of the phalanx). As the phalanxes began to attack Epaminondas kept his right-wing stationery creating an asymmetrical front. The left wing easily broke through Sparta’s right wing, killing Cleombrotus and collapsing their superior flank. At that point Epaminondas’s wing pivoted inward creating an enveloping arc around the remaining parts of Sparta’s phalanx effectively ending the Spartan myth of invincibility.

Epaminondas tactics shortened battles with fewer casualties. His innovations proved that properly trained and equipped citizen soldiers could defeat professional warriors while instilling a new civic honor through restraint and discipline. His oblique formation allowed landowners and farmers to settle their disputes, usually in a few hours or less, with minimal loss, and return to their farms in time for the harvest. Epaminondas not only brought asymmetrical tactics to the battlefield but shattered claims of superiority by employing the unexpected.

As the Golden Age of Athens and western civilization’s Greek center waned and Roman hegemony rose, the phalanx evolved again. The Greek phalanx gave way to the Roman manipular system, a staggered checkerboard pattern, enabling units to rotate, reinforce, or retreat as needed. It was a needed refinement and improvement to the phalanx, more effectual on open plains and less susceptible to calvary and arrows.

Then came Hannibal to Cannae in 216 BC. During the 2nd Punic War, he upended the war cart of tactics once again and ruthlessly exploited Rome’s refinements.

Hannibal’s improvisations of the phalanx maneuvering tactics, but not the actual formation, showed that he had studied Marathon. Instead of a convex line with strong wings and a weak center he developed a concave line with strong wings and weak center. He allowed the center to fall back, which the Romans unwittingly obliged by surging into Hannibal’s weak center. With the Romans committed Hannibal’s deception encircled them with precision and brutal lethality. The Romans were annihilated on the field losing somewhere between 50,000-70,000 killed and another 10,000 captured. Hannibal lost 6000-8000 men (again estimates vary). Then came the 3rd Punic War.

The phalanx began as a wall of spears and shields, a bulwark of bronze and bone. Its stunning victories echo through history’s scholarly halls and hallowed plains of death and destruction. Yet its Achilles’ heel, vulnerable flanks, precise terrain requirements proved incompatible to horses and gunpowder.

Still its legacy of discipline and unity endure. Born of necessity, refined through rigor, and studied for centuries, the phalanx stands as a testament Aristotle’s enduring insight, slightly abridged but still profound, ‘The whole is greater than the parts.’ And perhaps the Roman’s said it best: ‘E pluribus unum’, ‘out of many, one.’

Source: A War Like No Other by Victor Davis Hanson, 2005. Et al. Graphic: Stele of Vultures.

Women and Glass: The Starlight Calculators of Harvard

In the halcyon days of yore before digital ubiquity and tonal exactitude, computers were made of flesh and blood, fallibility crossed with imaginative leaps of genius. Photographs etched starlight’s past onto glistening glass and preserved silver. Solid archives where memory endures and future discoveries shimmer with potential, encoded in celestial light of the heavens awaiting the discerning caress of curiosity, intuition, and reason.

In 1613, English poet Richard Brathwait, best remembered for his semi-autobiographical Drunken Barnaby’s Four Journeys, enshrined the word computer into written English while contemplating the divine order of the heavens, calling God the “Truest computer of Times.” Rooted in the Latin computare, meaning “to reckon together,” the term evolved over the next three centuries to describe human minds inimitably attuned to the interpretation of visual data: star fields, spectral lines, geologic cross-sections, meteorological charts, and other cognitive terranes steeped in mystery, teasing initiates with hints of vision and translation. These were not mere calculators nor unimaginative computers, but perceptive analysts, tracing patterns, exposing truths, and coaxing insights from fluid shapes etched into the fabric of nature.

By the time of the Enlightenment and the scientific revolution, human computers had become the invisible deciphering force behind truth seeking laboratories, the unsung partners in progress, cataloging, interpreting, and taming the flood of empirical but seemingly nonsensical data that overwhelmed those without insight. Harvard College Observatory was no exception. With photography now harnessed to astronomy’s telescopes, the observatory could suddenly capture and archive starlight onto glass plates of coated silver, forever changing astronomy from the sketches of Galileo to silver etches of eternal starlight.

But these glass plates, resplendent with cosmic information, remained galleries of dusty, exposed negatives, inert until absorbed and guided by human curiosity and insight.

Enter the women computers of Harvard, beginning in 1875, over 140 women, many recruited by Edward Charles Pickering, processed more than 550,000 photographic plates, the last collected in 1992, bringing much needed coherence and linearity to the chaos of too much. They sorted signal from celestial noise, revealing the hidden order of the universe inscribed in silver, preserved in silica.

In 1875 the initial cohorts, the pioneers, the first names of Harvard women computers, although not exactly given that moniker, to appear on the glass plates were names like Rebecca Titsworth Rogers, Rhoda G. Saunders, and Anna Winlock assisting in the absolutely essential process of what we would now call cross-referencing the glass plate’s ‘metadata’ with the astronomical data.  Ascertaining that time and space of the data match the time and space of the metadata. In 1881 Pickering, the observatory’s fourth director, began hiring women specifically as Astronomical Computers, a formal role focused on analyzing and deciphering the growing collection of glass plate photographs.

This shift in 1881 was more than semantic, a fancy title for drudge work and tedious plate cataloging but a structured program where women like Williamina Fleming, Annie Jump Cannon, Henrietta Swan Leavitt, and Cecilia Payne-Gaposchkin were tasked with not just cataloging stars, but studying stellar spectra, and the lights powering life and imagination throughout the universe. Indispensable efforts that lead to the Henry Draper Catalogue, eventually containing the half million plus glass plates, and the foundations of modern stellar classification systems and 21st century astronomy. Their stories are worthy of a Horatio Alger novel, maybe not exactly rags to riches, but certainly humble beginnings to astronomical fame. They were paid peanuts, but they were the elephants in the observatory.

Williamina Fleming, in 1879 arrived in Boston penniless and abandoned by her husband secured a job as a domestic in the home of Edward Pickering, yes that guy. She impressed Pickering’s wife, Elizabeth, with such intelligence that she recommended her for work in the observatory. She quickly outpaced her male counterparts and in 1881 was officially hired as one of the first Harvard Computers.

Studying the photographed spectra of stars, she developed a classification system, the natural human desire to find order in apparent chaos, based on the abundance of hydrogen on the surface of a star or more exact the strength of hydrogen absorption lines from the spectra data. The most abundant stars were classed as A stars, the next most abundant as B stars, and on down to V.

In 1896 Pickering hired Annie Jump Cannon, a physics degree from Wellesley and an amateur photographer, modified Fleming’s stellar classification system based also on the surface temperature of a star rather than hydrogen abundance. Her method was to use the strength of the Balmer absorption lines, electrons excited within hydrogen atoms, like dancers at different tempos, reveal themselves through subtle spectral lines now understood to be differing ionization states of the atom directly tied to the surface temperature of the star.

Her system used the same letters to avoid redoing the entire Harvard catalogue, but she reduced the list down to 7 and reordered them from hottest to coolest: O, B, A, F, G, K, M. Her classification is still in use today. Earth revolves around a G-class star which has a medium surface temperature of about 5800 K (9980 F or 5527 C).

Henrietta Swan Leavitt graduated from Harvard’s Women’s College in 1892 with what we might now call a liberal arts degree. A year later, she began graduate work in astronomy, foundation for employment at the Harvard Observatory. After several extended detours tucked under her petticoats, Edward Charles Pickering brought her back to the Observatory in 1903. She worked initially without pay, later earning an unfathomable 30 cents an hour.

There, Leavitt collaborated with Annie Jump Cannon, in a coincidence of some note both women were deaf, though one is left with the feeling that the absence of sound may have amplified the remaining sensory inputs to their fertile minds. In time, Leavitt uncovered a linear relationship between the period of Cepheid variable stars and their luminosity, a revelation that became an integral part of the cosmic yardstick for measuring galactic distances. The Period-Luminosity relation is now enshrined as Leavitt’s Law.

Cepheid variables form the second rung of the Cosmic Distance Ladder; after parallax, and before Type Ia supernovae, galaxy rotation curves, surface brightness fluctuations, and, finally, the ripples of Einsteinian gravitational waves. Leavitt’s metric would prove essential to Edwin Hubble’s demonstration that the universe is expanding.

Swedish mathematician Gösta Mittag-Leffler considered nominating her for the Nobel Prize in Physics, but his plans stalled upon learning she had died in 1921. The Nobel, then as now, is non-awardable to the dead.

Cecilia Payne-Gaposchkin, a transplanted Brit, joined the Harvard Observatory as an unpaid graduate fellow while working towards her PhD at Radcliffe in astronomy. Upon earning her doctorate, she continued at the Observatory with no title and little pay. By 1938 she was awarded the title of Astronomer and by 1956 was made full professor of Harvard’s faculty.

In her dissertation she accurately showed for the first time that stars are composed primarily of hydrogen and helium, proving that hydrogen was the most abundant element in the universe, overturning long held but erroneous assumptions. But in a twist of fate, astronomer Henry Norris Russell persuaded her to label her conclusions of hydrogen abundance as spurious. Four years later Russell’s research reached the same conclusion, but he barely gave her an honorable mention when he published his results.

She wasn’t the first nor will she be the last to suffer at the hands of egotistical professors, more enamored of self rather than truth, but her elemental abundance contribution to astronomy brushed away the conceit that stars must mimic rocky planets in their composition, much like Galileo ended Earth’s reign as a center of everything. Twentieth century astronomer Otto Struve hailed her dissertation as “the most brilliant PhD thesis ever written in astronomy.”

Undeterred and building on her studies of spectral emissions of stars she turned her gaze to high luminosity and variable stars with husband astronomer Sergi Illarionovich Gaposchkin. After 2 million observations of variable stars, their efforts laid the groundwork for stellar evolution: how stars change over the course of time. From hints of dispersed stardust to starlight and back again. Cycles of stellar life repeated billions of times over billions of years.

Harvard’s astronomical female human computers, initially mere clerks transcribing stars from silver and glass, evolved into interpreters of light, shaping the very foundations of astronomy. Through logic, imagination, and an unyielding devotion to truth, they charted the heavens and opened lighted pathways for generations to follow.

Graphic: The Harvard Computers standing in front of Building C at the Harvard College Observatory, 13 May 1913, Unknown author. Public Domain

Tripping

Albert Hofmann, employed by Sandoz Laboratories in Basel, Switzerland, was conducting research on ergots, a toxic fungus, in 1938 to identify potential circulatory and respiratory stimulants. While synthesizing compounds derived from the fungus, he inadvertently created lysergic acid diethylamide (LSD), an alkaloid of the ergoline family, known for their physiological effects on the human nervous system.

Five years later on April 16, 1943, Hofmann became the first person to experience the hallucinogenic effects of LSD while re-synthesizing the compound. He accidentally absorbed a small amount through his skin, leading to vivid hallucinations he later described as a dreamlike state with kaleidoscopic visuals. With two groundbreaking lab accidents occurring five years apart, The Daily Telegraph ranked Hofmann as the greatest living genius in 2007.

During the counter-cultural movement of the 1960s, LSD emerged as a popular recreational drug, attracting advocates such as Timothy Leary, a Harvard psychologist who famously urged people to “Turn on, tune in, drop out.” Leary championed the use of psychedelics to explore altered states of consciousness and challenge conventional societal norms. LSD also played a pivotal role in Ken Kesey’s novel One Flew Over the Cuckoo’s Nest, which focused on the horrific abuse of patients in mental institutions. The book, later adapted into a film starring Jack Nicholson, significantly influenced awareness of the cruelty of mental institutions. However, LSD’s trajectory took a sinister turn beyond recreation when it became a tool for government mind-control experiments.

Starting in the 1950s, the CIA launched MKUltra, a covert program designed to explore drugs and techniques for breaking down individuals psychologically. LSD became a central component of these experiments, often administered secretly to unsuspecting individuals to study its effects. Targets included prisoners, drug addicts, prostitutes, military personnel, CIA employees, and even random civilians. It is difficult to ascertain which acronym took the greater hit to its reputation: the CIA or LSD.

Source: Albert Hofmann by Morgan and Donahue, All That’s Interesting, 2025. Graphic: Albert Hofmann in 1993.

Black Swans Part I

Black swans are rare and unpredictable events, what the military calls “unknown unknowns“, that often have significant, domain-specific impacts, such as in economics or climate. Despite their unpredictability, societies tend to rationalize these occurrences after the fact, crafting false narratives about their inevitability. COVID-19, for instance, ripples across multiple domains, beginning as a health crisis but expanding to influence the economy, legal systems, and societal tensions. As a human-made pathogen, its risks should have been anticipated.

Black swans throughout history are legendary. Examples include the advent of language and agriculture, the rise of Christianity (predicted yet world-changing), and the fall of Rome, which plunged the Western world into centuries of stagnation. Islam (also predicted), the Mongol conquests, the Black Death, and the Great Fire of London shaped and disrupted societies in profound ways. The fall of Constantinople, the Renaissance, the discovery of America, the printing press, and Martin Luther’s Reformation brought new paradigms. More recently, the Tambora eruption (“the year without a summer”), the Great Depression, WWII brought unforeseen disruptions to economies and geopolitics, the Manhattan Project, Sputnik, the fall of the Berlin Wall, and the rise of PCs and the internet altered the trajectory of human progress. Events like 9/11 and the iPhone have similarly reshaped the modern world. While black swans may be rare, they are not inevitable. We should expect moments of dramatic collapse or unanticipated brilliance to recur throughout history.

Nassim Taleb, author of the 2007 book The Black Swan, suggests several approaches to mitigate the effects of such events without needing to predict them. His recommendations include prioritizing redundancy, flexibility, robustness, and simplicity, as well as preparing for extremes, fostering experimentation, and embracing antifragility: a concept where systems not only withstand shocks but emerge stronger.

Through the lens of history, black swans appear as a mix of good and bad, bringing societal changes that were largely unanticipated before their emergence. As history has shown, predicting the impossible is just that: impossible. What might the next frontier be, the next black swan to transform humanity? Could it be organic AI, a fusion of human ingenuity and machine intelligence, unlocking potential but posing profound risks to free will, societal equilibrium, and humanity’s very essence? (Next week—preparing for a black swan: an example.)

Natural Law—Point Counterpoint

Version 1.0.0

The notions of right and wrong, justice and injustice, have there no place. Where there is no common power, there is no law; where no law, no injustice.” — Leviathan by Thomas Hobbes

Thomas Hobbes saw human nature as a cauldron of chaos. In his state of nature, life is “nasty, brutish, and short,” a “war of all against all” where self-preservation is the only natural law. Shaped by Thucydides’ tales of strife and Machiavelli’s ruthless pragmatism, Hobbes cast man’s self-interest as a destructive force that casts morality aside. His remedy to avert chaos: a towering sovereign, ideally a monarch, to crush anarchy with an iron fist. The social contract trades liberty for security, forging laws as human tools to bind the beast within. Yet Hobbes stumbled: he failed to grasp power’s seductive pull. He assumed his Leviathan, though human, would rise above the self-interest he despised, wielding authority without buckling to its corruption.

Reason, which is that law, teaches all mankind…that being all equal and independent, no one ought to harm another in his life, health, liberty, or possessions.” — Second Treatise of Government by John Locke

John Locke painted a gentler portrait of man than did Hobbes. He rooted natural law in reason and divine will, granting all people inherent rights to life, liberty, and property. His state of nature is peaceful yet imperfect, marred by the “want of a common judge with authority,” leaving it vulnerable to human bias and external threats. Optimistic, Locke envisioned a social contract built on the consent of the governed, protecting these rights through mutual respect and laying the groundwork for constitutional rule. Where Hobbes saw a void to be filled with control, Locke trusted reason to elevate humanity, crafting government as a shield, not a shackle.

Hobbes and Locke clash at the fault line of power. Hobbes’s sovereign, meant to tame chaos, reflects the rulers’ thirst for dominance, but his naivety about power’s effect cracks his foundation. Locke’s ideals, morality, reason, rights, empower the ruled, who yearn for liberty after security sours. Hobbes missed the flaw: rulers, driven by the same self-interest he feared, bend laws to their will, spawning a dual reality—one code for the governed, another for the governors. Locke’s vision of freedom and limited government inspires their soul, while Hobbes’s call for order fortifies their bones with courts, police, and laws of men. The U.S. Constitution marries both, yet scandals tip the scales: power corrupts, and liberty frays as safeguards buckle under the rulers’ grip.

Hobbes and Locke both accept the imperfection of man but take different paths to mitigate that imperfection with workable safeguards. Hobbes insists on the rule by law but drafted by imperfect man and applied with a Machiavellian indifference with no solution for absolute powers corrupting influence. Locke also chooses to rule by law but guided by morality, God and the will to depose of despots.

Sources: Leviathan, Thomas Hobbes; Second Treatise of Government, John Locke. Graphic:Original Leviathan frontispiece, a king composed of subjects, designed with Hobbes’s input.

White Guard

Mikhail Bulgakov’s White Guard, set during the Ukrainian War of Independence (1917–1921) amid the Russian Civil War, captures Kyiv in an existential power struggle against varied forces: Ukrainian nationalists allied with German troops, the White Guard clinging to Tsarist dreams, Lenin’s Bolsheviks closing in, plus Poles and Romanians. Against this bloody backdrop, Bulgakov crafts a semi-autobiographical tale of loss and fatalism, culminating in a nihilistic realization of humanity’s purpose: “But this isn’t frightening. All this will pass. The sufferings, agonies, blood, hunger, and wholesale death. The sword will go away, but these stars will remain… So why are we reluctant to turn our gaze to them? Why?”

Bulgakov, a doctor of venereal diseases like the book’s protagonist Alexei Turbin, knew hopelessness. In 1918, syphilis was a scourge, often incurable, leading to madness, mirroring the war’s societal decay. Alexei volunteers for the White Guard, tending to horrors he can’t heal, his efforts dissolving in a dream: “shadows galloped past…Turbin was dying in his sleep.” War becomes a disease, resistance futile. Yet Bulgakov’s lens widens. Sergeant Zhilin dreams of Revelation, “And God shall wipe away all tears…and there shall be no more death,” finding humility in cosmic indifference. Petka, an innocent, dreams simply of a sunlit ball, untouched by great powers. “Blessed are the pure in heart, for they shall see God” (Matthew 5:8).

Then, out of dreamland into the light: “All this will pass.” The stars endure, wars fade. Writing in the 1920s after the White defeat, Bulgakov channels Russian fatalism—Dostoevsky’s inescapable will, Chekhov’s quiet surrender. But he’s not fully broken. His “Why?” pleads, mocks, resists. Why not look up? Survival is luck, death equalizes, yet fighting a losing battle confronts our nothingness. Kyiv falls, the Bolsheviks threaten, the White Guard vanishes, still, Bulgakov continues to ask. Why?

He blends despair with irony, a doctor mocking death as the stars watch. The German expulsion of the Reds in 1918 briefly eased bloodshed, but 1919 brought worse, “Great was the year and terrible the Year of Our Lord 1918, but more terrible still was 1919.” History moves on; stars don’t care. Bulgakov’s question lingers: Why? To fight is to live, fate be damned.

Source: White Guard, Mikhail Bulgakov, trans. Marian Schwartz. Graphic: Ukrainian Soldiers circa 1918.