Banana Wars.

            “Yankee ingenuity… is inventive improvisation, adaptation and overcoming of shortages of materials.”[1]  Basically, opportunity-seeking and -seizing.  Lorenzo Baker (1840-1908) typified Yankee ingenuity.  Raised on pre-resort, hard-scrabble Cape Cod, by age 30 he was master of a small schooner trading between New England and the Caribbean.  In 1870, he brought back a load of bananas to see if anyone would buy.  They would, in quantity and at a 1000 percent profit.  Henry Meiggs (1811-1877), another Yankee, and his nephew Minor Keith (1848-1929) came at the banana trade from another direction.  They were building railroads in Central America, receiving vast tracts of land from the government and employing lots of laborers.  They turned their land grants into banana plantations, mostly for export.  In 1899, Dow’s company joined with Keith’s company to form the United Fruit Company.[2]  How did this lead to wars? 

            Bananas contain a lot of potassium.  “Potassium fends off a sense of existential dread.”    Hence the American interest in Central America and the Caribbean.[3]  Alternatively, for decades, American business had a lot of drag with the United States government.  In the Caribbean and Central America, the Marines supported the United Fruit Company and other businesses when troubled by local unrest.[4]  “Which will you have?” 

            Chronology. 

            1898: The Spanish-American War left Cuba under American control and Puerto Rico and the Philippines as American possessions.  The US occupied Cuba from 1898 to 1902, from 1906 to 1909, in 1912, and from 1917 to 1922. 

            1899-1902: Philippine-American War, a bloody “counter-insurgency.” 

1903: Panama “seceded” from Columbia,[5] then signed a treaty with the United States.  The treaty allowed the US to build a Panama Canal and gave the US sovereignty over the route. 

1903: US troops landed in Honduras to protect American lives and property.  They came back in 1907, 1911, 1912, 1919, and 1924-1925. 

1912: US troops landed in Nicaragua to protect American lives and property. 

1914: US troops landed in the Mexican port of Vera Cruz. 

1915-1934: US troops occupied Haiti, waging small wars against anti-American forces. 

1916-1917: US troops entered northern Mexico in a fruitless[6] hunt for Pancho Villa. 

1916-1924: “disorder” in Santo Domingo led the US to occupy the country for eight years, fighting anti-American forces out in the bush for much of the time. 

1927-1932: US troops intervened in a civil war in Nicaragua in 1927, then hung around until 1932.  From 1930, President Herbert Hoover wound-down interventions. 

            “US troops” mostly meant the Marines.  Operations involved much improvisation and adaptation.  Rich experience led to the USMC’s “Small Wars Manual” (1940).  Presciently, Jim Mattis urged his officers to read it before invading Iraq. 


[1] Yankee ingenuity – Wikipedia 

[2] See: United Fruit Company – Wikipedia  Now Chiquita.  The life of Henry Meiggs offers a fascinating view of the “Well, it warn’t illegal when I done it” phase of American business history. 

[3] I borrow the syllogism from Robert Stone, A Flag for Sunrise (1981). 

[4] Lester D. Langley, The Banana Wars: An Inner History of American Empire, 1900–1934 (1983). 

[5] Wanna buy a bridge? 

[6] HA!  Eeez pun, yes? 

Populism.

            Reading the newspapers, it might be possible to formulate a rough explanation of the term “populism.”  It begins with the observation that the problems of the modern world are highly complicated, long-term, often inter-connected, and—in the eyes of some–materialist. 

The highly complicated nature of problems means that they are best understood by experts, rather than by the common person.  Examples include the federal judiciary, which regulates national law; the Federal Reserve Board, which broadly regulates the level of economic activity; and the Joint Chiefs of Staff, which represent the professional opinion of the United States military.  The bench is full of lawyers, the Fed is full of economists, and the Joint Chiefs is full or warriors.  None are elected; all are insulated from political pressures.[1] 

Many problems are not only complicated; they are also of long duration.  In contrast, elected officials tend to have a two-year, four-year, or—at most—six-year existential time horizon.  Climate change offers a good example of this effect.  Faced with stiff resistance in the Senate, President Barack Obama sought to use executive agreements and Executive Branch rule-making to enshrine carbon reduction policies that reach out as far as 2050.  Within a few years, President Donald Trump could just withdraw the United States from the Paris Climate Accords and tell his own cabinet Secretaries to start undoing the Obama rule-writing. 

Inter-connections abound.  To take only one example, climate change is likely to set off large-scale population movements across national borders.  It’s likely to increase the frequency, size, and economic impact of both wildland fires and of cyclonic storms.  It makes a case for an urgent transition away from carbon-burning without any sustainable replacement energy technology—except new generation nuclear reactors—offering a swift and scalable replacement. 

American politics (but not only American politics) has been dominated for at least a century by the bi-partisan belief that, more than anything, voters want money so that they can buy stuff.  During the Depression, the Roosevelt administration adopted a policy of “tax-spend-elect.”  Using this policy, Democrats held the White House from 1932-1952, 1960-1968, 1976-1980, 1992-2000, and 2008-2016.  Fed up with losing, Republican eventually adopted the policy of “tax cut-spend-elect.”  It worked.  Using this policy, Republicans went from winning the White House 45 percent of the time to winning it 60 percent of the time. 

            Insisting that complicated problems are best understood and managed by highly educated professionals dedicated to “public service”[2] inevitably discounts the value and views of the common person.[3]  Voters can be easily distracted by controversies over things like transgender monuments and Confederate bathrooms, but elites can claim to govern in the common interest. 

However, a long string of failures can undermine deference to elite guidance.  So can non-materialist values or goals among common people.  The result can be an upwelling of wrath on the part of at least some of the common people.  This is what is labeled “populism.”[4]   


[1] Currently, none of these is in good repute, what with the excesses of judicial activism, failure to fend-off inflation, and the flunked wars.  All have deep reservoirs of previous good conduct to help see them through choppy waters. 

[2] It’s public employment, not “service.”  Very often it leads to highly remunerative private employment.  Revolving door (politics) – Wikipedia and Goldman Sachs – Wikipedia 

[3] Blazing Saddles – Simple Farmers You Know Morons – sub esp – YouTube  It wasn’t always so.  “Freedom of Speech” – NARA – 513536 – Freedom of Speech (painting) – Wikipedia 

[4] The basic conflict between “elites” and “populists” is portrayed in The Bloodening – YouTube

Refugees 2.

            The last decades of the Twentieth Century and the first decades of the Twenty-First Century witnessed a sharp increase in international migration from Developing countries toward Developed countries.[1]  The conditions greatly differed from the post-1945 refugee situation. 

In a first round of resistance to mass migration, target or destination countries acted in a reasonably civilized way.  Passengers on international air flights were required to present a visa from the country to which they were traveling before they were allowed to board.  The United Nations High Commission on Refugees (UNHCR) established the principle that refugees and asylum-seekers should remain in the country of “first arrival” while their application to enter a Developed country was reviewed by the government of the destination country.  So, someone fleeing Rwanda would first arrive in the neighboring Congo, then apply to become a Baptist preacher or radio announcer in the United States.  The countries of first arrival hold the refugees while their applications were reviewed (and rarely approved).  In return, the Developed countries pay for the maintenance of the refugees in big camps in the country of first arrival. 

When that didn’t work, the gloves came off.  The insoluble disaster that is Haiti provided a taste of harsher measures to come.  Beginning in the 1980s, large numbers of Haitians made perilous small-boat journeys toward the United States.  American law holds that anyone setting foot on American soil has the right to apply for asylum as a refugee.  The Reagan administration ordered the Coast Guard to intercept the refugees at sea and turn them back.  Beached in Haiti, many of them just began the voyage anew.  In 1991, tiring of this game of whack-a-mole, the George H. W. Bush administration began diverting the refugees to Guantanamo.  More recently, huge numbers of people from Central and South America have tried to enter the United States through Mexico.  This has created a very real crisis at the Southern border. 

            The tide of refuge-seekers rose around Europe and Australia as well.  Refugees from the Syrian civil war provided a spearhead.  Angry with the European Union, in 2015 Turkey’s Recep Tayib Erdogan got them moving toward the Aegean Sea and EU member-state Greece.  From there long caravans walked north toward the heart of the EU.  People from many other places joined the flow.  Then came the collapse of order in Libya after the American-led air assault.  First, crime gangs took over the ports and began exporting African migrants.  European navies began picking up survivors of ship-wrecks, then started pre-empting the ship-wrecks by working closer to the Libyan shore.  The EU eventually found a solution in paying the Turks to stop sending refugees and hiring the governments of African countries along the overland routes to Libya to block passage.  For their part, the Australians have used their navy to intercept migrant craft at sea, then sent the migrants to places like Christmas Island. 

Now the Britain and Denmark have struck a bargain with Rwanda to send asylum-seekers there while their applications wend their way.[2]  Paul Kagame, Rwanda’s ruler, is shining-up his country’s international reputation.  Rwanda already receives refugees from Burundi, the Democratic Republic of Congo, Eritrea, Somalia, and Sudan.  On the surface, at least, the refugee center at Gashora, Rwanda, offers good value for the money: clean, orderly, and fenced.  Doubtless he will have imitators since the problem of unwanted migrants isn’t going away soon. 


[1] Max Fisher, “How Domestic Politics Unravel The World’s Pledge to Refugees,” NYT, 18 April 2022. 

[2] Abdi Latif Dahir, “Rwanda Offers Refuge, But Critics Are Skeptical,” NYT, 10 October 2022. 

Refugees 1.

            The Twentieth Century could be called the “Century of Refugees.”[1]  The two World Wars unleashed immense floods of people in flight.  Then the break-up of the multi-ethnic colonial empires and the painful process of nation-making added millions of more people.  Now the Twenty-First Century seems likely to receive the title. 

            The current regime for refugees arose in the wake of the Second World War.[2]  Having turned a cold shoulder to the victims of Nazism before the war, many governments agreed to absorb and care for the victims of Stalinism after the war.  In 1951, governments agreed that those who could not remain safely in their own country could seek safety (refuge) in another country.  They could remain there if they met the receiving country’s standards for permanent residence.  National laws were written to comply with international agreementPoles, Ukrainians, Germans, Czechs, Hungarians, Rumanians, Bulgarians, and people from the Baltic countries moved westward out of fear of the Red Army or later to escape from the dictatorships put in place by the Red Army.[3]  The vast majority of these refugees moved in a few years after the end of the war in Europe; relatively small numbers arrived in later years.  The problem receded.[4] 

            The break-up of the colonial empires spawned further refugee movements.  “White Africans” in British Kenya and Rhodesia, French Algeria, Dutch Indonesia, the Belgian Congo, and Portuguese Mozambique and Angola bolted for “Home” when nationalist movements took power.  These people of European ancestry, too, were relatively easily taken in.  Other aspects of de-colonization had a less satisfactory outcome.  The partition of British India in 1947 sent Hindus and Muslims trekking in search of safety.[5]  India and Pakistan absorbed the survivors. 

            There soon appeared an entirely unanticipated and much more complicated issue.  It is with us yet.  Many of the post-colonial “new nations” failed to develop as real countries.  They benefitted from Western tropical medicine and agricultural improvements, setting off rapid population increases.[6]  Few benefitted from a reasonable strategy for economic development.  Worse, many were fractured by ethnic or tribal conflicts.[7]  Authoritarian kleptocrats stole everything that wasn’t nailed-down or red hot.  Growing population ate up whatever was left.  Complaining did no good.[8]  So, for many years now, all sorts of non-European people have been trying to find refuge in a Developed country.  They want work and safety.  Who can blame them? 


[1] Michael Marrus, The Unwanted: European Refugees in the 20th Century (1985).   

[2] Max Fisher, “How Domestic Politics Unravel The World’s Pledge to Refugees,” NYT, 18 April 2022. 

[3] Although in 1945 the Anglo-Americans shipped back 2.3 million of them because of a wartime agreement with the Soviets.  See: Nikolai Tolstoy, Victims of Yalta (1977).  During the war, many East Europeans had become entangled with the Nazis out of their hatred of the Soviets and—often–Jews.  Certain embarrassments later arose. 

[4] Although into the 1970s West German television gave nightly weather reports on lost German lands in the East. 

[5] Google Margaret Bourke-White’s photography for Life magazine. 

[6] See: The Man Who Saved a Billion Lives. | waroftheworldblog 

[7] For example, see: Indonesian mass killings of 1965–66 – Wikipedia; Nigerian Civil War – Wikipedia (1967-70); and Expulsion of Asians from Uganda – Wikipedia (1972). 

[8] Once upon a time, the dictator of Guinea-Bissau had all his political prisoners brought to a soccer stadium; the PA system began to play “Those Were the Days” Mary Hopkin – Those Were The Days – 1968 – YouTube; then the machine guns opened fire.  On another occasion, the President of France suffered embarrassment when it was revealed that he had accepted a private gift of jewelry from a dictator who had eaten some school children. 

When the CHIPS are Down.

            In the wake of the Great Depression and the Second World War, there emerged a “neo-capitalism” in Western countries.  On the one hand, international bodies would create the framework for global trade and payments in order to encourage economic growth and respond to crises.[1]  On the other hand, national governments would create the framework for prosperity in individual countries through monetary policy, a favorable regulatory environment, infrastructure investment, and an expanded social safety net.   

The American rivalry with China has brought into high-relief one interesting aspect of post-1945 business-government relations.  As part of creating the framework for prosperity, the U.S. government once led the world in support for research and development as a share of GDP.  The examples of government investment in basic research include energy, automobiles, aircraft, pharmaceuticals, the internet,[2] and semi-conductors.  One could also include more historical examples like the Springfield Armory[3] and support for the trans-continental railroad.    

By 2017, at 0.65 percent of GDP, the U.S. had fallen to seventh place, lagging behind China, South Korea, Norway, Germany, Sweden, and Japan.  France spends almost as large a share as does the U.S., with Britain and Russia at or below 0.5.[4] 

Why does this matter?  According to a common theory, forward-looking entrepreneurs can develop new technologies without being able to see the practical application of those technologies.  Moreover, technology development (like any new product) is not guaranteed to work out.  At the same time, individual private businesses can’t afford to spend money on basic research which may not produce any useful outcome.  That can make it difficult to obtain financing to develop new technology.  Forward looking governments have often paid for some of the costs of basic research on emerging technologies.  Once developed, these technologies can be adopted by private businesses who understand the application.  The resulting economic growth and the tax revenue that flows from it more than repay the government investment.

Why did this happen?  American government spending on research and development rose sharply during the 1940s and 1950s, peaked at 1.8 percent of Gross Domestic Product (GDP) in 1964, then fell sharply through the 1970s, rose again during the 1980s, fell again from the late 1980s to 2000, and has bumped along 0.65 percent of GDP since then.  This roughly tracks the movements of the defense budget.[5]  Here it is worth knowing something about the Defense Advanced Research Projects Agency (DARPA).[6]  It may that an unforeseen consequence of various “peace dividends” came in unilateral basic research disarmament.[7] 

Recently, an effort has been made to revive government investment in technology.  At the same time, the spending authority seems to be shifting from the Department of Defense toward the Department of Commerce.  Will this industrial policy become ordinary industrial politics?  


[1] See: Bretton Woods system – Wikipedia 

[2] Look up J.C.L. Licklider (1915-1990) some time.  Where’s his statue? 

[3] See: Springfield Armory – Wikipedia 

[4] David Leonhardt, “U.S. Doesn’t Invest in Innovation Like It Used To,” NYT, 10 December 2021. 

[5] See: US Government Defense Spending History with Charts – a www.usgovernmentspending.com briefing 

[6] DARPA – Wikipedia

[7] That isn’t the only area of depletion.  Fact check: Trump exaggerates on munitions shortage | CNN Politics.  Except that the current world-wide scramble for munitions to send to Ukraine suggests that he did not exaggerate. 

Climate of Fear XXV.

            In a world organized into competing nation-states, transnational problems can be difficult to resolve.[1]  On some level, all governments depend upon the consent of the governed.  Even autocratic governments endowed with powerful “My own security forces” can find themselves in an awkward spot when mass dissent bubbles up.  So preserving and enhancing the national welfare is a common goal of national governments.  Climate diplomacy illustrates this principle. 

            In 2015, the United Nations-sponsored Paris Climate Accord created a “Green Climate Fund.”  The Fund’s goal was to have Developed economies pay $100 billion a year to help Developing economies deal with the effects of climate change and make the shift away from fossil fuels.[2]  President Barack Obama pledged the United States to contribute $3 billion to the fund and did transfer $1 billion before the end of his term. 

            The United States signed the Paris Climate Accord as an Executive Agreement, not as a legally binding treaty.  He did so because treaty ratification requires a two-thirds majority vote in the Senate.  President Obama knew that he could not win ratification.  Hence, no subsequent government, Republican or Democratic, had any legal obligation to fulfill its terms.[3]  Many Americans–chiefly Republicans, but also many Democrats—feel no obligation to pay “climate reparations.”[4]  President Donald Trump withdrew the United States from the Paris Accords with the United States having met only 5 percent of what it should have donated under the 2015 agreement.  President Joe Biden rejoined the Paris Accords, pledging to give $11.4 billion a year by 2024.  A Democrat-controlled Congress agreed to provide $1 billion.[5] 

            Foreign leaders, and not just kleptocrats eager to get their snout in the trough, find American obeisance to voters frustrating.  French President Emmanuel Macron complained that “Europeans are paying; we are the only ones paying.”  UN climate official Nicholas Topping insisted that the election cycle should have no bearing on international commitments.  Meanwhile, major carbon-consumers China, Russia, and India skipped the conference in Egypt. 

            To take another angle, oil reserves are what might be thought of as a limited or unsustainable asset.  As they get used up, more is not created.  For owners of oil reserves, the question becomes one of how to maximize the value derived from the asset.  For example, Saudi Arabia possesses about a century’s worth of oil.  If people are going to go on burning carbon for energy for a century, then Saudi Arabia’s interest could lie in supporting a stable global oil market with prices low enough to prevent a shift to alternative energy sources.  If people are going to stop burning carbon by, say, 2050, then Saudi Arabia’s interest could lie in getting as high a price as possible for oil right now and in the near future.[6]  That could be achieved by reducing the amount of oil being pumped out of the ground in order to keep prices high.[7] 

            History is full of conflicts between Cosmopolitanism and Parochialism.  This is one. 


[1] See: The origins of the First and Second World Wars. 

[2] See: Green Climate Fund – Wikipedia 

[3] The nuclear agreement with Iran took the same form for the same reason and met the same fate. 

[4] For parallels, see: World War I reparations – Wikipedia and War Debt Issue (u-s-history.com) 

[5] Lisa Friedman, “Biden Will Face a World Demanding Reparations,” NYT, 11 November 2022. 

[6] Walter Russell Mead, “The Quagmire of Climate Diplomacy,” WSJ, 11 October 2022. 

[7] If that coldly rational strategy has the added benefit of sticking a thumb in the eye of foreign critics of Saudi Arabia’s ruler, all the better. 

The War of Symbols in Iran.

            In the aftermath of the Era of the French Revolution and Napoleon (1789-1815), defenders of the established order tried to work out a philosophical rationale for Conservatism.  They argued that societies are “organic” (like an orange), rather than “mechanical” (like a clock).[1]  They develop over long periods of time according to specific historical experiences.  Each society is different, even if there exist broad similarities in some areas.  People in one era could not just remake their society according to some plan for the betterment of all mankind which had just arrived in the mail from the Jesuits or the Freemasons or the son of some imprisoned banker in Nigeria.  Thus, Great Britain had become a constitutional monarchy through hundreds of years of incremental change.  Russia had become an autocracy through hundreds of years of different incremental change.  Change didn’t stop, so societies would creep along toward the future, altering by incremental change.[2] 

            This theory might be tested in contemporary Iran.  In the early 1920s, a Persian soldier named Reza Khan (1878-1944, r. 1925-1941) won the backing of the British for the overthrow of the Persian Shah.  He modeled himself on his modernizing neighbor in Turkey, Mustapha Kemal “Ataturk.” Shah Reza built roads, railroads, factories, and schools; he struggled with a conservative clergy; he banned the photographing of camels; and he sought to encourage both Westernization in dress and the relative emancipation of women.  The latter two came together in his effort to ban the chador.  Neither one appealed to the conservative mainstream.  Still, he was ruling over an agrarian and rural society in which tribes and the clergy were very powerful.  The shrewd old bugger never pushed change so far or fast as to trigger the same sort of revolution by which he had overthrown the previous shah.[3] 

            His son, Mohammad Reza Pahlavi (1919-1980, r. 1941-1979) drove ahead maniacally with Westernization and modernization.  Conservative Islam rallied many of his rural, working class, and clerical opponents, with the hijab as a visible symbol of opposition.  In 1979, revolution toppled the Shah.  The new Islamic Republic required that women cover their hair.  The long war with a secular Iraq (1979-1989) then consolidated the veil as a patriotic symbol. 

            Now, thirty years have passed.  Iran has continued along its path from rural to urban.  Women have gained much more access to both education and careers, albeit within a framework of conservative Islamic belief.  Young people, especially young women, are thinking in a different way than did their mothers and grandmothers.[4]  Yet the old order remains in place.  It has not made incremental adjustments.  The morality police continue to patrol the streets, hunting evil-doers without enough bobby-pins.  The government is made up of old men.  They are both tightly bound to their youths and incompetent to handle many normal tasks. 

            Now it has come to street demonstrations, violence from the forces of order, and accusations of foreign meddling.  The hijab was popular when Iranians wanted to resist change being forced on them.  Now it’s the symbol of resistance to change that many Iranians desire. 


[1] Hence the origins of the expression “queer [which meant “odd” in those days and was not a term of abuse] as a clockwork orange.”  From which came Anthony Burgess, A Clockwork Orange (1962). 

[2] Joseph de Maistre, The Divine Origins of Constitutions (1810). 

[3] Robert Byron, The Road to Oxiana (1937) recounts a motoring excursion with friends in Persia and Afghanistan. 

[4] Amanda Taub, “Hijab Protests in Iran Expose Deep Divide In Visions of Future,” NYT, 7 October 2022. 

Seditious Conspiracy 2.

            Five members of the right-wing militia group the “Oath Keepers” are on trial for seditious conspiracy.  From the first, the government acknowledged the difficulties it faced.  “It is rare that a conspiracy can be proven by direct evidence of an explicit agreement.”[1] 

            Instead, in the seditious conspiracy trial of Stewart Rhodes and four other leaders, the government has presented much evidence on the “mid-set and motives”[2] of Rhodes “in the post-election period.”  They have also demonstrated the close coordination between Rhodes and other Oath Keeper leaders.  Finally, they have shown the existence of two armed “quick reaction forces” at Arlington National Cemetery and a Virginia hotel. 

            However, a critic might argue that the government should have had direct evidence of a conspiracy.  The FBI had informants among the Oath Keepers.  In November 2020, an Oath Keeper contacted the FBI after listening to a talk by Stewart Rhodes that he found alarming.[3]  A low-level member who eventually participated in the attack on the Capitol, told FBI agents beforehand that the group had no plans to attack the Capitol or interfere with the election’s certification.  Most importantly, a third source was Greg McWhirter, vice president of the Oath Keepers.[4]  From early in 2020, McWhirter had been reporting to the FBI on the Oath Keepers.[5]   

            The government chose not to call any of its informants as witnesses.  It’s easy to understand why not when some of the witnesses they did call said that the invasion of the Capitol had been a “spontaneous” act by the Oath Keepers on the spot, rather than the implementation of a formal plan by Rhodes and the others.  One Oath Keepers who has struck a plea deal with the government, could offer no information on the planning or intent of the Oath Keeper leaders for 6 January.[6]  Rather, “when the crowd got over the barricade and they went into the building, an opportunity presented itself to do something.” 

            If the government didn’t put up its informants, Stewart Rhodes did take the stand in his own defense.  He resolutely denied the existence of any plot to attack the Capitol or interfere with certification of the election.  For his part, McWhirter was called by the defense, not the prosecution.  He hasn’t yet testified because he suffered a heart attack on the way to Washington.

If this dog won’t hunt, the prosecutor will say sententious things about “accepting the decision of the jury.”  Meanwhile, the accused will have been held up to—well deserved–public shaming and been loaded with legal bills.  Was that the point all along? 


[1] Alan Feuer, “Key to Jan. 6 Trial: Did Oath Keepers Plan Their Role?” NYT, 11 November 2020.  This sounds a bit like arguing that the absence of evidence is itself proof of the conspiracy.  “That’s how a conspiracy works.” 

[2] Like other people on the far right, Rhodes is reported to believe that the Chinese government has a grip on Joe Biden.  Many of the allegations made in conversation bear a marked resemblance to the contents of the Steele Dossier concocted against Donald Trump. 

[3] Apparently, the report fell through the cracks because the FBI only contacted him after 6 January. 

[4] Alan Feuer and Adam Goldman, “Informant Likely to Testify for Defense in Oath Keepers Trial,” NYT, 9 November 2022; Alan Feuer, “Key to Jan. 6 Trial: Did Oath Keepers Plan Their Role?” NYT, 11 November 2020. 

[5] Among other things, McWhirter reported that after the right-wing activist Aaron Danielson was ambushed and killed on 29 August 2020, allegedly by “antifa” activist Michael Reinoehl, Rhodes had talked about attacking “antifa” members in the Portland, Oregon area.  Apparently the FBI could not obtain a warrant for more intrusive measures to investigate McWhirter’s claim. 

[6] ‘Sorry for what I did’: Oath Keeper who pleaded guilty for Jan. 6 breach breaks down on the stand – POLITICO  His plea agreement required him to testify that there had been a plot.  The best he could manage on the stand was that he believed, based on what happened, that there had been an “implicit plot” to which he was not privy. 

Seditious Conspiracy 1.

            In 1798, faced with the unexpected emergence of a two-party political system during an age of rising authoritarianism, the Federalist Party passed the Sedition Act.  The law allowed the Federalists to prosecute their critics among the Democratic-Republicans.  The widely unpopular act was allowed to lapse in 1800.[1]  In 1859, after John Brown’s private militia attacked Harper’s Ferry, Senator Stephen A. Douglas proposed a new law against seditious conspiracy.  In 1861, after Southern states seceded, Congress barred the barn door by passing another Sedition Act, subsequently revised.[2]  At its core is a definition of “Seditious conspiracy.” 

            “If two or more persons in any State or Territory, or in any place subject to the jurisdiction of the United States, conspire to overthrow, put down, or to destroy by force the Government of the United States, or to levy war against them, or to oppose by force the authority thereof, or by force to prevent, hinder, or delay the execution of any law of the United States, or by force to seize, take, or possess any property of the United States contrary to the authority thereof, they shall each be fined or imprisoned not more than 20 years, or both.”[3] 

            “Conspire,” in turn means “to join in a secret agreement to do an unlawful or wrongful act or an act which becomes unlawful as a result of the secret agreement” or “to act in harmony toward a common end.”[4] 

            On those rare occasions when the government has prosecuted people for seditious conspiracy, the Justice Department hasn’t often carried away the laurels.  In 1920, three Communists were tried, but the judge ruled that the government hadn’t shown any connections between the evidence of sedition and the people charged or that the accused had conspired.  In 1936 a bunch of Puerto Rican nationalists were convicted, although it took two trials (along with what supporters of the Nationalists called jury-rigging) to get there.  In 1940, members of the anti-Semitic “Christian Front” were prosecuted, but the jury refused to convict.  In 1988, a bunch of white supremacists were prosecuted for sedition and other crimes.  The case fell apart when the two witnesses—group members “flipped” by the government—turned out to lack credibility.  All the men were acquitted.  In 1995, the government did manage to convict Sheikh Abdel-Rahman for his role in the 1993 World Trade Center truck-bombing.  In 1996, the Justice Department sicced the FBI on Osama bin Laden after his declaration of war on the United States.  Apparently, the investigation didn’t get anywhere before 9/11.  In 2010, things regressed to the mean when the government prosecuted some “Christian Patriots.”  A month into the case, the judge dismissed the most serious charges. 

Now a bunch of Oath Keepers and Proud Boys are on trial for seditious conspiracy.  The government may have better luck here, what with the whole thing having been on television.[5] 


[1] John C. Miller, Crisis in Freedom: The Alien and Sedition Acts (1951).  Probably no coincidence that he wrote at the start of the “Red Scare.” 

[2] Imagine a bunch of U.S. Marshalls showing up at the various legislatures debating Ordinances of Secession and going “You’re all under arrest!” 

[3] Seditious conspiracy – Wikipedia 

[4] The latter isn’t a legal definition.  If it was, then the League of Women Voters and wedding planners would be in jeopardy. 

[5] See: Karen Tumulty on Twitter: “Full-page ad on the back of the A-section of today’s ⁦@washingtonpost⁩ — https://t.co/LaivusLlrn” / Twitter

Tension Ball Politics.

            With the international climate conference now underway, it is easy to see the problems arising from Nationalism.  There are no national boundaries in the environment, but everyone wants what is best for their own country and devil-take-the-hindmost.  What if that truth nevertheless misses the very real forces encouraging transnationalism in important areas? 

            Inflation offers one way of approaching this issue.  For example, there seems to have been a broad intellectual consensus in the West on the appropriate response to Covid.  While the world awaited the fast-track development of vaccines, most governments responded in a roughly uniform manner.  First, they locked-down to various degrees.  The lock-downs disrupted the normal pattern of economic activity, both on the supply side and on the demand side.  The lock-downs reflected an international consensus among medical and scientific experts, rather than just imitative behavior by public officials.[1] 

Second, they generally spent a lot of money on stimulus payments to states and localities, private business, and individuals.  Governments all seem to have over-shot the mark because private savings boomed during the pandemic.[2]  Stimulus programs reflected an international expert consensus, rather than accidentally similar national responses.  Now the “mea culpas, but not really” of the heads of central banks closely resemble each other. 

Third, the need for action on climate change and global warming represents another area of consensus among government officials, scientists, and many publics.  There is certainly disagreement over what action, how fast, and at whose cost.  Nevertheless, the need to transition away from dead dinosaurs and toward some alternative energy source is widely accepted.  The trouble is that countries have started to shrink carbon sources of energy before alternative sources are up and running.  “Underinvestment in both fossil fuels and renewable energy infrastructure exposed everyone to crippling supply interruptions.”[3]  They also pushed up prices. 

Fourth, Demography presents a complex problem for the world, but one aspect of it is common in the Developed economies.  The populations of these countries are aging.  People are moving out of working age without an adequate number of younger people to fill up the gaps.[4]  Covid led to early retirements and medical disabilities.  The resulting tight labor markets are part of what is fueling inflation.  However, Covid merely highlighted a much larger problem.  In the absence of immigration from Developing countries, the Developed world could face long-term tight labor markets.  That, in turn, might lead to many more robots of one sort or another. 

            As the Developed world becomes more aligned on policy, populist revolts against experts and the administrative state have surged.[5]  There is a tension here, but no clear solution. 


[1] O the role of the World Health Organization (WHO) see: World Health Organization’s response to the COVID-19 pandemic – Wikipedia 

[2] For the American case, see: The Fed – Excess Savings during the COVID-19 Pandemic (federalreserve.gov)  The pent-up demand represented by these “excess savings” are being spent, in part, because inflation will erode their value.  “Use it or lose it.”  Governments could “claw-back” what’s left of this money to reduce inflationary forces.  That isn’t likely to happen in any democracy. 

[3] Greg Ip, “Inflation a Headache for Leaders Everywhere,” WSJ, 10 November 2022. 

[4] See the remnants of the educational web-site created to support a PBS “Nova” series: NOVA | World in the Balance | PBS 

[5] See: Yellow vests protests – Wikipedia