My Weekly Reader 21 October 2020.

Featured

            The Constitution reared up from a foundation of compromises.  Among these compromises was the toleration of slavery by states where it had little to no importance.  Article IV, Section 2, Clause 3 of the Constitution required the return of any fugitive from “service or labor” to her/his master from another state into which s/he had fled.  In sum, the Union mattered more than did slavery or the enslaved people.  A law, the Fugitive Slave Act of 1793, defined the legal mechanisms for returning fugitives.  However, as opposition to slavery increased in the North, local governments and private citizens often refused to co-operate or even obstructed the slave-catchers operating among them.  Therefore, another compromise, the Compromise of 1850, introduced a much more rigorous Fugitive Slave Act.  The new Act further inflamed Northern opinion.  

            Northern opinion divided more than this brief sketch suggests.  Anti-Black racism ran neck and neck with abolitionism in many places.  Many parts of the North valued their economic connections to the South and to slavery.[1]  Competition between political parties sometimes diverged from principled stands on issues.  All these forces came together in New York City before the Civil War.[2] 

            The city’s government dangled as a puppet of Tammany Hall, the Democratic Party organization.  Tammany pols played on the hostility to Blacks felt by the (predominantly Irish) immigrants they were organizing to vote early and often.  Judges and prosecutors had often crawled out of the same swamp.  New York City policemen sometimes moon-lighted as slave-catchers.  Businessmen who wanted to accommodate Southern customers turned a blind eye to it all. 

            Slave-owners would pay rewards for the return of run-aways, so Blacks in New York—people of color in an overwhelmingly White city–were deer in the jacklights[3] of slave-catchers.  This hunt only intensified with the passage of the Fugitive Slave Act of 1850, which offered handsome fees to both slave-catchers and the judges who approved their transfer Southward.  Lured by the money, the slave-catchers sometimes kidnapped—and judges regularly approved the transfer of—free Blacks who were knowingly misidentified as fugitives.  Applying the term not just to New York, but to the whole of the North, one historian has labeled this the “Reverse Underground Railroad.”[4] 

            Highly publicized stories of free Blacks kidnapped into slavery appalled a growing audience of Northern Whites.  Five Black boys were kidnapped from Philadelphia in 1825, then four survivors providentially returned to tell their story of the Black “Trail of Tears” that ran from the Upper South to the new cotton lands of the Southwest.  In 1853, Solomon Northup wrote of his “12 Yeas a Slave.”  Not for nothing has Elizabeth Varon called her book on the Union troops Armies of Deliverance. 


[1] Banks financed the cotton trade and its spendthrift planters; insurers and ship-owners profited from the massive cotton exports. 

[2] Jonathan Daniel Wells, The Kidnapping Club: Wall Street, Slavery, and Resistance on the Eve of the Civil War (2020), reviewed by Harold Holzer, WSJ, 19 October 2020. 

[3] See: https://en.wikipedia.org/wiki/Spotlighting 

[4] Richard Bell, Stolen: Five Free Boys Kidnapped into Slavery and Their Astonishing Odyssey (2019), reviewed by David S. Reynolds, WSJ, 17 October 2019.  . 

Reckoning with Racism.

Speaker of the House Nancy Pelosi has ordered the removal of the portraits of four previous Speakers on the grounds that they had supported the Confederacy, either before or after serving in the office she now holds.  “There is no room in the hallowed halls of Congress or in any place of honor for memorializing men who embody the violent bigotry and grotesque racism of the Confederacy.”[1]  This may seem to some to be more like virtue-signaling than substantive change, but it’s a first step.  The United States does need to consider the place of racism in its past and present.  One question is how much truth-telling people want or can stand.

In almost every presidential election from 1852 to 1860 and from 1880 to 1976, the states of the Confederacy and then the former Confederacy voted Democratic.  What is true of presidential elections is even more true of Congressional, state, and local elections.[2]  For most of this period, the Democratic Party was a Southern-dominated party.  Only under unusual circumstances did the Democratic party manage to break out of its geographic and cultural isolation to win large numbers of states in other regions.[3]

The point is that for a hundred years the Democratic Party anchored its electoral base in the old Confederacy.  At times and in terms of political representation, it existed almost entirely as a regional party.  After 1876, the federal government conceded virtual “”Home Rule” to the South.  Southern Democrats imposed “Jim Crow” laws,[4] disfranchised African-Americans,[5] created and celebrated the mythology of the “Lost Cause,”[6] put up statues to “Johnny Reb” and to Confederate generals, and lynched with abandon.[7]  Prominent Southern Democrats included Nathan Bedford Forrest, the first Grand Wizard of the Ku Klux Klan, and “Pitchfork” Ben Tillman, who had proudly led a bloody attack on freedmen before representing South Carolina in the Senate.[8]  At the Versailles peace conference, Woodrow Wilson vetoed a Japanese proposal for a “racial equality” statement in the Treaty.  During the Great Depression, much of the New Deal’s aid to Southerners either tacitly or explicitly excluded African-Americans.  Later, the men who murdered Emmett Till and the jury that acquitted them were Democrats.  These examples barely scratch the surface.

In short, and to put it mildly, the Democratic party long resisted racial equality.  Indeed, until within human memory, it formed one of chief institutional exponents of race hatred in the United States.  How to address this issue?

[1] Emily Cochrane, “Pelosi Removes Portraits Tied to Confederacy From Capitol,” NYT, 19 June 2020.

[2] For presidential elections, see: https://en.wikipedia.org/wiki/Solid_South#Solid_South_in_presidential_elections For gubernatorial elections, see: https://en.wikipedia.org/wiki/Solid_South#South_in_gubernatorial_elections

[3] Notably in 1912, when Theodore Roosevelt’s insurgency split the Republican party, and between 1932 and 1948 when the Great Depression and the Second World War created a national emergency.

[4] See: https://en.wikipedia.org/wiki/Jim_Crow_laws

[5] See: https://en.wikipedia.org/wiki/Disenfranchisement_after_the_Reconstruction_Era

[6] See: https://en.wikipedia.org/wiki/Lost_Cause_of_the_Confederacy

[7] See, if you’ve got a strong stomach: https://en.wikipedia.org/wiki/Lynching_in_the_United_States

[8] Maybe Speaker Pelosi could try to repeal the Tillman Act (1907).

The Logan Act.

Early modern European politics focused on the competition of “factions” organized around powerful individuals, rather than on “parties” organized around competing ideologies.  Hence, the Founding Fathers did not expect political parties to occupy the political system created by the Constitution.  Things didn’t work out as expected.[1]

Early National American politics quickly polarized into Federalists and Democratic-Republicans.  The two parties reflected different strands of the American Revolution in terms of attitudes to the power of the central government and of social groups.  However, the party competition also incited a degree of personal animus that challenged the generally desired rationality of the time.

In a further surprise, although the Patriots had sought to separate themselves from Europe, foreign affairs repeatedly intruded into the political life of the Early Republic.  An anti-monarchical revolution in France initially won broad support in America, then took a radical turn that divided Americans.  Opinion quickly swung from “Oh, they’re like us” to “What if that could happen here?”  Then war broke out between the new French Republic and, well, almost everyone.  Most important to the United States of all the combatants was Britain because of the Royal Navy’s control of the seas and of the trade that used those seas.  Federalists came to loathe the French Revolution and see a natural alignment with Britain, while the Democratic-Republicans sympathized with the aspirations of the sister-republic and put down its excesses to a temporary war expedient.

The debates on these matters between the two parties quickly soured.  Since the Federalists held the White House from 1789 to 1801, they had charge of American foreign policy.  Three things then happened: the Americans and the French fell into a naval Quasi-War (1798-1800)[2]; the Democratic-Republican press made the Federalists the butt of withering attacks[3]; and the Federalists rammed through a battery of “Alien and Sedition Acts” intended to stifle opposition voices.[4]  The Alien and Sedition Acts contributed to a revulsion against the Federalists that brought the Democratic-Republicans into power in 1801.

Although not normally lumped with the Alien and Sedition Acts, the Logan Act (1799) fits well with the general Federalist effort to squash political opposition.  George Logan,[5] a Democratic-Republican with a lifetime of poor judgement behind him, had made a purely personal visit to France during the Quasi-War in hopes of patching things up.   The Logan Act criminalized private individuals interfering in relations between the United States and other countries.  Unlike the Alien and Sedition Acts, the Logan Act neither sunsetted nor was repealed.

 

How has the Logan Act been applied?  Citing the Logan Act is wrapping oneself in the flag to harass political opponents.

In 1803, France controlled the mouth of the Mississippi river and obstructed American trade through the port of New Orleans.  Western farmers became exasperated with the barrier to getting their crops to market.  A farmer gave public voice to what must have been common tavern conversation after the second mug of rye.  He wrote a letter to a newspaper arguing that Kentucky should secede from the Union and form an alliance with France.  The newspaper published it.  The outraged United States Attorney in Kentucky—a Federalist hold-over from the Adams administration–got a grand jury to indict the farmer.  It never went any further than that.  Would have had to put him in front of a jury of locals.[6]

After service with the U.S. Navy in the Mexican-American War (1846-1848), the Philadelphia-born Jewish-American merchant and sea captain Jonas Levy (1807-1883) stayed on in Mexico.  He came from a family of enterprising people and the apple didn’t fall far from the tree.  Looking for business opportunities, he proposed to the Mexican government to build a railroad from the Gulf of Mexico to the Pacific across the Isthmus of Tehuantepec.  The isthmus is the narrowest point of Mexico and the Pacific end passes through a gap in the Sierra Madre.  Lots of people wanted to build a railroad there, but one group was in Washington and had the ear of Secretary of State Daniel Webster.  But Levy was on the ground, spoke Spanish, and was very go-ahead.  In 1852, Levy pitched it to the Mexican government.  The Mexicans seemed inclined to go with Levy’s plan, so Webster exerted pressure on behalf of the “American” plan.  Levy wrote to the president of Mexico arguing for his own proposal.  When American diplomats reported this to Washington, Webster got Levy indicted.  The trouble was he didn’t have any evidence, just hearsay.  The indictment went nowhere.

Herbert Hoover (1874-1964) made a fortune as a mining engineer, did a fantastic job organizing American civilian relief aid for Europe in the First World War, served as Secretary of Commerce, and won election as President in 1928.  Then the Depression hit and he got creamed in the 1932 election.  Hoover stomped off into retirement to sulk and fulminate against Franklin D. Roosevelt and all his works.  When the Second World War first broke out, Roosevelt offered Hoover an olive branch in the form of co-ordinating American relief for European civilians.  Hoover turned down the offer, but only because he hated Roosevelt.  He got busy organizing his own program of relief for Poland and then for Belgium.  By mid-1940, the situation had changed.  Poland and France had been defeated, Britain stood alone, and Britain’s naval blockade of Continental Europe offered an important source of pressure on Germany.  Nevertheless, Hoover pressed ahead.  In February 1941, Under-Secretary of State Sumner Welles publically warned that Hoover might be in violation of the Logan Act.

More recently, a chain of people have been described as violating the Logan Act.  Some were Democrats: George McGovern, Jesse Jackson, Nancy Pelosi, and John Kerry.  Some were Republicans: 47 Senators, candidate Donald Trump, and Rudi Giuliani.  None have been prosecuted.

In December 2017, as they prepared to interview National Security Advisor-designate Michael Flynn, FBI agents discussed trying to get Flynn to admit he had violated the Logan Act.

[1] Richard Hofstadter, The Idea of a Party System: The Rise of Legitimate Opposition in the United States, 1780–1840 (1969).

[2] See: Alexander De Conde, The Quasi-War: The Politics and Diplomacy of the Undeclared War with France, 1797-1801 (1966).  An oldie, but a goody.

[3] See, for example James Callender.  Michael Durey, With the Hammer of Truth, James Thomson Callender (1990). 

[4] John Miller, Crisis in Freedom: The Alien and Sedition Acts (1951).  Same as De Conde.  The point here is that if the necessary documents are available and a good historian gives them a careful analysis, then most of the subsequent scholarly literature is just an elaboration.

[5] Frederick B. Tolles, George Logan of Philadelphia (1953).

[6] In much political theory, Senators serving six-year terms and appointed officials are supposed to have a braking effect on “populist” impulsiveness and passion.  However, the opposite may also be true.

Chronology of a Tragedy.

By 20 April 2020, 773,000 people in the United States had tested positive for the coronavirus.  Of these, 247,543 were in New York, mostly in New York City and its suburbs.  New Jersey had 88,806 confirmed cases.  That works out to about 32 percent of the cases being located in New York City and its immediate area.  If you include New Jersey’s 88,000, then New York is the center of about 43 percent of the cases.[1]

How did New York City come to be the present American epicenter of the coronavirus pandemic?[2]

“From the earliest days of the crisis, state and city officials were also hampered by a chaotic and often dysfunctional federal response, including significant problems with the expansion of testing, which made it far harder to gauge the scope of the crisis.”  The same was true of every part of the country, so that doesn’t explain why New York got hit hardest by far.

“Epidemiologists have pointed to New York City’s [population] density and its role as an international hub of commerce and tourism to explain why the coronavirus has spread so rapidly.  And it seems highly unlikely that any response by the state or city could have fully stopped it.”  The same seem likely to be true of the national government.  The question is how much government action could have limited the damage.

Nevertheless, in the view of Dr. Thomas Frieden, former head of the Centers for Disease Control and Prevention, closing the schools, stores, restaurants, and other public venues one to two weeks earlier could have reduced the death toll in New York by 50 to 80 percent.

 

January-February 2020: coronavirus “devastates” China and Europe.

 

21 January 2020: first confirmed case in the United States, in Seattle, Washington.

 

23 January 2020: Chinese government seals off Wuhan.

 

30 January 2020: WHO declares a global health emergency.

 

31 January 2020: US bars entry for any foreign national who had traveled to China in the previous 14 days.

 

It now appears that coronavirus was present in New York City before the first person tested positive for it.  Infectious disease specialists had known for weeks that the federal tests were defective and that infected people were almost certainly present and circulating.  One specialist in infectious diseases for a New York hospital group said later than it was apparent by late January 2020 that cases would soon appear in the United States.

 

2 February 2020: first coronavirus death outside China—in the Philippines.

 

5 February 2020: Japanese government quarantines a cruise ship which carried passengers infected during the trip.

 

7 February 2020: Infectious disease specialists and other doctors confer on federal criteria from the CDC for testing.  The guidelines were too strict and limiting on who could be tested.  According to one of those present, “It was at that moment that I think everybody in the room realized, we’re dead.”

 

Early February 2020: Dr. Oxiris Barbot, NYC Health Commissioner states that “this is not something you’re going to contract in the subway or the bus.”

 

14 February 2020: France announces first coronavirus death.

 

19 February 2020: first two cases in Iran announced.

 

23 February 2020: Italy sees surge in cases in Lombardy.

 

24 February 2020: passenger already infected by coronavirus arrives at JFK on a flight that originated in Iran.

 

24 February 2020: Trump administration asks Congress for $1.25 billion for coronavirus response.  US has 35 cases and no deaths.

 

28 February 2020: number of cases in Europe rises sharply.

 

Late February 2020: Mayor Bill de Blasio tells a news conference that “We can really keep this thing [coronavirus] contained.”

 

29 February 2020: first US death, in Seattle.

 

1 March 2020: the passenger from Iran tests positive for the coronavirus, making her the first identified case in New York City.

 

2 March 2020: Governor Andrew Cuomo and Mayor de Blasio address a news conference.  Cuomo says “Everybody is doing exactly what we need to do.  We have been ahead of this from Day 1.”  Cuomo told the conference that “Out of an abundance of caution we will be contacting the people who were on the flight with her from Iran to New York.”  Then everyone would be traced and isolated.  According to the NYT, this didn’t happen because the CDC would not authorize an investigation.

 

3 March 2020: lawyer in New Rochelle tests positive.  He had not travelled to any affected country, so there was reason to suspect he had contracted the virus in New York.  City health investigators traced his travels and contact to Manhattan, but the state of New York put a “porous” containment line around New Rochelle.

 

3 March 2020: US government approves widespread testing.

 

5 March 2020: New York City mayor Bill de Blasio said that “You have to assume that it could be anywhere in the city.”  However, he also said that “We’ll tell you the second we think you should change your behavior.”

 

If Dr. Frieden is correct that the city should have shut down one to two weeks before it did, then that date would have been sometime between 8 and 15 March 2020.

 

About 7 March 2020: city hospitals start reporting a sharp increase in influenza-like cases and the NYPD reported increased numbers of officers calling in sick and of 911 calls for coughs and fevers.

 

Second week in March 2020: De Blasio wanted widespread testing, but the city’s Health Department urged a public information campaign to tell those with mild symptoms to self-isolate at home, rather than infect others at testing centers.  De Blasio blocked the public information campaign for about a week.

 

At some point not stated by the NYT, de Blasio did urge New Yorkers to practice social distancing and working from home where possible; and de Blasio and Cuomo had both ordered occupancy limits on bars and restaurants.  These limits were broadly ignored.

 

Moreover, de Blasio resisted closing the schools.  The schools provide nutritious meals and a safe space, and not in some touchy-liberal sort of way either, for their students.[3]

 

11 March 2020: US bars most travelers from Europe.

 

12 March 2020: San Francisco closed the schools when 18 cases had been confirmed; Ohio closes the schools when 5 cases had been confirmed.

 

12 March 2020: At a meeting chaired by de Blasio, City Health Commissioner Barbot told a meeting of business executives that 70 percent of the city’s population could become infected.  De Blasio “stared daggers at her.”

According to one person present at the meeting, de Blasio rejected closing restaurants.  “I’m really concerned about restaurants; I’m really concerned about jobs.”  It was a legitimate concern from one perspective.  According to one estimate, tourism accounts for 300,000 jobs in New York City.  This is twice as many as does the tech jobs and vastly more than the jobs linked to the financial services industry.[4]  Closing down restaurants, bars, tourist activities, hotels, and sporting events would hammer the incomes pf poor people much than the incomes of rich people.  He appears to have thought that New York City would never have to close.  In reality, it was a choice between closing the city earlier or later.  However, in the event, the virus spread rapidly.  The health burden has not been shared equally between different social groups.[5]

 

13 March 2020: Trump declares national emergency.

 

13 March 2020: Los Angeles closes its schools after 40 cases had been confirmed.  New York City had almost 160 confirmed cases.

 

15 March 2020: City health officials give de Blasio a grim warning about the number of infections and deaths if the schools—and most businesses—weren’t closed immediately.

 

15 March 2020: De Blasio closes the schools when 329 cases had been confirmed.

 

15 March 2020: CDC recommends no gatherings of more than 50 people.

 

17 March 2020: seven California counties around San Francisco issued stay at home orders.

 

17 March 2020: France orders national lock-down.

 

19 March 2020: California issues state-wide stay at home order with 675 confirmed cases.  New York then had 4,152 cases.

 

20 March 2020: New York State issues state-wide stay at home order, effective 22 March 2020.  On 20 March, the state had more than 7,000 confirmed cases.

 

Recently, the New York Times ran a piece considering the long-term consequences of the pandemic’s impact on New York.[6]  Much of the economic basis of the city may be hollowed out.  This is particularly true if a vaccine is not developed and mass-produced very soon.  Tourists may shrink from visiting a densely-crowded city.  Tourist amenities from theaters to museums to restaurants to public transportation systems may impose social-distancing regimes that capsize the business model of the industry.  Both the financial services and technology sectors may extend their work-from-home adaptations, while many workers may decide that the home from which they are working might as well be somewhere other than high-price New York.  Demand for office and residential space could fall, clobbering the construction industry.  The city’s budget would have to deal with a huge fall in revenue.  Services to the poor would fall.

Sometimes Tragedy is born of the collision of two Goods.

 

[1] “Tracking an Outbreak,” NYT, 21 April 2020, p. A4.

[2] J. David Goodman, “How Outbreak Kept New York A Step Behind,” NYT, 8 April 2020.

[3] See: Andrea Elliott, “Invisible Child.  Girl in the Shadows: Dasani’s Homeless Life,” NYT, 9 December 2013.  http://www.nytimes.com/projects/2013/invisible-child/index.html#/?chapt=1

[4] J. David Goodman, “It Could Be Years Before New York Regains Its Glory,” NYT, 21 April 2020.

[5] For one example, see: John Eligon et al, “Black Americans Bear The Brunt As Virus Spreads,” NYT, 8 April 2020.

[6] J. David Goodman, “It Could Be Years Before New York Regains Its Glory,” NYT, 21 April 2020.

The Exhaustion of Liberalism?

Barton Swaim[1] describes modern liberal democracy in North America and Western Europe:

“Liberal democracies value divided governmental institutions, a regulated market economy, a generous welfare state, personal autonomy and the expansion of political rights to formerly excluded classes.”[2]

Both “conservatives” and “liberals” share these beliefs.  Where they differ is that “liberals” have a deep faith in the ability of government to improve the human condition, while “conservatives” harbor profound doubts.

The “liberal” achievement in Twentieth Century America has been immense: the Pure Food and Drug Act (1906); the enfranchisement of women (1920); the Social Security Act (1935); the Civil Rights Act (1964); the Food Stamp Act (1964); the Voting rights Act (1965); and the amendment of the Social Security Act to create Medicare and Medicaid (1965).  Most of these laws passed during brief periods when a fundamentally conservative country favored dramatic change.

Swaim sees the historical record as demonstrating the exhaustion of liberalism, although not of liberal democracy.  Much of the liberal agenda has been fulfilled.  There aren’t any more dis-franchised people to enfranchise—except for criminals and non-citizens.  Liberals have turned from defending free speech to curtailing it through campus speech codes, demands that social media censor speech that they characterize as “false,” and demanding that the Supreme Court’s “Citizens United” decision be over-turned.  Increasingly, they place their trust in un-elected experts and bureaucrats to know better than do elected officials.  President Obama extended government regulation of business through federal agency rule-writing because he couldn’t get it through Congress, and President Trump is rolling it back in the same way.

Furthermore, he says, liberals haven’t passed any transformative legislation since the mid-Sixties.  The popular support among voters just isn’t there.  Instead, Swaim argues, liberal reforms have advanced along two lines since the Sixties.  On the one hand, liberal legislative reforms have become increasingly small-scale: the Clean Air Act (1970); the Clean Water Act (1972); and the Affordable Care Act (“Obama Care,” 2010).  On the other hand, and far more importantly, the Supreme Court has approved policies that would not have passed Congress: abortion (1973) and marriage equality (2015).

To the extent that the Democrats have “big ideas,” he says, they are not traditionally “liberal” but “radical.”  The “Green New Deal,” “Medicare for All,” and Senator Elizabeth Warren’s Plans-for-That all run well beyond conventional liberal policies.  Hence, the nomination of Joe Biden as the Democratic candidate for president in 2020 is the victory of the backward-looking “liberal” majority over the forward-looking “radical” minority.

Or perhaps not.

[1] South Carolinian (state flag has a half-moon on it that some people have interpreted as a closet endorsement of Islam); BA, University of South Carolina plus some study at the University of Edinburgh; speech-writer for the “intriguing” (HA!) governor, Mark Sanford; and now an opinion writer and book reviewer for the Wall Street Journal.

[2] Barton Swaim, “Joe Biden and the Slow Death of Liberalism,” WSJ, 11-12 April 2020.

The Attack on Iran 9 January 2020.

“Trump did it, so it must be the wrong thing.”  Fair rule of thumb/heuristic device.  However, seen in a historical perspective, some further thought may be in order.

First, the military historian John Keegan dissected the liberal mindset with regard to international order on the eve of the Second Iraq War in 2003.  He called this mindset “Olympianism.”  According to Keegan, it “seeks to influence and eventually control the behavior of states not by the traditional means of resorting to force as a last resort but by supplanting force by rational procedures, exercised through a supranational bureaucracy and supranational legal systems and institutions.” Keegan regarded this view as delusional, but widespread.  He describes the “Olympian ethic” as “opposition to any form of international action lying outside the now commonly approved limits of legal disapproval and treaty condemnation.”[1]

European states weren’t the only ones touched by “Olympianism.”  The Report of the 9/11 Commission tells readers that the US Government struggled to respond to the early attacks by Al Qaeda.  These early attacks included the bombing of two embassies in East Africa, and the attack on the USS “Cole” during a port call in Yemen.  The Director of the Central Intelligence Agency doubted he had the authority to kill some foreign terrorist just because the terrorist was trying to kill Americans.  Much thought went into how to capture Osama bin Laden.  Many Republicans, but also Democrats, belabored President Bill Clinton over the missile attack on a suspected Al Qaeda site in Khartoum, Sudan.  The evidence in the 9/11 Report suggests that the Clinton administration then slow-walked the investigation of the “Cole” bombing so that it wouldn’t be forced to do something that would lead to a further tide of abuse.  Attempts to kill Bin Laden in Afghanistan with cruise missiles failed because the diplomatic proprieties required the US Government to inform the government of Pakistan that the US would be flying cruise missiles across its territory.  This in spite of the fact that Pakistani intelligence had close ties to the Taliban government that was sheltering Bin Laden.

The response to the killing of Qassim Soleimani suggests that “Olympianism” has taken hold elsewhere.

Second, the war correspondent-turned historian Thomas Ricks has sought to explain the poor performance of the US Army in recent wars.  In his explanation, during the Second World War, Chief of Staff George Marshall and ruthless subordinates like Dwight Eisenhower and Omar Bradley, transformed a sleepy, gerontocratic peacetime army into a devastatingly effective instrument of war.  They did so, in part, by getting rid of any commander who didn’t cut the mustard.  After George Marshall and his followers had passed on, the Army reverted to a cautious, self-protective rather than self-critical, bureaucracy.[2]  Generals don’t get fired, except for egregious personal misconduct—when it comes to public attention.

If Ricks is correct in his analysis, how should we understand the apparent lack of enthusiasm in the Pentagon for the strike at an Iranian leader who has been asserting his country’s influence throughout the Middle East at the expense of the United States?

Third, it seems unlikely that President Trump’s order to kill General Soleimani is going to have a worse outcome than the decision by the Bush II administration to invade Iraq or the decision by the Obama administration to overthrow the government of Libya.

[1] John Keegan, The Iraq War (2005), pp. 109, 115.

[2] Thomas E. Ricks, The Generals: American Military Command From World War II to Today (2012).  See also: https://waroftheworldblog.com/2015/08/10/command-crisis/

Get Carter.

The Report of the Inspector-General of the Department of Justice on the beginnings of the Russia investigation makes fascinating reading.  There’s a lot of information in it, even only in the Executive Summary.  So, like the Mueller Report, it will take some time to digest.  However, little bits and pieces are worth a quick look.

How did the “Crossfire Hurricane” team select targets?  A “ consensus among the “Crossfire Hurricane” agents and analysts … identified individuals associated with the Trump campaign who had recently traveled to Russia or had other alleged ties to Russia.” (p. iv.)  These individuals were George Papadopoulos, Carter Page, Paul Manafort, and Michel Flynn.

“[I]mmediately after opening the investigation [31 July 2019], the Crossfire Hurricane team submitted name trace requests to other U.S. government agencies and a foreign intelligence agency, and conducted law enforcement database and open source searches, to identify individuals associated with the Trump campaign in a position to have received the alleged offer of assistance from Russia.”  (p. iv.)

In August 2016, the other agency [apparently the CIA] had informed the FBI that Page was approved as an “operational contact” of the other agency from 2008 to 2013; that Page had provided information about his past contacts with a Russian Intelligence Officer, and that an employee of the other agency had judged that Page had “candidly described his contact with” the Russian intelligence officer.  (p. ix.)

In late September 2016 the OI Attorney had specifically asked the case agent whether Carter Page had a current or prior relationship with the other agency. In response to that inquiry, the case agent advised the OI Attorney that Page’s relationship was “dated” (claiming it was when Page lived in Moscow in 2004-2007) and “outside scope.” This representation, however, was contrary to information that the other agency had provided to the FBI in August 2016, which stated that Page was approved as an “operational contact” of the other agency from 2008 to 2013 (after Page had left Moscow). Moreover, rather than being “outside scope,” Page’s status with the other agency overlapped in time with some of the interactions between Page and known Russian intelligence officers that were relied upon in the FISA applications to establish probable cause. Indeed, Page had provided information to the other agency about his past contacts with a Russian Intelligence Officer (Intelligence Officer 1), which were among the historical connections to Russian intelligence officers that the FBI relied upon in the first FISA application (and subsequent renewal applications)…. Thus, the FBI relied upon Page’s contacts with Intelligence Officer 1, among others, to support of its probable cause statement in the FISA application, while failing to disclose to OI or the FISC that” Page was candidly reporting on thee contacts to the other agency.  (p. ix.)

Thus the October 2016 FISA warrant application “Omitted information the FBI had obtained from another U.S. government agency detailing its prior relationship with Page, …” (p. viii.)

So, I don’t understand why Attorney General William Barr is so upset.  I can certainly see that the FBI and Department of Justice need to update their policies and procedure to prevent unintended errors like these from occurring again.

Father Rale.

By the middle of the 17th Century the fires of the Counter-Reformation had begun to cool.  New ways of thinking emphasized skepticism and tolerance and not fighting over religious issues.  Father Sebastien Rale (1657-1724) belonged to another era than the one in which he lived.  He grew up on the eastern fringe of France, then joined the Jesuits when young.  He taught for a stretch in southern France, but reciting “amo, amas, amat” to blubbering school-boys didn’t hold his attention.  So he volunteered for the New World and the Jesuits shipped him off to a place better suited to his commitments.  In 1689 he went to Canada.  The Jesuit Superior in New France sent him to an Abenaki village near Quebec to learn the language, then to a mission in Kaskaskia in the Illinois country for two years, and then (1694) to Norridgewock on the Kennebec River.  Today, that’s in central Maine; then it was the frontier between Catholic New France and Protestant New England.

In Norridgewock, Father Rale both served the spiritual needs of his parishioners and wound-up the local Indians against the English-speaking Protestants moving up relentlessly from the southwest.  When Queen Anne’s War (1703-1713) broke out Father Rale’s parishioners joined in a Fall 1703 raid that killed 150 English settlers.  This raid fell within a larger pattern.  For example a raid on York, Maine in 1692 had left 100 people—men, women, and children—dead and many others taken captive.  Among the captives carried off to Canada and later ransomed, was Jeremiah Moulton (1688-1765).  English settlers—understandably—became obsessed about the danger.[1]  The governor of Massachusetts put a price on Rale’s head and New England militia were inclined to a literal interpretation.  Ten years of unsuccessful man-hunting and border war followed.  In 1713 “peace” broke out.

It wasn’t much of a peace in Maine, whatever it was in Europe.  The exact border between New England and “Acadia” hadn’t been defined in the peace treaty.  The French said it ran along the Kennebec.  The Indians—the Wabanaki Confederation—didn’t agree that they were under British authority.  The government of Massachusetts (which then owned Maine) built some forts on Wabanaki land and settlers moved north and east.  Father Rale urged the Indians to attack the English settlers, although they didn’t need any encouragement to defend their lands from outsiders.  Small raids went on until, in January 1722, the governor of Massachusetts launched an Indian war on the frontier of the province.

Massachusetts militia troops just missed capturing Father Rale, but did get a strong-box full of papers that seemed to show that he acted on behalf of France.  “Father Rale’s War” then began in earnest.  The Wabanaki retaliated with attacks on the frontier forts and settlements.

During 1723, Indian attacks had a devastating effect.  Spring 1724 began as 1723 had ended.  Wabanaki raiders killed farmers and loggers, fishermen (they captured a bunch of fishing boats), and soldiers sent to fight them.  The governor of Massachusetts ordered all settlers to move to the forts or to fortified houses.[2]

In August 1724, a group of militia—now much experienced at Indian fighting–surprised the Indians at Norridgewock.  Afterwards, a scalped Father Rale lay among the dead.  The English burned the village and the crops in the field.  The Indians then moved north out of reach of the English.[3]  The commander of the English attack was Jeremiah Moulton, who had been kidnapped in York many years before.  There is something Biblical in that.

[1] See: https://www.youtube.com/watch?v=8Wgkpfa5HMw  and https://www.youtube.com/watch?v=pV2JPv1EFww

[2] See: https://en.wikipedia.org/wiki/Garrison_(architecture) for the architectural style.

[3] British colonists settled the now-empty site of the village only in 1773.

The Devil’s Backbone.

Who made the first roads in America?  Animals did, mostly bison and deer.  They migrated from place to place and then returned.  Often, they preferred to travel on ridge-lines.  Vegetation was less dense there and height gave them what soldiers today call “observation”: they could see danger coming.  Native Americans then followed these paths for many centuries, either migrating or hunting or bound for war.  The trails became more distinct.  Then came the European-Americans.  These travelers had horses and cattle, vehicles and tools.  The pathways became rough-and-ready roads.  European-Americans called any such path-to-road a “trace.”

The “Natchez Trace” was a somewhat improved dirt road connecting Nashville, on the Tennessee River, with Natchez, on the Mississippi River.  The lands between Nashville and Natchez remained thinly-settled for a long time.  Weary travelers looked forward to sight of isolated inns, called “stands,” where they could eat and sleep.[1]  It being only “somewhat improved,” 450 miles long, and lawless, most travelers referred to it as “the Devil’s Backbone.”

All sorts of people of people flowed along the Natchez Trace in the early 1800s.  Presbyterian and Methodist preachers of the “Second Great Awakening,” an emotionally powerful revival movement, were all over the place like a duck on a June-bug.[2]  Westward migrants hoped for better cotton lands in the Mississippi valley.  With the white planters went their African-American slaves.  Merchants from Nashville and elsewhere used the Trace as a river of commerce.  The Mississippi Valley blossomed from the combination of cotton, and the north-south trade between New Orleans and the “Old Northwest.”  “Kaintucks” manned the flatboats that carried the river’s trade.  They walked home along the Trace.

Because money flowed in both directions along the “Trace,” so did crime.[3]  The little U.S. Army was stretched thin, so there weren’t many soldiers to provide protection.  Sheriffs were few and far between.  On the Western end of the Trace, merchants, “Kaintucks,” and slaves all congregated in the wide-open town of Natchez-under-the-Hill, where gambling, girls, and drink abounded.  So did fights.  When crime got bad enough, a posse of “Regulators” would go hunting outlaws.  Court trials did not always follow captures.

For example, Samuel Mason (1739-1803) served on the frontier in the American Revolution, then he turned to river piracy in Ohio, Illinois, and Arkansas (which then belonged to Spanish America).  (This isn’t the sort of thing that the Daughters of the American Revolution like to play up.)  He fell in with a family of serial killers named Harpe until the Spanish arrested him in 1803 in what would later become Missouri.[4]  He didn’t have any good explanation for the twenty scalps found in his luggage (but really, who could?), so the Spanish turned him over to the Americans.  They would have hanged him, but he escaped for just long enough for two of his confederates to kill him in hopes of collecting a reward.  Instead the confederates met their own grim fates on a tree limb.

In the 1820s, the steamboat (which could carry goods and people upstream against the river currents) and other roads made the Trace irrelevant.

[1] In one of these inns, Meriwether Lewis— burdened by debts, drinking hard, and depressed–shot himself in 1809.

[2] Revivalist preachers stressed that individuals had to repent their sins to be saved.  Thousands of enthusiasts attended camp meetings like the one at Cane Ridge, Kentucky, in 1801.  The emotional, salvation-is-at-hand message of the revivalist movement had a profound effect on slaves, perhaps helping to inspire Nat Turner’s rebellion in Virginia in 1831.

[3] There’s a B-movie called “The Natchez Trace” (dir. Alan Crosland, 1960).

[4] The Louisiana Purchase was at hand, but had not yet taken place.  So, Missouri remained part of the Spanish empire.

Squanto.

The Native Americans of New England had been in contact with Europeans—French, Dutch, and English—since the early 1500s.  This contact began to transform Native American society.  On the one hand, the Europeans unintentionally introduced Old World diseases to which the Native Americans had no resistance.  Native American tribes did not live in isolation from other tribes.  The diseases spread like wild-fire from people near the coast to places much farther inland.  The toll could be horrific: 90 percent mortality in some cases, often as much as two-thirds.  On the other hand, the Native Americans were a Stone Age people.  The Iron Age Europeans had things—knives, axes, cooking pots, muskets—that would make the lives of the Native Americans much easier.  The Europeans would trade these things, and alcohol, for furs.

Beginning in 1605, English explorers—at the least—began occasional kidnappings of Native Americans.  Sometimes they sold them as slaves.  Sometimes they took them home to England and later returned them.  The catch-and-release effort may have been a crude attempt to create future intermediaries between the English and the Native Americans.  The English aimed at eventual settlement of colonies.  In 1614, an English explorer named Thomas Hunt grabbed 27 Native Americans from the shores of Cape Cod Bay.  He then sailed for the Spanish port of Malaga, where he sold them as slaves.

One captive called himself Tisquantum.  The Pilgrims later came to call him “Squanto.”  At a reasonable guess, “Squanto” was born about 1585 on the western shore of Cape Cod Bay.   His tribe, the Patuxet, were farmers, not hunters-and-gatherers.  Most of his life story is lost, with only occasional known facts.  He spent some time (probably years) in Spain (and probably at Malaga).  Somehow, he reached England.  He may have escaped to an English ship in the harbor.  He may have been bought or stolen by an English ship captain who knew of his employer’s interest in American colonization.  In any event, he spent enough time in London to learn English and see something of English society.

In 1618, the English merchant and colonizer Richard Slaney sent Squanto with an expedition to Newfoundland.  In 1619, Squanto talked an English captain into making an exploring voyage to Cape Cod Bay.  Home again, Squanto found himself virtually the “last of the Patuxets”: disease had destroyed his tribe.  Homeless and rootless, he declined to return with the captain.  However, he served as a translator and honest intermediary between his own people and the English.[1]

Then, in December 1620, the “Mayflower,” with the Pilgrims aboard, hove into sight on the western shore of Cape Cod Bay.  Having lost tribe and family, having learned English and met many Englishmen, Squanto soon moved into the Plymouth colony itself for almost two years.  He taught the colonists the rudiments of the fur trade.  This helped repay the debt to the company that had paid their passage—Plymouth was an “indentured colony.”  He taught them about Native American farming and crops.  Many of the seeds brought from England didn’t thrive in American soil.  He helped negotiate peace with surrounding tribes.  This minimized—for a time—“unfortunate incidents.”

Squanto died of what William Bradford described as an “Indian fever” in 1622.

[1] Some days later, a different group of Native Americans captured the English captain.  Eventually, he managed to escape and return home.  HA!