Oliver Wiswell.

My Dad was the finest man I’ve ever known, but he didn’t have a lot of formal education or refined taste in literature.   He read the novels of John D. MacDonald, C.S. Forester and Kenneth Roberts.  Cheap paperbacks you could buy in the Rexall drugstore on 45th in Seattle.  So I read them as well.  It was a productive use of my time.  Kenneth Roberts (1885–1957) started as a journalist, tried his hand at soldiering in the First World War (Siberia expedition), went back to journalism (Saturday Evening Post), and ended up as a historical novelist.

Roberts was a cross-grained guy.  Arundel (1931) and A Rabble in Arms (1933) celebrate Benedict Arnold—before the treason.  Northwest Passage (1937) centers on Robert Rogers, the subsequently disgraced American hero of the French and Indian Wars.[1]  Oliver Wiswell (1940) is a view of the American Revolution from the perspective of a Tory.  After Arundel[2] it is his best book.

In Oliver Wiswell the hero instinctively helps a man who is being tarred and feathered and ridden on a fence rail for dissenting from common opinion; helps treat the British wounded after Bunker Hill (one guy is gut-shot by a musket ball with a nail pounded through it); interviews New York Loyalists who have been driven into hiding in a swamp to escape their tormentors; hears of other Loyalists who have been imprisoned in the depths of Connecticut’s Simsbury mines; investigates the mass murder of American prisoners of war by their British guards in New York; wanders in disguise through the back-country in search of the troops that General John Burgoyne surrendered at Saratoga[3]; travels for a while with the many people migrating West of the Appalachians to escape the war and the “Land of Liberty”; arrives in South Carolina in time to hear of the bloody civil war underway in the South and to participate in the Loyalist defense of Ninety-Six; learns of the American assault upon the civilized Cherokee; returns to New York to share in the whale-boat fights on Long Island Sound as Loyalists sought to escape the United States; and ends by helping found new colonies in Canada for the Loyalists.  So, reading this book could give you the idea that the American Revolution involved a lot of informal violence on both sides, but especially against the opponents of the “Empire of Liberty.”

While not an “academic” historian, Roberts did a lot of research for his books.  He consulted both published primary sources and the “literary” histories of an earlier time.  Like any journalist, he sought out dramatic human stories that illustrated larger patterns.

In recent years, academic historians have systematically exploited many more sources that were used by Roberts.  However, their books largely confirm what Roberts intuited.[4]  There was nothing gentlemanly or moderate about the Patriots’ war with the allies of the British Army.  Roger Parkinson, The Common Cause, examines how Patriots nurtured white fear and hatred of blacks and Indians as a way to bind people to the Revolution.  Kathleen DuVal, Independence Lost, shows how on the Southwestern frontier many hopes for the future—especially among the Indians–failed when the United States succeeded.  Maya Jasanoff, Liberty’s Exiles, tracks the fate of losers red, white and black.  Holger Hoock, Scars of Independence, speaks directly to Robert’s theme of ruthless liberty.

All this emphasizes the achievement of the Founders in calming America after 1783.

[1] The first third of the book, on Rogers’ raid on the St. Francis Indians, is marvelous.  The rest is a dog.

[2] In one of the novels, an English noble-woman says “You live in a rundle?  Oh, you mean Arun-dell.”

[3] They were promised parole, but the Americans declined to fulfill this promise—or adequately care for their captives.

[4] See the discussion of books in Jane Kamensky, “Red, White, Black and Blue,” NYT Book Review, 21 May 2017.

My Weekly Reader 14 July 2017.

As contemporary Americans ponder whether the federal government has grown too strong or is not yet strong enough, it is worth revisiting the first years of the Republic.  Then even the survival of the United States lay open to question.  Many Americans (Anti-Federalists) had opposed the Constitution.  Britain and Spain, which possessed important territories bordering on the new nation, were little inclined to believe that the United States would become a behemoth.  In military affairs, interested people debated whether the United States should even have a permanent and professional military (a “standing army” of the sort decried in the Declaration of Independence).  Opponents of an excessively strong central government argued for a reliance on the citizen-soldiers of a militia.  Men like George Washington and Alexander Hamilton argued for the utility of a professional army.  What professional historians call “contingency” (specific things happen to determine an outcome) played an important role in deciding how things worked out.[1]

Take the example of the struggle for the “Northwest Territory.”  In 1781 a Franco-American army defeated a British army at Yorktown.  In 1783, Britain and America made peace.  In what was then the “West,” Britain granted to the Americans sovereignty over the territory north of the Ohio and east of the Mississippi.  The problem for the Americans lay in making good that claim.  The Indians living there declined to surrender their lands to white settlers.  Local British officials encouraged this resistance.  So, Indian wars became one feature of the presidential administration of George Washington.

In October 1791, the Indians defeated a U.S. Army force led by Josiah Harmar.  A year later, the governor of the Northwest Territory, Arthur St. Clair, led a new army into the wilds of northwestern Ohio.  Near the Wabash River, a smaller Indian force destroyed St. Clair’s little army in less than three hours.  They inflicted well over 90 percent casualties (including two-thirds killed), while suffering only minor losses themselves.[2]

This “signal catastrophe,”[3] tipped the balance in favor of a stronger national army.  In 1792, Washington managed to ram through Congress an increased military budget, then appointed Anthony Wayne to replace St. Clair.  Wayne worked hard to revive the morale of the defeated troops, then marched deep into Indian territory.  This took a while: the decisive contest only came in August 1794.  Then, at a place called Fallen Timbers, Wayne inflicted a devastating defeat on the Indians.  Faced with a victorious, ill-disposed-toward-Britain American army, local British commanders abandoned their Indian pawns.[4]

Two years later, Wayne negotiated the Treaty of Greenville with the defeated Indians.  The treaty—as decisive a surrender as that of the Japanese in 1945—opened the Northwest to white settlement.  Later, writers would cite America’s “manifest destiny” to conquer the West.  In the 1790s, this destiny was far from manifest.  It began to become so with the work of Anthony Wayne and his army.

[1] William Hogeland, Autumn of the Black Snake: the Creation of the U.S. Army and the Invasion That Opened the West (2017).

[2] In comparison George Custer got 268 men killed at the Little Big Horn.  “St. Clair’s Defeat” defeat was commemorated in folk culture.  See: https://www.youtube.com/watch?v=nn2rEoRmh1M

[3] I stole the phrase from the title of Patrick MacCrory’s superb book on the retreat of a British army from Kabul, Afghanistan in 1842.  I read his book in my youth and it conditioned me to think that people should be cautious about vexing Pashtuns.

[4] The episode nicely illustrates the difficulties of coalition warfare.  The British and the Indians were often in disagreement and the Indians themselves were not a solid bloc.

My Weekly Reader 29 June 2017.

A pessimist’s analysis of the American position in the world might run something like the following.  The United States is the world’s only global power.  (As such, it performs many of the vital military, political, and economic functions of a world government.)  It faces a host of regional powers bent on disrupting the global order created through American leadership after the Second World War.  Iraq, Afghanistan, Libya, and radical Islamist jihad all offer examples of the failure of military power as a solution to challenges.[1]  Moreover, the foundations of American power have been cracked by changes in America’s society and economy.  Liberal internationalist elites ignored the human costs of their policies until they inspired a backlash under the last three presidential administrations.  Domestic politics have come to center on divisive identity politics and the expansion of entitlements (including the entitlement to not be taxed) beyond what the traditional economy can support.  In light of these grim facts, America should shift from “hard” (lawyers, guns, and money) power to “soft” power (diplomacy, humanitarianism); America should seek to lead from behind by encouraging allies to assume their responsibilities; and America should do its nation building at home.

Eliot A. Cohen takes sharp issue with this point of view.[2]  “The chances are growing that the United States will find itself using military power chronically and at varying levels of intensity, throughout the early decades of the 21st century.”  Even over the short-run, the United States faces complex challenges: China’s rise as an economic and military power in a key region for American interests; an aggrieved Russia trying to punch above its weigh while it still can; and a transnational radical Islam that will continue to inspire local insurgencies.  These quarrels may have to be resolved in places as different as the high seas, the anarchic peripheries around or between failing states, and even outer space.  So far as he’s concerned, micro-lending isn’t going to cut it.  “Hard” power will have to be at least part of the response.

Cohen is equally persuasive, alarming, and rough-edged in the rest of the book.  Asking whether America possesses the means to use force where needed, Cohen answers with a qualified “Yes.”  His deepest concern lies in the nature and quality of thinking about the use of the instruments of power, rather than about the quality and quantity of those instruments.  One danger springs from what he sees as the capture of strategic thinking by process-oriented bureaucrats.  Plans, working papers, studies, and a deep dive into minutiae introduce rigidity and myopia into thinking about the long-term strategic environment.  In short, dopes have a large voice in the use of military power.  Another concern arises from our public discourse on these issues.  The United States, says Cohen, needs to do some serious thinking and debating on its relationship to the outside world and on how and when to use military force.  Not only must Americans recognize the need for force, they will have to accept that the country is in for a series of long wars with no easy resolution, let alone parades.  In the White House, in Congress, and in the Pentagon, decision-makers are too much concerned to define the “end state” of any military action.  Get in, wreck stuff, get out defined the 1991 and 2003 wars with Iraq.  Neither resolved the basic problem.  Here Cohen could profit from a review of the post-WWII experience.[3]

Left largely unaddressed is the problem of paying for all this power.  It seems presumptuous to believe that Americans will prefer national security to Social Security.

[1] Hence, the Obama administration recognized that the American people opposed any new war in the Middle East.  From this perspective, a deal to slow down Iran’s acquisition of nuclear weapons made a lot of sense.

[2] Eliot A. Cohen, The Big Stick: The Limits of Soft Power and the Necessity of Military Force (2016).

[3] See: https://waroftheworldblog.com/2017/06/29/soldiers-become-governors/

Soldiers Become Governors.

Modern war is about destroying the enemy’s army and seizing control of his territory.  Even when it can be achieved, victory still brings problems.  If Army officers wanted to be civilian bureaucrats they wouldn’t have gone into the military.  Yet civilian bureaucrats lack the means to effectively govern conquered territory.  Both civilians and soldiers agree to ignore this reality in what one scholar labels the “denial syndrome.”  Unfortunately, scholars have a lot of evidence with which to work in sorting out good practice from bad.[1]  People can’t help but compare the successful occupations of Germany and Japan after the Second World War with the disastrous aftermath of the 2003 invasion of Iraq.  What went right with the earlier occupations?  What went wrong with the later occupation?

After victory in the Second World War, the American military occupied huge territories of the defeated enemies.  Those countries acknowledged that they were beaten and that the war was ended.  The military had created immense global logistical systems that enabled it to move supplies to the conquered areas.  It had very large military forces available to support and enforce American military government.  The desire to avoid any renewed military danger from Germany or Japan inclined Generals Lucius Clay (Germany) and Douglas MacArthur (Japan) to sort out the conquered people, not just to punish them.  The suddenly developing Cold War with the Soviet Union motivated Americans (and the Germans especially) to not want a break-down of civil affairs.

Very different conditions prevailed in Iraq.  The war plan assigned far too few soldiers to occupation duty, then American forces were further drawn down.  Very quickly, the George W. Bush administration transferred authority in Iraq to what proved to be an inadequate Civilian Provisional Authority.  Iraqis did not acknowledge that they were beaten and that war had ended.  Instead, Sunni-Shi’ite-Kurdish conflicts broke out into the open.  Shi’ites looked to neighboring Iran for support, while Iran sought to undermine the American and Sunni positions.  While Germans had feared the Soviet Union, many Sunni embraced the insurgency that quickly became associated with the radical Islamists of Al Qaeda.

One—depressing—“lesson of History” might be that people fail to learn from History.  The George W. Bush administration failed to study the “good occupations” of Germany and Japan.  The Obama administration continued the same chaotic occupation policies launched by the Bush administration.  One reason for this failure may lie in the clash between any “lessons” History teaches and what people want to believe.  Lost in the adulation of the occupations of Germany and Japan is the reality that Americans raised in an environment of inter-war isolationism were only constrained to embark on internationalism by harsh necessity.

Also lost in recent accounts is the reality that Rome wasn’t built in a day.  By focusing tightly on the brief periods of military administration, then jumping ahead to the long-term outcomes, it is easy to attribute change to military government.  This analysis falls short of a real explanation.  On the one hand, civilian governments by the defeated peoples took decades to create democratic political cultures.  They wanted to avoid repeating the errors of the past.  On the other hand, Germany became a democracy because the victors in the Second World War partitioned the country, then parked 20,000 tanks on top of the place for almost half a century.

[1] Susan L. Carruthers, The Good Occupation: American Soldiers and the Hazards of Peace (2016); Nadia Schadlow, War and the Art of Governance: Consolidating Combat Success into Political Victory (2017).  In a typically American solipsism, the authors ignore the contemporaneous British experience with the government of conquered territories.

My Weekly Reader 3 June 2017.

It is characteristic of the long-running funk into which many Western societies have fallen that there have been many “decline of the West” books published in recent decades.  They offer varying analyses shaded by varying clouds of pessimism.  Some focus on economic issues, some on misguided international policies, and some on cultural factors (with rotten schools in the forefront).  Many are inspired by China’s challenge to societies that otherwise could remain complacent.  Some are compelling, many are not.  One recent example come from the former editor of the Economist, Bill Emmot.[1]

Thirty years ago Mancur Olson investigated the rapid revival of the devastated German and Japanese economies after the Second World War and the slower growth of the Western victors in that war.[2]  He found the answer in the role of intermediate groups–political as well as economic–in the different societies.  By intermediate groups he meant both labor unions and businessmen’s association, but also intrusive government regulators.  These groups entrenched established organizations at the expense of newcomers.  They entrenched established procedures at the expense of innovation.  Dictatorship, war, defeat, and foreign occupation had destroyed these intermediate groups in Germany and Japan.  This left individual entrepreneurs free to do what they wanted in a dynamic fashion.  (“And all that implies.”—“The Iron Giant.”)  Elsewhere, the intermediate groups survived the war and sometimes even tightened their grip.

It’s possible to find many examples of dysfunction in Western societies.  Take both the Republican and Democratic parties in the United States for example.  Or the low labor participation rate in the United States as men have fled to disability programs as an alternative to lost familiar work.  Or Japan’s descent from Olsonian prime example of success into a barnacle-encrusted sampan.  Or the domination of the American—and perhaps “Western”–political economies by the banks.  In Japan that has meant a “lack of entrepreneurship or corporate investment” needed for growth.  In the United States, it has meant exploiting a public safety-net to cover imprudent risk.  This has resulted in “rising inequality, distortion of public policy, and [the] generation of collective economic pain and anger.”  And now the dreaded “populism.”

Much later on, several different countries sought to scrape these “barnacles” off the hull. Sweden “reduc[ed] taxation and deregulat[ed] all manner of industries” in pursuit of “more freedom of choice and creativity.”  Switzerland adopted an openness to immigration and also deregulated its labor market to get the right mix of workers to the right industries.  Britain’s embrace of the “Thatcher Revolution,” joined with membership in the European Union allowed Britain to reap both a “brain gain” and a “brawn gain.”[3]

What does Emmott offer by way of possible solutions?  Refreshingly, he does not glom every unpleasant surprise into one whole.  Thus, Putin’s Russia and Islamist terrorism pose no existential threats to Western civilization.  They can be mastered with a coherent effort.  Similarly, “Brexit” is a bad idea but not a rejection of Western values or most Western institutions.  In contrast, he over-states the real danger posed by the Donald Trump administration.  Trump speaks neither for mainstream Republicans nor for Democrats, and his administration will not last beyond his first term.  Then it will be back to business as usual.

Emmott has less to say about solving the real danger: Olson’s intermediate groups.  Appeals for “openness” in discussion isn’t likely to suffice.  It may take a real crisis, alas.

[1] Bill Emmott, The Fate of the West: The Battle to Save the World’s Most Successful Political Idea (2017).

[2] See: https://waroftheworldblog.com/2016/06/18/the-rise-and-decline-of-nations/

[3] See: https://en.wikipedia.org/wiki/Polish_Plumber.

My Weekly Reader 30 May 2017.

Ali Soufan was born in Lebanon in 1971, but ended up living in the United States and became an American citizen.[1]  “Education’s the thing, don’t you know.”[2]  In 1995 he got a BA in Political Science from Mansfield University.[3]  Later on he got an MA in International Relations from Vanillanova.  Then he went into the EffaBeeEye.

No chasing bank-robbers or goombas for him.  The harps had those jobs sewn up.[4]  He spoke Arabic and the Bureau only had eight Arabic speakers, so he went into counter-terrorism.  In 1999 he went to Jordan to liase with the Jordanian intelligence service, which had uncovered leads to what would be called the “Millennium bomb plot.”  Here began another theme in his career.  He found a box of files in the CIA station, allegedly ignored by the over-worked agents, containing maps of the targets.  The CIA seemed more vexed than grateful.  In 2000 he went to Yemen as part of the team investigating the bombing of the USS “Cole.”  Here he made important discoveries.  He went back to Yemen after 9/11 to pursue leads.  Here he figured out that the CIA had held back information from the FBI that might have allowed him to connect the “Cole” attack with the 9/11 plot.[5]  The CIA seemed more vexed than grateful.  Then he interrogated captured Al Qaeda terrorists.  Subsequently, some of his subjects were transferred to CIA control and were subjected to enhanced interrogation techniques.[6]

By 2005 Soufan had become fed-up or burned-out.  He resigned from the Bureau to start a consultancy.  In 2011 he published The Black Banners: The Inside Story of 9/11 and the War Against al-Qaeda.[7]  Here he tracked the campaign against Al Qaeda from 9/11 to the killing of Osama bin Laden.  Now Soufan has published Anatomy of Terror: From the Death of Bin Laden to the Rise of the Islamic State (2017).[8]  The American invasion of Iraq (2003) triggered a disaster.  Partisan observer—Soufan included–put too much emphasis on the botched occupation.  Iraq was a social IED waiting to be tripped.  The invasion itself lit the fuse.

Even before OBL died, Al Qaeda had transformed into something else, something worse.  It had become Zarqawi’s Al Qaeda in Mesopotamia.  The remnants of that group fell back to Syria and became the Islamic State (ISIS).  More importantly (unless you’re stuck inside the Caliphate), ISIS called for the “lone wolf” attacks that have wreaked havoc in Europe and the United States.  Boko Haram (Nigeria), Al Shabab (Somalia), Jumatul Mujahedeen (Bangladesh), and Abu Sayaf (Philippines) all align themselves with the ideology of Al Qaeda.  We live with the results.

[1] I conjecture that his parents fled the awful Lebanese civil war of 1975-1990.  See: https://en.wikipedia.org/wiki/Lebanese_Civil_War  So, that’s one anecdotal argument against President Trump’s “Muslim ban.”  The recent suicide bombing in Manchester, England, offers an equally compelling anecdotal argument on the other side.  So, we probably shouldn’t rely upon anecdotal evidence.  “Well, d’uh,”–my sons.

[2] I think that’s from one volume of the trilogy U.S.A. by John Dos Passos, but I can’t find the exact reference.

[3] Mansfield is a former teachers college in the middle of nowhere in north-central Pennsylvania.   He got his BA when he was 24, so he lost some time somewhere doing something.

[4] See: https://en.wikipedia.org/wiki/Whitey_Bulger

[5][5] Before people start jumping all over the CIA, read the Report of the 9/11 Commission.  Not just the executive summary, but the whole thing.  Then look at the list of Commission members and run down their career tracks.

[6] Soufan subsequently made public comments on the results obtained by the different approaches.  The CIA seemed more vexed than grateful.

[7] In Western culture, black flags usually denote pirates.  Until the 18th Century, captured pirates rarely got a trial.  You just hanged them at the yard-arm or threw them overboard if there were some sharks handy.  This is a plea for cultural sensitivity on the part of radical Islamists.  Falls under the heading of “enlightened self-interest.”

[8] At least he didn’t call it Al Qaeda: Covenant or Al Qaeda: Dead Men Tell No Tales.

My Weekly Reader 29 April 2017.

In the “Roaring Twenties” the automobile was the “new thing.”  Henry Ford pioneered the mass production of cars and trucks.  He applied Frederick W. Taylor’s simplification of production into single successive tasks.  He created assembly lines to move the parts to workers in a carefully-sequenced order.  Production soared while the price of cars to consumers dropped off the edge of a cliff.  Others rushed to copy the “flivver king.”  So, in 1923, General Motors opened a car plant in Janesville, Wisconsin.  It was a good bet: thanks to the previous establishment of Parker Pens and a tractor factory, the town had a pool of suitably-skilled workers.  For almost fifty years, GM employed a lot of workers at decent wages.

The trouble was that the work itself would bore the balls off a pool table.  By the time of the New Deal disgruntled workers welcomed unionization with open arms.[1]  In 1936-1937 the United Auto Workers (UAW) staged a strike campaign that often turned violent.  For the first time, the government backed the right of the workers to unionize.

However, all the UAW could get its members were better pay, better benefits, and some pass-blocking between the workers and their foremen.  They couldn’t make the work itself any less soul-killing.  What workers wanted was out as soon as possible.  In 1970 the UAW launched a national strike that ran on for better than two months.  What the union won for its members was a “thirty [years] and out” rule that allowed workers to collect a full pension after thirty years on the job, and full health coverage between retirement and Medicare.

The cost of pensions and health care for people who retired when they were about 50 years old heavily freighted the books of companies that already had a hard time adapting to unexpected change.  Many of those companies—and other industries—began shifting the production to Southern “right to work” states or abroad.[2]  Furthermore, workers still had to gut out 30 years at a job they hated from Day One.  On the other hand, places like Janesville were tight communities that had real emotional attractions for the successive generations that grew up in them.[3]  Moreover, for decades American culture—and the Democratic Party in particular—celebrated the industrial working class.  Like combat troops, people could feel a sense of pride in what they had to endure.  It would be hard to cut loose, move to Los Angeles, and become a Chippendale dancer.  (If, you know, that’s how you roll.)

By the dawn of the 21st Century, automobiles—at least union-made, American-company automobiles–no longer were the “new thing.”  The financial crisis of 2008 pushed the remaining “rust belt” car companies to the edge of bankruptcy.  They responded with a desperate effort to cut costs and streamline production.  In October 2008, General Motors announced that it would close its Janesville plant in December.[4]  Merry Christmas!

In America, the human costs of global trade agreements, foreign competition, management errors, and union stupidity have been enormous.  The Janesville unemployment rate hit 13 percent, before falling sharply as people pulled up stakes to search for better chances.  Displaced workers in Janesville didn’t have any better luck with the vaunted government tr-training schemes than have other people.  Women have adapted more easily than men, which can’t be good for the men’s sense of identity.  In a generation, no one will remember or care.

[1] It’s not a car plant, but see: https://www.youtube.com/watch?v=UGXW-PpVL7I

[2] See: https://waroftheworldblog.com/2015/03/02/american-union-stay-away-from-me-uh/

[3] Elements of this appear in “Gran Torino” (dir. Clint Eastwood, 2008): https://www.youtube.com/watch?v=JmqV-LGbqkw

[4] Amy Goldstein, Janesville: An American Story (2017).