My Weekly Reader 29 June 2017.

A pessimist’s analysis of the American position in the world might run something like the following.  The United States is the world’s only global power.  (As such, it performs many of the vital military, political, and economic functions of a world government.)  It faces a host of regional powers bent on disrupting the global order created through American leadership after the Second World War.  Iraq, Afghanistan, Libya, and radical Islamist jihad all offer examples of the failure of military power as a solution to challenges.[1]  Moreover, the foundations of American power have been cracked by changes in America’s society and economy.  Liberal internationalist elites ignored the human costs of their policies until they inspired a backlash under the last three presidential administrations.  Domestic politics have come to center on divisive identity politics and the expansion of entitlements (including the entitlement to not be taxed) beyond what the traditional economy can support.  In light of these grim facts, America should shift from “hard” (lawyers, guns, and money) power to “soft” power (diplomacy, humanitarianism); America should seek to lead from behind by encouraging allies to assume their responsibilities; and America should do its nation building at home.

Eliot A. Cohen takes sharp issue with this point of view.[2]  “The chances are growing that the United States will find itself using military power chronically and at varying levels of intensity, throughout the early decades of the 21st century.”  Even over the short-run, the United States faces complex challenges: China’s rise as an economic and military power in a key region for American interests; an aggrieved Russia trying to punch above its weigh while it still can; and a transnational radical Islam that will continue to inspire local insurgencies.  These quarrels may have to be resolved in places as different as the high seas, the anarchic peripheries around or between failing states, and even outer space.  So far as he’s concerned, micro-lending isn’t going to cut it.  “Hard” power will have to be at least part of the response.

Cohen is equally persuasive, alarming, and rough-edged in the rest of the book.  Asking whether America possesses the means to use force where needed, Cohen answers with a qualified “Yes.”  His deepest concern lies in the nature and quality of thinking about the use of the instruments of power, rather than about the quality and quantity of those instruments.  One danger springs from what he sees as the capture of strategic thinking by process-oriented bureaucrats.  Plans, working papers, studies, and a deep dive into minutiae introduce rigidity and myopia into thinking about the long-term strategic environment.  In short, dopes have a large voice in the use of military power.  Another concern arises from our public discourse on these issues.  The United States, says Cohen, needs to do some serious thinking and debating on its relationship to the outside world and on how and when to use military force.  Not only must Americans recognize the need for force, they will have to accept that the country is in for a series of long wars with no easy resolution, let alone parades.  In the White House, in Congress, and in the Pentagon, decision-makers are too much concerned to define the “end state” of any military action.  Get in, wreck stuff, get out defined the 1991 and 2003 wars with Iraq.  Neither resolved the basic problem.  Here Cohen could profit from a review of the post-WWII experience.[3]

Left largely unaddressed is the problem of paying for all this power.  It seems presumptuous to believe that Americans will prefer national security to Social Security.

[1] Hence, the Obama administration recognized that the American people opposed any new war in the Middle East.  From this perspective, a deal to slow down Iran’s acquisition of nuclear weapons made a lot of sense.

[2] Eliot A. Cohen, The Big Stick: The Limits of Soft Power and the Necessity of Military Force (2016).

[3] See: https://waroftheworldblog.com/2017/06/29/soldiers-become-governors/

Soldiers Become Governors.

Modern war is about destroying the enemy’s army and seizing control of his territory.  Even when it can be achieved, victory still brings problems.  If Army officers wanted to be civilian bureaucrats they wouldn’t have gone into the military.  Yet civilian bureaucrats lack the means to effectively govern conquered territory.  Both civilians and soldiers agree to ignore this reality in what one scholar labels the “denial syndrome.”  Unfortunately, scholars have a lot of evidence with which to work in sorting out good practice from bad.[1]  People can’t help but compare the successful occupations of Germany and Japan after the Second World War with the disastrous aftermath of the 2003 invasion of Iraq.  What went right with the earlier occupations?  What went wrong with the later occupation?

After victory in the Second World War, the American military occupied huge territories of the defeated enemies.  Those countries acknowledged that they were beaten and that the war was ended.  The military had created immense global logistical systems that enabled it to move supplies to the conquered areas.  It had very large military forces available to support and enforce American military government.  The desire to avoid any renewed military danger from Germany or Japan inclined Generals Lucius Clay (Germany) and Douglas MacArthur (Japan) to sort out the conquered people, not just to punish them.  The suddenly developing Cold War with the Soviet Union motivated Americans (and the Germans especially) to not want a break-down of civil affairs.

Very different conditions prevailed in Iraq.  The war plan assigned far too few soldiers to occupation duty, then American forces were further drawn down.  Very quickly, the George W. Bush administration transferred authority in Iraq to what proved to be an inadequate Civilian Provisional Authority.  Iraqis did not acknowledge that they were beaten and that war had ended.  Instead, Sunni-Shi’ite-Kurdish conflicts broke out into the open.  Shi’ites looked to neighboring Iran for support, while Iran sought to undermine the American and Sunni positions.  While Germans had feared the Soviet Union, many Sunni embraced the insurgency that quickly became associated with the radical Islamists of Al Qaeda.

One—depressing—“lesson of History” might be that people fail to learn from History.  The George W. Bush administration failed to study the “good occupations” of Germany and Japan.  The Obama administration continued the same chaotic occupation policies launched by the Bush administration.  One reason for this failure may lie in the clash between any “lessons” History teaches and what people want to believe.  Lost in the adulation of the occupations of Germany and Japan is the reality that Americans raised in an environment of inter-war isolationism were only constrained to embark on internationalism by harsh necessity.

Also lost in recent accounts is the reality that Rome wasn’t built in a day.  By focusing tightly on the brief periods of military administration, then jumping ahead to the long-term outcomes, it is easy to attribute change to military government.  This analysis falls short of a real explanation.  On the one hand, civilian governments by the defeated peoples took decades to create democratic political cultures.  They wanted to avoid repeating the errors of the past.  On the other hand, Germany became a democracy because the victors in the Second World War partitioned the country, then parked 20,000 tanks on top of the place for almost half a century.

[1] Susan L. Carruthers, The Good Occupation: American Soldiers and the Hazards of Peace (2016); Nadia Schadlow, War and the Art of Governance: Consolidating Combat Success into Political Victory (2017).  In a typically American solipsism, the authors ignore the contemporaneous British experience with the government of conquered territories.

My Weekly Reader 3 June 2017.

It is characteristic of the long-running funk into which many Western societies have fallen that there have been many “decline of the West” books published in recent decades.  They offer varying analyses shaded by varying clouds of pessimism.  Some focus on economic issues, some on misguided international policies, and some on cultural factors (with rotten schools in the forefront).  Many are inspired by China’s challenge to societies that otherwise could remain complacent.  Some are compelling, many are not.  One recent example come from the former editor of the Economist, Bill Emmot.[1]

Thirty years ago Mancur Olson investigated the rapid revival of the devastated German and Japanese economies after the Second World War and the slower growth of the Western victors in that war.[2]  He found the answer in the role of intermediate groups–political as well as economic–in the different societies.  By intermediate groups he meant both labor unions and businessmen’s association, but also intrusive government regulators.  These groups entrenched established organizations at the expense of newcomers.  They entrenched established procedures at the expense of innovation.  Dictatorship, war, defeat, and foreign occupation had destroyed these intermediate groups in Germany and Japan.  This left individual entrepreneurs free to do what they wanted in a dynamic fashion.  (“And all that implies.”—“The Iron Giant.”)  Elsewhere, the intermediate groups survived the war and sometimes even tightened their grip.

It’s possible to find many examples of dysfunction in Western societies.  Take both the Republican and Democratic parties in the United States for example.  Or the low labor participation rate in the United States as men have fled to disability programs as an alternative to lost familiar work.  Or Japan’s descent from Olsonian prime example of success into a barnacle-encrusted sampan.  Or the domination of the American—and perhaps “Western”–political economies by the banks.  In Japan that has meant a “lack of entrepreneurship or corporate investment” needed for growth.  In the United States, it has meant exploiting a public safety-net to cover imprudent risk.  This has resulted in “rising inequality, distortion of public policy, and [the] generation of collective economic pain and anger.”  And now the dreaded “populism.”

Much later on, several different countries sought to scrape these “barnacles” off the hull. Sweden “reduc[ed] taxation and deregulat[ed] all manner of industries” in pursuit of “more freedom of choice and creativity.”  Switzerland adopted an openness to immigration and also deregulated its labor market to get the right mix of workers to the right industries.  Britain’s embrace of the “Thatcher Revolution,” joined with membership in the European Union allowed Britain to reap both a “brain gain” and a “brawn gain.”[3]

What does Emmott offer by way of possible solutions?  Refreshingly, he does not glom every unpleasant surprise into one whole.  Thus, Putin’s Russia and Islamist terrorism pose no existential threats to Western civilization.  They can be mastered with a coherent effort.  Similarly, “Brexit” is a bad idea but not a rejection of Western values or most Western institutions.  In contrast, he over-states the real danger posed by the Donald Trump administration.  Trump speaks neither for mainstream Republicans nor for Democrats, and his administration will not last beyond his first term.  Then it will be back to business as usual.

Emmott has less to say about solving the real danger: Olson’s intermediate groups.  Appeals for “openness” in discussion isn’t likely to suffice.  It may take a real crisis, alas.

[1] Bill Emmott, The Fate of the West: The Battle to Save the World’s Most Successful Political Idea (2017).

[2] See: https://waroftheworldblog.com/2016/06/18/the-rise-and-decline-of-nations/

[3] See: https://en.wikipedia.org/wiki/Polish_Plumber.

My Weekly Reader 30 May 2017.

Ali Soufan was born in Lebanon in 1971, but ended up living in the United States and became an American citizen.[1]  “Education’s the thing, don’t you know.”[2]  In 1995 he got a BA in Political Science from Mansfield University.[3]  Later on he got an MA in International Relations from Vanillanova.  Then he went into the EffaBeeEye.

No chasing bank-robbers or goombas for him.  The harps had those jobs sewn up.[4]  He spoke Arabic and the Bureau only had eight Arabic speakers, so he went into counter-terrorism.  In 1999 he went to Jordan to liase with the Jordanian intelligence service, which had uncovered leads to what would be called the “Millennium bomb plot.”  Here began another theme in his career.  He found a box of files in the CIA station, allegedly ignored by the over-worked agents, containing maps of the targets.  The CIA seemed more vexed than grateful.  In 2000 he went to Yemen as part of the team investigating the bombing of the USS “Cole.”  Here he made important discoveries.  He went back to Yemen after 9/11 to pursue leads.  Here he figured out that the CIA had held back information from the FBI that might have allowed him to connect the “Cole” attack with the 9/11 plot.[5]  The CIA seemed more vexed than grateful.  Then he interrogated captured Al Qaeda terrorists.  Subsequently, some of his subjects were transferred to CIA control and were subjected to enhanced interrogation techniques.[6]

By 2005 Soufan had become fed-up or burned-out.  He resigned from the Bureau to start a consultancy.  In 2011 he published The Black Banners: The Inside Story of 9/11 and the War Against al-Qaeda.[7]  Here he tracked the campaign against Al Qaeda from 9/11 to the killing of Osama bin Laden.  Now Soufan has published Anatomy of Terror: From the Death of Bin Laden to the Rise of the Islamic State (2017).[8]  The American invasion of Iraq (2003) triggered a disaster.  Partisan observer—Soufan included–put too much emphasis on the botched occupation.  Iraq was a social IED waiting to be tripped.  The invasion itself lit the fuse.

Even before OBL died, Al Qaeda had transformed into something else, something worse.  It had become Zarqawi’s Al Qaeda in Mesopotamia.  The remnants of that group fell back to Syria and became the Islamic State (ISIS).  More importantly (unless you’re stuck inside the Caliphate), ISIS called for the “lone wolf” attacks that have wreaked havoc in Europe and the United States.  Boko Haram (Nigeria), Al Shabab (Somalia), Jumatul Mujahedeen (Bangladesh), and Abu Sayaf (Philippines) all align themselves with the ideology of Al Qaeda.  We live with the results.

[1] I conjecture that his parents fled the awful Lebanese civil war of 1975-1990.  See: https://en.wikipedia.org/wiki/Lebanese_Civil_War  So, that’s one anecdotal argument against President Trump’s “Muslim ban.”  The recent suicide bombing in Manchester, England, offers an equally compelling anecdotal argument on the other side.  So, we probably shouldn’t rely upon anecdotal evidence.  “Well, d’uh,”–my sons.

[2] I think that’s from one volume of the trilogy U.S.A. by John Dos Passos, but I can’t find the exact reference.

[3] Mansfield is a former teachers college in the middle of nowhere in north-central Pennsylvania.   He got his BA when he was 24, so he lost some time somewhere doing something.

[4] See: https://en.wikipedia.org/wiki/Whitey_Bulger

[5][5] Before people start jumping all over the CIA, read the Report of the 9/11 Commission.  Not just the executive summary, but the whole thing.  Then look at the list of Commission members and run down their career tracks.

[6] Soufan subsequently made public comments on the results obtained by the different approaches.  The CIA seemed more vexed than grateful.

[7] In Western culture, black flags usually denote pirates.  Until the 18th Century, captured pirates rarely got a trial.  You just hanged them at the yard-arm or threw them overboard if there were some sharks handy.  This is a plea for cultural sensitivity on the part of radical Islamists.  Falls under the heading of “enlightened self-interest.”

[8] At least he didn’t call it Al Qaeda: Covenant or Al Qaeda: Dead Men Tell No Tales.

My Weekly Reader 29 April 2017.

In the “Roaring Twenties” the automobile was the “new thing.”  Henry Ford pioneered the mass production of cars and trucks.  He applied Frederick W. Taylor’s simplification of production into single successive tasks.  He created assembly lines to move the parts to workers in a carefully-sequenced order.  Production soared while the price of cars to consumers dropped off the edge of a cliff.  Others rushed to copy the “flivver king.”  So, in 1923, General Motors opened a car plant in Janesville, Wisconsin.  It was a good bet: thanks to the previous establishment of Parker Pens and a tractor factory, the town had a pool of suitably-skilled workers.  For almost fifty years, GM employed a lot of workers at decent wages.

The trouble was that the work itself would bore the balls off a pool table.  By the time of the New Deal disgruntled workers welcomed unionization with open arms.[1]  In 1936-1937 the United Auto Workers (UAW) staged a strike campaign that often turned violent.  For the first time, the government backed the right of the workers to unionize.

However, all the UAW could get its members were better pay, better benefits, and some pass-blocking between the workers and their foremen.  They couldn’t make the work itself any less soul-killing.  What workers wanted was out as soon as possible.  In 1970 the UAW launched a national strike that ran on for better than two months.  What the union won for its members was a “thirty [years] and out” rule that allowed workers to collect a full pension after thirty years on the job, and full health coverage between retirement and Medicare.

The cost of pensions and health care for people who retired when they were about 50 years old heavily freighted the books of companies that already had a hard time adapting to unexpected change.  Many of those companies—and other industries—began shifting the production to Southern “right to work” states or abroad.[2]  Furthermore, workers still had to gut out 30 years at a job they hated from Day One.  On the other hand, places like Janesville were tight communities that had real emotional attractions for the successive generations that grew up in them.[3]  Moreover, for decades American culture—and the Democratic Party in particular—celebrated the industrial working class.  Like combat troops, people could feel a sense of pride in what they had to endure.  It would be hard to cut loose, move to Los Angeles, and become a Chippendale dancer.  (If, you know, that’s how you roll.)

By the dawn of the 21st Century, automobiles—at least union-made, American-company automobiles–no longer were the “new thing.”  The financial crisis of 2008 pushed the remaining “rust belt” car companies to the edge of bankruptcy.  They responded with a desperate effort to cut costs and streamline production.  In October 2008, General Motors announced that it would close its Janesville plant in December.[4]  Merry Christmas!

In America, the human costs of global trade agreements, foreign competition, management errors, and union stupidity have been enormous.  The Janesville unemployment rate hit 13 percent, before falling sharply as people pulled up stakes to search for better chances.  Displaced workers in Janesville didn’t have any better luck with the vaunted government tr-training schemes than have other people.  Women have adapted more easily than men, which can’t be good for the men’s sense of identity.  In a generation, no one will remember or care.

[1] It’s not a car plant, but see: https://www.youtube.com/watch?v=UGXW-PpVL7I

[2] See: https://waroftheworldblog.com/2015/03/02/american-union-stay-away-from-me-uh/

[3] Elements of this appear in “Gran Torino” (dir. Clint Eastwood, 2008): https://www.youtube.com/watch?v=JmqV-LGbqkw

[4] Amy Goldstein, Janesville: An American Story (2017).

My Weekly Reader 1 April 2017.

Since 9/11 the imperatives of the war against radical Islamism have imposed an un-true interpretation of the enemy.  The radicals (Al-Qaeda, Boko Haram, Al Shabab, ISIS) form a minority within Islam, their most common targets are fellow Muslims, and the assistance of Muslim states is essential to victory over the Islamists.  Hence, it has become commonplace to describe the radicals as not truly Muslim, as heretics at best.

To argue differently is to open oneself to charges of Islamophobia.[1]  Nevertheless, radical Islamism diverges from contemporary Islam much more than it diverges from foundational Islam.  Originally, the Prophet Muhammad preached a single community of Believers (the “umma”), led by puritanical religious figures (a theocracy), and living in permanent hostility to Unbelievers (the conflict between the dar al-Islam/House of Peace/Islam and the dar al-Harb/House of War/Unbelievers).  Jews and Christians, the “Peoples of the Book,” were tolerated in return for payment of a tax, bit barred from proselytizing.  Slavery remained a hall-mark of Muslim societies from the time of the Prophet through the 19th Century.  Subsequently, mainstream Islam moved toward what Western observers think of today: fractured into nation states too weak to pull a hobo of their sister; economically stagnant in the face of swiftly rising populations; ruled by tyrannical soldiers and monarchs, and struggling to reconcile “modernization” in all its forms with core cultural values.

Gilles Kepel and others have argued that dissatisfaction with these governments sent people streaming toward a renewed religious commitment in the last decades of the 20th Century.  Some of those people turned back to a fundamentalist version of Islam.  The fruit of this commitment has been harvested in Syria, Iraq, Nigeria, France, Germany, Spain, and Britain.

It’s not difficult to narrate the rise and fall of the Islamic State.  It’s just difficult to explain—comprehend really—why people are willing to give their lives in support of it.  Graeme Wood argues that the “foot soldiers [of ISIS] view their mission in religious terms and spend great energy on piety and devotion.”[2]  They are filled with religious passion.  Dexter Filkins isn’t sure this is actually the case.  His own experience as a war correspondent in Afghanistan and Iraq for the New York Times leads him to believe that “the motives for joining a militant organization were varied and complex.”[3]  Psychopaths and sociopaths found a justification, not a motivation, in religion.  Possibly Wood’s response would be to point again to the identity between the theology of ISIS and the theology of early Islam.  In the 7th and 8th Centuries the Arabs over-run vast tracts of the Byzantine and Sassanian Empires.  Historians conventionally describe these armies as fired by a passionate religious enthusiasm.  Would Filkins argue that they actually were madmen and criminals?

The two different strands of interpretation can be reconciled if one understands that religious faith is intended to redeem those who feel themselves to be ruined by sin.[4]  Religion may become a tired and stifling bourgeois convention that upholds the established order.  It doesn’t normally start out that way.  So, perhaps ISSIS recruits a wide range of troubled people who are self-aware enough to embrace beliefs that may heal or channel their flaws.

[1] It isn’t immediately apparent why mouthing ignorance-based platitudes favorable to Islam is less Islamophobic than is mouthing ignorance-based platitudes hostile to Islam.  Both approaches seem to be based on an indifference to learning about Islam.

[2] Graeme Wood, The Way of the Strangers: Encounters With the Islamic State (2017).

[3] Dexter Filkins, “On the Fringes of ISIS,” NYT Book Review, 22 January 2017.

[4] See, for one example: https://waroftheworldblog.com/2015/02/24/the-islamic-brigades-1/

My Weekly Reader 21 March 2017.

During the Twenties, the Soviet-controlled Communist International (Comintern) adopted a policy called “class against class.”  The Communist Parties of Europe and America excoriated democratic Socialists and bourgeois liberal parties as “social fascists” with which there could be no co-operation.  Then Hitler came to power in Germany.  The Comintern soon espoused creation of “Popular Front” alliances to save democracy.   This change of course often aroused deep suspicion among more-than-once-burned Socialists and anti-Marxist bourgeois liberals.  “Progressive” western intellectuals were a different matter.   They rallied in droves to the idea of a “Popular Front.”

The Soviet intelligence services trolled these waters, recruiting agents and agents-of-influence.  Ernest Hemingway counted among those wiggling in the net.[1]  Like many other people, Hemingway became convinced that only the Soviet Union and the foreign communist parties in its service really opposed a Nazi take-over of Europe.  Hemingway joined a Communist front organization, the League of American Writers.  In 1936 he made the first of several trips to Spain to report on the Republican resistance to the right-wing military coup that had won the support of Fascist Italy and Nazi Germany.

On these reporting and material-gathering trips, Hemingway came to know Alexander Orlov, the Soviet secret intelligence service (NKVD) chief in Spain.   Orlov marked Hemingway for possible recruitment by the NKVD.  After the publication of “A Farewell to Arms” (1940), set in the Spanish Civil War, Hemingway began preparing for a trip to China to report on another preface to the Second World War.  Jacob Golos (1889-1943), a Soviet intelligence officer operating in the United States, recruited Hemingway.[2]  On that trip and afterward, Hemingway somehow always managed to miss connections with his assigned NKVD contacts.

American intelligence suffered a similarly un-productive relationship with the writer.  During the Second World War he filed reports with the F.B.I. on suspicious doings in Cuba, while rigging out his fishing boat as a sub-chaser.   During the Liberation of France, he crossed the line from war correspondent to combatant.   Less than a year later, the struggle against “fascism” ended with the complete victory of the “Grand Alliance.”  Most of the heavy lifting in that struggle had been done by the Red Army; the rest had been done by Western social fascists and bourgeois liberals.  This unpopular front died soon after victory.

Then the onset of the Cold War led to a hunt for Soviet agents in America.  Hemingway feared that his own sterile contacts would lead to his public disgrace, if not something worse.  He became paranoid about the F.B.I.  All the same, although a “premature anti-fascist,” Hemingway proved a laggard at dropping Communism even as its crimes became ever more obvious.   This reluctance is all the more remarkable because so many post-war events laid bare the realities Hemingway had chosen to ignore.  In 1948, Elizabeth Bentley, the lover of Jacob Golos and herself a Soviet agent, made spectacular revelations about Soviet espionage against the United States during the Second World War.  In 1953, Alexander Orlov, the senior NKVD officer he had met in Spain, published The Secret History of Stalin’s Crimes.  In 1956 the Soviet crushing of the Hungarian uprising caused many of the remaining Western Communist intellectuals to flee the party.   Nevertheless, Hemingway celebrated the initial victory of Fidel Castro’s revolution in Cuba.  A great writer, Hemingway was sometimes a fool.

[1] Nicholas Reynolds, Writer, Sailor, Soldier, Spy: Ernest Hemingway’s Secret Adventures, 1935-1961 (2017).  Reviewed by Harvey Klehr, WSJ, 14 March 2017, p. A15.

[2] Golos had been handling the cell centered on Julius Rosenberg.

My Weekly Reader 9 March 2017.

In the bad old days,[1] individual nation-states pursued the welfare of their citizens—political, economic, psychic—through nationalism, protectionism, and war.  The “Devil’s Decades” from 1914 to 1945 thoroughly discredited this approach.  In place of this disgraced “realist” world-view arose two rival systems.  The Soviet model of centrally-planned economies and Big Brother-little brother domination of surrounding countries came to dominate one half of the world.  The Western model of a market economy based on borders open to the flows of capital and people, and regulated by rules and laws came to dominate the other half of the world.  Both systems seemed to depend on international political stability.  Thus, “The “Cold War” was, as John Lewis Gaddis put it, “The Long Peace.”  However, the Soviet model also rested upon a set of beliefs about human beings that were completely false.[2]  Since 1990, former followers of the Soviet model have been in flight toward the Western model.  Intellectuals declared “the end of history” since all the ideological rivals to the Western model had been defeated.

The financial crisis of 2008-2009 and the adjustment problems of the Eurozone posed huge problems of economic management for experts and politicians.  However, they hardly dented the belief in the one best way.  Hence, it is fascinating to encounter a restatement of the Western model[3] made just before the Brexit referendum, the election of Donald Trump as president of the United States, and the arrival of Marine Le Pen as a sort of Snow White to a host of populist dwarf parties.

Michael Mandelbaum understands the substance of international relations and domestic politics almost entirely in material terms.  A stable international order has allowed governments to focus on the promotion of economic growth and the distribution of its benefits.  (Indeed, the pacification of international relations and the de-legitimization of most ideologies have left them nothing else to pursue.)  Mandelbaum carefully explains the main components of the system.  He considers the changes that may be necessary to respond to the rise of the BRIC (Brazil, Russia, India, China) economies.  He calmly contemplates the teeter-totter shift in power as the United States experiences a relative decline and other countries develop economically.

Two points are worth noting.  First, Mandelbaum says little about the impact of the disruptive changes in the old industrial countries brought by globalization.  The adjustment costs of globalization have chiefly been born by common people in sectors of the economy swept by the winds of change.  Currently, Western populism is being fueled by the anger of these people at the elites who have promoted globalization without devising any adequate devices for helping the losers.  Attention-grabbing though these movements have been, what will happen if the Chinese, Indian, and Brazilian people disrupted by globalization launch their own populist movements?  At least the Western countries have political systems designed—however grumpily and disdainfully—to accommodate grievances.

Second, writing in 2014, Mandelbaum foresaw that “it is reasonable to expect that the United States will do less global policing in the future than it has in the past….making the world a politically and militarily more turbulent place.”  Donald Trump may make this long-term trend worse, but he didn’t cause it.

[1] Admittedly, days beloved by history students.

[2] As one fictional character remarked, “All you had to do was keep them penned in and wait for the food riots to start.”  See William Gibson, Pattern Recognition.

[3] Michael Mandelbaum, The Road to Global Prosperity (2014).  See Tod Lindberg, “An Elite Guide to Globalization,” WSJ, 3 April 2014, p. A15.

My Weekly Reader 3 March 2017.

In the last third of the 19th Century, America sprinted through massive industrialization.  Sweeping development ended in “consolidation.”[1]  Big business sought to control the dangers to immense investments they had made.  By organizing production and markets they sought to avoid destructive competition and reduce waste.  “Trusts” and holding-companies, vertical and horizontal integration became hall-marks of industry.  Under the sponsorship of J. P. Morgan, this consolidation movement spread to financial services and capital markets.  Bankers and stock-brokers became central figures in the private management of the American economy.

“Old money” seems out of place in a “New Land” built by such hustlers.  Still, it has been an American reality since the time of Edith Wharton.  Prep schools, Ivy League universities, clubs, churches, and marriage bound families with a lineage into one important element of the American elites.  By the early 20th Century, banks and the stock-exchange opened an appealing career avenue to the young men of this group.  Like many others of his group, Richard W. Whitney (1888-1974) turned down it.[2]

From 1919 on, Whitney and Company brokered most of the bonds for the great Morgan bank.[3]  This burnished the already-impressive respectability that came with an education at Groton and Harvard, followed by marriage into a family with excellent ties to the Republican party.  Respectability isn’t the same thing as admiration.  Other men in the market didn’t think much of Whitney’s abilities.  The client base of his firm failed to grow.  However, bonds aren’t flashy (or mercurial) in the way that stocks may be, so they appeared to be a perfect match with Dick Whitney.  Appearances can be deceiving.  Whitney lived beyond his means.  Rather than retrench, he borrowed from an ever-widening pool of banks, family, friends, and acquaintances.  The personal loans often were unsecured by collateral, other than his respectability, family ties, and friendships.

When the Stock Market crashed in October 1929, Whitney became highly esteemed for his ineffectual steadiness in the face of disaster.[4]  Soon, his colleagues on the Stock Exchange elected him president, then re-elected him four times.

For a time, his respectability allowed him to go on borrowing.  Eventually, even the president of the Stock Exchange couldn’t get an unsecured loan.  So, in 1936, he misappropriated resources placed in his trust, then did it again and again.  In March 1938, the roof fell in.  Convicted of embezzlement, Whitney did three years in prison.

One question raised by this little immorality tale is how idiots—honest and dishonest—come to have big chunks of money.[5]  Richard Whitney didn’t strike people as sharp.  Anyone from the world of the monied must have been able to set a price on his life-style, then match it with what they knew of his income.  Yet they went on lending him money or trusted him to manage the resources of other people.  Why?

When Whitney lost everything and went to prison, friends gave his now-homeless wife a house to live in.  Later, Whitney’s more successful older brother paid all of his huge personal debts.  The bonds of family and friendship and respectability hold fast, even in the face of logic.

[1] See Naomi Lamoreaux, The Great Merger Movement in American Business, 1895-1904 (1988).

[2] Malcolm MacKay, Impeccable Connections: The Rise and Fall of Richard Whitney (2017).  Reviewed by Malcolm Carter, “Blue Blood, Red Ink,” WSJ, 28 April 2017, p. A11.

[3] Both his uncle and brother had made partner at Morgan.

[4] That’s not the worst quality a person can possess.

[5] For another, later, tale, see: https://waroftheworldblog.com/2015/01/19/by-the-waters-of-babylon/

My Weekly Reader 1 March 2017.

In recent years, I have noticed–and lost track of–how many times a head-line in the New York Times describes something as “risky.”[1]  The word is meant to deprecate, rather than to laud the thing being described.  Clearly, both the editors and the typical Times reader are risk averse.  OK, so what?  So this.

Tyler Cowen, The Complacent Class: The Self-Defeating Quest for the American Dream (2017), argues that risk-taking and an openness to change “made America the world’s most productive an innovative economy.”  Now, however, the American economy appears much less innovative and aspirational.[2]  Start-ups as a share of American companies have fallen from 12-13 percent in the 1980s to 7-8 percent today; over the last eight years productivity growth has increased at about half the average rate for the period since 1945.  The former may be taken as a rough measure of entrepreneurialism; the latter may be taken as a rough measure of technological innovation.  Then there is the widespread prescription of mood-leveling anti-depressants.  Perhaps these blunt enthusiasm and engagement (as well as the impulse to drive your car into a telephone pole)?  Even Americans’ recreational drugs-of-choice disappoint.  We’ve gone from dropping acid and snorting coke to drifting away on an opioid cloud.

Cowen believes that Americans have shifted from “building a new and freer world” to “rearranging the pieces in the world we already have.”  To steal a metaphor from demography, Americans are becoming increasingly “endogamous” (marrying people inside the familiar social group) and decreasingly “exogamous” (marrying people outside the familiar social group).  That is, people are increasingly “matching” with others who share their own identity, whether it is politics, or residential location, or interior design.[3]  One way or another, risk-aversion and a fear of change have seized hold of the hearts of a broad swathe of Americans.

While Cowen offers some cautious suggestions about the future, the book may incite a closer examination of the past.  If Cowen is correct, then how did this risk-aversion come about?  In a sense, “morning in America” or no, the last forty years have been trying times.[4]  The oil shocks of the Seventies announced a long era of the disruption of the settled economy of the post-war period.  A whole set of important social relationships and institutional arrangements rested upon the prosperity yielded by that settled economy.

A whole string of unforeseen disasters revealed errors in human judgement.  Take a few recent examples.  The invasion of Iraq in 2003 seemed like a good idea to some people and a bad idea to other people, but no one anticipated that the Pentagon would mess-up the subsequent occupation.  In the first decade of the 21st Century, a housing bubble existed and banks were badly compromised, but only a few people perceived the danger and government regulators were not among them.  The “Deepwater Horizon” blow-out left some people agape at the realization that no one in the oil industry had ever asked what would happen if the blow-out preventer failed, as technology does with surprising frequency.[5]

Seen in this light, it is possible to understand why many people have come to adopt a stance of “first do no harm.”  However, it may be that such a stance does a different kind of harm.

[1] For a sampling, see: https://www.google.com/search?q=New+York+times+%22risky%22&ie=utf-8&oe=utf-8

[2] Matthew Rees, “Lazy Does It,” WSJ, 1 March 2017, p. A15.

[3] For another take on this issue, see: https://waroftheworldblog.com/2017/02/08/pret-a-penser/

[4] “Make America Great Again” is a frank acknowledgement of a feeling shared by many Americans.

[5] For example, Samsung phones catching fire or Takata air-bags deploying when there weren’t supposed to.