The Lafarge Affair.

If you read the papers, it is easy to get the idea that the post-WWII order is breaking down.[1]  However, elements of one era can live on, for a time at least, in a new era.[2]  One part of the post-war order took the form of multi-national corporations operating in the developing world.

What happens when civil war or terrorism breaks out in those countries?  Do companies abandon their often-substantial investments and call the insurance company?  Do they pull out their Western leadership staff and abandon their local employees to their fates?  Alternatively, should they stay and try to continue operating?  In many developing countries, both the regime in power and the opponents willing to take up arms against it are unsavory.  In peacetime, the government can hide a lot of its brutality and oppression.  Once war breaks out, both sides come out into the open with unchecked violence.  If the companies remain, what kind of adaptations might they have to make as war drags on?

As anyone who has read Nevil Shute’s Most Secret (1945) or just walked around Paris knows, the French have long been pioneers in the use of reinforced concrete for construction.  (They call it “beton.”)  Cement is a major component of concrete.[3]  Currently, Lafarge SA is a major force in the business.[4]   It made large investments in Syria before the civil war began in 2011.[5]

The initial stage of the war raised the troubling questions of “should I stay or should I go.”[6]  Lafarge decided to stay.  Then the initial war, the basis of the company’s calculations, went sideways.  In eastern Syria from 2013 to 2015, the Islamic State (ISIS) seized control of territory and proclaimed a caliphate.  (They also videotaped and posted to the internet the burning to death of a captured Jordanian pilot, among other indications of their mind-set.)  ISIS exploited all the economic resources available within its domain.  This included extorting Western companies, as well as selling oil and trafficking in non-iconic antiquities.

Mired in this situation, Lafarge may have made some questionable choices.  Lafarge allegedly paid ISIS and other groups $5 million to insure the safe passage of employees and goods through territory controlled by the caliphate.  Local managers pressed local employees to keep working while the security situation deteriorated.  Critics also cite “lax security” at the Lafarge properties.[7]

Confusing the effect with the cause, a French court has “indicted” Lafarge SA.

[1] And not just because Donald Trump got elected president.  Doesn’t matter what the daily edition of the New Republic (i.e. the New York Times) thinks.

[2] See, for a highly readable example, R. F. Arragon, The Transition from the Ancient to the Medieval World (1936).

[3] On the deeply fascinating subject of Portland cement, see:

[4] Liz Alderman, “France Indicts Cement Giant on Charge of Aiding Terror Groups in Syria,” NYT, 29 June 2018.

[5] If you look at news photographs of Syrian cities during the war, you will see that a huge market existed for concrete and cement before the war.  Commonly, one sees that artillery fire and aerial bombing blow out the front walls of apartment buildings.  The poured-concrete floors then fall downward like the pages of a book, rather than disintegrating or collapsing straight down.  The back walls and staircases serve as the hinge or binding.  So, the concrete appears to be generally of high quality to the eye of a non-expert.

[6] See

[7] Without seeking to exculpate the company, it is fair to ask just how Lafarge could have provided adequate security against ISIS when the governments of Syria and Iraq could not defend themselves without foreign military aid.


Migrants 1.

Social scientists posit that people experiencing disturbing social change can seize on particularist identities like ethnicity or nationality.  Demographic change and economic change and shifting social values all can trigger such a response.  On the other hand, cultural and economic elites in Western countries celebrate the free flow of goods and labor.  They also have developed more cosmopolitan views than have many fellow citizens.[1]

Illegal immigration provides a good example of the particularist-cosmopolitan tension.  In recent times, illegal migration has become easier than ever before in history.  In both Europe and America bitter quarrels over immigration rack politics.[2]  These controversies arise not from heavy current immigration, but from heavy prior immigration.  More importantly, the general backlash against elites–who led us to war in Iraq and then into the financial crisis—has ensnared migrants.

Illegal migration to the United States dropped sharply during the Great Recession.  It hasn’t picked up immensely in the past year.  However, that still leaves 10-12 million illegal immigrants in the United States.  Human symbols of elite failure.  Liberals insisting on calling them “undocumented immigrants”—as if there is just some bureaucratic foul-up in Washington—adds fuel to the fire.  President Obama’s skirting of the law angered many people.  Illegal immigration in the European Union is more recent.  There the flood of migrants from various failed states mixes with refugees from war-torn Muslim states.

People leave their “shithole” countries for good reasons and not just on a whim.  Until conditions in those countries improve, there is not likely to be a significant drop in attempts at illegal immigration.  To complicate matters further, while many of the migrants are economic migrants, the law allows them to request asylum as victims of persecution.  This clogs the immigration system and delays repatriation.

In light of this reality, attention has turned to deterring them from reaching American or European soil in the first place.  Europeans have negotiated with pathway countries—Libya, Sudan, and Turkey—to stem the departures for Europe.  The implementation of those agreements involves a good deal of brutality that is much worse than anything suffered by Central American migrants to the United States.  Mexico is unwilling to play that sort of role for the United States.  The “zero tolerance” policy attempted by a Trump administration grown tired of waiting for Congressional approval of a border wall offers another form of deterrence.

Cosmopolitans sometimes phrase the choice in a misleading way: “What sort of society do they wish to be?  Do they wish to be immigrant nations with continual demographic and cultural change?”  First, both the European Union and the United States have long had substantial legal immigration.  Second, it is legitimate to debate what kinds of immigrants best serve the interests of the community.

[1] Benjamin Barber, Jihad and McWorld: How Globalism and Tribalism Are Shaping World Society (1996).  Barber’s analysis remains engaging, but it wasn’t new.  Late-Nineteenth Century sociologists had identified the problem of anomie.  For that matter, historians long ago diagnosed the rise of “mystery” religions as a response to the cosmopolitanism of the Hellenistic kingdoms.

[2] Amanda Taub and Max Fisher, “In U.S. and Europe, Conflict Over Migration Points to Political Problems,” NYT, 30 June 2018.

My Weekly Reader, 10 July 2018.

Russo-American relations had deteriorated under the simultaneous presidencies (2000-2008) of George W. Bush and Vladimir Putin.[1]  However, constitutional term limits meant that Putin could not run for a third consecutive term.  So, he became prime minister while his client, Dmitri Medvedev, became president.  However, all power remained in Putin’s hands.

Barack Obama also became president in 2009.  Obama made one of his campaign advisers on foreign policy, Michael McFaul, head of Russian affairs on the National Security Council.  McFaul then became a principle architect of the Obama administration’s attempt at a “reset” of the relationship with Russia.  The administration hoped to draw Russia toward the American-led international system.

The “reset” began well.  In July 2009, the Russians began allowing the United States to use Russian airspace to airlift supplies to Afghanistan.  In September 2009, the U.S. dropped its plan to build anti-missile defenses in Eastern Europe.   In March 2010, the two countries agreed to reduce their nuclear arsenals. In May 2010, the Russians agreed to impose sanctions on Iran in an effort to get it to end is nuclear weapons program.  The U.S. then lifted sanctions on Russia.

Then things went sour in a hurry.  Why?  There are two answers here.  One answer is that the Libyan Revolution from March to August 2011 began the breakdown.  In this account, the “Arab Spring” spread to Libya; the Gaddafi government set out to suppress it; Libya was a Russian client and Russia had a veto on any Security Council authorization; the Americans got Russia to abstain by limiting the resolution to “protecting civilians,” rather than overthrowing the regime; and then they went ahead and overthrew the regime.[2]

To make matters worse, in Fall 2011, Putin and Medvedev again switched jobs.  This infuriated many Russians.  Demonstrators filled the streets and the unrest continued during the run-up to the March 2012 presidential elections.  It doesn’t seem to have sat too well with Washington either.  In December 2011, Secretary of State  Hillary Clinton declared that “The Russian people, like people everywhere, deserve the right to have their voices heard and their votes counted. “And that means they deserve free, fair, transparent elections and leaders who are accountable to them.”[3]  This amounted to taking sides against Putin.

Michael McFaul, the American ambassador to Russia from 2012 to 2014, prefers another explanation.  He thinks that Putin is “paranoid” and sees the U.S. as “the enemy.”  He is possessed of “fixed and flawed views.”  The Russian people themselves follow Putin because of “a deep societal demand for this kind of autocratic leadership, and this kind of antagonistic relationship with the United States and the West.”

When Secretary of State Clinton made her statement on the Russian elections, the United States had already overthrown the autocratic governments of Afghanistan, Iraq, and Libya, and leaned on the Egyptian military to topple Hosni Mubarak.  The American government-funded National Endowment for Democracy was at work in Russia.  Is it a surprise that Putin is paranoid?  McFaul should have re-read Kennan before he entered government.

[1] Daniel Beer, Does Vladimir Putin Speak for the Russian People?” NYTBR, 8 July 2018, reviewing Michael McFaul, From Cold War to Hot Peace (2018).

[2] See:

[3] See:

Annals of the Great Recession XVI, Legacies.

In theory, the American economy is doing well.  Unemployment is at the lowest level in this century; corporations are investing, and there are signs of increasing consumer spending.  Fine.  However, there are also reasons to be concerned.  One is the “flattening of the yield curve.”[1]

The United States government borrows money by selling bonds (Treasury notes).  Basically, bonds are IOUs + Interest.  These Treasury notes run for different periods of time and pay different rates of interest.  Long-term bonds run for like 10 years, while short-term bonds run for like 2 years.  The long-term bonds pay higher interest (called “yield”) than do short-term bonds to account for inflation.  When the economy is growing strongly, prices will tend to rise.  The gap between the yield for long-term bonds and the yield for short-term bonds is called the “yield curve.”

If people think the economy will grow, then they will put their money in stocks and the Treasury will have to pay higher interest on its long-term bonds.  If a lot of people want the security of long-term bonds, rather than the risk of stocks and don’t fear inflation, then the Treasury won’t have to pay as much interest.

Then there are the banks.  They borrow money at low short-term rates and lend it at higher long-term rates.  That’s how they make a profit.  If short-term rate approach long-term rates, it pinches their profits.  If short-term rates exceed long-term rates, they actually lose money.  So, they stop borrowing and lending.

Here’s the thing.  The gap between long-term and short-term bonds has been closing.  This is called “the yield curve flattening.”  A year ago the gap was 1 percent; three months ago it was 0.5 percent; in early July it fell below 0.3 percent.  Interest rates for long-term bonds has not been rising much, while the rates for short-term bonds has continued to rise.  This suggests that bond-traders do not expect a lot of inflation, which suggests that they have doubts about future economic growth.  At some point, the yield for short-term bonds could rise above the yield for long-term bonds.  When this happens, the yield curve is said to be “inverted.”  Economists interpret an inverted yield curve as “a powerful signal of recession.”  Inversions have come before every recession and one near-recession since 1955.  However, the time lag between an inversion and a recession can stretch from six months to two years.  So, we aren’t there yet.

The huge number of bonds that central banks acquired to push down long-term rates during the period of “quantitative easing” are continuing to weigh on the long-term rates.  Now the Federal Reserve Bank is raising short-term rates to prevent excessive price rises in a strong economy.  There is mounting concern that policies being pursued by the Federal Reserve Bank could harm the economy by pinching off lending or by pushing banks to pursue riskier strategies.[2]    On the other hand, there is evidence that, in the wake of the “Great Recession,” the yield curve has lost some of its predictive power.  Moreover, a strong American economy coupled with a slowing world economy could push foreigners to buy long-term bonds.  The issue at hand is whether the Fed should continue to raise short-term interest rates as planned.  The stakes are high.

[1] Matt Philips, “A Recession Signal Is Flashing Yellow,” NYT, 27 June 2018.

[2] Nick Timiraos, “Fed Debates Signal From Yield Curve,” WSJ, 9 July 2018.


Ten years ago, 32 percent of graduating seniors received some form of “Latin honors” from the University of Southern California.[1]  This year, 44 percent received “Latin honors.”  Way to go Southern Cal!  Recruiting all those extra smart kids!  I bet the Ivy League schools will be taking their meals standing up after that spanking.  Oh, wait.  Turns out Harvard granted “Latin Honors” to more than half its graduating seniors.[2]

Granting “Latin honors” isn’t based on the subjective direct judgement of individual merit by the faculty members.  It’s based on the more objective quantifiable judgement of Grade Point Average.  So, Southern Cal and all the many other schools granting “Latin honors” to a growing share of graduates is just an artifact of long-term grade inflation.  According on one expert, a 3.7 GPA (on a scale of 4.0) “is just a run-of-the-mill student.”[3]

It starts in the schools.  In 1998, 39 percent of high-school seniors graduated with an “A” average.  In 2016, 47 percent graduated with an “A” average.  Over the same span, the SAT Critical Reading scores fell from an average of 505 to an average of 494; the Math scores fell from an average of 512 to 508.[4]  Students expect to continue their high-school experience in college.  Elite schools claim that they haven’t studied the trend, and don’t know how to explain it.[5]  The situation probably differs at tuition-driven, not-selective schools.  Too many schools pursuing too few students has led the recruiting effort look like feeding time at the shark tank: “Throw in another goat.”  After the admissions office has done what it can, the faculty face a heavy emphasis by their employers on retaining the students who have been admitted.

Grade inflation is like monetary inflation.

It is fueled by a weak authority in charge of controlling the volume of the unit of exchange.   In the case of the schools this could be parental pressure applied through the influence of a school’s reputation on housing prices.  In the case of colleges and universities, it is the desire to attract student dollars.  A strong authority might tell students that they aren’t particularly distinguished, or well-prepared, or hard-working.

It distorts incentives.  Thus, if you can get the same or more money for less work, then you’ll do less work.  If you can’t trust the money to have real value, then you’ll pursue other stores of value.  One form of this could be a flight to non-public schools with a reputation for greater rigor, or to home-schooling.

It favors people, better positioned to exploit the nominal value of a unit of exchange/measure and disfavors people poorly positioned to do so.  Employers, for example, lack any reliable means to evaluate the educational attainment of potential employees.  High GPAs fog over individual differences in both ability and work ethic.

The historical record shows that breaking an inflation is very painful and politically difficult.  People are willing to try this only after conditions have become intolerable.  We aren’t there yet.

[1] That is “cum laude,” magna cum laude,” and “summa cum laude.”

[2] Down from 91 percent in 2001.

[3] Melissa Korn, “You Graduated Cum Laude?  So Did Everyone Else,” WSJ, 3 July 2018.

[4] See:

[5] See “Captain Henri” in “Casablanca.”

Indonesian Islam.

Back in the day, Seymour Martin Lipset wrote The First New Nation: The United States in Historical and Comparative Perspective (1963).  Like most of Lipset’s work, it was about several things at once.  For one thing, it was about the United States as the first colonial territory to gain its independence from a colonial overlord.  Therefore, American could serve as a model for all the Asian and African countries recently or about-to-be liberated from European empires.  For another thing, it was about the related issue of how to create a stable democracy.  (That’s what most of the leaders of new nations said that they wanted, although the historical record now suggests other ambitions.[1])  According to Lipset democracy is intimately connected with economic growth: “[t]he more well-to-do a nation, the greater the chances that it will sustain democracy.”  This idea lay behind both the Marshall Plan to aid Western European economic recovery after the Second World War and the First Gulf War (1990-1991).[2]

Time hasn’t fully born out Lipset’s ideas–so far.  China, for example, is an increasingly prosperous autocracy.  In many Muslim countries, oligarchies have gobbled up national wealth, while the vast majority of people have little opportunity.  More importantly, religious belief can outweigh political theory.  It isn’t clear that the beliefs of Islam are compatible with Western conceptions of democracy.  Traditional Islam rejects any separation of church and state, it rejects law derived from legislatures rather than the Word of Allah, and it rejects the very idea of nation-states in favor of the “umma” of all Believers.[3]  Moreover, Islam is socially conservative in ways that Western liberals find repugnant.  Women’s rights and gay rights antagonize social conservatives.

Indonesia provides an interesting case.  It is the most populous Muslim country in the world.[4]  Piled on top of religious conservatism are hostilities related to ethnic or religious minorities.[5]  The very small share of people with Chinese ancestry play an out-sized role in the economy and have long been the target of Muslim hostility.  Women’s rights and gay rights have a salience in Muslim concerns because of Indonesia’s popularity with Western tourists.

Like Turkey, Indonesia has a democratic system.  Can democratic politics can be used to impose an Islamist agenda?  In 2002, Jemaah Islamiya—an Islamist group linked to Al Qaeda—killed 200 people in bomb attacks on the Indonesian island of Bali.  Repression followed.  Recently, however, there have been both a mass mobilization of Muslims against the Christian governor of Jakarta and renewed terrorist attacks.  There is also legislation pending to criminalize public display of affection by gay people.

Will Southeast Asia become the next front in the war against radical Islamism?

[1] A friend insists that there is a scene from one of Ionesco’s plays in which a character says “We will drink wine under the willow trees.  AND YOU WILL BE MY SLAVES!”  I haven’t been able to run it down.

[2] It was a war for oil prices and oil markets, not a war for oil companies.  The historically-minded men and women behind the war were aware that the 1970s “oil shocks” had pitched the world close to the edge of depression and that the Great Depression of the Thirties had been the principal cause of the Second World War.  They didn’t want that to happen again.

[3] We’ll probably hear complaints that the University of Michigan Museum of Art is a sign of creeping Islamization.

[4] Indonesia’s population is 270 million.   87.2 percent Muslim, 9.9 percent Christian, 1.7 percent Hindu, and 0.7 percent Buddhist and 0.2 percent Confucian.

[5] Yaroslav Trofimov, “Islamist Shift Unsettles Indonesia’s Democracy,” WSJ, 29 June 2018.

The Old Way and the New Way.

Once upon a time, the United States briefly (1945-1965) stood unchallenged atop the world economy.  “What America makes, the world takes.”  A handful of giant companies dominated the American economy.  They were capital-intensive mass production and mass employment manufacturers.  They paid good wages and many offered generous defined-benefit pension plans.[1]  The companies had been created by ruthless, visionary entrepreneurs.  By the Forties, Fifties, and Sixties, they were owned by mere heirs and by a great many upper middle-class stockholders.  Salaried managers with B-School degrees actually ran the increasingly bureaucratized companies.  No one much objected to punitive taxation of the well-off.  This is today’s Democratic Party idea of a “normal” economy.  It has been in decline for 50 years.[2]

Then change happened.  Part of the change came from abroad.  Foreign countries became serious competitors with American industry.  Then the “oil shocks” of the Seventies set off an inflation that disordered many areas of the American economy.  Part of the change was domestic.  New generations of ruthless entrepreneurs pushing new products rose up.  These people weren’t heirs to someone else’s work.  They had built their own businesses and fortunes.  Many of these people got rich without getting stupendously rich.  Therefore, many of them rejected the existing social consensus on soaking the rich.[3]  Reaganism followed and continues to this day.[4]  These changes sent shock waves through America’s economy, society, and politics.

For example, dying old industries and growing new industries faced the same problem of employee compensation.  (For that matter, so did many states and cities that had fobbed off public employee unions by promising them generous benefits in what the Brits call the “Never-Never”).  Neither corporate profits nor the stock market could guarantee adequate returns to support the defined benefit promises.  First, beginning in 1978, the private sector began to shift from “defined benefit” to “defined contribution” retirement plans.  Second, employers shifted a large share of medical insurance costs to employees as a way of holding down labor costs.  Since 1999, inflation has raised prices by 47 percent, but average contributions by workers to individual health insurance premiums have risen 281 percent.

The future well-being of employees came to depend upon their wisdom in choosing suitable retirement plans and on their willingness to divert income into savings.  Other factors also shaped their behavior.  First, we’ve been living with low interest rates for quite a while now.  This both encouraged people to pick up “cheap debt” and—through the magic of compound interest—slowed the rise in value of what people did save.  Second, many people had never thought much about saving and investing because the company’s pension and Social Security allowed them to not learn about it.  People often opted out of savings plans or made poor investment decisions when they opted in.

The median personal income of people aged 55 to 69 leveled off from 2000 (before the Great Recession) to the present.  This did not stop people from spending more.  On average, people approaching retirement these days have heavy debts (some for college for their kids, but also for other stuff).[5]  They also have been mining their savings, rather than building them.  The Great Recession both reduced contributions to 401k plans and caused many people to withdraw from them to make ends meet.

The long-term results of this huge change in the social contract are just now beginning to be felt.[6]  More than 40 percent of households headed by people aged 55 to 70 will not have the resources to maintain the standard of living they enjoyed while working once they hit retirement.  Households with at least one worker aged 55 to 64 had a median savings of $135,000 in their 401k plans.   The median annual income from their 401K plans is $8,000.  This should yield a paltry $675 a month in income.

Worse still, the Social Security Trust Fund will have to reduce payments at some point in the future as it is depleted or exhausted.

Undoubtedly, the disaster that is emerging renders a severe judgement on many of the “Baby Boomers.”  Not all of the human-interest stories included in journalists’ stories arouse the same degree of sympathy.  Faced with the need to save for the future and to be self-reliant, many of them delayed saving, stinted saving in favor of consumption[7] until too late, and then did too little.

Still, as a matter of public policy, there are going to be powerful and compelling arguments made in favor of a government response.  If the government expands benefits to the worst off retirees, then either taxes or deficits will rise or benefits for the better-off will be decreased.  Perhaps all three will form the basis of a compromise.

[1] By the 1980s, almost half (46 percent) of workers belonged to an employer pension plan.

[2] Without Democrats being willing to notice the changes.  JMO.

[3] Warren Buffett is in no sense a representative figure among this group.

[4] To the Democratic slogan of “tax, spend, elect,” the Republican learned to reply “tax-cut, spend, elect.”  See: William Shakespeare, Romeo and Juliet, Act 3, Scene 1.

[5] The per capita student loan debt of people aged 60 to 69 rose from about $300 to about $1,800 between 2004 and 2017.  Per capita debt for cars for the same group of people rose from about $3,000 to about $4,000 between 2004 and 2017.  It looks like people chose not to choose between guns and butter.

[6] Heather Gillers, Anne Tergesen, and Leslie Scism, “Time Bomb Looms for Aging America,” WSJ, 23-24 June 2018.

[7] Sales of HD televisions soared during the Great Recession.  The graph is for global sales, but may offer an approximation of American behavior.  See: