My Weekly Reader 23 July 2018.

“Globalization” means the trade in goods and services, the flow of capital, and the movement of workers across national boundaries with little or no national constraints.  This is an old story in human history, but it accelerated dramatically after 1945[1] and it has moved at astonishing speed since 1990.[2]  Globalization has spawned disruptive costs that accompany its immense benefits.  Much attention has focused on some of the costs more than on the benefits.

The political reaction against globalization commands the headlines.[3]  Examples include President Trump’s “America First” policies of tariffs and limits on migration; the British vote to leave the European Union (“Brexit”); and Angela Merkel’s suddenly precarious leadership of Germany.  The most persuasive interpretations see this reaction as rising from two sources.  One is the unequal distribution of both the benefits and costs of globalization.  The other is the resulting discrediting of the elites as leaders in the eyes of everyone else as followers.

One can point to many flaws in democratic governance.  However, part of the current problem is that democracy actually works.  Donald Trump won the 2016 election; a narrow, but real, majority of British voters chose “Brexit”; Italian voters supported the current coalition of anti-immigrant, anti-EU parties that governs the country.  Many of the reforms seem intended to blunt the responsiveness of politicians to the popular will.  These include giving the president of the United States more authority to commit the country to treaties that could not pass the Senate; extending the time between elections to buffer politicians from the public moods; raising the pay of politicians so that a better class of person will go into politics; and instituting civic literacy tests for voters.

Trends that have nothing to do with globalization, but which will rock a globalized world economy get lost in the shuffle.[4]  For example, in Western countries, robots look like a mechanical version of China: low-cost, high-productivity workers.  In developing countries, however, they are just as great a challenge.  Hundreds of millions of people in China, India, and elsewhere have been pulled out of abject poverty by industrialization.  Their jobs, too, are at risk.  Developed countries will have no incentive to off-shore production and developing countries will have to compete with their own robots.

Then soon–but possibly not soon enough–a demographic shift will occur from low birth-low death to low birth-high death.  The United States already depends upon immigration for its population growth (and the financial stability of Social Security).  Japan and many European countries (Germany and Italy for example) are in much worse shape in terms of their young workers-elder retirees ratios.  China will soon enter the ranks of countries this imbalance.  How will different societies pay for their aged, non-working populations?

[1] After the Second World War, the United States led the construction of an open “Free World” economy through institutions like the World Bank (International Bank for Reconstruction and Development), the International Monetary Fund (IMF), and the General Agreement on Tariffs and Trade (GATT).

[2] The collapse of the Soviet Union discredited centrally-planned, non-market economies in the eyes of previous true believers.  Russia, the former “captive nations” of the Soviet Empire, and the Peoples Republic of China all adopted capitalist market economies.  Many other leftist economies in the developing world (notably India) did the same thing.

[3] Dambisa Moyo, Edge of Chaos (2018).

[4] Ian Bremmer, Us vs. Them: The Failure of Globalism (2018).

Advertisements

Annals of the Great Recession XVI, Legacies.

In theory, the American economy is doing well.  Unemployment is at the lowest level in this century; corporations are investing, and there are signs of increasing consumer spending.  Fine.  However, there are also reasons to be concerned.  One is the “flattening of the yield curve.”[1]

The United States government borrows money by selling bonds (Treasury notes).  Basically, bonds are IOUs + Interest.  These Treasury notes run for different periods of time and pay different rates of interest.  Long-term bonds run for like 10 years, while short-term bonds run for like 2 years.  The long-term bonds pay higher interest (called “yield”) than do short-term bonds to account for inflation.  When the economy is growing strongly, prices will tend to rise.  The gap between the yield for long-term bonds and the yield for short-term bonds is called the “yield curve.”

If people think the economy will grow, then they will put their money in stocks and the Treasury will have to pay higher interest on its long-term bonds.  If a lot of people want the security of long-term bonds, rather than the risk of stocks and don’t fear inflation, then the Treasury won’t have to pay as much interest.

Then there are the banks.  They borrow money at low short-term rates and lend it at higher long-term rates.  That’s how they make a profit.  If short-term rate approach long-term rates, it pinches their profits.  If short-term rates exceed long-term rates, they actually lose money.  So, they stop borrowing and lending.

Here’s the thing.  The gap between long-term and short-term bonds has been closing.  This is called “the yield curve flattening.”  A year ago the gap was 1 percent; three months ago it was 0.5 percent; in early July it fell below 0.3 percent.  Interest rates for long-term bonds has not been rising much, while the rates for short-term bonds has continued to rise.  This suggests that bond-traders do not expect a lot of inflation, which suggests that they have doubts about future economic growth.  At some point, the yield for short-term bonds could rise above the yield for long-term bonds.  When this happens, the yield curve is said to be “inverted.”  Economists interpret an inverted yield curve as “a powerful signal of recession.”  Inversions have come before every recession and one near-recession since 1955.  However, the time lag between an inversion and a recession can stretch from six months to two years.  So, we aren’t there yet.

The huge number of bonds that central banks acquired to push down long-term rates during the period of “quantitative easing” are continuing to weigh on the long-term rates.  Now the Federal Reserve Bank is raising short-term rates to prevent excessive price rises in a strong economy.  There is mounting concern that policies being pursued by the Federal Reserve Bank could harm the economy by pinching off lending or by pushing banks to pursue riskier strategies.[2]    On the other hand, there is evidence that, in the wake of the “Great Recession,” the yield curve has lost some of its predictive power.  Moreover, a strong American economy coupled with a slowing world economy could push foreigners to buy long-term bonds.  The issue at hand is whether the Fed should continue to raise short-term interest rates as planned.  The stakes are high.

[1] Matt Philips, “A Recession Signal Is Flashing Yellow,” NYT, 27 June 2018.

[2] Nick Timiraos, “Fed Debates Signal From Yield Curve,” WSJ, 9 July 2018.

GPA+.

Ten years ago, 32 percent of graduating seniors received some form of “Latin honors” from the University of Southern California.[1]  This year, 44 percent received “Latin honors.”  Way to go Southern Cal!  Recruiting all those extra smart kids!  I bet the Ivy League schools will be taking their meals standing up after that spanking.  Oh, wait.  Turns out Harvard granted “Latin Honors” to more than half its graduating seniors.[2]

Granting “Latin honors” isn’t based on the subjective direct judgement of individual merit by the faculty members.  It’s based on the more objective quantifiable judgement of Grade Point Average.  So, Southern Cal and all the many other schools granting “Latin honors” to a growing share of graduates is just an artifact of long-term grade inflation.  According on one expert, a 3.7 GPA (on a scale of 4.0) “is just a run-of-the-mill student.”[3]

It starts in the schools.  In 1998, 39 percent of high-school seniors graduated with an “A” average.  In 2016, 47 percent graduated with an “A” average.  Over the same span, the SAT Critical Reading scores fell from an average of 505 to an average of 494; the Math scores fell from an average of 512 to 508.[4]  Students expect to continue their high-school experience in college.  Elite schools claim that they haven’t studied the trend, and don’t know how to explain it.[5]  The situation probably differs at tuition-driven, not-selective schools.  Too many schools pursuing too few students has led the recruiting effort look like feeding time at the shark tank: “Throw in another goat.”  After the admissions office has done what it can, the faculty face a heavy emphasis by their employers on retaining the students who have been admitted.

Grade inflation is like monetary inflation.

It is fueled by a weak authority in charge of controlling the volume of the unit of exchange.   In the case of the schools this could be parental pressure applied through the influence of a school’s reputation on housing prices.  In the case of colleges and universities, it is the desire to attract student dollars.  A strong authority might tell students that they aren’t particularly distinguished, or well-prepared, or hard-working.

It distorts incentives.  Thus, if you can get the same or more money for less work, then you’ll do less work.  If you can’t trust the money to have real value, then you’ll pursue other stores of value.  One form of this could be a flight to non-public schools with a reputation for greater rigor, or to home-schooling.

It favors people, better positioned to exploit the nominal value of a unit of exchange/measure and disfavors people poorly positioned to do so.  Employers, for example, lack any reliable means to evaluate the educational attainment of potential employees.  High GPAs fog over individual differences in both ability and work ethic.

The historical record shows that breaking an inflation is very painful and politically difficult.  People are willing to try this only after conditions have become intolerable.  We aren’t there yet.

[1] That is “cum laude,” magna cum laude,” and “summa cum laude.”

[2] Down from 91 percent in 2001.

[3] Melissa Korn, “You Graduated Cum Laude?  So Did Everyone Else,” WSJ, 3 July 2018.

[4] See: https://blog.prepscholar.com/average-sat-scores-over-time

[5] See “Captain Henri” in “Casablanca.”

The Old Way and the New Way.

Once upon a time, the United States briefly (1945-1965) stood unchallenged atop the world economy.  “What America makes, the world takes.”  A handful of giant companies dominated the American economy.  They were capital-intensive mass production and mass employment manufacturers.  They paid good wages and many offered generous defined-benefit pension plans.[1]  The companies had been created by ruthless, visionary entrepreneurs.  By the Forties, Fifties, and Sixties, they were owned by mere heirs and by a great many upper middle-class stockholders.  Salaried managers with B-School degrees actually ran the increasingly bureaucratized companies.  No one much objected to punitive taxation of the well-off.  This is today’s Democratic Party idea of a “normal” economy.  It has been in decline for 50 years.[2]

Then change happened.  Part of the change came from abroad.  Foreign countries became serious competitors with American industry.  Then the “oil shocks” of the Seventies set off an inflation that disordered many areas of the American economy.  Part of the change was domestic.  New generations of ruthless entrepreneurs pushing new products rose up.  These people weren’t heirs to someone else’s work.  They had built their own businesses and fortunes.  Many of these people got rich without getting stupendously rich.  Therefore, many of them rejected the existing social consensus on soaking the rich.[3]  Reaganism followed and continues to this day.[4]  These changes sent shock waves through America’s economy, society, and politics.

For example, dying old industries and growing new industries faced the same problem of employee compensation.  (For that matter, so did many states and cities that had fobbed off public employee unions by promising them generous benefits in what the Brits call the “Never-Never”).  Neither corporate profits nor the stock market could guarantee adequate returns to support the defined benefit promises.  First, beginning in 1978, the private sector began to shift from “defined benefit” to “defined contribution” retirement plans.  Second, employers shifted a large share of medical insurance costs to employees as a way of holding down labor costs.  Since 1999, inflation has raised prices by 47 percent, but average contributions by workers to individual health insurance premiums have risen 281 percent.

The future well-being of employees came to depend upon their wisdom in choosing suitable retirement plans and on their willingness to divert income into savings.  Other factors also shaped their behavior.  First, we’ve been living with low interest rates for quite a while now.  This both encouraged people to pick up “cheap debt” and—through the magic of compound interest—slowed the rise in value of what people did save.  Second, many people had never thought much about saving and investing because the company’s pension and Social Security allowed them to not learn about it.  People often opted out of savings plans or made poor investment decisions when they opted in.

The median personal income of people aged 55 to 69 leveled off from 2000 (before the Great Recession) to the present.  This did not stop people from spending more.  On average, people approaching retirement these days have heavy debts (some for college for their kids, but also for other stuff).[5]  They also have been mining their savings, rather than building them.  The Great Recession both reduced contributions to 401k plans and caused many people to withdraw from them to make ends meet.

The long-term results of this huge change in the social contract are just now beginning to be felt.[6]  More than 40 percent of households headed by people aged 55 to 70 will not have the resources to maintain the standard of living they enjoyed while working once they hit retirement.  Households with at least one worker aged 55 to 64 had a median savings of $135,000 in their 401k plans.   The median annual income from their 401K plans is $8,000.  This should yield a paltry $675 a month in income.

Worse still, the Social Security Trust Fund will have to reduce payments at some point in the future as it is depleted or exhausted.

Undoubtedly, the disaster that is emerging renders a severe judgement on many of the “Baby Boomers.”  Not all of the human-interest stories included in journalists’ stories arouse the same degree of sympathy.  Faced with the need to save for the future and to be self-reliant, many of them delayed saving, stinted saving in favor of consumption[7] until too late, and then did too little.

Still, as a matter of public policy, there are going to be powerful and compelling arguments made in favor of a government response.  If the government expands benefits to the worst off retirees, then either taxes or deficits will rise or benefits for the better-off will be decreased.  Perhaps all three will form the basis of a compromise.

[1] By the 1980s, almost half (46 percent) of workers belonged to an employer pension plan.

[2] Without Democrats being willing to notice the changes.  JMO.

[3] Warren Buffett is in no sense a representative figure among this group.

[4] To the Democratic slogan of “tax, spend, elect,” the Republican learned to reply “tax-cut, spend, elect.”  See: William Shakespeare, Romeo and Juliet, Act 3, Scene 1.

[5] The per capita student loan debt of people aged 60 to 69 rose from about $300 to about $1,800 between 2004 and 2017.  Per capita debt for cars for the same group of people rose from about $3,000 to about $4,000 between 2004 and 2017.  It looks like people chose not to choose between guns and butter.

[6] Heather Gillers, Anne Tergesen, and Leslie Scism, “Time Bomb Looms for Aging America,” WSJ, 23-24 June 2018.

[7] Sales of HD televisions soared during the Great Recession.  The graph is for global sales, but may offer an approximation of American behavior.  See: https://www.statista.com/statistics/461114/full-hd-tv-shipments-worldwide/

Default Setting.

I’m not sure that History weighs on us, but Memory certainly does.[1]  For example, inflation and deflation are subjects of learning and memory for those who experience them.  Deflation (falling prices) plagued American borrowers and benefitted American lenders in the last quarter of the 19th Century.  People looked at inflation (rising prices) with longing or loathing.  If you were, say, 64 in 1934, then you were born in 1870.[2]  Growing up, you would probably have heard about reams of paper money printed without any fixed relationship to gold in order to finance your particular country’s search for victory in the Civil War.  As an adult, you would have read with exultation or dread, depending on your social class, William Jennings Bryan’s “Cross of Gold” speech and the Populist calls for the free coinage of silver at a ratio of 16:1.  That is, you would have been familiar with inflation as a good thing (for debtors) or a bad thing (for creditors), rather than as just a normal thing.  In the wake of the election of 1896, a conservative victory, Congress enacted American adhesion to the gold standard.  However, that was just Congress, a bunch of gutless poltroons (why else would you bribe them?) who might change their minds with the wind.  As a result, many lenders inserted “gold clauses” in contracts.  These obligated borrowers to repay in gold coins of “present weight and fineness” or in paper of equivalent value.  Basically, “gold clauses” were inflation-proofing insisted upon by lenders.  They applied to various contracts, but especially to bonds—government and corporate IOUs.

OK, skip ahead to the Great Depression of the 1930s.  Taking the leadership of a country sunk in the slough of despond, Franklin D. Roosevelt opted for inflation over deflation.  He severed the United States from the Gold Standard, which kept currencies fixed at specific rates of exchange, and then revalued the dollar.  This allowed Roosevelt to “raise” the price of gold held by the United States and print more dollars to accommodate its higher price.  The “price” of gold rose from about $21/ounce to $35/ounce.  So, by about two-thirds.  This inflated prices and devalued debts.  Great!  For anyone who had debts not inflation-proofed.

At this point, Roosevelt’s policy slammed into the “gold clauses” on many bonds.  Because of the two-thirds rise in the price of gold, debtors had to pay lenders about two-thirds more than they had borrowed.  One of those debtors was the United States government, which owed about $20 billion in gold-clause bonds.[3]  In 1935, the Supreme Court—in the “gold clause cases:–held that the government could abrogate public and private gold clauses.  That is, the U.S. government is not obligated to pay its debts and it did not pay them in this case.

Still, it is a commonplace that the United States has never defaulted on its debts.  That reassuring belief keeps people buying Treasury bonds when the deficit and national debt keep growing to extraordinary levels.  Except, maybe Bill Gross when he was at PIMCO.[4]

[1] That’s probably why “we” never learn from the past, but individuals often do learn from the past.  There is no way to transmit the acquired knowledge.  They why study History at all?  Because smart people will be among the few who learn lessons and for everyone else, it’s pretty entertaining.

[2] Sebastian Edwards, American Default: The Untold Story of FDR, the Supreme Court, and the Battle over Gold (2018).

[3] Worth about $380 billion in 2018 dollars.

[4] https://www.theatlantic.com/business/archive/2011/03/pimcos-gross-asks-who-will-buy-treasuries-when-the-fed-doesnt/72276/ ; https://www.theatlantic.com/business/archive/2011/05/bill-gross-on-deficits-and-the-fed/238682/

Chain Migration.

From 1789 to 1808 the United States had a policy of unrestricted immigration; from 1808 to the 1920s the United States had a policy of unrestricted immigration for people of European origins; and from the 1920s to the 1960s the United States had a policy of restricted immigration that favored people from Northwestern Europe.[1]  These changes reflected struggles between economic necessity and national identity.

In 1960, 70 percent of immigrants came from Europe.[2]  Early in 1964, in a little noticed part of his campaign for a “Great Society,” President Lyndon B. Johnson proclaimed that “a nation that was built by immigrants of all lands can ask those who now seek admission “What can you do for our country?’  But we should not be asking ‘In what country were you born?’”  The election of a liberal Congress in November 1964 opened the flood-gates for a host of long-stalled reforms.[3]

A new immigration law compromised between the traditional policy that prioritized immigration from northwestern Europe and a new policy that prioritized candidates with skills and education needed by the United States.  Conservatives chose family re-unification as the device for defending the traditional sources of immigration.  The new “Immigration and Nationality Act” of 1965 capped annual immigration at about a million people and assigned about 80 percent of the slots to ‘family reunification” candidates, but only about 20 percent to “needed” candidates.  Moreover, eligible family members shifted from spouse and small children to add adult children, brothers and sisters, and parents.

What looked to be a resounding victory for conservatives turned out to be something else entirely.  While the Irish and Italians continued to migrate in droves from desperately broken societies, the rest of Europe dried up as a major source of migration to America.  Britain, France, and Germany were both short of labor themselves and building “social” states that offered steadily rising standards of living for most people.  Eastern Europe lay within the Soviet empire, from which few could escape.  As a result, the large share of family reunification slots increasingly flowed toward the previous minority sources of Asia, Latin America, and Africa.  By 2010, 90 percent of immigrants were from non-European sources.

Is there anything wrong with this approach?  From the economic point of view, there is—at least in some eyes and some ways.  On the one hand, traditionally, most immigrants came to America as young people seeking economic opportunity and political freedom.  They found a hard and demanding land that gave nothing away and insisted that immigrants assimilate to an “Anglo-Saxon” culture.  America ended up with lots of adaptable strivers.  An Organization for Economic Cooperation and Development (OECD) study has reported that skill-based immigrants are more likely to be younger, better educated, more fluent in English, and quicker to get work than are the family-based immigrants.  Thus, American immigration policy misses the opportunity to fully enrich the country’s human capital.  On the other hand, a battle over limiting or reducing immigration is counter-productive for a country that is short of skilled labor and likely to suffer slower economic growth as a result.

So there is a case for immigration reform.  However, it should involve shifting (even reversing) the distribution of slots between “family” and “skill” immigrants.  Of course, even this solution dodges the question of whether the United States should be aggressively recruiting from countries with a dim future—like Taiwan.

[1] From 1808 the involuntary immigration of African slaves was restricted; from the 1880s Asian immigration to the West Coast was restricted; and from 1924 the immigration of people from southern and eastern Europe was restricted.

[2] Greg Ip, “Kinship Emerges as Immigration Flashpoint,” WSJ, 18 January 2018; Tom Gjelten, “The Curious History of ‘Chain Migration’,” WSJ, 20-21 January 2018

[3] See: Julian Zelizer, “The Fierce Urgency of Now.”  Greg Ip argues that Jonson saw immigrants as deserving the same right to equal treatment without regard to race that he wished to insure for American citizens.

Annals of the Great Recession XV.

The TARP and the stimulus bill were intended to recover from the financial crisis of 2008-2009.  What about preventing a re-run in the future?  The Dodd-Frank Act required banks to hold larger capital reserves and to submit to “stress tests” to evaluate how well they could deal with a future financial crisis on the scale of 2008.   Curiously, the law also limited the trade in “credit default swaps.”  Admittedly, the wholesale trade in these insurance policies against a collapse of the bubble seems to have been what sunk the AIG insurance group.  On the other hand, they were an investment by people who saw the bubble for what it was rather than blindly believing what they were told.

One effect of the new legislation appears to be that it has encouraged the consolidation of the banking system.  It has been argued that the costs of complying with the new regulations are more than smaller banks can bear, so they have sold out to already big banks that are better able to shoulder the burden.

It is said that generals are always preparing to fight the last war.  Banks and investors are on guard against sub-prime mortgages.  However, “bubbles” can develop in any asset.[1]  So, some kind of new crisis is always possible.  Can the government and the financial system respond effectively to a new crisis?  The answers are not encouraging.

First, a flight from Keynesian demand-management policies followed quickly on the financial crisis.  President Bush encountered considerable difficulty in getting Republicans to accept the TARP.  President Obama opted for a stimulus bill that Paul Krugman warned was half as big as it needed to be, spread over two years instead of front-loaded into one year, and contained a bunch of tax cuts that would be used to reduce debt instead of engaging in new spending.  Both Republicans and Democrats have proved critical of deficit spending plans.

Second, in the absence of a Keynesian policy on the part of the Congress and President, the Federal Reserve Bank launched a long program of “quantitative easing.”  It bought huge amounts of both MBSs and U.S. treasury debt as a way of pumping money into a slow-recovering economy.  It has only recently begun to unwind this position and to raise interest rates.  That means that it would be difficult to counter a new recession by cutting interest rates.

There may also be a deep hostility to government intervention on the part of many voters.  The policies that saved the American—and world—economy from a new Depression looked very much like a privatization of gains and a socialization of losses.[2]  Thus, in 2007, the top 10 percent of income-earners held 71 percent of the nation’s wealth; now the top 10 percent hold 77 percent.  That is about an 8 percent increase.  The Fed’s quantitative easing pushed up asset prices when ownership of stocks and bonds is concentrated in the upper income groups.

In 2007, the bottom 90 percent of earners held 29 percent of the nation’s wealth; today the bottom 90 percent hold 23 percent.  That is an average 20 percent drop in assets for the vast majority of Americans.  Even so, it is worse for some than for others.  Back in 2007, the median lower-income family had about $18,000 in assets.  Today they have about $11,000 in assets.  Doubtless that fall largely represents the loss of the houses they bought without being able to pay for them.  Would Congress tolerate a new TARP or a new stimulus bill?

Maybe.  The combination of the recent tax revisions and the huge spending bill that enjoyed bipartisan support seem likely to massively expand the deficit.  Maybe stimulus is back in style if you put in enough treats for everyone.  Locking up a bunch of bankers might have to be one of those treats.

[1] See: Alexandre Dumas, The Black Tulip (1850).

[2] President Obama may have contributed to this with his denunciation of the rich as “the people who tanked the economy.”  Bill Gates and Warren Buffett tanked the economy?