The Old Way and the New Way.

Once upon a time, the United States briefly (1945-1965) stood unchallenged atop the world economy.  “What America makes, the world takes.”  A handful of giant companies dominated the American economy.  They were capital-intensive mass production and mass employment manufacturers.  They paid good wages and many offered generous defined-benefit pension plans.[1]  The companies had been created by ruthless, visionary entrepreneurs.  By the Forties, Fifties, and Sixties, they were owned by mere heirs and by a great many upper middle-class stockholders.  Salaried managers with B-School degrees actually ran the increasingly bureaucratized companies.  No one much objected to punitive taxation of the well-off.  This is today’s Democratic Party idea of a “normal” economy.  It has been in decline for 50 years.[2]

Then change happened.  Part of the change came from abroad.  Foreign countries became serious competitors with American industry.  Then the “oil shocks” of the Seventies set off an inflation that disordered many areas of the American economy.  Part of the change was domestic.  New generations of ruthless entrepreneurs pushing new products rose up.  These people weren’t heirs to someone else’s work.  They had built their own businesses and fortunes.  Many of these people got rich without getting stupendously rich.  Therefore, many of them rejected the existing social consensus on soaking the rich.[3]  Reaganism followed and continues to this day.[4]  These changes sent shock waves through America’s economy, society, and politics.

For example, dying old industries and growing new industries faced the same problem of employee compensation.  (For that matter, so did many states and cities that had fobbed off public employee unions by promising them generous benefits in what the Brits call the “Never-Never”).  Neither corporate profits nor the stock market could guarantee adequate returns to support the defined benefit promises.  First, beginning in 1978, the private sector began to shift from “defined benefit” to “defined contribution” retirement plans.  Second, employers shifted a large share of medical insurance costs to employees as a way of holding down labor costs.  Since 1999, inflation has raised prices by 47 percent, but average contributions by workers to individual health insurance premiums have risen 281 percent.

The future well-being of employees came to depend upon their wisdom in choosing suitable retirement plans and on their willingness to divert income into savings.  Other factors also shaped their behavior.  First, we’ve been living with low interest rates for quite a while now.  This both encouraged people to pick up “cheap debt” and—through the magic of compound interest—slowed the rise in value of what people did save.  Second, many people had never thought much about saving and investing because the company’s pension and Social Security allowed them to not learn about it.  People often opted out of savings plans or made poor investment decisions when they opted in.

The median personal income of people aged 55 to 69 leveled off from 2000 (before the Great Recession) to the present.  This did not stop people from spending more.  On average, people approaching retirement these days have heavy debts (some for college for their kids, but also for other stuff).[5]  They also have been mining their savings, rather than building them.  The Great Recession both reduced contributions to 401k plans and caused many people to withdraw from them to make ends meet.

The long-term results of this huge change in the social contract are just now beginning to be felt.[6]  More than 40 percent of households headed by people aged 55 to 70 will not have the resources to maintain the standard of living they enjoyed while working once they hit retirement.  Households with at least one worker aged 55 to 64 had a median savings of $135,000 in their 401k plans.   The median annual income from their 401K plans is $8,000.  This should yield a paltry $675 a month in income.

Worse still, the Social Security Trust Fund will have to reduce payments at some point in the future as it is depleted or exhausted.

Undoubtedly, the disaster that is emerging renders a severe judgement on many of the “Baby Boomers.”  Not all of the human-interest stories included in journalists’ stories arouse the same degree of sympathy.  Faced with the need to save for the future and to be self-reliant, many of them delayed saving, stinted saving in favor of consumption[7] until too late, and then did too little.

Still, as a matter of public policy, there are going to be powerful and compelling arguments made in favor of a government response.  If the government expands benefits to the worst off retirees, then either taxes or deficits will rise or benefits for the better-off will be decreased.  Perhaps all three will form the basis of a compromise.

[1] By the 1980s, almost half (46 percent) of workers belonged to an employer pension plan.

[2] Without Democrats being willing to notice the changes.  JMO.

[3] Warren Buffett is in no sense a representative figure among this group.

[4] To the Democratic slogan of “tax, spend, elect,” the Republican learned to reply “tax-cut, spend, elect.”  See: William Shakespeare, Romeo and Juliet, Act 3, Scene 1.

[5] The per capita student loan debt of people aged 60 to 69 rose from about $300 to about $1,800 between 2004 and 2017.  Per capita debt for cars for the same group of people rose from about $3,000 to about $4,000 between 2004 and 2017.  It looks like people chose not to choose between guns and butter.

[6] Heather Gillers, Anne Tergesen, and Leslie Scism, “Time Bomb Looms for Aging America,” WSJ, 23-24 June 2018.

[7] Sales of HD televisions soared during the Great Recession.  The graph is for global sales, but may offer an approximation of American behavior.  See: https://www.statista.com/statistics/461114/full-hd-tv-shipments-worldwide/

Advertisements

Default Setting.

I’m not sure that History weighs on us, but Memory certainly does.[1]  For example, inflation and deflation are subjects of learning and memory for those who experience them.  Deflation (falling prices) plagued American borrowers and benefitted American lenders in the last quarter of the 19th Century.  People looked at inflation (rising prices) with longing or loathing.  If you were, say, 64 in 1934, then you were born in 1870.[2]  Growing up, you would probably have heard about reams of paper money printed without any fixed relationship to gold in order to finance your particular country’s search for victory in the Civil War.  As an adult, you would have read with exultation or dread, depending on your social class, William Jennings Bryan’s “Cross of Gold” speech and the Populist calls for the free coinage of silver at a ratio of 16:1.  That is, you would have been familiar with inflation as a good thing (for debtors) or a bad thing (for creditors), rather than as just a normal thing.  In the wake of the election of 1896, a conservative victory, Congress enacted American adhesion to the gold standard.  However, that was just Congress, a bunch of gutless poltroons (why else would you bribe them?) who might change their minds with the wind.  As a result, many lenders inserted “gold clauses” in contracts.  These obligated borrowers to repay in gold coins of “present weight and fineness” or in paper of equivalent value.  Basically, “gold clauses” were inflation-proofing insisted upon by lenders.  They applied to various contracts, but especially to bonds—government and corporate IOUs.

OK, skip ahead to the Great Depression of the 1930s.  Taking the leadership of a country sunk in the slough of despond, Franklin D. Roosevelt opted for inflation over deflation.  He severed the United States from the Gold Standard, which kept currencies fixed at specific rates of exchange, and then revalued the dollar.  This allowed Roosevelt to “raise” the price of gold held by the United States and print more dollars to accommodate its higher price.  The “price” of gold rose from about $21/ounce to $35/ounce.  So, by about two-thirds.  This inflated prices and devalued debts.  Great!  For anyone who had debts not inflation-proofed.

At this point, Roosevelt’s policy slammed into the “gold clauses” on many bonds.  Because of the two-thirds rise in the price of gold, debtors had to pay lenders about two-thirds more than they had borrowed.  One of those debtors was the United States government, which owed about $20 billion in gold-clause bonds.[3]  In 1935, the Supreme Court—in the “gold clause cases:–held that the government could abrogate public and private gold clauses.  That is, the U.S. government is not obligated to pay its debts and it did not pay them in this case.

Still, it is a commonplace that the United States has never defaulted on its debts.  That reassuring belief keeps people buying Treasury bonds when the deficit and national debt keep growing to extraordinary levels.  Except, maybe Bill Gross when he was at PIMCO.[4]

[1] That’s probably why “we” never learn from the past, but individuals often do learn from the past.  There is no way to transmit the acquired knowledge.  They why study History at all?  Because smart people will be among the few who learn lessons and for everyone else, it’s pretty entertaining.

[2] Sebastian Edwards, American Default: The Untold Story of FDR, the Supreme Court, and the Battle over Gold (2018).

[3] Worth about $380 billion in 2018 dollars.

[4] https://www.theatlantic.com/business/archive/2011/03/pimcos-gross-asks-who-will-buy-treasuries-when-the-fed-doesnt/72276/ ; https://www.theatlantic.com/business/archive/2011/05/bill-gross-on-deficits-and-the-fed/238682/

Chain Migration.

From 1789 to 1808 the United States had a policy of unrestricted immigration; from 1808 to the 1920s the United States had a policy of unrestricted immigration for people of European origins; and from the 1920s to the 1960s the United States had a policy of restricted immigration that favored people from Northwestern Europe.[1]  These changes reflected struggles between economic necessity and national identity.

In 1960, 70 percent of immigrants came from Europe.[2]  Early in 1964, in a little noticed part of his campaign for a “Great Society,” President Lyndon B. Johnson proclaimed that “a nation that was built by immigrants of all lands can ask those who now seek admission “What can you do for our country?’  But we should not be asking ‘In what country were you born?’”  The election of a liberal Congress in November 1964 opened the flood-gates for a host of long-stalled reforms.[3]

A new immigration law compromised between the traditional policy that prioritized immigration from northwestern Europe and a new policy that prioritized candidates with skills and education needed by the United States.  Conservatives chose family re-unification as the device for defending the traditional sources of immigration.  The new “Immigration and Nationality Act” of 1965 capped annual immigration at about a million people and assigned about 80 percent of the slots to ‘family reunification” candidates, but only about 20 percent to “needed” candidates.  Moreover, eligible family members shifted from spouse and small children to add adult children, brothers and sisters, and parents.

What looked to be a resounding victory for conservatives turned out to be something else entirely.  While the Irish and Italians continued to migrate in droves from desperately broken societies, the rest of Europe dried up as a major source of migration to America.  Britain, France, and Germany were both short of labor themselves and building “social” states that offered steadily rising standards of living for most people.  Eastern Europe lay within the Soviet empire, from which few could escape.  As a result, the large share of family reunification slots increasingly flowed toward the previous minority sources of Asia, Latin America, and Africa.  By 2010, 90 percent of immigrants were from non-European sources.

Is there anything wrong with this approach?  From the economic point of view, there is—at least in some eyes and some ways.  On the one hand, traditionally, most immigrants came to America as young people seeking economic opportunity and political freedom.  They found a hard and demanding land that gave nothing away and insisted that immigrants assimilate to an “Anglo-Saxon” culture.  America ended up with lots of adaptable strivers.  An Organization for Economic Cooperation and Development (OECD) study has reported that skill-based immigrants are more likely to be younger, better educated, more fluent in English, and quicker to get work than are the family-based immigrants.  Thus, American immigration policy misses the opportunity to fully enrich the country’s human capital.  On the other hand, a battle over limiting or reducing immigration is counter-productive for a country that is short of skilled labor and likely to suffer slower economic growth as a result.

So there is a case for immigration reform.  However, it should involve shifting (even reversing) the distribution of slots between “family” and “skill” immigrants.  Of course, even this solution dodges the question of whether the United States should be aggressively recruiting from countries with a dim future—like Taiwan.

[1] From 1808 the involuntary immigration of African slaves was restricted; from the 1880s Asian immigration to the West Coast was restricted; and from 1924 the immigration of people from southern and eastern Europe was restricted.

[2] Greg Ip, “Kinship Emerges as Immigration Flashpoint,” WSJ, 18 January 2018; Tom Gjelten, “The Curious History of ‘Chain Migration’,” WSJ, 20-21 January 2018

[3] See: Julian Zelizer, “The Fierce Urgency of Now.”  Greg Ip argues that Jonson saw immigrants as deserving the same right to equal treatment without regard to race that he wished to insure for American citizens.

Annals of the Great Recession XV.

The TARP and the stimulus bill were intended to recover from the financial crisis of 2008-2009.  What about preventing a re-run in the future?  The Dodd-Frank Act required banks to hold larger capital reserves and to submit to “stress tests” to evaluate how well they could deal with a future financial crisis on the scale of 2008.   Curiously, the law also limited the trade in “credit default swaps.”  Admittedly, the wholesale trade in these insurance policies against a collapse of the bubble seems to have been what sunk the AIG insurance group.  On the other hand, they were an investment by people who saw the bubble for what it was rather than blindly believing what they were told.

One effect of the new legislation appears to be that it has encouraged the consolidation of the banking system.  It has been argued that the costs of complying with the new regulations are more than smaller banks can bear, so they have sold out to already big banks that are better able to shoulder the burden.

It is said that generals are always preparing to fight the last war.  Banks and investors are on guard against sub-prime mortgages.  However, “bubbles” can develop in any asset.[1]  So, some kind of new crisis is always possible.  Can the government and the financial system respond effectively to a new crisis?  The answers are not encouraging.

First, a flight from Keynesian demand-management policies followed quickly on the financial crisis.  President Bush encountered considerable difficulty in getting Republicans to accept the TARP.  President Obama opted for a stimulus bill that Paul Krugman warned was half as big as it needed to be, spread over two years instead of front-loaded into one year, and contained a bunch of tax cuts that would be used to reduce debt instead of engaging in new spending.  Both Republicans and Democrats have proved critical of deficit spending plans.

Second, in the absence of a Keynesian policy on the part of the Congress and President, the Federal Reserve Bank launched a long program of “quantitative easing.”  It bought huge amounts of both MBSs and U.S. treasury debt as a way of pumping money into a slow-recovering economy.  It has only recently begun to unwind this position and to raise interest rates.  That means that it would be difficult to counter a new recession by cutting interest rates.

There may also be a deep hostility to government intervention on the part of many voters.  The policies that saved the American—and world—economy from a new Depression looked very much like a privatization of gains and a socialization of losses.[2]  Thus, in 2007, the top 10 percent of income-earners held 71 percent of the nation’s wealth; now the top 10 percent hold 77 percent.  That is about an 8 percent increase.  The Fed’s quantitative easing pushed up asset prices when ownership of stocks and bonds is concentrated in the upper income groups.

In 2007, the bottom 90 percent of earners held 29 percent of the nation’s wealth; today the bottom 90 percent hold 23 percent.  That is an average 20 percent drop in assets for the vast majority of Americans.  Even so, it is worse for some than for others.  Back in 2007, the median lower-income family had about $18,000 in assets.  Today they have about $11,000 in assets.  Doubtless that fall largely represents the loss of the houses they bought without being able to pay for them.  Would Congress tolerate a new TARP or a new stimulus bill?

Maybe.  The combination of the recent tax revisions and the huge spending bill that enjoyed bipartisan support seem likely to massively expand the deficit.  Maybe stimulus is back in style if you put in enough treats for everyone.  Locking up a bunch of bankers might have to be one of those treats.

[1] See: Alexandre Dumas, The Black Tulip (1850).

[2] President Obama may have contributed to this with his denunciation of the rich as “the people who tanked the economy.”  Bill Gates and Warren Buffett tanked the economy?

Annals of the Great Recession XIV.

To review, the presidents from 1981 to 2017 were Ronald Reagan (1981-1989), George H.W. Bush (1989-1993), Bill Clinton (1993-2001), George W. Bush (2001-1009), and Barack Obama (2007-2017).  The chair-people of the Federal Reserve Bank were Alan Greenspan (1987-2006), Ben Bernanke (2006-2014), and Janet Yellin (2014-2018).  So, those are the people upon whose watch various things happened.[1]

Between 1997 and 2006 the government eased regulations on lending and encouraged home-ownership among new groups.[2]  Mortgage originators—banks or mortgage companies—did what they were allowed and even encouraged to do: they issued mortgages (loans) to “sub-prime” borrowers.[3]  These amounted to hundreds of billions of dollars of risky loans.  Rather than hold these dangerous loans on their own books, the loan originators re-packaged the mortgages as collateralized debt obligations (CDOs) and mortgage-backed securities (MBS), then sold these packages to investors.[4]  With many previously-excluded buyers seeking a limited stock of housing, housing prices rose by a national average of 124 percent.  The value of the CDOs and MBSs also rose.  Prices for both exceeded their real value.[5]

Then, in 2007 and 2008, it became apparent why sub-prime borrowers had previously had trouble getting loans.  The number of defaults started to rise sharply.  The MBSs and CDOs dropped toward their real value.  Financial institutions that had purchased these “instruments” suddenly found immense sums wiped off the asset side of their ledgers without their liabilities (what they owed other people) being reduced.  Bankruptcy loomed for the banks unless they could get rid of these dogs in a hurry and replace them with more valuable assets.  First Bear, Sterns, and then Lehman Brothers failed.  Seeking to stop the bleeding, banks pulled in the reins on all lending, including for productive investment.  The whole economy rapidly slowed during 2008.  The Dow Jones Industrial Average fell by 50 percent.  This reduced the values of many assets held by the upper and middle-classes, causing them to cut spending in order to reduce their own debts.  With consumption spending and investment both falling, the unemployment rate jumped to 10 percent by late 2009.

Acting quickly, the George W. Bush administration pushed through a Troubled Asset Relief Program (TARP) that bought $700 billion worth of bad debt from the banks.  The Obama administration launched a mini-Keynes stimulus program of $757 billion.  The Federal Reserve Bank cut interest rates to near zero and held them there for a long time.

[1] “The long shadow of the financial crisis,” The Week, 13 April 2018, p. 11.

[2] In part, this seems to have had a worthy purpose.  Houses are a key middle-class asset, but “red-lining” by banks had long restricted access to home purchases by African-Americans and other groups.  See: https://en.wikipedia.org/wiki/Redlining

[3] Sub-prime borrowers are ones with poor credit-worthiness.  For an explanation of how credit-worthiness is determined, see: https://www.investopedia.com/terms/f/ficoscore.asp  Very often, these are referred to in public discourse as “sub-prime loans,” as if the problem existed only with “predatory” lenders.  This seems to me to resemble referring to illegal immigrants as “un-documented immigrants,” as if the only problem is a bureaucratic foul-up with issuing them some documents.

[4] Apparently, it was possible for the purchasers to discern that the CDOs and MBSs were very risky—and possibly worthless—investments.  Most people did not do so.  A few did.  See: Gregory Zuckerman, The Greatest Trade Ever (2009) and Michael Lewis, The Big Short (2010).  The bets against the housing buble were called credit default swaps.

[5] This is called a “bubble.”

Memoirs of the Addams Administration 31.

After the latest (but perhaps not last) attempt to “repeal and replace” the Affordable Care Act (ACA), some Republicans have fallen back.  Lamar Alexander suggested that a bipartisan effort to “stabilize and strengthen” the ACA.[1]  Will President Trump accommodate himself to this inconvenient truth?  The president could scuttle the ACA’s healthcare market places by refusing to authorize the payment of the subsidies that enable “Cost Sharing Reductions” in premiums.  Under the Obama administration a federal judge held that payment of the subsidies without a Congressional appropriation is illegal.  The case awaits final resolution in the Supreme Court, but the Trump administration has continued to make the payments in the meantime.  Halting the payments would lead to an estimated 19 percent jump in premiums nation-wide.  Does Donald Trump want to shove millions of Americans off medical insurance?

Six months into his administration, President Trump has begun to encounter resistance from fellow Republicans.[2]  They are eager to embrace a strong line against Russia, they can’t do anything to bring a resolution to the “collusion” story, and they’re angry about his verbal assault on Attorney General Jeff Sessions (so recently one of their own).  If Republicans break from the president, he will have little choice but to abandon a legislative agenda in far of issuing a blizzard of executive orders and vetoing Republican legislation out of spite.  Those will be contested in the courts.  On the other hand, if Republicans break from the president, they will have little chance of advancing their own legislative agenda unless they can unite with Democrats to over-ride a presidential veto.  Of course, cooler heads may prevail.

Playing to his base, the president announced that transgender troops would be barred from further service in the military, and the Justice Department launched an investigation of affirmative action admissions policies by universities.[3]  There may be legitimate reasons for limiting transgender troops in the military.  It isn’t clear that the president knows any of them.  Rather, he seems to have been over-responding to pressure from Christian conservative Republicans.  In any event, the Pentagon said that a tweet is not the same thing as a formal order, that there is a formal review of transgender service people under way; and that all troops will continue to be treated with respect.  In terms of affirmative action, there is a sense in some quarters that it has been turned into a system of “set asides” for African-Americans and, to some extent, for Hispanic-Americans.  Given the over-supply of colleges, it doesn’t have much effect.

Under these adverse conditions, a steadier hand in the White House became vital.  President Trump’s churning of his White House staff reached a new stage.[4]   Chief of Staff Reince Priebus and Press Secretary Sean Spicer left, while Anthony Scaramucci became director of communications.  Then Secretary of Homeland Security John Kelly took over as chief of staff.  Next thing you know, Scaramucci got booted out of the White House.  Kelly, a retired Marine Corps general often portrayed in the media as a Drill Instructor screaming orders up the noses of staffers, tried to impose some order.  (No such option appears available to the Congressional Republicans.)  One key task for Kelly will be dealing with leaks from the sieve-like White House.  It will fall to the Justice Department to stanch the leaks from the Trump Resistance within the federal bureaucracy.  Editorials and columnists generally agreed that a far more challenging task lay in the need to wrangle an undisciplined president.

[1] “Health care: What happens now?” The Week, 11 August 2017, p. 6.

[2] “The GOP: Rebelling against Trump,” The Week, 11 August 2017, p. 16.

[3] “Justice Department to target affirmative action,” The Week, 11 August 2017, p. 6.

[4] “Embattled Trump turns to Kelly,” The Week, 11 August 2017, p. 4.

Memoirs of the Addams Administration 30.

“The great thing about hitting yourself in the head with a hammer is that it feels so good when you stop.”  Recently, almost two-thirds (64 percent) of Americans desired the preservation of the Affordable Care Act (ACA) as it currently exists or with reforms of “problem areas.”[1]   Are there “problem areas”?  Yes.  Here are a couple of examples.  First, there are many people who are caught in a tight spot by earning too much to qualify for subsidies, but too little to be able to afford health insurance.  Second, only a few insurance companies had any experience at providing/pricing health insurance to poor people.  The Obama administration lured many other health insurers into participating in the healthcare market places by promising that all sorts of healthy rubes would pay premiums without needing much care.  Then, the Obama administration failed to enforce the “mandate.”  Many people did not even bother to get health insurance.[2]  The lack of young, healthy fools ready to be gouged for the benefit of older, wealthier people lies at the root of the instability in healthcare market places.  Third, the survival of the system depends upon continuing subsidies from appropriations passed by Congress.  The Republicans have declined to pass such appropriations and a federal court has held that spending without an appropriation is unconstitutional.  This case has not yet been heard by the Supreme Court.  When the Court does hear the case, it seems likely to support the initial decision.

Then, there are all the bad-press issues.  President Barack Obama said that “If you like your insurance, you can keep it” (or words to that effect).  Then the government cancelled a lot of insurance policies as “garbage policies” when the policy-holders really liked those policies.  The “roll-out” of the healthcare.gov web-site was a humiliating mess.  The Supreme Court held that the extension of Medicaid could not be forced on states that didn’t wish to participate by the threat to withhold other Medicaid funding.[3]  Naturally, these colossal screw-ups colored the perception of the ACA for a time.  Now, however, with the ACA an established—if imperfect—reality,[4] Republicans might do well to concentrate on remediation.

Such remediation might consist of getting rid of the ACA mandate on what must be covered; getting rid of the “mandate” that everyone must be covered; allowing/encouraging a few experienced companies to provide insurance for previously uninsured Americans[5]; expanding the range of those people eligible for subsidies; appropriating the moneys needed to make the system work; and not trying to coerce states that don’t want to expand Medicaid.

This will not be easy for Republican law-makers to do.  It abandons ideas of personal responsibility, to which many Republican voters are committed.  It expands spending, when we are already neck-deep in red-ink.  On the other hand, it will not be easy for Democrat law-makers to do.  It abandons the idea of “equal access” to health care and it abandons another federal suppression of state autonomy.

Then we can argue about how to close the budget deficit.  Anthoer difficult task.

[1] “Poll Watch,” The Week, 11 August 2017, p. 17.

[2] About 15 million people did get health insurance solely out of fear of the Internal Revenue Service (IRS).  These 15 million resent having to buy something they don’t need and constitute the core of those people who would “lose” this insurance under various Republican plans.

[3] This class-based program of medical insurance covers many Trump voters as well as the voters whom the Trump voters despise for other reasons.

[4] Like Social Security, Medicare, and the Espionage Act of 1918.

[5] Yes, I understand that this will create a two-tier medical care system.  “What are you, fresh offa da boat?  Expect that the streets are paved with gold?  ‘Merica is hard place to live.  Still, is better than the Old Country. “  NB: Imagined monologue, not a quote from the text.