The New Economy.

Once upon a time, most American workers were essentially independent contractors: small farmers selling to the local market or craftsmen with their own shops. Then came the Industrial Revolution and massive immigration. Armies of semi-skilled employees replaced the independent contractors and petty entrepreneurs. Giant corporations arose to manage the mass-production industries. Much hand-wringing and teeth gnashing followed. Unions and government both stepped in to regulate the working time, working conditions, and pay of the industrial armies. Much hand-wringing and teeth gnashing followed. This economy flourished through the 1970s.

Then began the great disruption of the American economy. Foreign competition returned to the global market long dominated by Americans (1945-1975). The “oil shocks” (1973, 1979) set off a grave inflation and pushed foreign car-makers toward fuel efficiency. American labor unions not only refused to adapt: they went on the offensive by launching a tidal wave of strikes intended to defend and expand their existing benefits. Companies responded by moving jobs to “right-to-work” states and overseas. Much hand-wringing and teeth gnashing followed.

Then, by 1991, Communism and the centrally-planned economy had been defeated. China, and other socialist countries began a rapid shift toward open markets. Many American jobs disappeared over-seas (although Americans were—short-sightedly—prone to blame NAFTA. Much hand-wringing and teeth gnashing followed. Thereafter, Americans struggled to find some new way of making an adequate living.

Then came the “Great Recession.” Today, about one-third of American workers work part-time, or as temp workers, or day by day. This, in my mind, has been one of the great economic and political preoccupations of the last twenty years.[1]

Uber, the ride-sharing service, and Airbnb, the home-sharing service, are often cited as the fore-runners of a new “sharing” economy. One element of Uber’s business plan has been to define Uber drivers as “independent contractors,” rather than as employees. The upside of this is the great efficiencies and flexibility for both Uber and for its drivers, not to mention the savings on labor costs like benefits. For Uber, the drivers are doing piece-work; for the drivers, they get to structure work around other aspects of their lives by working when and how much they work.

On the other hand, it drives Democrats and their clients in the “old” industries crazy. Independent contractors have no right to unionize; they have no right to benefits; they aren’t subject to government regulation; they don’t get compensated for wait-time; they can work for two different companies; they are all profit-oriented, rather than submissive to the moral strictures of Democratic voters; and they’re entrepreneurial, rather than locked into a known and established institution.[2]

Probably, the goal should be to prevent the exploitation of independent contract labor, rather than to stifle economic change an innovation. This would require treating these workers as some sort of middle ground. Social Security and Medicare with-holding should apply and they should be part of pools for health insurance. The “gig economy” should have to succeed on the strength of its business model, rather than by “screwing labor down to the lowest peg,” as was so often the case in early industrialization. At the same time, Washington shouldn’t try to create a Greek economy.

[1] Greg Ip, ”New Rules for the Gig Economy,” WSJ, 10 December 2015.

[2] Alas, this litany of differences suggests that the “normal” American working conditions are unsustainable in a competitive global economy.

Just another BRIC in the wall.

The surge in economic globalization since 1990 raised up some countries as potential rivals to the long-dominant Western industrial nations. In particular, Brazil, Russia, India, and China seemed poised to equal or surpass the economic power of the old leaders before the middle of the 21st Century. They have been labeled the “BRIC” countries.[1] Unfortunately, “many’s the slip between the cup and the lip.”

As late as 2010, Brazil possessed an apparently dynamic economy.[2] Great results were expected from the oil reserves discovered off-shore[3] and the economy was growing at a rate of 7 percent a year. The government took advantage of the boom in the economy to embark on a generous social policy: cash-transfers and easy credit both raised 40 million Brazilians into the middle-class and expanded consumption. Government deficits, rather than higher taxation on the wealthy, expanded to finance these policies. So long as the economy continued to expand, however, there seemed little danger from these please-everyone policies.

However, aspirations aren’t the same as achievement. Brazil remained a chiefly primary-product (agriculture, mining, forestry) economy. Sugar, soybeans, coffee, and oil were all major exports. Therefore, the Great Recession hit Brazil’s export sector very hard: Chinese and German imports of Brazilian goods slumped; the prices for its main crops sank, some by as much as a third.

By 2015, the whole process seemed to have gone into reverse: economic growth has stalled and teeters on the edge of recession; government debt has expanded at an alarming rate as it tries to keep promise made in happier times, but bond-rating agencies have down-graded Brazilian government debt; the nominal inflation rate is 8 percent a year, but interest rates are at 13 percent (so people may believe that the inflation rate is higher than official statistics claim). The standard solution to such problems is one of “austerity”: cut government spending and increase tax receipts to reduce borrowing. Cutting government spending mean in large part reducing the pay and benefits of public sector employees. They aren’t happy with this reversal of course.

To make matters worse, corruption is endemic in many developing countries and that includes Brazil. Soon after the Workers Party, under President Luis Lula da Silva, took power in 2003, an official investigation began into accusations that the Workers Party had engaged in bribery of legislators to get the government’s policies through the legislature. In 2013 Brazilian police began an investigation into the giant state-owned oil company, Petrobras. The reports so far indicate that Petrobras has been over-paying contractors, who then kick-back part of the profit to the Workers Party. Police arrested the Party’s treasurer, along with many other politicians and businessmen. A proposal by current president Dilma Rousseff, Lula da Silva’s successor, to limit the ability of prosecutors to investigate charges of corruption against politicians stinks to high Heaven. Many Brazilians are enraged over austerity and corruption.

The recent performance of the BRIC economies has been scattered, but long-term performance is what matters for shifting the world balance of power and prosperity.

[1] See: http://en.wikipedia.org/wiki/BRIC

[2] “Brazil’s economic catastrophe,” The Week, 5 June 2015, p. 11.

[3] Early—and apparently exaggerated–reports on exploration of off-shore reserves suggested that there is $1 trillion worth of oil and gas. However, off-shore drilling almost forty miles out in the Atlantic has its challenges and much more modest estimates of the extent of the reserves have begun to come in. The recent decline in world oil prices also has reduced the value of whatever reserves do exist.

The Opinionated American Public.

American religious affiliation:

70.6 percent: Christian in some way, shape, or form.

23 percent: None. I’m not sure that this tells us very much about their social views.

21 percent: Catholic.

15 percent: “mainstream” Protestants.

If 93.6 percent of Americans are Christians or “nones,” then the remaining 6.4 percent are Jews, Muslims, Buddhists, Hindus, and other faiths that don’t come to mind at the moment.

If 70.6 are Christian and 36 percent belong to one of the “mainstream” religions, then 34 percent belong to some other variant of Christianity. This suggests that something between 30 and 34 percent of Americans belong to non-mainstream Protestant churches. OK, there is a small bunch of Seventh Day Adventists and Jehovah’s Witnesses, and I don’t know if Mormons get counted as “mainstream” Protestant. However, the majority probably belong to the free churches that dot the suburbs.[1] (Probably no role for Bing Crosby as the minister of one of these churches.) However, if there are 15 percent of Americans in “mainstream” Protestant churches and 30 percent in non-mainstream Protestant churches, then the non-mainstream Protestants would appear to be the mainstream and the mainstream Protestants would appear to be the non-mainstream. If you see what I mean. The media just haven’t caught up to this reality. It’s a “Christian” country and, within that, a “Protestant” country.

The decade and a half since 9/11 has been hard on American views of Islam. More than half (55 percent) of Americans have an unfavorable view of Islam, while 21 percent have a favorable view. Almost a quarter of Americans aren’t sure.[2] The math says that a lot of the “favorables” and “not sures” must come from the 70.6 percent who self-identify as Christians.

The Republicans opposed gay marriage. How did that work out for them? The Republicans are opposed to illegal immigration.[3] A recent poll showed that 29 percent of Americans want to round up and ship home all the illegal immigrants now in the United States.   In contrast, 57 percent of Americans—essentially twice as many—want to let them stay and grant them the right to apply for citizenship. Only 11 percent favor granting the illegals “green cards” to stay in the United State, but barring them from pursuing citizenship.[4]

Savings patterns by income groups are a sort of opinion poll.[5] “How important is it to save for a rainy day or the monsoon of old age?” Eight percent of lower-income households save more than 15 percent of their income; Twenty-five percent of households earning between $50K and $75K save more than 15 percent of their income; and Seventeen percent of higher-income households save more than 15 percent of their income.

[1] “Poll Watch,” The Week, 22 May 2015, p. 15.

[2] “Poll Watch,” The Week, 24 April 2015, p. 17.

[3] “Poll Watch,” The Week, 22 May 2015, p. 15.

[4] Disclosure: this is my own position. The illegals came to the United States illegally. They can’t be allowed to crowd in ahead of people who took their turn. To do so would b to privilege those immigrants who have the easiest access to the United States across a land border over those who would have to cross the Pacific or the Atlantic. I admit that this is an argument that will resonate more in Britain than in France or Italy. On the other hand, I’m also in favor of open borders. Massive immigration of ambitious people would do the USA a lot of good. However, I’m also in favor of democracy and the rule of law. The fact that a lot of Republican businessmen want cheap labor and a lot of Democratic politicians imagine that Hispanics will vote Democratic doesn’t mean that the laws should just be ignored.

[5] The Week, 10 April 2015, p. 30.

Toward the cliff.

In brief compass, the “supply side” theories of the Reagan administration de-stabilized the traditional budget by cutting taxes without cutting expenditures.[1] Deficits expanded. However, observers were more concerned about the budget deficits that would be driven by the cost of entitlements—Medicare, Medicaid, and Social Security—for Baby Boomers. If one takes as a given that government can only account for some fixed share of GDP, then the growth of entitlements will crowd out spending on other areas: defense and the wide range of government functions labeled as “discretionary spending.”[2] These entitlements are so popular and the mythology surrounding them so powerful that the elected representatives in a democratic polity were unwilling to address them. The problem festered.

Then came the Great Recession. In 2009, the government’s deficit peaked at over 10 percent of Gross Domestic Product (GDP). By 2013 the Congressional Budget Office (CBO) projected that the deficit would fall to 2.1 percent of GDP. Moreover in September 2013 the CBO projected that short-term government deficits would shrink, thanks to the economy’s recovery from the Great Recession and the cuts enforced by “zee zequester.” So, the deficit has been mastered. We’re good, right?

Well, no. The deficit arising from the Great Recession has been mastered. However, that was a matter of course. Counter-cyclical deficit spending has been the normal response to recessions for half a century. Economic activity revises, spending falls, and tax revenues increase, so the deficit goes away.[3]

However, the deficit arising from entitlement programs has not been mastered. Or addressed. Or even acknowledged. Social Security, Medicare, and Medicaid are about to rise sharply in total cost as the fabled Baby Boomers begin to retire in droves. From 1973 to 2013, spending on Medicare, Medicaid, and Social Security averaged 7 percent of GDP per year; the CBO projects that they will rise to 14 percent of GDP by 2038. By 2023, government spending will rise to 3.5 percent of GDP; by 2038 it would reach 6.5 percent. The trouble is that the economy will not grow as much as does government spending. Moreover, federal revenues are projected to rise by only 2 percent of GDP over the same period. Hence, this will drive up the deficit from the 38 percent of Gross Domestic Product (GDP) that formed the average from 1968 to 2008 to 100 percent of GDP in 2038.

Neither Republicans nor Democrats have shown any willingness during the Obama Administration to address this important long-term problem. The administration has concentrated tis efforts on raising taxes on the higher income groups, rather than on trying to contain or reduce costs. The Republicans have concentrated on trying to repeal the Affordable Care Act, rather than on trying to address the ballooning costs of entitlement programs.

Nor is it likely to emerge as a major issue in the 2016 presidential election. Older people vote in larger percentages than do younger people. No one has yet formulated a way to deliver the same quality of medical care or retirement income at a much lower cost. No one yet has formulated a way to raise substantially larger tax revenues from all Americans.

[1] Jackie Calmes, “Budget Office Warns That Deficits Will Rise Again Because Cuts Are Misdirected,” NYT, 18 September 2013.

[2] Obviously, one does not have to agree that some fixed share of the economy should be devoted to government spending. Certainly Senator Sanders does not.

[3] See Paul Krugman’s terse, withering evaluation of President Obama’s performance in this regard in NYT, 8 May 2015.

Rise of the Machine Minders.

Back in August 2014, Tyler Cowen, an economics professor at George Mason University, got the idea that students returning to college for Fall Term needed to hear his advice on the keys to success.[1] He wrote a column for the New York Times. Here, in a nut-shell, is what he said.[2]

He took as his point of departure the proliferation of “thinking” machines in the economy. Industrial robots threaten factory workers; self-driving vehicles threaten truck drivers, if not other motorists; drones threaten airline pilots, did they but know it; e-discovery research software threatens lawyers and college professors. What are we non-mechanicals supposed to do for jobs in this dawning era? What are the keys to success?

Being conscientious is one key. The opportunity to do something and actually doing it are two different things.[3] If students or workers need something more corporeal than a gnawing anxiety to keep them relentlessly on task, they’re in for a bad time.

Listening to the machines is a second key. GPS-based driving instructions delivered by an Oxbridge-educated woman of a certain age are just the beginning.[4] Pretty soon our smart phones will be tuning-up our decision-making in many areas. Deciding, like Doc Boone in ”Stagecoach,” to wave off the warning and have another drink will have a cost.

Remembering that Price reflects the ratio between Supply and Demand is a third key. Stuff that is rare will command a higher price than stuff that is common. In human terms, people need to work on how they present themselves and how they interact with other people.

The Return of Calvinism is a fourth key. Either people are internally motivated or they are externally motivated. If an employer wants to keep down labor costs, then an effective “Atta boy” or “You slime” can replace a bonus as a motivator. People who know which to choose incentive will thrive in the new economy.

Developing a Thick Hide is a fifth key. Computer programs will be able to measure productivity and some other aspects of employee performance.[5] (Not all aspects, just some other aspects.) Workers at every level will get turned into a mathematical formula.[6] You need to be able to learn from a harsh performance review, get up, and move on.

Remembering Harold Lamb is a sixth key. One of his characters was a 17th Century Cossack who bought a pair of expensive leather boots to demonstrate that he had money and then spilled tar on them to demonstrate that at he didn’t care about money. Lots of young workers are libertarian-subversive in this way.

Recognizing that machines under-cut the price advantage of cheap (Asian) human labor is a seventh key. A higher class of Nineteenth Century “machine minders” is on the horizon. That will be good for the right American workers.

Your quick conceiving discontent will have noticed that none of this has anything to do with which major a student chooses. It is all about what kind of person chooses that major.

[1] My guess would be that he was fed up with the stuff that his teaching assistants had been telling him and with the results of the Generals Examinations of the graduate students.

[2] Tyler Cowen, “Who Will Prosper in the New World,” NYT, 1 September 2014.

[3] See: Dorothy Parker.

[4] You ever wonder if people into BDSM choose some German dominatrix’s voice? Just asking.

[5] The SEC wants to track executive compensation against company performance. The first rule for plumbers is that shit runs downhill. The second is to not bite your fingernails.

[6] This will lead to its own disasters. Both life experience and literature indicate that there are non-performers who are vital to the functioning of an organization. So, judgment and experience will be vital.

Annals of the Great Recession VIII.

When we say “investors” we naturally think of Thurston Howell III from “Gilligan’s Island.” Nothing could be further from the truth in contemporary America. Now “investors” means banks, insurance companies, hedge funds, and pension funds. Many of these investors are, in turn, owned by mutual funds. These investors had a lot of money to throw around and they wanted safe investments.[1] The banks addressed this dual problem by creating Collateralized Debt Obligations (CDO). Essentially, a CDO is a super-bond that groups together many smaller loans. So, a CDO is a big financial instrument appropriate for a big investor. At the same time, the CDO addressed the safety problem by bundling the few loans anticipated to default with the many that were expected to not default. These CDOs proved to be wildly popular with investors: $550 billion worth of CDOs were issued in 2006 alone.

For a combination of reasons, the risky, or “sub-prime,” share of mortgages greatly expanded. Rather than trying to rein-in the “sub-prime” risk, lenders relied on safety features of the CDO (many presumably sound mortgages bundled together with a handful of presumably bad mortgages). Furthermore, other companies sold insurance for the derivatives, so they seemed very safe. The market in these “financial derivatives” just exploded. Less noticed, many of the loans were also adjustable rate mortgages (ARMs) which allowed the lender to increase the interest rate charged the borrower if interest rates in general began to rise.

Then, in the second half of the 2000s the whole process went into reverse.[2] The Federal Reserve Bank raised the Federal Funds Rate from 1 percent in Summer 2004 to 5.25 percent in 2006, then left it there until Summer 2007.[3] Interest rates began to rise and housing prices began to drop. The adjustable rate mortgages followed the track of interest rates in general, squeezing many marginal home owners to the point where they could not service the mortgage at all. Defaults suddenly began to mount, leading to foreclosures, leading to a glut of homes on an already falling market, leading to a further decline in the value of all homes.

The trouble here finally appeared in the opacity of the CDOs. Once the defaults started to mount, it proved impossible to tell with any certainty how solid any one CDO was. It might be made up of mostly good loans with a few dogs mixed in. It might be a veritable animal rescue society with a few good loans mixed in. As Peter Peterson put it, “you’ve got ten bottles of water and one of them is poisoned; which one do you drink?” There was no way to tell, so people did the safe thing by distrusting all of them.

As the number of worthless mortgages inside the “bundles” of mortgages bought by investors rose sharply, the value of the securities plunged. Banks that had bought these securities as part of their capital, suddenly found their balance sheets showing huge losses. Worse still, the companies which had sold insurance on the derivatives found that they had misunderstood the degree of risk of default and did not have the resources necessary to cover their own losses. Banks started refusing to lend to other banks out of a fear that the loans would not be repaid. Suddenly, the whole financial system seemed to be on the verge of collapse.

The United States had been through this once before, in the early stages of the Great Depression of the Thirties. Inadequate government action then had led to more than a decade of hardship, misery, and political upheaval. This time would be different. Sort of.

[1] “The ‘toxic debt’ tsunami,” The Week, 20 March 2009, p. 13.

[2] “Wall Street’s hidden time bombs,” The Week, 10 October 2008, p. 11.

[3] http://fpc.state.gov/documents/organization/112465.pdf

Annals of the Great Recession VII.

All business decisions are bets on an unknowable future.[1] Faced with uncertainty about the future and the risk that some bets will go bad, businessmen have long sought to build in certainty through contracts and off-set possible losses through hedging. Commodities futures—promises to deliver a set amount and a set price at some future point—have been contracted for and traded for a long time. Commodities futures guarantee sellers a buyer and an income, while guaranteeing purchasers a product at a fixed price.

If uncertainty is one fixture of business, so is innovation. In the 1990s lenders developed a new form of betting on the future. Housing prices had risen steadily in the United States since the 1970s. Believing that housing prices were on a long-term or even permanent upward track, some lenders perceived mortgages issued today as a promise of secure returns tomorrow. Large numbers of newly-issued mortgages were bundled together into securities which were then sold to investors seeking the promise of above-market rates of return. In all lending there is the danger that the borrower cannot or will not repay the loan. The theory appears to have been that a few bad mortgages in any one bundle would not impair the value of all the other sound mortgages in the security.

Democrats wanted to bring these new financial instruments and markets under federal regulation in the same way that the Securities and Exchange Commission over-sees the stock market. Republicans defeated this effort. Indeed, Senator Phil Gramm pushed through a law which exempted such “financial derivatives” from federal regulation. Potentially, the derivative market had become the Wild West. On the other hand, it was a pretty small market in the later 1990s. What’s the worst that could happen?

The “dot.com boom” was one of the hall-marks of the late 1990s.[2] It turned out to be a bubble and the bubble popped in 2001. Then the 9/11 attacks administered a second shock to the system. Rather than put the United States through a financial crisis and recession, the Federal Reserve Bank pumped a lot of money into the economy and cut the short-term interest rate from 6.5 percent to 1 percent. Banks borrowed money cheaply, then re-lent it to others at a somewhat higher rate of interest. Pretty soon all the reasonable loans had been made, but there was still a lot of money to lend. What to do?

Make unreasonable loans, that’s what. Mathematical risk models for these loans, based on an extremely shallow historical record, predicted only a few defaults and constantly rising house prices. The usual standards for making a loan to someone were diluted. This allowed banks to lend to people and for purposes that normally would not have been acceptable. Some of it went to home loans that were labeled “sub-prime”; some of it went for auto loans, credit card debt, student loans, and commercial mortgages. In short, it financed a lot of consumption by ordinary Americans that otherwise would not have been possible.

So, the banks and non-bank financial institutions (mortgage originators) made all these loans. What to do with them? One answer would be “sit on them and collect the interest and principle until the loan is repaid, then make another loan.” Another answer would be “sell the loans (i.e. the right to be repaid by the original borrower) to investors looking for a steady income stream.” Mostly the banks chose the latter course. Selling the loans brought in cash immediately and earned fees for the banks. It transferred the assets to the “investors.”

[1] “Wall Street’s hidden time bombs,” The Week, 10 October 2008, p. 11.

[2] “The ‘toxic debt’ tsunami,” The Week, 20 March 2009, p. 13.

Long Term Trends 3.

In 2010 and 2011, New York Times correspondent David Leonhardt turned to the problem of the long-term deficit.[1] In his analysis, that deficit arises from “the world’s highest health costs (by far), the world’s largest military (by far), a Social Security program built when most people died by 70—and to pay for it all, the lowest tax rates in decades.” By Leonhardt’s estimate, we will need about $500 billion a year in annual deficit reduction for the next decade. The money will have to come out of the three big spending categories and from more revenue.

The Medicare budget is the “linchpin of deficit reduction.”  Leonhardt recommended introducing incentives for people to choose cost-effective health-care. In practice, that will mean taxing employer-provided health-care benefits like the income they really are. The cost has risen massively since 1975. This encourages spending on health-care, rather than using the market to restrain price increases. Also, it is a benefit available only to people with employer-provided health-care. So, Americans get taxed differently for no good reason. This cost the government $264 billion in revenue in 2010. The federal tax subsidy created by sheltering them from taxes benefits drug makers, hospitals, and insurers.[2] He also wanted Medicare to refuse to pay for health care that cannot demonstrate that it makes people healthier.

Test social programs for actual effectiveness. Lots of them just exist, rather than achieving anything. Doubtless, many Republicans would say the same thing about the encrustations of regulations on business and industry that have grown up over the decades without ever being sunsetted.

Last, cut military spending by $100 billion a year to reverse the post-9/11 expansion.

Some leading conservatives—Mitch Daniels, Glen Hubbard, Gregory Mankiw–have endorsed means-testing Social Security and Medicare benefits. That is, shove more of the cost of their own care and retirement off on people who can afford to pay it.

Leonhardt also favored higher taxes on the middle and upper classes. The mortgage-interest deduction chiefly benefits people in the higher income brackets. Tax breaks for investors (IRA exclusion, $12.6 billion; lower tax rate for dividends, $31.1 billion; lower tax rate for capital gains, $36.3 billion; not taxing capital gains on items left to people in wills, $39.5 billion; and the 401(k) exclusion, $52.2 billion) total $171.70 billion in ‘lost” revenue.

Social Security and Medicare are essentially about paying for the past. (Often an improvident past.) Spending on education, research, and infrastructure are about paying for the future we desire. Leonhardt argues that we should actually be spending more on the future.

However, it isn’t clear that American democracy has the ability to confront powerful and well-entrenched interest groups. Reforming Medicare would involve taking money away from doctors, insurance companies, hospitals, pharmaceutical companies.  Stabilizing Social Security will involve raising the cap on withholding, so that upper income groups get gored and means-testing benefits at the cost of upper-income groups. Raising the taxes on the upper income groups to sustain benefits for lower income groups invited push-back in the past and will do so again. Cutting defense spending has never been as easy as increasing it, especially in the midst of dangers. The refusal of Democrats to define “fair share” makes rational discussion difficult.

Still, laying out the problems and some possible solutions makes it possible to think about the implications.

[1] David Leonhardt, “The Deficit: Real vs. Imagined,” NYT, 22 June 2011.

[2] David Leonhardt, “Health care’s obstacle: no will to cut,” by NYT, 10 March 2010; David Leonhardt, “The 3 Biggest Tax Breaks—and What They Cost Us,” NYT Magazine, 17 April 2011.

Annals of the Great Recession VI.

The “Great Recession” led to much head-scratching. How could this disaster have come about? Who or what was responsible? How should we proceed going forward. Many books have poured forth in an attempt to answer these questions. Four of them take a critical look at the the materialism that drives modern life. The books raise more questions than they answer. The questions are worth some thought.

People used to live with scarcity we can’t imagine.  They had to work very hard just to have enough to survive on.  Then industrialization and the agricultural revolution created abundance.  People had to work very hard to have enough, then later to have a lot of stuff.  Now people ask what to do with this abundance.  Consideration of this question can lead people into areas that will be unfamiliar to most people.

The philosopher Michael Sandel argues that people start to think of money as the solution to everything, instead of just as a situationally-appropriate tool.[1]  Moreover, “the more things that money can buy, the more the lack of it hurts.”  Economic inequality leads to experiential inequality.

The writers Robert and Edward Skidelsky identify certain “basic goods”: health, security, respect, “personality,” harmony with nature, and leisure.[2]  How much is “enough”?  Should people continue to produce and consume ever more?  What do people get out of having more stuff or more activities?  Should they make do with less to have more time?  What would they do with more time?   Among their solutions: tax consumption, not income; tax spending on advertising.  What if the prescription for the good life offered by the Skidelskys conflicts with what most people want?

Luigi Zingales argues that Americans used to think that capitalism was a “fair enough” system, even if it wasn’t perfectly fair.[3]  The economist Alan Meltzer argues that Capitalism is the only system that produces freedom and prosperity.[4]  It makes no claim to produce virtue or stability.  This means that freedom and prosperity have an ugly reverse face.  However, “corruption, fraud, and greed” are common in all other systems, and less likely to be corrected than under capitalism.  What concerns Meltzer and Zingales is the government response.

American government tends to provide benefits to selected businesses without thinking about actually helping society as a whole.  For example, the mortgage interest deduction helps the construction industry, but amounts to a tax on renting that makes it less reasonable. Similarly, subsidies for ethanol amount to a tax on regular gasoline.  Zingales thinks that both education and health-care are “protected” industries.  They need to be exposed to competition.

It also tries to control the flaws.  Very often these efforts at “reform” miscarry.  For example, government seeks to correct for inequality of result by means of paying benefits funded by debt (the obligation of future generations to pay for benefits enjoyed by the present generation) or by regulatory systems that favor present established interests over newcomers.

Similarly, people who behave immorally need to be publically shamed even when they cannot be prosecuted. Bankers have taken their share of this, but someone needs to go after the “jingle-mail” borrowers.  They decided “enough is enough” and abandoned their obligations and assets. Probably not the answer the Skidelskys and Sandel had in mind.

[1] Michael Sandel, What Money Can’t Buy: The Moral Limits of Markets (2012).

[2] Robert Skidelsky and Edward Skidelsky, How Much Is Enough?  Money and the Good Life (2012).

[3] Luigi Zingales, A Capitalism for the People: Recapturing the Lost Genius of American Prosperity (2012).

[4] Alan Meltzer, Why Capitalism?  (2012). A

Which Sides Are You On?

Americans are ambivalent about public unions.  In early industrial capitalism, all the power lay with employers. There were always more people seeking work than there were jobs, while state and local governments were there for the buying. As a result, wages were low, hours were long, working conditions were abominable, and job security was non-existent. Only unions offered any chance at improving the lives of workers. Union-organizing, however, proved to be hard and dangerous work. Employers resisted with every means possible and often did not stop at the edge of legality. Moreover, the very idea of a union clashed with the individualistic values upheld by most Americans. Only with the Depression and the New Deal did mass unionization sweep over heavy industry.

Public-sector unionization did not amount to much for a very long time. For one thing, the large American state is a fairly recent creation. More importantly, most people distinguished between public and private unions. On the one hand, public employment seemed far more secure than did private sector work and often seemed subject to various kinds of patronage. On the other hand, government provided services for which there was no alternative. While breaking a police strike in Boston, Calvin Coolidge declared that “there is no right to strike against the public safety.” Most people agreed with the sentiment for half a century. However, in 1962 President John Kennedy issued an executive order allowing many federal employees to unionize. The movement then spread to the state and local levels. Membership in public-sector unions now outnumbers membership in private-sector unions. Because the courts have upheld the right of unions to collect dues from all members, unions have deep pockets for political action.[1]

Amity Shlaes argues that there is an important emotional component to public attitudes toward unions. People have a positive view of Franklin D. Roosevelt and Roosevelt’s New Deal promoted mass unionization. Most people wouldn’t run into a burning building, or pull over a car on a dark night, or try to wrangle a room full of 14 year-olds, so they admire those who will do those things. So, public sector unions are approved on an emotional level. [2]

While the national media are interested in labor’s role in national politics, the unions actually focus most of their efforts lower down the food-chain. Local government elections often run in the “off” years between national elections. Turn-out is about a third lower in the local elections. When unions can turn out voters and supply campaign funds, they can have a disproportionate impact on the governments with which unions will then negotiate contracts.

Since they depend on union support in elections, Democrats tend to fold up under pressure. Since Americans don’t want to pay more taxes, local governments find their way out of the immediate dilemma by granting generous pension benefits that someone else in the years ahead with have to figure out how to pay. We can see the consequences in the balance sheets of some American cities. Dallas, a non-union town if ever I saw one, pays $74 a ton for garbage collection and disposal. Chicago, the union-city par excellence now that Detroit has cratered, pays $231 a ton. Speaking of Detroit, in 2013 the city sank under more than $18 billion in long-term debt. Half of that debt was for pension and health-care benefits for employees that could not be supported from the shrinking tax base.

Exasperated Republicans just want to cut government services to get rid of the burden of the unions. It’s difficult to see this as anything except a different kind of “strike against the public safety.” As with many things in contemporary America, some fresh thinking is needed.

[1] Daniel DiSalvo, Government Against Itself: Public Union Power and Its Consequences (2015).

[2] Her own sentimental attachments lie elsewhere. See: Amity Schlaes, Coolidge (2013).