The New Economy.

Once upon a time, most American workers were essentially independent contractors: small farmers selling to the local market or craftsmen with their own shops. Then came the Industrial Revolution and massive immigration. Armies of semi-skilled employees replaced the independent contractors and petty entrepreneurs. Giant corporations arose to manage the mass-production industries. Much hand-wringing and teeth gnashing followed. Unions and government both stepped in to regulate the working time, working conditions, and pay of the industrial armies. Much hand-wringing and teeth gnashing followed. This economy flourished through the 1970s.

Then began the great disruption of the American economy. Foreign competition returned to the global market long dominated by Americans (1945-1975). The “oil shocks” (1973, 1979) set off a grave inflation and pushed foreign car-makers toward fuel efficiency. American labor unions not only refused to adapt: they went on the offensive by launching a tidal wave of strikes intended to defend and expand their existing benefits. Companies responded by moving jobs to “right-to-work” states and overseas. Much hand-wringing and teeth gnashing followed.

Then, by 1991, Communism and the centrally-planned economy had been defeated. China, and other socialist countries began a rapid shift toward open markets. Many American jobs disappeared over-seas (although Americans were—short-sightedly—prone to blame NAFTA. Much hand-wringing and teeth gnashing followed. Thereafter, Americans struggled to find some new way of making an adequate living.

Then came the “Great Recession.” Today, about one-third of American workers work part-time, or as temp workers, or day by day. This, in my mind, has been one of the great economic and political preoccupations of the last twenty years.[1]

Uber, the ride-sharing service, and Airbnb, the home-sharing service, are often cited as the fore-runners of a new “sharing” economy. One element of Uber’s business plan has been to define Uber drivers as “independent contractors,” rather than as employees. The upside of this is the great efficiencies and flexibility for both Uber and for its drivers, not to mention the savings on labor costs like benefits. For Uber, the drivers are doing piece-work; for the drivers, they get to structure work around other aspects of their lives by working when and how much they work.

On the other hand, it drives Democrats and their clients in the “old” industries crazy. Independent contractors have no right to unionize; they have no right to benefits; they aren’t subject to government regulation; they don’t get compensated for wait-time; they can work for two different companies; they are all profit-oriented, rather than submissive to the moral strictures of Democratic voters; and they’re entrepreneurial, rather than locked into a known and established institution.[2]

Probably, the goal should be to prevent the exploitation of independent contract labor, rather than to stifle economic change an innovation. This would require treating these workers as some sort of middle ground. Social Security and Medicare with-holding should apply and they should be part of pools for health insurance. The “gig economy” should have to succeed on the strength of its business model, rather than by “screwing labor down to the lowest peg,” as was so often the case in early industrialization. At the same time, Washington shouldn’t try to create a Greek economy.

[1] Greg Ip, ”New Rules for the Gig Economy,” WSJ, 10 December 2015.

[2] Alas, this litany of differences suggests that the “normal” American working conditions are unsustainable in a competitive global economy.

Rant.

In terms of GDP the United States has the largest economy in the world: $17 trillion. China’s GDP is $10 trillion, with 2-3 times as many people, so China’s per-capita GDP is pathetic compared to the US.[1]

In terms of after-tax household income, the US just wipes the floor with other countries: the US average is $41,355, while the median for Organization for Economic Cooperation and Development (OECD) countries is $27,630. So the US is like 50 percent higher than the median. That means that it drags the median upward by its high household income, so most OECD countries have family incomes even lower than in the US. .

People are quick to point out that American success is all quantitative, rather than qualitative. Americans work way more than do “normal” people: an average of 46.7 hours a week. They get less sleep, have less family time with their ingrate kids, have wives who have let themselves go, and about 35 percent of Americans are obese.[2] American society may be rich, but it is very unequal, which may be a factor in high poverty rates. Also, there are signs of “moral decay”: Americans trail only Mexico and Chile in pregnancy rates among 15-19 year-olds. Then there is all the violence: only Mexico has a higher murder rate per 100,000 people—and Mexico has drug gangs run amok. Only 74 percent of Americans, Serbs, or Egyptians felt safe walking alone at night.

Is there an alternative model? Yes, either Scandinavia or Central America. Panama, Belize, and Costa Rica all out-pace the US in reports of “daily positive experiences such as smiling and laughing, feeling enjoyment, and feeling treated with respect each day.”[3] More concretely, Scandinavians (Danes, Norwegians, Swedes) accept paying much higher taxes generally than do Americans in return for a comprehensive social safety net.[4] Top earners pay 57 percent, but—and this will freak-out Democrats—middle-income earners pay up to 48 percent of their income in taxes.[5] Consequently, the price of consumer goods is higher and the purchases of consumer goods are less. On the other hand, if you’re playing by the rule that “the one with the most toys when he dies wins” or if you listen to economists who argue that the great American demand for consumer goods is what drives the world economy, then you have to hate the European approach.

Regardless of what European leftists insist, the American definition of happiness isn’t just about quantitative measures over qualitative measures. Americans value individual freedom and choice more than do people elsewhere, and this makes them insist on the importance of individual self-reliance and accountability more than do people elsewhere. Americans believe that progress in life, measured in economic terms, validates an open society and a competitive economy. This is why the “recent [economic] unpleasantness” has been such a trial for Americans. It is astonishing that most of the Republican presidential candidates can’t see this.

See: https://www.youtube.com/watch?v=q49NOyJ8fNA

[1] “How America rates,” The Week, 27 November 2015, p. 11.

[2] Only 4 percent of Japanese are obese and that’s including all those sumo wrestlers.

[3] This explains all the reports of Yankees getting caught trying to cross into Mexico. See: https://www.youtube.com/watch?v=0-N9L3ZXWPA

[4] Actually that term is deceptive. Americans mean that they have to catch and carry the screw-ups. Scandinavians mean a system for enabling each person to live a productive and socially-useful life. These different meanings reflect different beliefs about human character. Jury is out in my view.

[5] So, good-bye “middle class tax cuts” beloved of both parties and the Obama confirmation of 98 percent of the George W. Bush tax cuts looks politically expedient without being fiscally prudent.

Machine minders.

Employers have always wanted a productive and committed labor force, especially if it didn’t involve paying higher wages. Modern technology is supposed to improve productivity in ways that are well understood. However, technology also allows employers to measure productivity in other ways.[1] Software has been developed to track, measure, and analyze all sorts of employee actions. Today, 66 percent of American companies track employees’ internet use; 45 use key-stroke logging to track productivity; and 43 percent track e-mail. Employer-provided cell phones allow tracking through their GPS chips. In addition, computer systems connected to cash registers can track speed of customer purchase processing. UPS has fitted its trucks with a host of sensors that accomplish the same thing in measuring the delivery drivers. The information measures how hard an employee is working and, thus, their marginal value to the employer.

Beyond that, new software allows companies to engage in “sentiment-analysis” on the part of workers.[2] Companies have been using annual surveys and internal blogs to gain insight into the expressed beliefs of workers. New software purports to be able to measure the emotional content as well. (One study revealed that, in spite of the positive terms used to describe a diversity-enhancement initiative, workers felt threatened and fearful for their own jobs.) Other programs assess the salience of issues in the minds of workers.

People who value a degree of privacy might also be alarmed by the recent development of an employee badge that contains a microphone, location sensor, and accelerometer. For the moment, the company that produced the badge claims not to record conversations, but only to use the data to discover valuable patterns among workers.

All this undoubtedly spurs productivity. For example, after four years using the sensor system, UPS delivered 1.4 million packages using 1,000 fewer drivers. Using another technology, Bank of America call centers found that tightly knit groups of workers were less likely to quit and more productive on the job. The bank introduced common coffee-break times. Turnover fell by 70 percent and productivity increased by 10 percent.

It also violates a certain un-spoken assumptions about work held by many employees. Partly this has to do with how much work one should do for how much pay. “People get intimidated and they work faster,” complained one UPS driver. This isn’t really different from the “speed-up” on an assembly-line in the old days. Similarly, work-life and non-work-life are increasingly interpenetrated. Sometimes people have to take care of personal business while at work, just as they sometimes have to bring work home. They expect the employer to understand this reality. When employers complain about time use, employees resent it.

Partly this has to do with revealing employer attitudes about employees. “Right at the heart of all of this [monitoring] is trust,” confessed one management consultant. Modern human resources management talk about creating a sense of community or teamwork in the work-place is revealed to be so much drivel. Hence, Twitter has explicitly fore-sworn analyzing the e-mail of workers and focused on internal blogs (where workers can have no expectation of privacy). Then, will the information be used to cull employees who have what is seen as a bad attitude?

All this is compounded by the fact that good supervisory help is just as hard to get as is other types of employees. One supervisor told an employee that the GPS chip in her company issued cell phone allowed him to track her location 24/7. This could sound like being stalked.

[1] “The rise of workplace spying,” The Week, 10 July 2015, p. 11.

[2] Rachel King, “Companies Want to Know: How Do workers Feel?” WSJ, 14 October 2015.

Annals of the Great Recession XII.

Does History teach “lessons”? Amity Schlaes certainly thinks so. Her book on the Great Depression of the 1930s is both history and prophecy.[1]

Standard histories of the Great Depression focus on all those millions of people whose lives were destroyed by the economic collapse of 1929-1932, and who were rescued by the policies of the New Deal of 1933-1940. Schlaes takes a different approach. She focuses on the people who found no solution to their problems in the New Deal or who found themselves stifled by the New Deal. Some of her cases are fascinating, but ridiculous. “Bill W,” the founder of Alcoholics Anonymous, and Father Divine, a now-forgotten campaigner against racism, undoubtedly pursued solutions rooted in individual behavior rather than in collectivist action. But the New Deal wasn’t trying to deal with alcoholism or racism.[2] It was trying to deal with a mind-bending economic collapse.

Schlaes is on more solid ground when she deals with political and economic issues. On the one hand, Schlaes is undoubtedly correct that the New Deal utterly failed to revive the American economy. Unemployment remained high throughout the decade, while the stock market—a barometer of activity in the real economy, regardless of what one thinks of brokers—remained low. Only the massive deficit spending for the Second World War and the sequestering of much of the earnings for later consumer spending restored prosperity. Still, the New Deal put a safety net under a collapsing economy.  Both this achievement and the role of deficit spending in long-term prosperity are ignored or under-played by Schlaes.

On the other hand, she brings out the essential pessimism of the New Deal—FDR’s smile aside. Schlaes argues that many New Deal figures had been influenced by foreign authoritarian and collectivist models in the Twenties. Mussolini’s Italy and Bolshevik Russia had impressed intellectuals who went on the shape the debates of the Thirties.[3] These people tended to be repelled by the supposed chaos and injustice of the market economy. The National Recovery Administration tried to regulate prices, wages, hours, and even processes.[4] Schlaes insists upon the New Deal’s emphasis on redistribution over economic growth; its creation of a regulatory state with bureaucrats run-amok; its early commitment to creating a planned economy; its creation of constituencies tied to the government by economic interest; and its attempt to judicially punish the representatives of an alternative vision.[5]

Curiously, the book came out in 2007, before the Great Recession and the election of Barack Obama as President. Since 2008, Americans have witnessed—cheering or hissing—the flight from Keynesianism by both Republicans and Democrats; the President telling Americans that the person who own a business “didn’t make that” business; and the attack on “millionaires and billionaires” who “tanked the economy.” Seems like old times.

[1] Amity Schlaes, The Forgotten Man: A New History of the Great Depression (New York: HarperCollins, 2007)

[2] Indeed, the New Deal was founded on racism. Much of its electoral base was in the South, where Democrats both excluded blacks from voting and counted blacks for purposes of representation. Hugo Black, appointed to the Supreme Court by FDR, had been a Klansman. Richard Nixon’s “Southern Strategy,” much decried by all right-thinking progressive people, amounted to catching the Democrats skinny-dipping and running away with their clothes.

[3] Schlaes is hardly alone in doing this. See: Paul Hollander, Political Pilgrims: Travels of Western Intellectuals to the Soviet Union, China, and Cuba 1928-1979 (1981) for many funny or revolting stories.

[4] Like the justices of the Supreme Court at the time, Schlaes has a good deal of fun with the “straight killing” of chickens in the Schechter case.

[5] Examples include the “show trials” of Samuel Insull and Andrew Mellon and the disparaging of Herbert Hoover.

Immigration Politics.

After the Civil War, the stream of European immigrants to the United States turned into a flood. By 1890, 14.8 percent of the people living in the United States had been born abroad. Many “old-stock” Americans found this deeply disturbing. While the First World War temporarily choked down on emigration from Europe, a powerful movement for immigration restriction had sprung up. In the early Twenties, new laws imposed a system of quotas on future immigrants. Decades later various new laws eased restrictions on legal immigration, while a large number of Mexican and Central American immigrants had entered the country illegally. By 2015, 13.7 percent of the population had been born abroad. Demographers now project that this share of the population will grow. By 2015, 14.9 percent of the population may be foreign-born.[1] Is there some kind of “saturation point”?

Today, Americans aren’t opposed to immigration. OK, I have to qualify that a bit. As recently as 2013, a huge majority of Americans (73 percent) thought that immigration was good for America, while only 24 percent thought that it was bad.[2] However, one recent Pew poll found that only 45 percent of Americans believe that immigrants improve America—over the long run at least.[3] A majority (55 percent) of Democrats and a minority (31 percent) of Republicans believe that immigrants improve America. On the other hand, that means that 45 percent of Democrats either don’t think immigrants make the country better or they’re not sure. In addition, 34 percent of Democrats think that immigrants are making the economy worse. Hilary Clinton, Bernie Sanders, and Tommy Carchetti should think about this. (See: Donald Trump in the general election.) On the other hand, the vast majority of Republicans either think that immigrants don’t make the country better or they aren’t sure. This is pretty bizarre within my own notion of what the Republican Party should be: an opportunity society that creams off the best and the brightest from all those sweat-soaked hell-holes around the globe. Of which there are a great many.

In a discombobulating perception, while at most (69 percent) Republicans (100-31=69) think immigrants do not make the country better, 71 percent of Republicans think that immigrants are making the economy worse. Apparently, at least 2 percent of Republicans think that immigrants are making the economy worse and also believe that this is good for the country. Probably some kind of sampling error. As in: Pew interviewed a bunch of idiots. Well, they get to vote so I suppose they deserve to be polled.

Still, there are intricacies to the issue that don’t always receive adequate discussion. For example, one tricky bit appears to be the difference between legal and illegal immigration. In November 2013, 63 percent of Americans favored a “pathway to citizenship” for illegal immigrants. In contrast, 18 percent want all the illegals rounded up and shipped home.[4] In June 2014, the great majority (62 percent) of Americans favored granting full citizenship to illegal immigrants who meet certain requirements; 17 percent favored granting “green cards,” but not full citizenship; and 19 percent wanted them all deported.[5]

Also, the composition of immigration has been changing. In 2010, Mexicans amounted to 45 percent of the immigrants to the US. In 2012 this fell to 14 percent of immigrants. Who picked up the slack? India sent 12 percent, and China 10 percent, while other Asian countries sent 23 percent. That makes Asia, at a total of 45 percent, the current chief source of immigrants to the United States.[6] According to the Census Bureau, in 2013 alone, 147,000 people of Chinese origin migrated to the United States. This puts China in first-place in the list of countries sending migrants to the United States. In 2013, Mexico sent 125,000.[7]

Liberals are counting on Hispanics to vote en mass Democratic. It may not happen. About one-sixth of Hispanics (16 percent) now identify as evangelical Christians (who lean Republican). Another 18 percent express no religious affiliation. Religious Hispanics remain overwhelmingly Catholic (55 percent) but that number is noticeably down from where it was in 2010 (67 percent).[8]

In one sense, Republicans have little to gain from seeking to the Hispanic vote. Only about 16 percent of Congressional districts held by Republicans have at least 20 percent Hispanics in their populations.[9] However, would swinging the Hispanic vote allow Republicans to make further inroads in currently Democratic districts?

Then, if one is to judge by the attacks on Asian shop-keepers during the Rodney King riots in Los Angeles, or the off-hand comments of people I know, African-Americans don’t much like Asians or Hispanics. Much of the traditional Democratic base is concentrated in a handful of major cities and in the South. The Democratic obsession with affirmative action is going to alienate the Asian and Hispanic voters.  In sum, the Democrats have some long-term problems cooking.

[1] “Noted,” The Week, 9 October 2-15, p. 18.

[2] “Poll Watch,” The Week,

[3] “Poll Watch,” The Week, 16 October 2015, p. 17.

[4] “Poll Watch,” The Week, 6 December 2013, p. 17,

[5] “Poll Watch,” The Week, 20 June 2014, p. 17.

[6] “Noted,” The Week, 25 July 2014, p. 14.

[7] “Noted,” The Week, 15 May 2015, p. 16.

[8] “Noted,” The Week, 23 May 2014, p. 14.

[9] “Noted,” The Week, 19 July 2013, p. 14.

On your own.

Once upon a time, each individual person in the Americans society and the American economy bore all sorts of risk associated with their life.[1] Then came the Great Depression of the Thirties.[2] Under Democratic auspices in the New Deal and the Great Society, mass-unionized workers got defined-benefit pension systems, “Cadillac” health insurance plans, unemployment insurance, near-full employment, and ever-more generous Social Security. In essence, risk became shared as if in an insurance model.

Then, beginning in the Seventies, international competition eroded the complacency of the post-war decades. Coincidentally, at the same time, the mythic “American work ethic” eroded to the point where, for example, American-manufactured cars ceased to be stolen. OK, somebody might want to buy German cars or Japanese cars, but Americans cars? Who would steal a K-Car or a Gremlin? The car companies and the UAW pressured Washington into imposing quantitative limits on the number of Japanese cars imported into the United States. Again, the assumption of risk fell on the group or community rather than on the individual.

In the Eighties, risk began to be shifted back toward the individual. Both corporations and governments “de-leveraged” by cutting their formal obligations. Defined-benefit pension systems gave way to defined-contribution systems; health insurance slid toward high-deductible plans; a minimum of 5 percent unemployment became the definition of “full employment”[3] Rather than tolerate poor workmanship for high labor costs, companies began to shift their production overseas. American consumers got better products at a lower price.

All the same, those consumers were also producers. The new systems eroded both job-security and labor compensation. Several aspects of contemporary political radicalism (both Bernie Sanders and the Tea Party) may arise from this disorder.[4]

At the core of Hacker’s work is a life-cycle interpretation of political success and failure.[5] The 45 year-old Hacker believes that victory goes to the young, energetic, and imaginative. (People like him or Paul Ryan.) The Democrats were young and vigorous once. Then, over time, they turned into a party of old buffers. Meanwhile, licking their wounds in exile, the Republicans became a party with “that lean and hungry look.” They figured out how to market their ideas and developed an acute understanding of how the political system worked. Democrats fell in droves before the sword of Ronald Reagan. According to this narrative, old-guy Democrats thought that they could get by splitting the difference with fine young conservatives. Alligator Republicans just ate their lunch. Now what was needed, in the mind of Jacob Hacker is a younger, more dynamic Democratic Party.[6]

The possibility that labor costs (wages + benefits) relative to price and quality of the goods produced has gone beyond what is sustainable in a competitive global economy is not something that Democrats desire to discuss. Nor Republicans either.

[1] Jacob Hacker, The Great Risk Shift (New York: Oxford UP, 2006).

[2] The New Deal is what the Left has I n place of a revolution. Polemicists will debate whether it was a new American Revolution or a watered down Russian Revolution.

[3] Unemployment had sunk below 3 percent in the pre-management of the economy Twenties.

[4] Democrats are inclined to regard one—Sandism—as legitimate, if misguided, while they regard the other—what, evangelical Republicanism?—as illegitimate as well as unhinged. I’m not sure I see a real difference.

[5] It isn’t much different from Ibn Khaldun or Ma Joad.

[6] In the world of ideas, this meant people like Paul Krugman and Joseph Stiglitz; in the world of the communication of ideas—or at least of notions and punch lines—it meant people like Jon Stewart and Steven Colbert.

Inequality 7.

According to the CIA, income inequality in the United States now is more extreme than in Red China.[1] So what? What matters is that a “rising tide lifts all boats,” as JFK said when arguing for a tax cut. However, some economists argue that the evidence for this “true that” statement is sketchy (as young people used to say). President Clinton got Congress to raise the top tax rate from 31 percent to 39.6 percent and the economy boomed (admittedly with the “Tech Bubble” that collapsed after he left office); President George W. Bush got Congress to cut taxes on high earners to 35 percent, but the economy floundered (admittedly with the “Housing Bubble” that collapsed before he left office). In this analysis, what really matters is the amount of demand for goods in the economy. That is an argument for shifting resources to consumers.

The “Great Bull Market” of the Twenties (and other stuff that pundits don’t want to know about) led to the Great Depression. The Great Depression led to the New Deal and 20 years of Democratic dominance in Congress. The Depression discredited businessmen as prophets-of-the-New Era. The New Deal imposed all sorts of restrictions on business and raised taxes on the rich swells (who were in some vague way blamed for the Depression). By the 1950s the top rate on marginal incomes had risen to 91 percent, essentially a confiscatory tax on high incomes. Proponents of relative income equality point to this period as the ideal society because it coincided with the period of American economic ease. Good-paying working-class jobs allowed many people with only a high-school diploma to enter some version of the middle class.

However, the Great Depression ended in 1940. By the 1970s a whole new generation of businessmen had come on the scene. They were unburdened by the sins of their elders. They campaigned for a reduction in the punitive tax rates of the New Deal era. One can see this as Republicans responding to the Democratic strategy of “tax, spend, elect” with their own mantra of “tax-cut, spend, elect.” In theory, savings create investment capital and investment capital creates jobs. Therefore, the tax rate on capital gains fell to 70 percent in the 1970s, then to 50 percent in the first Reagan administration, and then to 28 percent in the second Reagan administration. Bill Clinton pushed for and won a reduction in the tax on capital gains from 28 percent to 20 percent. George W. Bush pushed for and won a reduction in the tax on capital gains from 20 percent to 15 percent. However, President Bush also pushed for massive cuts on taxes paid by lower income groups.

Two things resulted from the Bush tax cuts. First, the US government lost $400 billion a year in revenue. Of this lost revenue, “only” $87 billion came from people earning $250,000 a year or more. The other $313 billion came from people earning less than $250,000 a year.[2]

Second, taxation became much more progressive. While cutting taxes overall, Bush shifted the burden of taxation onto upper income earners. After the Bush tax cuts, the top 1 percent of income-earners now pays 40 percent of the income tax bill (and 21 percent of all taxes), while 47 percent of Americans now pay no income tax at all.[3] Despite his bitter condemnation of the Bush Administration on many scores, President Obama fought hard to confirm 98 percent of the cuts.

There are three observations worth making. One is that there are big long-term trends or swings in tax policy. The huge deficits looming as the “Baby Boom” ages may herald an end to low taxation for everyone.

A second is that President Obama has loudly condemned the plutocrats “who tanked the economy” in the financial crisis. How did Bill Gates or Steve Jobs or Warren Buffett or the idiots who ran American car companies “tank the economy”? They didn’t. In fact, only about 14 percent of the richest Americans work in finance. Yet Gates, Jobs, Buffett and a lot of other ordinary, successful entrepreneurs were hammered by the Obama tax increases.[4] They have also been subject to his frequent dispensation of moral opprobrium.[5]

A third is that the Democrats need to define what they mean by “fair.” As in, “the rich should pay their fair share.” The rich are already carrying a disproportionate share of the fiscal weight while almost half of Americans pay nothing at all for the programs that benefit them. As Woody Guthrie might have said (had he been an entirely different person), “A poor man with a ballot-box can rob you just as easily as can a rich man with a pen.”

[1] “Taxing the rich,” The Week, 4 November 2011, p. 11.

[2] Can you impeach a former President?

[3] If “taxation without representation is tyranny,” then what is representation without taxation?

[4] Perhaps it is worth pointing out that of the “one percent,” about 16 percent are in medicine; about 12 percent are lawyers, and about 50 percent of the members of the House of Representatives and the Senate belong to the “one percent.”

[5] See: “Stuff my president says.”

Climate of Fear XVIII.

Environmental record-keeping is really pretty new. It’s a function of the rise of both Science and the State during the 19th Century. In the case of California, systematic record-keeping only began in 1895. Beyond the formal records, policy-makers are forced to rely upon scientific studies and interpretations. Some geological studies indicate that “megadroughts” lasting a decade or more occur in the Western United States every 400 to 600 years.[1] We may be at the beginning of such a megadrought. It may last thirty years or more.

Since 2000, the whole of the West has been suffering from drought. This has created many different kinds of problems from crop failures to massive wildfires.[2] California offers a study of a particularly acute case. During the winters of 2012-2013 and 2013-2014, a large high pressure area prevented winter storms in the Pacific from blowing in-land to the great Sierra Nevada mountains along California’s eastern edge.[3] Three-quarters of California’s water comes from the snowfall in the Sierra Nevada. Come Spring each year, the snow melts and runs down streams and rivers into the rest of the state. Much of the water also seeps down into the aquifer of the great Central Valley. The high-pressure area cut rainfall in California to 42 to 75 percent of normal.[4] In addition, the absence of the cooling effect of on-shore breezes and storms helped bake California. That evaporated much of the water that did reach the ground.

By September 2014, 82 percent of California had been designated as either in an “extreme” or an “exceptional” drought. How to respond? Well, California is both a cluster of major cities and suburbs and a major agricultural state: it produces about 70 percent of the top 25 fruits, nuts, and vegetables. So, 80 percent of California’s water is used for irrigation of its farmers’ crops. To cut water to farmers is to cut the legs off a major industry. Instead of hitting agriculture, governments tried to limit non-agricultural water use. To begin with, the California Water Resources Control Board began fining people who watered their lawns or washed their cars without using a water-saving nozzle on the hose; Los Angeles—harking back to the oil shock of 1973—limited people to watering their yards on alternate days. That didn’t have much effect. Huge numbers of urban consumers pushed back against such restrictions.

Faced with limits on taking water from rivers, farmers turned to drilling into the aquifer. That’s a short-term—and destructive—response. There is a limited amount of water in the aquifer. A “water rush” equivalent to the Oklahoma “land rush” will privilege those with the most money for drilling operations and force smaller farmers to the wall.

In 2014 the state legislature passed a bill to regulate the use of groundwater (i.e. the drilling). This enraged farmers, who saw the groundwater as their own property. The basic question is whether a lot of people who use relatively little water (at most 20 percent of the total) should suffer hardships for the sake of relatively few people who use relatively a lot of water in order to produce valuable products. What if it was a question of electricity use, where urban areas consume far more than do rural areas? These questions aren’t just about California. They go to how we think about the environment and the economy in general.

[1] This cuts across the argument of supporters of androgenic climate change, without invalidating their arguments.

[2] Disclaimer: my son is a National Forest Service wildlands fire fighter. Actually, it isn’t a disclaimer. I just want everyone to know that I’m proud of my boy for doing a hard, demanding, and dangerous job when most kids want careers with Wall Street or Disney World or the US Gummint.

[3] “California’s epic drought,” The Week, 26 September 2014, p. 9.

[4] Apparently, no one attributes the high-pressure ridge to global warming. Inevitably, people make do with claims that “global warming” is intensifying the effects of the drought. Arguably, this is what climate-change denial on one side elicits from the other side: potential overstatement.

The Roosevelts versus Ronald Reagan.

Back at the start of the Twentieth Century, Theodore Roosevelt had posited that big business and a foreseeably big labor would require a big government to balance their power and solve complex new problems. For a long time, it appeared that “the Republican Roosevelt”[1] had been prescient. The New Deal, launched by his cousin Democrat Franklin D. Roosevelt, greatly expanded the government’s role in the economy. That trajectory continued until the election of Ronald Reagan in 1980. Since then, Republicans have inveighed against the expansion of state power (unless national security can be invoked). What do Americans think about this issue in the early Twenty-First Century? A January 2014 opinion poll captured a fundamental division of opinion.[2] A majority (57 percent) agreed with the statement that “we need a strong government to handle today’s complex economic problems.” However, a very substantial minority (41 percent) rejected that idea in favor of letting a free market operate without “the government being involved.” To belabor the obvious, 57 + 41 = 98 percent of Americans. There is no uncertainty in the minds of Americans about this issue, no mushy middle ground on which compromise is possible. Two tribes confront each other. In Europe, on the other hand, there is a broad consensus on the role of government in the economy.

This has important implications for the economically-battered ordinary American. In 2010, the median wage was $26,364. After adjusting for inflation, this was the lowest real median wage since 1999.[3] In 2014, American median net worth per adult hit $44,900. Japan, Canada, Australia, and many Western European countries ranked ahead of the United States, which came in at 19th .[4] Apparently, if Americans are offered a choice between earning another $20,000 a year and getting another month of vacation, they would take the pay.[5] One could interpret this as Americans being workaholics. One could also interpret it as a sign of the economic stress under which many Americans are operating.

The question is what to do about this pathetic performance. The opposing positions generally pit redistribution through taxation policies (i.e. “strong government”) against pro-growth and social mobility policies (i.e. “let the market operate”).

If you combine federal, state, and local taxes, Americans are among the lowest taxed people in the developed world. Here the US ranks 31st, trailing most of the countries with higher median net worth.[6] Where does American federal spending go? Almost two thirds of it (65 percent) goes to three categories: Social Security (24 percent); Medicare/Medicaid/CHIP (22 percent); and defense (19 percent).[7]

None of this goes to the question of which group is correct. Perhaps neither one is entirely correct. Europeans are laboring under an “austerity” that would never be tolerated in the US. It does suggest that there is a core dispute that is more powerful—and important—than the “culture wars” that obsess the media and Democratic activists. Hence, Bernie Sanders.

[1] As Yale historian John Morton Blum called one of his books.

[2] “Poll Watch,” The Week, 17 January 2014, p. 17.

[3] “Noted,” The Week, 4 November 2011, p. 18.

[4] “The bottom line,” The Week, 20 June 2014, p. 34.

[5] “Poll Watch,” The Week, 24 July 2015, p. 15.

[6] “Noted,” The Week, 25 April 2014, p. 16.

[7] “Noted,” The Week, 25 April 2014, p. 16. It is worth pointing out that most countries don’t spend anything like the share of the budget on defense as does the US. Instead, they rely on the US in an emergency. That frees up a lot of resources for social programs. Then the federal nature of American government means that much spending is done by state and local authorities. Some European countries, in particular, have a more centralized system.

Annals of the Great Recession XI.

I saw the Iraq War as an obvious act of stupidity from even before we attacked in Spring 2003. So, in 2008, I voted for the candidate who had opposed it, Barack Obama. I voted for him in spite of his obvious weaknesses: he was as green as grass in politics, he had never run anything, he didn’t know anyone much in Washington, and he had some dopey ideas. My assessment of President Obama’s failings is amply borne out by Ron Suskind’s scathing account of how the President and his advisers made economic policy in the first two years after he reached the White House.[1]

Undoubtedly, Obama inherited an economic disaster from the George W. Bush Administration. However, his background and range of contacts left him ill-positioned to deal with the immense problems on his plate. First, the president believed in the power of rhetoric; he almost seems to have believed that talk and action were identical. Supporters have argued that he’s the first president in a while to speak in full sentences and paragraphs, and that doesn’t mesh well with sound-bites. In reality, the trouble was that much of his discourse seemed to have been picked up in Chicago rec-league basketball. He disses people who disagree with him.[2]

Second, the president turned out to be a poor judge of people and had few close advisers to keep him from going into the ditch at the first opportunity. Rahm Emanuel, who served as his first chief of staff (and who recently squeaked through to re-elections as mayor of Chicago), and Lawrence Summers, who headed his National Economic Council (before going off to become President of Harvard until he vexed the faculty, were imperious), abrasive men who rubbed people the wrong way as a first order of business in any meeting. Tim Geithner, his first Secretary of the Treasury, was consistently suspected of mouthing the Wall Street view.

Third, unlike his immediate predecessor, President Obama could not pull the trigger on any issue. Instead of deciding, he sought consensus. Endless debates went on, but the President refused to choose one option and then to say “it’s my way or the highway.” Who ever crossed Richard Nixon without landing on the sidewalk with his suit in tatters? It’s a short list.

Many of his own subordinates saw through him from the start. Famously, Lawrence Summers, the head of Obama’s National Economic Council, told another official: “We’re home alone. There’s no adult in charge. Clinton would never have made these mistakes.” Geithner has been accused of out-right insubordination, but stayed at Treasury as long as he chose.

The “friendly opposition” within the Democratic Party would argue that, after the rough ride of his first two years, Obama began to understand how things should operate. He got rid of his early hires and started to make decisions. So they say. With a year and change to run on his second term, it isn’t clear that much has changed.

Still, what was the bigger disaster for America: Obama’s mismanagement of the economy or the Iraq War? Somebody in Washington needed to get drilled for the Iraq War, not just the men and women who fought there. John McCain and Hillary Clinton had to pay a price at the voting booth. What are we supposed to do? Let bygones be bygones after each new train-wreck engineered by the usual suspects who populate American politics?

Finally, has Obama learned anything? The answer to that question goes to the credibility of the Iran deal.

[1] Ron Suskind, Confidence Men: Wall Street, Washington, and the Education of a President (New York: HarperCollins, 2011).

[2] See: Stuff my president says.”