Snow on the roof.

In the Nevada caucuses, with 95.3 percent of the counties reporting, Hillary Clinton picked up 52.7 percent and Bernie Sanders picked up 47.2 percent of the vote.[1] This is an important victory for Hillary Clinton after Sanders tied her in Iowa and thrashed her in New Hampshire.

That isn’t the same as saying that it was a total loss for Sanders. A year ago, in February 2015, 58 percent of self-identified Democratic voters in Nevada favored Hillary Clinton, while 4 percent favored Bernie Sanders. In March 2015, 61 percent favored Clinton, while 7 percent favored Sanders. In July 2015, 55 percent favored Clinton, while 18 percent favored Sanders. In October 2015, 50 percent favored Clinton, while 34 percent favored Sanders. In December 2015, 51 percent favored Clinton, while 39 percent favored Sanders. In January 2016, 47 percent favored Clinton, while 43 favored Sanders. In early February 2016, they were tied at 45 percent each. In mid-February 2016 they were pretty much where they ended up, with 53 percent favoring Clinton and 47 percent favoring Sanders.[2]

Clinton’s numbers were pretty steady for a year, although there was a certain amount of erosion. Sanders’ numbers, however, shot up. Where did he get these voters? Mostly, they came from people who had previously favored Elizabeth Warren or Joe Biden or Tommy Carchetti, or who had been undecided. Thus, Clinton has a hard core of steady support. There also appears to be a substantial Anyone-But-Clinton (ABC) group among Democratic voters.

Nevada actually is a big blank space on the map. Three-quarters of the state’s population lives in or around Las Vegas, the county seat of Clark County. In Clark County, Clinton won 54.9 percent and Sanders won 45.1 percent. According to the 2010 census, Clark County’s racial makeup was roughly 61 percent white, 29 percent Hispanic, 10.5 percent African American.[3]

Although African Americans made up 10.5 percent of the Clark County population in 2010, they turned out at a higher rate than did other groups, totaling 13 percent of the people at the caucuses. Then they voted overwhelmingly for Clinton (76 percent) over Sanders (22 percent). Clinton also did better among older voters than did Sanders.

The ABC movement is centered among younger people and Hispanics. Sanders crushed Clinton among under-30 voters (82 percent-14 percent); and among under-45 voters (62 percent-35 percent).[4] Among Hispanics, Sanders beat Clinton by 8 percent. While, 29 percent of the population is Hispanic, they turned out in much lower numbers, representing only 19 percent of the people at caucuses. Perhaps this represents the Clinton heavy use of Hispanic surrogates in the last stage. This may have suppressed part of the Democratic vote. Had Sanders found a way to fully mobilize the Hispanic vote, he might have won. Whites turned out at a rate of 59 percent, a hair below their share of the population. Clinton and Sanders essentially split this group.

Probably, this will not block her from winning the nomination. Will it affect Democratic turn-out in November? Does Clinton speak only for older people and African Americans?

[1] See: http://graphics.latimes.com/election-2016-nevada-results/

[2] https://en.wikipedia.org/wiki/Statewide_opinion_polling_for_the_Democratic_Party_presidential_primaries,_2016#Nevada

[3] Yes, I know it doesn’t quite add up and leaves out Asians, etc. It’s the effect of the White, non-Hispanic versus White Hispanic mishagosh.

[4] Abby Philip, John Wagner, and Anne Gearan, “Black vote key in Democratic caucus in Nev.,” Philadelphia Inquirer, 21 February 2016.

Explaining Bernie Sanders—and Perhaps Donald Trump.

Two-thirds of Americans believe that there is at least one presidential candidate who would make a good president in the current crop. Most (75 percent) of Republicans believe that Donald Trump could win a general election—even though only about half of Republicans want him as their candidate. Virtually all (83 percent) Democrats believed that Hillary Clinton could win election–before Bernie Sanders ran even with Clinton in Iowa and then torched her in New Hampshire. Among the less-favored candidates are Ted Cruz (60 percent of Republican); Marco Rubio (55 percent of Republicans); and Bernie Sanders (54 percent of Democrats).[1]

In theory, Hillary Clinton wipes the floor with the leading Republican candidates when it comes to dealing with terrorism. Americans preferred her to Donald Trump (50-42), Marco Rubio (47-43), and even Jeb Bush (46-43).[2] On the other hand, that means that 43 percent of Americans want anyone-but-Hillary, no matter how clownish or inexperienced, to deal with terrorism. Is it the same for other issues? If it is, then she has remarkably high negatives for someone running for president. Still, so did Richard Nixon. Oh. Wait.

On the other hand, Independents fail to share this enthusiasm. Only 58 percent of them believe that there is anyone who would make a good president. (If Independents sit out in large numbers, then that might leave the November 2016 election in the hands of party regulars.)

Why are Americans so rabid for anti-establishment candidates?

In 2003, the net worth of the average American was $87,992. In 2013, the net worth of the average American was $56,335 in 2013. That amounts to a 36 percent fall in net worth, before allowing for nugatory inflation.[3] On the other hand (2003-2014), the net worth of the top five percent of earners increased by 14 percent over the same period.[4]

About one-third of Americans have no savings accounts at all.[5] Twenty percent of people aged 55 to 64 have no retirement savings. Almost half (45 percent) of people surveyed expected to live on whatever Social Security paid them.[6] Almost half (44 percent) of Americans don’t have an “emergency fund” to cover basic expenses for three months. Almost half (43 percent) of American workers would be willing to take a pay cut IF their employer would increase the contribution to the 401k retirement savings plan.[7] In August 2014, about 77 million Americans had a debt “in collection.” The median amount owed is $1,350.[8]   That’s not a lot of money. Unless you don’t have it.

If the “Great Recession” had not occurred, then college graduates entering the job market might have expected salaries 19 percent higher. The “normal” penalty for graduating in a recession is about 10 percent.[9] The recent unpleasantness has been unusually unpleasant. Also, state aid to public colleges has fallen during the recession. That means that students have been graduating with much larger debt loads than previously. They have to service those debts out of smaller starting salaries.

People hiring employees tend to favor those who are narcissistic over the humble.[10] Apparently, they are right to do so. “Narcissistic” CEOs make an average of $512 million more over their careers than do those who are not.[11] Will it be the same for voters? Hard to think of anyone more narcissistic than the Clintons. Unless it is Donald Trump.

[1] “Poll Watch,” The Week, 5 February 2016, p. 19.

[2] “Poll Watch,” The Week, 4 December 2015, p. 19.

[3] “Noted,” The Week, 8 August 2014. P. 14.

[4] “Noted,” The Week, 8 August 2014. P. 14.

[5] “The bottom line,” The Week, 15 February 2013, p. 32.

[6] “Noted,” The Week, 22 August 2014, p. 16.

[7] “The bottom line,” The Week, 22 August, 2014, p. 32.

[8] “The bottom line.” The Week, 15 August, 2014, p. 31.

[9] “The bottom line,” The Week, 1 August 2014, p. 31.

[10] “The bottom line,” The Week, 27 June 2014, p. 32.

[11] “The bottom line,” The Week, 1 August 2014, p. 31.

Public opinion and foreign policy.

Back in April 2014, almost half of Americans (47 percent) thought that the United States should be “less active” abroad.[1] That included both Republicans and Democrats (45 percent each, which suggests that Independents were still more likely to favor caution). However, markedly more Republicans (29 percent) than Democrats (12 percent) or all Americans (19 percent) thought that the US should be “more active” abroad. The Republican “don’t knows” amounted to 26 percent, compared to 43 percent for Democrats and 34 percent for all Americans. Thus, there was a more intense division of opinion among Republicans than among Democrats, while Democrats were more uncertain about the right course of action.

By August 2014, Americans were generally feeling surly about the country’s situation. The vast majority (71 percent) felt the country to be “on the wrong track,” and well over half (60 percent) felt it to be “in decline.”[2] A lot of this had to do with the still-unsatisfactory economic recovery and with the continuing dead-lock between the legislative and the executive branches, but some of it probably arose from foreign policy issues as well. In the wake of the rapid advance of ISIS in western Iraq, as well as in light of other domestic reverses (like the ObamaCare roll-out fiasco in Fall 2013), only 42 percent of Americans believed that President Obama could “manage the government effectively,” while a stinging 57 percent thought that he could not. That left only 1 percent who weren’t sure.[3]

A year and a half later, the course of events had shifted opinion among both Republicans and Democrats.  The rise of ISIS from Summer 2014 on, the terrorist attacks in Western countries, and the controversial Iran deal all worked to polarize opinion. The events sent many Republicans back toward a traditional policy of engagement. By December 2015, only 32 percent of Republicans wanted to “focus more at home,” while 62 percent favored being “stronger abroad.” That left only 6 percent saying that they “didn’t know.” The same events sent many Democrats toward a policy of disengagement. Among Democrats, 69 percent now said that the US should “focus more at home,” while only 23 percent favored being “stronger abroad.” That left only 8 percent saying that they “didn’t know.”

Partly, this may be a reflection of the dissolution of established verities. Only 44 percent of Democrats sympathized with Israel in its war with Hamas in the Gaza Strip in Summer 2014, while only 51 percent of Americans overall sympathized more with Israel than with the Palestinians. In contrast, 73 percent of Republicans sympathized with Israel. Whatever the merits of Israel’s policy, the actual implementation of blockade, bombings, and artillery fire in an urban area crowded with women and children as well as missile-firing militants made for gruesome television viewing.

Or perhaps it was just the return to a presidential election campaign that caused many Democrats and Republicans to adopt policies in knee-jerk opposition to their rivals’ policies. For example, in March 2015, 53 percent of Republicans supported automatic registration of all eligible voters. Then, Hillary Clinton endorsed this proposal. Soon, only 28 percent of Republicans supported automatic registration of all eligible voters.[4]

In any event, American voters will get a clear choice in November 2016.

[1] “Behind Shifting GOP Mindset,” WSJ, 4 February 2016.

[2] “Poll Watch,” The Week, 22 August 2014, p. 17.

[3] “Poll Watch,” The Week, 8 August 2014, p. 15.

[4] “Poll Watch,” The Week, 19 June 2015, p. 15. Still, only a minority (48 percent) of Americans supported the idea, while 36 percent were opposed.

The Gun Show.

Since 2009, when President Obama first began talking about gun control, gun sales have increased. The stock market value of gun manufacturers like Smith and Wesson and Ruger rose by 900 percent.[1] Now the president has begun taking executive action to extend federal control.

How big is the problem of unlicensed gun sales? A study of “Armslist.com” found that 600,000 guns were offered for sale on-line by unlicensed dealers. Of these, 4.5 percent were sales by “high volume dealers”–people who sold 25 to 150 weapons a year.[2] So there are a small number of people knowingly skirting the law in much the same way, perhaps, as many drivers ignore the speed limits[3] or sell marijuana. Guns purchased on-line will not be sent directly to the purchaser. They will be sent to a licensed gun-dealer who can carry out an on-line background check before turning over the gun to the purchase. Many, if not most, gun show sales also require a background check. (Lots of people have a Wi-Fi connection.)

When President Obama issued his executive order on gun-sales, he sought to bring all those who sell or trade guns under federal control.[4] Specifically, anyone who sells guns could be considered a “gun dealer.” Any of them who do not have a federal license—which will not be issued to applicants in the same way that federal lands are to be closed to coal miners by executive order—could be subject to heavy fines. White House spokes-person Josh Earnest[5], claimed that the penalties to be levied on people “hiding behind the hobbyist exemption” would force many people to seek federal gun-dealer licenses. So, that’s the end of that. These dealers are thought to sell hundreds of weapons a year. Some of these hundreds of weapons may be used in the thousands of gun homicides that happen each year. Both small gun dealers and knowledgeable federal officials doubt that the new order will have any effect.

How does it play in Peoria? As of early January 2016, 51 percent of Americans were opposed to tighter gun laws; while 48 percent supported tighter laws.[6] As of mid-January 2016, more than half (54 percent) of Americans opposed President Obama’s use of executive orders to alter the gun laws [relating to who is a gun-dealer], while 44 percent approved it. So, Americans are clear in their own mind about what they believe on this matter. At the same time, however, two-thirds (67 percent) of Americans supported President Obama’s directive for expanded background checks for gun-buyers.[7] What about the party-affiliation breakdown? Well, virtually all Democrats (85 percent), two-thirds (65 percent) of Independents, and just over half of Republicans (51 percent) support expanded background checks. 

What’s the difference? Well, we have an existing system of back-ground checks and anyone can see that the system doesn’t catch enough of the killers. So extending it makes sense without changing the law by presidential ukase. Changing the legal definition of who is a gun-dealer smacks of President Obama’s all-too-evident belief that he is the ruler of the French Second Empire, rather than president of the United States. The former adjunct professor of law appears to have a problem with many Americans about how he understands the Constitution.

[1] Compared to 800 percent for Apple and 147 percent for the benchmark Standard and Poor’s 500 index.

[2] The NYT did not report on the hand gun versus long gun balance of this trade. Hand guns are the chief killers.

[3] See: Route 202 southbound at 6:30 AM. Just reporting on what I have seen.

[4] Hiroku Tabuchi and Rachel Abrams, “Obama’s Gun Initiative Seen as Having Limited Effect on Unlicensed Dealers,” NYT, 8 January 2016.

[5] “Josh” is an old term for “joke.” “Earnest” is an old term for “I’m serious.” So, which is it?

[6] “Poll Watch,” The Week, 15 January 2016, p. 17.

[7] “Poll Watch,” The Week, 22 January 2016, p. 17.

The Gracchi.

According to the guiding theory of the Democratic Party, a big government that robs the rich to give to the poor should be a permanent winner in electoral politics.[1] It isn’t, in spite of continual Democratic efforts to paint the Republicans as Rich Swells and the mere creatures of Big Business. Eduardo Porter conjectures that working-class whites assign greater importance to “racial, ethnic, and cultural identity,” than to “economic status.”[2]

In December 2015, one poll reported that Hillary Clinton would beat Donald Trump in a general election—if only college-educated people voted. On the other hand, Trump would beat Clinton if only people without a college education voted.

Porter is at pains to argue that, while Trump also polls ahead of his Republican rivals with women and upper-income voters, his main base is “less-educated, lower-income white men.”[3] He argues that white, working-class voters (especially men) are “nostalgic for the county they lived in 50 years ago.”[4] These people—he doesn’t quite say “those people”—“would rather not have a robust government if it largely seems to serve people who do not look like them.” While 62 percent of white Americans would prefer a smaller, less providential government[5], only 32 percent of blacks and 26 percent of Hispanics desire that end.   As a result, America could experience “an outright political war along racial and ethnic lines over the distribution of resources and opportunities.” Actually, it isn’t that clear. Whites account for 62 percent of the population, while African-Americans account for 13.2 percent and Hispanics account for 17.1 percent. Taken together, the supporters of a smaller government total better than 46 percent of the population. That’s a big constituency that spans racial lines.

Porter confuses other issues as well. He approvingly quotes one scholarly paper that argues that “racial animosity in the U.S. makes redistribution to the poor, who are disproportionately black, unappealing to many voters.” For one thing, Trump has not attacked blacks to the best of my knowledge. Indeed, he has sought the support of traditional leadership figures in black communities. Trump’s white, working-class base agrees with the candidate’s policies on building a wall along the border with Mexico; deporting illegal immigrants, virtually all of whom are Hispanic-Mexicans; and registering Muslims as potential terrorists. All these can be read as expressions of concern about the loss of jobs to foreign competition, the open flouting of the rule of law, and security in an age of terrorism. For another thing, while Porter accepts that people can have predominant non-economic concerns, he ignores the chance that people are ideologically opposed to welfare dependency. Something else must be driving them. That “something” appears to be race, as in “racism.” Again, however, Porter turns a blind eye to long-standing traditions of self-reliance as an American virtue.

America’s economy, society, and place in the world have all changed in ways that most people do not like. Democrats and Republicans are both nostalgic and they offer policies aiming at “restoration.” We need something better.

[1] “Tax, spend, elect” is one version of a motto attributed to President Franklin D. Roosevelt’s advisor Harry Hopkins. The authenticity of the phrase is disputed. See: https://en.wikipedia.org/wiki/Tax_and_spend

[2] Eduardo Porter, “Racial Identity Returns to American Politics,” NYT, 6 January 2016.   This is a variant of what Marxists term “false consciousness.” People think that that they belong to a different social class than they actually do, so they behave in the wrong fashion. In this case, people assign less importance to “economic status” than liberals think that they should.

[3] That is, the foundation of the New Deal coalition and of its successors until the Seventies. Now much despised.

[4] President Obama and other Democrats have been talking about restoring the middle class to its former prosperity. Why isn’t that “nostalgia”?

[5] This group spills well outside of the Republican constituency, let alone the Trump constituency.

Affirmative Action.

Between 1940 and 1965 the Democratic Party slowly shifted from relying on anti-black racism to a forthright advocacy of “equality as a fact and as a result.”[1] Since the end of the Civil War, opportunities for African-Americans within the modest federal government had bounced around, with the appalling Woodrow Wilson doing much to roll-back advances made under his Republican predecessors. However, after 1932 the dramatic expansion of the size of the federal government and the turn to employing private contractors to execute its will created new conditions. In 1965, President Lyndon Johnson signed an executive order that required government contractors to identify and eliminate obstacles to the employment of minorities (by which Johnson meant African-Americans). This basic commitment to justice swiftly became the consensus in American politics. In 1969, President Richard Nixon issued his own executive order that required contractors to hire so as to reflect the racial composition of their area. Many states followed the lead given by the federal government. Ten years of expanding affirmative action initiatives followed.

For every action, there is an equal and opposite reaction.[2] For one thing, there were people who saw “affirmative action” as “reverse discrimination.” If one kind of discrimination is wrong, then all kinds are wrong. So, there was a principled opposition to affirmative action. For another thing, affirmative action disrupted and devalued a well-established system of apportioning opportunity.[3] At all levels of American society, some people get things because of patronage or connections. That’s true of “legacy” admissions to Ivy League universities; it’s true of family firms; and it’s true of union hiring halls. Increasing minority representation gored somebody’s ox in many of these cases. For yet another thing, some people are racists. They assumed that African-Americans were innately less capable than were whites. For liberals of this stripe, inferiority meant that African-Americans needed to be protected and guided by an expanded state, rather than left to their own devices. For conservatives of this stripe, inferiority meant that nothing achieved by any African-American came by way of merit, but only by manipulation.

In 1975 Allan Bakke, denied admission to medical school at the University of California at Davis, sued. In 1978 the US Supreme Court found for Bakke, rejecting the use of quotas to apportion opportunity. The case touched a nerve among conservatives in particular. In 1980, former California governor Ronald Reagan won the presidential election. He issued his own executive order ending the affirmative action requirement for federal contractors.

The American system of federalism means that the policy of the federal government is not necessarily the policy of the individual states. Hence, a sustained effort has been made to persuade the Supreme Court that affirmative action is un-constitutional. In 2003, without much enthusiasm, the Supreme Court upheld the basic constitutionality of affirmative action. It’s easy to find people who feel wronged by affirmative action. So, it’s still on the docket.

[1] “The origins of affirmative action,” The Week, 28 June 2013, p. 9. Between 1865 and 1965 much of the Democratic voter base consisted of Southern whites, who upheld the system of “Jim Crow.” Indeed, it seems likely that at various points in its history, every single member of the Ku Klux Klan was a Democrat. Dis-franchised Southern blacks were nevertheless counted for the purposes of apportioning representatives just as if they had the right to vote. This inflated Democratic numbers in the House of Representatives and in the Electoral College.

[2] Isaac Newton, Third Law of Motion. But maybe not, at least not in politics, society, and the economy. Otherwise we’d be stuck in the same place for millennia. This shows the perils of applying the lessons of physics to the less reliable world of human activity. So does the Reign of Terror in the French Revolution. But I digress.

[3] Indeed, that was the idea.

American opinion on gun control.

Americans are divided on the utility of stricter gun laws to stop shootings. In September 2015, 46 percent of Americans thought that stricter gun-laws were the best way to reduce the number of shootings, while 36 percent thought that the best way would be for more Americans to carry guns for their own protection, and 18 percent weren’t sure.[1] By late-October/early-November 2015, about one-third (35 percent) thought that tighter laws would reduce all forms of shootings, while another third (35 percent) thought that tighter laws would have no effect, and almost a third (30 percent) weren’t sure. On the subject of “mass shootings, however, Americans were clearer in their mind. Almost half (48 percent) thought that mass shootings can be stopped, while one-third (35 percent) think that these events are “just a fact of life in America today.” That means that only one-sixth (17 percent) weren’t sure.[2] However, that was before the San Bernardino shootings[3] and President Obama’s ill-received speech seeking to reassure Americans. By mid-December 2015, 71 percent of Americans believed that both mass shootings and terrorist attacks have become a permanent part of American life.[4]

That is, the share of Americans who believe that mass shootings are just a fact of life more than doubled and moved from a minority to a majority position in about a month. It’s easy to se why they think so. About twice a day for the last twenty years somebody gets killed in an act of workplace violence. More specifically, 14,770 people between 1992 and 2012. Mostly, they were shot.[5] Between 2007 and the end of 2015, 29 people legally entitled to carry a concealed weapon committed “mass shootings.”[6] In the wake of the shooting incident at the Planned Parenthood site in Colorado Springs, CO, people started doing the math for the umpteenth time. Using the expansive definition of “mass shootings” (at least four people including the gunman are killed or wounded), there were 351 mass shootings from 1 January to 30 November 2015.[7] However, this isn’t what most people mean by “mass shootings.” Most people mean “somebody goes postal.” The expansive definition includes criminals who shot up everyone inside of or in front of a row-house in Bal’mer.[8]

Similarly, in Fall 2015, almost half of Americans (46-48 percent) thought that stricter regulation of who could own a gun would reduce shootings by some uncertain amount, while just over a third (35-36 percent) thought that such restrictions wouldn’t be effective. The size of the uncertain group bounced around from 18 to 30 percent. However, the number of the uncertain rose as the issue was discussed in public. The increased size of the uncertain group came at the expense of the supporters of stricter gun laws.

In contrast, the numbers for those who favor carrying personal weapons for protection, who doubt the effectiveness of stricter gun control laws, and who believe mass shootings are just a fact of life are all the same at 35 percent. This matches up with the one-third of Americans who are estimated to own guns.

Gun control advocates are losing the debate. The more they talk, the more they lose. Is it time to re-think strategy and discourse?

[1] “Poll Watch,” The Week, 11 September 2015, p. 19.

[2] “Poll Watch,” The Week, 6 November 2015, p. 21.

[3] So far as I can tell, the NYT never referred to the recent attack in Paris as a “mass shooting.”

[4] “Poll Watch,” The Week, 25 December 2015, p. 21.

[5] “Noted,” The Week, 11 September 2015, p. 18.

[6] “Noted,” The Week, 6 November 2015, p. 20.

[7] “Noted,” The Week, 11 December 2015, p. 16.

[8] See: https://www.youtube.com/watch?v=R7DhFhzkjcA

The Golden Years.

Can “social progress” have negative consequences? The social security systems established after the Second World War rested on the assumption that many workers would pay a small tax to support a few retirees for a few years.[1] In Western countries the ratio between active, tax-paying workers and inactive, benefit-receiving retirees has shifted from 14 retirees/100 people to 29 retirees/100 people. This has shifted the balance between the number of workers whose taxes support retirees and the number of retirees. Furthermore, people are living longer. Just since 1970, the average period which people spend in retirement has increased by seven years. This has increased the costs of retirement born by advanced societies. Between 1990 and 2011, public spending in this area increased from 6.2 percent of GDP to 7.9 percent. An aging population has more and more retirees and fewer and fewer workers to support them.

There’s Social Security and then there are your personal savings. These are the two chief components of retirement income for most American workers. Social Security originally was not meant to be a national pension system. It was meant to insure the aged against a steady diet of cat-food noodle casseroles. Today, Social Security pays out 39 percent of the career-average earnings of middle income workers and 54 percent of the career-average earnings of a low-wage worker. If projected personal savings are added to projected Social Security benefits, then a low wage worker could anticipate receiving 90 percent of his/her average lifetime wage. However, most low-wage workers don’t manage to save much. One study estimated that less than ten percent of the bottom 20 percent of retirees has any personal savings. Social Security only pays about 54 percent of these peoples average lifetime wage. Old age means a big fall in income.

The problems will get worse. Nominally, Social Security recipients are buffered by the Social Security “trust fund.” Even if we accept this fiction, then the trust fund will eventually be exhausted. By 2035, Social Security will be paying only 27.5 percent of average career wages.

The slow-growth American economy will not make it easier to resolve these problems. It isn’t generating higher wages for most workers. The retirement of the “Baby Boom” generation is likely to create labor shortages that will drag on economic growth. While it seems to be accepted that many Americans who are doing some kind of physical labor will have a hard time adding more years to their careers, it is also likely that many people doing some kind of office work will see their intellectual abilities degrade in the same way.

So, what is to be done? One solution—popular among liberals, but poison among conservatives—is to raise the cap on Social Security withholding for higher income groups in order to re-distribute the income (and reduce the savings) of the well-paid and the provident. Another solution—popular among the “serious people” often derided by Paul Krugman—is to raise the retirement age in order to reduce spending while raising contributions. The discussion of these options is likely to be messy. Both sides are likely to frame the debate in moral, rather than practical, terms. The well-off will be portrayed as “greedy,” as “selfish,” and as not “needing” all that they have. The needy will be portrayed as “takers,” as “slackers” and as people wanting to manipulate the political system to escape the consequences of their own bad choices.

Less than a year out from a presidential election, it would be nice if the issue came up in a debate.

[1] Eduardo Porter, “An Aging Society Changes the Story About a Decline of Poverty for Retirees,” NYT, 23 December 2015.

Edjumication.

Is college worth the price? Oh absolutely! In the late 1970s, a college degree earned you about 25 percent more than did just a high-school diploma. In the late 1980s, a college degree earned you about 50 percent more than did just a high-school diploma. In 2000, a college degree earned you 70 percent more than did just a high-school diploma.[1]

On the other hand, if a student attends a college or university that is ranked in the bottom 25 percent of all colleges and universities, then they are likely—on average—to earn less than a high-school graduate who did not attend a college or university. But which are these schools? Google “ranking of colleges and universities” and the next thing you know, you’re in a morass. What I’m—defensively—guessing is that the list includes a lot of for-profit schools which the Federal government is intermittently dragging down a dirt road chained behind a pick-up truck.  Still, if you think about it, there are a bunch of schools where a BA earns you just as much as not having gone to college, and a bunch more schools where a BA earns you somewhat more than not having gone to college in the first place. All of this costs students and/or parents money.

Not everyone is a loser in this stupid game, but the success of some disguises the relative failure of many. Currently, the average annual income for college graduates ten years after crossing the stage to the cheers of family members is $35,000.[2] On the other hand, after the same span of time, Ivy League graduates average $70,000.[3] So, all you’ve got to do is work real hard in high-school, get into one of the Ivies, and you’re on Easy Street, right? Well, it turns out that “no, man, there’s games beyond the game,” as “Stringer” Bell advised Avon Barksdale on “The Wire.” For the top ten percent of Ivy League graduates, the average income ten years out is $200,000 a year. We’re talking about 31-32 year-olds here.

Why did the gap between a high school diploma and a BA open? Did the economy develop in a way that created an increased demand for whatever higher order intellectual skills and contextual knowledge one acquires in college? Did the economy develop in a way that eliminated well-paying jobs that did not require a college education? Did the high-schools decline as institutions of foundational learning, shifting the burden to colleges?

Well, yes. Back in April 2008, American high-schools trailed many other countries in their graduation rate. Norway (100 percent), Germany (99 percent), South Korea (96 percent), Russia (87 percent), and Italy (81), among others, all out-performed the United States (75 percent).[4] In June 2015, many young Americans graduated from high-school. Their average composite SAT score was 1490 out of a possible 2400. That is the lowest level since 2005. The country has been pursuing essentially the same educational reform policy under different names (No Child Left Behind, Race to the Top) all those years. It has achieved nothing. Furthermore, the commonly-accepted bar for college readiness is 1550. OK, not everyone needs to go to college (regardless of what President Obama once hoped for) and it’s only an average. So, how many high-school graduates were ready for college? Of all students, 42 percent scored at least 1550. However, only 16 percent of African-Americans scored at least 1550.[5]

The problems are with the schools and with parenting. Sad—and rare–to say. Read to your kids. Let them see you reading.   Praise hard work. You know, Puritanism.

[1] “Noted,” The Week, 14 January 2005, p. 14.

[2] When welders are making $100K a year.

[3] “Noted,” The Week, 25 September 2015, p. 16.

[4] “Noted,” The Week, 4 April 2008, p. 16.

[5] “Noted,” The Week, 18 September 2015, p. 16.

The New Economy.

Once upon a time, most American workers were essentially independent contractors: small farmers selling to the local market or craftsmen with their own shops. Then came the Industrial Revolution and massive immigration. Armies of semi-skilled employees replaced the independent contractors and petty entrepreneurs. Giant corporations arose to manage the mass-production industries. Much hand-wringing and teeth gnashing followed. Unions and government both stepped in to regulate the working time, working conditions, and pay of the industrial armies. Much hand-wringing and teeth gnashing followed. This economy flourished through the 1970s.

Then began the great disruption of the American economy. Foreign competition returned to the global market long dominated by Americans (1945-1975). The “oil shocks” (1973, 1979) set off a grave inflation and pushed foreign car-makers toward fuel efficiency. American labor unions not only refused to adapt: they went on the offensive by launching a tidal wave of strikes intended to defend and expand their existing benefits. Companies responded by moving jobs to “right-to-work” states and overseas. Much hand-wringing and teeth gnashing followed.

Then, by 1991, Communism and the centrally-planned economy had been defeated. China, and other socialist countries began a rapid shift toward open markets. Many American jobs disappeared over-seas (although Americans were—short-sightedly—prone to blame NAFTA. Much hand-wringing and teeth gnashing followed. Thereafter, Americans struggled to find some new way of making an adequate living.

Then came the “Great Recession.” Today, about one-third of American workers work part-time, or as temp workers, or day by day. This, in my mind, has been one of the great economic and political preoccupations of the last twenty years.[1]

Uber, the ride-sharing service, and Airbnb, the home-sharing service, are often cited as the fore-runners of a new “sharing” economy. One element of Uber’s business plan has been to define Uber drivers as “independent contractors,” rather than as employees. The upside of this is the great efficiencies and flexibility for both Uber and for its drivers, not to mention the savings on labor costs like benefits. For Uber, the drivers are doing piece-work; for the drivers, they get to structure work around other aspects of their lives by working when and how much they work.

On the other hand, it drives Democrats and their clients in the “old” industries crazy. Independent contractors have no right to unionize; they have no right to benefits; they aren’t subject to government regulation; they don’t get compensated for wait-time; they can work for two different companies; they are all profit-oriented, rather than submissive to the moral strictures of Democratic voters; and they’re entrepreneurial, rather than locked into a known and established institution.[2]

Probably, the goal should be to prevent the exploitation of independent contract labor, rather than to stifle economic change an innovation. This would require treating these workers as some sort of middle ground. Social Security and Medicare with-holding should apply and they should be part of pools for health insurance. The “gig economy” should have to succeed on the strength of its business model, rather than by “screwing labor down to the lowest peg,” as was so often the case in early industrialization. At the same time, Washington shouldn’t try to create a Greek economy.

[1] Greg Ip, ”New Rules for the Gig Economy,” WSJ, 10 December 2015.

[2] Alas, this litany of differences suggests that the “normal” American working conditions are unsustainable in a competitive global economy.