Single Payer.

The great achievement of the Affordable Care Act (ACA) has been to extend medical insurance to a large share of the previously uninsured.  The great failing of the ACA has been to fail to address the comparatively high cost of medical care in the United States.

Many foreign countries have single-payer systems.  They also pay a much smaller share of GDP for medical care.  The real issue here is that single-payer countries pay doctors and other medical care or service providers a lot less than does the United States.[1]  American physicians, nurses, hospitals, and pharmaceutical companies all make more than do their European or Japanese equivalents.  For example, British general practitioners make between $81,000 and $122,000 a year, while the average GP in America makes $208,000 a year.  Similarly, the pay differentials between GPs and specialists are much greater in the United States than elsewhere.  Then, some kinds of care are less available or unavailable elsewhere.  (For example, apparently European doctors don’t believe in allergies or depression.  Or try getting a single-patient room in a European hospital.)  One recent estimate holds that a single-payer system in the United States could bargain-down prices for prescription drugs to about 25 percent below the level that Medicare currently pays, which is lower than what the private system pays.  That suggests the scale of pharmaceutical company profits in America compared to other countries.  To get the cost of American medicine down to Western European or Japanese levels, the incomes of these people would have to be compressed.

One economic argument against the Sanders plan is that he proposes to extend coverage beyond the core insurance provided by the Affordable Care Act to include dental care and long-term nursing care.  These would run-up the over-all cost of the program.  At the same time, Sanders would do away with premiums and out-of-pocket costs.  A huge cost increase.

However, the “real” economic argument against the Sanders plan is that it would require a massive tax increase to pay for government-provided universal health care.  This is a misleading argument.  Americans already pay for medical care through premiums, out-of-pocket costs, and lower wages in exchange for health insurance.  They already pay for dental care and for long-term nursing care.  They just do it as private individuals.  (Employer-provided insurance is just an employment benefit that should be taxed like other forms of income.)

The Sanders plan would eliminate private health insurance companies; it would force down the incomes of hospitals and drug manufacturers; it would compress the incomes of doctors and nurses.  None of these institutions or individuals is much inclined to give up the current income structure.  There would be enormous push-back.  Along these lines, Paul Krugman has argued that the U.S. should not try for a single-payer system.  A Medicare-for-all system would require tax increases on the middle class, rather than just on the wealthy.  People with employer-provided health-care would be forced into an inferior system, to their distress.  Republicans would never go for it.  So it is politically impossible.[2]  However, that’s a political argument against the Sanders plan.  It isn’t an economic argument.

There is a lot to be said for this argument from a political realist perspective. Real power doesn’t just reside on Wall Street.  It also resides in suits, white coats over scrubs, and in flowery smocks and Crocs.  Neither Democrats nor Republicans want to bell the cat.  Bernie Sanders’s pull with young people means that the issue isn’t going away when he does.

[1] Margot Sanger-Katz, “Why a Single-Payer Plan Would Still Be Really Costly,” NYT, 17 May 2016.

[2] “Single-payer health care is a pipe dream,” The Week, 29 January 2016, p. 12.

It ain’t necessarily so 3.

Poverty-induced hunger used to be a grave problem in America.  Michael Harrington’s The Other America: Poverty in the United States (1962) documented hunger among America’s large population of poor people, as well as many other ills.  One response appeared in the Food Stamp Act (1964).  Over the last fifty year, this program has greatly reduced hunger among poor Americans.  Today, less than 1 percent of households worry about having enough to eat or go without adequate food on a daily basis.  It is a remarkable success story about government’s ability to solve problems.

However, bureaucracies and advocacy groups foster mission-creep.  Backed by advocacy groups created at an earlier time, the Department of Agriculture (USDA) has moved from reducing widespread real hunger to grappling with “food insecurity” and poor diet choices.

The USDA asserts that 14 percent of households are “food insecure” and another 5.6 percent had “very low food security.”  Not so fast, say critics.  It has been demonstrated for several decades now that average intakes of nutrients are similar for children living in poverty and children not living in poverty, and between black and white.[1]   The real problem is obesity, not “food insecurity.”  Currently, 38 percent of all Americans are obese, 42 percent of Hispanic-Americans, and 48 percent of African-Americans are obese.[2]

Similarly, “food desert” became a term much in vogue five years ago.[3]  According to the USDA, in urban areas it is a place where at least 33 percent of the population lives at least a mile from a supermarket and at least 20 percent live below the poverty line.  In rural areas, the distance to qualify is ten miles.  This isn’t just a convenience issue in the eyes of the Obama Administration.  It is also a health issue.  Lack of access to fresh foods drives people to rely on processed foods and fast foods.  A steady diet of Big Macs and soda leads to obesity, with a host of medical complications.

Not so fast, say critics.  First of all, 93 percent of the people who live in these supposed “food deserts” have access to a car.  In addition, in cities there is public transportation.  Saying a grocery store is a mile or more away is meaningless.

Second, there are five fast-food outlets for every supermarket for a reason.  That reason is market demand.  Many people prefer fast foods and processed foods to fresh foods even when they have a choice.  Fast foods cost less than do fresh foods.  Fast foods are full of fat, salt, and sugar, so they taste better than do fresh foods.  Fast foods and a lot of the stuff sold in convenience stores don’t require any higher-order cooking skills than the ability to work a microwave.

Well, is there a way to stop people from doing what they want, when what they want is bad for them?  It’s difficult.  In 2010, New York mayor Michael Bloomberg tried to have soda dropped from the “foods” eligible for purchase with food stamps.  Advocay groups and minority communities pushed back.  The USDA rejected Bloomberg’s idea.  Los Angeles gave it a try by using zoning laws to restrict the number of fast-food outlets in poorer parts of the city.  The question is whether the restrictions have just displaced fast-food consumers from their home “food desert” to some other area where no such restriction apply or if they have just led to longer lines at existing fast-food outlets.

Sometimes solving one problem can lead to other problems, even imaginary ones.

[1] Robert Paarlberg, “Obesity: The New Hunger,” WSJ, 11 May 2016.

[2] Paarlberg, “Obesity.”

[3] “America’s ‘food deserts’,” The Week, 19-26 August 2011, p. 11.

It ain’t necessarily so 2.

The current presidential candidates are selling snake oil when it comes to the economy.[1]

“The economy is rigged”—Bernie Sanders, with Hillary Clinton yapping along behind.  The economy is “rigged” only in the sense that economic change has assigned a lot of value to skills and education, and virtually no value to just showing up.  In a period of economic transformation, a modern economy shifts resources from low-productivity sectors to higher-productivity sectors.  All those in the skilled and educated sectors profit, while those in the less-educated and less-skilled sectors lose.  That isn’t the same as saying—as Sanders and Clinton imply—that a cabal of Wall Street bankers are making all the decisions for the nation at large.

“We don’t make things anymore”—Donald Trump.  In fact, it depends on what the meaning of “we” is.  On the one hand, the total value of goods manufactured in the United States is at its highest level, almost 50 percent higher than in the late 1990s.[2]  On the other hand, employment in manufacturing has declined by 29 percent over the same period.  Under the pressure of foreign competition, productivity in manufacturing has increased through new technological innovations.  Indeed, in the many days ago, it was the competition from highly productive American manufacturing that forced adaptation on foreign countries.

“I do not believe in unfettered free trade”—Bernie Sanders.  The North American Free Trade Agreement (NAFTA) and the proposed Trans-Pacific Partnership (TPP) have been condemned for exporting jobs to developing countries.  In fact, most academic economists—highly astute people on the left—believe that the evidence shows that free trade has been good for the United States.  It has destroyed some jobs, but it has created many others.  Job loss at big, old-fashioned firms is easier for the media to document than is job-creation at many small firms.

“I want to make sure the wealthy pay their fair share, which they have not been doing”—Hillary Clinton.[3]  While exceptions exist, the Congressional Budget Office (CBO) reports that the fabled top “one percent” of earners pays at a rate of 33 percent, while the middle three-fifths of earners pay at an average rate of 13 percent.[4]

“The Laffer curve.  HA!”—me.  Republicans promise that big tax cuts will lead to robust economic growth.  The Mellon Plan of the 1920s and JFK’s tax cut of 1963 seem to bear out this claim.  However, the Reagan and Bush II tax cuts did not stimulate much economic growth.[5]  Still, tax cuts leading to growth has become a Republican mantra.  Actually, the amount of growth from tax cuts is very uncertain.  What is certain is the impact of further tax cuts on the deficit.  Tax cuts will produce bigger deficits.  According to one estimate, Donald Trump’s tax plan would reduce federal income by 29 percent.

What Bernie Sanders, Donald Trump, and Hillary Clinton are trying to say is that Americans have become uncomfortable with adapting to change and competition.  That is easy to understand.  From 1945 to the 1970s, the American economy led the world.  Americans got used to high incomes from less work.  Then, the rest of the world caught up.  Sometimes this came in the form of better quality goods; sometimes in the form of lower prices.  Now it’s up to us to learn how to compete again.

[1] Gregory Mankiw, “The Economy Is Rigged, And Other Campaign Myths,” NYT, 8 May 2016.

[2] That is, during the now longed-for “golden years” of the Clinton administration.

[3] Asked to define “fair” taxes on the upper 40 percent of earners, my beloved sister-in-law says, “well, more.”

[4] http://www.pewresearch.org/fact-tank/2016/04/13/high-income-americans-pay-most-income-taxes-but-enough-to-be-fair/

[5] To be fair, the Reagan administration also had to wring-out a lot of inflation by slamming the brakes on money creation.  This led to high interest rates, slow growth, and high unemployment.

A lovely day in the neighborhood.

Social scientists contend that the location in which a child grows up correlates with their adult fate.[1]  On the one hand, there is adult income.[2]  One experiment that ran from 1994 to 1998 offered people living in public housing the opportunity to enter a lottery.[3]  Winners in the lottery received vouchers to help pay the rent if they moved to other areas.  The children of lottery winners (if they moved early enough) far outpaced the children of losers in subsequent earnings.[4]

The sequential demolition of the vast Robert Taylor Homes in Chicago between 1995 and 1998 displaced both those who did want to move and those who did not want to move.  All had to go and all received housing vouchers.  Comparing those who moved—willingly or unwillingly—with those who remained behind, economists have found that a) those who moved were 9 percent more likely to be employed than those who remained behind; and b) they earned 16 percent more than those who remained behind.

Then there is life-span.[5]  Rich people have lived longer than poor people for quite a while.  At the start of this century the average billionaire lived 12 years longer than the average street-person.  Today the gap has widened to 15 years.  Social scientists (and, for all I know, anti-social scientists or just the John Frink, Jr.s of this world) have documented that there is a very uneven distribution of extra years among poor people.  The poor in some places live almost as long as the rich, but they die young in other places.  On average, poor men in New York City live for 79.5 years; poor men in Gary, Indiana live for only 74. 2 years.

The studies suggest that altering the habits and attitudes of poor people in the blighted areas could extend lives.  First of all, in the housing-voucher lottery, only one-fourth of the people who were offered the chance to join the lottery did so.  Those who did apply have been characterized as “particularly motivated to protect their children from the negative effects of a bad neighborhood.”  This means that three-quarters of the people offered the chance to join the lottery were not “particularly motivated to protect their children.”

Then, moving to a better neighborhood increased likelihood of being employed by only 9 percent.  That’s better than nothing, but it isn’t much of a bump.  Moving to a better neighborhood increased lifetime earnings by 16 percent.  How much is that in dollar figures?  It’s $45,000.  Spread over a possible 40 year working life, that’s $1,125 a year and about $0.55 per hour.  Is it worthwhile for a family to leave behind everyone they know, a “system” that they know how to navigate, for this kind of money?

Second, the rich live in healthier ways than do some poor people.  They eat better, they exercise more, they are less likely to be obese, they usually don’t smoke, and they are unlikely to use opiods.  Even demanding, stressful jobs don’t make them feel more stressed than do poor people.  Poor people often eat a poor diet, smoke, and don’t exercise (it’s hard running 5 miles if you’re a smoker). Diet propaganda, parenting education, anti-smoking campaigns, and adult exercise programs could make a big difference.

To an uncertain extent then, poverty is volitional, a choice.  See: Juan Williams.

[1] That raises a question: does the neighborhood itself cause this effect or do people with other characteristics and experiences just end up in certain kinds of neighborhoods?

[2] Given social class segregation, it isn’t readily apparent why this isn’t the same as saying that the social class in which a child grows up has a large effect on their adult income.  Maybe it’s just NewSpeak.

[3] Justin Wolfers, “Bad Neighborhoods Do More Harm Than We Thought,” NYT, 27 March 2016.

[4] However, another experiment found virtually no difference in outcomes between winners and losers.

[5] Neil Irwin and Quoctrung Bui, “Where the Poor Live in America May Help Determine Life Span,” NYT, 11 April 2016.

Edjumication 2.

The Wall Street Journal ran this interesting—and terrifying if you give a rip about our country—story.[1]  Back in 2012, the Organization for Economic Cooperation and Development (OECD) ran a big survey of a lot of member states.  The International Assessment of Adult Competency tried to figure out how well different advanced counties do at “problem-solving in technology rich environments” (AKA “using digital technology to perform practical tasks”).  The U.S came last among 18 advanced countries.

Japan, Finland, Sweden, and Norway headed the list.[2]  Poland came 17th, just ahead of the U.S.  (On the other hand, Poles have a tremendous work ethic that has made them deeply unpopular in much of Western Europe.[3]  In contrast, car thieves in the U.S. will not steal American cars made in the 1970s and 1980s because the cars are garbage as the result of poor workmanship.  Foreign cars, like a Honda or a Mercedes?  That’s a different matter.[4])

Why is this?  A Harvard B-School professor opined that “when you look at this data it suggests the trends we’ve discerned over the last twenty years are continuing and if anything they are gaining momentum.”  What are those trends? American workers demonstrate “flagging literacy and numeracy skills, which are the fundamental skills needed to score well on the survey.”  Many Americans have a lot of trouble with any kind of math problem.

Why does this matter?  It matters because most middle-class jobs in the future will require numeracy and literacy skills.  What we think of as “manufacturing” jobs, for example, are simple, repetitive, boring jobs on an assembly-line.  The substitution of machines for manpower by management and investment allowed both high wages and high profits.  The rise of cheap labor in Asian economies entering the global market since the collapse of Communism has destroyed those jobs.  American manufacturers have adapted by introducing far more mechanization and computers.  Future manufacturing in the U.S. will involve far fewer workers with far greater skills.[5]

It isn’t just blue-collar workers who are “in a queer street.”[6]  For those aged 16 to 34, the study found that “even workers with college degrees and graduate or professional degrees don’t stack up favorably against their international peers.”  So, taking on a lot of debt to get a college degree in order to gain some safety isn’t necessarily a wise move.

What are the sources of our malaise?  Without any doubt, they are many.  However, perhaps one of them is “cultural,” rather than institutional.  “This is the only country in the world where it is acceptable to say ‘I’m not good at math’.” said one observer.   The same is probably true for reading.[7]  One measure: is there a “no shush” rule posted in your local library?

Perhaps there is something to be said for a reassertion of traditional values.

[1] Douglas Belkin, “U.S. Ranks Last in Tech Problem Solving,” WSJ, 10 March 2016.

[2] OK, but when is the last time you saw a Scandinavian block-buster movie about a crime-stopping hero in a spandex suit?  No, Scandinavian crime-stopper movies are full of aging, morose alcoholics and enraged victims of sexual abuse.  So there!

[3] See: https://en.wikipedia.org/wiki/Polish_Plumber  Or talk to people who prefer cheap, high-quality, readily-available Polish workers to the lay-abouts who make up much of the French and British labor force.

[4] See: http://www.nytimes.com/2014/08/12/upshot/heres-why-stealing-cars-went-out-of-fashion.html?_r=0

[5] See: http://www.nytimes.com/2013/09/20/business/us-textile-factories-return.html?pagewanted=all

[6] It isn’t a sexual-orientation reference.  In Evelyn Waugh’s Brideshead Revisited, the protagonist’s father recalls—during a dinner-table conversation of the son’s poor job prospects—that his Uncle Malachi “got into “a queer street.”  As a result, “He had to go to Australia.  Before the mast.”

[7] It is difficult to nail down just how many books the average American reads in a year or owns.  However, some research backs up intuition.  See: https://www.sciencedaily.com/releases/2010/05/100520213116.htm

Explaining Bernie Sanders—and Perhaps Donald Trump.

Two-thirds of Americans believe that there is at least one presidential candidate who would make a good president in the current crop. Most (75 percent) of Republicans believe that Donald Trump could win a general election—even though only about half of Republicans want him as their candidate. Virtually all (83 percent) Democrats believed that Hillary Clinton could win election–before Bernie Sanders ran even with Clinton in Iowa and then torched her in New Hampshire. Among the less-favored candidates are Ted Cruz (60 percent of Republican); Marco Rubio (55 percent of Republicans); and Bernie Sanders (54 percent of Democrats).[1]

In theory, Hillary Clinton wipes the floor with the leading Republican candidates when it comes to dealing with terrorism. Americans preferred her to Donald Trump (50-42), Marco Rubio (47-43), and even Jeb Bush (46-43).[2] On the other hand, that means that 43 percent of Americans want anyone-but-Hillary, no matter how clownish or inexperienced, to deal with terrorism. Is it the same for other issues? If it is, then she has remarkably high negatives for someone running for president. Still, so did Richard Nixon. Oh. Wait.

On the other hand, Independents fail to share this enthusiasm. Only 58 percent of them believe that there is anyone who would make a good president. (If Independents sit out in large numbers, then that might leave the November 2016 election in the hands of party regulars.)

Why are Americans so rabid for anti-establishment candidates?

In 2003, the net worth of the average American was $87,992. In 2013, the net worth of the average American was $56,335 in 2013. That amounts to a 36 percent fall in net worth, before allowing for nugatory inflation.[3] On the other hand (2003-2014), the net worth of the top five percent of earners increased by 14 percent over the same period.[4]

About one-third of Americans have no savings accounts at all.[5] Twenty percent of people aged 55 to 64 have no retirement savings. Almost half (45 percent) of people surveyed expected to live on whatever Social Security paid them.[6] Almost half (44 percent) of Americans don’t have an “emergency fund” to cover basic expenses for three months. Almost half (43 percent) of American workers would be willing to take a pay cut IF their employer would increase the contribution to the 401k retirement savings plan.[7] In August 2014, about 77 million Americans had a debt “in collection.” The median amount owed is $1,350.[8]   That’s not a lot of money. Unless you don’t have it.

If the “Great Recession” had not occurred, then college graduates entering the job market might have expected salaries 19 percent higher. The “normal” penalty for graduating in a recession is about 10 percent.[9] The recent unpleasantness has been unusually unpleasant. Also, state aid to public colleges has fallen during the recession. That means that students have been graduating with much larger debt loads than previously. They have to service those debts out of smaller starting salaries.

People hiring employees tend to favor those who are narcissistic over the humble.[10] Apparently, they are right to do so. “Narcissistic” CEOs make an average of $512 million more over their careers than do those who are not.[11] Will it be the same for voters? Hard to think of anyone more narcissistic than the Clintons. Unless it is Donald Trump.

[1] “Poll Watch,” The Week, 5 February 2016, p. 19.

[2] “Poll Watch,” The Week, 4 December 2015, p. 19.

[3] “Noted,” The Week, 8 August 2014. P. 14.

[4] “Noted,” The Week, 8 August 2014. P. 14.

[5] “The bottom line,” The Week, 15 February 2013, p. 32.

[6] “Noted,” The Week, 22 August 2014, p. 16.

[7] “The bottom line,” The Week, 22 August, 2014, p. 32.

[8] “The bottom line.” The Week, 15 August, 2014, p. 31.

[9] “The bottom line,” The Week, 1 August 2014, p. 31.

[10] “The bottom line,” The Week, 27 June 2014, p. 32.

[11] “The bottom line,” The Week, 1 August 2014, p. 31.

The Gracchi.

According to the guiding theory of the Democratic Party, a big government that robs the rich to give to the poor should be a permanent winner in electoral politics.[1] It isn’t, in spite of continual Democratic efforts to paint the Republicans as Rich Swells and the mere creatures of Big Business. Eduardo Porter conjectures that working-class whites assign greater importance to “racial, ethnic, and cultural identity,” than to “economic status.”[2]

In December 2015, one poll reported that Hillary Clinton would beat Donald Trump in a general election—if only college-educated people voted. On the other hand, Trump would beat Clinton if only people without a college education voted.

Porter is at pains to argue that, while Trump also polls ahead of his Republican rivals with women and upper-income voters, his main base is “less-educated, lower-income white men.”[3] He argues that white, working-class voters (especially men) are “nostalgic for the county they lived in 50 years ago.”[4] These people—he doesn’t quite say “those people”—“would rather not have a robust government if it largely seems to serve people who do not look like them.” While 62 percent of white Americans would prefer a smaller, less providential government[5], only 32 percent of blacks and 26 percent of Hispanics desire that end.   As a result, America could experience “an outright political war along racial and ethnic lines over the distribution of resources and opportunities.” Actually, it isn’t that clear. Whites account for 62 percent of the population, while African-Americans account for 13.2 percent and Hispanics account for 17.1 percent. Taken together, the supporters of a smaller government total better than 46 percent of the population. That’s a big constituency that spans racial lines.

Porter confuses other issues as well. He approvingly quotes one scholarly paper that argues that “racial animosity in the U.S. makes redistribution to the poor, who are disproportionately black, unappealing to many voters.” For one thing, Trump has not attacked blacks to the best of my knowledge. Indeed, he has sought the support of traditional leadership figures in black communities. Trump’s white, working-class base agrees with the candidate’s policies on building a wall along the border with Mexico; deporting illegal immigrants, virtually all of whom are Hispanic-Mexicans; and registering Muslims as potential terrorists. All these can be read as expressions of concern about the loss of jobs to foreign competition, the open flouting of the rule of law, and security in an age of terrorism. For another thing, while Porter accepts that people can have predominant non-economic concerns, he ignores the chance that people are ideologically opposed to welfare dependency. Something else must be driving them. That “something” appears to be race, as in “racism.” Again, however, Porter turns a blind eye to long-standing traditions of self-reliance as an American virtue.

America’s economy, society, and place in the world have all changed in ways that most people do not like. Democrats and Republicans are both nostalgic and they offer policies aiming at “restoration.” We need something better.

[1] “Tax, spend, elect” is one version of a motto attributed to President Franklin D. Roosevelt’s advisor Harry Hopkins. The authenticity of the phrase is disputed. See: https://en.wikipedia.org/wiki/Tax_and_spend

[2] Eduardo Porter, “Racial Identity Returns to American Politics,” NYT, 6 January 2016.   This is a variant of what Marxists term “false consciousness.” People think that that they belong to a different social class than they actually do, so they behave in the wrong fashion. In this case, people assign less importance to “economic status” than liberals think that they should.

[3] That is, the foundation of the New Deal coalition and of its successors until the Seventies. Now much despised.

[4] President Obama and other Democrats have been talking about restoring the middle class to its former prosperity. Why isn’t that “nostalgia”?

[5] This group spills well outside of the Republican constituency, let alone the Trump constituency.

Affirmative Action.

Between 1940 and 1965 the Democratic Party slowly shifted from relying on anti-black racism to a forthright advocacy of “equality as a fact and as a result.”[1] Since the end of the Civil War, opportunities for African-Americans within the modest federal government had bounced around, with the appalling Woodrow Wilson doing much to roll-back advances made under his Republican predecessors. However, after 1932 the dramatic expansion of the size of the federal government and the turn to employing private contractors to execute its will created new conditions. In 1965, President Lyndon Johnson signed an executive order that required government contractors to identify and eliminate obstacles to the employment of minorities (by which Johnson meant African-Americans). This basic commitment to justice swiftly became the consensus in American politics. In 1969, President Richard Nixon issued his own executive order that required contractors to hire so as to reflect the racial composition of their area. Many states followed the lead given by the federal government. Ten years of expanding affirmative action initiatives followed.

For every action, there is an equal and opposite reaction.[2] For one thing, there were people who saw “affirmative action” as “reverse discrimination.” If one kind of discrimination is wrong, then all kinds are wrong. So, there was a principled opposition to affirmative action. For another thing, affirmative action disrupted and devalued a well-established system of apportioning opportunity.[3] At all levels of American society, some people get things because of patronage or connections. That’s true of “legacy” admissions to Ivy League universities; it’s true of family firms; and it’s true of union hiring halls. Increasing minority representation gored somebody’s ox in many of these cases. For yet another thing, some people are racists. They assumed that African-Americans were innately less capable than were whites. For liberals of this stripe, inferiority meant that African-Americans needed to be protected and guided by an expanded state, rather than left to their own devices. For conservatives of this stripe, inferiority meant that nothing achieved by any African-American came by way of merit, but only by manipulation.

In 1975 Allan Bakke, denied admission to medical school at the University of California at Davis, sued. In 1978 the US Supreme Court found for Bakke, rejecting the use of quotas to apportion opportunity. The case touched a nerve among conservatives in particular. In 1980, former California governor Ronald Reagan won the presidential election. He issued his own executive order ending the affirmative action requirement for federal contractors.

The American system of federalism means that the policy of the federal government is not necessarily the policy of the individual states. Hence, a sustained effort has been made to persuade the Supreme Court that affirmative action is un-constitutional. In 2003, without much enthusiasm, the Supreme Court upheld the basic constitutionality of affirmative action. It’s easy to find people who feel wronged by affirmative action. So, it’s still on the docket.

[1] “The origins of affirmative action,” The Week, 28 June 2013, p. 9. Between 1865 and 1965 much of the Democratic voter base consisted of Southern whites, who upheld the system of “Jim Crow.” Indeed, it seems likely that at various points in its history, every single member of the Ku Klux Klan was a Democrat. Dis-franchised Southern blacks were nevertheless counted for the purposes of apportioning representatives just as if they had the right to vote. This inflated Democratic numbers in the House of Representatives and in the Electoral College.

[2] Isaac Newton, Third Law of Motion. But maybe not, at least not in politics, society, and the economy. Otherwise we’d be stuck in the same place for millennia. This shows the perils of applying the lessons of physics to the less reliable world of human activity. So does the Reign of Terror in the French Revolution. But I digress.

[3] Indeed, that was the idea.

The Golden Years.

Can “social progress” have negative consequences? The social security systems established after the Second World War rested on the assumption that many workers would pay a small tax to support a few retirees for a few years.[1] In Western countries the ratio between active, tax-paying workers and inactive, benefit-receiving retirees has shifted from 14 retirees/100 people to 29 retirees/100 people. This has shifted the balance between the number of workers whose taxes support retirees and the number of retirees. Furthermore, people are living longer. Just since 1970, the average period which people spend in retirement has increased by seven years. This has increased the costs of retirement born by advanced societies. Between 1990 and 2011, public spending in this area increased from 6.2 percent of GDP to 7.9 percent. An aging population has more and more retirees and fewer and fewer workers to support them.

There’s Social Security and then there are your personal savings. These are the two chief components of retirement income for most American workers. Social Security originally was not meant to be a national pension system. It was meant to insure the aged against a steady diet of cat-food noodle casseroles. Today, Social Security pays out 39 percent of the career-average earnings of middle income workers and 54 percent of the career-average earnings of a low-wage worker. If projected personal savings are added to projected Social Security benefits, then a low wage worker could anticipate receiving 90 percent of his/her average lifetime wage. However, most low-wage workers don’t manage to save much. One study estimated that less than ten percent of the bottom 20 percent of retirees has any personal savings. Social Security only pays about 54 percent of these peoples average lifetime wage. Old age means a big fall in income.

The problems will get worse. Nominally, Social Security recipients are buffered by the Social Security “trust fund.” Even if we accept this fiction, then the trust fund will eventually be exhausted. By 2035, Social Security will be paying only 27.5 percent of average career wages.

The slow-growth American economy will not make it easier to resolve these problems. It isn’t generating higher wages for most workers. The retirement of the “Baby Boom” generation is likely to create labor shortages that will drag on economic growth. While it seems to be accepted that many Americans who are doing some kind of physical labor will have a hard time adding more years to their careers, it is also likely that many people doing some kind of office work will see their intellectual abilities degrade in the same way.

So, what is to be done? One solution—popular among liberals, but poison among conservatives—is to raise the cap on Social Security withholding for higher income groups in order to re-distribute the income (and reduce the savings) of the well-paid and the provident. Another solution—popular among the “serious people” often derided by Paul Krugman—is to raise the retirement age in order to reduce spending while raising contributions. The discussion of these options is likely to be messy. Both sides are likely to frame the debate in moral, rather than practical, terms. The well-off will be portrayed as “greedy,” as “selfish,” and as not “needing” all that they have. The needy will be portrayed as “takers,” as “slackers” and as people wanting to manipulate the political system to escape the consequences of their own bad choices.

Less than a year out from a presidential election, it would be nice if the issue came up in a debate.

[1] Eduardo Porter, “An Aging Society Changes the Story About a Decline of Poverty for Retirees,” NYT, 23 December 2015.

Edjumication.

Is college worth the price? Oh absolutely! In the late 1970s, a college degree earned you about 25 percent more than did just a high-school diploma. In the late 1980s, a college degree earned you about 50 percent more than did just a high-school diploma. In 2000, a college degree earned you 70 percent more than did just a high-school diploma.[1]

On the other hand, if a student attends a college or university that is ranked in the bottom 25 percent of all colleges and universities, then they are likely—on average—to earn less than a high-school graduate who did not attend a college or university. But which are these schools? Google “ranking of colleges and universities” and the next thing you know, you’re in a morass. What I’m—defensively—guessing is that the list includes a lot of for-profit schools which the Federal government is intermittently dragging down a dirt road chained behind a pick-up truck.  Still, if you think about it, there are a bunch of schools where a BA earns you just as much as not having gone to college, and a bunch more schools where a BA earns you somewhat more than not having gone to college in the first place. All of this costs students and/or parents money.

Not everyone is a loser in this stupid game, but the success of some disguises the relative failure of many. Currently, the average annual income for college graduates ten years after crossing the stage to the cheers of family members is $35,000.[2] On the other hand, after the same span of time, Ivy League graduates average $70,000.[3] So, all you’ve got to do is work real hard in high-school, get into one of the Ivies, and you’re on Easy Street, right? Well, it turns out that “no, man, there’s games beyond the game,” as “Stringer” Bell advised Avon Barksdale on “The Wire.” For the top ten percent of Ivy League graduates, the average income ten years out is $200,000 a year. We’re talking about 31-32 year-olds here.

Why did the gap between a high school diploma and a BA open? Did the economy develop in a way that created an increased demand for whatever higher order intellectual skills and contextual knowledge one acquires in college? Did the economy develop in a way that eliminated well-paying jobs that did not require a college education? Did the high-schools decline as institutions of foundational learning, shifting the burden to colleges?

Well, yes. Back in April 2008, American high-schools trailed many other countries in their graduation rate. Norway (100 percent), Germany (99 percent), South Korea (96 percent), Russia (87 percent), and Italy (81), among others, all out-performed the United States (75 percent).[4] In June 2015, many young Americans graduated from high-school. Their average composite SAT score was 1490 out of a possible 2400. That is the lowest level since 2005. The country has been pursuing essentially the same educational reform policy under different names (No Child Left Behind, Race to the Top) all those years. It has achieved nothing. Furthermore, the commonly-accepted bar for college readiness is 1550. OK, not everyone needs to go to college (regardless of what President Obama once hoped for) and it’s only an average. So, how many high-school graduates were ready for college? Of all students, 42 percent scored at least 1550. However, only 16 percent of African-Americans scored at least 1550.[5]

The problems are with the schools and with parenting. Sad—and rare–to say. Read to your kids. Let them see you reading.   Praise hard work. You know, Puritanism.

[1] “Noted,” The Week, 14 January 2005, p. 14.

[2] When welders are making $100K a year.

[3] “Noted,” The Week, 25 September 2015, p. 16.

[4] “Noted,” The Week, 4 April 2008, p. 16.

[5] “Noted,” The Week, 18 September 2015, p. 16.