Inequality 1.

In December 2013 President Obama called income inequality “the defining issue of our time.” He’s agin it. Soon afterward Thomas Piketty, Capital in the Twenty-First Century (2014) garnered many accolades and some readers. This added academic fuel to the populist fires.

Already by Summer 2014, however, there were reasons to doubt the substance behind the passions aroused by the issue. Eduardo Porter raised two issues.[1] First, the problem of income inequality isn’t that important compared to other problems facing the United States. Social scientists have been trying to demonstrate that the rise of the “One Percent” has harmed society. They haven’t been able to prove it. Second, it may not be a problem with a practical solution.

What we think of as “globalization” (technology + open world markets) has polarized people toward the extremes of income: high-earners and low earners, but fewer and fewer people in between. The relationship between one’s job and technology is key. Someone who has a job that is not easily replaced by a machine, but which requires the manipulation of technology, is in a good place. In contrast, anyone with a job that can—or one day could be—done by a machine is in a bad place.[2] Generally, higher incomes are flowing toward people with higher education.[3] That’s true both within the United States and within the global economy. From this perspective, the “defining issue” is how to promote enough economic growth to insure a rising standard of living for the low-earners. Gregory Mankiw, a Harvard economist who served as an economic advisor to both George W. Bush and Mitt Romney, argues that raising the amount of education of American workers offers the best path to higher incomes.

Seen dispassionately, the best solution would be to help the people at the bottom of the income ladder without preventing the people at the top of the income ladder from doing the stuff that generates income for all. Allowing the gains from growth to flow only to those at the top of the income pyramid will not head off political trouble.  More could be done to take the rough edges off the state in which we find ourselves. For one thing, cuts in public aid to state colleges and universities has shifted a heavier burden onto parents and students seeking the higher education that is supposed to allow them to climb out of the pit. Restoring that aid would be a valuable step. Increasing the Earned Income Tax credit is another way. Developing policies to make the urban cores of the dynamic cities affordable to low-income workers by is another way. Still, reducing inequality by higher taxes on the well-off and an ever more generous social welfare system[4] cannot turn back the tsunami of change.

However, dispassionately isn’t how most engaged people are seeing the issue. Both the political parties have a stake in stirring up passions by misrepresenting the realities. The Right sees President Obama as an anti-business zealot. The Left sees Republicans as pawns of corporations.

How long will it take to make a more educated and better educated workforce? In the meantime, how does the country manage the social costs of the transition?

[1] Eduardo Porter, “Income Inequality And the Ills Behind It,” NYT, 30 July 2014.

[2] A 2012 poll of economists showed that the great majority believed that the uneven impact of technological change best explained the rise of income inequality. Reagan, Bush II, “deregulation,” and the other usual suspects didn’t figure

[3] Scholars have compared the college graduation rates for those born in the early 1960s with those for 1979-1982. People in the top 20 percent of incomes rose from 36 percent to 54 percent, while it rose from 5 percent to 9 percent for those in the bottom 20 percent. Furthermore, even the real incomes of people with a BA have hardly risen since the mid-1970s. NB: There is a lot you can do with this basic set of statistics.

[4] “Free sandals for foot fetishists,” as the Democratic columnist Mark Shields once described the policy prescriptions of the Democratic Party of the 1980s.

Annals of the Great Recession IV.

Recession and recovery are supposed to follow a pattern.[1] Recessions lead to higher unemployment; recovery leads to higher employment. Thus, during the 1990 recession, the share of the working-age population with a job or looking for one fell from just under 67 percent to just over 66 percent. Labor force participation then rose to over 67 percent by 2000. However, since 2000 the pattern has changed. Between the recession of 2000 and the recession of 2007-2008, labor force participation trended downward from 67 percent to just over 66 percent. Since that recession began, the labor participation rate has pursued an even more rapid decline. By September 2014, the rate had fallen to 62.7 percent. It hadn’t climbed any by the end of the year. New entrants to the job market get absorbed, but many of the long-term unemployed remain off the labor market.

Where did the missing 3-4 percent of the potential labor force go? Many of them retired permanently. We can see here the leading edge of the “baby boom” taking up the rocker on the front porch. For anyone born between 1950 and 1954, getting laid off in the recession just sent them into a slightly early retirement. It probably doesn’t make sense to these people to try to fight their way back into a job so that they can work for another year or three. Less than 20 percent of those who are over 65 are still in the work force.

In addition, psychological fragility has replaced resilience as an American character trait. At least, that’s the idea you could get from some economists’ explanations. “Labor market scarring” of workers seems to reflect a belief that job-hunting is a traumatic experience. The unemployed would rather adapt by other means. They move in with aged parents to provide care; they file for disability under the currently easy conditions for gaining it; they probably do a bunch of work off-the-books; and they’re not going to leave anything to their kids.

What are the effects of them not working? The Federal Reserve Bank wants to sustain low interest rates until the labor participation rate rises to “normal.” What if the current rate is the “new normal”? It’s an awful lot of productive labor going to waste. It sets a ceiling on the growth of the economy. Fewer workers paying taxes tightens the screws on federal revenue.

The trend toward a lower over-all labor market participation rate masks other changes.[2] The female labor participation rate has fallen from about 74 percent in 1999 to about 70 percent today. One could conjecture that if over-all labor participation was about 67 percent in 2000 and the female rate about 74 percent, then the male participation rate would have been about 60 percent. Similarly, if the over-all rate is about 63 percent today and the female rate is about 70 percent, then the male participation rate would be about 56 percent.

Certainly, the labor participation rate for men has been trending downward since the 1970s.[3] Back in the 1950s and 1960s, only 10 percent of men of working age were not in the labor force. Another trend masked by the over-all data is the shift of better jobs toward women. That trend springs from the shift away from manufacturing (traditionally male work) toward a knowledge and service economy which requires more education. Men are less likely, women more likely, to stick with school. The quality of jobs held by women has steadily improved.

There’s an old joke about a guy in Maine who lost his job. A friend asked him how he was going to get by. The man replied “Well, the t.v. works and the wife works.”

[1] Josh Zumbrun, “Labor-Market Dropouts Stay on the Sidelines,” WSJ, 29 December 2014.

[2] David Leonhardt, “The Distinct Geography of Female Employment,” NYT, 6 January 2015.

[3] In the 1970s the “oil shocks” disorganized the economy and foreign economic competition first became a serious challenge.

War Movies 8: “American Sniper.”

Chris Kyle (1974-2013) had a rare talent at shooting, joined the Navy SEALS at the beginning of global terror’s war on us, did four tours in Iraq as a sniper, wrote a book about his experiences, and was killed by a disturbed military veteran he was trying to help.

Warner Brothers bought the movie rights to the book and signed Bradley Cooper to star. First, David Russell (“The Fighter” (2010), “Silver Linings Playbook” (2012), “American Hustle” (2013),) was going to direct; then Stephen Spielberg; and finally Clint Eastwood.[1]

Kyle’s father instructs his son on shooting and in manly conduct: “there are three kinds of people: sheep, wolves, and sheep dogs.” Chris Kyle (played by Bradley Cooper) takes the message to heart. He is determined to use his skill to save the lives of endangered American troops in Iraq. A chance encounter with his younger brother, who had enlisted after 9/11, drives home the importance of this mission. The younger man is skittish and eager to be gone from Iraq. This sense of duty leads him to serve four tours in Iraq. He becomes a legend among the common soldiers and Marines. A dead insurgent plunges off a rooftop into the midst of an American patrol. An officer casually remarks “that’s the over-watch; you can thank him later.” Increasingly, Kyle becomes obsessed with an insurgent master sniper called “Mustafa.”[2] He returns for his final tour in hopes of killing Mustafa. He succeeds and comes home.

The price is very high: Cooper plays Kyle as “calm and confident,” so he doesn’t emote much about stress. He’s just increasingly distant, uncomfortable with the emotions of other people (both his wife’s and those of grateful veterans), with flashes of rage. Eventually, this self-contained man makes his way home by finding a new means to “save” fellow soldiers.

The movie has been criticized from the Left for de-contextualizing Kyle’s story. Eastwood portrays Kyle as motivated by the Al Qaeda attacks on the American embassies in East Africa and by 9/11; then the events in Iraq focus on the effort to kill Al Qaeda in Iraq leader Abu Musab al-Zarqawi. How the United States came to invade Iraq is scrupulously left out. The critics are mad that this wasn’t about the lies that led us to war. That would be a different movie. Indeed, it has been. Several times. All of which were flops. “Rendition” (2007, dir. Gavin Hood); “Lions for Lambs” (2007, dir. Robert Redford); “Redacted” (2007, dir. Brian de Palma); and “Green Zone” (2010, dir. Paul Greengrass) all lost money or fell short of earning expectations. That says something about audiences and what they’re willing to acknowledge. . In contrast, “American Sniper” is well over $200m in the black.

“American Sniper” falls into a different category of war movie from the ones that haven’t succeeded with American audiences. “The Hurt Locker” (2008, dir. Kathryn Bigelow) and “Zero Dark Thirty” (2012, dir. Kathryn Bigelow) became huge hits by focusing on driven individuals, the personal price they pay, and on the shameful American indifference to the human costs of wars waged by their country.   However, “American Sniper” ends on a different note than do Bigelow’s two movies. In her work, the protagonists (played by Jeremy Renner and Jessica Chastain) are lonely souls, estranged from their less-driven colleagues, cut off from home, and unknown to their fellow Americans. “American Sniper” ends with Kyle’s funeral procession across Texas. On a rainy day masses of people line the highway and the overpasses, fire-engine ladder trucks hoist huge American flags, Stetsons and baseball caps come off as the cortege passes. Eastwood is in his eighties. This may be his last movie. Hell of a way to go out.

[1] “American Sniper” (2014, dir. Clint Eastwood).

[2] It’s worth noting that the film portrays Mustafa (played by Sammy Sheikh—who has portraying evil Muslims down to a fine art) as an insurgent version of Kyle: skilled, committed, and with a family that is shut out of his work.

The Plagues Next Time.

Somebody (Stephen Colbert?) once joked the “Reality has a well-known liberal bias.” Actually, reality has a well-known bias in favor of human reason. Reason, in turn, is pretty-much non-partisan and available to anyone who cares to develop it. Of course, one problem is that not everyone is a willing consumer.

Antibiotics.[1] Bacteria cause infections and spread infectious diseases. Infections and infectious diseases used to kill many people. Even with sterile operating room, for example, the danger of post-operative infection made even an appendectomy a hazardous procedure. At the dawn of the 20th Century, scientists and doctors combined to launch a medical revolution. They developed antibiotics like penicillin to fight infections. All sorts of perils were suddenly conquered. Antibiotics made a vital contribution to the dramatic rise in life expectancy during the 20th Century.

Now we face a potentially devastating return of infectious diseases. The origins of this menace are complex, rather than simple and easily addressed. First, bacteria are living things that adapt to their environment. Some bacteria are hardier than other bacteria when it comes to resisting antibiotics. These hardy bacteria can develop mutations that make them more resistant to antibiotics, so they multiply while the less-resistant strains of bacteria are wiped out. (See: Darwin and his “theory” of Evolution.) Two factors have greatly facilitated this development. On the one hand, idiot doctors prescribe antibiotics in the wrong circumstances and idiot patients who get prescribed antibiotics often stop taking them before they have completed the full course. This wipes out weaker bacteria while leaving stronger bacteria to multiply. Once there are enough of the resistant bacteria in the system, the existing type of antibiotics no longer work. Then, “factory farming” of livestock involves massive use of antibiotics in the feed for these animals. Eighty percent of antibiotics are used on “factory farms.” So this creates a hot-house environment for the mutation of bacteria. Ooops.

Second, pharmaceutical companies lose money on new antibiotics to fight the new “superbugs” that are developing. People only take antibiotics when they have a bacterial infection. That is a rare occurrence compared to what it was before antibiotics were developed. Moreover, the sales price of antibiotics is low. Taken together, these factors make for a thin revenue stream from antibiotics. However, antibiotics are very expensive to develop. The average antibiotic loses $50 million for the company that develops it. In contrast, drugs to treat chronic conditions (diabetes, high blood pressure, can’t-get-it-up-with-a-crane) are taken on a constant basis over a long period of time. They are money-spinners. So, no important new antibiotics have been created since 1987.

How do we avoid this train wreck? First, give the pharmaceutical companies a reason to create new antibiotics. (I know: “They make enormous profits! They should do this out of the goodness of their souls!” They won’t and the “public option” beloved of “progressive people” = the Veterans’ Administration + Solyndra.) Extend the length of time that companies have patent protection for their antibiotics. This will keep low-cost producers from churning out generics. Second, subsidize the companies with tax-credits when they develop antibiotics. Third, put a stop to the abuse of antibiotics by idiot doctors and patients, and by factory farms.

 

Vaccination.[2] One idea behind vaccination is to wipe out diseases among young people. As the diseases are wiped out, they cease to pose a threat to older people as the effects of the childhood vaccinations wear off with time. Fine, so long as hardly anyone misses out on vaccinations. However, that is just what is starting to happen.

In 1998 Dr. Andrew Wakefield published a scientific study showing that the development of autism in twelve children could be linked to the standard vaccination against measles, mumps, and rubella. Naturally, many parents became alarmed. A subsequent inquiry demonstrated that the study was a fraud. Many subsequent studies have demonstrated that there is no connection between vaccination and autism. Too late! The suspicion/belief that vaccination is dangerous had become entrenched among a large and growing segment of parents. Why did this happen? In part, because of a 300 percent increase it the number of cases of autism that were diagnosed between 2002 and 2013. Although scientists suspect that autism arises from a mixture of genetic and environmental factors, the “anti-vaxxers” aren’t buying this explanation. Today, about ten percent of parents either postpone scheduled vaccines or claim a “personal belief” exemption to prevent their children from receiving vaccinations.

Who are the “anti-vaxxers”? Their ranks include pure-life progressives who reject both vaccines and genetically-modified foods; libertarians who see good health as just one more federal intrusion on their God-given right to watch their children cough their lungs out; and the descendants of the Scopes “monkey trial” rural conservatives.

What do “anti-vaxxers” believe? They believe that immunization can cause disorders and/or that so many vaccinations—16 is common—can “overwhelm” the body’s natural resistance to disease and expose children to diseases. There is NO evidence for any of this.

There is abundant evidence that reducing the number of vaccinated children then exposes adults to diseases from which they have thought themselves safe. In 2012, 50,000 Americans came down with whooping cough, by far the largest number if fifty years. Eighteen people died. In 2013 the number of cases of measles (OK, 190) was three times higher than in 2012.

Where do I go to get away from the people who want to get away from the Federal government? Idaho?

[1] “The antibiotic crisis,” The Week, 22 November 2013, p. 9.

[2] “The return of childhood diseases,” The Week, 7 March 2014, p. 9.

 

Getting a fat lady into a girdle.

It is way too early to tell how the Affordable Care Act (ACA) is going to shake-out. Neither Republican doom-saying nor Democrat triumphalism seems warranted at this moment. There are signs of gains that need to be consolidated and issues that may need to be addressed.

During the first year of the ACA the uninsured rate fell by thirty percent/10 million people.[1] That means that seventy percent/20 million people of the previously un-insured are still un-insured. Between 2002 and 2012 a rising number of Americans told Commonwealth Fund pollsters that medical bills caused them financial troubles.[2] Medical debt became one of the leading causes of people filing for bankruptcy. Many people (43 percent in 2012) decided against seeking some sort of medical care because of the cost. The Affordable Care Act intended to address this problem as one part of its effort to make health care more broadly available. The number of Americans reporting trouble with medical debt peaked at 41 percent in 2012. Then the number began to fall, hitting 35 percent in 2014. The number of those who did not seek medical care because of cost also fell to 36 percent. So, is the glass full, half-full, or empty?

The big problem is health-care costs and, thus, health-insurance costs.

Between 2003 and 2013, insurance premiums rose faster than did median incomes.[3] Between 2003 and 2010 insurance premiums rose by an average of 5.1 percent per year. In thirty-seven states the total employer + employees contributions equaled at least 20 percent of median income. Thus employers’ labor costs also rose. From 2011 to 2013, the pace of increases slowed, but continued to rise at a rate of 4.1 percent. By 2013 the average insurance premium had reached a national average of $16,000. Employers started looking for a way to limit the rise in their labor costs.

What they have hit on, in many cases, is shifting the cost to employees. In 2003, 52 percent of workers had employment-provided insurance with a deductible. By 2013 the number had risen to 81 percent. Furthermore, the deductibles have also risen by an average of 146 percent. They now average $1,000 per person in most states. According to a Commonwealth Fund study, the out-of-pocket costs for employees (insurance premiums + deductibles) rose from 5.3 percent of median household income in 2003 to 9.6 percent in 2013.

On the one hand, according to one report, 58 percent of Americans polled want ObamaCare repealed.[4] Why? Job-creation and wage increases have both been lagging for several years. This has left people feeling like the Great Recession never ended. Perhaps the shifting of medical costs to their consumers makes people feel like ObamaCare never happened.[5]

On the other hand, although health-care costs have risen more slowly since passage of the ACA, most economists—as opposed to political spokesmen—attribute this to the recession. They are likely to start back upward as the economy recovers. This will increase the pressure on employees for out-of-pocket expenses and premiums.

In short, we’re not yet done with health insurance reform. Maybe we’ll get it all the way right the next time.

[1] “Obamacare: Why, in Year Two, it’s still so unpopular,” The Week, 16 January 2015, p. 6.

[2] Margot Sanger-Katz, “Distress Appears to Ease Over Cost of Health Care,” NYT, 15 January 2015.

[3] Tara Siegel Bernard, “Health Premiums Rise More Slowly, but Workers Shoulder More of Cost,” NYT, 8 January 2015.

[4] “Obamacare: Why, in Year Two, it’s still so unpopular,” The Week, 16 January 2015, p. 6.

[5] However, it is possible that what they don’t like is Obama, rather than the Care. People often disapprove of a President in his lame-duck years.

 

By the waters of Babylon.

There was a weird and grim story in the New York Times on Sunday.[1] The story starts with two “old money” brothers: George Seymour Beckwith Gilbert (born 1942) and Thomas Gilbert (born circa 1944).[2] Their father ran a company that made textile machinery, back when America still had a textiles industry. The parents sent the boys to Philips Andover and then to Princeton (Beckwith ’63, Tom ’66). Both went on to get MBAs (Beckwith from NYU, Tom from Harvard). Both went into finance. Thereafter their career tracks diverged. The older brother worked for firms that bought and turned-around poorly performing companies. There were a lot of these in the America of the Seventies and Eighties. Eventually he founded Field Point Capital Management Company. Later, he got interested in science and medicine. This led him to get an MS in Immunology from Rockefeller University (2006).[3] He’s on a bunch of boards, corporate and academic. You could read this as an example of how people get to the top of American society and how subsequent generations stay there: a combination of brains, hard work, and the opportunities that come from social networks.

            Tom Gilbert’s career seems to have run down a different course. People from Princeton remember him as affable and athletic, rather than as highly intelligent. He worked in a bunch of jobs at Wall Street, including a seven-year stint at Loeb Partners that ended in 1991. In 1998 he founded Knowledge Delivery Systems (KDS) to provide on-line education materials.[4] In 2010 he co-founded Syzygy Therapeutics LLC. He stuck with that for a little over a year, and then founded his own private equity firm, Wainscott Capital Partners LLC, in April 2011. He was sixty-seven years old and starting a new venture.
Should we see this new venture as admirably lively or as desperate? Possibly the latter. Tom Gilbert, Sr. left an estate worth $1.6 million. Oddly, and my saying this will infuriate most people, that isn’t a lot of money.[5] About a third of his assets were his stake in his new fund. He had a house (not a “mansion”) in the Hamptons; he belonged to a couple of clubs (River in New York, Maidstone in East Hamptons); he was selling the house in the Hamptons; he and his wife had given up a brownstone on the Upper East Side for a smaller apartment on Beekman Place. He put in twelve-hour days at his new business and never took vacation. You could read this as an example of how people get to the top of American society and how subsequent generations struggle desperately to stay there: more social than smart; hard work; and the lack of social networks as the economy goes through revolutionary changes.
Beneath the surface of this little bit of social history a la Louis Auchincloss is a sadder tale that also speaks to other contemporary concerns. Tom Gilbert was (apparently) a loving, doting father who had a troubled child. Tom Gilbert, Jr. (born 1984) had followed in his father’s footsteps: he graduated from Deerfield and then from Princeton, with a degree in economics. He loved sports and had a wide circle of friends. However, something was wrong. He graduated from Princeton in 2009, when he was twenty-five. Something slowed him down. He never managed to start a career. Instead, he lived off his father: the $2,400 monthly rent on an apartment and an allowance of $600 a week.[6] Meanwhile, his friends from Deerfield and Princeton pressed on with the usual careers in business, law, and government. He went to parties, saw them, and what could he say when they asked what he was doing?
About a year ago, perhaps in late 2013, things started to get dramatically worse for the Gilberts. Tom, Jr. got barred from the Maidstone Club for giving one of the employees a bad time.[7] He had a fight with a friend (possibly over a woman); the friend got a restraining order; Tom Jr., violated the restraining order and got arrested; somebody burned down the family summer home of the friend; the police wanted to talk to Tom Jr. about this episode, but never charged anyone with setting the fire. More and more friends stopped seeing him. Tom Jr. got a gun and started spending time at a range.[8] Some of Tom Jr.’s friends told Tom Sr. that they were worried about his son. Undoubtedly, Tom Sr. was worried as well. He had paid for a lawyer to resolve some “minor matter.” He may have persuaded his son to get medical help and paid for that. Tom Jr. doesn’t seem to have appreciated the help. He became critical, even mocking, of his father.
The two strands of Tom Gilbert Sr.’s life came together in early January 2015. He was making sacrifices to get his fund up and running by downsizing his own life-style. Truth be told, he wasn’t getting any younger and there was no guarantee that he would be able to build his fund into a real fortune. He probably wasn’t going to be able to leave a huge inheritance to his family. Tom Jr. may have seemed stuck in a life going nowhere and in need of some kind of help. Either because the financial pressures he faced were becoming grave or because he hoped to nudge his son toward becoming self-sustaining, Tom Sr. told his son that he would have to reduce his allowance. On Sunday, 4 January 2015, Tom Sr. was shot once in the head in his apartment. Police arrested Tom Jr. later that day.
Some in the media want to make the story about the harmful effects of “privilege.” That isn’t what it’s about. Instead, the story is about two things. One is that inherited “privilege” is nowhere near as reliable as it once may have been. The differential fortunes of the two older Gilbert brothers illustrate that point. The structure of the American economy has been changing fast. The decline of old industries has wreaked havoc with blue-collar and middle class incomes. Did it do the same with upper-class inheritances, forcing a whole generation to seek opportunities to restore or shore-up their assets? The composition of the American financial elite also appears to be changing in response to the rise of new industries. Adapt or disappear.
A second thing is that a troubled adult is hard for anyone to assess, help, or control. It’s hard to tell how far a person will fall. It’s difficult to get anyone institutionalized after they hit fourteen unless they do something that makes people say that they should have been institutionalized before they did it. It’s easy to say that someone needs help, but harder to find help that works. It’s easy for people to get their hands on firearms, even when there is a restraining order against them for one thing and they’re suspects in a crime for something else.
Of the two themes, the second seems far the more important, the outcome the most tragic. Parents of all social classes and races have struggled with troubled children. Sometimes things work out. Life for everyone gets progressively better. Sometimes they don’t and there flows a river of tears.

[1] Landon Thomas, Jr., “The Price of Privilege,” NYT, 18 January 2015.

[2] Are they related to the Seymour Parker Gilbert who was a New York investment banker and later was Agent-General for Reparations in the Twenties?

[3] http://www.pa59ers.com/potpourri/folders/g05-Gilbe/g05.html

[4] If you look at the current leadership team at KDS you can’t help but get the feeling that they are not “old money” or “old school.” BAs from Tufts, Yeshiva, Wake Forest, Gettysburg, UCLA, Kenyon, George Washington, North Texas State, and Howard. Blacks, women, and Jews. http://www1.kdsi.org/about-kds/kds-team.htm

[5] Well, it isn’t a lot of money for a 70 year-old guy who came from money, got a first-rate education, and spent his working life on Wall Street.

[6] So he’s costing the father $60K a year. Do all rich families subsidize their children in this fashion while the kids get their feet or is this an exception to the rule?

[7] The incident may have been really egregious or not the first time if it got him banned.

[8] Glock 22: .40-cal. pistol favored by big-city police departments and the DEA. Ugly piece of work.

Nothing to CLAP about.

There is an exam called the College Learning Assessment Plus.[1] The exam measures how much college students gain between the freshman year and the senior year. It assesses communications skills (reading, writing); analytical reasoning; and critical thinking. Thus, it is applicable across disciplines and measures the “transferable skills” that have long been touted as the real value of a college education.

The results of the CLA+ for 2013-2014 give cause for hope and fear.[2] Of Freshmen who took the test, 63 percent scored below the Proficient level and 37 percent scored Proficient or higher. Of Seniors who took the test, 40 percent scored below the Proficient level and 60 percent scored Proficient or higher. Of Freshmen, 31 percent enter college at a Below Basic level, but by the Senior year this share has been reduced to 14 percent. Similarly, 32 percent of Freshmen score in the Basic level, but by the senior year this had been reduced to 26 percent even as 17 percent have moved up from Below Basic to at least Basic.

So, the good news is that colleges take the 37 percent who are already proficient and make them more proficient; and they take 23 percent who are not proficient and raise them to proficiency. So, sixty percent of college students benefit from attending college.[3]

What’s the bad news? Well, 14 percent of seniors graduate with a Below Basic score and another 26 percent graduate with a Basic, but Below Proficient score. That’s 40 percent who come out of college deficient in the intellectual skills assessed by the CLA+ exam. That is a huge wastage of resources. Of late, much attention has focused on graduation rates and time-to-graduation. Here, the United States has lost its world-leading position and has fallen behind some other countries. The results of the CLA+ exam suggest that the problem is actually worse than it appears because 40 percent of college graduates don’t actually function at a BA level.

There’s a part I don’t understand, but which I will report. Test scores fall in a range between 400 and 1600. The average Freshman score is 1039; the average Senior score is 1128. The average improvement is 89 points. If, for the sake of argument, you subtract the 400 points you get for being able to sign your own name, then the Freshmen average score is 639 and the Senior average score is 728. An 89 point increase amounts to just under a 14 percent.

Still, these reports raise several questions. Why do almost two-thirds of Freshmen start college below the level of proficiency for their group? Furthermore, many students do not go on to college at all. This suggests that K-12 education is failing many students. It also suggests that an increasingly remedial function is being forced on colleges. (At the same time, they are being criticized for loading students and parents with debt and for not graduating students in a timely fashion.)

Is a 14 percent average improvement enough to justify the cost of four years of college? Does the 14 percent improvement push students over some undefined threshold between incompetence and competence? If it does, then the money probably is well spent.

It’s just my opinion, but professors are the least-qualified to understand the nature of the problem. Their children grow up with books, pictures on the walls, a variety of kinds of music playing, trips to cultural events rather than Disney World, experiences valued over possessions, and parents who work all the time. So, their children are usually successful in school and in life.

[1] This is abbreviated as CLA+ so that anxious parents will not be overheard asking other parents “So, how did your kid do with the CLAP?”

[2] Douglas Belkin, “Skills Gap Found in College Students,” WSJ, 17-18 January 2015.

[3] Maybe all of them do, without that showing up in the test scores. Maybe they are marginally more attuned to key skills without quite getting out of the bottom category.

Legacies of the Violent Decades.

The 1970s and 1980s were violent decades.[1] The rate for all violent crime rose from about 500/100,000 people to almost 800 between 1975 and 1991. The robbery rate rose from about 200/100,000 people in 1975 to about 270 in 1991. The rate for aggravated assaults rose from about 230/100/000 people to about 450 in 1992. From 1975 through 1991 the murder rate bounced around between 8 and 10/100,000 people. In 1990 there were 2,245 homicides in New York City (five a day), and 474 homicides in Washington, DC (more than one a day).

State and federal governments lashed out against this spike in crime with the weapons at hand. The federal government directed billions of dollars to the states to increase the number of police and to build prisons to house the people the police caught. Sentences were lengthened for some crimes and mandatory minimums were imposed to limit the freedom of judges. Between the early 1970s and 2009 the number of people in state or federal prisons quadrupled to about 1.5 million people.

Then the rates of violent crime began to drop. The rate for all violent crime fell by 51 percent, to a level 25 percent below the 1975 rate. The rate for aggravated assault fell from its 1992 peak by 48 percent, roughly back to where it had been in 1975. The rate for robbery fell by from its 1991 peak by 60 percent, to a level 51 percent below the 1975 rate. The murder rate fell from its 1992 peak by 41 percent, to a level slightly below its 1975 rate. In 2014 there were 328 homicides in New York City (less than 1/day) and 104 homicides in Washington, DC (two/ week).

This remarkable change has begun to spark debate, just as did the remarkable spike in violence in America before 1990. One question is what has happened since 1990 to bring down the rate of violent crime? Experts are not entirely sure how to answer this question. They do agree on some things. First, targeted policing is a big part of the answer. New York City Police Commissioner William J. Bratton introduced the use of computer data and crime mapping (“CompStat”) to identify targets for police efforts.   Police began to concentrate their efforts on these identifiable trouble spots. Drugs used to be sold right out on the street. Aggressive policing pushed the sales in-doors. That didn’t do much to cut down on drug use, but it did make drive-by shootings a lot less lethal. The “broken windows” strategy came to be widely adopted. Second, tougher sentencing and mass incarceration played a lesser role than advocates expected.[2]

A second question is about what to do going forward? On the one hand, what is to be done with the large numbers of people still locked up from the previous decades? If they are released, will they just return to their old ways? Can people convicted of non-violent crimes be safely released and better served with drug-treatment programs? Going forward, should the length of sentences be reduced?

On the other hand, should the aggressive policing that accompanied the reduction in crime be scaled back? When crime rates are high and people are afraid, they are willing to tolerate aggressive forms of policy that they will not tolerate when crime rates are low and people feel secure. “Stop and frisk” has come under heavy fire. It has been argued that this kind of policing—which may have created the situation in which Eric Garner died—has begun to alienate law-abiding people in the communities on which the police focus. Can the police operate in an environment in which they are widely viewed as the enemy?

See: “The Senator from San Quentin”; “Military Police”; Death Wish.”

[1] Erik Eckholm, “With Crime Down, U.S. Faces Legacy of a Violent Age,” NYT, 14 January 2015.

[2] Which is not the same as saying that they played no role.

Can’t buy me love–or happiness.

Does money buy happiness? Yes—up to a point.[1] All sorts of other factors also play in, but nothing is as important as national income in determining response to “life satisfaction” surveys. A decade of surveys organized by a Dutch social scientist have found that “most people worldwide say they are fairly happy” and that people in more developed countries are happier than people in less developed countries (i.e. more development would increase happiness). However, once you get to the $20,000 per capita income level, advances in national income cease to produce much gain in life satisfaction or happiness. Thus, “happiness” or “life satisfaction” has not increased in the United States since the mid-Fifties, although there has been an 85 percent increase in the real value of family incomes (from $24K in 1953 to $51K in 2001). About 53 percent of Americans described themselves as “very happy” in 1957; about 47 percent did so in 2000. Curiously, the material ambitions of Americans seem to have sky-rocketed in recent years. In 1987 surveyed adults estimated that an income of $50K/year would be enough to “fulfill all your dreams”; by 1994 that figure had shot up to $102K, although prices had not doubled. (NB: All of a sudden Americans wanted things that were really expensive? Or college tuition sticker-shock had hit?)

What is “happiness”? One Yale political scientists (Robert Lane) argues that “happiness is derived largely from two sources—material comfort, and social and familial intimacy…” These needs tend to be out of whack. In “less developed countries…social ties are often strong and money is scarce…” People have social intimacy, but no material comfort. “Economic development increases material comfort, but it systematically weakens social and familial ties by encouraging mobility, commercializing relationships, and attenuating the bonds of both the extended and the nuclear family.” Initially, “the gains in material comfort more than outweigh the slight declines in social connectedness.” At some point the competing needs for comfort and intimacy balance, leaving people at their maximum point of “life satisfaction” or “happiness.” Western culture has a deeply entrenched need to produce and consume, to generate prosperity. It is what made the West the leader in economic development and it continues to hold sway long after the real need to produce has passed. Eventually, therefore, “the balance tips and the happiness-reducing effects of reduced social stability begin to outweigh the happiness-increasing effects of material gain.”

Still, there are places that are poor and unhappy, less poor and happy, and rich and happy, but there are no places that are rich and unhappy. The places that were poor and unhappy ten years ago were Ukraine, Russia, Belarus, Armenia, Azerbaijan, Bulgaria, and Latvia. Estonia and Lithuania are pretty close to falling into this category. In short, people were really miserable in the ruins of the old Soviet empire. Conversely, people who lived in the old American empire (the US, Canada, Western Europe, Japan, Australia) tended to be pretty happy. (Hence the outcome of the Cold War.) The highest levels of “life satisfaction” seemed to be found in politically insignificant countries with per capita incomes between $17,000 and $25,000, and located in more northern climates (Finland, Sweden, Denmark, Iceland, Switzerland, Netherlands, Luxembourg, Ireland, Canada). However, that doesn’t prove that moderate income and moderate social stability is the real key to happiness. Perhaps the cold climate just keeps people indoors all the time and they make love a lot. For lack of anything better to do.

[1] Don Peck and Ross Douthat, “The World in Numbers: Does Money Buy Happiness?” Atlantic, January-February 2003, pp. 42-43.

Death Wish.

As anyone knows who ever watched the “Death Wish” movies starring Charles Bronson, New York City is full of crazy people. Recognition of that truth helps us to understand the current conflict between Mayor Bill Di Blasio and the NYPD.

First of all, in spite of the concatenation of questionable police killings nation-wide in the past year and in spite of Mayor Di Blasio’s warning to his son, NYPD police shot to death three people during 2014. That is down from eight in 2013 (and 91 in 1971). Police department shootings fell by more than half in the later 1970s, then trended downward to one-sixth of the 1971 level though the first decade of the 21st Century. New York is a less violent city than in the past and the NYPD is less inclined to use lethal force.[1]

Second, it is dangerous to be a police officer, but much less dangerous than it used to be. In the “Bloody Seventies,” an average of 127 law enforcement officers a year were killed in the line of duty nationwide. Then the death-toll began to fall. In 2013, 32 police officers were shot to death in the line of duty; in 2014 the number rose to 50 officers killed.[2]

Third, Eric Garner was not an “unarmed black man” who died from an illegal choke-hold. He was a 6’3”, 350-pound career petty criminal[3] who suffered from asthma, heart disease, and obesity. When police attempted to arrest him for the minor crime of allegedly selling untaxed cigarettes on 17 July 2014, Garner resisted arrest. Officer Daniel Pantaleo put his arm around Garner’s neck and dragged him backward to the ground. Garner fell hard. However, the medical examiner found that there was no damage to either Garner’s windpipe or neck-bones. So, he wasn’t killed by the “chokehold.” He may have died of either a heart-attack or a severe asthma attack brought on by the arm around his neck, a high level of stress, and the slamming to the ground of a fat man with a bad pump. After Garner hit the ground, the police did nothing to assist Garner beyond calling for an ambulance. Garner died in the ambulance on his way to hospital.

Fourth, there is absolutely nothing to connect the liberal posturing of the mayor to the murder of the two New York police officers Rafael Ramos and Wenjian Liu. Their murderer, Ismaaiyl Brinsley, was a lifelong failure and malcontent who shot the officers after having shot and wounded the girl who had dumped him. It is obvious that he seized upon the Garner death as a way to go out in a blaze of gunfire that would make his otherwise forgettable life ring out.

Fifth, the hostility to Mayor Di Blasio arises from two sources. On the one hand, the unions representing NYPD officers are engaged in contract talks with the city. Anything that gives the unions the moral bulge on the city is fine with the unions. On the other hand, Mayor di Blasio is a fool—as a recent in-depth story by the New York Times makes clear.[4] He’s a racist and a classist. He ignored the reality of shared values and shared experiences among cops and assumed that a “more diverse” police force would naturally agree with him. Worse, he dumped off responsibility for his own errors on to cops in his security detail, blaming them for speeding by the mayoral entourage and for his late arrival at a ceremony when he had in fact over-slept. Well, the demonstrations by the cops may be seen as a wake-up call.

[1] http://reason.com/blog/2014/12/15/the-nypd-shoots-and-kills-fewer-people-t

[2] The Week, 16 January 2015, p. 16.

[3] Garner’s arrests included assault, grand larceny, and—most often—the selling of black-market untaxed cigarettes.

[4] Article summarized in Leon Neyfakh, “Bill Di Blasio’s Bad Bet,” http://www.slate.com/articles/news_and_politics/crime/2015/01/nypd_and_bill_de_blasio_why_new_york_s_mayor_was_wrong_to_count_on_police.html