The Plagues Next Time.

Somebody (Stephen Colbert?) once joked the “Reality has a well-known liberal bias.” Actually, reality has a well-known bias in favor of human reason. Reason, in turn, is pretty-much non-partisan and available to anyone who cares to develop it. Of course, one problem is that not everyone is a willing consumer.

Antibiotics.[1] Bacteria cause infections and spread infectious diseases. Infections and infectious diseases used to kill many people. Even with sterile operating room, for example, the danger of post-operative infection made even an appendectomy a hazardous procedure. At the dawn of the 20th Century, scientists and doctors combined to launch a medical revolution. They developed antibiotics like penicillin to fight infections. All sorts of perils were suddenly conquered. Antibiotics made a vital contribution to the dramatic rise in life expectancy during the 20th Century.

Now we face a potentially devastating return of infectious diseases. The origins of this menace are complex, rather than simple and easily addressed. First, bacteria are living things that adapt to their environment. Some bacteria are hardier than other bacteria when it comes to resisting antibiotics. These hardy bacteria can develop mutations that make them more resistant to antibiotics, so they multiply while the less-resistant strains of bacteria are wiped out. (See: Darwin and his “theory” of Evolution.) Two factors have greatly facilitated this development. On the one hand, idiot doctors prescribe antibiotics in the wrong circumstances and idiot patients who get prescribed antibiotics often stop taking them before they have completed the full course. This wipes out weaker bacteria while leaving stronger bacteria to multiply. Once there are enough of the resistant bacteria in the system, the existing type of antibiotics no longer work. Then, “factory farming” of livestock involves massive use of antibiotics in the feed for these animals. Eighty percent of antibiotics are used on “factory farms.” So this creates a hot-house environment for the mutation of bacteria. Ooops.

Second, pharmaceutical companies lose money on new antibiotics to fight the new “superbugs” that are developing. People only take antibiotics when they have a bacterial infection. That is a rare occurrence compared to what it was before antibiotics were developed. Moreover, the sales price of antibiotics is low. Taken together, these factors make for a thin revenue stream from antibiotics. However, antibiotics are very expensive to develop. The average antibiotic loses $50 million for the company that develops it. In contrast, drugs to treat chronic conditions (diabetes, high blood pressure, can’t-get-it-up-with-a-crane) are taken on a constant basis over a long period of time. They are money-spinners. So, no important new antibiotics have been created since 1987.

How do we avoid this train wreck? First, give the pharmaceutical companies a reason to create new antibiotics. (I know: “They make enormous profits! They should do this out of the goodness of their souls!” They won’t and the “public option” beloved of “progressive people” = the Veterans’ Administration + Solyndra.) Extend the length of time that companies have patent protection for their antibiotics. This will keep low-cost producers from churning out generics. Second, subsidize the companies with tax-credits when they develop antibiotics. Third, put a stop to the abuse of antibiotics by idiot doctors and patients, and by factory farms.

 

Vaccination.[2] One idea behind vaccination is to wipe out diseases among young people. As the diseases are wiped out, they cease to pose a threat to older people as the effects of the childhood vaccinations wear off with time. Fine, so long as hardly anyone misses out on vaccinations. However, that is just what is starting to happen.

In 1998 Dr. Andrew Wakefield published a scientific study showing that the development of autism in twelve children could be linked to the standard vaccination against measles, mumps, and rubella. Naturally, many parents became alarmed. A subsequent inquiry demonstrated that the study was a fraud. Many subsequent studies have demonstrated that there is no connection between vaccination and autism. Too late! The suspicion/belief that vaccination is dangerous had become entrenched among a large and growing segment of parents. Why did this happen? In part, because of a 300 percent increase it the number of cases of autism that were diagnosed between 2002 and 2013. Although scientists suspect that autism arises from a mixture of genetic and environmental factors, the “anti-vaxxers” aren’t buying this explanation. Today, about ten percent of parents either postpone scheduled vaccines or claim a “personal belief” exemption to prevent their children from receiving vaccinations.

Who are the “anti-vaxxers”? Their ranks include pure-life progressives who reject both vaccines and genetically-modified foods; libertarians who see good health as just one more federal intrusion on their God-given right to watch their children cough their lungs out; and the descendants of the Scopes “monkey trial” rural conservatives.

What do “anti-vaxxers” believe? They believe that immunization can cause disorders and/or that so many vaccinations—16 is common—can “overwhelm” the body’s natural resistance to disease and expose children to diseases. There is NO evidence for any of this.

There is abundant evidence that reducing the number of vaccinated children then exposes adults to diseases from which they have thought themselves safe. In 2012, 50,000 Americans came down with whooping cough, by far the largest number if fifty years. Eighteen people died. In 2013 the number of cases of measles (OK, 190) was three times higher than in 2012.

Where do I go to get away from the people who want to get away from the Federal government? Idaho?

[1] “The antibiotic crisis,” The Week, 22 November 2013, p. 9.

[2] “The return of childhood diseases,” The Week, 7 March 2014, p. 9.

 

Getting a fat lady into a girdle.

It is way too early to tell how the Affordable Care Act (ACA) is going to shake-out. Neither Republican doom-saying nor Democrat triumphalism seems warranted at this moment. There are signs of gains that need to be consolidated and issues that may need to be addressed.

During the first year of the ACA the uninsured rate fell by thirty percent/10 million people.[1] That means that seventy percent/20 million people of the previously un-insured are still un-insured. Between 2002 and 2012 a rising number of Americans told Commonwealth Fund pollsters that medical bills caused them financial troubles.[2] Medical debt became one of the leading causes of people filing for bankruptcy. Many people (43 percent in 2012) decided against seeking some sort of medical care because of the cost. The Affordable Care Act intended to address this problem as one part of its effort to make health care more broadly available. The number of Americans reporting trouble with medical debt peaked at 41 percent in 2012. Then the number began to fall, hitting 35 percent in 2014. The number of those who did not seek medical care because of cost also fell to 36 percent. So, is the glass full, half-full, or empty?

The big problem is health-care costs and, thus, health-insurance costs.

Between 2003 and 2013, insurance premiums rose faster than did median incomes.[3] Between 2003 and 2010 insurance premiums rose by an average of 5.1 percent per year. In thirty-seven states the total employer + employees contributions equaled at least 20 percent of median income. Thus employers’ labor costs also rose. From 2011 to 2013, the pace of increases slowed, but continued to rise at a rate of 4.1 percent. By 2013 the average insurance premium had reached a national average of $16,000. Employers started looking for a way to limit the rise in their labor costs.

What they have hit on, in many cases, is shifting the cost to employees. In 2003, 52 percent of workers had employment-provided insurance with a deductible. By 2013 the number had risen to 81 percent. Furthermore, the deductibles have also risen by an average of 146 percent. They now average $1,000 per person in most states. According to a Commonwealth Fund study, the out-of-pocket costs for employees (insurance premiums + deductibles) rose from 5.3 percent of median household income in 2003 to 9.6 percent in 2013.

On the one hand, according to one report, 58 percent of Americans polled want ObamaCare repealed.[4] Why? Job-creation and wage increases have both been lagging for several years. This has left people feeling like the Great Recession never ended. Perhaps the shifting of medical costs to their consumers makes people feel like ObamaCare never happened.[5]

On the other hand, although health-care costs have risen more slowly since passage of the ACA, most economists—as opposed to political spokesmen—attribute this to the recession. They are likely to start back upward as the economy recovers. This will increase the pressure on employees for out-of-pocket expenses and premiums.

In short, we’re not yet done with health insurance reform. Maybe we’ll get it all the way right the next time.

[1] “Obamacare: Why, in Year Two, it’s still so unpopular,” The Week, 16 January 2015, p. 6.

[2] Margot Sanger-Katz, “Distress Appears to Ease Over Cost of Health Care,” NYT, 15 January 2015.

[3] Tara Siegel Bernard, “Health Premiums Rise More Slowly, but Workers Shoulder More of Cost,” NYT, 8 January 2015.

[4] “Obamacare: Why, in Year Two, it’s still so unpopular,” The Week, 16 January 2015, p. 6.

[5] However, it is possible that what they don’t like is Obama, rather than the Care. People often disapprove of a President in his lame-duck years.

 

Looking Backward and Forward.

Expert predictions for economic developments during 2014 turned out to be off-target in several important areas.[1] Try, try again. What do they say 2015 will look like?

The rest of the world had a lousy year in 2014, so the American economy looked good in comparison. Unemployment fell from7 percent at the end of 2013 to 5.8 percent at the end of 2014.[2] US employers added 2.7 million jobs during the year. New jobs are running near the highest point since 2001. However, there are still about 2 million fewer workers with full-time jobs.[3] So, it will be a while before there is much upward pressure on wages.[4] As a result, the American stock market had a much better year in 2014[5] than did foreign markets.[6]

Why did the American economy do better in 2014 than most other places? A combination of factors were at work, but one thing was more important than all other things. During the second half of 2014 the price of oil dropped by fifty percent. Partly, this reflected a long-developing increase in American production of oil and gas. Partly, it reflected a decision by the Gulf countries not to reduce their own production in response to falling prices.

Low oil prices should encourage world economic growth in 2015. Low energy prices also are a powerful reason to expect low inflation for a long time; expectations of low inflation may add to this dynamic.[7] Low inflation for a long time means that the Fed will not be under heavy pressure to raise interest rates.

Long-term interest rates have fallen[8] in spite of a strengthening American economy and the end of “quantitative easing.” The falling cost of borrowing will lead to lower rates for mortgages and for borrowing by business. These too should stimulate the American economy.

So, what’s the down-side of all this good news? First, countries that depend on oil for their export earnings (not just Middle Eastern countries, but also Russia and Venezuela) are going to be pinched.

Second, the Fed’s ending of quantitative easing and its expressed willingness to raise interest rates in the future have combined with economic stagnation elsewhere to raise the value of the dollar against other currencies.[9] The dollar is so central to the world economy that its rising value is likely to slow growth elsewhere.

Third, the Asian economies started to slow down in 2014. There is a certain contradiction there. China, Indonesia, and India all tightened on the money supply to rein-in the development of bubbles, while Japan has been trying to stimulate its economy after a long period of stagnation.

The point here is that we are still walking on a knife’s edge. The Europeans are pursuing a fool-hardy policy that will prolong stagnation. China is trying to walk-back some of its heady growth. The US recovery remains vulnerable to unexpected problems at home and abroad. .

[1] Neil Irwin, “Market Trends of 2014: What They Mean for 2015,” NYT, 1 January 2015.

[2] Economists regard 5.2-5.5 percent unemployment as the normal “full employment” rate. You may guffaw, but you haven’t met my brother-in-law. Hire him? HA!

[3] Even that disguises the situation. About seven million people have part-time jobs, but would prefer full-time jobs.

[4] “The Year in Review,” WSJ, 31 December 2014.

[5] The S&P rose 30 percent in 2013 and 11.4 percent in 2014.

[6] However, stock prices are rising faster than corporate profits, so a correction is likely.

[7] At the start of 2014 inflation was running at 2.65 percent per year; at the start of 2015 it is down to 2.14 percent per year.

[8] Interest on 30-year Treasury notes has fallen from 3 percent in late 2013 to 2.8 percent in late 2014.

[9] The euro lost 12 percent against the dollar and yen lost 14 percent.

By the waters of Babylon.

There was a weird and grim story in the New York Times on Sunday.[1] The story starts with two “old money” brothers: George Seymour Beckwith Gilbert (born 1942) and Thomas Gilbert (born circa 1944).[2] Their father ran a company that made textile machinery, back when America still had a textiles industry. The parents sent the boys to Philips Andover and then to Princeton (Beckwith ’63, Tom ’66). Both went on to get MBAs (Beckwith from NYU, Tom from Harvard). Both went into finance. Thereafter their career tracks diverged. The older brother worked for firms that bought and turned-around poorly performing companies. There were a lot of these in the America of the Seventies and Eighties. Eventually he founded Field Point Capital Management Company. Later, he got interested in science and medicine. This led him to get an MS in Immunology from Rockefeller University (2006).[3] He’s on a bunch of boards, corporate and academic. You could read this as an example of how people get to the top of American society and how subsequent generations stay there: a combination of brains, hard work, and the opportunities that come from social networks.

            Tom Gilbert’s career seems to have run down a different course. People from Princeton remember him as affable and athletic, rather than as highly intelligent. He worked in a bunch of jobs at Wall Street, including a seven-year stint at Loeb Partners that ended in 1991. In 1998 he founded Knowledge Delivery Systems (KDS) to provide on-line education materials.[4] In 2010 he co-founded Syzygy Therapeutics LLC. He stuck with that for a little over a year, and then founded his own private equity firm, Wainscott Capital Partners LLC, in April 2011. He was sixty-seven years old and starting a new venture.
Should we see this new venture as admirably lively or as desperate? Possibly the latter. Tom Gilbert, Sr. left an estate worth $1.6 million. Oddly, and my saying this will infuriate most people, that isn’t a lot of money.[5] About a third of his assets were his stake in his new fund. He had a house (not a “mansion”) in the Hamptons; he belonged to a couple of clubs (River in New York, Maidstone in East Hamptons); he was selling the house in the Hamptons; he and his wife had given up a brownstone on the Upper East Side for a smaller apartment on Beekman Place. He put in twelve-hour days at his new business and never took vacation. You could read this as an example of how people get to the top of American society and how subsequent generations struggle desperately to stay there: more social than smart; hard work; and the lack of social networks as the economy goes through revolutionary changes.
Beneath the surface of this little bit of social history a la Louis Auchincloss is a sadder tale that also speaks to other contemporary concerns. Tom Gilbert was (apparently) a loving, doting father who had a troubled child. Tom Gilbert, Jr. (born 1984) had followed in his father’s footsteps: he graduated from Deerfield and then from Princeton, with a degree in economics. He loved sports and had a wide circle of friends. However, something was wrong. He graduated from Princeton in 2009, when he was twenty-five. Something slowed him down. He never managed to start a career. Instead, he lived off his father: the $2,400 monthly rent on an apartment and an allowance of $600 a week.[6] Meanwhile, his friends from Deerfield and Princeton pressed on with the usual careers in business, law, and government. He went to parties, saw them, and what could he say when they asked what he was doing?
About a year ago, perhaps in late 2013, things started to get dramatically worse for the Gilberts. Tom, Jr. got barred from the Maidstone Club for giving one of the employees a bad time.[7] He had a fight with a friend (possibly over a woman); the friend got a restraining order; Tom Jr., violated the restraining order and got arrested; somebody burned down the family summer home of the friend; the police wanted to talk to Tom Jr. about this episode, but never charged anyone with setting the fire. More and more friends stopped seeing him. Tom Jr. got a gun and started spending time at a range.[8] Some of Tom Jr.’s friends told Tom Sr. that they were worried about his son. Undoubtedly, Tom Sr. was worried as well. He had paid for a lawyer to resolve some “minor matter.” He may have persuaded his son to get medical help and paid for that. Tom Jr. doesn’t seem to have appreciated the help. He became critical, even mocking, of his father.
The two strands of Tom Gilbert Sr.’s life came together in early January 2015. He was making sacrifices to get his fund up and running by downsizing his own life-style. Truth be told, he wasn’t getting any younger and there was no guarantee that he would be able to build his fund into a real fortune. He probably wasn’t going to be able to leave a huge inheritance to his family. Tom Jr. may have seemed stuck in a life going nowhere and in need of some kind of help. Either because the financial pressures he faced were becoming grave or because he hoped to nudge his son toward becoming self-sustaining, Tom Sr. told his son that he would have to reduce his allowance. On Sunday, 4 January 2015, Tom Sr. was shot once in the head in his apartment. Police arrested Tom Jr. later that day.
Some in the media want to make the story about the harmful effects of “privilege.” That isn’t what it’s about. Instead, the story is about two things. One is that inherited “privilege” is nowhere near as reliable as it once may have been. The differential fortunes of the two older Gilbert brothers illustrate that point. The structure of the American economy has been changing fast. The decline of old industries has wreaked havoc with blue-collar and middle class incomes. Did it do the same with upper-class inheritances, forcing a whole generation to seek opportunities to restore or shore-up their assets? The composition of the American financial elite also appears to be changing in response to the rise of new industries. Adapt or disappear.
A second thing is that a troubled adult is hard for anyone to assess, help, or control. It’s hard to tell how far a person will fall. It’s difficult to get anyone institutionalized after they hit fourteen unless they do something that makes people say that they should have been institutionalized before they did it. It’s easy to say that someone needs help, but harder to find help that works. It’s easy for people to get their hands on firearms, even when there is a restraining order against them for one thing and they’re suspects in a crime for something else.
Of the two themes, the second seems far the more important, the outcome the most tragic. Parents of all social classes and races have struggled with troubled children. Sometimes things work out. Life for everyone gets progressively better. Sometimes they don’t and there flows a river of tears.

[1] Landon Thomas, Jr., “The Price of Privilege,” NYT, 18 January 2015.

[2] Are they related to the Seymour Parker Gilbert who was a New York investment banker and later was Agent-General for Reparations in the Twenties?

[3] http://www.pa59ers.com/potpourri/folders/g05-Gilbe/g05.html

[4] If you look at the current leadership team at KDS you can’t help but get the feeling that they are not “old money” or “old school.” BAs from Tufts, Yeshiva, Wake Forest, Gettysburg, UCLA, Kenyon, George Washington, North Texas State, and Howard. Blacks, women, and Jews. http://www1.kdsi.org/about-kds/kds-team.htm

[5] Well, it isn’t a lot of money for a 70 year-old guy who came from money, got a first-rate education, and spent his working life on Wall Street.

[6] So he’s costing the father $60K a year. Do all rich families subsidize their children in this fashion while the kids get their feet or is this an exception to the rule?

[7] The incident may have been really egregious or not the first time if it got him banned.

[8] Glock 22: .40-cal. pistol favored by big-city police departments and the DEA. Ugly piece of work.

Rivers of Blood I.

Muslim immigration to Western Europe began by stages after the Second World War as labor-short economies and the end of empires combined to draw non-Europeans toward the “mother country.” A great deal of thoughtlessness went into these migrations. All host countries were ill-prepared to deal with the immigrants.

In January 2015 there are an estimated 20 million Muslims in Europe. About 5 million are in France, where they make up 8 percent of the population. (See: “The other land of liberty and opportunity.”) In Britain and Germany they make up 5 percent of the population. One of the things that eats at European countries is the feeling that immigrants have come to their countries to prey on the generous social welfare provision of enlightened countries. In the 1970s, two-thirds of the immigrants in Germany were in the labor force, while one-third were not. Thirty years later scarcely more than a quarter of immigrants were in the labor force.[1] Another problem, revealed by a poll in L’Express in January 2013 is that 74 percent of those polled said that Islam “is not compatible with French society.” Yet this feeling finds no expression in the “mainstream” or “respectable” French political parties. Why not?

Christopher Caldwell argues that the European left has made discussion of the problems raised by immigration almost impossible.[2] On the one hand, they have evoked European historical crimes—the Holocaust above all—to justify repressing unwelcome speech. He implies that they have undermined the foundations of democracy in the process. In France, he sees the “SOS Racisme” group created in the 1980s as a puppet of the Socialist Party intended to shout-down conservative voices and the 1990 Gayssot Law against Holocaust-denial as an entering wedge for people who want to stifle discussion of other historical events—many of them highly unpleasant and non-Western. Most recently, the anti-immigrant Front National Party got left out of the post-Charlie Hebdo parade on the grounds that it was not “republican” enough.

On the other hand, people on the left have failed to understand that, whatever was done to European Jews, it wasn’t done by Muslims. Just as Palestinians have felt free to reject the State of Israel as European expiation of European crimes at the expense of Arabs, so too have Euro-Muslims felt free to reject European progressive thought as an alien set of values intended to curb their own beliefs. If one adds these forces to the failure to integrate the immigrants and their French-born descendants into French society, one can begin to understand some of the impulses that set the Kouachi brothers and Amedy Coulibaly on the path to terrorism. They are not alone in their alienation, hostility, and religious fervor.

Caldwell understands that Europe’s aging and declining non-Muslim population makes immigration essential. He is less quick to say that the anti-immigrant and anti-Muslim sentiments expressed by parties like the Front National can have no practical expression in public policy. Yes, the Jews were expelled from Spain in 1492. Is that what the Front National or other parties expect to repeat with Muslims? Some of the Front National voters undoubtedly do want that, but the program of the party calls for a halt to further immigration and a defense of “secular values.” France already has more police per capita than any other European country. Even so, security lapses allowed the Kouatchis and Coulibaly to escape detection of their plans to kill.

There is going to have to be a third way between “political correctness” and stupidity.

[1] Two million out of three million in the early 1970s versus two million out of seven-and-a-half million in th early 2000s.

[2] Christopher Caldwell, “Europe’s Crisis of Faith,” WSJ, 17-18 January 2015.

 

Nothing to CLAP about.

There is an exam called the College Learning Assessment Plus.[1] The exam measures how much college students gain between the freshman year and the senior year. It assesses communications skills (reading, writing); analytical reasoning; and critical thinking. Thus, it is applicable across disciplines and measures the “transferable skills” that have long been touted as the real value of a college education.

The results of the CLA+ for 2013-2014 give cause for hope and fear.[2] Of Freshmen who took the test, 63 percent scored below the Proficient level and 37 percent scored Proficient or higher. Of Seniors who took the test, 40 percent scored below the Proficient level and 60 percent scored Proficient or higher. Of Freshmen, 31 percent enter college at a Below Basic level, but by the Senior year this share has been reduced to 14 percent. Similarly, 32 percent of Freshmen score in the Basic level, but by the senior year this had been reduced to 26 percent even as 17 percent have moved up from Below Basic to at least Basic.

So, the good news is that colleges take the 37 percent who are already proficient and make them more proficient; and they take 23 percent who are not proficient and raise them to proficiency. So, sixty percent of college students benefit from attending college.[3]

What’s the bad news? Well, 14 percent of seniors graduate with a Below Basic score and another 26 percent graduate with a Basic, but Below Proficient score. That’s 40 percent who come out of college deficient in the intellectual skills assessed by the CLA+ exam. That is a huge wastage of resources. Of late, much attention has focused on graduation rates and time-to-graduation. Here, the United States has lost its world-leading position and has fallen behind some other countries. The results of the CLA+ exam suggest that the problem is actually worse than it appears because 40 percent of college graduates don’t actually function at a BA level.

There’s a part I don’t understand, but which I will report. Test scores fall in a range between 400 and 1600. The average Freshman score is 1039; the average Senior score is 1128. The average improvement is 89 points. If, for the sake of argument, you subtract the 400 points you get for being able to sign your own name, then the Freshmen average score is 639 and the Senior average score is 728. An 89 point increase amounts to just under a 14 percent.

Still, these reports raise several questions. Why do almost two-thirds of Freshmen start college below the level of proficiency for their group? Furthermore, many students do not go on to college at all. This suggests that K-12 education is failing many students. It also suggests that an increasingly remedial function is being forced on colleges. (At the same time, they are being criticized for loading students and parents with debt and for not graduating students in a timely fashion.)

Is a 14 percent average improvement enough to justify the cost of four years of college? Does the 14 percent improvement push students over some undefined threshold between incompetence and competence? If it does, then the money probably is well spent.

It’s just my opinion, but professors are the least-qualified to understand the nature of the problem. Their children grow up with books, pictures on the walls, a variety of kinds of music playing, trips to cultural events rather than Disney World, experiences valued over possessions, and parents who work all the time. So, their children are usually successful in school and in life.

[1] This is abbreviated as CLA+ so that anxious parents will not be overheard asking other parents “So, how did your kid do with the CLAP?”

[2] Douglas Belkin, “Skills Gap Found in College Students,” WSJ, 17-18 January 2015.

[3] Maybe all of them do, without that showing up in the test scores. Maybe they are marginally more attuned to key skills without quite getting out of the bottom category.

Legacies of the Violent Decades.

The 1970s and 1980s were violent decades.[1] The rate for all violent crime rose from about 500/100,000 people to almost 800 between 1975 and 1991. The robbery rate rose from about 200/100,000 people in 1975 to about 270 in 1991. The rate for aggravated assaults rose from about 230/100/000 people to about 450 in 1992. From 1975 through 1991 the murder rate bounced around between 8 and 10/100,000 people. In 1990 there were 2,245 homicides in New York City (five a day), and 474 homicides in Washington, DC (more than one a day).

State and federal governments lashed out against this spike in crime with the weapons at hand. The federal government directed billions of dollars to the states to increase the number of police and to build prisons to house the people the police caught. Sentences were lengthened for some crimes and mandatory minimums were imposed to limit the freedom of judges. Between the early 1970s and 2009 the number of people in state or federal prisons quadrupled to about 1.5 million people.

Then the rates of violent crime began to drop. The rate for all violent crime fell by 51 percent, to a level 25 percent below the 1975 rate. The rate for aggravated assault fell from its 1992 peak by 48 percent, roughly back to where it had been in 1975. The rate for robbery fell by from its 1991 peak by 60 percent, to a level 51 percent below the 1975 rate. The murder rate fell from its 1992 peak by 41 percent, to a level slightly below its 1975 rate. In 2014 there were 328 homicides in New York City (less than 1/day) and 104 homicides in Washington, DC (two/ week).

This remarkable change has begun to spark debate, just as did the remarkable spike in violence in America before 1990. One question is what has happened since 1990 to bring down the rate of violent crime? Experts are not entirely sure how to answer this question. They do agree on some things. First, targeted policing is a big part of the answer. New York City Police Commissioner William J. Bratton introduced the use of computer data and crime mapping (“CompStat”) to identify targets for police efforts.   Police began to concentrate their efforts on these identifiable trouble spots. Drugs used to be sold right out on the street. Aggressive policing pushed the sales in-doors. That didn’t do much to cut down on drug use, but it did make drive-by shootings a lot less lethal. The “broken windows” strategy came to be widely adopted. Second, tougher sentencing and mass incarceration played a lesser role than advocates expected.[2]

A second question is about what to do going forward? On the one hand, what is to be done with the large numbers of people still locked up from the previous decades? If they are released, will they just return to their old ways? Can people convicted of non-violent crimes be safely released and better served with drug-treatment programs? Going forward, should the length of sentences be reduced?

On the other hand, should the aggressive policing that accompanied the reduction in crime be scaled back? When crime rates are high and people are afraid, they are willing to tolerate aggressive forms of policy that they will not tolerate when crime rates are low and people feel secure. “Stop and frisk” has come under heavy fire. It has been argued that this kind of policing—which may have created the situation in which Eric Garner died—has begun to alienate law-abiding people in the communities on which the police focus. Can the police operate in an environment in which they are widely viewed as the enemy?

See: “The Senator from San Quentin”; “Military Police”; Death Wish.”

[1] Erik Eckholm, “With Crime Down, U.S. Faces Legacy of a Violent Age,” NYT, 14 January 2015.

[2] Which is not the same as saying that they played no role.

The other land of liberty and opportunity.

The terrible events in Paris in early January 2015 have inspired all sorts of questions. What are the limits of “free speech”? Why did the security services fail to discern the threat? Perhaps most importantly, why do some French Muslims become radicalized?

During the 19th Century French population grew at a pace (40 percent) much below that of the rest of Europe (100+ percent). This population gap began to have an effect on the supply of workers. In the late 19th and early 20th Centuries the French began to make up the difference by encouraging immigration from countries like Italy, Poland, and Spain. By the eve of the Great Depression, immigrants had increased from 1 percent of the population to 3 percent. The Depression caused the French to seek to reduce the number of immigrants in the country. In the aftermath of the Second World War, however, France turned to encouraging the immigration of guest workers from its colonial empire as a national policy. The collapse of the French position in Algeria in the early Sixties then brought a flood of refugees (both Algerians of European descent and Algerian Muslims who had been loyal to France in the Algerian war). This population movement totaled well over a million people in the space of a few years.

From this point onward the question of immigration became politicized and tense. For one thing, there the “pied noir” immigrants from Algeria and the “harkis” competed for the same jobs at the bottom of the French economy, spawning a bitter hostility. For another thing, the great economic slump of the Seventies intensified the competition for jobs. France put a stop to immigration in 1974, but the immigrants in the country put down roots rather than going “home.” They sent for their families before French laws could prohibit this. Consequently, the immigrant population actually increased in size at a time when France sought to limit it. For a third thing, the French accepted the sociological theory of a “threshold of tolerance,” beyond which the number of unassimilated immigrants worked to disintegrate society. This latter theory had a particular resonance because of the “French social model.”

That model holds that there is a single French national culture and everyone has to assimilate to it to be French. Anyone who is not French is “foreign” (etranger). Formally, “etranger” refers to anyone without French citizenship, but informally it includes anyone who refused to become “French.” The French reject the Anglo-American model of multi-culturalism. The French carry this to the point of refusing to gather statistical data on the ethnic or national origins of French citizens. Rough estimates, done on the basis of the number of “etrangers” and their descendants living in France, put the number of non-French within the hexagon at 14 million or 25 percent of the population. Of these, it has been estimated that 5-6 million are Muslims.

It is open to question whether the Muslim immigrants have assimilated to French culture. On the one hand, they undoubtedly have: they eat pork, smoke, drink, and have premarital sex, just like ordinary “French” people of their generation. On the other hand, they are walled off in ethnic ghettoes on the outskirts of the major cities (especially Paris). These areas are marked by very high unemployment (40-50 percent), crime, and drug-use. At the same time, one can wonder whether the French have made much of an effort to assimilate the immigrants. The inhabitants of these ghettoes are often third generation residents of France with little knowledge of or interest in their “homelands,” there is a good deal of evidence that French employers prefer to hire people with lighter skins and French-sounding names, and former President Nicholas Sarkozy may have been expressing a common sentiment when he referred to the rioters at the end of 2005 as “racaille” (scum).   See: The Week, 2 December 2005, p. 15.

Can’t buy me love–or happiness.

Does money buy happiness? Yes—up to a point.[1] All sorts of other factors also play in, but nothing is as important as national income in determining response to “life satisfaction” surveys. A decade of surveys organized by a Dutch social scientist have found that “most people worldwide say they are fairly happy” and that people in more developed countries are happier than people in less developed countries (i.e. more development would increase happiness). However, once you get to the $20,000 per capita income level, advances in national income cease to produce much gain in life satisfaction or happiness. Thus, “happiness” or “life satisfaction” has not increased in the United States since the mid-Fifties, although there has been an 85 percent increase in the real value of family incomes (from $24K in 1953 to $51K in 2001). About 53 percent of Americans described themselves as “very happy” in 1957; about 47 percent did so in 2000. Curiously, the material ambitions of Americans seem to have sky-rocketed in recent years. In 1987 surveyed adults estimated that an income of $50K/year would be enough to “fulfill all your dreams”; by 1994 that figure had shot up to $102K, although prices had not doubled. (NB: All of a sudden Americans wanted things that were really expensive? Or college tuition sticker-shock had hit?)

What is “happiness”? One Yale political scientists (Robert Lane) argues that “happiness is derived largely from two sources—material comfort, and social and familial intimacy…” These needs tend to be out of whack. In “less developed countries…social ties are often strong and money is scarce…” People have social intimacy, but no material comfort. “Economic development increases material comfort, but it systematically weakens social and familial ties by encouraging mobility, commercializing relationships, and attenuating the bonds of both the extended and the nuclear family.” Initially, “the gains in material comfort more than outweigh the slight declines in social connectedness.” At some point the competing needs for comfort and intimacy balance, leaving people at their maximum point of “life satisfaction” or “happiness.” Western culture has a deeply entrenched need to produce and consume, to generate prosperity. It is what made the West the leader in economic development and it continues to hold sway long after the real need to produce has passed. Eventually, therefore, “the balance tips and the happiness-reducing effects of reduced social stability begin to outweigh the happiness-increasing effects of material gain.”

Still, there are places that are poor and unhappy, less poor and happy, and rich and happy, but there are no places that are rich and unhappy. The places that were poor and unhappy ten years ago were Ukraine, Russia, Belarus, Armenia, Azerbaijan, Bulgaria, and Latvia. Estonia and Lithuania are pretty close to falling into this category. In short, people were really miserable in the ruins of the old Soviet empire. Conversely, people who lived in the old American empire (the US, Canada, Western Europe, Japan, Australia) tended to be pretty happy. (Hence the outcome of the Cold War.) The highest levels of “life satisfaction” seemed to be found in politically insignificant countries with per capita incomes between $17,000 and $25,000, and located in more northern climates (Finland, Sweden, Denmark, Iceland, Switzerland, Netherlands, Luxembourg, Ireland, Canada). However, that doesn’t prove that moderate income and moderate social stability is the real key to happiness. Perhaps the cold climate just keeps people indoors all the time and they make love a lot. For lack of anything better to do.

[1] Don Peck and Ross Douthat, “The World in Numbers: Does Money Buy Happiness?” Atlantic, January-February 2003, pp. 42-43.

Death Wish.

As anyone knows who ever watched the “Death Wish” movies starring Charles Bronson, New York City is full of crazy people. Recognition of that truth helps us to understand the current conflict between Mayor Bill Di Blasio and the NYPD.

First of all, in spite of the concatenation of questionable police killings nation-wide in the past year and in spite of Mayor Di Blasio’s warning to his son, NYPD police shot to death three people during 2014. That is down from eight in 2013 (and 91 in 1971). Police department shootings fell by more than half in the later 1970s, then trended downward to one-sixth of the 1971 level though the first decade of the 21st Century. New York is a less violent city than in the past and the NYPD is less inclined to use lethal force.[1]

Second, it is dangerous to be a police officer, but much less dangerous than it used to be. In the “Bloody Seventies,” an average of 127 law enforcement officers a year were killed in the line of duty nationwide. Then the death-toll began to fall. In 2013, 32 police officers were shot to death in the line of duty; in 2014 the number rose to 50 officers killed.[2]

Third, Eric Garner was not an “unarmed black man” who died from an illegal choke-hold. He was a 6’3”, 350-pound career petty criminal[3] who suffered from asthma, heart disease, and obesity. When police attempted to arrest him for the minor crime of allegedly selling untaxed cigarettes on 17 July 2014, Garner resisted arrest. Officer Daniel Pantaleo put his arm around Garner’s neck and dragged him backward to the ground. Garner fell hard. However, the medical examiner found that there was no damage to either Garner’s windpipe or neck-bones. So, he wasn’t killed by the “chokehold.” He may have died of either a heart-attack or a severe asthma attack brought on by the arm around his neck, a high level of stress, and the slamming to the ground of a fat man with a bad pump. After Garner hit the ground, the police did nothing to assist Garner beyond calling for an ambulance. Garner died in the ambulance on his way to hospital.

Fourth, there is absolutely nothing to connect the liberal posturing of the mayor to the murder of the two New York police officers Rafael Ramos and Wenjian Liu. Their murderer, Ismaaiyl Brinsley, was a lifelong failure and malcontent who shot the officers after having shot and wounded the girl who had dumped him. It is obvious that he seized upon the Garner death as a way to go out in a blaze of gunfire that would make his otherwise forgettable life ring out.

Fifth, the hostility to Mayor Di Blasio arises from two sources. On the one hand, the unions representing NYPD officers are engaged in contract talks with the city. Anything that gives the unions the moral bulge on the city is fine with the unions. On the other hand, Mayor di Blasio is a fool—as a recent in-depth story by the New York Times makes clear.[4] He’s a racist and a classist. He ignored the reality of shared values and shared experiences among cops and assumed that a “more diverse” police force would naturally agree with him. Worse, he dumped off responsibility for his own errors on to cops in his security detail, blaming them for speeding by the mayoral entourage and for his late arrival at a ceremony when he had in fact over-slept. Well, the demonstrations by the cops may be seen as a wake-up call.

[1] http://reason.com/blog/2014/12/15/the-nypd-shoots-and-kills-fewer-people-t

[2] The Week, 16 January 2015, p. 16.

[3] Garner’s arrests included assault, grand larceny, and—most often—the selling of black-market untaxed cigarettes.

[4] Article summarized in Leon Neyfakh, “Bill Di Blasio’s Bad Bet,” http://www.slate.com/articles/news_and_politics/crime/2015/01/nypd_and_bill_de_blasio_why_new_york_s_mayor_was_wrong_to_count_on_police.html