QE by the ECB.

The United States, Britain, and Japan all eventually responded to the “Great Recession” with “Quantitative Easing”—central bank purchases of public and private bonds in order to pump money into the economy.[1] Europe resisted this policy[2] and its economic recovery has trailed most other places. The ECB’s goal has been to see an annual inflation rate of 2 percent. It hasn’t worked. In December 2014 the inflation rate hit minus 0.2 percent. Economists feared that Europe would descend into a deflationary spiral. Therefore, on 22 January 2015, the European Central Bank (ECB) announced an initiative to buy 60 billion Euros worth of public and private bonds every month :until we see a sustained adjustment in the path of inflation.”[3]

Will the ECB action be sufficient to propel the European Community on the road to a solid recovery? When combined with the unanticipated fall in world oil prices and a depreciation in the exchange value of the Euro, Quantitative Easing might get the European economy moving. Still, there is a great deal of uncertainty going forward.

At the same time that he announced the program of bond-buying, ECB chief Mario Draghi urged the need for “structural reforms” to create the basis for the “confidence” among investors that is needed to encourage investment.[4] Will European governments be willing to implement such reforms after resisting them for so many years in crisis conditions? Or will they hope that QE can provide enough stimulus to allow them to ignore unpleasant choices? Jens Weidmann, president of the German central bank, worried in public that this might be the case.

How will the ECB initiative affect exchange rates between the Euro and other currencies? The dollar has been rising against the euro and gained another 2 percent after the ECB policy announcement; the Swiss ended their “peg” of the franc against the Euro and the franc rose 17 percent in value in one day. The change in exchange rates will make Euro-zone goods more competitive in foreign markets, but they will make Swiss and American goods less competitive in those same markets. Countries that borrowed in dollars will find it more difficult to repay those loans now that the value of the dollar is rising. In short, it creates a drag on the world economy at a critical time for the recovery.

One thing that now seems impossible to foretell is the effect of important central banks creating so much liquidity. Will it affect the basic stability of the world economy over time? No one is talking about this problem at the moment. They have more pressing business at hand.

[1] In the United States, “QE” pushed up asset prices like those for stocks and homes. This had the unintended effect of adding to the sense of an unequal sharing of the economic recovery.

[2] In large part the resistance stems from people in the northern “creditor” countries who feel that they “were had” by the Greeks and fear that southern “debtor” counties may try to stick them with the real costs of the bail-outs. This feeling comes on top of a long-standing belief that weaker economies suffer from the self-inflicted wounds of overly generous welfare states and a hostility to business.  The complicated governing system for a central bank serving nineteen sovereign states, each with their own central bank, allowed the “creditor” countries to hold the “debtor” countries at bay for years.  Both the emotional and the institutional components to economic policy-making seem incomprehensible to some leading American academic economists.

[3] Neil Irwin, “Fear That Eurozone Stimulus May Be Too Little or Too Late,” NYT, 23 January 2015.

[4] German Chancellor Angela Merkel immediately emphasized this point to the Italians. Dutch prime minister Mark Rutte then piled-on to the same effect. See: Jack Ewing, “Compromise and Persuasion Won Grudging Support for Bond Buying,” NYT, 24 January 2015.

Bozo Haram.

Religion can have internal (esoteric) and external (exoteric) components. The esoteric approach is essentially mystical. The exoteric is essentially about adherence to the law. As institutions, churches are usually satisfied with the exoteric. Sometimes true believers want the esoteric in order to achieve union with God. In Islam, those who pursue the esoteric are often called “sufi.”

Nigeria gained its independence from Britain in 1956. The new nation divided between a Christian South, with access to rich oil resources, and a Muslim North, which suffered from poverty. Bitterness arose in the North, where people complained of both the evils of the Christian government and the failings of their own clergy and traditional leaders to obtain justice. A religious protest movement arose around Mohammed Marwa (c. 1920-1980, knick-named “Maitatsine”) that led to violence and deaths. The government never entirely managed to suppress support for it. Then, during the 1960s and 1970s, Sufism began to make in-roads among Muslims in northern Nigeria. Inspired by the Saudi Arabian Wahhabist-funded World Muslim League, Sheikh Ismaila Idris (1937-2000) began to push back. In 1978 he founded the Izala Society to advocate a traditional form of Islam. One of the bright lights of this movement was Ja’afar Mahmud Adam (1960-2007). He was trained as a teacher in Nigeria, then studied at the Islamic University in Saudi Arabia. From 1993 to 2007 he preached in a mosque in Kano, Nigeria. One of his followers was Mohammed Yusuf (1970–2009). About 2002, Adam and Yusuf fell out.[1]

Yusuf went his own way to found Boko Haram. He seems to have recruited many of the same sorts of people with the same sorts of grievances who had followed Marwa twenty years before. Yusuf concentrated his mission on building support in the far northeast of Nigeria, near the borders with Chad and Niger. Yusuf may have aimed at the creation of an Islamist state. Certainly, he gathered arms and young men with nothing to lose. One of these was Abubakar Shekau.[2] Shekau became Yusuf’s second-in-command.

In July 2009 Boko Haram clashed with Nigerian security forces and Yusuf was killed “while trying to escape.” Shekau took command of Boko Haram. In September 2010 he opened war against the government with a prison break that freed over a hundred members of the group. Beginning in 2011 Boko Haram has used bombings (suicide and IED) and shootings to drive the police off the street and then out of towns. As a result, general lawlessness spread throughout the north. The Army and police reacted violently, but usually against civilian target that came to hand rather than against the Boko Haram militants. Reports of massacres, rapes, and pillaging carried out by the “forces of order” became common. During 2013 the conflict spilled over into Chad, Niger, and Cameroon. In 2014, Boko Haram transiently caught the attention of the world when it kidnapped several hundred girls from a school at Chibok.[3] The gory fight goes on.

As is the case in Syria and Iraq, the Islamists are up against corrupt or incompetent or non-existent governments. They aren’t fighting real soldiers: they’re fighting men with guns hired to prop up the government. They’re “filled with a terrible certainty,” while their opponents “lack all conviction.” Probably because the courts are rigged.

See: http://en.wikipedia.org/wiki/Boko_Haram for a more detailed account.

[1] In 2007 someone shot Adam to death in his mosque. The whole area was too violent to pin the blame on Yusuf.

[2] He may be anywhere from his mid-30s to his mid-40s.

[3] I haven’t seen a lot of “Bring back our girls” posts of late on my FB feed. First there was the “ice bucket challenge,” then all the memes sent out by groups like AddictingInfo to denounce the enormities of the Republicans.

Annals of the Great Recession III.

Years ago, back before the world economic slowdown, Germany overhauled its economy to make it more competitive and flexible. This overhaul built on earlier German strengths: an excellent educational system, a commitment to quality production, and a cultural predisposition to sound finances. The successful reforms put Germany in a strong position to first weather the initial storm and then exploit the inflationary policies pursued by other countries.

Not everyone pursued similar policies. Many European countries opted for social protection over economic growth. Their labor and management systems are encrusted with regulatory barnacles that slow growth and hinder employment; they run high levels of debt that become increasingly difficult to support with stagnant economies; and they are broadly change-averse. In the worst case, the Greeks spent years living off grants and loans from the European Community while cooking their books to disguise the fact that the money was being consumed rather than invested. The demographic crisis of an aging population across much of Europe bodes ill for the survival of the welfare states. Reforms to increase innovation, productivity and competitiveness are essential for the long-term future.

With the onset of economic crisis in 2009, the Germans seized upon the crisis as a device to force other countries to make fundamental reforms to improve the long-term position of the whole group.[1] Germany rejected expansionary policies at home while leading the imposition of severe conditions upon Greece in exchange for further aid. Behind the disreputable Greeks stood the more reputable Spaniards, Italians, and Frenchmen. Many countries didn’t want anyone saying that they resembled the Greeks, so they went along with the German policies.

However, even under pressure most countries have not made the kinds of reforms to entitlements, labor market regulations, and budgeting needed to create dynamic economies. Europe continues to limp along behind the United States in recovering from the “Great Recession.” Indeed, the danger that Europe will slide into a deflationary-spiral is very real.

From a dispassionately economic perspective, the best solution appears to be a combination of monetary stimulus by the ECB, higher public spending by Germany and other creditor countries, deficit-reduction in the debtor countries, and a wide application of the reforms that the Germans have been pushing.        The rival policy to that of Germany has been inflation by the European Central Bank (ECB) and higher spending by the creditor countries in order to ease conditions in the debtor countries. The hard times have led to the rise of “anti-austerity” parties, like the Syriza party in Greece and the Podemos party in Spain. Commentators can’t prove it, but they suggest that the growth of anti-European parties like the French Front National and the British United Kingdom Independence Party (UKIP) and of anti-immigrant feeling are all tied to “austerity.” Until recently, Germany managed to fend off calls for inflation.

The German strategy is founded on a misconception. The Germans have assumed that other countries could alter their politics and culture to become German-like. Most countries are not like the Germans and do not want to pay the costs of becoming more German-like. They have aging populations that are set in their ways. They have lived for decades with public discourse that disparages entrepreneurs and American-style capitalism. The costs of transition will be paid by entrenched interests and will benefit chiefly their descendants.[2]

Will the Greeks be forced out of the European Community? Or will the Germans?

[1] Marcus Walker, “Analysis: Double Blow to Germany’s Leadership,” WSJ, 26 January 2015.

[2] As Groucho Marx once asked, “What’s the future ever done for me?” The United States faces something of the same dilemma. See: “College costs: the old eat the young.”

The Plagues Next Time.

Somebody (Stephen Colbert?) once joked the “Reality has a well-known liberal bias.” Actually, reality has a well-known bias in favor of human reason. Reason, in turn, is pretty-much non-partisan and available to anyone who cares to develop it. Of course, one problem is that not everyone is a willing consumer.

Antibiotics.[1] Bacteria cause infections and spread infectious diseases. Infections and infectious diseases used to kill many people. Even with sterile operating room, for example, the danger of post-operative infection made even an appendectomy a hazardous procedure. At the dawn of the 20th Century, scientists and doctors combined to launch a medical revolution. They developed antibiotics like penicillin to fight infections. All sorts of perils were suddenly conquered. Antibiotics made a vital contribution to the dramatic rise in life expectancy during the 20th Century.

Now we face a potentially devastating return of infectious diseases. The origins of this menace are complex, rather than simple and easily addressed. First, bacteria are living things that adapt to their environment. Some bacteria are hardier than other bacteria when it comes to resisting antibiotics. These hardy bacteria can develop mutations that make them more resistant to antibiotics, so they multiply while the less-resistant strains of bacteria are wiped out. (See: Darwin and his “theory” of Evolution.) Two factors have greatly facilitated this development. On the one hand, idiot doctors prescribe antibiotics in the wrong circumstances and idiot patients who get prescribed antibiotics often stop taking them before they have completed the full course. This wipes out weaker bacteria while leaving stronger bacteria to multiply. Once there are enough of the resistant bacteria in the system, the existing type of antibiotics no longer work. Then, “factory farming” of livestock involves massive use of antibiotics in the feed for these animals. Eighty percent of antibiotics are used on “factory farms.” So this creates a hot-house environment for the mutation of bacteria. Ooops.

Second, pharmaceutical companies lose money on new antibiotics to fight the new “superbugs” that are developing. People only take antibiotics when they have a bacterial infection. That is a rare occurrence compared to what it was before antibiotics were developed. Moreover, the sales price of antibiotics is low. Taken together, these factors make for a thin revenue stream from antibiotics. However, antibiotics are very expensive to develop. The average antibiotic loses $50 million for the company that develops it. In contrast, drugs to treat chronic conditions (diabetes, high blood pressure, can’t-get-it-up-with-a-crane) are taken on a constant basis over a long period of time. They are money-spinners. So, no important new antibiotics have been created since 1987.

How do we avoid this train wreck? First, give the pharmaceutical companies a reason to create new antibiotics. (I know: “They make enormous profits! They should do this out of the goodness of their souls!” They won’t and the “public option” beloved of “progressive people” = the Veterans’ Administration + Solyndra.) Extend the length of time that companies have patent protection for their antibiotics. This will keep low-cost producers from churning out generics. Second, subsidize the companies with tax-credits when they develop antibiotics. Third, put a stop to the abuse of antibiotics by idiot doctors and patients, and by factory farms.

 

Vaccination.[2] One idea behind vaccination is to wipe out diseases among young people. As the diseases are wiped out, they cease to pose a threat to older people as the effects of the childhood vaccinations wear off with time. Fine, so long as hardly anyone misses out on vaccinations. However, that is just what is starting to happen.

In 1998 Dr. Andrew Wakefield published a scientific study showing that the development of autism in twelve children could be linked to the standard vaccination against measles, mumps, and rubella. Naturally, many parents became alarmed. A subsequent inquiry demonstrated that the study was a fraud. Many subsequent studies have demonstrated that there is no connection between vaccination and autism. Too late! The suspicion/belief that vaccination is dangerous had become entrenched among a large and growing segment of parents. Why did this happen? In part, because of a 300 percent increase it the number of cases of autism that were diagnosed between 2002 and 2013. Although scientists suspect that autism arises from a mixture of genetic and environmental factors, the “anti-vaxxers” aren’t buying this explanation. Today, about ten percent of parents either postpone scheduled vaccines or claim a “personal belief” exemption to prevent their children from receiving vaccinations.

Who are the “anti-vaxxers”? Their ranks include pure-life progressives who reject both vaccines and genetically-modified foods; libertarians who see good health as just one more federal intrusion on their God-given right to watch their children cough their lungs out; and the descendants of the Scopes “monkey trial” rural conservatives.

What do “anti-vaxxers” believe? They believe that immunization can cause disorders and/or that so many vaccinations—16 is common—can “overwhelm” the body’s natural resistance to disease and expose children to diseases. There is NO evidence for any of this.

There is abundant evidence that reducing the number of vaccinated children then exposes adults to diseases from which they have thought themselves safe. In 2012, 50,000 Americans came down with whooping cough, by far the largest number if fifty years. Eighteen people died. In 2013 the number of cases of measles (OK, 190) was three times higher than in 2012.

Where do I go to get away from the people who want to get away from the Federal government? Idaho?

[1] “The antibiotic crisis,” The Week, 22 November 2013, p. 9.

[2] “The return of childhood diseases,” The Week, 7 March 2014, p. 9.

 

Getting a fat lady into a girdle.

It is way too early to tell how the Affordable Care Act (ACA) is going to shake-out. Neither Republican doom-saying nor Democrat triumphalism seems warranted at this moment. There are signs of gains that need to be consolidated and issues that may need to be addressed.

During the first year of the ACA the uninsured rate fell by thirty percent/10 million people.[1] That means that seventy percent/20 million people of the previously un-insured are still un-insured. Between 2002 and 2012 a rising number of Americans told Commonwealth Fund pollsters that medical bills caused them financial troubles.[2] Medical debt became one of the leading causes of people filing for bankruptcy. Many people (43 percent in 2012) decided against seeking some sort of medical care because of the cost. The Affordable Care Act intended to address this problem as one part of its effort to make health care more broadly available. The number of Americans reporting trouble with medical debt peaked at 41 percent in 2012. Then the number began to fall, hitting 35 percent in 2014. The number of those who did not seek medical care because of cost also fell to 36 percent. So, is the glass full, half-full, or empty?

The big problem is health-care costs and, thus, health-insurance costs.

Between 2003 and 2013, insurance premiums rose faster than did median incomes.[3] Between 2003 and 2010 insurance premiums rose by an average of 5.1 percent per year. In thirty-seven states the total employer + employees contributions equaled at least 20 percent of median income. Thus employers’ labor costs also rose. From 2011 to 2013, the pace of increases slowed, but continued to rise at a rate of 4.1 percent. By 2013 the average insurance premium had reached a national average of $16,000. Employers started looking for a way to limit the rise in their labor costs.

What they have hit on, in many cases, is shifting the cost to employees. In 2003, 52 percent of workers had employment-provided insurance with a deductible. By 2013 the number had risen to 81 percent. Furthermore, the deductibles have also risen by an average of 146 percent. They now average $1,000 per person in most states. According to a Commonwealth Fund study, the out-of-pocket costs for employees (insurance premiums + deductibles) rose from 5.3 percent of median household income in 2003 to 9.6 percent in 2013.

On the one hand, according to one report, 58 percent of Americans polled want ObamaCare repealed.[4] Why? Job-creation and wage increases have both been lagging for several years. This has left people feeling like the Great Recession never ended. Perhaps the shifting of medical costs to their consumers makes people feel like ObamaCare never happened.[5]

On the other hand, although health-care costs have risen more slowly since passage of the ACA, most economists—as opposed to political spokesmen—attribute this to the recession. They are likely to start back upward as the economy recovers. This will increase the pressure on employees for out-of-pocket expenses and premiums.

In short, we’re not yet done with health insurance reform. Maybe we’ll get it all the way right the next time.

[1] “Obamacare: Why, in Year Two, it’s still so unpopular,” The Week, 16 January 2015, p. 6.

[2] Margot Sanger-Katz, “Distress Appears to Ease Over Cost of Health Care,” NYT, 15 January 2015.

[3] Tara Siegel Bernard, “Health Premiums Rise More Slowly, but Workers Shoulder More of Cost,” NYT, 8 January 2015.

[4] “Obamacare: Why, in Year Two, it’s still so unpopular,” The Week, 16 January 2015, p. 6.

[5] However, it is possible that what they don’t like is Obama, rather than the Care. People often disapprove of a President in his lame-duck years.

 

Looking Backward and Forward.

Expert predictions for economic developments during 2014 turned out to be off-target in several important areas.[1] Try, try again. What do they say 2015 will look like?

The rest of the world had a lousy year in 2014, so the American economy looked good in comparison. Unemployment fell from7 percent at the end of 2013 to 5.8 percent at the end of 2014.[2] US employers added 2.7 million jobs during the year. New jobs are running near the highest point since 2001. However, there are still about 2 million fewer workers with full-time jobs.[3] So, it will be a while before there is much upward pressure on wages.[4] As a result, the American stock market had a much better year in 2014[5] than did foreign markets.[6]

Why did the American economy do better in 2014 than most other places? A combination of factors were at work, but one thing was more important than all other things. During the second half of 2014 the price of oil dropped by fifty percent. Partly, this reflected a long-developing increase in American production of oil and gas. Partly, it reflected a decision by the Gulf countries not to reduce their own production in response to falling prices.

Low oil prices should encourage world economic growth in 2015. Low energy prices also are a powerful reason to expect low inflation for a long time; expectations of low inflation may add to this dynamic.[7] Low inflation for a long time means that the Fed will not be under heavy pressure to raise interest rates.

Long-term interest rates have fallen[8] in spite of a strengthening American economy and the end of “quantitative easing.” The falling cost of borrowing will lead to lower rates for mortgages and for borrowing by business. These too should stimulate the American economy.

So, what’s the down-side of all this good news? First, countries that depend on oil for their export earnings (not just Middle Eastern countries, but also Russia and Venezuela) are going to be pinched.

Second, the Fed’s ending of quantitative easing and its expressed willingness to raise interest rates in the future have combined with economic stagnation elsewhere to raise the value of the dollar against other currencies.[9] The dollar is so central to the world economy that its rising value is likely to slow growth elsewhere.

Third, the Asian economies started to slow down in 2014. There is a certain contradiction there. China, Indonesia, and India all tightened on the money supply to rein-in the development of bubbles, while Japan has been trying to stimulate its economy after a long period of stagnation.

The point here is that we are still walking on a knife’s edge. The Europeans are pursuing a fool-hardy policy that will prolong stagnation. China is trying to walk-back some of its heady growth. The US recovery remains vulnerable to unexpected problems at home and abroad. .

[1] Neil Irwin, “Market Trends of 2014: What They Mean for 2015,” NYT, 1 January 2015.

[2] Economists regard 5.2-5.5 percent unemployment as the normal “full employment” rate. You may guffaw, but you haven’t met my brother-in-law. Hire him? HA!

[3] Even that disguises the situation. About seven million people have part-time jobs, but would prefer full-time jobs.

[4] “The Year in Review,” WSJ, 31 December 2014.

[5] The S&P rose 30 percent in 2013 and 11.4 percent in 2014.

[6] However, stock prices are rising faster than corporate profits, so a correction is likely.

[7] At the start of 2014 inflation was running at 2.65 percent per year; at the start of 2015 it is down to 2.14 percent per year.

[8] Interest on 30-year Treasury notes has fallen from 3 percent in late 2013 to 2.8 percent in late 2014.

[9] The euro lost 12 percent against the dollar and yen lost 14 percent.

By the waters of Babylon.

There was a weird and grim story in the New York Times on Sunday.[1] The story starts with two “old money” brothers: George Seymour Beckwith Gilbert (born 1942) and Thomas Gilbert (born circa 1944).[2] Their father ran a company that made textile machinery, back when America still had a textiles industry. The parents sent the boys to Philips Andover and then to Princeton (Beckwith ’63, Tom ’66). Both went on to get MBAs (Beckwith from NYU, Tom from Harvard). Both went into finance. Thereafter their career tracks diverged. The older brother worked for firms that bought and turned-around poorly performing companies. There were a lot of these in the America of the Seventies and Eighties. Eventually he founded Field Point Capital Management Company. Later, he got interested in science and medicine. This led him to get an MS in Immunology from Rockefeller University (2006).[3] He’s on a bunch of boards, corporate and academic. You could read this as an example of how people get to the top of American society and how subsequent generations stay there: a combination of brains, hard work, and the opportunities that come from social networks.

            Tom Gilbert’s career seems to have run down a different course. People from Princeton remember him as affable and athletic, rather than as highly intelligent. He worked in a bunch of jobs at Wall Street, including a seven-year stint at Loeb Partners that ended in 1991. In 1998 he founded Knowledge Delivery Systems (KDS) to provide on-line education materials.[4] In 2010 he co-founded Syzygy Therapeutics LLC. He stuck with that for a little over a year, and then founded his own private equity firm, Wainscott Capital Partners LLC, in April 2011. He was sixty-seven years old and starting a new venture.
Should we see this new venture as admirably lively or as desperate? Possibly the latter. Tom Gilbert, Sr. left an estate worth $1.6 million. Oddly, and my saying this will infuriate most people, that isn’t a lot of money.[5] About a third of his assets were his stake in his new fund. He had a house (not a “mansion”) in the Hamptons; he belonged to a couple of clubs (River in New York, Maidstone in East Hamptons); he was selling the house in the Hamptons; he and his wife had given up a brownstone on the Upper East Side for a smaller apartment on Beekman Place. He put in twelve-hour days at his new business and never took vacation. You could read this as an example of how people get to the top of American society and how subsequent generations struggle desperately to stay there: more social than smart; hard work; and the lack of social networks as the economy goes through revolutionary changes.
Beneath the surface of this little bit of social history a la Louis Auchincloss is a sadder tale that also speaks to other contemporary concerns. Tom Gilbert was (apparently) a loving, doting father who had a troubled child. Tom Gilbert, Jr. (born 1984) had followed in his father’s footsteps: he graduated from Deerfield and then from Princeton, with a degree in economics. He loved sports and had a wide circle of friends. However, something was wrong. He graduated from Princeton in 2009, when he was twenty-five. Something slowed him down. He never managed to start a career. Instead, he lived off his father: the $2,400 monthly rent on an apartment and an allowance of $600 a week.[6] Meanwhile, his friends from Deerfield and Princeton pressed on with the usual careers in business, law, and government. He went to parties, saw them, and what could he say when they asked what he was doing?
About a year ago, perhaps in late 2013, things started to get dramatically worse for the Gilberts. Tom, Jr. got barred from the Maidstone Club for giving one of the employees a bad time.[7] He had a fight with a friend (possibly over a woman); the friend got a restraining order; Tom Jr., violated the restraining order and got arrested; somebody burned down the family summer home of the friend; the police wanted to talk to Tom Jr. about this episode, but never charged anyone with setting the fire. More and more friends stopped seeing him. Tom Jr. got a gun and started spending time at a range.[8] Some of Tom Jr.’s friends told Tom Sr. that they were worried about his son. Undoubtedly, Tom Sr. was worried as well. He had paid for a lawyer to resolve some “minor matter.” He may have persuaded his son to get medical help and paid for that. Tom Jr. doesn’t seem to have appreciated the help. He became critical, even mocking, of his father.
The two strands of Tom Gilbert Sr.’s life came together in early January 2015. He was making sacrifices to get his fund up and running by downsizing his own life-style. Truth be told, he wasn’t getting any younger and there was no guarantee that he would be able to build his fund into a real fortune. He probably wasn’t going to be able to leave a huge inheritance to his family. Tom Jr. may have seemed stuck in a life going nowhere and in need of some kind of help. Either because the financial pressures he faced were becoming grave or because he hoped to nudge his son toward becoming self-sustaining, Tom Sr. told his son that he would have to reduce his allowance. On Sunday, 4 January 2015, Tom Sr. was shot once in the head in his apartment. Police arrested Tom Jr. later that day.
Some in the media want to make the story about the harmful effects of “privilege.” That isn’t what it’s about. Instead, the story is about two things. One is that inherited “privilege” is nowhere near as reliable as it once may have been. The differential fortunes of the two older Gilbert brothers illustrate that point. The structure of the American economy has been changing fast. The decline of old industries has wreaked havoc with blue-collar and middle class incomes. Did it do the same with upper-class inheritances, forcing a whole generation to seek opportunities to restore or shore-up their assets? The composition of the American financial elite also appears to be changing in response to the rise of new industries. Adapt or disappear.
A second thing is that a troubled adult is hard for anyone to assess, help, or control. It’s hard to tell how far a person will fall. It’s difficult to get anyone institutionalized after they hit fourteen unless they do something that makes people say that they should have been institutionalized before they did it. It’s easy to say that someone needs help, but harder to find help that works. It’s easy for people to get their hands on firearms, even when there is a restraining order against them for one thing and they’re suspects in a crime for something else.
Of the two themes, the second seems far the more important, the outcome the most tragic. Parents of all social classes and races have struggled with troubled children. Sometimes things work out. Life for everyone gets progressively better. Sometimes they don’t and there flows a river of tears.

[1] Landon Thomas, Jr., “The Price of Privilege,” NYT, 18 January 2015.

[2] Are they related to the Seymour Parker Gilbert who was a New York investment banker and later was Agent-General for Reparations in the Twenties?

[3] http://www.pa59ers.com/potpourri/folders/g05-Gilbe/g05.html

[4] If you look at the current leadership team at KDS you can’t help but get the feeling that they are not “old money” or “old school.” BAs from Tufts, Yeshiva, Wake Forest, Gettysburg, UCLA, Kenyon, George Washington, North Texas State, and Howard. Blacks, women, and Jews. http://www1.kdsi.org/about-kds/kds-team.htm

[5] Well, it isn’t a lot of money for a 70 year-old guy who came from money, got a first-rate education, and spent his working life on Wall Street.

[6] So he’s costing the father $60K a year. Do all rich families subsidize their children in this fashion while the kids get their feet or is this an exception to the rule?

[7] The incident may have been really egregious or not the first time if it got him banned.

[8] Glock 22: .40-cal. pistol favored by big-city police departments and the DEA. Ugly piece of work.

Rivers of Blood I.

Muslim immigration to Western Europe began by stages after the Second World War as labor-short economies and the end of empires combined to draw non-Europeans toward the “mother country.” A great deal of thoughtlessness went into these migrations. All host countries were ill-prepared to deal with the immigrants.

In January 2015 there are an estimated 20 million Muslims in Europe. About 5 million are in France, where they make up 8 percent of the population. (See: “The other land of liberty and opportunity.”) In Britain and Germany they make up 5 percent of the population. One of the things that eats at European countries is the feeling that immigrants have come to their countries to prey on the generous social welfare provision of enlightened countries. In the 1970s, two-thirds of the immigrants in Germany were in the labor force, while one-third were not. Thirty years later scarcely more than a quarter of immigrants were in the labor force.[1] Another problem, revealed by a poll in L’Express in January 2013 is that 74 percent of those polled said that Islam “is not compatible with French society.” Yet this feeling finds no expression in the “mainstream” or “respectable” French political parties. Why not?

Christopher Caldwell argues that the European left has made discussion of the problems raised by immigration almost impossible.[2] On the one hand, they have evoked European historical crimes—the Holocaust above all—to justify repressing unwelcome speech. He implies that they have undermined the foundations of democracy in the process. In France, he sees the “SOS Racisme” group created in the 1980s as a puppet of the Socialist Party intended to shout-down conservative voices and the 1990 Gayssot Law against Holocaust-denial as an entering wedge for people who want to stifle discussion of other historical events—many of them highly unpleasant and non-Western. Most recently, the anti-immigrant Front National Party got left out of the post-Charlie Hebdo parade on the grounds that it was not “republican” enough.

On the other hand, people on the left have failed to understand that, whatever was done to European Jews, it wasn’t done by Muslims. Just as Palestinians have felt free to reject the State of Israel as European expiation of European crimes at the expense of Arabs, so too have Euro-Muslims felt free to reject European progressive thought as an alien set of values intended to curb their own beliefs. If one adds these forces to the failure to integrate the immigrants and their French-born descendants into French society, one can begin to understand some of the impulses that set the Kouachi brothers and Amedy Coulibaly on the path to terrorism. They are not alone in their alienation, hostility, and religious fervor.

Caldwell understands that Europe’s aging and declining non-Muslim population makes immigration essential. He is less quick to say that the anti-immigrant and anti-Muslim sentiments expressed by parties like the Front National can have no practical expression in public policy. Yes, the Jews were expelled from Spain in 1492. Is that what the Front National or other parties expect to repeat with Muslims? Some of the Front National voters undoubtedly do want that, but the program of the party calls for a halt to further immigration and a defense of “secular values.” France already has more police per capita than any other European country. Even so, security lapses allowed the Kouatchis and Coulibaly to escape detection of their plans to kill.

There is going to have to be a third way between “political correctness” and stupidity.

[1] Two million out of three million in the early 1970s versus two million out of seven-and-a-half million in th early 2000s.

[2] Christopher Caldwell, “Europe’s Crisis of Faith,” WSJ, 17-18 January 2015.

 

Nothing to CLAP about.

There is an exam called the College Learning Assessment Plus.[1] The exam measures how much college students gain between the freshman year and the senior year. It assesses communications skills (reading, writing); analytical reasoning; and critical thinking. Thus, it is applicable across disciplines and measures the “transferable skills” that have long been touted as the real value of a college education.

The results of the CLA+ for 2013-2014 give cause for hope and fear.[2] Of Freshmen who took the test, 63 percent scored below the Proficient level and 37 percent scored Proficient or higher. Of Seniors who took the test, 40 percent scored below the Proficient level and 60 percent scored Proficient or higher. Of Freshmen, 31 percent enter college at a Below Basic level, but by the Senior year this share has been reduced to 14 percent. Similarly, 32 percent of Freshmen score in the Basic level, but by the senior year this had been reduced to 26 percent even as 17 percent have moved up from Below Basic to at least Basic.

So, the good news is that colleges take the 37 percent who are already proficient and make them more proficient; and they take 23 percent who are not proficient and raise them to proficiency. So, sixty percent of college students benefit from attending college.[3]

What’s the bad news? Well, 14 percent of seniors graduate with a Below Basic score and another 26 percent graduate with a Basic, but Below Proficient score. That’s 40 percent who come out of college deficient in the intellectual skills assessed by the CLA+ exam. That is a huge wastage of resources. Of late, much attention has focused on graduation rates and time-to-graduation. Here, the United States has lost its world-leading position and has fallen behind some other countries. The results of the CLA+ exam suggest that the problem is actually worse than it appears because 40 percent of college graduates don’t actually function at a BA level.

There’s a part I don’t understand, but which I will report. Test scores fall in a range between 400 and 1600. The average Freshman score is 1039; the average Senior score is 1128. The average improvement is 89 points. If, for the sake of argument, you subtract the 400 points you get for being able to sign your own name, then the Freshmen average score is 639 and the Senior average score is 728. An 89 point increase amounts to just under a 14 percent.

Still, these reports raise several questions. Why do almost two-thirds of Freshmen start college below the level of proficiency for their group? Furthermore, many students do not go on to college at all. This suggests that K-12 education is failing many students. It also suggests that an increasingly remedial function is being forced on colleges. (At the same time, they are being criticized for loading students and parents with debt and for not graduating students in a timely fashion.)

Is a 14 percent average improvement enough to justify the cost of four years of college? Does the 14 percent improvement push students over some undefined threshold between incompetence and competence? If it does, then the money probably is well spent.

It’s just my opinion, but professors are the least-qualified to understand the nature of the problem. Their children grow up with books, pictures on the walls, a variety of kinds of music playing, trips to cultural events rather than Disney World, experiences valued over possessions, and parents who work all the time. So, their children are usually successful in school and in life.

[1] This is abbreviated as CLA+ so that anxious parents will not be overheard asking other parents “So, how did your kid do with the CLAP?”

[2] Douglas Belkin, “Skills Gap Found in College Students,” WSJ, 17-18 January 2015.

[3] Maybe all of them do, without that showing up in the test scores. Maybe they are marginally more attuned to key skills without quite getting out of the bottom category.

Legacies of the Violent Decades.

The 1970s and 1980s were violent decades.[1] The rate for all violent crime rose from about 500/100,000 people to almost 800 between 1975 and 1991. The robbery rate rose from about 200/100,000 people in 1975 to about 270 in 1991. The rate for aggravated assaults rose from about 230/100/000 people to about 450 in 1992. From 1975 through 1991 the murder rate bounced around between 8 and 10/100,000 people. In 1990 there were 2,245 homicides in New York City (five a day), and 474 homicides in Washington, DC (more than one a day).

State and federal governments lashed out against this spike in crime with the weapons at hand. The federal government directed billions of dollars to the states to increase the number of police and to build prisons to house the people the police caught. Sentences were lengthened for some crimes and mandatory minimums were imposed to limit the freedom of judges. Between the early 1970s and 2009 the number of people in state or federal prisons quadrupled to about 1.5 million people.

Then the rates of violent crime began to drop. The rate for all violent crime fell by 51 percent, to a level 25 percent below the 1975 rate. The rate for aggravated assault fell from its 1992 peak by 48 percent, roughly back to where it had been in 1975. The rate for robbery fell by from its 1991 peak by 60 percent, to a level 51 percent below the 1975 rate. The murder rate fell from its 1992 peak by 41 percent, to a level slightly below its 1975 rate. In 2014 there were 328 homicides in New York City (less than 1/day) and 104 homicides in Washington, DC (two/ week).

This remarkable change has begun to spark debate, just as did the remarkable spike in violence in America before 1990. One question is what has happened since 1990 to bring down the rate of violent crime? Experts are not entirely sure how to answer this question. They do agree on some things. First, targeted policing is a big part of the answer. New York City Police Commissioner William J. Bratton introduced the use of computer data and crime mapping (“CompStat”) to identify targets for police efforts.   Police began to concentrate their efforts on these identifiable trouble spots. Drugs used to be sold right out on the street. Aggressive policing pushed the sales in-doors. That didn’t do much to cut down on drug use, but it did make drive-by shootings a lot less lethal. The “broken windows” strategy came to be widely adopted. Second, tougher sentencing and mass incarceration played a lesser role than advocates expected.[2]

A second question is about what to do going forward? On the one hand, what is to be done with the large numbers of people still locked up from the previous decades? If they are released, will they just return to their old ways? Can people convicted of non-violent crimes be safely released and better served with drug-treatment programs? Going forward, should the length of sentences be reduced?

On the other hand, should the aggressive policing that accompanied the reduction in crime be scaled back? When crime rates are high and people are afraid, they are willing to tolerate aggressive forms of policy that they will not tolerate when crime rates are low and people feel secure. “Stop and frisk” has come under heavy fire. It has been argued that this kind of policing—which may have created the situation in which Eric Garner died—has begun to alienate law-abiding people in the communities on which the police focus. Can the police operate in an environment in which they are widely viewed as the enemy?

See: “The Senator from San Quentin”; “Military Police”; Death Wish.”

[1] Erik Eckholm, “With Crime Down, U.S. Faces Legacy of a Violent Age,” NYT, 14 January 2015.

[2] Which is not the same as saying that they played no role.