Mass shootings.

What is a “mass shooting”? Answers differ: at least four people shot dead in a public place; or at least four people shot dead anywhere; or at least four people shot—wounded or killed–anywhere.

In liberal discourse, the US leads the world in mass shootings. By one count, 31 percent of all mass shootings occur in the United States.[1] Proponents of this view are quick to slide in the “developed country” qualifier because in reality, it doesn’t. Why not drive into a Mexican border town to check it out? Still, saying “well it’s worse in Guatemala” doesn’t help.

The most expansive totals for “mass shootings” appear to be arrived at by rolling in all the shootings associated with a sub-culture of violence among poor people. Drive-by shootings get counted just like massacres in fast-food restaurants. “Would you like death with that?”

One trope, less noticed and less publicized than others, holds that mass killings have a copy-cat element to them. Mass media attention devoted to one killer then helps put the idea into the pointy little heads of others. So, one solution would be to regulate the press to reduce the “if it bleeds, it leads” mentality.[2] This would involve curtailing the First Amendment.

Another trope, much more widely noticed, holds that these appalling crimes arise from American “gun culture.” Widespread gun-ownership and feeble limits on access to guns by evil-doers leads to slaughter. Leaving aside the people who beef with someone at an after-hours party in a rotting former greenhouse on a Saturday night, who are the shooters? Almost all are men; almost two thirds (64 percent) are white.[3] Working backward after mass shootings, scholars have found in about half of the killers some earlier sign of “mental illness.” The trouble is that this runs the gamut from depression to paranoia to full-blown schizophrenia. Moreover, “there is no one diagnosis that’s linked to mass shootings.” Many different diagnoses have been offered. Hence, “We can’t go out and lock up all the socially awkward young men in the world.”[4] Of course not: they often become college professors. (I can hear the gears turning in Lynn Cheney’s head already.) Furthermore, millions of men suffer from some kind of mental illness without ever becoming violent. In our current state of knowledge, it is impossible to predict who will be a killer (perhaps 20 a year) and who will not (millions).[5] Like convicted felons, people who have been involuntarily committed to a mental health facility are barred from purchasing guns. However, less than a quarter of the killers in mass shootings (23 percent) have been treated for a prior mental illness.

So, if you can’t invade First Amendment freedoms because the right of businesses to sell faulty products is sacred, and you can’t predict which mentally ill person will turn into a mass killer, and you don’t like a high murder rate, and the regulation of sales of guns is flawed by human error, then the only logical solution to the problem would be to disarm Americans in general. This is where a lot of the push-back originates.

About 100 people get killed a year in mass shootings out of 11-12,000 murder victims. That is both a drop in the bucket and a sign of the malign influence of media.

[1]. “The killing contagion,” The Week, 11 September 2015, p. 11.

[2] Hillary Clinton has recently endorsed proposals to try to deter the “short term” obsession of the stock market traders, so it isn’t much of a jump to deterring the obsessions of reporters and advertising managers.

[3] Compared with African-Americans (16%) and Asians (9%). The white share of mass shooters matches with the share of the over-all population, while the African-American share somewhat exceeds the share of over-all population (12.2%) and the Asian share is dramatically higher than the share of over-all population (4.7%).

[4] Jeffrey Swanson, Duke University, quoted in “The killing contagion,” The Week, 11 September 2015, p. 11.

[5] See: “Minority Report” (dir. Steven Spielberg, 1995).

Inequality 7.

According to the CIA, income inequality in the United States now is more extreme than in Red China.[1] So what? What matters is that a “rising tide lifts all boats,” as JFK said when arguing for a tax cut. However, some economists argue that the evidence for this “true that” statement is sketchy (as young people used to say). President Clinton got Congress to raise the top tax rate from 31 percent to 39.6 percent and the economy boomed (admittedly with the “Tech Bubble” that collapsed after he left office); President George W. Bush got Congress to cut taxes on high earners to 35 percent, but the economy floundered (admittedly with the “Housing Bubble” that collapsed before he left office). In this analysis, what really matters is the amount of demand for goods in the economy. That is an argument for shifting resources to consumers.

The “Great Bull Market” of the Twenties (and other stuff that pundits don’t want to know about) led to the Great Depression. The Great Depression led to the New Deal and 20 years of Democratic dominance in Congress. The Depression discredited businessmen as prophets-of-the-New Era. The New Deal imposed all sorts of restrictions on business and raised taxes on the rich swells (who were in some vague way blamed for the Depression). By the 1950s the top rate on marginal incomes had risen to 91 percent, essentially a confiscatory tax on high incomes. Proponents of relative income equality point to this period as the ideal society because it coincided with the period of American economic ease. Good-paying working-class jobs allowed many people with only a high-school diploma to enter some version of the middle class.

However, the Great Depression ended in 1940. By the 1970s a whole new generation of businessmen had come on the scene. They were unburdened by the sins of their elders. They campaigned for a reduction in the punitive tax rates of the New Deal era. One can see this as Republicans responding to the Democratic strategy of “tax, spend, elect” with their own mantra of “tax-cut, spend, elect.” In theory, savings create investment capital and investment capital creates jobs. Therefore, the tax rate on capital gains fell to 70 percent in the 1970s, then to 50 percent in the first Reagan administration, and then to 28 percent in the second Reagan administration. Bill Clinton pushed for and won a reduction in the tax on capital gains from 28 percent to 20 percent. George W. Bush pushed for and won a reduction in the tax on capital gains from 20 percent to 15 percent. However, President Bush also pushed for massive cuts on taxes paid by lower income groups.

Two things resulted from the Bush tax cuts. First, the US government lost $400 billion a year in revenue. Of this lost revenue, “only” $87 billion came from people earning $250,000 a year or more. The other $313 billion came from people earning less than $250,000 a year.[2]

Second, taxation became much more progressive. While cutting taxes overall, Bush shifted the burden of taxation onto upper income earners. After the Bush tax cuts, the top 1 percent of income-earners now pays 40 percent of the income tax bill (and 21 percent of all taxes), while 47 percent of Americans now pay no income tax at all.[3] Despite his bitter condemnation of the Bush Administration on many scores, President Obama fought hard to confirm 98 percent of the cuts.

There are three observations worth making. One is that there are big long-term trends or swings in tax policy. The huge deficits looming as the “Baby Boom” ages may herald an end to low taxation for everyone.

A second is that President Obama has loudly condemned the plutocrats “who tanked the economy” in the financial crisis. How did Bill Gates or Steve Jobs or Warren Buffett or the idiots who ran American car companies “tank the economy”? They didn’t. In fact, only about 14 percent of the richest Americans work in finance. Yet Gates, Jobs, Buffett and a lot of other ordinary, successful entrepreneurs were hammered by the Obama tax increases.[4] They have also been subject to his frequent dispensation of moral opprobrium.[5]

A third is that the Democrats need to define what they mean by “fair.” As in, “the rich should pay their fair share.” The rich are already carrying a disproportionate share of the fiscal weight while almost half of Americans pay nothing at all for the programs that benefit them. As Woody Guthrie might have said (had he been an entirely different person), “A poor man with a ballot-box can rob you just as easily as can a rich man with a pen.”

[1] “Taxing the rich,” The Week, 4 November 2011, p. 11.

[2] Can you impeach a former President?

[3] If “taxation without representation is tyranny,” then what is representation without taxation?

[4] Perhaps it is worth pointing out that of the “one percent,” about 16 percent are in medicine; about 12 percent are lawyers, and about 50 percent of the members of the House of Representatives and the Senate belong to the “one percent.”

[5] See: “Stuff my president says.”

The Clin-tons. See: Theme for “The Simpsons.”

After leaving the White House in 2001, Bill Clinton found himself at loose ends. He didn’t have a ranch with brush to clear, so he started a little foundation to help children in Harlem. In 2002 he added an effort to raise money to lower the cost of AIDS drugs in Africa. In 2005 he launched the Clinton Global Initiative: an annual meeting of the smart, rich, and “concerned.”   This mini-Davos still runs, providing an opportunity for powerful people from many domains to hob-nob. However, the Clinton Foundation soon saw itself awash in donations ($2 billion and counting) from big business and foreign governments. In addition, Bill Clinton found himself much in demand as a speaker: he’s earned $26 million in fees.[1] It is, or should be, hard for any American to carp about this tale of a poor country boy who made good.

One fly in the ointment is that examination of the tax records of the Clinton Foundation for 2011-2013 shows that only 10 percent of the donations it has received go to actual charitable projects. The rest goes to administrative expenses.[2] Those administrative expenses include a staff of 2,000 that is packed with Clinton loyalists. .

A second fly in the ointment is that Hillary Clinton launched her own political career at the same moment that Bill Clinton launched his profitable post-presidency. She won election to the Senate, ran for the Democratic nomination for President in a year when the Democrats actually did win the White House, served for four years as Secretary of State as a consolation prize from Barack Obama, and is no the front-runner for the Democratic nomination for President. The millions of dollars pouring into their joint account began to look very much like a slush fund and as influence-peddling.

In the second term of the George W. Bush Administration, the US sold about $85 billion in weapons to twenty State Department-approved countries. In the first term of the Obama Administration, the US sold about $165 billion in weapons to twenty State Department-approved countries. Those twenty countries had made millions of dollars in donations to the Clinton Foundation.[3] For example, the government of Algeria donated $500,000 to the foundation, then received State Department approval for a 70 percent increase in authorized military purchases from the United States. That looks bad, to my eye, but it gets worse. The Obama administration had extracted a promise from the Clintons that all foreign donations to the foundation would be fully reported. Somehow, the foundation forgot to report this one and others as well.

When the Hillary Clinton e-mail “scandal” first broke, 44 percent of Republicans thought it was a “very serious problem,” while 17 percent of Democrats thought it was a “very serious problem.” After a week of both parties spinning the issue for all it was worth, the divergence had increased. By late March 2015, 68 percent of Republicans thought that it was a “very serious problem,” while 8 percent of Democrats thought that it was a “very serious matter.”[4] That “scandal” centers on Hillary Clinton’s use of a potentially insecure private e-mail server located in the Clinton family home in New York. Under pressure, she turned over 30,000 e-mail messages that bore on State Department business. Some Republican inquisitors may hope to find a smoking gun with regard to Benghazi. However, the real issue may be in the many other “personal” messages that she deleted. Worming around in the minds of many people is the suspicion that “If it walks like a duck and it quacks like a duck, then it’s a duck.”

[1] “The Clintons’ controversial foundation,” The Week, 3 July 2015, p. 11.

[2] “Clinton Foundation: Is it a true charity?” The Week, 15 May 2015, p. 16.

[3] “Noted,” The Week, 12 June 2015, p. 16.

[4] “Poll Watch,” The Week, 27 March 2015, p. 17.

Wahhabn?

Back in the many-days-ago, immediately following the death of the Prophet Muhammad, Muslims divided over the question of who should lead the “Umma” (the Faithful). Should it be some prominent person who enjoyed wide deference among Arabs or should it be a blood relative? The prominent (and rich) men who argued that one of them should lead tended to be “late adopters” of Islam. This opened them to the suspicion that they were what the Nazis would call “March violets”—opportunists who joined the movement once it came to power. The men who thought that a blood relative should lead tended to be, well, blood relatives, but also essentially lower-ranking figures committed to tribal loyalties. Islam divided between those who supported an eminent figure (Sunnis, the vast majority) and those who favored a blood relative (Shi’ites, a minority overall, but the clear majority in Iran and Iraq). The two sects of Islam did battle for hundreds of year. Today, the Islamic Republic of Iran espouses the cause of the Shi’ites, while the Kingdom of Saudi Arabia espouses the cause of the Sunnis.

For many years, the United States fostered warm relations with Iran. Then came the Iranian Islamist Revolution of 1979. The Americans shifted their support to Sunni rulers, like the kings of Saudi Arabia, but also to more “secular” Arab leaders like Saddam Hussein.[1] This makes it sound like the US is backing “moderate” Islam against “radical” Islam. Nothing could be further from the truth. The Saudis have their own brand of religious radicalism, Wahhabism.

Wahhabism began in the 18th Century as a puritanical sect of Sunni Islam. The founder, sheik Abdul-Wahhab, forged an alliance with the leader of the Saud family, an alliance sealed by the alliance of their children. Almost two centuries later, the Saud family completed the conquest of Arabia. Later, still, it became a major oil exporter. The oil wealth led to a loosening of the strict moral standards that had run in parallel with the rise of the Sauds. In 1979, Wahhabist enthusiasts administered a very public rebuke to the nation’s leadership by seizing the Great Mosque in Mecca. Taking the message to heart, the Saudi leadership changed course. Saudi Arabia has long tried to spread Wahhabism while checking the spread of Shi’ite doctrines.[2] Saudi money pays for mosques, schools, and cultural centers abbroad.

In failed or failing states like Pakistan and Afghanistan during the war against the Soviets, Saudi-funded religious schools (“madrasas”) offered the only schools available to children in border regions and in refugee camps. The Wahhabist doctrines spread to many boys who would later take arms as part of the Taliban. The schools continue to teach studetns drawn from Muslim populations in Indonesia and Malaysia.

In exchange for this largesse for the cause, Wahhabist militants operate only outside Saudi Arabia. The “Arab Afghans” who went to fight the Soviet Union were Wahhabists. Others went to fight in Bosnia or in Chechnya. Most of the 9/11 plane hijackers were Wahhabists.[3] The Nigerian group Boko Haram grew out of Saudi-funded efforts to counter the spread of Sufism in the Sahel. ISIS can be seen as an extension of Wahhabism. Certainly, the Saudis have shown no interest in fighting it in Syria and Iraq, even as their planes pound Shi’ites in Yemen.

In short, victory over Iranian-backed Islamism might just reveal a greater danger still. Little in either the media or government pronouncements is preparing Americans for that shock.

[1] Clients of Iran had a hand in bombing the Marine barracks in Beirut, so it isn’t like this was done at the whim of the oil companies. Regardless of the last sermon in the New York Times.

[2] “Exporting radical Islam,” The Week, 14 August 2015, p. 11.

[3] A portion of the 9/11 Commission’s report that deals with Saudi involvement remains classified.

Vacation dream spot.

Back in April 2008 a New York Times writer sang the praises of an as-yet under-touristed destination. There one could find an “ancient way of life that is still largely intact.” It was but the latest of the-next-place-to-be-discovered.[1] Contemporary society—or some sub-set of it—places a premium on rare and new experiences. Probably they are a form of status possession. Globalization in all its forms (standardization of products world-wide; cheap jet fares; the idea of taking a gap year or sabbatical at some point in your life; wealthy leisure-based societies) has created a huge market for experiences that once were the realm of misfits.[2] Now college graduates with Business degrees fight forest fires and work at ski resorts; academically-inclined college students seek berths on merchant ships, future school teachers spend a few years trying to surf all the best breaks in the Pacific; and bed-and-breakfast inn-keepers in New England spend the off-season buying textiles in Bali.[3] What are they after? Something different from the Burberry-Ralph Lauren-Tommy Hilfiger knock-off possessions that jam the stores? Some contact with challenging and “authentic” experience? Hence the search for new places.

Where was this wonderland? Yemen.[4] There, “every prospect pleases”[5]: remarkable traditional architecture un-sullied by the golden arches of McDonald’s and a combination of mountain with desert. In the Old City section of the capital, visitors are literally walking back into the Middle Ages in a way that is not true of the hordes trudging around Notre Dame in Paris. There are street markets that look and smell (of khat and persimmons) much as they must have when Mohammed was contemplating a career change. Striking out from the capital, visitors could explore the mountain-top village of Al Hajjara,[6] a sort of cactus-strewn Muslim Orvieto, which is not much changed from the time of its original construction in the 11th Century. Then there is the Wadi Hadhramaut, an Arabian valley in which things will actually grow. Frankincense first of all, but also senna and cocoanut.

Well, understandably, things have deteriorated since that description of actual adventure tourism.[7] “Only man is vile.”[8] Even in 2008 the US Department of State issued scary “travel advisories” for those thinking of a trip to Yemen. Now the country is home to a lot of al Qaeda people, there’s a savage civil war going on, and Saudi Arabia and Iran are using it as a proxy battlefield in the same way that Nazi Germany and Stalinist Russia used Spain in the late Thirties.

That doesn’t mean that things will stay this bad forever. Yahya Muhammad Hamid ed-Din (or Imam Yahya) (1869-1948) ruled the country after the First World. He reined-in, if he could not entirely put a stop to, the endemic feuds and banditry. So, perhaps one day trekkers will return to Al Hajjara and the Hydramaut valley.

[1] See: Alex Garland, The Beach (1996); William Gibson, Pattern Recognition (2003); and David Simon, producer, “The Wire” (2002-2008) for various observations on modern society’s relentless drive to “step on the package.” Anyway, that’s how I read them.

[2] See, for one example: https://en.wikipedia.org/wiki/Henry_de_Monfreid

[3] Just to list some people I know.

[4] “This week’s dream: Yemen’s secret world.” The Week, 4 April 2008, p. 30.

[5] Reginald Heber, “From Greenland’s mighty mountain” (1819). http://www.hymnsandcarolsofchristmas.com/Hymns_and_Carols/from_greenlands_icy_mountains.htm

[6] Lonely Planet used to publish a guide-book to Yemen. It noted that Al Hajjara served as the jumping-off point for people hiking into the wilderness. I wonder if Anwar al-Awlaki had a copy?

[7] As opposed to merely working up a sweat being led around places by NOLS teams or having a five-star dinner in the open on a dude ranch.

[8] Heber again.

Ammo 2.

Back in 2007, at the height of the wars in Iraq and Afghanistan, American soldiers were firing a billion rounds a year. That’s a lot of bullets, even by my standards. About 1,500 Iraqis were killed by US and coalition forces in 2007.[1] About 4,500 enemy fighters were killed in Afghanistan in 2007.[2] So, that would end up totaling about 6,000 enemy combatants killed in 2007 in the two wars taken together. In theory, that means that American soldiers fired a billion rounds to kill 6,000 enemies. That makes it sound like they’re just spraying around on full-auto at the first sign of trouble. Except that it isn’t true. American soldiers and Marines fire a lot, probably most, of their rounds in training. Still, that leaves us with the question of how many rounds American soldiers did fire in combat. I haven’t figured out how to track that yet. It is worth doing because it is one way of measuring what may have been the experience of Afghans and Iraqis with American soldiers. Do they just shoot up any place that gives them guff or are they obviously discriminating in their use of force? This has implications for our relationships with these people going forward.

Then, bullets are a commodity just like, say, eggs. At any given moment, production is limited to some level. When demand goes up, prices rise until production expands. The federal government can always run a deficit and just print the money it needs. In contrast, state and local governments are required to live within their means. What this meant was that the federal government came to dominate the bullet market at the expense of both hunters and police departments. I don’t know what hunters did. Maybe there are a lot more deer and bear wandering around as a result of our wars. However, faced with a shortage of bullets, police departments responded by reducing the amount of live-fire target practice and training.[3] Apparently, this began back in 2007 at the latest. How long did the training reduction continue? For that matter, is it still in effect? Administrative systems develop a certain momentum that can be difficult to change. The point here is to ask if that training reduction is in any way connected to the recent high-profile cases of police officers shooting unarmed people? Or perhaps this is just an example of “apophenia” (seeing patterns where none actually exist).[4]

Over-lapping this ammunition shortage was another associated with events of the first Obama administration. Many gun-owners were deeply suspicious of the new president on the matter of the Second Amendment.[5] This led to the buying of guns and ammunition, just as my father-in-law’s own father bought several casks of brandy as Prohibition approached. In December 2012, the massacre at Sandy Hook school led to calls[6] for much tighter regulations of guns. Lots of people bought ammunition (and probably receivers). For example, the FBI reported 2.8 million background checks in December 2012, most coming after the Sandy Hook shootings. The price of .22-cal. Long Rifle went from 5 cents a round to 12 cents a round.[7]

Little things can be made to tell you a lot. Or, at least, raise questions.

[1] See: https://www.iraqbodycount.org/analysis/numbers/2011/

[2]See: https://en.wikipedia.org/wiki/Civilian_casualties_in_the_war_in_Afghanistan_%282001%E2%80%93present%29#Civilian_and_overall_casualties_.282006.29

[3] “Noted,” The Week, 7 September 2007, p. 20.

[4] See: William Gibson, Pattern Recognition (2003). Amazing book. My students hate it.

[5] His derisive comments about people “holding on to their God and their guns” didn’t win him any friends among gun-owners. See: “Stuff My President Says.”

[6] Including my own e-mailed appeal to one of my two idiot Senators.

[7] This is the ammunition fired by the very popular Ruger 22-10 semi-automatic rifle. Really sweet piece of work.

TGIF.

In mid-July 2013, 56 percent of Americans favored the Supreme Court’s decision to strike down the Defense of Marriage Act (DoMA), which limited marriage to a man and a woman. Still, a large minority (41 percent) opposed the Court’s decision.[1]

In summer 2015, with gay marriage equality close to a done deal, transgenderism emerged as a hot topic. In June 2015, 45 percent of Americans regarded transgenderism as a moral issue, 39 percent regarded it as not a moral issue, and 16 percent didn’t know if it was a moral issue. However, of the 45 percent who regarded it as a moral issue, 14 percent regarded it as morally acceptable and 31 percent regarded it as morally wrong.[2]

Do people understand the difference between transgenderism and transvestitism? Are they assuming (incorrectly) that all or most transvestites are homosexual? Are they using this as a proxy for their feelings about homosexuality?

According to one recent poll, 54 percent of Americans aged 18 to 29 think that transgender people should be allowed to use the public restroom that corresponds to their gender identification. In contrast, only 31 percent of Americans aged 45 and older think that transgender people should be allowed to use the public restroom that corresponds to their gender identification.[3]

In one sense, this is the standard story of progress across the generations. The older generation is less comfortable acknowledging change that seems more normal to younger generations. As it was with race and gender, so now it is with both sexual orientation and gender identification.

Still, there is something bizarre about this poll. There were almost 320 million Americans in 2014. There were maybe 700,000 people who identify as transgender. Most of them are in the closet. Unless you live in a major city, your chance of encountering a transgender person in a public restroom is virtually nil. Then, the majority of them are male-to-female identifiers.[4] So, they are guys dressed as women who have a reasonable belief that they can “pass.” (Otherwise they wouldn’t run the risk of going out “en femme.”) Women’s restrooms are all stalls with doors. (Why do you think the lines outside women’s restrooms are so long? Are the lines really for the mirrors? Aside from the fact that the idiot architects put the same number of toilets in both restrooms, but then put in a lot of urinals in men’s restrooms.) So how is any woman going to identify the person sitting in the stall next to them in women’s shoes as really a guy?

So, I conjecture that most of the people—older and younger—who object to transgender people using the public restroom of their choice are probably guys who are worried that some good-looking babe will walk in to the men’s room, stand at the next urinal, and lift the hem of her mini-skirt.

[1] “Poll Watch,” The Week, 19 July 2013, p. 15.

[2] “Poll Watch,” The Week, 19 June 2015, p. 15.

[3] “Poll Watch,” The Week, 13 March 2015, p. 17.

[4] http://www.examiner.com/article/transgender-transsexual-issues-101-are-there-more-trans-women-than-trans-men-part-one

Love and Marriage.

Twenty years ago, about 21 percent of married men and 7.5 percent of married women would admit to having had an extra-marital relationship.[1] Today, the rate for men has stayed the same, but the percentage of married women admitting to having had an extra-marital relationship has climbed to 14.7 percent.[2] I suppose that counts as some kind of victory of feminism.[3] At the same time, a slightly larger share of Republicans (67 percent) than Democrats (60 percent) report being “very happy” in their marriage.[4]

Scholars have commented on both issues. On the one hand, some suppose that the growing equality of women in the work-place has made married women more financially independent and less likely to fear the consequences of discovery. That is, getting tossed out on their ear, and losing their children, and late-model used car, and Kohl’s charge card. On the other hand, some scholars have suggested that there is more social support for marriage in conservative areas. Religion, family values, and blah-blah-blah. However, a 7 percent difference doesn’t seem that significant.

If we conjecture that the continuing economic inequality between men and women (where women earn two-thirds of what a man earns) means that the same share of women as men are unhappy in their marriage, but only two-thirds of them are able to enter the infidelity market-place, then 21 percent of women are also unhappy in their marriages. If 21 percent of husbands have and 21 percent of wives either have or would like to trespass beyond the bounds of Holy Deadlock, then 21 percent of Americans are in unhappy marriages.[5] If you average Republican and Democratic “very happy in marriage” rates, you end up with 63 percent. If 63 percent of Americans are “very happy” and 21 percent are very unhappy, then 16 percent are in the middle. (Or they “don’t know” if they are very happy or unhappy. Probably in the first couple years of marriage when such disorientation is common, what with discussions of thread-count versus Sawzalls, how to allocate time between families during the holidays, and whether tuna noodle casserole is better with or without crushed potato chips.)

So, broadly speaking, either you get marriage right or you mess it up.[6] There isn’t much of a middle ground. Generally, almost two-thirds of people get it right and one fifth gets it badly wrong. Some of those go on to get it right the second time. How does this match up against law school admission or the stock market or going to the dog track? I dunno. Be worth finding out.

[1] This was long before the whole “Ashley Madison” thing. See: https://en.wikipedia.org/wiki/Ashley_Madison You don’t get points for boasting in an anonymous survey, so, either all respondents were being honest in return for a promise of anonymity or a bunch of people still decided that telling your secrets is a dumb idea. If anything, then, the number of unhappy marriages can only have been equal to or higher than the number reported.

[2] “Noted,” The Week, 19 July 2013, p. 14.

[3] Kind of like poor Al Gore’s people going “We would have won Florida if convicted felons had the right to vote,” until somebody told them to shut up.

[4] Noted,” The Week, 28 August 2015, p. 14.

[5] Obviously, gay men and lesbians off-set in this calculation.

[6] Thus, marriage is “protopathic” (all or nothing), rather than “epicritic” (recognizing fine distinctions). See; Pat Barker, Regeneration (1991). No, really, go read it. Easily one of the finest novels of the 20th Century. The movie is not as good, in my judgement.

Climate of Fear XVIII.

Environmental record-keeping is really pretty new. It’s a function of the rise of both Science and the State during the 19th Century. In the case of California, systematic record-keeping only began in 1895. Beyond the formal records, policy-makers are forced to rely upon scientific studies and interpretations. Some geological studies indicate that “megadroughts” lasting a decade or more occur in the Western United States every 400 to 600 years.[1] We may be at the beginning of such a megadrought. It may last thirty years or more.

Since 2000, the whole of the West has been suffering from drought. This has created many different kinds of problems from crop failures to massive wildfires.[2] California offers a study of a particularly acute case. During the winters of 2012-2013 and 2013-2014, a large high pressure area prevented winter storms in the Pacific from blowing in-land to the great Sierra Nevada mountains along California’s eastern edge.[3] Three-quarters of California’s water comes from the snowfall in the Sierra Nevada. Come Spring each year, the snow melts and runs down streams and rivers into the rest of the state. Much of the water also seeps down into the aquifer of the great Central Valley. The high-pressure area cut rainfall in California to 42 to 75 percent of normal.[4] In addition, the absence of the cooling effect of on-shore breezes and storms helped bake California. That evaporated much of the water that did reach the ground.

By September 2014, 82 percent of California had been designated as either in an “extreme” or an “exceptional” drought. How to respond? Well, California is both a cluster of major cities and suburbs and a major agricultural state: it produces about 70 percent of the top 25 fruits, nuts, and vegetables. So, 80 percent of California’s water is used for irrigation of its farmers’ crops. To cut water to farmers is to cut the legs off a major industry. Instead of hitting agriculture, governments tried to limit non-agricultural water use. To begin with, the California Water Resources Control Board began fining people who watered their lawns or washed their cars without using a water-saving nozzle on the hose; Los Angeles—harking back to the oil shock of 1973—limited people to watering their yards on alternate days. That didn’t have much effect. Huge numbers of urban consumers pushed back against such restrictions.

Faced with limits on taking water from rivers, farmers turned to drilling into the aquifer. That’s a short-term—and destructive—response. There is a limited amount of water in the aquifer. A “water rush” equivalent to the Oklahoma “land rush” will privilege those with the most money for drilling operations and force smaller farmers to the wall.

In 2014 the state legislature passed a bill to regulate the use of groundwater (i.e. the drilling). This enraged farmers, who saw the groundwater as their own property. The basic question is whether a lot of people who use relatively little water (at most 20 percent of the total) should suffer hardships for the sake of relatively few people who use relatively a lot of water in order to produce valuable products. What if it was a question of electricity use, where urban areas consume far more than do rural areas? These questions aren’t just about California. They go to how we think about the environment and the economy in general.

[1] This cuts across the argument of supporters of androgenic climate change, without invalidating their arguments.

[2] Disclaimer: my son is a National Forest Service wildlands fire fighter. Actually, it isn’t a disclaimer. I just want everyone to know that I’m proud of my boy for doing a hard, demanding, and dangerous job when most kids want careers with Wall Street or Disney World or the US Gummint.

[3] “California’s epic drought,” The Week, 26 September 2014, p. 9.

[4] Apparently, no one attributes the high-pressure ridge to global warming. Inevitably, people make do with claims that “global warming” is intensifying the effects of the drought. Arguably, this is what climate-change denial on one side elicits from the other side: potential overstatement.

Query.

Why does my post called “White flight from Baltimore” draw so many hits/visitors?  Is it circulating on some kind of subterranean network?  No one comments.  No one “likes.”  But it keeps popping up on my list of views, right after “Archives.”  So, I’m puzzled.