Ammo 2.

Back in 2007, at the height of the wars in Iraq and Afghanistan, American soldiers were firing a billion rounds a year. That’s a lot of bullets, even by my standards. About 1,500 Iraqis were killed by US and coalition forces in 2007.[1] About 4,500 enemy fighters were killed in Afghanistan in 2007.[2] So, that would end up totaling about 6,000 enemy combatants killed in 2007 in the two wars taken together. In theory, that means that American soldiers fired a billion rounds to kill 6,000 enemies. That makes it sound like they’re just spraying around on full-auto at the first sign of trouble. Except that it isn’t true. American soldiers and Marines fire a lot, probably most, of their rounds in training. Still, that leaves us with the question of how many rounds American soldiers did fire in combat. I haven’t figured out how to track that yet. It is worth doing because it is one way of measuring what may have been the experience of Afghans and Iraqis with American soldiers. Do they just shoot up any place that gives them guff or are they obviously discriminating in their use of force? This has implications for our relationships with these people going forward.

Then, bullets are a commodity just like, say, eggs. At any given moment, production is limited to some level. When demand goes up, prices rise until production expands. The federal government can always run a deficit and just print the money it needs. In contrast, state and local governments are required to live within their means. What this meant was that the federal government came to dominate the bullet market at the expense of both hunters and police departments. I don’t know what hunters did. Maybe there are a lot more deer and bear wandering around as a result of our wars. However, faced with a shortage of bullets, police departments responded by reducing the amount of live-fire target practice and training.[3] Apparently, this began back in 2007 at the latest. How long did the training reduction continue? For that matter, is it still in effect? Administrative systems develop a certain momentum that can be difficult to change. The point here is to ask if that training reduction is in any way connected to the recent high-profile cases of police officers shooting unarmed people? Or perhaps this is just an example of “apophenia” (seeing patterns where none actually exist).[4]

Over-lapping this ammunition shortage was another associated with events of the first Obama administration. Many gun-owners were deeply suspicious of the new president on the matter of the Second Amendment.[5] This led to the buying of guns and ammunition, just as my father-in-law’s own father bought several casks of brandy as Prohibition approached. In December 2012, the massacre at Sandy Hook school led to calls[6] for much tighter regulations of guns. Lots of people bought ammunition (and probably receivers). For example, the FBI reported 2.8 million background checks in December 2012, most coming after the Sandy Hook shootings. The price of .22-cal. Long Rifle went from 5 cents a round to 12 cents a round.[7]

Little things can be made to tell you a lot. Or, at least, raise questions.

[1] See:


[3] “Noted,” The Week, 7 September 2007, p. 20.

[4] See: William Gibson, Pattern Recognition (2003). Amazing book. My students hate it.

[5] His derisive comments about people “holding on to their God and their guns” didn’t win him any friends among gun-owners. See: “Stuff My President Says.”

[6] Including my own e-mailed appeal to one of my two idiot Senators.

[7] This is the ammunition fired by the very popular Ruger 22-10 semi-automatic rifle. Really sweet piece of work.


In mid-July 2013, 56 percent of Americans favored the Supreme Court’s decision to strike down the Defense of Marriage Act (DoMA), which limited marriage to a man and a woman. Still, a large minority (41 percent) opposed the Court’s decision.[1]

In summer 2015, with gay marriage equality close to a done deal, transgenderism emerged as a hot topic. In June 2015, 45 percent of Americans regarded transgenderism as a moral issue, 39 percent regarded it as not a moral issue, and 16 percent didn’t know if it was a moral issue. However, of the 45 percent who regarded it as a moral issue, 14 percent regarded it as morally acceptable and 31 percent regarded it as morally wrong.[2]

Do people understand the difference between transgenderism and transvestitism? Are they assuming (incorrectly) that all or most transvestites are homosexual? Are they using this as a proxy for their feelings about homosexuality?

According to one recent poll, 54 percent of Americans aged 18 to 29 think that transgender people should be allowed to use the public restroom that corresponds to their gender identification. In contrast, only 31 percent of Americans aged 45 and older think that transgender people should be allowed to use the public restroom that corresponds to their gender identification.[3]

In one sense, this is the standard story of progress across the generations. The older generation is less comfortable acknowledging change that seems more normal to younger generations. As it was with race and gender, so now it is with both sexual orientation and gender identification.

Still, there is something bizarre about this poll. There were almost 320 million Americans in 2014. There were maybe 700,000 people who identify as transgender. Most of them are in the closet. Unless you live in a major city, your chance of encountering a transgender person in a public restroom is virtually nil. Then, the majority of them are male-to-female identifiers.[4] So, they are guys dressed as women who have a reasonable belief that they can “pass.” (Otherwise they wouldn’t run the risk of going out “en femme.”) Women’s restrooms are all stalls with doors. (Why do you think the lines outside women’s restrooms are so long? Are the lines really for the mirrors? Aside from the fact that the idiot architects put the same number of toilets in both restrooms, but then put in a lot of urinals in men’s restrooms.) So how is any woman going to identify the person sitting in the stall next to them in women’s shoes as really a guy?

So, I conjecture that most of the people—older and younger—who object to transgender people using the public restroom of their choice are probably guys who are worried that some good-looking babe will walk in to the men’s room, stand at the next urinal, and lift the hem of her mini-skirt.

[1] “Poll Watch,” The Week, 19 July 2013, p. 15.

[2] “Poll Watch,” The Week, 19 June 2015, p. 15.

[3] “Poll Watch,” The Week, 13 March 2015, p. 17.


Love and Marriage.

Twenty years ago, about 21 percent of married men and 7.5 percent of married women would admit to having had an extra-marital relationship.[1] Today, the rate for men has stayed the same, but the percentage of married women admitting to having had an extra-marital relationship has climbed to 14.7 percent.[2] I suppose that counts as some kind of victory of feminism.[3] At the same time, a slightly larger share of Republicans (67 percent) than Democrats (60 percent) report being “very happy” in their marriage.[4]

Scholars have commented on both issues. On the one hand, some suppose that the growing equality of women in the work-place has made married women more financially independent and less likely to fear the consequences of discovery. That is, getting tossed out on their ear, and losing their children, and late-model used car, and Kohl’s charge card. On the other hand, some scholars have suggested that there is more social support for marriage in conservative areas. Religion, family values, and blah-blah-blah. However, a 7 percent difference doesn’t seem that significant.

If we conjecture that the continuing economic inequality between men and women (where women earn two-thirds of what a man earns) means that the same share of women as men are unhappy in their marriage, but only two-thirds of them are able to enter the infidelity market-place, then 21 percent of women are also unhappy in their marriages. If 21 percent of husbands have and 21 percent of wives either have or would like to trespass beyond the bounds of Holy Deadlock, then 21 percent of Americans are in unhappy marriages.[5] If you average Republican and Democratic “very happy in marriage” rates, you end up with 63 percent. If 63 percent of Americans are “very happy” and 21 percent are very unhappy, then 16 percent are in the middle. (Or they “don’t know” if they are very happy or unhappy. Probably in the first couple years of marriage when such disorientation is common, what with discussions of thread-count versus Sawzalls, how to allocate time between families during the holidays, and whether tuna noodle casserole is better with or without crushed potato chips.)

So, broadly speaking, either you get marriage right or you mess it up.[6] There isn’t much of a middle ground. Generally, almost two-thirds of people get it right and one fifth gets it badly wrong. Some of those go on to get it right the second time. How does this match up against law school admission or the stock market or going to the dog track? I dunno. Be worth finding out.

[1] This was long before the whole “Ashley Madison” thing. See: You don’t get points for boasting in an anonymous survey, so, either all respondents were being honest in return for a promise of anonymity or a bunch of people still decided that telling your secrets is a dumb idea. If anything, then, the number of unhappy marriages can only have been equal to or higher than the number reported.

[2] “Noted,” The Week, 19 July 2013, p. 14.

[3] Kind of like poor Al Gore’s people going “We would have won Florida if convicted felons had the right to vote,” until somebody told them to shut up.

[4] Noted,” The Week, 28 August 2015, p. 14.

[5] Obviously, gay men and lesbians off-set in this calculation.

[6] Thus, marriage is “protopathic” (all or nothing), rather than “epicritic” (recognizing fine distinctions). See; Pat Barker, Regeneration (1991). No, really, go read it. Easily one of the finest novels of the 20th Century. The movie is not as good, in my judgement.

Climate of Fear XVIII.

Environmental record-keeping is really pretty new. It’s a function of the rise of both Science and the State during the 19th Century. In the case of California, systematic record-keeping only began in 1895. Beyond the formal records, policy-makers are forced to rely upon scientific studies and interpretations. Some geological studies indicate that “megadroughts” lasting a decade or more occur in the Western United States every 400 to 600 years.[1] We may be at the beginning of such a megadrought. It may last thirty years or more.

Since 2000, the whole of the West has been suffering from drought. This has created many different kinds of problems from crop failures to massive wildfires.[2] California offers a study of a particularly acute case. During the winters of 2012-2013 and 2013-2014, a large high pressure area prevented winter storms in the Pacific from blowing in-land to the great Sierra Nevada mountains along California’s eastern edge.[3] Three-quarters of California’s water comes from the snowfall in the Sierra Nevada. Come Spring each year, the snow melts and runs down streams and rivers into the rest of the state. Much of the water also seeps down into the aquifer of the great Central Valley. The high-pressure area cut rainfall in California to 42 to 75 percent of normal.[4] In addition, the absence of the cooling effect of on-shore breezes and storms helped bake California. That evaporated much of the water that did reach the ground.

By September 2014, 82 percent of California had been designated as either in an “extreme” or an “exceptional” drought. How to respond? Well, California is both a cluster of major cities and suburbs and a major agricultural state: it produces about 70 percent of the top 25 fruits, nuts, and vegetables. So, 80 percent of California’s water is used for irrigation of its farmers’ crops. To cut water to farmers is to cut the legs off a major industry. Instead of hitting agriculture, governments tried to limit non-agricultural water use. To begin with, the California Water Resources Control Board began fining people who watered their lawns or washed their cars without using a water-saving nozzle on the hose; Los Angeles—harking back to the oil shock of 1973—limited people to watering their yards on alternate days. That didn’t have much effect. Huge numbers of urban consumers pushed back against such restrictions.

Faced with limits on taking water from rivers, farmers turned to drilling into the aquifer. That’s a short-term—and destructive—response. There is a limited amount of water in the aquifer. A “water rush” equivalent to the Oklahoma “land rush” will privilege those with the most money for drilling operations and force smaller farmers to the wall.

In 2014 the state legislature passed a bill to regulate the use of groundwater (i.e. the drilling). This enraged farmers, who saw the groundwater as their own property. The basic question is whether a lot of people who use relatively little water (at most 20 percent of the total) should suffer hardships for the sake of relatively few people who use relatively a lot of water in order to produce valuable products. What if it was a question of electricity use, where urban areas consume far more than do rural areas? These questions aren’t just about California. They go to how we think about the environment and the economy in general.

[1] This cuts across the argument of supporters of androgenic climate change, without invalidating their arguments.

[2] Disclaimer: my son is a National Forest Service wildlands fire fighter. Actually, it isn’t a disclaimer. I just want everyone to know that I’m proud of my boy for doing a hard, demanding, and dangerous job when most kids want careers with Wall Street or Disney World or the US Gummint.

[3] “California’s epic drought,” The Week, 26 September 2014, p. 9.

[4] Apparently, no one attributes the high-pressure ridge to global warming. Inevitably, people make do with claims that “global warming” is intensifying the effects of the drought. Arguably, this is what climate-change denial on one side elicits from the other side: potential overstatement.


Why does my post called “White flight from Baltimore” draw so many hits/visitors?  Is it circulating on some kind of subterranean network?  No one comments.  No one “likes.”  But it keeps popping up on my list of views, right after “Archives.”  So, I’m puzzled.

The Roosevelts versus Ronald Reagan.

Back at the start of the Twentieth Century, Theodore Roosevelt had posited that big business and a foreseeably big labor would require a big government to balance their power and solve complex new problems. For a long time, it appeared that “the Republican Roosevelt”[1] had been prescient. The New Deal, launched by his cousin Democrat Franklin D. Roosevelt, greatly expanded the government’s role in the economy. That trajectory continued until the election of Ronald Reagan in 1980. Since then, Republicans have inveighed against the expansion of state power (unless national security can be invoked). What do Americans think about this issue in the early Twenty-First Century? A January 2014 opinion poll captured a fundamental division of opinion.[2] A majority (57 percent) agreed with the statement that “we need a strong government to handle today’s complex economic problems.” However, a very substantial minority (41 percent) rejected that idea in favor of letting a free market operate without “the government being involved.” To belabor the obvious, 57 + 41 = 98 percent of Americans. There is no uncertainty in the minds of Americans about this issue, no mushy middle ground on which compromise is possible. Two tribes confront each other. In Europe, on the other hand, there is a broad consensus on the role of government in the economy.

This has important implications for the economically-battered ordinary American. In 2010, the median wage was $26,364. After adjusting for inflation, this was the lowest real median wage since 1999.[3] In 2014, American median net worth per adult hit $44,900. Japan, Canada, Australia, and many Western European countries ranked ahead of the United States, which came in at 19th .[4] Apparently, if Americans are offered a choice between earning another $20,000 a year and getting another month of vacation, they would take the pay.[5] One could interpret this as Americans being workaholics. One could also interpret it as a sign of the economic stress under which many Americans are operating.

The question is what to do about this pathetic performance. The opposing positions generally pit redistribution through taxation policies (i.e. “strong government”) against pro-growth and social mobility policies (i.e. “let the market operate”).

If you combine federal, state, and local taxes, Americans are among the lowest taxed people in the developed world. Here the US ranks 31st, trailing most of the countries with higher median net worth.[6] Where does American federal spending go? Almost two thirds of it (65 percent) goes to three categories: Social Security (24 percent); Medicare/Medicaid/CHIP (22 percent); and defense (19 percent).[7]

None of this goes to the question of which group is correct. Perhaps neither one is entirely correct. Europeans are laboring under an “austerity” that would never be tolerated in the US. It does suggest that there is a core dispute that is more powerful—and important—than the “culture wars” that obsess the media and Democratic activists. Hence, Bernie Sanders.

[1] As Yale historian John Morton Blum called one of his books.

[2] “Poll Watch,” The Week, 17 January 2014, p. 17.

[3] “Noted,” The Week, 4 November 2011, p. 18.

[4] “The bottom line,” The Week, 20 June 2014, p. 34.

[5] “Poll Watch,” The Week, 24 July 2015, p. 15.

[6] “Noted,” The Week, 25 April 2014, p. 16.

[7] “Noted,” The Week, 25 April 2014, p. 16. It is worth pointing out that most countries don’t spend anything like the share of the budget on defense as does the US. Instead, they rely on the US in an emergency. That frees up a lot of resources for social programs. Then the federal nature of American government means that much spending is done by state and local authorities. Some European countries, in particular, have a more centralized system.

Command Crisis.

When George C. Marshall became Chief of Staff of the United States Army in 1939, he perceived a striking disparity between the officer corps and the grim international situation. The Army had been reduced to a small size after the First World War; America had been at peace for twenty years; promotion had been glacially slow; and the upper ranks of commanders were clogged with elderly men who lacked energy and imagination. America would be endangered, at the very least, by the looming European war, and might well be drawn into the fighting. To revitalize the Army, Marshall ruthlessly purged the officer corps. Six hundred senior officers were removed from command or nudged into retirement before Pearl Harbor. Since the world crisis led to a dramatic expansion of the armed forces, many more than six hundred younger men rapidly rose to high command. (The most dramatic example of this is Dwight Eisenhower, who went from colonel to lieutenant-general in just over a year.) Marshall didn’t demand just energy and imagination. He also demanded ruthless effectiveness. During the Second World War, sixteen division commanders and five corps commanders were relieved of command when they failed to perform up to standard.   The rise, fall, and resurrection of George Patton might be offered as a book-end to that of Eisenhower.[1]

Thus, it can be argued that one determined man took advantage of a grave crisis to re-make a hide-bound bureaucratic institution.[2] The failures in Vietnam and in the second Iraq War seem to suggest that something went awry after Marshall and his ruthless followers had faded away. Slowly and in stages[3], the Army reverted to a cautious, self-protective rather than self-critical, bureaucracy. One sign of this change is the reluctance to remove failed commanders. Relief is taken as a sign of institutional failure because it suggests to critics that senior commanders had made a poor choice in the first place. Short command tours reinforce this trend. A duff leader will cycle out in a couple of years anyway, so why rock the boat? Now it takes really egregious personal misconduct, rather than professional incompetence, to bring relief.[4] Will it take another existential crisis to bring new life to the Army?

This analysis strikes a chord with many observers.[5] What it ignores is the malign effects of civilian political meddling and incompetence. Army Chief of Staff Eric Shinseki didn’t under-estimate the number of troops needed to occupy a defeated Iraq, Secretary of Defense Donald Rumsfeld did. Tommy Franks didn’t disband the army of Iraq and order the purge of Baath Party members from public institutions, Paul Bremer did. A critical examination of the failings of the military can’t stand alone in the effort to better defend America. We have to be equally honest and critical in examining the political institutions to which the military is subordinate. Nor should the examination be a partisan witch-hunt. President Obama prolonged a war in Afghanistan in which he plainly did not believe. There is a lot of blame to go around.

[1] See: Forrest Pogue, George C. Marshall, 4 vols. (New York: Viking, 1963-1987); Stephen Ambrose, Eisenhower, vol. 1 (New York: Simon and Schuster, 1983); Carlo D’Este, Patton: A Genius for War (New York: HarperCollins, 1995).

[2] Thomas E. Ricks, The Generals: American Military Command From World War II to Today (New York: Penguin, 2012). See also: Anton Myrer, Once An Eagle (1968).

[3] Except for Vietnam, from the end of the Korean War to the first Iraq War, the Army was “at peace.” Even Korea, Vietnam, and Iraq did not amount to existential struggles on a par with the Civil War or the Second World War.

[4] See, for example, Stanley McChrystal’s ill-considered statements in front of a reporter.

[5] See, for example, Max Boot, “Bureaucrats in Uniform,” NYT Book Review, 9 December 2012; Thanassis Cambanis, review of Fred Kaplan, The Insurgents, NYT Book Review, 27 January 2013.