Gun culture.

My reader(s) is (are) probably fed up with recent posts on gun-related issues.  Perhaps i should explain.

My father was a Washington state country  boy who had hunted for much of his early life and who had been in the Army during the Second World War.  I grew up in a house full of guns, learned to shoot (badly) at a young age, received thorough training in  gun safety from a highly-regarded authority figure (I’d still be paying for the dental work if I had ever “played” with a gun), and got my first gun as a birthday present when I turned 12.  Guns hold no terrors for me.  (Gun-owners, well that’s a different issue.)

I had guns myself (shotgun, rifle, revolver) in my house for many years.  I taught my sons about gun safety along the same lines as my father taught me.  “Guns are always loaded until you know different; you can tell if a gun is loaded by checking in the breech; never point a gun at another person  except to save a life.”  (These rules caused much alarm among other parents during a gun safety lesson in Cub Scouts.)  I got rid of my guns when one of my children was diagnosed with depression.  Hard choice to make at the moment, but I haven’t regretted it for a moment.

Recently, I was asked to give the annual Constitution Day address at the school where I am employed.  With some trepidation, I told my boss that I would like to talk about the Second Amendment.  Recent events have made it a “hot topic.”  More than that, I wanted to try to sort out some of the confusions in my own mind about guns in America today.  The stuff I have posted on guns and killings is a part of the reading and thinking that I have been doing.  Hence, all the tedious posts.

The fundamental issues–as I understand them–are the following: The Supreme Court has held that the right to keep and bear arms is an individual right guaranteed by the Bill of Rights; guns are a plague on at least a segment of the Americans population; a vast majority of gun-owners are going to be asked to sacrifice a “right” because a small share of Americans abuse that right; what are we to do?

Advertisements

Assault weapons.

In America, a lot of people own guns, but most people don’t own any guns.[1] Therefore, most people get confused by the terminology bandied about in public discourse. Government estimates are that Americans own 310 million guns: 196 million “long guns” (110 million rifles; 86 million shotguns), and 114 million hand guns (pistols). Perhaps 4 million of the “long guns’ are what might be called “assault weapons.”[2] A semi-automatic weapon fires one round each time the trigger is pulled. Semi-automatic weapons are fully legal, whether pistols, rifles, or shotguns. In contrast, an automatic weapon fires continuously as long as the trigger is depressed. So, an automatic weapon is a machine gun. These have been banned since 1934.

There is nothing like war to encourage innovation. One of the weapons that made the First World War so appalling was the machine gun—a heavy weapon served by the crew of three or four. Toward the end of the war, weapons-designers invented single-man-portable machine guns: the Thompson sub-machine gun and the Bergmann machine pistol. Other countries soon followed. Toward the end of the Second World War, the German weapon-designer Hugo Schmeisser (yes, that one) produced the “Sturmgewehr” (“storm rifle”). The Russkies soon adapted this into the AK-47.[3] The US countered with the M-16. Both weapons are “selective fire”: they can fire on either automatic or semi-automatic.

There is a semi-automatic version of the M-16 that is known under the generic label of the AR-15. The civilian version of these weapons still fire at a high rate (up to 50 rounds per minute) and they have little recoil. The latter facilitates a different kind of “gun control” than what liberals have in mind.

Sales of semi-automatic, civilian versions of “assault rifles” have been booming. One type rose from 4,600 sold in 2006 to 100,000 sold in 2010. Part of this reflects a deep distrust of the federal government.[4] “The weapons that would be most suited to overthrow a dictatorial federal government would, of course, be weapons of war, and not sports equipment.”[5]

Homicides rarely involve “assault weapons.” In 2011, there were 323 murders committed with any kind of rifle, but there were 6,220 committed with hand-guns. “Assault weapons” were used in less than half of the “mass shootings” in the last thirty years. On the other hand, some of the most eye-catching mass killings involved “assault weapons”: the movie theater massacre in Aurora, Colorado, and the Sandy Hook School massacre in Newtown, Connecticut, both used civilian versions of “assault weapons.” One estimate suggests that banning assault weapons[6] would reduce the death toll from shootings by as much as 100 victims per year. That isn’t much in comparison to the 11,000 gun homicides a year in the United States. Unless you’re one of the dead or the bereaved.

On one level, the question is how did James Holmes (Aurora, CO), or Adam Lanza (Newtown, CN), or Jared Loughner (Phoenix, AZ) get a gun in the first place? On another level, the question is why people are obsessed by 4 million weapons that caused 300-odd deaths?

The real issue is hand-guns. Who owns them? Why? Would regulation work?

[1] If you just “don’t like guns,” then my tedious explanation is not for you. I understand your emotions, but do not share them.

[2] “The assault weapon,” The Week, 15 February 2013, p. 11.

[3] See: “The Gun That Made the Nineties Roar.”

[4] Which criminalized Japanese ancestry in 1941.

[5] David Kopel, Cato Institute, quoted in “The assault weapon,” The Week, 15 February 2013, p. 11.

[6] As Australia did after one terrible massacre in 1996.

Mass shootings.

What is a “mass shooting”? Answers differ: at least four people shot dead in a public place; or at least four people shot dead anywhere; or at least four people shot—wounded or killed–anywhere.

In liberal discourse, the US leads the world in mass shootings. By one count, 31 percent of all mass shootings occur in the United States.[1] Proponents of this view are quick to slide in the “developed country” qualifier because in reality, it doesn’t. Why not drive into a Mexican border town to check it out? Still, saying “well it’s worse in Guatemala” doesn’t help.

The most expansive totals for “mass shootings” appear to be arrived at by rolling in all the shootings associated with a sub-culture of violence among poor people. Drive-by shootings get counted just like massacres in fast-food restaurants. “Would you like death with that?”

One trope, less noticed and less publicized than others, holds that mass killings have a copy-cat element to them. Mass media attention devoted to one killer then helps put the idea into the pointy little heads of others. So, one solution would be to regulate the press to reduce the “if it bleeds, it leads” mentality.[2] This would involve curtailing the First Amendment.

Another trope, much more widely noticed, holds that these appalling crimes arise from American “gun culture.” Widespread gun-ownership and feeble limits on access to guns by evil-doers leads to slaughter. Leaving aside the people who beef with someone at an after-hours party in a rotting former greenhouse on a Saturday night, who are the shooters? Almost all are men; almost two thirds (64 percent) are white.[3] Working backward after mass shootings, scholars have found in about half of the killers some earlier sign of “mental illness.” The trouble is that this runs the gamut from depression to paranoia to full-blown schizophrenia. Moreover, “there is no one diagnosis that’s linked to mass shootings.” Many different diagnoses have been offered. Hence, “We can’t go out and lock up all the socially awkward young men in the world.”[4] Of course not: they often become college professors. (I can hear the gears turning in Lynn Cheney’s head already.) Furthermore, millions of men suffer from some kind of mental illness without ever becoming violent. In our current state of knowledge, it is impossible to predict who will be a killer (perhaps 20 a year) and who will not (millions).[5] Like convicted felons, people who have been involuntarily committed to a mental health facility are barred from purchasing guns. However, less than a quarter of the killers in mass shootings (23 percent) have been treated for a prior mental illness.

So, if you can’t invade First Amendment freedoms because the right of businesses to sell faulty products is sacred, and you can’t predict which mentally ill person will turn into a mass killer, and you don’t like a high murder rate, and the regulation of sales of guns is flawed by human error, then the only logical solution to the problem would be to disarm Americans in general. This is where a lot of the push-back originates.

About 100 people get killed a year in mass shootings out of 11-12,000 murder victims. That is both a drop in the bucket and a sign of the malign influence of media.

[1]. “The killing contagion,” The Week, 11 September 2015, p. 11.

[2] Hillary Clinton has recently endorsed proposals to try to deter the “short term” obsession of the stock market traders, so it isn’t much of a jump to deterring the obsessions of reporters and advertising managers.

[3] Compared with African-Americans (16%) and Asians (9%). The white share of mass shooters matches with the share of the over-all population, while the African-American share somewhat exceeds the share of over-all population (12.2%) and the Asian share is dramatically higher than the share of over-all population (4.7%).

[4] Jeffrey Swanson, Duke University, quoted in “The killing contagion,” The Week, 11 September 2015, p. 11.

[5] See: “Minority Report” (dir. Steven Spielberg, 1995).

Inequality 7.

According to the CIA, income inequality in the United States now is more extreme than in Red China.[1] So what? What matters is that a “rising tide lifts all boats,” as JFK said when arguing for a tax cut. However, some economists argue that the evidence for this “true that” statement is sketchy (as young people used to say). President Clinton got Congress to raise the top tax rate from 31 percent to 39.6 percent and the economy boomed (admittedly with the “Tech Bubble” that collapsed after he left office); President George W. Bush got Congress to cut taxes on high earners to 35 percent, but the economy floundered (admittedly with the “Housing Bubble” that collapsed before he left office). In this analysis, what really matters is the amount of demand for goods in the economy. That is an argument for shifting resources to consumers.

The “Great Bull Market” of the Twenties (and other stuff that pundits don’t want to know about) led to the Great Depression. The Great Depression led to the New Deal and 20 years of Democratic dominance in Congress. The Depression discredited businessmen as prophets-of-the-New Era. The New Deal imposed all sorts of restrictions on business and raised taxes on the rich swells (who were in some vague way blamed for the Depression). By the 1950s the top rate on marginal incomes had risen to 91 percent, essentially a confiscatory tax on high incomes. Proponents of relative income equality point to this period as the ideal society because it coincided with the period of American economic ease. Good-paying working-class jobs allowed many people with only a high-school diploma to enter some version of the middle class.

However, the Great Depression ended in 1940. By the 1970s a whole new generation of businessmen had come on the scene. They were unburdened by the sins of their elders. They campaigned for a reduction in the punitive tax rates of the New Deal era. One can see this as Republicans responding to the Democratic strategy of “tax, spend, elect” with their own mantra of “tax-cut, spend, elect.” In theory, savings create investment capital and investment capital creates jobs. Therefore, the tax rate on capital gains fell to 70 percent in the 1970s, then to 50 percent in the first Reagan administration, and then to 28 percent in the second Reagan administration. Bill Clinton pushed for and won a reduction in the tax on capital gains from 28 percent to 20 percent. George W. Bush pushed for and won a reduction in the tax on capital gains from 20 percent to 15 percent. However, President Bush also pushed for massive cuts on taxes paid by lower income groups.

Two things resulted from the Bush tax cuts. First, the US government lost $400 billion a year in revenue. Of this lost revenue, “only” $87 billion came from people earning $250,000 a year or more. The other $313 billion came from people earning less than $250,000 a year.[2]

Second, taxation became much more progressive. While cutting taxes overall, Bush shifted the burden of taxation onto upper income earners. After the Bush tax cuts, the top 1 percent of income-earners now pays 40 percent of the income tax bill (and 21 percent of all taxes), while 47 percent of Americans now pay no income tax at all.[3] Despite his bitter condemnation of the Bush Administration on many scores, President Obama fought hard to confirm 98 percent of the cuts.

There are three observations worth making. One is that there are big long-term trends or swings in tax policy. The huge deficits looming as the “Baby Boom” ages may herald an end to low taxation for everyone.

A second is that President Obama has loudly condemned the plutocrats “who tanked the economy” in the financial crisis. How did Bill Gates or Steve Jobs or Warren Buffett or the idiots who ran American car companies “tank the economy”? They didn’t. In fact, only about 14 percent of the richest Americans work in finance. Yet Gates, Jobs, Buffett and a lot of other ordinary, successful entrepreneurs were hammered by the Obama tax increases.[4] They have also been subject to his frequent dispensation of moral opprobrium.[5]

A third is that the Democrats need to define what they mean by “fair.” As in, “the rich should pay their fair share.” The rich are already carrying a disproportionate share of the fiscal weight while almost half of Americans pay nothing at all for the programs that benefit them. As Woody Guthrie might have said (had he been an entirely different person), “A poor man with a ballot-box can rob you just as easily as can a rich man with a pen.”

[1] “Taxing the rich,” The Week, 4 November 2011, p. 11.

[2] Can you impeach a former President?

[3] If “taxation without representation is tyranny,” then what is representation without taxation?

[4] Perhaps it is worth pointing out that of the “one percent,” about 16 percent are in medicine; about 12 percent are lawyers, and about 50 percent of the members of the House of Representatives and the Senate belong to the “one percent.”

[5] See: “Stuff my president says.”

The Clin-tons. See: Theme for “The Simpsons.”

After leaving the White House in 2001, Bill Clinton found himself at loose ends. He didn’t have a ranch with brush to clear, so he started a little foundation to help children in Harlem. In 2002 he added an effort to raise money to lower the cost of AIDS drugs in Africa. In 2005 he launched the Clinton Global Initiative: an annual meeting of the smart, rich, and “concerned.”   This mini-Davos still runs, providing an opportunity for powerful people from many domains to hob-nob. However, the Clinton Foundation soon saw itself awash in donations ($2 billion and counting) from big business and foreign governments. In addition, Bill Clinton found himself much in demand as a speaker: he’s earned $26 million in fees.[1] It is, or should be, hard for any American to carp about this tale of a poor country boy who made good.

One fly in the ointment is that examination of the tax records of the Clinton Foundation for 2011-2013 shows that only 10 percent of the donations it has received go to actual charitable projects. The rest goes to administrative expenses.[2] Those administrative expenses include a staff of 2,000 that is packed with Clinton loyalists. .

A second fly in the ointment is that Hillary Clinton launched her own political career at the same moment that Bill Clinton launched his profitable post-presidency. She won election to the Senate, ran for the Democratic nomination for President in a year when the Democrats actually did win the White House, served for four years as Secretary of State as a consolation prize from Barack Obama, and is no the front-runner for the Democratic nomination for President. The millions of dollars pouring into their joint account began to look very much like a slush fund and as influence-peddling.

In the second term of the George W. Bush Administration, the US sold about $85 billion in weapons to twenty State Department-approved countries. In the first term of the Obama Administration, the US sold about $165 billion in weapons to twenty State Department-approved countries. Those twenty countries had made millions of dollars in donations to the Clinton Foundation.[3] For example, the government of Algeria donated $500,000 to the foundation, then received State Department approval for a 70 percent increase in authorized military purchases from the United States. That looks bad, to my eye, but it gets worse. The Obama administration had extracted a promise from the Clintons that all foreign donations to the foundation would be fully reported. Somehow, the foundation forgot to report this one and others as well.

When the Hillary Clinton e-mail “scandal” first broke, 44 percent of Republicans thought it was a “very serious problem,” while 17 percent of Democrats thought it was a “very serious problem.” After a week of both parties spinning the issue for all it was worth, the divergence had increased. By late March 2015, 68 percent of Republicans thought that it was a “very serious problem,” while 8 percent of Democrats thought that it was a “very serious matter.”[4] That “scandal” centers on Hillary Clinton’s use of a potentially insecure private e-mail server located in the Clinton family home in New York. Under pressure, she turned over 30,000 e-mail messages that bore on State Department business. Some Republican inquisitors may hope to find a smoking gun with regard to Benghazi. However, the real issue may be in the many other “personal” messages that she deleted. Worming around in the minds of many people is the suspicion that “If it walks like a duck and it quacks like a duck, then it’s a duck.”

[1] “The Clintons’ controversial foundation,” The Week, 3 July 2015, p. 11.

[2] “Clinton Foundation: Is it a true charity?” The Week, 15 May 2015, p. 16.

[3] “Noted,” The Week, 12 June 2015, p. 16.

[4] “Poll Watch,” The Week, 27 March 2015, p. 17.

Wahhabn?

Back in the many-days-ago, immediately following the death of the Prophet Muhammad, Muslims divided over the question of who should lead the “Umma” (the Faithful). Should it be some prominent person who enjoyed wide deference among Arabs or should it be a blood relative? The prominent (and rich) men who argued that one of them should lead tended to be “late adopters” of Islam. This opened them to the suspicion that they were what the Nazis would call “March violets”—opportunists who joined the movement once it came to power. The men who thought that a blood relative should lead tended to be, well, blood relatives, but also essentially lower-ranking figures committed to tribal loyalties. Islam divided between those who supported an eminent figure (Sunnis, the vast majority) and those who favored a blood relative (Shi’ites, a minority overall, but the clear majority in Iran and Iraq). The two sects of Islam did battle for hundreds of year. Today, the Islamic Republic of Iran espouses the cause of the Shi’ites, while the Kingdom of Saudi Arabia espouses the cause of the Sunnis.

For many years, the United States fostered warm relations with Iran. Then came the Iranian Islamist Revolution of 1979. The Americans shifted their support to Sunni rulers, like the kings of Saudi Arabia, but also to more “secular” Arab leaders like Saddam Hussein.[1] This makes it sound like the US is backing “moderate” Islam against “radical” Islam. Nothing could be further from the truth. The Saudis have their own brand of religious radicalism, Wahhabism.

Wahhabism began in the 18th Century as a puritanical sect of Sunni Islam. The founder, sheik Abdul-Wahhab, forged an alliance with the leader of the Saud family, an alliance sealed by the alliance of their children. Almost two centuries later, the Saud family completed the conquest of Arabia. Later, still, it became a major oil exporter. The oil wealth led to a loosening of the strict moral standards that had run in parallel with the rise of the Sauds. In 1979, Wahhabist enthusiasts administered a very public rebuke to the nation’s leadership by seizing the Great Mosque in Mecca. Taking the message to heart, the Saudi leadership changed course. Saudi Arabia has long tried to spread Wahhabism while checking the spread of Shi’ite doctrines.[2] Saudi money pays for mosques, schools, and cultural centers abbroad.

In failed or failing states like Pakistan and Afghanistan during the war against the Soviets, Saudi-funded religious schools (“madrasas”) offered the only schools available to children in border regions and in refugee camps. The Wahhabist doctrines spread to many boys who would later take arms as part of the Taliban. The schools continue to teach studetns drawn from Muslim populations in Indonesia and Malaysia.

In exchange for this largesse for the cause, Wahhabist militants operate only outside Saudi Arabia. The “Arab Afghans” who went to fight the Soviet Union were Wahhabists. Others went to fight in Bosnia or in Chechnya. Most of the 9/11 plane hijackers were Wahhabists.[3] The Nigerian group Boko Haram grew out of Saudi-funded efforts to counter the spread of Sufism in the Sahel. ISIS can be seen as an extension of Wahhabism. Certainly, the Saudis have shown no interest in fighting it in Syria and Iraq, even as their planes pound Shi’ites in Yemen.

In short, victory over Iranian-backed Islamism might just reveal a greater danger still. Little in either the media or government pronouncements is preparing Americans for that shock.

[1] Clients of Iran had a hand in bombing the Marine barracks in Beirut, so it isn’t like this was done at the whim of the oil companies. Regardless of the last sermon in the New York Times.

[2] “Exporting radical Islam,” The Week, 14 August 2015, p. 11.

[3] A portion of the 9/11 Commission’s report that deals with Saudi involvement remains classified.

Vacation dream spot.

Back in April 2008 a New York Times writer sang the praises of an as-yet under-touristed destination. There one could find an “ancient way of life that is still largely intact.” It was but the latest of the-next-place-to-be-discovered.[1] Contemporary society—or some sub-set of it—places a premium on rare and new experiences. Probably they are a form of status possession. Globalization in all its forms (standardization of products world-wide; cheap jet fares; the idea of taking a gap year or sabbatical at some point in your life; wealthy leisure-based societies) has created a huge market for experiences that once were the realm of misfits.[2] Now college graduates with Business degrees fight forest fires and work at ski resorts; academically-inclined college students seek berths on merchant ships, future school teachers spend a few years trying to surf all the best breaks in the Pacific; and bed-and-breakfast inn-keepers in New England spend the off-season buying textiles in Bali.[3] What are they after? Something different from the Burberry-Ralph Lauren-Tommy Hilfiger knock-off possessions that jam the stores? Some contact with challenging and “authentic” experience? Hence the search for new places.

Where was this wonderland? Yemen.[4] There, “every prospect pleases”[5]: remarkable traditional architecture un-sullied by the golden arches of McDonald’s and a combination of mountain with desert. In the Old City section of the capital, visitors are literally walking back into the Middle Ages in a way that is not true of the hordes trudging around Notre Dame in Paris. There are street markets that look and smell (of khat and persimmons) much as they must have when Mohammed was contemplating a career change. Striking out from the capital, visitors could explore the mountain-top village of Al Hajjara,[6] a sort of cactus-strewn Muslim Orvieto, which is not much changed from the time of its original construction in the 11th Century. Then there is the Wadi Hadhramaut, an Arabian valley in which things will actually grow. Frankincense first of all, but also senna and cocoanut.

Well, understandably, things have deteriorated since that description of actual adventure tourism.[7] “Only man is vile.”[8] Even in 2008 the US Department of State issued scary “travel advisories” for those thinking of a trip to Yemen. Now the country is home to a lot of al Qaeda people, there’s a savage civil war going on, and Saudi Arabia and Iran are using it as a proxy battlefield in the same way that Nazi Germany and Stalinist Russia used Spain in the late Thirties.

That doesn’t mean that things will stay this bad forever. Yahya Muhammad Hamid ed-Din (or Imam Yahya) (1869-1948) ruled the country after the First World. He reined-in, if he could not entirely put a stop to, the endemic feuds and banditry. So, perhaps one day trekkers will return to Al Hajjara and the Hydramaut valley.

[1] See: Alex Garland, The Beach (1996); William Gibson, Pattern Recognition (2003); and David Simon, producer, “The Wire” (2002-2008) for various observations on modern society’s relentless drive to “step on the package.” Anyway, that’s how I read them.

[2] See, for one example: https://en.wikipedia.org/wiki/Henry_de_Monfreid

[3] Just to list some people I know.

[4] “This week’s dream: Yemen’s secret world.” The Week, 4 April 2008, p. 30.

[5] Reginald Heber, “From Greenland’s mighty mountain” (1819). http://www.hymnsandcarolsofchristmas.com/Hymns_and_Carols/from_greenlands_icy_mountains.htm

[6] Lonely Planet used to publish a guide-book to Yemen. It noted that Al Hajjara served as the jumping-off point for people hiking into the wilderness. I wonder if Anwar al-Awlaki had a copy?

[7] As opposed to merely working up a sweat being led around places by NOLS teams or having a five-star dinner in the open on a dude ranch.

[8] Heber again.