Runnin’ all ’round my brain.

Cocaine prices per gram in selected American cities, 1999 and 2005.

1999.             2005.               Change in base price.

Seattle.                       $80-100           $30-100          -62%

Denver.                       $100-125         $100-125         0%

Los Angeles.               $50-100           $30-100           -40%

Dallas.                        $90-125           $50-80             -44%

Chicago.                     $75-100           $75-100              0%

Detroit.                       $75-100           $50-120           -33%

Atlanta.                      $100                $80-100           -20%

Miami.                        $40-60             $20-110           -50%

New York.                 $21-40             $20-25             -0%

 

There are a bunch of ways of cutting up this data, so to speak.

First, in 1999, cocaine was a glut on the market in New York, Miami, and Los Angeles. These were major cities with a large over-all market, ports of entry, and centers of a counter-culture. In contrast, it was hard to come by in Atlanta, Denver, Dallas, and Seattle. These were chief cities of “the provinces,” as the Romans would have put it. Six years later Seattle had joined New York, Miami, and Los Angeles as the capital cities of cocaine. This probably has something to do with the explosion of the computer and software industries in Seattle. Maybe writing software allows for blow in a way that designing airplanes for Boeing does not. Still, the “cocaine revolution” hadn’t reached Denver, Atlanta, and Chicago. These cities remained the ones with the highest priced (and thus least available) cocaine.

Second, even in two of the original core cities of cocaine consumption, Miami and Los Angeles, prices fell sharply. New York began with the lowest price and pretty much stayed there. Perhaps $20 a gram was the rock-bottom price for cocaine. Lots of people hustling on a big, but limited, market, all of them competing to deliver the best product to the most people at the lowest price. Adam Smith take note. Labor costs driven down to the subsistence minimum. David Ricardo take note.

Third, prices fell while the Drug Enforcement Agency was spending billions of dollars to drive up the price (and thus reduce consumption) through interdiction and eradication. Why didn’t this effort produce better results?

One reason is that cocaine producers in Columbia dispersed their coca-growing operations into more remote areas and spread into Peru and Bolivia as well. These are outside the range of US-sponsored eradication efforts. Production went up, not down.

Another reason is that, since the signature of the North American Free Trade Agreement (NAFTA) in 1994, there has been a huge increase in trans-border truck and vehicle traffic between Mexico and the United States. This made it much easier to move cocaine into the United States. One government policy warred with another government policy. The thing is that people trying to make money won in both cases. What’s more American than that?

Final thing to think about: 88 percent of cocaine moved through Mexico. Eventually, the Mexican intermediaries for the Columbians wanted a better deal. Much violence followed. (See: Narcostate with a State.)

 

Ken Dermota, “The World in Numbers: Snow Fall,” Atlantic, July/August 2007, pp. 24-25.

Opium War.

Opium was a familiar plague in Asia before the 20th Century. Chinese efforts to ban the import of opium from British India led to the Opium Wars, which China lost. Conquering opium became associated with conquering sovereignty for the Chinese. When the Chinese Communists won the civil war in 1949, they launched a campaign against drug use and against opium production within China. Chinese producers fled to Laos and Burma (today’s Myanmar). Anti-drug campaigns in other Middle Eastern and Asian countries pushed the heart of production into increasingly remote areas: Burma, Laos, and most of all, Afghanistan. Once the long war against the Soviet Union and its Afghan puppets (1979-1989) wrecked traditional wheat and grape farming, Afghan peasants moved into growing opium poppies

Since the Iranian Revolution (1979) the government has tried to end drug abuse, production, and role as a transit corridor for Afghan production. Afghan producers shifted their routes to the successor states created by the collapse of the Soviet Union (1990). The hall-marks of these successor states were poverty, corruption, and badly secured nuclear stockpiles left over from the Soviet Union. For criminals—or for Islamists—conditions were perfect. (There’s a movie in this, if only Hollywood will listen.)

The Taliban, like the Iranian regime, tried hard to suppress the opium trade and opium use in Afghanistan after they came to power. In 2000 the Taliban ordered an end to poppy farming and to the opium trade. Partly, they wanted to end a social evil; partly they wanted to destroy the financial base of the regional warlords who opposed them. Whatever their motive, opium production came to a near halt. The American invasion in 2001 toppled the Taliban, freed the warlords to pursue their traditional actions, and caused the Taliban itself to turn to opium dealing as a way of financing its war to return to power. Within a few years of the American invasion, almost 90 percent of the world’s opium again came from Afghanistan. Myanmar and Laos came in distant second and third places.

Afghanistan is hardly the only weak state that is caught up in the international narcotics trade. In 1998 the Korean dictator Kim Jong-Il launched his government into the opium trade, producing it on collective farms and transporting the product through North Korea’s embassies. Nigerian drug dealers have set up business in Bangkok to buy Pakistani and Iranian heroin for re-sale everywhere there is a part of the Nigerian diaspora. (There’s a movie in this, if only Hollywood will listen.) The cocaine cartels fighting against the Columbian government broadened their own product-line to include opium poppies and then heroin.

In the eyes of American officials, putting a stop once again to the opium trade appeared to be essential to building a viable Afghan state by taming both the warlords and the Taliban. A viable state, in turn, formed a prerequisite to an American escape from Afghanistan. In early 2005 the Americans and the Afghan government launched “Plan Afghanistan,” which was modeled on the “Plan Columbia” anti-cocaine campaign begun in 1999.[1] The plan combined assistance to farmers to help them shift to other crops with efforts to eradicate opium poppies and interrupt the movement of opium out of the country. So far, neither “Plan” appears to have made a serious dent in the trade. Drugs give weak states a kind of strength, just not the kind we want.

Matthew Quirk, “The World in Numbers: The New Opium War,” Atlantic, March 2005, pp. 52-53.

[1] This offers an interesting example of analogical thinking as a guide to action. See: Yuen Foong Khong, Analogies at War: Korea, Munich, Dien Bien Phu, and the Vietnam Decisions of 1965 (Princeton UP, 1992); and Richard Neustadt and Ernest May, Thinking In Time: The Uses of History for Decision Makers (Free Press, 1988).

Fries with that?

What do we talk about when we talk about “Americanization”?  Are we talking about the spread of the American model through compulsion or seduction? Or are we talking about the Americans getting someplace first when everyone wants to go there? The global spread of obesity offers an example. In the United States the daily per capita consumption of calories has increased by 600 calories since 1980. Correspondingly, the share of overweight adults in the population has increased from 47 percent in 1980 to 64 percent in 2003. Since 1980 Americans have taken an increasing share of their meals from restaurants and take-out food. These meals tend to have about twice as many calories as does the typical meal Mom puts on the table. Nutritionists estimate that this accounts for about two thirds of the additional weight gained by Americans in the last quarter century. As late as 1991, generalized obesity was narrowly restricted geographically: only Michigan, West Virginia, Mississippi, and Louisiana had 15-20 percent of their populations classified as obese. By 1995 24 states had 15-20 percent of their adult population classed as overweight. By 2000 22 states had at least 20 percent of their populations classed as obese, and every other state except Colorado had at least 15 percent of its population classed as obese. “Obesity is often discussed as an American cultural phenomenon, closely intertwined with a taste for fast food, soft drinks, television, and video games.” This is probably what “Americanization” means in the eyes of Frenchmen and Islamist jihadis.

There have always been more underweight people than overweight people in the world. That gap has closed over time, however, and in 2000 it ceased to be true.   Falling food prices for consumers, the shift from rural to city life for many people, the substitution of desk-work for field-work, and the purchase of processed foods are world-wide trends. Obesity has emerged as a social characteristic in developing countries: 15 percent of adult Kenyan women are overweight compared to 12 percent who are underweight; 26 percent of adult Zimbabwean women are overweight compared to 5 percent who are underweight; 71 percent of adult Egyptian women are overweight compared to 1 percent who are underweight; and 29 percent of children in urban areas of China are obese.

This is largely attributable to the end of the other “oil crisis”: in recent years cheap, high-quality cooking oil has become available in developing countries for the first time. The oil contains dietary fat that has raised the caloric intake of individuals by 400 calories per day since 1980. But the increased use of cooking oil also reflects the increasing availability of meat as incomes rise around the world. For example, the ordinary Chinese diet used to rely very heavily on starchy roots, rice, and salted vegetables. Since 1980 the Chinese diet has added a lot of meat fried in oil. Most people now consume at least 2,500 calories per day.

There are still some places with “old” nutrition problems: 6 percent of the adult women in Cambodia are overweight compared to 21 percent who are underweight; 4 percent of adult Bangladeshi women are overweight compared to 45 percent who are underweight.

This has some large implications for public health. Excess weight has been associated with illnesses like diabetes and heart disease. Poor countries lack the medical systems to deal with these sorts of problems, which are new to them. An obesity epidemic is on the way. Does that mean that Weight Watchers will become an international phenomenon?

Don Peck, “The World in Numbers: The Weight of the World,” Atlantic, June 2003, pp. 38-39.

Freedom from Farmers.

Back in the 1920s and 1930s almost half of Americans lived in communities of fewer than 2,000 people and a full quarter of them lived in rural areas. Massive over-production of basic crops led to an agricultural depression long before the onset of the Great Depression. The larger collapse of the American economy in 1929 eventually led to an effort to address the agricultural problems. The New Deal’s Agricultural Adjustment Act tried to push up farm incomes. The Act linked desirable prices to their highest recorded level, then combined subsidies with payments to not grow crops as a way to meet desirable incomes for farmers. Generally, it worked. The program had been intended as a temporary “emergency” measure, but Congress made it permanent in 1949.

Since then the program has grown while the number of farmers has been reduced. Until recently, the government made direct payments to farmers and picked up almost two-thirds of the cost of insurance against weather-related problems. All farmers, great and small, have benefitted from this program: the average farmer made $87,000 a year in 2011, largely thanks to federal welfare, compared to the national average income of $67,000. At the same time, the “family farm” has become largely imaginary. American farming has become concentrated in the hands of a few giant “agribusinesses.” Since most of the beneficiaries of these programs are in a minority of “Red” states, Republicans bought off the Democrats by including the food-stamp program in the Farm Bill.[1] Probably not what Thomas Jefferson had in mind. Or maybe it was.

In 1973 and again in 1979 oil supplies from the Middle East were interrupted and gasoline prices soared. People eager to insulate the American economy from such price shocks urged the development of alternative fuels. One of the most prominent alternatives was ethanol—alcohol derived from plants. In particular, Middle Western farm states pushed for the conversion of corn into ethanol. However, other adaptations provided a first response. Not until 1995 did the United States government begin to subsidize the production of corn-based ethanol. This program grew tremendously over the next decade as Congress. In 2007 the United States produced about 5 billion gallons of ethanol from corn. It seems likely to grow even larger: in 2007 Barack Obama told an Iowa audience that he favored raising ethanol production to 65 billion gallons by 2030.

So far, so good. Is there a down-side to this pursuit of ethanol as an alternative fuel? Yes, there are several. First of all, corn-based ethanol is incredibly inefficient compared to other forms of fuel. The “energy balance” of any fuel is the ratio between the amount of energy produced and the energy consumed to produce it. Gasoline produces five times the amount of energy needed to produce it. Sugar cane-based ethanol yields eight times as much energy as is needed to produce it. Corn-based ethanol, however, produces only about 1.3 times as much energy as is needed to produce it. In short, you get virtually no benefit for the energy expenditure. Second, ethanol absorbs water. As a result, it cannot be shipped by existing gasoline pipelines and it cannot be mixed to more than a 1:9 ratio with gasoline because it would corrode engine parts. In turn, this means that ethanol has to be shipped by less energy-efficient tanker trucks and that it can only reduce oil-based gasoline consumption by 10 percent. Third, because the energy balance of corn-based ethanol is so low, it takes huge amounts of corn to produce much ethanol. About one-fifth of the existing corn crop is devoted to ethanol. (To reach President Obama’s goal of 65 billion gallons of ethanol by 2030 would require using thirteen times as much corn as is used currently—or about 250 percent of current total corn production. Since corn is used for many different things, the existing 80 percent devoted to producing corn for those purposes would have to remain in cultivation. This means that the real level of corn production would have to go well above triple the present level.)   Devoting corn to ethanol drives up the price of all other corn-derived products: Mexican tortillas, corn-fed beef, anything sweetened with corn-syrup or fried in corn-oil. Shifting land from producing something else to producing subsidized-corn then drives up the price of other goods.

If the energy balance of ethanol is poor, that of campaign contributions is not. One agribusiness giant made $3 million in campaign contributions between 2000 and 2013, but received subsidies for producing ethanol worth $10 billion.

[1] Although, in 2013, in one of those fits of insanity that have become their hall-mark, Republicans decided to shred the food-stamp program. President Obama threatened to veto any bill that didn’t fund food-stamps. “A welfare program for agribusiness,” The Week, 16-23 August 2013, p. 13.

 

Ammo.

            C.J. Chivers came to reporting for the New York Times by an unusual rout. He graduated from Cornell in 1987, then went in the Marines as an officer. He served in the First Gulf War, then in peace-keeping operations in Los Angeles after the Rodney King riots. He left the Marines as a captain in 1994. Graduate school in journalism at Columbia followed. His first reporting job came with the Providence Journal in Rhode Island. He worked there from 1995 to 1999. In 1999 he moved to the Times, where he had the police beat until 2001. Thereafter he became a foreign correspondent covering the wars with radical Islam. He’s covered the Americans war in Afghanistan, the Russian war with Chechnya, and the American war in Iraq. Lately, he’s been covering the wars in Ukraine and Syria.

As a former Marine, Chivers knows more than does the usual reporter about military weapons. As a war correspondent in the Greater Islamic Area, he’s run into a lot of AK-47s. These qualifications give his reporting a certain cast. He can make firearms themselves tell an interesting story about the conflicts in which they are used. For example, he wrote The Gun (2010), a history of the AK-47. (See: The Gun That Made the Nineties Roar; The Arms Barometer).

Recently, he published a story about the ammunition that has been recovered on the battlefields where troops have engaged ISIS. It turns out that ISIS captures much of its ammunition from defeated foes. Indeed, it appears to select target for attack to some degree or in some cases by the prospect of capturing important stocks of weapons. It isn’t hard to do because a lot of the opponents of ISIS don’t put up much of a fight. Sometimes, anti-Assad groups of Syrians rebels or the Syrian troops they are supposed to be fighting just sell to ISIS the arms that they have been given by foreign patrons.

About 80 percent of the ammunition examined came from the Soviet Union before its collapse, post-Soviet Russia, the United States, China, or from Serbia (the perpetual bad-boy of international morality). A lot of the ISIS ammo came out of captured Syrian warehouses—or off dead Syrian troops. The Soviet Union/Putinia were long-terms sponsors of Syria, so about 18-19 percent of the ammo was manufactured in some version of whatever we’re calling Russia this week. Most of this was produced between 1970 and 1990. So, did the Russkies stop selling to the Syrians from 1990 on? Or was more recently supplied ammo stored in warehouses closer to the center of power? Or was this AK-47 ammunition purchased by the US government from an American re-seller of ammo to fit the AK-47 and other Russian weapons and then given to either Iraqi security forces before they were supplied with American M-16s or to Syrian “moderates”? About 26 percent was manufactured in China during the 1980s, but it is impossible to tell when it was shipped to Syria. About 18 percent of it was manufactured in the United States during the 2000s, so this is ammo supplied to the Iraq security forces after the American invasion of Iraq. Probably, most of this ammo came into the possession of ISIS after the collapse of the Iraqi army in Spring-Summer 2014.[1]

The story by Chivers complicates the Obama administration’s idea of building up “moderate” alternatives to ISIS. For one thing, why is it necessary to train and arm “moderate” fighters when the solution that occurred to ISIS was to go get the weapons that they needed by brute force? Why didn’t “moderates” seize the arms they needed from Syrian forces? Fpr another thing, “moderates” appear to have sold some of the weapons that they have received to ISIS to avoid trouble. Won’t they do that with any new weapons that they receive?

[1] C.J. Chivers, “ISIS’ Ammunition Is Shown to Have Origins in U.S. and China,” NYT, 6 October 2014.

Climate of Fear V.

A recent poll about global warming suggests that Americans are fractured in their beliefs about climate change. The divide chiefly is partisan. Among Democrats, 61% believe that it is having a serious impact right now, and 67% believe that it is caused mostly by human actions. Among Republicans, 26% believe that it is having a serious impact now, and 35% believe that it is caused mostly by human action. Typically, Independents are firmly in the middle: 47% believe that it is having a serious impact right now, and 53 percent believe that it is caused by human action.

One thing that this suggests is that Republicans and Democrats will be competing to persuade Independents of the correctness of their own analysis. Democrats have a basic advantage in this struggle at the moment. While 27% of Democrats and 30% of Independents believe that global warming results from natural patterns in the environment (rather than from human actions), 42% of Republicans believe that it results from natural patterns.

Another thing that it suggests is that American politics will veer between Republicans (57% percent believe it will have a serious impact sometime, 77% believe that something is causing it) and Democrats (84% believe it will have a serious impact sometime, 94% believe that something is causing it). The policies will swing between Republican efforts at palliation/adaptation and Democratic efforts at palliation/adaptation + reducing emissions.

 

One group that does believe in climate change is the national security establishment. Previously, the Pentagon and CIA saw climate change as a rising, but distant threat. A report issued on 13 October 2014 portrays a more immediate danger.

Water shortages are at the center of Pentagon concerns: shortages of drinking water and drought-related crop failures can stir migrations that will stress governments in vulnerable areas. Those areas mostly are in Northern and Southern Africa and across the Middle East. (See: Climate of Fear III)

Marcus King, who studies the political security implications of climate change has suggested one possible scenario: in recent years drought in Syria has forced many farmers off their land and into cities; their children were then exposed to the appeals of Islamist preachers; when civil war broke out in 2011 many of these rootless and radicalized young men flocked to ISIS; and now ISIS uses the control of water-supply in its territory as a lever of power over the people who live there.

The announcement of the new Pentagon stance on the dangers posed by climate change may be politically-inspired to a degree. People—both reasonable and unreasonable—may suspect that it springs from the efforts of the Obama administration to build pressure for new American commitments in the next international climate agreement to be reached in 2015. Secretary of Defense Chuck Hagel announced the main points of the report at a conference of American defense ministers being held in Peru. International delegations will meet in Peru in December 2014 to draft the new agreement. Reasonable people may conclude that the danger is real (if hyped a little for the moment), rather than manufactured.

Marjorie Connelly, “Global Warming Concerns Grow,” NYT, 23 September 2014.

Coral Davenport, “Pentagon Signal Security Risks of Climate Change,” NYT, 14 October 2014.

A fine kettle of fish.

Wage increases haven’t kept pace with inflation for at least a decade. Generally, American families earn less than they did in 1999. A host of factors lie behind this depressing trend. There is intensifying competition from overseas (globalization); there is the difficulty of workers adapting to technological changes that wipe out lower skill/lower wage jobs while creating higher skill/higher wage jobs; and there is a government that is managing the past more than helping create the future. Still, there are a couple of factors that capture the attention.

First, America has been suffering from slow economic growth for quite a while. Why have we suffered slow growth? One answer is that high energy costs exert a drag on the economy. Beginning with the oil shocks of the 1970s, energy costs rose until the 1990s. They dropped for most of that decade, but have returned to the post-1970s “normal” in this century. Energy costs work like a regressive tax: everybody drives, so everybody pays the same gas tax; high energy costs for employers drive them to hold down other costs, like wages, or to pass them on to consumers. Another answer is that American workers used to have an enormous education advantage over most foreign workers. Now other countries have moved forward, while Americans have remained stuck in neutral. This affects productivity in a competitive economy.

Second, what growth that has occurred has flowed toward those already at the top of the pyramid. Health care costs reduce real incomes. Either employers resist wage increases in order to provide health insurance or employees without work-provided health insurance have to pay their own costs. The long rise in health-care costs cut into the rise in pay for most people. It took a proportionately smaller share from the incomes of the well-off. They plowed the difference back into investments.

Are there any grounds for even a modest optimism? Yes. First, “fracking” has greatly increased the supply of cheaper energy in America. Second, the incessant talk about the importance of education for getting a decent job has led to an increase in the number of high-school and college graduates. In 2000, 29.1 percent of 25 to 29 year olds had a college BA; in 2008, 30.8 percent did; and in 2013, 33.6 percent did. Third, for reasons that are much debated, health-care costs have stopped rising for the last few years. This should allow pay to rise as well.

None of this means that we’re home free. The way forward is shrouded in fog. Short-term results haven’t been very satisfying. American voters clang back and forth between “Hope” and the “Tea Party.” The partisan “grid-lock” in Washington may be less of a cause of our troubles than a symptom of those troubles.[1]

This analysis raises a couple of questions.

First, how do we improve the educational preparation of American workers? Shove 50 percent or more of Americans through college? Create a trades-oriented alternative to college?

Second, how do we get health-care costs down? Western Europe and Japan spend two-thirds the share of GDP on health-care as does the US and get better results, so it can be done.

Third, where do we stand on the cheap energy versus the environment issue? Global warming argues for alternatives to burning carbon; jobs and economic growth argue for it.

Fourth, what is a government supposed to do in a highly complex society and economy? After the “London whale” and the Chrysler recalls, the “regulatory state” has a black eye. That’s hardly a reason to believe in the pure rationality of the market economy

[1] David Leonhardt, “The Great Wage Slowdown Of the 21st Century,” NYT, 7 October 2014.

 

 

The Secret History of Columbus Day.

The vast majority of the early settlers of British North America were Protestants. They brought with them a folk memory of how English Catholics had been seen—often correctly—as disloyal to the British government and in the service of foreign princes who wished to establish absolute monarchies that would force people to abandon their own faith to become Catholics. Protestantism and Catholicism regarded each other as defective faiths, rather than legitimate religions. From the late 18th Century on, the Catholic Church sided with autocratic governments and systematic ignorance. The Church opposed everything desired by progressive people of the day: representative governments, elections, freedom of speech, freedom of opinion, freedom of the press, individual civil rights, and modern science. The Church had maintained an Inquisition to repress heresy (wrong belief) and an Index of Banned Books that no Catholic should read. Occasionally, the Church kidnapped Jewish children who had been secretly baptized by Christians, and raised them as Catholics.[1] Moreover, in theory, Catholics owed their first loyalty to the Pope, rather than to the government of whatever country they happened to live in. Protestants in all countries despised Catholics as a primitive people who were slaves to the orders of their priests.

Catholic immigrants—from Ireland, Italy, and Germany—got a hostile reception from Protestant America. To make matters worse, the Irish and Italians, were poor country people. Usually they were illiterate and generally had no technical skills. Hence, they took the lowest-paying and least-regarded jobs when they first arrived in America. Their desperation for work dragged down wages for the native-born population. During the 1830s and 1840s, anti-Catholic sentiment boiled over in brawls, riots, press campaigns, and “Nativist” political parties.

The problem for Catholics lay in how to make themselves acceptable in a hostile foreign society. One solution came through associating themselves with the history of America from its earliest times. Italian-Americans first celebrated Columbus Day in New York City’s “Little Italy” in 1866. In 1882 Catholic Americans led by an Irish-American priest founded the “Knights of Columbus” as a device to help impoverished immigrants and promote Catholic education. The organization grew like wild-fire among Irish and Italian immigrants and their descendants. It emphasized the union of Americanism and Catholicism.

In 1892 President Benjamin Harrison proposed that Americans celebrate the 400th anniversary of the arrival of Christopher Columbus in the New World. Various dignitaries and un-dignitaries used the occasion to laud such ideals as patriotism and social progress. School-children recited the “Pledge of Allegiance” for the first time as part of the celebration.

Angelo Noce, an Italian immigrant who had become a citizen and who lived in Denver, Colorado took it into his head to press to make Columbus Day a Colorado state holiday. In 1905 the governor of Colorado decreed 12 October to be a state holiday.

In 1934, the Knights of Columbus and an Italian-American leader in New York City named Generoso Pope (the founder of the National Enquirer), got newly-elected President Franklin D. Roosevelt to proclaim Columbus Day as a national holiday. Roosevelt needed the Italian vote, so he agreed.

Now “progressive” people want to use the date to validate the long-neglected Native Americans. Why not? Catholics now are fully-integrated into American society. They don’t need it. And it isn’t as much fun as Saint Patrick’s Day. Still, that leaves Asian-Americans.

[1] See David Kertzer, The Kidnapping of Edgardo Mortara (New York: Random House, 1997) for one example that attracted much attention.

Bomb ’em till the mullahs bounce.

Iran has spent thirty years and $100 billion pursuing atomic weapons. Iran is deeply hostile to the West in general and to the United States and its allies in particular. So, that’s a problem. What to do?

Either we attack Iran’s nuclear resources to forestall the development of weapons or we accept Iran as a nuclear power and then seek to contain it. The choice will be shaped by how outsiders, the Americans in particular, perceive the Iranian leadership. If it is a rational, dispassionate leadership pursuing national security, rather than expanded power, then containment might well work. If it is an irrational, hatred-driven leadership seeking to expand Iranian power by toppling the established regional order, then an attack may be the only solution.             Kenneth Pollack[1] has concluded that Iran is driven either by “the Iranian leadership’s pathological perceptions of the United States or its own aggressive ambitions.” Nevertheless, he favors containment over the short to mid-term. Over the longer term, he argues, it would be better to engineer a change of regime through keeping the economic sanctions on Iran, reducing the diplomatic support it receives from Russia and China, and supporting dissidents within the country. Anybody, he thinks, would be better than the current rulers, both for America and for the Iranians themselves.

Matthew Kroenig[2] shares the conviction of Pollack and every other informed observer that Iran is pursuing nuclear weapons, not a peaceful nuclear program. He bolsters the standard arguments by noting that Iran is also developing Intercontinental Ballistic Missiles (ICBMs), the standard delivery vehicle for nuclear warheads. Kroenig derides the “containment” of a nuclear Iraq.  If the United States won’t fight a pre-nuclear Iran today, why would it risk fighting a nuclear Iran in the future? He also doubts the Pollack’s dream of regime change will become a reality. He sees the government in Tehran as too deeply entrenched and too ruthless in crushing its opponents, as it did with the so-called “Green Revolution” in 2009.[3]

 

Either containment or attack will leave the future uncertain. Might a “contained” nuclear Iran later tip toward expansionism when conditions become favorable? Would a successful attack stop Iran’s pursuit of nuclear weapons in its tracks for all time or would it just lead Iran to renew the effort after the dust had settled? Destroying a few key sites would still leave the country with scientists, engineers, and oil revenues—the real building blocks of a nuclear effort.

A creeping, largely unspoken fear is that the religious fundamentalists in Tehran share a basic mind-set with the religious fundamentalist suicide bombers of Al Qaeda and ISIS: death is to be welcomed in the service of a higher cause. It makes it hard to believe that Mutual Assured Destruction would dissuade Iran from waging nuclear war.

Finally, can the United States coerce Iran while seeking its support against ISIS? Or will the United States have to send troops to Iraq and Syria to defeat ISIS if it wants to coerce Iran?

If the United States agonizes too long, will Israel attack to degrade, even if it cannot destroy, the Iranian nuclear program?

[1] Kenneth Pollack, Unthinkable: Iran, the Bomb, and American Strategy (New York: Simon and Schuster, 2013).

[2] Matthew Kroenig, A Time to Attack: The Looming Iranian Nuclear Threat (New York: Palgrave Macmillan, 2014).

[3] The defeat of both the “Green Revolution” in Iran and the Tahrir Square movement in Egypt suggest the staying power of authoritarian governments in the Middle East.

Shi’a pets.

The Prophet Muhammad died in 632 AD. Who should succeed him as “caliph,” the leader of the Faithful? Should the succession be “elective” in the sense of someone chosen from among Muhammad’s chief followers? If so, then the leading candidate was Abu Bakr, Muhammad’s father-in-law and a powerful prop of Islam. Or should the succession be “hereditary” in the sense of someone chosen from among Muhammad’s sons-in-law so that the blood of the Prophet would run in the veins of future caliphs? If so, the leading candidate was Ali, the favored son-in-law. The majority supported the “elective” solution: Abu Bakr became the caliph. Ali and his followers sulked and schemed. Eventually Ali seized power as the fourth caliph, only to be assassinated. Since the debate over the succession, Islam has been split between a majority which sprang out of the supporters of Abu Bakr, the Sunni, and a minority that sprang from the “party of Ali,” the Shi’a[t Ali].[1] Eventually, the caliphate passed to the Ottoman sultan. The majority of Ottoman subjects were Sunni Muslims, with Shi’ites a minority located in what would become Syria and what would become Iraq. The great majority of Shi’ites were found in Persia/Iran.

Events in the 1980s turned up the flame under this conflict. The Iranian Revolution led to the creation of a revolutionary theocratic republic. Saddam Hussein’s attack on Iran led to a long war in which other Sunni states supported Iraq. Iran largely created the Hezbollah movement in Lebanon.

At the start of the Twenty-First Century, Syria under the Assad dictatorship offered a mirror-image to Iraq under the Hussein dictatorship. In the former, a Shi’a minority ruled a Sunni majority in the latter, a Sunni minority ruled s Shi’a majority.[2] The overthrow of these regimes then opened the door for the oppressed minorities to seek revenge.[3] Since the beginning of the Syrian civil war in March 2011, the Assad government has seen half the country secede from its control. In Iraq, the Maliki government got right to business as soon as they had waved good-bye to the all-too-willing Americans in 2011.

Both sides in the Syrian civil war have found supporters among their co-religionists abroad. Shi’ite Iran and the Shi’ite government of Iraq have aided the Shi’ite Assad government. Sunni Qatar, Sunni Saudi Arabia, and Sunni foreign fighters have supported the Sunni Islamists who are doing most of the heavy lifting against the Assad government in Syria and who have attacked the Shi’ite government in Iraq.[4] (See: “A Dog in This Fight?”)

“The Sunni-Shi’ite War,” The Week, 1 November 2013, p. 9.

[1] Wait. They’re fighting a gory war over something that happened 1400 years ago? Well, not exactly. During the 1400 years the two sects developed different religious practices which divide them. They also developed a history of conflict, oppression, and resistance linked to these two different faith traditions. So, they’re fighting a gory war over stuff that began 1400 years ago and continued—in widely varying degrees of intensity—down to the present. It probably isn’t helpful to try to analogize it to history-based conflicts in Western culture, like Protestant versus Catholic in Northern Ireland or the struggle for African-American civil rights.

[2] Do minorities create dictatorships as a defensive response to past or potential threats from the majority? That’s a political science question, rather than a historical question.

[3] While effete Italians assert that “revenge is a dish best tasted cold,” Arabs appear to prefer take-out.

[4] Is it possible to compare the Syrian Civil War to the Spanish Civil War? Or aren’t young Muslims entitled to a romantic commitment to an idealistic cause that subsequently turns out to be soiled by Great Power scheming?