The Secret History of Veterans Day.

Fighting in the First World War stopped at 11:00 AM on 11 November 1918. In 1919, President Woodrow Wilson proclaimed 11 November of that year to be a national holiday, “Armistice Day.” It was supposed to be a one-off. The next year, Wilson proclaimed the Sunday nearest 11 November to be Armistice Sunday so that churches could devote a day to recalling the lost and pondering the difficulties of peace. In 1921 Congress declared a national holiday on 11 November to coincide with the dedication of the Tomb of the Unknown Soldier at Arlington National Cemetery. Thereafter most states made 11 November a state holiday.

The American Legion campaigned for additional payments to military veterans on the grounds that wartime inflation had eroded the value of their pay. Civilian employees of the federal government had received pay adjustments, so veterans should receive them as well to “restore the faith of men sorely tried by what they feel to be National ingratitude and injustice.” There were a lot of veterans: 3,662,374 of them. All were voters, so Congress passed the Adjusted Compensation Act in 1921, which promised immediate payments to veterans. This would amount to about $2.24 billion. That was a lot of money, especially since Congress didn’t propose a means to pay for it. President Warren Harding initially opposed the Act unless it was paired with new revenue, then came to favor a pension system. Harding managed to block the legislation in 1921 and again in 1922. President Calvin Coolidge vetoed a new bill in 1924, saying that “patriotism…bought and paid for is not patriotism.” Congress over-rode the veto.

The World War Adjusted Compensation Act, also known as the Bonus Act, applied to veterans who had served between 5 April 1917 and 1 July 1919. They would receive $1.00 for each day served in the United States and $1.25 for each day served outside the United States. The maximum pay-out was capped at $625. The ultimate payment date was set for the recipient’s birthday in 1945. Thus, it functioned as a deferred savings or insurance plan. However, a provision of the law allowed veterans to borrow against their eventual payment.

In 1926 Congress urged the President to issue a proclamation each year on the commemoration of Armistice Day. It also ordered creation of a new and grander Tomb of the Unknown Soldier.

In 1929 the Great Depression began. Veterans suffered just like everyone else. Many of them began to borrow against the deferred compensation. By the middle of 1932, 2.5 million veterans had borrowed $1.369 billion.

In April 1932 the new Tomb of the Unknown Soldier at Arlington was completed. In Spring and Summer 1932 about 17,000 veterans gathered in Washington, DC, to demand immediate payment of their compensation. Accompanied by thousands of family members, they camped out in shacks on Anacostia Flats. The papers called them the “Bonus Army.” In mid-June 1932, the House of Representatives passed a bill for immediate repayment, but the Senate rejected it. At the end of July 1932 the Washington police tried to evict the “Bonus Marchers,” but failed. President Herbert Hoover then had the Army toss them out.

In 1936 the Democratic majorities in Congress passed a bill to allow immediate payment of the veterans’ compensation, over-riding President Franklin D. Roosevelt’s veto. A bunch of rich-kid jokers at Princeton soon formed the “Veterans of Future Wars” to demand immediate payment of a bonus to them since they were likely to get killed in the next war, before they had a chance to spend a post-war bonus.

In May 1938 Congress passed a law making 11 November an annual holiday for federal employees. In 1954 Congress changed the name to Veterans Day.

Climate of Fear VI.

Burning carbon emits carbon-dioxide and other greenhouse gasses into the atmosphere. Greenhouse gases then trap heat in the atmosphere, preventing it from escaping out into space. This effect is responsible for global warming. Since the late 18th Century, burning carbon has fueled the Industrial Revolution. In the 1980s and 1990s, the surface temperature of the Earth rose by 1.2 degrees. This rise then caused substantial melting of the polar ice caps and extreme weather events.

How much worse, then, would be the effects of the spread of industrialization into the non-Western world in the 21st Century? This has greatly increased the burning of carbon. Between 2000 and 2010, 110 billion tons of carbon dioxide were released into the atmosphere. This amounts to an estimated one-fourth of all the greenhouse gases ever emitted. At this rate, the volume of carbon dioxide concentrations in the atmosphere compared to pre-industrial times will double by 2050. In 2007 the UN’s Intergovernmental Panel on Climate Change (IPCC) predicted that such a doubling could lead to a temperature rise of 5.4 degrees, with increases each decade of 0.2 degrees Celsius per decade. (Which I think, but I’m a dumb American, works out to be 0.36 degrees Fahrenheit.) So, the temperature of the Earth should be rising even faster than before.

It isn’t. Since 1998 the surface temperature of the Earth has risen by 0.2 degrees. However, this is much less of a rise than climate scientists had projected by extrapolating the temperature increases that were recorded in the 1980s and 1990s. (I think that we should be about 0.5 degrees warmer, but see my earlier disclaimer.) “Baby, Baby, where did the heat go?”

Some climate change skeptics love this: “There is no problem with global warming. It stopped in 1998.” OK, but why did it stop? Will it restart? Another stripe of skeptics take issue with the accuracy of the models used to estimate the effects of greenhouse gas emissions. They argue that the climate is not as sensitive to increases in greenhouse gases as many models assume. We have more time to adapt and at a lower cost than “alarmists” predict.

Climate scientists offer a number of possible explanations for the “missing heat.”

The deep seas absorbed the extra heat, the way they did the “Titanic.” While surface sea temperatures have remained stable, temperatures below 2,300 feet have been rising since 2000.

The rhythms in the heat radiated by the Sun are responsible. The highs and lows of this rhythm are called solar maximums and solar minimums. One solar maximum ended in 2000 and we are in the midst of a solar minimum.

The pollution emitted by major carbon-burners like China actually reflects away some of the Sun’s heat before it becomes trapped in the atmosphere. (You can see how this answer would alarm proponents of responding to climate change. “The real problem with air pollution is that we don’t have enough of it.”)

Climate scientists have also scaled-back their predictions from a possible 5.4 degree rise in surface temperatures to projections between 1.6 and 3.6 degrees. These less-warm decades will then be followed by the roof falling in. The sun will move toward the next solar maximum; the heat trapped in the deep sea will rise toward the surface to boost temperatures; and the “pollution umbrella” will go back to trapping heat in the atmosphere. We’ll fry like eggs. Or perhaps just get poached. Depends on which scientists you believe.

“The missing heat,” The Week, 30 August 2013, p. 11.

Judith Curry, “The Global Warming Statistical Meltdown,” Wall Street Journal, 10 October 2014.

The Senator from San Quentin.

During the 1980s violent crime rose to new peaks. The murder rate in 1991 reached 9.8/100,000, about four times the rate in, say, France. A criminologist named George Kelling argued that the toleration of all sorts of little crimes or acts of indecency—even broken windows or vandalism or those homeless goofs at intersections trying to extort pocket change for cleaning your windows—created an atmosphere of disrespect for the law. From little things, people went on to feel less restrained about bigger things. Kelling sold this idea to New York City Police Commissioner William Bratton. New York cops started pushing the homeless into shelters, clearing the intersections of squeegee men, and stopping kids from hanging out on street corners.

However, Bratton also embraced the idea that a lot of crime is committed by a few people, and a little crime is committed by a lot of people. You want a big drop in crime? Concentrate on the few career criminals and put them away for a long time. Bratton concentrated on a statistical analysis of crime in each police precinct, then drove his precinct captains to find and arrest habitual criminals. This seemed to work, so lots of police departments adopted the New York approach. Bratton’s approach coincided with a get-tough policy adopted by legislatures in the Nineties. Mandatory minimum sentences and three-strikes-and-you’re-out sentencing kept criminals in prison for longer. The war on drugs, especially the crack cocaine epidemic, sent a lot more people to prison. Guys who are locked up can’t commit crimes, at least not against ordinary citizens. (Fellow prisoners or guards? That’s another story.)

Inevitably, there is a down-side. First, the United States has one-twentieth of the world’s population, but one-fourth of the prison population. That includes both Russia and China. There are more people currently in prison in the United States (2.3 million) than there are in any one of fifteen states, and more than in the four least-populated states put together. The rate of imprisonment in the United States is the highest in the world.

Second, black communities have been particularly hard hit by both crime and punishment. One in nine black men between the ages of 20 and 34 is in jail. (The overall ratio of imprisoned to paroled/probationed is about 1:3, so that would suggest that another three in nine black men is under some other form of judicial supervision.) Since felons lose the right to vote, large numbers of blacks have been dis-franchised in what one law professor has labeled “the new Jim Crow.” Since most prisons are located in rural areas, this leads to the over-representation of areas unsympathetic to city problems.

Third, keeping huge numbers of prisoners locked up is really expensive. Americans don’t like to pay taxes, so prison budgets have been held down for decades. The result is massive over-crowding. Courts have repeatedly held this over-crowding to amount to cruel and unusual punishment.

Fourth, imprisonment doesn’t seem to do anything to change behavior. Says one criminologist, “two-thirds of those who leave prison will be back within three years.”

What have changed are the crime rates. Between 1991 and 2009, the number of murders fell by 45 percent. From its peak of 9.8/100,000 in 1991, the murder rate fell to 5.0/100,000 in 2009. The same decline has been found in most other categories of crime over the same period. At least for now.

Prisoners are so numerous that, if grouped together and represented in the Congress, they would be a formidable voting bloc.

“The prison nation,” The Week, 13 February 2009, p. 13; “The mystery of falling crime rates,” The Week, 16 July 2010, p. 13.

Eye in the Sky.

Some time ago the courts decided that no one has a right to privacy when they are on the streets or in public places. Initially, this applied, in part, to the many surveillance cameras installed by banks and stores and apartment buildings. Then the development of digital cameras made surveillance video available to watchers in real time and it made it simple to transfer the images between widely separated computers. Then computer geeks developed face-recognition software and programs that detected “anomalous behavior.” All of these were great crime-fighting tools, at least according to the police who sing the non-specific praises of the cameras as deterrents and crime-solving aids.

With this doorway open, since 9-11 the Department of Homeland Security has been making grants to cities to fund the installation of security cameras targeting public places. These cameras supplement the already existing security cameras installed by banks, stores, and office buildings. Madison, Wisconsin—a bastion of Mid-Western liberalism–is putting in 32 cameras; Chicago and Baltimore—hotbeds of urban crime which actually don’t give a rip about Islamic terrorism—are installing thousands of cameras and are linking them to the existing systems of private cameras. The most elaborate system is that of the Lower Manhattan Security Initiative: by 2010, 3,000 cameras will be in place throughout Wall Street and the World Trade Center area. In addition, the system includes license plate readers connected to computers that cross reference the numbers of suspect vehicles and which share images with the Department of Homeland Security and the EffaBeeEye.

Now there is a new layer of observation: police, government, and private drones. The police are hot to use drones. In the 1980s the Supreme Court held that the police don’t need a warrant to observe private property from public airspace. [NB: What is “public airspace”? So far as I can tell, anything at a height of 500 feet or above is clearly public airspace; anything 83 feet or below is private airspace; and what is in-between is a little murky. Are you allowed to shoot drones under 83 feet like skeet?] Drones can be fitted with high-resolution cameras, infra-red sensors, license plate-readers, and directional microphones. They are quieter and smaller than helicopters, reducing the chance that people will know that they are being observed without a warrant. If you keep your shades pulled down, can they “assume” you’re running a grow house?

Are there problems with this program? In the eyes of individual rights advocates on the left and right, the answer is definitely yes. While government agencies will watch millions of people in public places in hopes of catching a few terrorists before an attack, it is more likely that they only will be able to figure out what happened after the attack. Will people just become habituated to being watched in public places? In a generation, will they accept the possibility of being watched in semi-public places? What happens when surveillance images leak from the government agency to the public sphere? See: http://www.youtube.com/watch?v=8zYRYh6cQ2g The clip is fun to watch, except that it is a public traffic camera with the film leaked to provide private entertainment. What if a mini-drone lands on your bathroom window sill one morning and catches you in the shower? Some Peeping Tom at home or cops finding a fun use for the technology paid for by the DEA or property seizures from teen-age druggies driving their Dad’s BMW? In the eyes of most Americans, however, more surveillance cameras are just fine. (“The drone over your backyard,” The Week, 15 June 2012, p. 11.)

Runnin’ all ’round my brain.

Cocaine prices per gram in selected American cities, 1999 and 2005.

1999.             2005.               Change in base price.

Seattle.                       $80-100           $30-100          -62%

Denver.                       $100-125         $100-125         0%

Los Angeles.               $50-100           $30-100           -40%

Dallas.                        $90-125           $50-80             -44%

Chicago.                     $75-100           $75-100              0%

Detroit.                       $75-100           $50-120           -33%

Atlanta.                      $100                $80-100           -20%

Miami.                        $40-60             $20-110           -50%

New York.                 $21-40             $20-25             -0%

 

There are a bunch of ways of cutting up this data, so to speak.

First, in 1999, cocaine was a glut on the market in New York, Miami, and Los Angeles. These were major cities with a large over-all market, ports of entry, and centers of a counter-culture. In contrast, it was hard to come by in Atlanta, Denver, Dallas, and Seattle. These were chief cities of “the provinces,” as the Romans would have put it. Six years later Seattle had joined New York, Miami, and Los Angeles as the capital cities of cocaine. This probably has something to do with the explosion of the computer and software industries in Seattle. Maybe writing software allows for blow in a way that designing airplanes for Boeing does not. Still, the “cocaine revolution” hadn’t reached Denver, Atlanta, and Chicago. These cities remained the ones with the highest priced (and thus least available) cocaine.

Second, even in two of the original core cities of cocaine consumption, Miami and Los Angeles, prices fell sharply. New York began with the lowest price and pretty much stayed there. Perhaps $20 a gram was the rock-bottom price for cocaine. Lots of people hustling on a big, but limited, market, all of them competing to deliver the best product to the most people at the lowest price. Adam Smith take note. Labor costs driven down to the subsistence minimum. David Ricardo take note.

Third, prices fell while the Drug Enforcement Agency was spending billions of dollars to drive up the price (and thus reduce consumption) through interdiction and eradication. Why didn’t this effort produce better results?

One reason is that cocaine producers in Columbia dispersed their coca-growing operations into more remote areas and spread into Peru and Bolivia as well. These are outside the range of US-sponsored eradication efforts. Production went up, not down.

Another reason is that, since the signature of the North American Free Trade Agreement (NAFTA) in 1994, there has been a huge increase in trans-border truck and vehicle traffic between Mexico and the United States. This made it much easier to move cocaine into the United States. One government policy warred with another government policy. The thing is that people trying to make money won in both cases. What’s more American than that?

Final thing to think about: 88 percent of cocaine moved through Mexico. Eventually, the Mexican intermediaries for the Columbians wanted a better deal. Much violence followed. (See: Narcostate with a State.)

 

Ken Dermota, “The World in Numbers: Snow Fall,” Atlantic, July/August 2007, pp. 24-25.

Opium War.

Opium was a familiar plague in Asia before the 20th Century. Chinese efforts to ban the import of opium from British India led to the Opium Wars, which China lost. Conquering opium became associated with conquering sovereignty for the Chinese. When the Chinese Communists won the civil war in 1949, they launched a campaign against drug use and against opium production within China. Chinese producers fled to Laos and Burma (today’s Myanmar). Anti-drug campaigns in other Middle Eastern and Asian countries pushed the heart of production into increasingly remote areas: Burma, Laos, and most of all, Afghanistan. Once the long war against the Soviet Union and its Afghan puppets (1979-1989) wrecked traditional wheat and grape farming, Afghan peasants moved into growing opium poppies

Since the Iranian Revolution (1979) the government has tried to end drug abuse, production, and role as a transit corridor for Afghan production. Afghan producers shifted their routes to the successor states created by the collapse of the Soviet Union (1990). The hall-marks of these successor states were poverty, corruption, and badly secured nuclear stockpiles left over from the Soviet Union. For criminals—or for Islamists—conditions were perfect. (There’s a movie in this, if only Hollywood will listen.)

The Taliban, like the Iranian regime, tried hard to suppress the opium trade and opium use in Afghanistan after they came to power. In 2000 the Taliban ordered an end to poppy farming and to the opium trade. Partly, they wanted to end a social evil; partly they wanted to destroy the financial base of the regional warlords who opposed them. Whatever their motive, opium production came to a near halt. The American invasion in 2001 toppled the Taliban, freed the warlords to pursue their traditional actions, and caused the Taliban itself to turn to opium dealing as a way of financing its war to return to power. Within a few years of the American invasion, almost 90 percent of the world’s opium again came from Afghanistan. Myanmar and Laos came in distant second and third places.

Afghanistan is hardly the only weak state that is caught up in the international narcotics trade. In 1998 the Korean dictator Kim Jong-Il launched his government into the opium trade, producing it on collective farms and transporting the product through North Korea’s embassies. Nigerian drug dealers have set up business in Bangkok to buy Pakistani and Iranian heroin for re-sale everywhere there is a part of the Nigerian diaspora. (There’s a movie in this, if only Hollywood will listen.) The cocaine cartels fighting against the Columbian government broadened their own product-line to include opium poppies and then heroin.

In the eyes of American officials, putting a stop once again to the opium trade appeared to be essential to building a viable Afghan state by taming both the warlords and the Taliban. A viable state, in turn, formed a prerequisite to an American escape from Afghanistan. In early 2005 the Americans and the Afghan government launched “Plan Afghanistan,” which was modeled on the “Plan Columbia” anti-cocaine campaign begun in 1999.[1] The plan combined assistance to farmers to help them shift to other crops with efforts to eradicate opium poppies and interrupt the movement of opium out of the country. So far, neither “Plan” appears to have made a serious dent in the trade. Drugs give weak states a kind of strength, just not the kind we want.

Matthew Quirk, “The World in Numbers: The New Opium War,” Atlantic, March 2005, pp. 52-53.

[1] This offers an interesting example of analogical thinking as a guide to action. See: Yuen Foong Khong, Analogies at War: Korea, Munich, Dien Bien Phu, and the Vietnam Decisions of 1965 (Princeton UP, 1992); and Richard Neustadt and Ernest May, Thinking In Time: The Uses of History for Decision Makers (Free Press, 1988).

Fries with that?

What do we talk about when we talk about “Americanization”?  Are we talking about the spread of the American model through compulsion or seduction? Or are we talking about the Americans getting someplace first when everyone wants to go there? The global spread of obesity offers an example. In the United States the daily per capita consumption of calories has increased by 600 calories since 1980. Correspondingly, the share of overweight adults in the population has increased from 47 percent in 1980 to 64 percent in 2003. Since 1980 Americans have taken an increasing share of their meals from restaurants and take-out food. These meals tend to have about twice as many calories as does the typical meal Mom puts on the table. Nutritionists estimate that this accounts for about two thirds of the additional weight gained by Americans in the last quarter century. As late as 1991, generalized obesity was narrowly restricted geographically: only Michigan, West Virginia, Mississippi, and Louisiana had 15-20 percent of their populations classified as obese. By 1995 24 states had 15-20 percent of their adult population classed as overweight. By 2000 22 states had at least 20 percent of their populations classed as obese, and every other state except Colorado had at least 15 percent of its population classed as obese. “Obesity is often discussed as an American cultural phenomenon, closely intertwined with a taste for fast food, soft drinks, television, and video games.” This is probably what “Americanization” means in the eyes of Frenchmen and Islamist jihadis.

There have always been more underweight people than overweight people in the world. That gap has closed over time, however, and in 2000 it ceased to be true.   Falling food prices for consumers, the shift from rural to city life for many people, the substitution of desk-work for field-work, and the purchase of processed foods are world-wide trends. Obesity has emerged as a social characteristic in developing countries: 15 percent of adult Kenyan women are overweight compared to 12 percent who are underweight; 26 percent of adult Zimbabwean women are overweight compared to 5 percent who are underweight; 71 percent of adult Egyptian women are overweight compared to 1 percent who are underweight; and 29 percent of children in urban areas of China are obese.

This is largely attributable to the end of the other “oil crisis”: in recent years cheap, high-quality cooking oil has become available in developing countries for the first time. The oil contains dietary fat that has raised the caloric intake of individuals by 400 calories per day since 1980. But the increased use of cooking oil also reflects the increasing availability of meat as incomes rise around the world. For example, the ordinary Chinese diet used to rely very heavily on starchy roots, rice, and salted vegetables. Since 1980 the Chinese diet has added a lot of meat fried in oil. Most people now consume at least 2,500 calories per day.

There are still some places with “old” nutrition problems: 6 percent of the adult women in Cambodia are overweight compared to 21 percent who are underweight; 4 percent of adult Bangladeshi women are overweight compared to 45 percent who are underweight.

This has some large implications for public health. Excess weight has been associated with illnesses like diabetes and heart disease. Poor countries lack the medical systems to deal with these sorts of problems, which are new to them. An obesity epidemic is on the way. Does that mean that Weight Watchers will become an international phenomenon?

Don Peck, “The World in Numbers: The Weight of the World,” Atlantic, June 2003, pp. 38-39.

Freedom from Farmers.

Back in the 1920s and 1930s almost half of Americans lived in communities of fewer than 2,000 people and a full quarter of them lived in rural areas. Massive over-production of basic crops led to an agricultural depression long before the onset of the Great Depression. The larger collapse of the American economy in 1929 eventually led to an effort to address the agricultural problems. The New Deal’s Agricultural Adjustment Act tried to push up farm incomes. The Act linked desirable prices to their highest recorded level, then combined subsidies with payments to not grow crops as a way to meet desirable incomes for farmers. Generally, it worked. The program had been intended as a temporary “emergency” measure, but Congress made it permanent in 1949.

Since then the program has grown while the number of farmers has been reduced. Until recently, the government made direct payments to farmers and picked up almost two-thirds of the cost of insurance against weather-related problems. All farmers, great and small, have benefitted from this program: the average farmer made $87,000 a year in 2011, largely thanks to federal welfare, compared to the national average income of $67,000. At the same time, the “family farm” has become largely imaginary. American farming has become concentrated in the hands of a few giant “agribusinesses.” Since most of the beneficiaries of these programs are in a minority of “Red” states, Republicans bought off the Democrats by including the food-stamp program in the Farm Bill.[1] Probably not what Thomas Jefferson had in mind. Or maybe it was.

In 1973 and again in 1979 oil supplies from the Middle East were interrupted and gasoline prices soared. People eager to insulate the American economy from such price shocks urged the development of alternative fuels. One of the most prominent alternatives was ethanol—alcohol derived from plants. In particular, Middle Western farm states pushed for the conversion of corn into ethanol. However, other adaptations provided a first response. Not until 1995 did the United States government begin to subsidize the production of corn-based ethanol. This program grew tremendously over the next decade as Congress. In 2007 the United States produced about 5 billion gallons of ethanol from corn. It seems likely to grow even larger: in 2007 Barack Obama told an Iowa audience that he favored raising ethanol production to 65 billion gallons by 2030.

So far, so good. Is there a down-side to this pursuit of ethanol as an alternative fuel? Yes, there are several. First of all, corn-based ethanol is incredibly inefficient compared to other forms of fuel. The “energy balance” of any fuel is the ratio between the amount of energy produced and the energy consumed to produce it. Gasoline produces five times the amount of energy needed to produce it. Sugar cane-based ethanol yields eight times as much energy as is needed to produce it. Corn-based ethanol, however, produces only about 1.3 times as much energy as is needed to produce it. In short, you get virtually no benefit for the energy expenditure. Second, ethanol absorbs water. As a result, it cannot be shipped by existing gasoline pipelines and it cannot be mixed to more than a 1:9 ratio with gasoline because it would corrode engine parts. In turn, this means that ethanol has to be shipped by less energy-efficient tanker trucks and that it can only reduce oil-based gasoline consumption by 10 percent. Third, because the energy balance of corn-based ethanol is so low, it takes huge amounts of corn to produce much ethanol. About one-fifth of the existing corn crop is devoted to ethanol. (To reach President Obama’s goal of 65 billion gallons of ethanol by 2030 would require using thirteen times as much corn as is used currently—or about 250 percent of current total corn production. Since corn is used for many different things, the existing 80 percent devoted to producing corn for those purposes would have to remain in cultivation. This means that the real level of corn production would have to go well above triple the present level.)   Devoting corn to ethanol drives up the price of all other corn-derived products: Mexican tortillas, corn-fed beef, anything sweetened with corn-syrup or fried in corn-oil. Shifting land from producing something else to producing subsidized-corn then drives up the price of other goods.

If the energy balance of ethanol is poor, that of campaign contributions is not. One agribusiness giant made $3 million in campaign contributions between 2000 and 2013, but received subsidies for producing ethanol worth $10 billion.

[1] Although, in 2013, in one of those fits of insanity that have become their hall-mark, Republicans decided to shred the food-stamp program. President Obama threatened to veto any bill that didn’t fund food-stamps. “A welfare program for agribusiness,” The Week, 16-23 August 2013, p. 13.

 

Ammo.

            C.J. Chivers came to reporting for the New York Times by an unusual rout. He graduated from Cornell in 1987, then went in the Marines as an officer. He served in the First Gulf War, then in peace-keeping operations in Los Angeles after the Rodney King riots. He left the Marines as a captain in 1994. Graduate school in journalism at Columbia followed. His first reporting job came with the Providence Journal in Rhode Island. He worked there from 1995 to 1999. In 1999 he moved to the Times, where he had the police beat until 2001. Thereafter he became a foreign correspondent covering the wars with radical Islam. He’s covered the Americans war in Afghanistan, the Russian war with Chechnya, and the American war in Iraq. Lately, he’s been covering the wars in Ukraine and Syria.

As a former Marine, Chivers knows more than does the usual reporter about military weapons. As a war correspondent in the Greater Islamic Area, he’s run into a lot of AK-47s. These qualifications give his reporting a certain cast. He can make firearms themselves tell an interesting story about the conflicts in which they are used. For example, he wrote The Gun (2010), a history of the AK-47. (See: The Gun That Made the Nineties Roar; The Arms Barometer).

Recently, he published a story about the ammunition that has been recovered on the battlefields where troops have engaged ISIS. It turns out that ISIS captures much of its ammunition from defeated foes. Indeed, it appears to select target for attack to some degree or in some cases by the prospect of capturing important stocks of weapons. It isn’t hard to do because a lot of the opponents of ISIS don’t put up much of a fight. Sometimes, anti-Assad groups of Syrians rebels or the Syrian troops they are supposed to be fighting just sell to ISIS the arms that they have been given by foreign patrons.

About 80 percent of the ammunition examined came from the Soviet Union before its collapse, post-Soviet Russia, the United States, China, or from Serbia (the perpetual bad-boy of international morality). A lot of the ISIS ammo came out of captured Syrian warehouses—or off dead Syrian troops. The Soviet Union/Putinia were long-terms sponsors of Syria, so about 18-19 percent of the ammo was manufactured in some version of whatever we’re calling Russia this week. Most of this was produced between 1970 and 1990. So, did the Russkies stop selling to the Syrians from 1990 on? Or was more recently supplied ammo stored in warehouses closer to the center of power? Or was this AK-47 ammunition purchased by the US government from an American re-seller of ammo to fit the AK-47 and other Russian weapons and then given to either Iraqi security forces before they were supplied with American M-16s or to Syrian “moderates”? About 26 percent was manufactured in China during the 1980s, but it is impossible to tell when it was shipped to Syria. About 18 percent of it was manufactured in the United States during the 2000s, so this is ammo supplied to the Iraq security forces after the American invasion of Iraq. Probably, most of this ammo came into the possession of ISIS after the collapse of the Iraqi army in Spring-Summer 2014.[1]

The story by Chivers complicates the Obama administration’s idea of building up “moderate” alternatives to ISIS. For one thing, why is it necessary to train and arm “moderate” fighters when the solution that occurred to ISIS was to go get the weapons that they needed by brute force? Why didn’t “moderates” seize the arms they needed from Syrian forces? Fpr another thing, “moderates” appear to have sold some of the weapons that they have received to ISIS to avoid trouble. Won’t they do that with any new weapons that they receive?

[1] C.J. Chivers, “ISIS’ Ammunition Is Shown to Have Origins in U.S. and China,” NYT, 6 October 2014.

Climate of Fear V.

A recent poll about global warming suggests that Americans are fractured in their beliefs about climate change. The divide chiefly is partisan. Among Democrats, 61% believe that it is having a serious impact right now, and 67% believe that it is caused mostly by human actions. Among Republicans, 26% believe that it is having a serious impact now, and 35% believe that it is caused mostly by human action. Typically, Independents are firmly in the middle: 47% believe that it is having a serious impact right now, and 53 percent believe that it is caused by human action.

One thing that this suggests is that Republicans and Democrats will be competing to persuade Independents of the correctness of their own analysis. Democrats have a basic advantage in this struggle at the moment. While 27% of Democrats and 30% of Independents believe that global warming results from natural patterns in the environment (rather than from human actions), 42% of Republicans believe that it results from natural patterns.

Another thing that it suggests is that American politics will veer between Republicans (57% percent believe it will have a serious impact sometime, 77% believe that something is causing it) and Democrats (84% believe it will have a serious impact sometime, 94% believe that something is causing it). The policies will swing between Republican efforts at palliation/adaptation and Democratic efforts at palliation/adaptation + reducing emissions.

 

One group that does believe in climate change is the national security establishment. Previously, the Pentagon and CIA saw climate change as a rising, but distant threat. A report issued on 13 October 2014 portrays a more immediate danger.

Water shortages are at the center of Pentagon concerns: shortages of drinking water and drought-related crop failures can stir migrations that will stress governments in vulnerable areas. Those areas mostly are in Northern and Southern Africa and across the Middle East. (See: Climate of Fear III)

Marcus King, who studies the political security implications of climate change has suggested one possible scenario: in recent years drought in Syria has forced many farmers off their land and into cities; their children were then exposed to the appeals of Islamist preachers; when civil war broke out in 2011 many of these rootless and radicalized young men flocked to ISIS; and now ISIS uses the control of water-supply in its territory as a lever of power over the people who live there.

The announcement of the new Pentagon stance on the dangers posed by climate change may be politically-inspired to a degree. People—both reasonable and unreasonable—may suspect that it springs from the efforts of the Obama administration to build pressure for new American commitments in the next international climate agreement to be reached in 2015. Secretary of Defense Chuck Hagel announced the main points of the report at a conference of American defense ministers being held in Peru. International delegations will meet in Peru in December 2014 to draft the new agreement. Reasonable people may conclude that the danger is real (if hyped a little for the moment), rather than manufactured.

Marjorie Connelly, “Global Warming Concerns Grow,” NYT, 23 September 2014.

Coral Davenport, “Pentagon Signal Security Risks of Climate Change,” NYT, 14 October 2014.