The GWOT if Israel was in charge.

What if Israel ran the Global War on Terror (GWOT)?

On the wall of his office Meir Dagan had an old black-and-white photograph of his grandfather about to be shot by a German in Russia during the Second World War. Must be some German soldier’s snap-shot, something he could keep as a trophy or send home to his girlfriend. I don’t know where Dagan got it. Probably did a lot of looking through the picture collection at Yad Vashem. This may not be psychologically healthy. Perhaps he should have considered grief counseling. On the other hand, Dagan was the head of the Israeli foreign intelligence service, the Mossad. He could look at it anytime he wanted during the day while he tried to figure out how to deal with Israel’s enemies.

One of the units under Dagan’s command was called “Kidon.” That’s the Hebrew word for bayonet. (Actually, it probably means “dagger” or “six inches of honed bronze” because Hebrew is a language from the many days ago before Bayonne even existed.) You go to Barnes and Noble, you’ll find a bunch of books about American snipers with 500 “kills” or sumshit like that. Kind of FPSy if you ask me. I don’t think I’ve run across books about sticking a blade in somebody, feeling it grate on a rib, inhaling the coppery smell of blood, hearing the guy gasping for breath like it’s sex. Nothing FPS about that. Kidon typifies Israel’s response to terrorism.

After the 1972 Munich Olympics, Kidon launched “Operation Wrath of God.” (See: “Munich.”) The Israelis killed eleven PLO terrorists believed to have been involved in the attack. It took seven years. Apparently, they’re tenacious and patient.

At least once, in Lillehamer, Norway, they killed a complete innocent. In front of his pregnant wife. Apparently, they don’t get thrown off-track by remorse over errors.

After Hamas rose to power in the Gaza Strip in 1993, it sent many suicide bombers into Israel. The Israelis didn’t take this lying down. In 1996 they palmed off a “burner” filled with explosives on Yahya Ayyash, the really talented chief bomb maker for Hamas; in 1997 they tried to kill Khaled Meshal, a Hamas leader, by injecting poison into his ear; in 2004 they killed the founder of Hamas, Sheikh Ahmed Yassin, with an Apache gunship; in 2008 they put a bomb in the headrest of a Hamas leader’s car in Damascus. In January 2010 they suffocated the chief contact between Hamas and Iran in his luxury hotel room in Dubai. Apparently, they focus on the enemy leadership. Just keep mowing the lawn.

When Hamas took full control of Gaza in 2007, it fired thousands of rockets into Israel. Israel responded by blockading Gaza: it will not allow in cement, steel, cars, computers, and lots of ordinary food; its navy will not let fishing boats proceed more than three miles from shore; it will not allow any Palestinians out of Gaza. From December 2008 to January 2009 Israeli forces bombarded the Gaza Strip. Anything big (police stations, factories, government buildings, schools, hospitals) got blown up; 1,300 people got killed; tens of thousands got “dishoused”—as the RAF used to describe the result of the area bombing of German cities. Apparently, they don’t care much about making a bad impression on world opinion.

At the same time, Israeli leaders began to talk about doing a deal with Syria for the return of the Golan Heights. Syria is the chief supporter of Hamas. Probably, the price of the Golan for Syria would include helping eliminate the ability of Hamas to engage in attacks on Israel—before the Syrians get back the Golan. (See: “Michael Collins.”) Apparently, they adapt to changing circumstances and will talk to their enemies.

So, tenacity, patience, focus, a thick hide to criticism, and adaptability are keys traits. The enemy hasn’t gone away, but neither have the Israelis. They live with a long struggle.

What we learned from the report of the 911 Commission XII

On 12 October 2000, an al Qaeda team staged a suicide bombing against the American warship, the USS Cole while it was at anchor in the Yemen port of Aden. The attack killed 17 American sailors.

Although the CIA “described initial Yemeni support after the Cole [bombing] as ‘slow and inadequate,’…the Yemenis provided strong evidence connecting the Cole attack to al Qaeda during the second half of November, identifying individual operatives whom the United States knew were part of al Qaeda. During December the United States was able to corroborate this evidence. But the United States did not have evidence about Bin Laden’s personal involvement in the attacks until Nashiri[1] and Khallad[2] were captured in 2002 and 2003.” (p. 278.)

The Yemenis arrested two of the surviving members of the Cole team; extracted from them the names and descriptions of Nashiri, their immediate commander, and Khallad, the liaison who came from Afghanistan; and suggested to the Americans (correctly) that Khallad was actually Tawfiq bin Attash. (p. 277.) Both Nashiri and Khallad were known to the Americans to have been involved in the 1998 embassy bombings, for which al Qaeda had claimed credit, and to be linked to al Qaeda. (p. 278.) An FBI special agent participating in the investigation recognized the name Khallad as someone described by an al Qaeda source as Bin Laden’s “run boy.” In mid-December 2000 the Americans’ al Qaeda source identified a photograph of Khallad obtained from the Yemenis as Bin Laden’s agent. (pp. 277-278.)

Moreover, the 12 October 2000 “attack on the USS Cole galvanized al Qaeda’s recruitment efforts.” [OBL ordered production of a propaganda video that highlighted the attack on the Cole.] “Al Qaeda’s image was very important to Bin Laden, and the video was widely disseminated… and caused many extremists to travel to Afghanistan for training and jihad. Al Qaeda members considered the video an effective tool in their struggle for pre-eminence among other Islamist and jihadist movements.” (p. 276.) [NB: Al Qaeda appeared to be claiming responsibility for the attack. How could the CIA still waver over identifying OBL as the originator of the attack on the Cole?]

In mid-November 2000 Sandy Berger asked Hugh Shelton to review plans for military action against Bin Laden. On 25 November 2000 Berger and Clarke wrote to President Clinton to inform him that the investigation would soon show that the Cole attack had been launched by a terrorist cell whose leaders belonged to al Qaeda and whose members had trained in al Qaeda facilities; the memo also sketched out a “final ultimatum” to the Taliban being pushed by Clarke. (pp. 280-281.)

 

 

 

[1] Abd al-Rahim al-Nashiri (1965- ). Saudi Arabian. One of the “Arab Afghans” who fought the Soviet Union in Afghanistan. Eventually aligned with Osama bin Laden. Captured by the CIA in 2002. Reportedly “waterboarded” during interrogation. Currently being held at Guantanamo.

[2] Walid Muhammad Salih bin Roshayed bin Attash (1979- ).  Yemeni immigrant to Saudi Arabia.  Another “Arab Afghan.”  Became very close to Osam bin Laden.  Captured in 2003.

What we learned from the report of the 911 Commission XI

Post-Crisis Reflection: Agenda for 2000.

In January, February, and March 2000 the NSC and others reviewed what lessons might be learned from the “millennium crisis.” They concluded that any effort at disrupting al Qaeda operations had to be undertaken in a more determined way henceforth and that domestic security had already been penetrated by “sleeper cells.” Action to deal with these problems was approved in a general way. (pp. 262-263.)

Various American delegations (including one by President Clinton which the security-conscious Secret Service loudly opposed) went to Pakistan in January, March, May, June, and September. The trouble is that the US had noting to offer the Pakistanis as a reward for their co-operation: Congressionally-imposed sanctions prevented the government from offering anything of substance [and apparently the Clinton Administration did not want to brave the wrath of Congress by requesting a revision of relations with Pakistan]. (pp. 263-265.)

Richard Clarke seems to have been so focused on al Qaeda that he could not see the need for CIA assets to deal with other forms of terrorism, still less for a robust general intelligence capability. This led to bitter disputes between Clarke and the CIA leaders, who may have played the terrorism card as a budget ploy without fully appreciating how grave the danger faced by America. (pp. 265-266.)

The executive branch didn’t get very far trying to tighten up border security, especially with regard to Canada.

By the end of 1999 or the start of 2000 the leader of the Northern Alliance, Ahmed Shah Massoud, wanted the US to line up as his ally in the struggle to overthrow the Taliban. Both Cofer Black and Richard Clarke wanted to do then what the US did anyway after 9/11. At the minimum, this would allow the CIA to put its agents into Afghanistan on a long-term basis, rather than relying on hearsay from the Northern Alliance and the “tribals.” The Clinton administration declined to forge such an alliance: the Tajik-dominated Northern Alliance represented the minority within Afghanistan and many of its people had very shady pasts. (p. 271.)

Meanwhile, CIA agents in Malaysia took the group of suspects identified by the NSA intercepts under surveillance, but failed to communicate departure information in a timely fashion when some of the men moved on to Bangkok, Thailand. CIA agents in Bangkok not only failed to arrive at the airport in time to tail the arriving suspects, they failed to learn that two of the suspects had left for the United States on 15 January 2000 until March 2000. CIA’s Counterterrorist Center did not inform anyone else–neither the State Department nor the FBI– of the arrival of the two suspects in the United States until January 2001, after the bombing of the U.S.S. Cole. (pp. 261-262.) As a result, the first two members of the 9/11 team arrived in Los Angeles on 15 January 2000, at the height of the “millennium crisis.” Although neither one spoke any English and were Arabs, they failed to attract any recorded attention from Customs.

“Heading South” (2005, dir. Laurent Cantet)

 

International Tourism.

Romans used to go to Greece to acquire some “polish.”   English noblemen used to send their sons on the “Grand Tour” for the same purpose.   Between the wars Americans used to visit European war cemeteries to see where their “gallant Willy fell.” Today, tourism is big business: in 2010 there were 940 million international tourist “arrivals” someplace and the industry earned $919 billion. The USA earned over $100 billion from foreign tourists that year. Airlines, hotels, taxis, restaurants, tour-guides, museums, and sellers of hand-woven guitars all profited. (Unless you’re willing to rough-it: learn to recognize foreign traffic signs, pick up some phrases from a guide book, and eat what you ordered by mistake even though it turned out to be a psychotropic carcinogen, the way the missus and I do. See: Mark Twain, An Innocent Abroad.)

 

International sex tourism.

Il y a etait un fois, guys went to “the big city” for these purposes (see: Patricia Cohen, The Murder of Helen Jewett) or Mexico (see: JFK). Now, air travel allows people to zoom all over the earth for the same purpose. Try getting through the streets around the Rijksmuseum in Amsterdam to see the porcelain violins when a ferry-load of Brits show up on a cheap-beer-and-expensive-sex outing, and start ogling the girls in lingerie sitting in the shop windows—many of whom are petting a cat in a bit of symbolic advertising.

Of course, most people aren’t beer-sodden British soccer hooligans, so there is a market in other parts of the world. Most of them are naturally hot and sweaty places: Tunisia—where the “recent unpleasantness” has left a whole class of service workers on the beach in Speedos, Gambia, Kenya, Bali in Indonesia, Thailand, Brazil, and the Caribbean.

 

Female sex tourism.

Women started traveling for “romance” in the mid-19th century. (See: the novels of Henry James and E. M. Forster). In the first half of the 20th Century there are some pretty interesting stories of women charting their own course, although this often involves highly repressed Northern women falling for highly unrepressed (to put it mildly) Southern men. There’s probably some kind of message about life there. You never see books or movies about some Greek having an epiphany and deciding to pay his bills or go to work on time.

So, skip ahead to the aftermath of the Feminist and Sexual Revolutions of the 1960s and 1970s. Women got careers outside that of “homemaker”; women had a difficult time finding men who would accept them in their new roles (or do the dishes); marriages broke down at high rates or never got formed; and women had money. This meant that some women had one sort of success, but no significant other in their life. Result: female sex tourism blossomed (although hardly to the scale of male sex tourism). Anyway, that’s the belief. It is hard to find women who will own up to this. This makes me think that there may be a certain prurient motive behind the “exposes.” Like that “I can’t believe it’s not butter” guy on the cover of romance novels.

There are a few scholarly studies in The Annals of Tourism Research and a UC-Santa Barbara Ph.D. dissertation by April Gorry. Popular culture books and movies dealing with this supposed phenomenon include: the movie “Shirley Valentine” (1989); the novel by Terry MacMillan and the movie made from it, “How Stella Got Her Groove Back” (1998); creepy Michel Houlebecq’s novel Platform (2002); and the movie “Heading South” (2005).

In voodoo, Legba is the “master of the crossroads” who controls access to the spirits.

Big discounts at the Organ Loft!

Popular culture side-swipes reality when it comes to organ-theft. Organ theft is a “trope” (a recurring motif, AKA cliché) in many Japanese anime and manga, and in American comic books, video games, and television (C.S.I.; Law and Order; Justified, Futurama). Examples:

Robin Cook, Coma (1977). The recent development of successful transplantation techniques suddenly creates an imbalance between the supply of and demand for organs, so a black-market arises. A deranged doctor in a Boston hospital induces comas in healthy patients undergoing minor procedures, then harvests the organs.

“Coma” (1979). The movie version of the book, directed by Michael Crichton.

1989: a Turk came to Britain, sold a kidney, got stiffed on the payment, and lied to the police that he had been robbed of a kidney. This is the origin of the urban legend about “I woke up in a bath tub full of ice…”

“Death Warrant” (1990). No one cares what happens to the inmates in maximum security prisons. An evil warden, corrupt guards, and a greedy doctor, kill inmates to harvest organs for sale on the black market. The very institutions that guard us are actually criminal.

“The Harvest” (1993). Writer goes to Mexico, gets robbed of a kidney, tries to find the people responsible, partially succeeds, and then finds out that his boss has just had a transplant.

Christopher Moore, Island of the Sequined Love Nun (1997). Predatory missionaries.

“Dirty Pretty Things” (2002). Hard-pressed illegal immigrants in Britain sell organs.

“Sympathy for Mr. Vengeance” (2002). Hard-pressed South Korean factory worker sells a kidney to save his sister’s life, gets cheated, she dies, and he wreaks a bloody vengeance.

“Shichinin no Tomurai (The Innocent Seven)” (2005). Seven groups of abusive parents get an offer from a mysterious figure. They’re likely to either kill their kids or lose them to the child welfare people. Why not make a different kind of “killing” by selling the children so that their organs can be harvested? A week at a mountain vacation camp will close the deal. This may reflect Japanese discomfort with transplants, plus the Aum Shinrikyo terrorist cult.

Kazuo Ishiguru, Never Let Me Go (2005). Test tube babies + cloning = human spare tires for when you come down with some life-threatening disease. Your liver goes? Just pop one out of the “donor” you paid to have created many years ago. Now everyone can live to be 100! In the meantime, the future donors are raised in ignorance of their intended function.

“Turistas” (2006). The developed world has exploited the developing world for centuries. (See: Andre Gunder Frank.) Now it is time for reparations. A deranged doctor abducts gringo tourists who visit a remote beach resort. He harvests their organs, which are donated to the poor in a Brazilian hospital.

“Repo! The Genetic Opera” (2008). In the sinister future a big corporation supplies organs for transplant on credit. Transplant technology has progressed so far that you can get replacement intestines and spines. If you fall behind on your payments, however, the company sends around some guys to re-possess your implanted organ, just like your car or washing machine. The consequences aren’t the same as having your car or washing machine repoed, however. You die. The movie is a musical.

Eric Garcia, Repossession Mambo (2009). Uses the same sinister future/big corporation/buy on credit/get repoed premise as above. Adds bio-mechanical organs/people hiding from their creditors and being hunted by repo men twists for product differentiation.

“Repo Men” (2010). The chop-socky movie version of Garcia’s novel.

“Never Let Me Go” (2010). The excellent movie version of Ishiguro’s novel.

Give my knees to the needy.

Organ transplantation.

In the 7th Century BC,[1] a Chinese physician named Bian Que tried transplanting the heart of a strong-willed commoner into the body of a weak-willed emperor.

During the late 19th Century surgeons finally developed the technical ability to conduct operations (knowledge of how the body functioned, anesthesia, antiseptics) and this made transplants possible. However, it took much longer to develop the ability to prevent rejection of the implanted organ by the body’s immune system. Thus, the transplanted “Hands of Dr. Orloc” (1924) weren’t. Lung (1963), liver (1967), and heart (1967-1968) transplants were “successful” in the sense that the patients lived for weeks to months after the operation. In 1970 the development of the immuno-suppressive drug cyclosporine finally permitted successful transplantation to begin. Since 1970 transplants have become common: hearts, lungs, kidneys, livers, pancreases, hands, facial tissue, and bones have all been transplanted. No brains, yet.

The mismatch between donors and recipients.

Generally, there are more sick people in need of an organ than there are dead people with healthy organs for “harvesting.” While the growth of organ transplantation has extended many lives, people often die waiting for an available organ. National medical systems have developed ways of determining who gets priority.

However, there are two issues to bear in mind. First, national boundaries create barriers between donors and recipients. Second, as we have seen in so many other areas, great differences of wealth and income between different parts of the world lets buyers in rich countries get what they want in poor countries. People with money who want to jump the line can seek organ transplants abroad. One outcome of globalization has been to create a market in organs for transplant.

The global trade in organs.

Some Asian countries used to have a legal market in organs: India (until 1994), the Philippines (until 2008), and China (to this day) all allowed the legal sale of organs. Sometimes governments participate in this trade. An estimated 90 percent of the organs from China are taken from criminals executed in prisons. (They used to shoot them in train stations.)

There is also a thriving black-market in organs. The average price paid to a donor for a kidney is $5,000, while the average cost to the recipient is $150,000. When the Indian Ocean tsunami wrecked many fishing villages, about 100 villagers—almost all of them women—sold kidneys. According to one report, 40-50 percent of the people in some Pakistani villages have only one kidney. “It’s a poverty thing. You wouldn’t understand.”

Both the desire to circumvent the laws at home and the need to be close-by when an organ becomes “available” have stimulated “medical tourism.”

Finally, there is the alleged problem of “organ theft.” Given a shortage of voluntary donors, it has been suggested that some middle-men may turn to theft or murder. This is a common theme in horror movies and urban legend. It doesn’t have much truth behind it. Which isn’t the same as saying it doesn’t happen at all. “Hey buddy, can you give me a hand?”

[1] I can just see the Three Wise Men—one of them played by Buscemi—impatiently flipping through the calendar in 1 BC, marking off the days until Jesus would be born, trying to get a cheap flight, then getting told that Bethlehem’s inns are all booked solid: “Zoro-H-Aster! What are we supposed to do, stay in a manger?”

Climate of Fear X November 2014.

For twenty years China has been driving hard for industrialization. About 70 percent of all Chinese energy comes from coal. Chinese industry burns coal for fuel and Chinese apartment buildings are heated by coal-burning generators. China burns about as much coal as every other country in the world combined. The newly-affluent Chine middle-class buys cars. There are already 120 million cars and as many other motor vehicles spewing out exhaust.

Of the twenty most-polluted cities in the world, sixteen are in China. All sorts of ludicrous examples of the “How bad is…?” variety can be cited. During one recent bout of smog in Beijing, for example, a factory caught on fire and burned for three hours before anyone noticed the flames. This is at least as bad as that time the river that runs through Cleveland caught fire.

The health effects are awful. Over the last thirty years, Chinese lung cancer rates have risen by 465 percent. Thousands of people stream into hospitals complaining of breathing problems whenever air pollution becomes particularly bad.

The Chinese government turned a blind eye to this problem for a long time. Recently, they have found it much harder to pretend that killer smogs are just “heavy fog.” For one thing, foreigners don’t want to visit China if it just means that they’re going to feel like they’ve been working through two packs of Camels a day for twenty years. Tourism has fallen off and foreign businessmen don’t want to base themselves in China. For another thing, ordinary Chinese people are starting to complain. Since Tiananmen Square back in 1989 most Chinese have been cautious about demonstrating for democratic government. However, the environmental problems are pushing people into the streets for reasons other than a stroll in the park. One count estimates that there are 30,000 to 50,000 protests a year over clean air, clean water, and clean food.

The pollution problems have become so severe, and have generated a measure of public unrest, that the government began preparing for a shift to nuclear power and renewable energy sources. Looking down-range fifteen to twenty years, they seem to have concluded that they would have to continue expanding the generation of electricity through carbon-burning while preparing for a transition to other forms of energy. Hence Chinas commitment in November 2014 to reach peak carbon burning and to draw 20 percent of their energy from non-carbon sources by 2030, formalized its existing policy.

Still, this commitment leaves a bunch of stuff—aside from ash particles—up in the air.     How much energy will China require in 2030? Are they close to meeting their projected needs already? If so, then reaching peak could be a simple matter. What if they’re only at their half-way mark? Is there any quantitative value assigned as the Chinese peak? Or do the Chinese just get to expand carbon burning as fast as they can until 2030, while also expanding non-carbon energy sources to 20 percent of whatever is the total peak? Will China be building nuclear power plants and solar collectors at a rapid pace for decades to come? If the Chinese government is responding now to public unhappiness with pollution, how will it respond in the future to public unhappiness with either slowing economic growth or trying to transition away from a major industry?

 

“The face-mask nation,” The Week, 15 November 2013, p. 9.

Henry Fountain and John Schwartz, “Climate Pact by U.S. and China Relies on Policies Now Largely in Place,” NYT, 13 November 2014.

Climate of Fear IX November 2014.

India is bound to be a big loser from global climate change. The air pollution in Delhi is worse than that in Beijing; sea-level rise could forcibly displace 37 million Indians by 2050, and water for farmers could be affected by accelerated melting of glaciers in the Himalayas or disruption of the monsoons. So, India has a deep interest in limiting climate change. However, India is also one of the principle forces causing climate change.[1]

Burning coal for generating electricity is central to India’s strategy for economic development. The country has huge coal deposits (the fifth largest in the world), but little oil or natural gas. Consequently, India launched a ten year plan for building coal-burning generating plants back in 2009. Generating capacity has already expanded by 73 percent. In 2013 India burned 565 million tons of coal. Most Indian coal has a high ash-content, so it pollutes more than do some other commonly used types of coal. This makes India the third-largest emitter of greenhouse gases. By 2019 the government plans to burn more than a billion tons a year. “India’s development imperatives cannot be sacrificed at the altar of potential climate changes many years in the future,” the government’s Minister of Power has asserted.

It will be difficult to argue that India should adjourn its plans for development. Three hundred million Indians have no electricity at all, and many more have it only in fits and starts. On a per-capita basis, Indians consume only one-fourteenth as much electricity as do Americans. In a country with hundreds of millions of people living in grotesque poverty, making do with less isn’t much of an option. Electricity powers industry and industry raises incomes.

India’s coal-fired industrialization effort alarms environmentalists elsewhere. “If India goes deeper and deeper into coal, we’re all doomed.” said one climate scientist at the Scripps Institute of Oceanography. There isn’t much ground for expecting push-back by Indian environmentalists. For the most part, Indians seem to accept both air pollution and the physical displacement of populations in the countryside to make space for more coal mines. The environmental movement in China seems to have more support behind it and, therefore, more influence with the government than is the case in India.

Nuclear power and solar generation offer alternative energy sources. A lot of Western India is cloudless for much of the year, so a lot of solar energy the ground. The government of Narendra Modi has said that it will launch a program of constructing solar-energy plants. Whether this can be carried forward fast enough and on a large enough scale to replace India’s reliance on coal is hard to tell.

So, that’s a problem. Still, China currently burns as much coal as every other country in the world combined. Can India’s coal-burning really pose more of a problem than does that of China?[2] The recent agreement between the United States and China called for China to cap its greenhouse gas emissions before 2030. The Chinese may continue to shovel on the coal until then, but they also might begin to shift from a reliance on coal to other energy sources. If that comes true, it will be a lot more significant for the climate than is India’s continuing development of coal. If the rest of the world moves in one direction, then India might find a way to follow. There’s a couple of big “Ifs” there. Still, the prospects look better than they did a little while ago.

[1] Gardiner Harris, “Coal Rush in India Could Tip Balance of Climate Change,” NYT, 18 November 2014.

 

[2] China produces 46 percent of the world’s coal and imports more; India produces 7.7 percent of the world’s coal, but has been developing its own reserves because of the cost of imports. See: “Climate of Fear IX.”

The economic mess and policy.

Median income, adjusted for inflation, is about $3,600 less than when President George W. Bush entered the White House and about $2,100 less than when President Obama entered the White House. America has not recovered from the “Great Recession.” We are rolling up on fifteen years of falling incomes after a long period of rising incomes. In contrast, upper income groups are seeing their wealth and incomes rise. Something is wrong.

What do economists suggest about reviving economic growth? They suggest improving education because America has lost its one-time enormous lead over other nations in terms of human capital. They suggest improving our crumbling infrastructure because roads, bridges, airports, and telecommunications are all falling behind needs. They suggest sorting out the messy tax code to reduce distortions in economic activity. They suggest cutting the cost of health care, which drags on the economy and cuts down money wages.[1]

The problem with these sorts of policies is that they will take a long time to play out, have an uncertain effect, and are complicated to understand. Hence, both side look for nostrums that look good on a bumper sticker. For Republicans, the solution tends to be cuts in taxes on high income-earners and corporations. These are the “job creators.”

What do the Democrats want to do to raise stagnant incomes among middle-class “workers”?[2] Well, they haven’t done much for quite a stretch so far as voters can tell. It should surprise no one if lots of them sit out an election. To counteract this trend, Democrats have adopted the cause of a higher minimum wage. In the near future they may turn to a “middle-class tax cut.” It seems most likely that this “cut” would actually take the form of “tax-credits.” These could be presented as tax incentives to save for retirement or for college education. Democrats favor paying for these cuts through higher taxes on upper-incomes. This would be popular with most Americans, who want more money for themselves and resent wealthy people.

How likely is this to happen? On one sense, very likely. The anti-tax frenzy that has gripped America for several decades has led to all Americans paying lower taxes than the historical trend since the Second World War. President Obama was happy to make most of the Bush-era tax cuts permanent.

In another sense, very unlikely. Such policies would have to pass through the House of Representatives. According to one analysis, the House is almost certain to remain in the hands of Republicans for the next decade. Only 28 of the Republicans’ 244 House seats are in districts that voted for President Obama in 2012. The Democrats now hold 188 seats. If all of those seats were moved from Republican to Democrat candidates, then the two parties would tie in the House. Such a shift is very unlikely, given the advantages of incumbents and the unreliable turn-out among Democratic voters. For the last decade American politics has see-sawed between Republicans and Democrats, but what Americans seem to like is a divided government that can’t accomplish anything.

David Leonhardt, “The Great Wage Slowdown, Looming Over Politics,” NYT, 11 November 20014.

Nate Cohn, “The Enduring Republican Grip on the House,” NYT, 11 November 2014.

[1] In fact, health care costs have stopped rising and in some cases have fallen. The reasons for this are subject to debate. It seems unlikely that the Affordable Care Act has anything to do with this—yet.

[2] OK, I’ll leave aside the whole issue of how “workers” used to mean “blue-collar.” Don’t want to suggest that America is really confused about the whole issue of social class.

 

Islamism as a story.

The current theater of operations for ISIS lies in the midst of ancient and modern historical places. On the one hand, Tel Megiddo, in northern Israel, is the place identified with Armageddon in the Bible’s Book of Revelations. Farther north, in Syria, Dabiq appears in the Hadith as the name of a village where a final confrontation between the armies of Islam and Christendom will fight to a decision. Dabiq is near the Syrian-Turkish border. In Summer 2014 it fell to the ISIS forces. In July 2014, during its own “surge” in Iraq, ISIS began publishing an on-line magazine called “Dabiq.”

On the other hand, it is commonplace for people in the Arab states to explain the decline from earlier Muslim power and prosperity by blaming Western intervention and exploitation.[1] Islamists extend this narrative. Islamists celebrate the breaking of the grip of the Byzantine Empire on Syria and Palestine, and the conquest of “al-Andalus” in the in the 7th and 8th Centuries. The Abbasid and Umayyad caliphates are held up as the ideal for what the Islamists hope to create. Similarly, the Medieval Crusaders are analogized to contemporary Western states.

The American invasions of Afghanistan in 2011 and of Iraq in 2003 certainly gave the proponents of this view a lot of material with which to work. Young Islamists have mastered modern social media just as well as have young non-Islamists, along with young everyone else. Al Qaeda led the way by launching a media campaign: audio cassettes, DVDs, and Internet forums preached the Islamist interpretation.

Recognizing that people like Anwar al-Awlaki[2] had played a role in fomenting and recruiting for terrorism, in 2011 the United States Department of State created a Center for Strategic Counter-Terrorism Communications (CSCC). One chief function of the CSCC is to engage in on-line debate with Islamists. The goal here is to dissuade young people from supporting or joining Islamist groups.[3] The CSCC has a Digital Outreach Team with members working in Arabic, Urdu, Punjabi, Somali, and English.

The means to the goal is to propose a different narrative of history than the one upheld by many Muslims. The CSCC’s counter-narrative focuses on recent history, rather than on a more remote past. It emphasizes the tolerance of pre-Islamist Muslim society. This view clashes with both the restriction imposed under the Islamists’ version of sharia and the brutality with which it is enforced.

The question–not much addressed by Western scholars or journalists or counter-propagandists–is why the messages of either an “End of Days” or a revival of the Caliphate appeals so strongly to thousands of young Muslims. What are they missing about motivation?

 

Shatha Almutawa, “Historical Narrative in American Counterterrorism Operations,” American Historical Association, Perspectives, September 2014, pp. 12-13.

Noor Malas, “Ancient Prophecies Motivate Islamic State,” WSJ, 19 November 2014.

[1] This explanation ignores the pervasive weaknesses of Medieval Arab society that exposed the region to conquest by successive waves of Muslim Turkish tribesmen, followed by the long decline caused by the decay of the Ottoman Empire. Western imperialism had a much briefer period of influence. Not all of those influences were negative. However, the performance of the post-independence Arab states contrasts badly with those of other “developing” societies.

[2] See: “Just like imam used to make.”

[3] One might be forgiven for believing that another purpose is to draw them out so that their other communications can be tracked by the NSA. I’m all for it, but it could lead to “getting flamed” for some hasty remark—by a drone.