Annals of the Great Recession XI.

I saw the Iraq War as an obvious act of stupidity from even before we attacked in Spring 2003. So, in 2008, I voted for the candidate who had opposed it, Barack Obama. I voted for him in spite of his obvious weaknesses: he was as green as grass in politics, he had never run anything, he didn’t know anyone much in Washington, and he had some dopey ideas. My assessment of President Obama’s failings is amply borne out by Ron Suskind’s scathing account of how the President and his advisers made economic policy in the first two years after he reached the White House.[1]

Undoubtedly, Obama inherited an economic disaster from the George W. Bush Administration. However, his background and range of contacts left him ill-positioned to deal with the immense problems on his plate. First, the president believed in the power of rhetoric; he almost seems to have believed that talk and action were identical. Supporters have argued that he’s the first president in a while to speak in full sentences and paragraphs, and that doesn’t mesh well with sound-bites. In reality, the trouble was that much of his discourse seemed to have been picked up in Chicago rec-league basketball. He disses people who disagree with him.[2]

Second, the president turned out to be a poor judge of people and had few close advisers to keep him from going into the ditch at the first opportunity. Rahm Emanuel, who served as his first chief of staff (and who recently squeaked through to re-elections as mayor of Chicago), and Lawrence Summers, who headed his National Economic Council (before going off to become President of Harvard until he vexed the faculty, were imperious), abrasive men who rubbed people the wrong way as a first order of business in any meeting. Tim Geithner, his first Secretary of the Treasury, was consistently suspected of mouthing the Wall Street view.

Third, unlike his immediate predecessor, President Obama could not pull the trigger on any issue. Instead of deciding, he sought consensus. Endless debates went on, but the President refused to choose one option and then to say “it’s my way or the highway.” Who ever crossed Richard Nixon without landing on the sidewalk with his suit in tatters? It’s a short list.

Many of his own subordinates saw through him from the start. Famously, Lawrence Summers, the head of Obama’s National Economic Council, told another official: “We’re home alone. There’s no adult in charge. Clinton would never have made these mistakes.” Geithner has been accused of out-right insubordination, but stayed at Treasury as long as he chose.

The “friendly opposition” within the Democratic Party would argue that, after the rough ride of his first two years, Obama began to understand how things should operate. He got rid of his early hires and started to make decisions. So they say. With a year and change to run on his second term, it isn’t clear that much has changed.

Still, what was the bigger disaster for America: Obama’s mismanagement of the economy or the Iraq War? Somebody in Washington needed to get drilled for the Iraq War, not just the men and women who fought there. John McCain and Hillary Clinton had to pay a price at the voting booth. What are we supposed to do? Let bygones be bygones after each new train-wreck engineered by the usual suspects who populate American politics?

Finally, has Obama learned anything? The answer to that question goes to the credibility of the Iran deal.

[1] Ron Suskind, Confidence Men: Wall Street, Washington, and the Education of a President (New York: HarperCollins, 2011).

[2] See: Stuff my president says.”

Annals of the Great Recession X.

Jeff Madrick, once a New York Times economics columnist, argues that America’s economic decline can be traced to the 1970s.[1] Beginning with the Great Depression, governments had imposed tight controls on American financial markets. These controls had made banking boring. That was a good thing for anyone who had ever lived through a financial panic (or watched that scene in “It’s a Wonderful Life” where Jimmy Stewart abandons his honey-moon to save the savings-and-loan when the town bank collapses. However, it also restricted the opportunities for profit in one sector of the economy. Economists at the University of Chicago, inspired by the writings of Milton Friedman, pushed an “extreme free-market ideology.” Embraced by greedy financial industry leaders, then by the Republican Party in the era of Ronald Reagan and later by Democrats as well, these ideas led to the de-regulation of the American financial industry. “And Hell followed with him.”[2]

Reckless lending became progressively ever more deeply entrenched among bankers. Successive crises of ever greater severity sprang from these practices: wild real estate speculation in the 1980s; the Latin American lending binge; the “dot.com” bubble; and then the nightmarishly complicated real estate investments that ended in the financial crisis of 2007-2008. The government repeatedly had to step in to bail-out reckless bankers in order to avert an even worse disaster for the whole economy. Thus, profits were privatized while losses were socialized. Not exactly what Milton Friedman had in mind.

To a historian, some of Madrick’s argument appears kind of rickety. He, among others, appears to believe that the American prosperity and global economic domination of the period from 1945 to 1975 was somehow “normal.” However, it is at least equally possible to regard this situation as “abnormal” and bound to end. The financial industry (or just “greed”) can hardly be blamed alone for the complex changes that have undermined America’s position.

Then, for purposes of analysis and argument, he separates out free-market thinkers and financial industry leaders. However, in real life they existed within and responded to an evolving context of beliefs and influences. Thus, the inflation of the Vietnam years intersected with government regulations on the interest banks could pay depositors. People wouldn’t keep money in banks unless they could get a higher rate of interest. So, the interest rate regulations had to be relaxed.

Then, he appears to believe that “greed” drives the elite, but that the same behaviors by people lower on the income pyramid are unexceptionable. In 1970, 381 major strikes hit American companies as workers drove for higher pay and benefits at a time when foreign competition had begun to exert heavy pressure.[3] Why is one act greed, the other not?

Then, a lot of the American economy was deregulated from the 1970s on. Takes airlines as an example. Between 1980 and 2009, inflation-adjusted air fares fell by fifty percent.[4] Air travel increased, but crashes did not increase. (I’ll grant you that the air travel experience now reminds of a trip I once took on a Mexican bus.)

So, the part is not the whole. Blanket statements about regulation don’t get us very far.

[1] Jeff Madrick, Age of Greed: The Triumph of Finance and the Decline of America, 1970 to the Present (New York: Alfred A. Knopf, 2011).

[2] Revelation, 6:8.

[3] See: “American union, stay away from me uh.” March 2015.

[4] See: http://www.theatlantic.com/business/archive/2013/02/how-airline-ticket-prices-fell-50-in-30-years-and-why-nobody-noticed/273506/

An Israeli Dilemma.

In 1958, Leon Uris wrote Exodus. While portraying the birth of the state of Israel, he imagined an Israel-yet-to-be: a secular, socialist-inspired Jewish state living on terms of amity with the Arabs. Today, Leon Uris’s vision seems far-fetched.

In the Six Days’ War of 1967, Israel over-ran the Gaza Strip, the Sinai Peninsula, the West Bank, and the Golan Heights.

What to do with the conquered territories? The Sinai was traded away in exchange for peace with Egypt. The Syrians lacked the strength to take back the Golan, even before the current massive uprising against the Assad regime. Gaza and the West Bank, however, were chock-full of Palestinian refugees from the creation of Israel in 1948. One of the founding illusions of Zionism had been that Palestine was “a land without people for a people without land.” In 1948, many Muslims had fled the fighting, or had answered an appeal from Arab leaders to clear the path for Arab armies, or had been driven out by Israelis by means of exemplary massacres. They had never been allowed to return. Now Israel had over-run the places where the Palestinians had taken refuge. What course would Israel follow? One option would have been to create a Palestinian state that consisted of the West Bank and the Gaza Strip.[1] Another option would be to extend Israel’s territory into the newly-conquered lands. The loudest exponents of this policy were to be found among ultra-Orthodox religious zealots. This was the course pursued by Israel through the creation of settlements. Settlements—both “legal” and illegal under Israeli law—began to proliferate. Many successive governments turned a blind eye to the settlements and to the disastrous impact of the settlements on Israel’s international situation. While Israel has not—yet—annexed the West Bank and has withdrawn from Gaza, both the settlements and the inferior legal status of the Palestinians living under what is effectively Israel’s rule give the country something of the appearance of Prussia on the Jordan.

Why? Leftist critics argue that the country has come to be dominated by right-wing voters who pander to religious parties and are deeply hostile to the Arabs, both Israeli-Arabs and the Palestinian Arabs; and that military officers lean ever more toward Orthodox Jews who have a right-wing political bent.[2] Implicitly, a return to Israel’s leftist roots would facilitate a solution to the problems facing the country.

Attractive though it is, this interpretation ignores some realities. A return to its roots by Israel will not undo the radicalization of opinion among many Muslims, whether Palestinian or not. Hamas displaced the elected Palestinian Authority from control of Gaza, then turned the enclave into a base for attacks on Israel.[3] Hamas does not accept the right to survive of Israel, regardless of where its borders are drawn. In the future, Hamas could achieve domination, or at least the tolerance by a sovereign Palestinian government, on the West Bank. In Lebanon, Hezbollah is an Iranian client. Egypt is teetering on the brink of a civil war between Islamists and authoritarians. It isn’t entirely clear what kind of inroads ISIS could make in Jordan. It would be hard for any military adviser to argue that Israel should surrender strategic depth in return for promises of future peace. Not all problems have solutions.

[1] Between 1948 and 1967, this had been an option available to Egypt and Jordan. However, it appears that Egypt and Jordan were more interested in maintaining the Palestinians in misery as a stick with which to beat Israel in the square of international opinion than they were in actually creating a Palestinian homeland. Now the ball was in Israel’s court.

[2] Gershom Gorenberg, The Unmaking of Israel (New York: HarperCollins, 2011).

[3] Last summer’s war began over rockets fired into Israel, but Israel’s response soon uncovered a network of tunnels driven into Israel for what could only be offensive purposes.

American Opinion and the Confederate Battle Flag.

In the 1950s and 1960s the Civil Rights movement reached one of its peaks. American public opinion turned against segregation, overt racism, and the violent defense of white dominance. This peak also coincided with the centennial of the Civil War. I haven’t seen (but maybe I haven’t looked hard enough) much scholarly work on how white Southerners sought to commemorate the “American Iliad.”[1] Were little Confederate flags placed on the graves of veterans in cemeteries? Were there speeches on the “Confederate Memorial Day”? Were more streets and highways named for Confederate generals? In any event, I conjecture that a Civil War Movement arose to counter the Civil Rights Movement. One aspect of that appeared in laws incorporating the Confederate battle flag into the state flags of some Southern states or to the displaying of the flag on government buildings.

Fifty years later, much had changed. In late June and early July 2015, the vast majority of Americans (64 percent) opposed having the Confederate flag fly over public buildings, while 21 percent thought that the flag should be allowed to fly over public buildings; and 21 percent weren’t sure.[2] However, most of the 21 percent who favored flying the Confederate flag over public buildings live in Southern states. Two weeks later, a majority (57 percent) of Americans accepted that the Confederate battle flag is a symbolic expression of “Southern pride,” rather than a racist affirmation. However, a majority of Americans still supported hauling down the flag on public property. Among that majority viewing the flag as a symbol of Southern pride were 75 percent of Southern whites. However, 75 percent of Southern blacks saw it as chiefly a racist statement. Deep divisions exist in the South over the Confederate flag.[3] However, lots of Southern whites appear to recognize that what is a symbol of pride to them is also deeply offensive to African-Americans. (See the statement by South Carolina governor Nikki Haley.) This might suggest an important, but hard to define, psychological shift among Southern whites. Still, opinion polls don’t always dig too deep. What did the other 25 percent of Southern whites believe about the flag, that it was a racist affirmation? If so, did they like that or did they hate it?

Why does “Southern” appear to mean “Southern and white”? Is there a regional culture shared by whites and blacks? Looking at Farm Security Administration photographs from the Thirties and Forties might lead you to think so. See the remarkable on-line exhibition at: http://www.loc.gov/exhibits/bound-for-glory/ So might the history of Zydeco.[4] See: https://www.youtube.com/watch?v=fa8vyTfugcI Shooting people in church might fall outside the pale in such a shared culture. Or perhaps it awakens memories of a fire-bombed church in Birmingham, Alabama many years ago.

There is no question of the Confederate flag flying over federal buildings, but each state has the right to choose what flags fly on state government grounds. Another problem left to later generations by the Founding Fathers. What did they expect us to do, figure it out for ourselves?

[1] Charles Roland, An American Iliad: The Story of the Civil War, 2nd edition (Lexington: University of Kentucky Press, 2004).

[2] “Poll Watch,” The Week, 3 July 2015, p. 17.

[3] “Poll Watch,” The Week, 17 July 2015, p. 17.

[4] See: https://en.wikipedia.org/wiki/Zydeco

Power Surge.

In 2003 the United States attacked Iraq. Swift defeat of Iraq’s conventional forces then gave way to misstep after misstep. An insurgency arose among the minority Sunnis deposed from their long dominance by the American invasion. The Shi’ite majority demanded that the Americans leave as soon as possible so that they could get to the business of governing the country and settling scores. Al Qaeda in Iraq sought to foment a civil war that would make Iraq ungovernable and force an American evacuation. Foreign fighters poured into serve with Al Qaeda. By 2007 a disaster of epic proportions loomed before the Americans.

Then things began to turn around. Lower level American commanders began buying-off Sunni insurgents in their areas of operation. Many of the Sunni insurgents got fed up with the Al Qaeda fanatics. Together, these forces led to the “Sunni Awakening” that markedly reduced the level of violence from early in 2007. American special operations troops focused their efforts on killing the Al Qaeda fanatics. Civilian and military casualties began to fall. If only in comparison to the chaos visited on the country between 2003 and 2007, Iraq began to move toward something like a functional state.

Then and later, considerable energy has gone into myth-making about the turn-around. The commonly accepted—because commonly told—narrative is that the Bush administration belatedly developed a coherent and workable plan for victory in Iraq; General David Petraeus helped develop and then implemented an effective counter-insurgency strategy; and the “surge” of troops greatly enhanced security so that some form of national reconciliation could take place.

In fact, the Bush White House’s “National Strategy for Victory in Iraq” had neither substance nor application. “Victory” had been redefined to mean bringing down the level of violence to a point where responsibility could be handed off to the government of Iraq with something approaching a straight face.

In fact, the administration and the Pentagon were in search of a “hero” to revive American morale. They found that hero in General Petraeus, adept at both war and image-management. The buying-off of Sunni insurgents and the increasingly effective work of the special operations forces were well underway before General Petraeus arrived in Iraq. He endorsed and broadly applied the methods already developed.

In fact, although Chelsea Manning has been sentenced to 35 years in prison for giving classified documents to WikiLeaks and Edward Snowden has been hunted across the globe for having revealed NSA spying on Americans, officials in the Defense Department appear to have provided favored authors with a trove of classified documents.

In fact, what the United States achieved in Iraq was not victory, but avoiding defeat. By avoiding defeat the United States has also avoided any honest reckoning with the causes and consequences of a disastrous adventure that spanned two different presidential administrations. “The fraud is that a 20 year military effort to determine the fate of Iraq yielded something approximating a positive outcome.”

So says Andrew Bacevich, a professor of political science at Boston University, a combat veteran of Vietnam, a retired Army colonel, a grieving father to a son killed in action in Iraq, and a bitter and clear-eyed critic of recent American foreign and military policies.[1]

[1] Andrew Bacevich, “Avoiding Defeat,” New York Times Book Review, 10 February 2013, pp. 20-21. Professor Bacevich reviewed Michael R. Gordon and Bernard Trainor, The Endgame: The Inside Story of the Struggle for Iraq, From George W. Bush to Barack Obama (New York: Pantheon, 2012); and Stanley McChrystal, My Share of the Task: A Memoir (New York: Penguin, 2013).

Terrorism 1.

How long will the current war against radical Islamism continue? Can we win? How will we know when/if we have won? These questions don’t get much discussion, so preoccupied are we with each surprising outbreak of insurgency and atrocity. Probably, government officials in democracies are not eager to tell the public that this could go on for a lot longer than the next election cycle. Back in 2009, two books offered counsel that still deserves attention.[1]

David Kilcullen saw a core struggle between radical Islam, on the one hand, and the Unbelievers in the West and Incorrect Believers in many Muslim countries, on the other hand. Swirling around both parties to the core struggle were many local movements that associate themselves in name with radical Islam (Al Qaeda then, ISIS now, something else in the future). The strength and the staying power of the local insurgencies vary greatly. Kilcullen thought that the Western countries had a pretty good sense of how to wage the core struggle against radical Islam, even if they botched the execution from time to time. Where they came up short is in managing the peripheral small wars. Indeed, having the local insurgencies pop-up seemingly out of nowhere is one of the things disturbing the public in the West. More recently, the “lone wolf” attacks in Britain, Canada, France, and the United States add to this unease.

According to Michael Burleigh, history tells us that we can and–almost certainly will—win. Terrorism has come and gone in waves: in the 19th Century, they were Irish Fenians, Russian revolutionaries, and European anarchists; in the later 20th Century, they were malcontent leftists in advanced countries (Weathermen, Red Brigades, Red Army Faction, IRA, ETA) and Third World rebels (PLO, South Africa); today they are radical Islamists (Chechens, Al Qaeda, ISIS). Wherever they go, the terrorists have left a trail of dead, maimed, and traumatized victims. In most cases, however, they had little in the way of concrete political achievements to show for their work.

How to defeat these threats? Focusing on the peripheral wars and insurgencies, Kilcullen recommends policies that protect local communities in remote areas from becoming penetrated by radical movements. This, rather than heavy hammer blows from the military, is most likely to stop an insurgency in its tracks. Problems abound with this solution. A lot of the world’s people live in small communities remote from central government authority. Who can tell where the next danger will arise? Is every Middlesex village and farm to be garrisoned “just in case”? Then, most armies train for conventional war against foreign states or for repression of dissent in unjust societies, not for policing or community protection.

Here, Michael Burleigh has some equally useful suggestions. Focusing on the core struggle, Burleigh argues that experience shows that winning the ideological debate through public diplomacy; promoting economic development to drain the swamp of poverty that contributes to radicalization; and developing intelligence capabilities before relying on brute force offers the best path forward. Burleigh’s strategy provides the framework for Kilcullen’s tactics. However, long debates in many languages on social media, nudging countries toward social justice and economic modernization, nurturing good governance in countries suspicious of Western meddling, and building language skills and cultural competence in intelligence agencies is going to take time. We’re in for a long war. People need to know this harsh truth.

[1] David Kilcullen, The Accidental Guerrilla: Fighting Small Wars in the Midst of a Big One (New York: Oxford University Press, 2009); Michael Burleigh, Blood and Rage: A Cultural History of Terrorism (New York: HarperCollins, 2009). .

Man Hunters.

Before the Second World War the United States possessed intelligence-gathering organizations that were derisory in comparison to those of the great powers. The War Department gathered information on the military capabilities of foreign states from military attaches; the State Department reported on political and economic developments; both War and State maintained signals intelligence (code-breaking) offices. However, the US possessed no “secret intelligence service” equivalent to the British MI-6 or the action services of other countries. During the Second World War, the US sought to make good this deficiency with the temporary Office of Strategic Services (OSS). After the Second World War, America’s new global role and the Cold War demanded an enhanced intelligence-gathering capability. In 1947, Congress created the Central Intelligence Agency (CIA) to fill this role. Filled with wartime OSS veterans, the new agency had a predisposition to clandestine action, not just to intelligence gathering. Confronting the brutal Soviet KGB around the globe, CIA played a rough game. Eventually, CIA fell afoul of changed national values. The Church Committee hearings led to restrictions on CIA action like assassinations. From the mid-Seventies onward, CIA concentrated conventional intelligence-gathering and analysis.

Then came 9/11.[1] The scales fell from their eyes, or they had a Road to Damascus experience, or whatever other Biblical reference occurs to you. An executive order from President George W. Bush overturned the limits on action. CIA agents lashed out at Al Qaeda operatives wherever they came within reach. Some were killed, either by a rapidly-expanded paramilitary arm of CIA or by drone strikes. Some were captured and subjected to “enhanced interrogation.” In 2003, the US attacked Iraq, only to see early triumph turn into a gory insurgency that seemed to have no end. Soon, there came a backlash against both big wars and the use of torture.[2] A new consensus emerged: killing terrorists is acceptable, but torturing them is not. Certainly, it is less likely to get people keel-hauled by a Congressional committee. According to Mark Mazzetti, CIA “went on a killing spree.” Drones and commandos struck Islamists[3] in Afghanistan, Pakistan, Yemen, and Somalia. While banning the use of torture, President Barack Obama has continued all the other programs begun by the Bush administration.

Arguably, the results have been as disastrous, if not quite so dramatic, for American intelligence as for the Islamists hit by Hellfire missiles launched from Predator drones. In an Econ 101 analysis, multiple needs compete for finite resources. Resources (money, manpower, attention) spent “man-hunting” can’t be devoted to other needs. Yet the US faces multiple current, latent, and potential threats.

The CIA already suffered from maladaptation between the end of the Cold War and 9/11. Its budget fell as part of the “peace dividend”; spending on new technologies further reduced the resources for human intelligence-gathering and analysis; and its former strengths in Soviet and East European issues could not easily be shifted to new areas. (Pashto and Polish both begin with a P, but there the similarity ends.)

America’s political culture is having a hard time discussing the choice between long-term trends and immediate action. The recent murder of five servicemen by what looks like an Islamist “lone wolf” will only make “man-hunting” seem more vital than ever.

[1] Mark Mazzetti, The Way of the Knife: The CIA, a Secret Army, and a War at the Ends of the Earth (New York: Penguin, 2013).

[2] In 2004, CIA’s Inspector General condemned some of the practices as “unauthorized” and “inhumane.”

[3] Including the occasional American renegade who declined to surrender himself to more formal American justice.