Annals of the Great Recession IV.

Recession and recovery are supposed to follow a pattern.[1] Recessions lead to higher unemployment; recovery leads to higher employment. Thus, during the 1990 recession, the share of the working-age population with a job or looking for one fell from just under 67 percent to just over 66 percent. Labor force participation then rose to over 67 percent by 2000. However, since 2000 the pattern has changed. Between the recession of 2000 and the recession of 2007-2008, labor force participation trended downward from 67 percent to just over 66 percent. Since that recession began, the labor participation rate has pursued an even more rapid decline. By September 2014, the rate had fallen to 62.7 percent. It hadn’t climbed any by the end of the year. New entrants to the job market get absorbed, but many of the long-term unemployed remain off the labor market.

Where did the missing 3-4 percent of the potential labor force go? Many of them retired permanently. We can see here the leading edge of the “baby boom” taking up the rocker on the front porch. For anyone born between 1950 and 1954, getting laid off in the recession just sent them into a slightly early retirement. It probably doesn’t make sense to these people to try to fight their way back into a job so that they can work for another year or three. Less than 20 percent of those who are over 65 are still in the work force.

In addition, psychological fragility has replaced resilience as an American character trait. At least, that’s the idea you could get from some economists’ explanations. “Labor market scarring” of workers seems to reflect a belief that job-hunting is a traumatic experience. The unemployed would rather adapt by other means. They move in with aged parents to provide care; they file for disability under the currently easy conditions for gaining it; they probably do a bunch of work off-the-books; and they’re not going to leave anything to their kids.

What are the effects of them not working? The Federal Reserve Bank wants to sustain low interest rates until the labor participation rate rises to “normal.” What if the current rate is the “new normal”? It’s an awful lot of productive labor going to waste. It sets a ceiling on the growth of the economy. Fewer workers paying taxes tightens the screws on federal revenue.

The trend toward a lower over-all labor market participation rate masks other changes.[2] The female labor participation rate has fallen from about 74 percent in 1999 to about 70 percent today. One could conjecture that if over-all labor participation was about 67 percent in 2000 and the female rate about 74 percent, then the male participation rate would have been about 60 percent. Similarly, if the over-all rate is about 63 percent today and the female rate is about 70 percent, then the male participation rate would be about 56 percent.

Certainly, the labor participation rate for men has been trending downward since the 1970s.[3] Back in the 1950s and 1960s, only 10 percent of men of working age were not in the labor force. Another trend masked by the over-all data is the shift of better jobs toward women. That trend springs from the shift away from manufacturing (traditionally male work) toward a knowledge and service economy which requires more education. Men are less likely, women more likely, to stick with school. The quality of jobs held by women has steadily improved.

There’s an old joke about a guy in Maine who lost his job. A friend asked him how he was going to get by. The man replied “Well, the t.v. works and the wife works.”

[1] Josh Zumbrun, “Labor-Market Dropouts Stay on the Sidelines,” WSJ, 29 December 2014.

[2] David Leonhardt, “The Distinct Geography of Female Employment,” NYT, 6 January 2015.

[3] In the 1970s the “oil shocks” disorganized the economy and foreign economic competition first became a serious challenge.

War Movies 8: “American Sniper.”

Chris Kyle (1974-2013) had a rare talent at shooting, joined the Navy SEALS at the beginning of global terror’s war on us, did four tours in Iraq as a sniper, wrote a book about his experiences, and was killed by a disturbed military veteran he was trying to help.

Warner Brothers bought the movie rights to the book and signed Bradley Cooper to star. First, David Russell (“The Fighter” (2010), “Silver Linings Playbook” (2012), “American Hustle” (2013),) was going to direct; then Stephen Spielberg; and finally Clint Eastwood.[1]

Kyle’s father instructs his son on shooting and in manly conduct: “there are three kinds of people: sheep, wolves, and sheep dogs.” Chris Kyle (played by Bradley Cooper) takes the message to heart. He is determined to use his skill to save the lives of endangered American troops in Iraq. A chance encounter with his younger brother, who had enlisted after 9/11, drives home the importance of this mission. The younger man is skittish and eager to be gone from Iraq. This sense of duty leads him to serve four tours in Iraq. He becomes a legend among the common soldiers and Marines. A dead insurgent plunges off a rooftop into the midst of an American patrol. An officer casually remarks “that’s the over-watch; you can thank him later.” Increasingly, Kyle becomes obsessed with an insurgent master sniper called “Mustafa.”[2] He returns for his final tour in hopes of killing Mustafa. He succeeds and comes home.

The price is very high: Cooper plays Kyle as “calm and confident,” so he doesn’t emote much about stress. He’s just increasingly distant, uncomfortable with the emotions of other people (both his wife’s and those of grateful veterans), with flashes of rage. Eventually, this self-contained man makes his way home by finding a new means to “save” fellow soldiers.

The movie has been criticized from the Left for de-contextualizing Kyle’s story. Eastwood portrays Kyle as motivated by the Al Qaeda attacks on the American embassies in East Africa and by 9/11; then the events in Iraq focus on the effort to kill Al Qaeda in Iraq leader Abu Musab al-Zarqawi. How the United States came to invade Iraq is scrupulously left out. The critics are mad that this wasn’t about the lies that led us to war. That would be a different movie. Indeed, it has been. Several times. All of which were flops. “Rendition” (2007, dir. Gavin Hood); “Lions for Lambs” (2007, dir. Robert Redford); “Redacted” (2007, dir. Brian de Palma); and “Green Zone” (2010, dir. Paul Greengrass) all lost money or fell short of earning expectations. That says something about audiences and what they’re willing to acknowledge. . In contrast, “American Sniper” is well over $200m in the black.

“American Sniper” falls into a different category of war movie from the ones that haven’t succeeded with American audiences. “The Hurt Locker” (2008, dir. Kathryn Bigelow) and “Zero Dark Thirty” (2012, dir. Kathryn Bigelow) became huge hits by focusing on driven individuals, the personal price they pay, and on the shameful American indifference to the human costs of wars waged by their country.   However, “American Sniper” ends on a different note than do Bigelow’s two movies. In her work, the protagonists (played by Jeremy Renner and Jessica Chastain) are lonely souls, estranged from their less-driven colleagues, cut off from home, and unknown to their fellow Americans. “American Sniper” ends with Kyle’s funeral procession across Texas. On a rainy day masses of people line the highway and the overpasses, fire-engine ladder trucks hoist huge American flags, Stetsons and baseball caps come off as the cortege passes. Eastwood is in his eighties. This may be his last movie. Hell of a way to go out.

[1] “American Sniper” (2014, dir. Clint Eastwood).

[2] It’s worth noting that the film portrays Mustafa (played by Sammy Sheikh—who has portraying evil Muslims down to a fine art) as an insurgent version of Kyle: skilled, committed, and with a family that is shut out of his work.