The Secret History of Columbus Day.

The vast majority of the early settlers of British North America were Protestants. They brought with them a folk memory of how English Catholics had been seen—often correctly—as disloyal to the British government and in the service of foreign princes who wished to establish absolute monarchies that would force people to abandon their own faith to become Catholics. Protestantism and Catholicism regarded each other as defective faiths, rather than legitimate religions. From the late 18th Century on, the Catholic Church sided with autocratic governments and systematic ignorance. The Church opposed everything desired by progressive people of the day: representative governments, elections, freedom of speech, freedom of opinion, freedom of the press, individual civil rights, and modern science. The Church had maintained an Inquisition to repress heresy (wrong belief) and an Index of Banned Books that no Catholic should read. Occasionally, the Church kidnapped Jewish children who had been secretly baptized by Christians, and raised them as Catholics.[1] Moreover, in theory, Catholics owed their first loyalty to the Pope, rather than to the government of whatever country they happened to live in. Protestants in all countries despised Catholics as a primitive people who were slaves to the orders of their priests.

Catholic immigrants—from Ireland, Italy, and Germany—got a hostile reception from Protestant America. To make matters worse, the Irish and Italians, were poor country people. Usually they were illiterate and generally had no technical skills. Hence, they took the lowest-paying and least-regarded jobs when they first arrived in America. Their desperation for work dragged down wages for the native-born population. During the 1830s and 1840s, anti-Catholic sentiment boiled over in brawls, riots, press campaigns, and “Nativist” political parties.

The problem for Catholics lay in how to make themselves acceptable in a hostile foreign society. One solution came through associating themselves with the history of America from its earliest times. Italian-Americans first celebrated Columbus Day in New York City’s “Little Italy” in 1866. In 1882 Catholic Americans led by an Irish-American priest founded the “Knights of Columbus” as a device to help impoverished immigrants and promote Catholic education. The organization grew like wild-fire among Irish and Italian immigrants and their descendants. It emphasized the union of Americanism and Catholicism.

In 1892 President Benjamin Harrison proposed that Americans celebrate the 400th anniversary of the arrival of Christopher Columbus in the New World. Various dignitaries and un-dignitaries used the occasion to laud such ideals as patriotism and social progress. School-children recited the “Pledge of Allegiance” for the first time as part of the celebration.

Angelo Noce, an Italian immigrant who had become a citizen and who lived in Denver, Colorado took it into his head to press to make Columbus Day a Colorado state holiday. In 1905 the governor of Colorado decreed 12 October to be a state holiday.

In 1934, the Knights of Columbus and an Italian-American leader in New York City named Generoso Pope (the founder of the National Enquirer), got newly-elected President Franklin D. Roosevelt to proclaim Columbus Day as a national holiday. Roosevelt needed the Italian vote, so he agreed.

Now “progressive” people want to use the date to validate the long-neglected Native Americans. Why not? Catholics now are fully-integrated into American society. They don’t need it. And it isn’t as much fun as Saint Patrick’s Day. Still, that leaves Asian-Americans.

[1] See David Kertzer, The Kidnapping of Edgardo Mortara (New York: Random House, 1997) for one example that attracted much attention.

Pop. 2050.

People from Thomas Malthus to Paul Ehrlich used to fear that population growth would outrun resources. These fears proved groundless by the end of the 20th century. Projecting from current trends, the United Nations foresees a world population of 9.3 billion by 2050, with growth slowing to stability at 11 billion by 2200. Other reliable estimates set the “carrying capacity” of the earth (its resource base) at something better than ten billion people. Many estimates hold that the earth could support 11 to 14 billion people. In short, a huge crush on resources seems unlikely to imperil human survival.

Instead, by the start of the 21st Century it was being predicted that “the most important changes in world population over the next fifty years are less likely to be in the total number of people than in their age and geographic distribution.”

For example, the anticipated overall slowing of population growth means that populations will age. In 2002 the median age of the world’s population was 26.5 years; by 2050 it will be something like 36.5 years. In the more-developed regions, long life-spans combined with a previous drop in the number of children below replacement level (2.2 children/family) will create very distinctly aged population patterns. The absolute and relative size of the working populations will shrink. Fewer working people will have to support more elderly dependent people, but fewer children. Unless there is substantial immigration from non-European areas, Europe’s 2050 population will be smaller than its 2000 population and only 57 percent will be of working age (15-65). Italy may be regarded as an extreme case: by 2050 the Italian population will shrink by 25 percent and only 3 Italians will be working for every two over 65 years. In both Russia and the former Soviet-bloc territories population is plunging as people have fewer children, many die younger than one would expect, and others emigrate.

Other areas of the world still face surging population growth: in China the birth rate is double the death rate, in India and Nigeria the birth rate is almost triple the death rate, in Pakistan the birth rate is more than triple the death rate. In general, almost all of Africa, the Arab world, and South Asia can anticipate population growth by 2050 that ranges from at least 50 to over 100 percent. Eight of ten of the fastest growing countries are Islamic-majority countries. Afghan women bear on average 6.8 children, while the population of the Gaza Strips is projected to quadruple by 2050. But it is not just Islam that reports rapid population growth: sixteen million more Indians were born than died in 2002 (20 percent of the world’s population growth); and the population of Africa is projected to increase by 150 percent between 2000 and 2050. This is in spite of the AIDS epidemic, which reduced life expectancy in Africa from 60 years (early 1990s) to 36 years (2002).

In contrast to developed Western countries (including Japan), in less-developed regions, the continuing comparatively high number of children will create distinctly youthful population patterns. The absolute and relative size of the working populations will grow. More working people will have to support more children, but not as many aged people.   (Retirement homes and elementary schools may become the key institutions in two different societies.)

More importantly, it is difficult to see how “developed” societies are going to do without a large influx of workers from “developing” countries. What school-teachers call “cultural competencies” are going to start to count more and more. “Controlling the border” will take on a different meaning.

 

Don Peck, “The World in Numbers: Population 2050,” Atlantic, October 2002, pp. 40-41.

The economic mess

Every–bored-to-tears–schoolboy knows who propounded the idea of a “social contract”: Thomas Hobbes and John Locke.  The idea of a social contract on the distribution of income has formed one of the pillars of “neo-capitalism” since 1945.  However, that basic idea has witnessed several successive versions.  From 1945 to the Reagan Administration in the 1980s, the US combined high tax rates on the wealthy with the channeling of the gains in productivity to employees.  Eventually, business people pushed back against what they saw an an unfair deal.  A new social contract emerged in which much higher incomes for the wealthy were accepted so long as the real incomes for the middle class continued to rise.  (All this is just my opinion.  In all likelihood, many of my historian friends would rain-down good-humored abuse on this interpretation.)  The financial crisis and the “Great Recession” then ruptured this second version of the social contract.

In 2007-2008 we had the financial crisis and the “Great Recession.”  In 2009 we started back up the road to prosperity.  American Gross Domestic Product (GDP, OK, cue Mort Sahl here) is up 6.7% over 2007.  Per-capita disposable income rose 4.2% between June 2009 and June 2014.  Well, some of us started back toward prosperity, but not all of us did.  In June 2009 the median family income was $55,589; in June 2014 it was $53,891 (in inflation-adjusted dollars).  That’s a 3.1% decline.

How can that be?  Well, the stock market is doing very well.  If you’re the kind of person who puts their  savings  into Vanguard accounts, then your the kind of person who probably has profited from the recovery.  (On the other hand, you’re also the kind of person who took a bath in the recession.  Not that the people at the New York Times give a rip about your experience.)  If you’re the kind of person who depends on wages or salary and your home is your chief investment, there is good reason to feel like the “recovery” is a joke.  (Like a bucket of water propped on top of a partly-open door.  “Hey, can you come in here for a minute?”)  Worse still, the 1999 peak in real household income was a little higher than the 2007 (pre-recession) peak in income.  Five years into the “recovery” and we aren’t even back to the 2007 level and the 2007 level wasn’t as high as the 1999 level.  In sum, we’ve actually had fifteen years of things not working right, rather than five or seven years of things not working right.  There’s probably something in the Bible about this.

One great challenge of the day is to figure out a new version of the social contract.  There has to be a way of achieving broadly-shared economic growth.  There isn’t much political consensus about what to do.  George W. Bush and Barack Obama, Republicans and Democrats all had or have high disapproval levels in public opinion polls.  A big chunk of voters seem to have swung from supporting Obama and the Democrats in 2008 to supporting the Tea Party faction of Republicans in 2010.  The 2014 mid-terms loom next month with no certain outcome.

Saying that there is no political consensus on action isn’t quite the same as saying that professional economists couldn’t come up with some solutions.  It’s just that neither the right or the left seems much interested in listening to what they have to say.  The flight from Keynesian solutions to the recession actually was widely shared.  It is inexplicable in rational terms, especially by Democrats who were going to be left holding the bag in future elections.  Yet it happened.  Probably the same goes for constructive policies aimed at building a better American future.

Paul Krugman, “How to Get it Wrong,” NYT, 15 September 2014.

Neil Irwin, “A Crisis of Faith in the Global Elite,” NYT, September 2014.

Neil Irwin, “Why the Middle Class Isn’t Buying the Talk About a Strong Recovery,” NYT, 22 August 2014.

Why don’t Americans trust their Government? I

“Enemy of the State” (1998, dir. Tony Scott[1]). The NSA has been pressing for legislation that will allow it to slip the leash, but a Congressman is in the way. A top NSA official organizes his killing—meant to look like a suicide—only to discover that a remote camera dedicated to another, innocuous purpose, recorded the killing. HA! The hunt for the video record is on. The wildlife observer who had set up the remote camera—this is hilarious: he is astonished to find that government officials in a democracy are just as savage as wolves in the wild—ends up dead in an “accident.”   He had passed a CD of the scene to an unwitting acquaintance (played by Denzel Washington). So the full weight of the government’s information apparatus—all the CCTV cameras, phone taps, internet intercepts–falls on the acquaintance. It turns out that the government not only can listen to what you say and watch what you do, it can also plant information in the computer records of your life. Soon, the guy played by Denzel has been fired from his job, had his bank account frozen, and been tossed out of the house by his outraged wife. Eventually, a former government tech surveillance guy turned outlaw (played by Gene Hackman) saves the day by using the techniques of the NSA against the bad guys.

“Shooter” (2007, dir. Antoine Fuqua). Government agents get former Marine sniper Bob Lee Swagger (played by Mark Wahlberg) to come out of retirement to consult on a supposed plot to kill an important figure in Philadelphia. Turns out that they are setting up Swagger as the fall guy for a government-sponsored assassination done at the behest of big corporations—which own the US government. (See: “Citizens United” in the mythology of Democrats.) Swagger turns out to be hard to kill and hard to catch—Semper Fi—and he hunts answers. A newbie FBI agent (played by Michael Pena) gets staked out as sacrificial goat because he didn’t believe the stuff the bosses were saying, but Swagger turns him into an ally and they find the truth. After much shooting, the Truth comes out—within a restricted circle in the know. The rest of us are left in the dark, although it is implied that Survivalism isn’t as crazy as it sounds.

“The Bourne Legacy (2012, dir. Tony Gilroy). As anyone who has seen the earlier installments in the series knows, the US government created a bunch of psychologically-enhanced assassins to put a sharp edge on American action in the world. In this installment, a new generation of agents has also been chemically-enhanced into near-Marvel Comics characters. Scandal forces the government to burn down the program, but one of the agents, Aaron Cross, escapes and goes in search of answers. In pursuit of him, the US commandeers all sorts of surveillance systems from weather satellites to toll-booth cameras to CCTV security cameras in airports to credit card records to airline seating charts. In the end Cross (played by Jeremy Renner) and a rogue scientist from the program (played by Rachel Weisz), sail away, sail away, sail away on a fishing boat bound for the southern islands of the Philippines. Still, they’re careful to stay under an awning all the time, just in case of, you know, drones.

The conventions of these paranoid fantasies require a renegade product of evil covert government actions, a basically decent participant in those actions who is appalled to discover what s/he has been doing, and government officials who have been carried away in their pursuit of their duty to protect the dough-headed citizens of a fat, lazy America. (See: “Margin Call”; see: Edward Snowden—I mean “Edward Scissorhands”!)

[1] He purportedly committed suicide in 2012.

College costs: the old eat the young.

It is always worth asking whether a consumer is getting value for money. Is a college education today worth the higher price than that paid by earlier generations?

Everyone knows that inflation-adjusted college tuition has more than doubled since 1992. Except that it hasn’t. Everyone knows that it can cost $60,000 a year for college. Except that it hardly ever does.

The real price of college has to include the financial aid (other than loans) supplied to the student. This gives the net price. Since 1992, the net price for community college has fallen; the net price at a private four-year college has risen 22 percent; and the net price for a four-year public college has risen 60 percent. The average of the two falls into a range between a 40 percent and 50 percent increase in net tuition. This puts college tuition in the same ball-park as medical costs (35 percent) or day care (44 percent).

The “sticker shock” tuitions beloved of the media and the politicians only apply to people from affluent families who are not eligible for financial aid attending elite schools that can charge what the market will bear for a prestigious degree.

Taking lower costs and higher aid into account, the average price for a student attending a four-year public college was $3,120 a year in 2013; the average price for a student attending a four-year private college was $12,460.[1]

Why has the net cost of a four-year public college risen so much more than the cost of a four-year private college? In the United States, about eighty percent of college students attend public colleges. Between 1988 and 2013, nominal tuition at these institutions more than doubled. This has created a terrible problem of debt for parents and students when most incomes have been stagnant. However, the revenue earned by these colleges stayed flat. In 1988 colleges earned an average of $11,300 per student; in 2013 they earned an average of $11,500 per student. If colleges aren’t getting rich, then where did the additional tuition go? To tax-payers, that’s where.

Traditionally, public colleges were subsidized by state legislatures. In 1988, each student at a public college received an average of $8,600 a year to subsidize his/her studies. The student and his/her family kicked in the additional $2,700 a year. In 2013, each student at a public college received an average of $6,100 a year to subsidize his/her studies. The student and his/her family now have to kick in $5,400 a year. A four year BA went from costing the state $34,600 to costing $24,400. That same four year BA went from costing students and parents $10,800 to costing $20,800. People who got a cheap BA paid for by others, now want to pay lower taxes.

The Obama administration has the idea that introducing ratings for colleges will help “education consumers.” They want to consider factors like affordability, drop-out rates, and the earnings of graduates. Federal subsidies—“Jump, boy, jump” versus “Bad dog, no biscuit”—would reward colleges which score well on the standardized test.[2]   People push back, saying that there is too much difference among students to make a single standard meaningful. The economist Susan Dynarski has suggested that the “risk adjusted” rating system used for hospitals might offer a useful means of adapting any rating system.[3] Better still, restore the state aid.

[1] David Leonhardt, “How Government Exaggerates College’s Cost,” New York Times, 29 July 2014.

[2] I can foresee the criticism that this will lead colleges to “admit to the test” just as schools “teach to the test.”

[3] Susan Dynarski, “Where College Ratings Hit the Wall,” New York Times, 21 September 2014.

MPs–Militarized Police.

The police response to civil unrest in Ferguson, MO, brought to public attention the “militarization of the police.” The irony is that rioting and looting of the sort that took place around the edges of the legitimate protests in Ferguson is one of the events in which a more robust police presence is appropriate. What got lost in the discussion was the far more common use of “militarized” police.

Once upon a time, the local authorities responded to trouble by calling the National Guard. https://www.youtube.com/watch?v=VwcJ5WQSamQ. Then a lot of “civil unrest” hit American cities in the 1960s-1970s: riots, rock concerts run amok like Altamont https://www.youtube.com/watch?v=0qTKsylrpsg, holed-up radicals http://www.nbcnews.com/video/dateline/32129377#32129377, hostage situations, and just crazy people packing a lot of fire-power.

Police department Special Weapons and Tactics (SWAT) teams became common after the Los Angeles Police Department (LAPD) pioneered them in 1967. Subsequent events multiplied the demand and the occasions on which they might be used. The Columbine school massacre in 1999, the 9/11 terrorist attacks in 2001, and the escalating violence of the war on drugs all argued for a more militarized police force as the first-responders to unimaginable dangers. The Department of Homeland Security, which is now headquartered at the old St. Elizabeth’s Hospital in Washington, DC, has poured $35 billion into up-arming local police forces. Some of the results are so ludicrous that they attract even media attention: armored vehicles (MRAPs) that were produced to respond to IEDs in Iraq, and helicopter gun-ships.

Far more important, however, have been the militarization of attitudes among police officers. Incessant talk of policemen as the front-line soldiers in “wars” on crime, drugs, or terror combines with training in military tactics to create the ideal of a “warrior cop.” One can’t help but wonder if police officers start to think of citizens as a foreign enemy. This is far removed from the boiler-plate slogan “to serve and protect.” Radley Balko, author of Overkill: The Rise of Paramilitary Police Raids (Cato Institute Press, 2006) and The Warrior Cop: The Militarization of America’s Police Forces (PublicAffairs, 2013), warned of the danger long before it began to make headlines

One thing that alarms observers has been the conversion of SWAT teams from a sensible “extreme case” response into a common feature of policing. The American Civil Liberties Union (ACLU) estimates that SWAT team raids take place on an average of 124 times a day. Some observers wonder if having the capacity to conduct such operations creates a pressure to use it. The vast majority of SWAT operations now occur in support of drug searches. Apparently, theory holds that when policemen looking like big fighting insects bust down the door, suspected criminals will be too petrified to destroy the evidence by flushing it down the toilet. Carried to an extreme, these operations have included raids on illegal barbershops, cockfights, Tibetan monks on a peace pilgrimage in Iowa, and a guy who bet too much on a college football game. Several dozen people have been killed during these raids, although the most common victim is a family dog shot by amped-up policemen responding to incessant barking. (But who hasn’t felt that impulse in the middle of a summer night?) “The militarization of America’s police,” The Week, 1 August 2014, p. 11.)

The well-known gap between the American citizens and the military that protects them from foreign dangers, the broad reach of NSA communication searches, and the militarization of policing raise questions about the direction in which American democracy is headed.