Save the Pagan Babies!

Poor countries cannot run what contemporary Americans would regard as “adequate” orphanages. They don’t have the surplus economic resources to provide robust social welfare institutions. Furthermore, as political scientists say, the state institutions lack capacity to achieve their goals. At best, they’re something out of Dickens. At worst, they’re warehouses in Hell. This is probably going to have some kind of long term psychological impact.

Long wars, especially civil wars, fought under barbarous conditions produce lots of orphans. The process of getting orphaned may involve something like watching your father have his arms chopped off with a machete. This, too, may have a lasting impact.

One report states that in Azerbaijan, “Many children are abandoned due to extreme poverty and harsh living conditions. Family members or neighbors may raise some of these children but the majority live in crowded orphanages until the age of fifteen when they are sent into the community to make a living for themselves.”

Finally, as in America not all that long ago, people use mental institutions and orphanages as receptacles for family members who are permanently disabled in some way. (One problem with tenement living was that you lacked an attic in which to confine Great-Aunt Grace who spent all her time talking about Kate Chopin’s The Yellow Wallpaper. Putting her in the storage locker in the basement just got the neighbors talking.)

Promoting international adoption can be one way of reducing the burden on taxpayers.

Still, there can be problems.

“Child laundering.” No, really, that’s what it’s called. Basically, “gringos” and “farangs” spend so much time with their cell phones that the radiation fries their little swimmers. So, no kids. So, they come to some developing country to buy a kid from an orphanage or some helpful soul who knows a starving child and would like to set him/her up in an American suburban home with a swing set in the backyard and 999 television channels. They’re rich, so there’s money to be made if you have a spare kid to sell. What if you do not have such a kid? Well, that’s what shopping malls are for in the United States. In developing countries you probably have to snatch them in a market-place or on their way home from school. Then, sell to “gringo” or “farang.” It helps if you know a “poor, corrupt policeman” who can help you with fake identity papers. (The US government has been prosecuting an American woman for her part in the fraudulent adoption of 800 Cambodian children.)

UNICEF estimates that there are 700,000 orphans in Russia. The number increases by over 100,000 a year. The striking thing is that these are “social orphans.” They have at least one living parent. The parent feels unable to care for the child, so they abandon the child to the care of someone else. Most go to other relatives or to foster homes. About a third are in the care of the state. Same thing is true in Haiti, where poor parents “hoped to increase their children’s opportunities by sending them to orphanages.” After the Haitian earthquake, the number of orphans sky-rocketed (although so did the number of suddenly-childless adults). American aid agencies descended on Haiti. One impulse was to promote the adoption of children from the orphanages to American homes. The obvious problem was that the Americans completely misunderstood the nature of Haitian orphanages. (On the other hand, they perfectly understood the motives of Haitian government officials who objected to the adoptions: they hadn’t got their cut.)

Little of this kind of “news” makes the headlines.

Value for Money in College Education

A Pew Research Center report from 2011 made two interesting points. First,” less than half of members of the general public agrees [that students should pay for their own college education], with a majority saying either the federal or state government, private donors, or a combination of those should pick up the largest share of a student’s college tab.” Second, “nearly 60 percent of Americans say the U.S. higher education system is not providing students with a good value.” These attitudes put average Americans sharply at odds with college presidents and faculty, who feel themselves best by Yahoos.

It’s time for some plain speaking. First, college does cost more than most families care to afford. Second, most colleges don’t give good value for what they charge, at least not in educational terms. Third, it is the same general public that complains about low value for a high price that is the cause of these problems. An examination of the historical record makes this clear.

One part of the explanation comes from demography.  The Baby Boom (from 1946 to 1964 approximately) went through American society like a mouse through the rattlesnake my college room-mate used to keep.  In the Forties and Fifties a tsunami of students hit the schools.  In the Sixties and Seventies the same tsunami hit the colleges.  The result of massive demand was a huge increase in the size of colleges and college faculties.

Then the Baby Boom gave way to the Baby Bust.  This brought a decline in the number of 18 year-olds in 1982 and for years to come. The number of students no longer matched up with the size of colleges and faculties.  What to do?  In business, of course, lots of places would just have gone under, like nail or tanning salons. Supply would have returned to balance with demand.  Not in higher education however.  Colleges fought for survival. First of all, they molted into country-clubs attached to classrooms.  Sports facilities, luxury dorms, and improved food services became the hall-marks of a good college. Second, adult education and degree-completion programs multiplied. Third, they played to the American reverence for diplomas, if not for learning as an abstract concept.  Everyone emphasized the economic value of more education.  Everyone celebrated a liberal arts education for all as a form of democratization.  Graduate programs to confer credentials sprang up like mushrooms.

The end result was that not enough colleges were down-sized.  Instead, they passed the rising costs along to others: to parents (through tuition increases), to students (through larger student loan debt), and to taxpayers (through government aid to higher education).

A second part of the explanation is cultural.              On the one hand, we are living with the consequences of a regulatory society created to pursue well-intended, but ill-defined goals like “justice” and “well-being” for citizens.  The outcome of this has been the growth of a massive apparatus of administrative staff at every college.  If you compare a college phone directory of twenty years ago with one of today, you will be able to measure the scale of the growth of administrators, new offices, assistants, and secretaries.  These people largely respond to mandates imposed by the federal and state governments, and accrediting agencies.  The costs of those mandates, however, are carried by the colleges and passed on to the consumer.

On the other hand, we are living with the consequences of the “de-bourgeoisification” of the American middle class.  Being bourgeois used to mean valuing hard work, self-restraint, living on less than you earn in order to have savings and–in old age–to be able to leave “an estate” to one’s children to help them get started in life.  It did not mean being happy or “fulfilled.”  Even so, bourgeois used to have a positive association.  Since the 1960s being bourgeois has gone the way of fedoras and torpedo bras.  Increasingly, the cultural emphasis has been on individual fulfillment and happiness.  There isn’t much that is fulfilling or happy about hard work, so it is de-valued.

The average American home now has five books in it.  The average home also has a big screen TV and a huge range of channels on its cable package.  You can’t get literacy or analytical skills from reality shows or video games.

Furthermore, in 1950 about 40 percent of students never finished high-school.  They didn’t need much education to drive a truck or work a drill-press or dig a ditch.  THEN high school and college were for people willing to do the work and to respect authority in the form of unreasonable teachers and parents angry about report cards.  NOW the schools have shifted toward keeping kids in school regardless of the cost to the state of education.  The quality of education has suffered because it isn’t fashionable to do the work required for learning and almost impossible to coerce kids with threats of flunking out.  Parental authority also has declined.  (You try involuntarily institutionalizing somebody over the age of 14 in any state except Utah.)

The outcome of all this is that many students come to college without the intellectual or cultural or psychological capital that their predecessors brought.  They struggle–or don’t bother to–in the classes.  They require remedial course work and second chances.  The survival imperative driving many colleges leads to a dilution of course work and grading standards.  They need the tuition, so they need the students.

For many students, college is a rite of passage, not an education.  They get to live away from their parents for the first time.  They’re semi-adults on their way to being minor-league adults on their way to being full-scale adults on their way to being safely dead where nothing can go wrong now so they’re Winners!  (The movie “Trainspotting” may have been repellant, but it wasn’t wrong.) The country-club with classrooms environment reinforces this feeling.

Public attention has focused on the real-estate bubble and all the evidence of corporate misbehavior.  Much less noticed was the explosion in ill-considered consumer debt and use of home equity loans to finance consumption.  Basically, most people don’t save a ton of money to pay for their kids’ college education.  The attitudes reflected in the Pew survey are unrealistic on several grounds.  First, it would be one thing publically financing the higher education of some sort of elite.  In fact, most students in college are not part of some intellectual elite.  Second, the money just isn’t there.  The federal deficit is going to be cut through some combination of tax increases on most people and spending cuts for all.  How we would expand public aid to everyone seeking a college education in that environment is beyond me.  Certainly, Princeton could buy the moon if it was for sale. However, most colleges do not have large endowments to provide additional income.  Public colleges live off direct state aid and tuition.  Many private colleges are in the same leaky boat.  That means that the “someone” who will pay for college if parents and students don’t pay will be—parents and students in their capacity as taxpayers and tuition-payers.

Is there a solution to this problem? Sure. Shape up. Turn off the television. Get rid of the xBox. Take the kid to the library once a week. Ground the kid if the grade report isn’t good. Paint the house during your summer vacation or drive out to Gettysburg, but forget about going to Disney World or down the Shore. I hate having to quote Chris Christie, but “why are you mad at the first person who told you the truth?”

 

The Tax Wars.

Should the rich pay their “fair share”? In 1992 there were three tax brackets: 15%, 28%, and 31%. In 1993 the Democrats created two additional tax brackets on higher incomes: 36% and 39.6%. Thus, the Democrats imposed higher tax rates on high incomes.[1]

In 2001 the Republicans cut federal income taxes on all Americans.[2] Single tax-payers with taxable income up to $6,000, heads of households with taxable income up to $10,000 and people filing jointly with taxable incomes up to $12,000 had their tax rate reduced from 15% to 10%. Those in the 15% bracket had the lower threshold indexed to the new 10% bracket. The tax rate on people in the next bracket was reduced from 28% to 25% by 2006. The rate on the next bracket would be lowered from 31% to 28% by 2006. The rate on the next bracket was reduced from 36% bracket to 33% by 2006. The rate on the highest bracket was reduced from 39.6% to 35% by 2006. The biggest percentage cuts in the tax rates were at the bottom end of the tax brackets, the smaller cuts at the high end. The two highest brackets still were taxed at a higher rate than in 1992.

These taxes continued through 2012, when the 2001 cuts on the two top brackets were allowed to expire, while the rates on the other brackets were made permanent. To illustrate, the rate for single filers making up to $8,925 is 10%; on $8,925 to $36,250 is 15%; on $36,250 to $87,850 is 25%; on $87,850 to $183,250 is 28%; on $183,250 to $398,350 is 33%; on $398,350 to $400,000 is 35%; and on $400,000+ is 39.6%. So, most Americans live under the Bush Administration tax cuts, while the wealthiest Americans live under the Clinton Administration tax increases.

Under these systems, what do different income groups pay as a percentage of federal income taxes?[3] In 1991, before the Clinton tax increases on high incomes, the top one percent of income earners paid 24.82% of the income tax bill; the bottom 50% paid 5.48%. In 2000, before the Bush tax cuts, the top 1% percent of income earners paid 37.42% of the income tax bill; the bottom 50% paid 3.91%. In 2011, under the Bush tax cuts, the top 1% of tax payers paid 35.1%; the bottom 50% of tax-payers paid 2.89% of taxes. (The top 50% paid 97.1%; the top 25% paid 85.6%; and the top 10% paid 68.3%.)

Across three very different administrations and under very different economic situations, the tax burden has been continually shifted from the bottom 50 percent of taxpayers onto the top one percent of tax payers. The Democratic mantra that the Bush tax cuts “favored the rich” is absolutely untrue. (In all likelihood, the Republican mantra that tax cuts will stimulate economic growth is equally untrue. That needs to be the subject of a different jeremiad.)

If tax rates favor the bottom 50%, income distribution favors the top 50%.

The “hard times” experienced by many Americans don’t have anything to do with tax-dodging by the rich. They are more likely to be the product of big shifts in the American economy within a globalized world economy since the 1970s. Fighting over shares of a shrinking pie isn’t going to fix the problem. We need broadly shared economic growth.

[1] For the sake of comparison, in Canada the highest rate of national taxation—on incomes over $132,000—is 29%.

 

[2] Economic Growth and Tax Relief Reconciliation Act of 2001 (EGTRRA).

 

[3] Kyle Pomerleau, “Summary of Latest Federal Tax Data,” Table 6. http://taxfoundation.org/article/summary-latest-federal-income-tax-data

Exporting Jobs

Companies are owned by private investors seeking the maximum return on their investment. In the decades after the Second World War, the United States slowly became a high-cost place to do business. Labor costs (wages and benefits), and environmental and workplace safety regulations played an important role in this process. The weakening quality of the American workforce in terms of science and math also played a role.

Beginning in the late 1970s major American firms began seeking higher profits through the lower production costs that could be attained by moving operations outside the United States. General Electric, under Jack Welch, figures as one of the leaders in this movement and GE was not shy about encouraging its own suppliers to do the same. Many other manufacturers followed the example of GE. For example, in 1992 the Ford Motor Company overseas manufacturing sector employed 47 percent of its workforce, but still employed 53 percent of its workforce in the US and Canada. In 2009 the overseas operations employed 63 percent of its workforce, while 37 percent were employed in the US, Canada, and Mexico.

Then, in the 1990s, the growth of the Internet exposed service industries to globalization as well. Computer programmers have seen 13 percent of their jobs exported to foreign countries.

Furthermore, the United States has the second highest rate of taxation on corporations in the world. The nominal tax rate on corporate profits is 35 percent. Companies have spent decades lobbying Congress in successful efforts to create tax loopholes, so the average effective rate is 25 percent. In Canada the corporate tax rate is 16.5 percent; in Germany it is 15.8 percent; in Ireland it is 12.5 percent. Thus, the tax rate on corporate profits in the United States remains higher than the rates in many foreign countries.

Many American companies have created foreign branches to take advantage of lower labor and regulatory costs, and lower rates of taxation. Moreover, American corporations with operations abroad must pay the difference between the tax rate in the country where they earned the profit and the tax rate in the United States when they repatriate those profits. Rather than do so, the companies reinvest the foreign profits in their foreign profits in expanding production overseas, rather than reinvesting in the United States. America is almost alone in double-taxing profits earned abroad.

In 2010, the Simpson-Bowles commission President Obama recommended that the US tax rate be lowered to 23 percent and most loopholes closed. American business leaders generally accepted this proposal. However, the proposal also encountered opposition from the left to any reduction of taxes on business. According to one critic of corporate tax avoidance “It’s unpatriotic, it’s unfair, and we can’t afford it.”

Who profits from this strategy? American corporations profit: in 2009 47 percent of the revenues of the five hundred leading American corporations came from their overseas operations. Developing economies that host American corporations profit: between 1995 and 2008 China’s GDP grew an average of 9.6 percent and India’s GDP grew an average of 6.9 percent. It’s harder to say that America itself benefits. Between 1995 and 2008 the GDP of the United States grew an average of 2.9 percent. In 1950, the United States Government pulled in thirty percent of its revenue from taxes on corporations. In 2010, the United States Government pulled in nine percent of its revenues from taxes on corporations.

Corporate “inversions” are just the latest example of these problems.

“Where America’s Jobs Went,” The Week, 25 March 2011, p. 13; “Taxing corporations,” The Week, 2 September 2011, p. 13.

Keynesianism and Monetarism

Accepted truth from 1776 to 1929: the “invisible hand” of the unrestricted free market is the best regulator of the economy.  The economy expands, contracts, expands in natural cycles.  Government should stay out of the way, balance its budget (no deficits), and keep taxes low.  This is called “laissez faire” (pronounced lay-zay fare).  These ideas are most associated with the British economist Adam Smith who wrote a book called The Wealth of Nations (1776).

Then came the Great Depression from 1929 to various points in the Thirties.  Thousands of bank failures, tens of thousands of bankrupt businesses, millions of unemployed people, and year after year of hardship with no hope in sight.  The “invisible hand” seemed too invisible for most people’s liking.

Accepted truth from 1933 to 1973: Recession (bad) and Depression (worse) result from a shortfall in Demand (people wanting to buy stuff) compared with what the economy can actually produce.  Government should make up the difference by spending money to buy stuff.  Also, if a government ran a budget deficit in the process of reviving the economy, it was all right and not the end of the world.  So, the government could manage the economy, do lots of things for citizens, and let the politicians decide how much to spend on what.  These ideas are most associated with the British economist John Maynard Keynes (pronounced Kanes) who wrote a book called The General Theory of Employment, Interest, and Money (1936).  So this is called “Keynesianism.”  By the mid-Sixties everybody was a “Keynesian.”

Then came the Seventies.  The “oil shocks” of 1973 and 1979 caused world-wide “stagflation”: a combination of high inflation and high unemployment.  In economic theory, this could not happen.  In economic reality, it could happen.  People observed that big government deficits dumped gasoline on the fire of inflation, while lots of government control of the economy blocked adapting to new conditions.

Accepted truth from 1973 to 2008: the money supply and interest rates really govern the economy.  Deficits are bad because they either dump excess money into the economy (fueling inflation) or “crowd out” businesses that want to borrow money (smothering economic progress like a wicked step-mother).  The government should balance budgets, cut spending, cut taxes, keep interest rates low, and let the natural economy function.  These ideas are most associated with the American economist Milton Friedman who wrote a book called The Monetary History of the United States, 1867-1960 (1971). So this is called “Monetarism” (rather than “Friedmanism”).

Then came the “Great Recession” of 2008-201_ (fill in blank when you get a job).  Unregulated bankers did a lot of, you know, silly things.  The world financial system almost collapsed.  We’ve got seven percent unemployment.  Monetary policy isn’t working: the interest rate is at about zero, but the banks still aren’t lending; my 401(k) is only now back to where it was in 2008.  So, what is to be done?  Ask Keynes.