Chronology of a Tragedy.

By 20 April 2020, 773,000 people in the United States had tested positive for the coronavirus.  Of these, 247,543 were in New York, mostly in New York City and its suburbs.  New Jersey had 88,806 confirmed cases.  That works out to about 32 percent of the cases being located in New York City and its immediate area.  If you include New Jersey’s 88,000, then New York is the center of about 43 percent of the cases.[1]

How did New York City come to be the present American epicenter of the coronavirus pandemic?[2]

“From the earliest days of the crisis, state and city officials were also hampered by a chaotic and often dysfunctional federal response, including significant problems with the expansion of testing, which made it far harder to gauge the scope of the crisis.”  The same was true of every part of the country, so that doesn’t explain why New York got hit hardest by far.

“Epidemiologists have pointed to New York City’s [population] density and its role as an international hub of commerce and tourism to explain why the coronavirus has spread so rapidly.  And it seems highly unlikely that any response by the state or city could have fully stopped it.”  The same seem likely to be true of the national government.  The question is how much government action could have limited the damage.

Nevertheless, in the view of Dr. Thomas Frieden, former head of the Centers for Disease Control and Prevention, closing the schools, stores, restaurants, and other public venues one to two weeks earlier could have reduced the death toll in New York by 50 to 80 percent.

 

January-February 2020: coronavirus “devastates” China and Europe.

 

21 January 2020: first confirmed case in the United States, in Seattle, Washington.

 

23 January 2020: Chinese government seals off Wuhan.

 

30 January 2020: WHO declares a global health emergency.

 

31 January 2020: US bars entry for any foreign national who had traveled to China in the previous 14 days.

 

It now appears that coronavirus was present in New York City before the first person tested positive for it.  Infectious disease specialists had known for weeks that the federal tests were defective and that infected people were almost certainly present and circulating.  One specialist in infectious diseases for a New York hospital group said later than it was apparent by late January 2020 that cases would soon appear in the United States.

 

2 February 2020: first coronavirus death outside China—in the Philippines.

 

5 February 2020: Japanese government quarantines a cruise ship which carried passengers infected during the trip.

 

7 February 2020: Infectious disease specialists and other doctors confer on federal criteria from the CDC for testing.  The guidelines were too strict and limiting on who could be tested.  According to one of those present, “It was at that moment that I think everybody in the room realized, we’re dead.”

 

Early February 2020: Dr. Oxiris Barbot, NYC Health Commissioner states that “this is not something you’re going to contract in the subway or the bus.”

 

14 February 2020: France announces first coronavirus death.

 

19 February 2020: first two cases in Iran announced.

 

23 February 2020: Italy sees surge in cases in Lombardy.

 

24 February 2020: passenger already infected by coronavirus arrives at JFK on a flight that originated in Iran.

 

24 February 2020: Trump administration asks Congress for $1.25 billion for coronavirus response.  US has 35 cases and no deaths.

 

28 February 2020: number of cases in Europe rises sharply.

 

Late February 2020: Mayor Bill de Blasio tells a news conference that “We can really keep this thing [coronavirus] contained.”

 

29 February 2020: first US death, in Seattle.

 

1 March 2020: the passenger from Iran tests positive for the coronavirus, making her the first identified case in New York City.

 

2 March 2020: Governor Andrew Cuomo and Mayor de Blasio address a news conference.  Cuomo says “Everybody is doing exactly what we need to do.  We have been ahead of this from Day 1.”  Cuomo told the conference that “Out of an abundance of caution we will be contacting the people who were on the flight with her from Iran to New York.”  Then everyone would be traced and isolated.  According to the NYT, this didn’t happen because the CDC would not authorize an investigation.

 

3 March 2020: lawyer in New Rochelle tests positive.  He had not travelled to any affected country, so there was reason to suspect he had contracted the virus in New York.  City health investigators traced his travels and contact to Manhattan, but the state of New York put a “porous” containment line around New Rochelle.

 

3 March 2020: US government approves widespread testing.

 

5 March 2020: New York City mayor Bill de Blasio said that “You have to assume that it could be anywhere in the city.”  However, he also said that “We’ll tell you the second we think you should change your behavior.”

 

If Dr. Frieden is correct that the city should have shut down one to two weeks before it did, then that date would have been sometime between 8 and 15 March 2020.

 

About 7 March 2020: city hospitals start reporting a sharp increase in influenza-like cases and the NYPD reported increased numbers of officers calling in sick and of 911 calls for coughs and fevers.

 

Second week in March 2020: De Blasio wanted widespread testing, but the city’s Health Department urged a public information campaign to tell those with mild symptoms to self-isolate at home, rather than infect others at testing centers.  De Blasio blocked the public information campaign for about a week.

 

At some point not stated by the NYT, de Blasio did urge New Yorkers to practice social distancing and working from home where possible; and de Blasio and Cuomo had both ordered occupancy limits on bars and restaurants.  These limits were broadly ignored.

 

Moreover, de Blasio resisted closing the schools.  The schools provide nutritious meals and a safe space, and not in some touchy-liberal sort of way either, for their students.[3]

 

11 March 2020: US bars most travelers from Europe.

 

12 March 2020: San Francisco closed the schools when 18 cases had been confirmed; Ohio closes the schools when 5 cases had been confirmed.

 

12 March 2020: At a meeting chaired by de Blasio, City Health Commissioner Barbot told a meeting of business executives that 70 percent of the city’s population could become infected.  De Blasio “stared daggers at her.”

According to one person present at the meeting, de Blasio rejected closing restaurants.  “I’m really concerned about restaurants; I’m really concerned about jobs.”  It was a legitimate concern from one perspective.  According to one estimate, tourism accounts for 300,000 jobs in New York City.  This is twice as many as does the tech jobs and vastly more than the jobs linked to the financial services industry.[4]  Closing down restaurants, bars, tourist activities, hotels, and sporting events would hammer the incomes pf poor people much than the incomes of rich people.  He appears to have thought that New York City would never have to close.  In reality, it was a choice between closing the city earlier or later.  However, in the event, the virus spread rapidly.  The health burden has not been shared equally between different social groups.[5]

 

13 March 2020: Trump declares national emergency.

 

13 March 2020: Los Angeles closes its schools after 40 cases had been confirmed.  New York City had almost 160 confirmed cases.

 

15 March 2020: City health officials give de Blasio a grim warning about the number of infections and deaths if the schools—and most businesses—weren’t closed immediately.

 

15 March 2020: De Blasio closes the schools when 329 cases had been confirmed.

 

15 March 2020: CDC recommends no gatherings of more than 50 people.

 

17 March 2020: seven California counties around San Francisco issued stay at home orders.

 

17 March 2020: France orders national lock-down.

 

19 March 2020: California issues state-wide stay at home order with 675 confirmed cases.  New York then had 4,152 cases.

 

20 March 2020: New York State issues state-wide stay at home order, effective 22 March 2020.  On 20 March, the state had more than 7,000 confirmed cases.

 

Recently, the New York Times ran a piece considering the long-term consequences of the pandemic’s impact on New York.[6]  Much of the economic basis of the city may be hollowed out.  This is particularly true if a vaccine is not developed and mass-produced very soon.  Tourists may shrink from visiting a densely-crowded city.  Tourist amenities from theaters to museums to restaurants to public transportation systems may impose social-distancing regimes that capsize the business model of the industry.  Both the financial services and technology sectors may extend their work-from-home adaptations, while many workers may decide that the home from which they are working might as well be somewhere other than high-price New York.  Demand for office and residential space could fall, clobbering the construction industry.  The city’s budget would have to deal with a huge fall in revenue.  Services to the poor would fall.

Sometimes Tragedy is born of the collision of two Goods.

 

[1] “Tracking an Outbreak,” NYT, 21 April 2020, p. A4.

[2] J. David Goodman, “How Outbreak Kept New York A Step Behind,” NYT, 8 April 2020.

[3] See: Andrea Elliott, “Invisible Child.  Girl in the Shadows: Dasani’s Homeless Life,” NYT, 9 December 2013.  http://www.nytimes.com/projects/2013/invisible-child/index.html#/?chapt=1

[4] J. David Goodman, “It Could Be Years Before New York Regains Its Glory,” NYT, 21 April 2020.

[5] For one example, see: John Eligon et al, “Black Americans Bear The Brunt As Virus Spreads,” NYT, 8 April 2020.

[6] J. David Goodman, “It Could Be Years Before New York Regains Its Glory,” NYT, 21 April 2020.

The Man Who Saved a Billion Lives.

In the 19th Century, a lot of Norwegians migrated to places like Minnesota, Iowa, and the Dakotas to make a living as farmers. Tough, hard-working, close-mouthed, decent people. Norman Borlaug (1914-2009) fit the stereotype. He grew up during the Depression, worked his way through the University of Minnesota to get a BA in forestry (1937). Along the way he got interested in plant diseases, so he went on and got a Ph.D. in plant pathology and genetics (1942).

Borlaug spent most of the Second World War on research work for DuPont down in Wilmington. In 1944 his old Ph.D. adviser recruited him to work on improving wheat harvests in Mexico. Borlaug spent sixteen years in Mexico developing disease-resistant strains of wheat. Along the way he had to overcome resistance from incompetent, lazy, or anti-foreign bureaucrats. He also had to persuade farmers to try something new when they were both wedded to tradition and fearful that a failed experiment would leave them to starve. He persevered. The seeds developed by Borlaug both yielded high returns of grain and resisted disease. A bunch of his developments were impossible in theory, but possible in practice. (So much for Rene Descartes.) Largely as a result of Borlaug’s work, the yield of Mexican wheat rose five-fold between 1950 and 2000. Mexico went from being a wheat-importer in the 1940s to being a wheat-exporter by the 1960s while feeding a much larger population.

In the early 1960s developing countries all over the world were struggling with rapid population growth. (See: The Population Bomb.) How were they to feed their people? Agricultural scientists in India and Pakistan got their governments to call in Borlaug. Borlaug had to overcome all the same difficulties that he had encountered in Mexico, with the added problem that India and Pakistan were at war with each other for part of the time. He persevered. As a result of Borlaug’s work, the yield of Indian and Pakistani wheat quadrupled between 1960 and 2000. Other countries in Latin America, the Middle East, and Africa then copied the Borlaug seeds. Then Asian governments applied his basic approach to producing high-yield, disease resistant rice instead of wheat. The huge increase in food production in countries that once faced the certainty of mass-death from famine has come to be called the “Green Revolution.”

In 1970 Norman Borlaug won the Nobel Peace Prize. When the committee called Borlaug at home to inform him, his wife said that he had already gone to work. It was 4:00 AM.

Later on, from 1984 on, Borlaug taught at Texas A&M University.[1]

Critics have found much to dislike in the effects of Borlaug’s work. They denounce the shift from subsistence farming to single-crop agriculture because it makes people dependent on the capitalist market. They denounce the reliance on scientifically-bred seeds and fertilizers and tractors and irrigation systems because it creates profits for American corporations. They dislike genetically-modified foods because it seems unnatural.

Borlaug replied that “They’ve never experienced the physical sensation of hunger. They do their lobbying from comfortable office suites in Washington or Brussels. If they lived just one month amid the misery of the developing world, as I have for fifty years, they’d be crying out for tractors and fertilizer and irrigation canals and be outraged that fashionable elitists back home were trying to deny them these things”

Borlaug was a tough, hard-working, close-mouthed, decent man. It has been estimated that about a billion people didn’t starve to death because of his work.

[1] The “A&M” stands for “Agricultural and Mechanical.”   Once upon a time, we had a different vision of education.

The Plagues Next Time.

Somebody (Stephen Colbert?) once joked the “Reality has a well-known liberal bias.” Actually, reality has a well-known bias in favor of human reason. Reason, in turn, is pretty-much non-partisan and available to anyone who cares to develop it. Of course, one problem is that not everyone is a willing consumer.

Antibiotics.[1] Bacteria cause infections and spread infectious diseases. Infections and infectious diseases used to kill many people. Even with sterile operating room, for example, the danger of post-operative infection made even an appendectomy a hazardous procedure. At the dawn of the 20th Century, scientists and doctors combined to launch a medical revolution. They developed antibiotics like penicillin to fight infections. All sorts of perils were suddenly conquered. Antibiotics made a vital contribution to the dramatic rise in life expectancy during the 20th Century.

Now we face a potentially devastating return of infectious diseases. The origins of this menace are complex, rather than simple and easily addressed. First, bacteria are living things that adapt to their environment. Some bacteria are hardier than other bacteria when it comes to resisting antibiotics. These hardy bacteria can develop mutations that make them more resistant to antibiotics, so they multiply while the less-resistant strains of bacteria are wiped out. (See: Darwin and his “theory” of Evolution.) Two factors have greatly facilitated this development. On the one hand, idiot doctors prescribe antibiotics in the wrong circumstances and idiot patients who get prescribed antibiotics often stop taking them before they have completed the full course. This wipes out weaker bacteria while leaving stronger bacteria to multiply. Once there are enough of the resistant bacteria in the system, the existing type of antibiotics no longer work. Then, “factory farming” of livestock involves massive use of antibiotics in the feed for these animals. Eighty percent of antibiotics are used on “factory farms.” So this creates a hot-house environment for the mutation of bacteria. Ooops.

Second, pharmaceutical companies lose money on new antibiotics to fight the new “superbugs” that are developing. People only take antibiotics when they have a bacterial infection. That is a rare occurrence compared to what it was before antibiotics were developed. Moreover, the sales price of antibiotics is low. Taken together, these factors make for a thin revenue stream from antibiotics. However, antibiotics are very expensive to develop. The average antibiotic loses $50 million for the company that develops it. In contrast, drugs to treat chronic conditions (diabetes, high blood pressure, can’t-get-it-up-with-a-crane) are taken on a constant basis over a long period of time. They are money-spinners. So, no important new antibiotics have been created since 1987.

How do we avoid this train wreck? First, give the pharmaceutical companies a reason to create new antibiotics. (I know: “They make enormous profits! They should do this out of the goodness of their souls!” They won’t and the “public option” beloved of “progressive people” = the Veterans’ Administration + Solyndra.) Extend the length of time that companies have patent protection for their antibiotics. This will keep low-cost producers from churning out generics. Second, subsidize the companies with tax-credits when they develop antibiotics. Third, put a stop to the abuse of antibiotics by idiot doctors and patients, and by factory farms.

 

Vaccination.[2] One idea behind vaccination is to wipe out diseases among young people. As the diseases are wiped out, they cease to pose a threat to older people as the effects of the childhood vaccinations wear off with time. Fine, so long as hardly anyone misses out on vaccinations. However, that is just what is starting to happen.

In 1998 Dr. Andrew Wakefield published a scientific study showing that the development of autism in twelve children could be linked to the standard vaccination against measles, mumps, and rubella. Naturally, many parents became alarmed. A subsequent inquiry demonstrated that the study was a fraud. Many subsequent studies have demonstrated that there is no connection between vaccination and autism. Too late! The suspicion/belief that vaccination is dangerous had become entrenched among a large and growing segment of parents. Why did this happen? In part, because of a 300 percent increase it the number of cases of autism that were diagnosed between 2002 and 2013. Although scientists suspect that autism arises from a mixture of genetic and environmental factors, the “anti-vaxxers” aren’t buying this explanation. Today, about ten percent of parents either postpone scheduled vaccines or claim a “personal belief” exemption to prevent their children from receiving vaccinations.

Who are the “anti-vaxxers”? Their ranks include pure-life progressives who reject both vaccines and genetically-modified foods; libertarians who see good health as just one more federal intrusion on their God-given right to watch their children cough their lungs out; and the descendants of the Scopes “monkey trial” rural conservatives.

What do “anti-vaxxers” believe? They believe that immunization can cause disorders and/or that so many vaccinations—16 is common—can “overwhelm” the body’s natural resistance to disease and expose children to diseases. There is NO evidence for any of this.

There is abundant evidence that reducing the number of vaccinated children then exposes adults to diseases from which they have thought themselves safe. In 2012, 50,000 Americans came down with whooping cough, by far the largest number if fifty years. Eighteen people died. In 2013 the number of cases of measles (OK, 190) was three times higher than in 2012.

Where do I go to get away from the people who want to get away from the Federal government? Idaho?

[1] “The antibiotic crisis,” The Week, 22 November 2013, p. 9.

[2] “The return of childhood diseases,” The Week, 7 March 2014, p. 9.

 

The History of AIDS.

For tens of thousands of years, some types of monkeys in West Africa have lived with a virus that attacks the immune system. Two different strains of it developed, one in Cameroon and one in the Senegal-Ivory Coast region. It wasn’t fatal because it was a “weak” virus. The immune system counter-attacked and fended off the virus. Over-simply put, it couldn’t evolve because the immune system jumped on it before the virus could make genetic changes. For the virus to evolve fast enough to become dangerous, it would have to be passed rapidly from one host to another—making changes along the way–before the immune-system hammered it flat.

So, no big deal even for diseased monkeys. However, American Indians hunted buffalo for food and Africans hunted “bushmeat” for food. Monkeys, tapirs, stuff like that. Then you had to skin them out to make steaks, chops, Kentucky Fried Tapir, etc. Butchers got nicked by knives and covered in (infected) blood along the way. So, the monkey immune-virus got into humans through open cuts. Still, this was not a big deal for most people because most Africans lived in isolated and not-very-large communities. That is, they lived in villages in the middle of forest clearings. The human version of the monkey immune deficiency, HIV, didn’t transmit very well because it needs to enter the bloodstream in some fashion. Even normal sex won’t do it most of the time. So, it settled into humans like it had settled into monkeys, and the human immune system fell on it.

So, no big deal even for diseased humans. However, European imperialists took over West and Central Africa in the late 19th Century. They tried to turn the place into a paying concern by starting plantations, building ports and railroads and warehouses, and corralling a lot of African labor. Port cities, railroad towns, and mining camps sprang up. All had large African working populations and small European ruling populations. People kept pouring in from the countryside. Most were men, but a minority were women.

The truth of the matter is that guys will pay for sex. Prostitution thrived in the towns of Central Africa. So did venereal diseases like syphilis. These diseases cause genital ulcers.[1] Such ulcers greatly facilitated the rapid spread of HIV into the human bloodstream. Rapid transmission allowed rapid genetic evolution into the fearsome plague we know today. Widespread vaccination of workers against smallpox compounded the problem because the colonial medical authorities did not sterilize needles between injections in order to save money.[2]

So, a big deal for African workers, but no big deal for anyone else. However, in the mid-1960s, HIV reached the Western Hemisphere and Europe. In all likelihood, a merchant seaman carried it from one port to another. In the early 1960s a young Norwegian merchant sailor spent time in West Africa. He came down with gonorrhea and was infected with HIV. By 1968 he had abandoned the sea to work as a long-haul truck-driver in Europe. He often had sex with prostitutes while on trips. He died in 1976. In 1966 a boy in St. Louis contracted HIV by some unknown means and died of it in 1969. Was a closeted gay man working in tropical medicine at one of the St. Louis universities? HIV/AIDS was still unknown outside Africa at the time and American and European doctors were stumped. About the same time some still-unidentified person traveled from the newly-independent Democratic Republic of the Congo to Haiti. S/he carried HIV. An epidemic soon began in Haiti. From here it was communicated in much greater numbers to the United States. By 1981 doctors had begun to identify an HIV/AIDS epidemic.

 

[1] See: US Army training films during the Second World War.

[2] I’m not making this up. Alas.

Big Pharma

What we think of as medicine is a fairly new development. Doctors used to be able to set broken bones, sew up cuts, lop off limbs, and give you an emetic. This changed in the later 19th Century, thanks to the addition of chemistry to medicine. Anesthesia and disinfectants made invasive surgery possible. No screaming, no gangrene. Then insulin (1921) and penicillin (1928) were discovered. Direct chemical treatment of disorders became possible. After the Second World War scientific research was applied in a systematic way to expanding knowledge of biology and techniques for producing drugs improved. The results of this combination appeared in a flood of new drugs and the growth of huge pharmaceutical companies. The new products included oral contraceptives, blood-pressure medicines, and psychiatric drugs. Cancer drugs began to come on-line in the 1970s. More recently, there have been drugs to treat cholesterol, acid-reflux, and asthma, as well as Viagra–and anti-depressants for when that doesn’t work. Then there is the terrible plague of male pattern baldness.

There have been several important developments in my life-time.

First, rules for medical trials became more elaborate and restrictive. Between 1957 and 1961 doctors prescribed a new tranquilizer to pregnant women to counter morning-sickness. Unfortunately, thalidomide caused terrible birth defects. In 1962 Congress amended the law governing the Food and Drug Administration to require that new pharmaceuticals prove not only safety, but also “efficacy”: the ability to produce a specific desired effect (and not some other effect or no effect) before a drug could be released. In 1964 the World Medical Association established rules requiring testing before the release of any new drug.   Again, pharmaceutical companies were required to prove “efficacy.” These reforms greatly extended the time and cost invested before drugs were released. In recent years, medical crises—like heart-disease and AIDS—has created a countervailing pressure for accelerated testing and approval.

Second, the pharmaceutical business became highly concentrated and vertically integrated. These are business terms, but it is a business. You should learn what they mean—although I didn’t when I was your age.) As pharmaceutical research sought treatments for complicated illnesses, research and development became more expensive. As research and development became more expensive, companies faced a greater risk that they would not be able to cover their costs before any patent ran out. Therefore, during the 1970s many countries passed laws strengthening and extending the time limit that the patents issued to pharmaceutical companies. These were intended to prevent generic producers from just figuring out the chemical basis of a drug, then producing it without having to bear the high costs of research. During the 1980s a wave of “buy-outs” of small bio-tech firms by big pharmaceutical companies took place. Today most pharmaceutical research, production, and sales are concentrated in fewer than twenty large companies. These companies are based in the United States, Britain, France, Germany, and Switzerland, although they operate internationally. This is called “concentration.” Each of these companies researches, develops, manufactures, and markets the product. This is called “vertical integration.” Critics refer to this complex as “Big Pharma.”

Most prescription drug use took place in a few rich countries (US, EU, Japan). Don’t get sick somewhere else. (My son’s room-mate got bit by a rabid dog while in Bolivia one summer. He had to fly home to get the injections to save his life. What if he had been Bolivian?) However, China, Russia, and South Korea expanded sales by 81 percent in 2006. Pharmaceuticals are already the most profitable business in America. Now a big money harvest looms in the developing world.

Give my knees to the needy.

Organ transplantation.

In the 7th Century BC,[1] a Chinese physician named Bian Que tried transplanting the heart of a strong-willed commoner into the body of a weak-willed emperor.

During the late 19th Century surgeons finally developed the technical ability to conduct operations (knowledge of how the body functioned, anesthesia, antiseptics) and this made transplants possible. However, it took much longer to develop the ability to prevent rejection of the implanted organ by the body’s immune system. Thus, the transplanted “Hands of Dr. Orloc” (1924) weren’t. Lung (1963), liver (1967), and heart (1967-1968) transplants were “successful” in the sense that the patients lived for weeks to months after the operation. In 1970 the development of the immuno-suppressive drug cyclosporine finally permitted successful transplantation to begin. Since 1970 transplants have become common: hearts, lungs, kidneys, livers, pancreases, hands, facial tissue, and bones have all been transplanted. No brains, yet.

The mismatch between donors and recipients.

Generally, there are more sick people in need of an organ than there are dead people with healthy organs for “harvesting.” While the growth of organ transplantation has extended many lives, people often die waiting for an available organ. National medical systems have developed ways of determining who gets priority.

However, there are two issues to bear in mind. First, national boundaries create barriers between donors and recipients. Second, as we have seen in so many other areas, great differences of wealth and income between different parts of the world lets buyers in rich countries get what they want in poor countries. People with money who want to jump the line can seek organ transplants abroad. One outcome of globalization has been to create a market in organs for transplant.

The global trade in organs.

Some Asian countries used to have a legal market in organs: India (until 1994), the Philippines (until 2008), and China (to this day) all allowed the legal sale of organs. Sometimes governments participate in this trade. An estimated 90 percent of the organs from China are taken from criminals executed in prisons. (They used to shoot them in train stations.)

There is also a thriving black-market in organs. The average price paid to a donor for a kidney is $5,000, while the average cost to the recipient is $150,000. When the Indian Ocean tsunami wrecked many fishing villages, about 100 villagers—almost all of them women—sold kidneys. According to one report, 40-50 percent of the people in some Pakistani villages have only one kidney. “It’s a poverty thing. You wouldn’t understand.”

Both the desire to circumvent the laws at home and the need to be close-by when an organ becomes “available” have stimulated “medical tourism.”

Finally, there is the alleged problem of “organ theft.” Given a shortage of voluntary donors, it has been suggested that some middle-men may turn to theft or murder. This is a common theme in horror movies and urban legend. It doesn’t have much truth behind it. Which isn’t the same as saying it doesn’t happen at all. “Hey buddy, can you give me a hand?”

[1] I can just see the Three Wise Men—one of them played by Buscemi—impatiently flipping through the calendar in 1 BC, marking off the days until Jesus would be born, trying to get a cheap flight, then getting told that Bethlehem’s inns are all booked solid: “Zoro-H-Aster! What are we supposed to do, stay in a manger?”

Fries with that?

What do we talk about when we talk about “Americanization”?  Are we talking about the spread of the American model through compulsion or seduction? Or are we talking about the Americans getting someplace first when everyone wants to go there? The global spread of obesity offers an example. In the United States the daily per capita consumption of calories has increased by 600 calories since 1980. Correspondingly, the share of overweight adults in the population has increased from 47 percent in 1980 to 64 percent in 2003. Since 1980 Americans have taken an increasing share of their meals from restaurants and take-out food. These meals tend to have about twice as many calories as does the typical meal Mom puts on the table. Nutritionists estimate that this accounts for about two thirds of the additional weight gained by Americans in the last quarter century. As late as 1991, generalized obesity was narrowly restricted geographically: only Michigan, West Virginia, Mississippi, and Louisiana had 15-20 percent of their populations classified as obese. By 1995 24 states had 15-20 percent of their adult population classed as overweight. By 2000 22 states had at least 20 percent of their populations classed as obese, and every other state except Colorado had at least 15 percent of its population classed as obese. “Obesity is often discussed as an American cultural phenomenon, closely intertwined with a taste for fast food, soft drinks, television, and video games.” This is probably what “Americanization” means in the eyes of Frenchmen and Islamist jihadis.

There have always been more underweight people than overweight people in the world. That gap has closed over time, however, and in 2000 it ceased to be true.   Falling food prices for consumers, the shift from rural to city life for many people, the substitution of desk-work for field-work, and the purchase of processed foods are world-wide trends. Obesity has emerged as a social characteristic in developing countries: 15 percent of adult Kenyan women are overweight compared to 12 percent who are underweight; 26 percent of adult Zimbabwean women are overweight compared to 5 percent who are underweight; 71 percent of adult Egyptian women are overweight compared to 1 percent who are underweight; and 29 percent of children in urban areas of China are obese.

This is largely attributable to the end of the other “oil crisis”: in recent years cheap, high-quality cooking oil has become available in developing countries for the first time. The oil contains dietary fat that has raised the caloric intake of individuals by 400 calories per day since 1980. But the increased use of cooking oil also reflects the increasing availability of meat as incomes rise around the world. For example, the ordinary Chinese diet used to rely very heavily on starchy roots, rice, and salted vegetables. Since 1980 the Chinese diet has added a lot of meat fried in oil. Most people now consume at least 2,500 calories per day.

There are still some places with “old” nutrition problems: 6 percent of the adult women in Cambodia are overweight compared to 21 percent who are underweight; 4 percent of adult Bangladeshi women are overweight compared to 45 percent who are underweight.

This has some large implications for public health. Excess weight has been associated with illnesses like diabetes and heart disease. Poor countries lack the medical systems to deal with these sorts of problems, which are new to them. An obesity epidemic is on the way. Does that mean that Weight Watchers will become an international phenomenon?

Don Peck, “The World in Numbers: The Weight of the World,” Atlantic, June 2003, pp. 38-39.