2007 Dodge Ram SRT10: Overpriced, Overpowered and Proud of It
By EZRA DYER
this truck doesn’t care about mileage, or about refinement, or about not scaring children. This truck could be more politically incorrect only if it ran on whale oil and panda tears. And that’s why I like it, sort of. The SRT10 is not a truck of half-measures — it’s gleefully over-the-top in just about every way. Its 510-horsepower engine, borrowed from the Dodge Viper, is a callous brute, rocking the truck on its suspension even at idle.
At full throttle, the four-speed automatic shifts so hard that you expect to see it in the rearview mirror, scattered about the pavement. Twin tailpipes emit a guttural roar pretty much constantly, but you can drown them out with the 508-watt Infinity stereo. There’s a deep air dam in the front, a hood scoop that wears a “Viper Powered” badge and a spoiler perched atop the tailgate. The tailgate spoiler gets my vote as the new reference to complete the phrase “As American as ...”
We are a people who create downforce-producing aerodynamic devices for pickup trucks, because our pickup trucks go so fast that they’d otherwise fly right into the air like the magic car in “Harry Potter and the Chamber of Secrets.” You listening, Al Qaeda? You may as well just give up right now.
From a practical standpoint, this thing is a tough proposition. On one hand, it has four practical doors, and it can tow up to 8,150 pounds. Wonderfully unburdened by a speed limiter, the two-door version does 150 miles an hour, and this one probably also approaches that distinctly untrucklike velocity. So it’s potentially useful if you like to tow boats down the autobahn.
On the other hand, the sticker on my test truck totaled $57,460, which included a $1,595 navigation system and a $1,200 rear-seat DVD entertainment system, among other niceties. For that stack of cash, you could have a Hemi-powered regular-cab Ram 1500 and a Mercedes-Benz C230 sedan.
Or — I just looked this up — you could buy 20 acres of ranch land in Montalba, Tex. But I have a feeling that if you’re really interested in buying a $57,000 pickup truck, you might already own Montalba, Tex.
The SRT10’s ride-handling balance is tilted in favor of comfort, probably in deference to the fact that an object this immense is never going to give a Lotus a run for its money in the twisties, no matter how stiff the suspension, and like a classic muscle car it gets confused by corners.
I’ll wager that if the truck had merely gargantuan wheels (say, 20-inchers), it would keep its tires in better contact with the pavement. As it is, the springs and dampers struggle to control the weight of the huge 22-inch wheels at each corner. Trying to make a suspension work properly with 22-inch wheels is like fashioning a yo-yo out of a bowling ball and some string. They do look nice, though.
Now, I know that thrifty fuel economy isn’t a priority here. I also know that your garden-variety 4x4 pickup will probably never see 20 m.p.g. But the Ram SRT10 rivals a torched oil well for sheer profligacy — and, oh yes, it demands premium.
I managed mileage in the double digits only because I made a highway-heavy road trip. Around town, I was getting 7 or 8 m.p.g. With more than a quarter-tank of fuel remaining, I gassed up to the tune of $74.28. If I’d come close to running the 34-gallon tank dry, the big red truck would have had the dubious distinction of being my first $100 fill-up.
One advantage of the Ram’s pavement-crushing weight and cinder-block aerodynamics is that those are two identifying characteristics of a truck, and trucks are not subject to the federal gas-guzzler tax. So while the Viper and its 20 m.p.g. highway rating get hit with the guzzler label, a Ram SRT10 with the same engine but much worse fuel economy does not. But nobody ever said life was fair for Viper owners.
The Ram SRT10 Quad Cab may not make fiscal or ecological sense, but I appreciate the chutzpah it represents. Nobody else was making a four-door truck with 510 horsepower, so the Dodge people took it upon themselves to fill the void.
Mission accomplished, it seems: after a three-year run, 2006 is the last year for the Viper-powered pickup. The SRT10 Quad Cab is the truck to end all trucks, including itself.
Sunday, August 27
How Do You Take a Gun Away?
By JAMES TRAUB
Can Hezbollah be disarmed? The United Nations Security Council, the major Western powers and the government of Lebanon have all called for the Shiite militia to be shorn of its weapons. But how? And by whom? When it approved the terms of the cease-fire on Aug. 16, the Lebanese cabinet stipulated that its army would not take Hezbollah’s weapons away. United Nations officials have said that the international force that is to join the Lebanese Army in southern Lebanon would not do so, either. And militia leaders insist that they will not voluntarily lay down their arms. That doesn’t leave too many options, does it? And yet if Hezbollah is not disarmed, all of the appalling destruction that Israel visited upon Lebanon and suffered in its own territory may have accomplished nothing, and the bloodshed just concluded may be only the prelude to something yet worse.
“Disarmament,” like “peacekeeping,” is a confident-sounding coinage for an improbable activity. The murkiness of the language governing the conflict in Lebanon is, in fact, endemic to the activity itself. What does it mean to disarm? Is it a reflexive verb — a thing you agree to do to yourself? Or is it a thing done to you?
Victors in war, of course, forcibly disarm the losers — as the Allies did to the Germans and Japanese after World War II and as the United States did to the defeated Iraqi Army in 2003. But in a war that ends without decisive victory, or in civil conflicts, forcible disarmament is often impossible. The fighting force must more or less agree to disarm itself.
And disarming is the easiest part. Fighters who yield up their weapons must then be demobilized, meaning not only that they have to be mustered out but also that the organization’s command-and-control structure must be eliminated. And then, perhaps most crucially of all, as the Bush administration discovered to its pain in Iraq, those soldiers must be reintegrated into civilian society, or into the national army, so that the rewards, or at least potential rewards, of peace outweigh those of violence. Professionals thus refer to the entire activity as disarmament, demobilization and reintegration, or D.D.R.
Disarmament, like peacekeeping itself, offers a set of time-tested, codified practices that are quite effective under certain political conditions and futile in their absence. In 2000, I visited the dusty town of Port Loko, in Sierra Leone, to see a “disarmament camp,” a desultory affair in which a knot of surly ex-rebels from a murderous force known as the Revolutionary United Front hung around waiting for $300 payments meant to enable their fresh start as farmers. Most of them still wanted to fight, and many probably returned to the bush. But then their leader was arrested, U.N. peacekeepers equipped with heavy weapons were deployed in the countryside and the R.U.F. signed a peace deal. D.D.R. resumed in earnest in 2001, the R.U.F. disbanded the following year, and by 2004 the rebels had been fully disarmed. U.N. peacekeepers were able to leave. Sierra Leone is now patrolled by its own army and police force, though the country’s desperate poverty and political fragility could tip it back into warfare at any time.
Kosovo provides another more-or-less-happy disarmament situation. After a relentless NATO bombing campaign in 1999 compelled Serbian troops to withdraw from the province, a NATO force filled the vacuum. But the home-grown militia, the Kosovo Liberation Army, viewed itself as the true author of the victory and thus was in no mood to surrender its weapons. With the K.L.A., which was itself guilty of widespread ethnic cleansing, prepared to become a resistance force, or possibly a national mafia, peacekeeping officials made the audacious decision to enroll its members in an unarmed national guard, the Kosovo Protection Corps. And though in its early years the K.P.C. was found to be secretly stockpiling arms and was accused of serious human rights violations, the experiment has largely worked. The chief reason for its success is that K.P.C. members have not truly been demobilized; they have been permitted to keep their command structure intact and fully expect to become the nucleus of a national army when Kosovo gains its independence. There has been some talk of applying the Kosovo model to Hezbollah, by absorbing the militia either into the Lebanese Army or into a new national guard.
Kosovo and Sierra Leone worked not because peacekeepers got disarmament right but because the politics were right, or because the balance of force was favorable to peacekeepers. Otherwise, disarmament fails. In Congo, for example, aggressive and well-armed U.N. peacekeepers largely disarmed the ragtag militias in the northeastern region of Ituri (though owing to the government’s almost complete failure to prepare the ex-rebels for civilian life, violence has returned to the area). But equally determined peacekeeping troops made very little headway against the tougher and better-equipped force of Rwandan Hutus who have been wreaking havoc in eastern Congo since they fled across the border after the 1994 genocide. The Hutus agreed to return to Rwanda if they were allowed to organize as a political party, but President Paul Kagame flatly rejected the demand. The United Nations could thus neither intimidate the rebels nor offer them a better deal than the one they already had, pillaging the Congolese countryside.
What is true of the Rwandan force is true yet more of Hezbollah. Israel launched its air, land and sea attack on Lebanon with the goal, as Prime Minister Ehud Olmert put it, of “disarming this murderous organization”; in that regard, the campaign failed. How, then, could any lesser force succeed? Lebanon’s defense minister, Elias Murr, has defended Hezbollah and flatly asserted that the Lebanese Army “is not going to the south to strip Hezbollah of its weapons and do the work that Israel did not.” Neither will a U.N. peacekeeping force, however large. “You cannot impose peace on these people if they’re ready to fight you,” as a D.D.R. expert in the U.N.’s peacekeeping department puts it. “You need to be able to annihilate them, because they’re not going to lay down their arms voluntarily.” Even robust United Nations forces do not seek to annihilate their adversaries.
If Hezbollah cannot be forcibly disarmed, can some political arrangement induce the militia to disarm itself? This, of course, raises a question about Hezbollah’s aspirations: is it seeking to achieve through force a goal that can be attained through diplomacy, or through political activity? That this is in fact the case is the unspoken premise of United Nations Resolution 1559, passed in 2004, which sought to release Lebanon from the suffocating grip of Syria, and thus to begin a national dialogue that would ultimately lead to the incorporation of Hezbollah into Lebanese affairs.
Here you can look to a very different precedent: the voluntary disarmament in 2005 of the Irish Republican Army. Like Hezbollah, which has legislators and ministers in the Lebanese government, the hard-core Catholic resistance to British rule in Northern Ireland had a military wing, the I.R.A., and a civilian one, known as Sinn Fein. This was, for decades, a distinction without a difference, for the movement as a whole was committed to forcing out the British by calculated acts of violence. Starting in the early 1990’s, and then with increasing intensity with the election of Tony Blair as prime minister in 1997, the British government tried to induce the I.R.A. to lay down its arms by offering a political path to greater self-determination. Great attention was devoted to the mechanics of disarmament. In 1998, the British and Irish governments established the Independent International Commission on Decommissioning to oversee and verify the disarmament process. But the I.I.C.D. was able to do little so long as the tortuous negotiations over power sharing kept collapsing into acrimony and violence. The I.R.A. would declare a cease-fire amid great ceremony and optimism, then pull the plug with a spectacular act of violence.
Finally, after the terrorist attacks carried out by Islamic extremists on the London subway in July 2005, Blair made a series of gestures to the I.R.A., and the group responded by definitively vowing to cease all military activity. Fighters deposited rifles, machine guns, chemical explosives and even surface-to-air missiles at secret locations in the Republic of Ireland, with Catholic and Protestant clergymen brought in as witnesses. In September, the I.I.C.D. certified that “the I.R.A. has met its commitment to put all its arms beyond use.” (The group has, however, been accused of continuing to use violence for criminal, rather than political, ends.)
At the time, a columnist in The Times of London explained how the underlying dynamic had changed: “Sinn Fein was once the political wing of the I.R.A.; in the course of the past decade, the I.R.A. has become the paramilitary branch of Sinn Fein. A paramilitary organization can choose whether or not it has a political manifestation. A political organization in a Western democracy cannot, ultimately, choose whether or not it has a paramilitary offshoot.”
Should the parties to the violence in Lebanon work toward a similar demilitarization of the struggle with similar disarmament mechanics? Ben Zogby, the son of the Lebanese-American pollster John Zogby, recently made just this suggestion in The Huffington Post. Zogby proposed, as many others have, a political deal to grant Hezbollah its demands — a swap of prisoners, the withdrawal of Israeli troops from the disputed Shebaa Farms, “adequate representation of the politically disenfranchised Shia community” in Lebanon — all overseen by a new Commission on Decommissioning.
Certainly the I.R.A. precedent shows that even brutal paramilitary groups can ultimately be persuaded to lay down their arms. But it will prove relevant only if Hezbollah has demands that can be satisfied by a political process, so that over time its fighting force will dwindle into “the paramilitary branch” of its political wing, and former soldiers will accept reintegration into civilian life. Hezbollah does, in fact, aspire to gain “adequate representation” for Shiites inside Lebanon, as the I.R.A. did for Catholics in Northern Ireland. But this is scarcely its raison d’être. Hezbollah has used its weapons on Israel, not on the government of Lebanon; and it fights Israel with the professed goal of destroying it. If we take Hezbollah at its word, disarmament can come only in the wake of apocalyptic triumph.
Of course, just because you can’t see your way to a long-term solution doesn’t mean you dispense with short-term palliatives: what can’t be solved can often be postponed (a nostrum the Bush administration might wish it had invoked in the case of Iraq). The thousands of Lebanese and international troops who will be inserted between the combatants should provide at least an interval of calm. While the peacekeepers cannot disarm Hezbollah, their mandate requires them to prevent rearmament by blocking the militia’s Syrian supply routes. This, in turn, could persuade Israel to stay its hand. And diplomacy could then have time to lay solid foundations before the whole rickety structure gives way.
James Traub, a contributing writer, is the author of “The Best Intentions: Kofi Annan and the U.N. in the Era of American World Power,” due out in November.
Can Hezbollah be disarmed? The United Nations Security Council, the major Western powers and the government of Lebanon have all called for the Shiite militia to be shorn of its weapons. But how? And by whom? When it approved the terms of the cease-fire on Aug. 16, the Lebanese cabinet stipulated that its army would not take Hezbollah’s weapons away. United Nations officials have said that the international force that is to join the Lebanese Army in southern Lebanon would not do so, either. And militia leaders insist that they will not voluntarily lay down their arms. That doesn’t leave too many options, does it? And yet if Hezbollah is not disarmed, all of the appalling destruction that Israel visited upon Lebanon and suffered in its own territory may have accomplished nothing, and the bloodshed just concluded may be only the prelude to something yet worse.
“Disarmament,” like “peacekeeping,” is a confident-sounding coinage for an improbable activity. The murkiness of the language governing the conflict in Lebanon is, in fact, endemic to the activity itself. What does it mean to disarm? Is it a reflexive verb — a thing you agree to do to yourself? Or is it a thing done to you?
Victors in war, of course, forcibly disarm the losers — as the Allies did to the Germans and Japanese after World War II and as the United States did to the defeated Iraqi Army in 2003. But in a war that ends without decisive victory, or in civil conflicts, forcible disarmament is often impossible. The fighting force must more or less agree to disarm itself.
And disarming is the easiest part. Fighters who yield up their weapons must then be demobilized, meaning not only that they have to be mustered out but also that the organization’s command-and-control structure must be eliminated. And then, perhaps most crucially of all, as the Bush administration discovered to its pain in Iraq, those soldiers must be reintegrated into civilian society, or into the national army, so that the rewards, or at least potential rewards, of peace outweigh those of violence. Professionals thus refer to the entire activity as disarmament, demobilization and reintegration, or D.D.R.
Disarmament, like peacekeeping itself, offers a set of time-tested, codified practices that are quite effective under certain political conditions and futile in their absence. In 2000, I visited the dusty town of Port Loko, in Sierra Leone, to see a “disarmament camp,” a desultory affair in which a knot of surly ex-rebels from a murderous force known as the Revolutionary United Front hung around waiting for $300 payments meant to enable their fresh start as farmers. Most of them still wanted to fight, and many probably returned to the bush. But then their leader was arrested, U.N. peacekeepers equipped with heavy weapons were deployed in the countryside and the R.U.F. signed a peace deal. D.D.R. resumed in earnest in 2001, the R.U.F. disbanded the following year, and by 2004 the rebels had been fully disarmed. U.N. peacekeepers were able to leave. Sierra Leone is now patrolled by its own army and police force, though the country’s desperate poverty and political fragility could tip it back into warfare at any time.
Kosovo provides another more-or-less-happy disarmament situation. After a relentless NATO bombing campaign in 1999 compelled Serbian troops to withdraw from the province, a NATO force filled the vacuum. But the home-grown militia, the Kosovo Liberation Army, viewed itself as the true author of the victory and thus was in no mood to surrender its weapons. With the K.L.A., which was itself guilty of widespread ethnic cleansing, prepared to become a resistance force, or possibly a national mafia, peacekeeping officials made the audacious decision to enroll its members in an unarmed national guard, the Kosovo Protection Corps. And though in its early years the K.P.C. was found to be secretly stockpiling arms and was accused of serious human rights violations, the experiment has largely worked. The chief reason for its success is that K.P.C. members have not truly been demobilized; they have been permitted to keep their command structure intact and fully expect to become the nucleus of a national army when Kosovo gains its independence. There has been some talk of applying the Kosovo model to Hezbollah, by absorbing the militia either into the Lebanese Army or into a new national guard.
Kosovo and Sierra Leone worked not because peacekeepers got disarmament right but because the politics were right, or because the balance of force was favorable to peacekeepers. Otherwise, disarmament fails. In Congo, for example, aggressive and well-armed U.N. peacekeepers largely disarmed the ragtag militias in the northeastern region of Ituri (though owing to the government’s almost complete failure to prepare the ex-rebels for civilian life, violence has returned to the area). But equally determined peacekeeping troops made very little headway against the tougher and better-equipped force of Rwandan Hutus who have been wreaking havoc in eastern Congo since they fled across the border after the 1994 genocide. The Hutus agreed to return to Rwanda if they were allowed to organize as a political party, but President Paul Kagame flatly rejected the demand. The United Nations could thus neither intimidate the rebels nor offer them a better deal than the one they already had, pillaging the Congolese countryside.
What is true of the Rwandan force is true yet more of Hezbollah. Israel launched its air, land and sea attack on Lebanon with the goal, as Prime Minister Ehud Olmert put it, of “disarming this murderous organization”; in that regard, the campaign failed. How, then, could any lesser force succeed? Lebanon’s defense minister, Elias Murr, has defended Hezbollah and flatly asserted that the Lebanese Army “is not going to the south to strip Hezbollah of its weapons and do the work that Israel did not.” Neither will a U.N. peacekeeping force, however large. “You cannot impose peace on these people if they’re ready to fight you,” as a D.D.R. expert in the U.N.’s peacekeeping department puts it. “You need to be able to annihilate them, because they’re not going to lay down their arms voluntarily.” Even robust United Nations forces do not seek to annihilate their adversaries.
If Hezbollah cannot be forcibly disarmed, can some political arrangement induce the militia to disarm itself? This, of course, raises a question about Hezbollah’s aspirations: is it seeking to achieve through force a goal that can be attained through diplomacy, or through political activity? That this is in fact the case is the unspoken premise of United Nations Resolution 1559, passed in 2004, which sought to release Lebanon from the suffocating grip of Syria, and thus to begin a national dialogue that would ultimately lead to the incorporation of Hezbollah into Lebanese affairs.
Here you can look to a very different precedent: the voluntary disarmament in 2005 of the Irish Republican Army. Like Hezbollah, which has legislators and ministers in the Lebanese government, the hard-core Catholic resistance to British rule in Northern Ireland had a military wing, the I.R.A., and a civilian one, known as Sinn Fein. This was, for decades, a distinction without a difference, for the movement as a whole was committed to forcing out the British by calculated acts of violence. Starting in the early 1990’s, and then with increasing intensity with the election of Tony Blair as prime minister in 1997, the British government tried to induce the I.R.A. to lay down its arms by offering a political path to greater self-determination. Great attention was devoted to the mechanics of disarmament. In 1998, the British and Irish governments established the Independent International Commission on Decommissioning to oversee and verify the disarmament process. But the I.I.C.D. was able to do little so long as the tortuous negotiations over power sharing kept collapsing into acrimony and violence. The I.R.A. would declare a cease-fire amid great ceremony and optimism, then pull the plug with a spectacular act of violence.
Finally, after the terrorist attacks carried out by Islamic extremists on the London subway in July 2005, Blair made a series of gestures to the I.R.A., and the group responded by definitively vowing to cease all military activity. Fighters deposited rifles, machine guns, chemical explosives and even surface-to-air missiles at secret locations in the Republic of Ireland, with Catholic and Protestant clergymen brought in as witnesses. In September, the I.I.C.D. certified that “the I.R.A. has met its commitment to put all its arms beyond use.” (The group has, however, been accused of continuing to use violence for criminal, rather than political, ends.)
At the time, a columnist in The Times of London explained how the underlying dynamic had changed: “Sinn Fein was once the political wing of the I.R.A.; in the course of the past decade, the I.R.A. has become the paramilitary branch of Sinn Fein. A paramilitary organization can choose whether or not it has a political manifestation. A political organization in a Western democracy cannot, ultimately, choose whether or not it has a paramilitary offshoot.”
Should the parties to the violence in Lebanon work toward a similar demilitarization of the struggle with similar disarmament mechanics? Ben Zogby, the son of the Lebanese-American pollster John Zogby, recently made just this suggestion in The Huffington Post. Zogby proposed, as many others have, a political deal to grant Hezbollah its demands — a swap of prisoners, the withdrawal of Israeli troops from the disputed Shebaa Farms, “adequate representation of the politically disenfranchised Shia community” in Lebanon — all overseen by a new Commission on Decommissioning.
Certainly the I.R.A. precedent shows that even brutal paramilitary groups can ultimately be persuaded to lay down their arms. But it will prove relevant only if Hezbollah has demands that can be satisfied by a political process, so that over time its fighting force will dwindle into “the paramilitary branch” of its political wing, and former soldiers will accept reintegration into civilian life. Hezbollah does, in fact, aspire to gain “adequate representation” for Shiites inside Lebanon, as the I.R.A. did for Catholics in Northern Ireland. But this is scarcely its raison d’être. Hezbollah has used its weapons on Israel, not on the government of Lebanon; and it fights Israel with the professed goal of destroying it. If we take Hezbollah at its word, disarmament can come only in the wake of apocalyptic triumph.
Of course, just because you can’t see your way to a long-term solution doesn’t mean you dispense with short-term palliatives: what can’t be solved can often be postponed (a nostrum the Bush administration might wish it had invoked in the case of Iraq). The thousands of Lebanese and international troops who will be inserted between the combatants should provide at least an interval of calm. While the peacekeepers cannot disarm Hezbollah, their mandate requires them to prevent rearmament by blocking the militia’s Syrian supply routes. This, in turn, could persuade Israel to stay its hand. And diplomacy could then have time to lay solid foundations before the whole rickety structure gives way.
James Traub, a contributing writer, is the author of “The Best Intentions: Kofi Annan and the U.N. in the Era of American World Power,” due out in November.
Saturday, August 26
Images of War
The Power Joe Rosenthal Knew
By Susan D. Moeller for the Washington Post
Saturday, August 26, 2006; Page A21
"Correspondents have a job in war as essential as the military personnel," wrote Gen. Dwight D. Eisenhower in a memorandum drafted in the worrisome days before the Normandy invasion. "Fundamentally, public opinion wins wars." One of the greatest weapons in the World War II arsenal turned out to be a photograph -- the image taken by Associated Press photographer Joe Rosenthal of five Marines and one Navy corpsman raising the flag over Iwo Jima.
That image told of men, in the midst of cataclysm, together planting a symbol of America on contested ground. At a time when images of dead and wounded Americans were being published with regularity in the U.S. press, the photograph from Mount Suribachi celebrated a heroic moment on the front lines. It became the signature icon of the war, a photograph fortuitously taken, as Joe Rosenthal has often described it, and immediately seized upon by those leading the war effort back in the United States.
Countless publications duplicated the image. It was reproduced on a postage stamp, made into a statue, copied on untold numbers of commemorative items and turned into a Hollywood movie plot. Joe Rosenthal's photograph not only gave Americans back home an image of what was happening on the front lines, it persuasively argued that Americans were winning.
Rosenthal died last Sunday at the age of 94. When I interviewed him in the mid-1980s for a book I wrote on American war photography, he argued that he had no problem with his photograph being adopted as the icon of the war. What mattered, he said, is that the essential truth that his image captured had not been altered. World War II was the "good war." And Americans were the liberators.
Managing images to elicit a supportive public opinion in wartime was understood as essential long before the World War II -- it's simply the method of management that has changed. Napoleon III, during his mid-19th century reign in France, censored caricature more harshly than the written word -- in a time of low literacy, political cartoons were intelligible to all. Famed World War I photographer Jimmy Hare, who took pictures of the dead on the Italian front, wrote about being more stymied by the censors than were his reporter colleagues, and noted that "to so much as make a snapshot without official permission in writing means arrest."
In 1965 CBS correspondent Morley Safer enraged the military and the Johnson administration by showing footage of Marines burning thatched roofs of the village of Cam Ne with Zippo cigarette lighters. Although similar reports had been routinely documented in the print media, the visual effect of the television coverage so irritated President Lyndon Johnson that he is said to have awakened Frank Stanton, president of CBS News, with the demand "Are you trying to [expletive] me?"
In June 1986 the South African government tightened existing press restrictions with new guidelines cannily calculated to frustrate photographic coverage of disturbances throughout the country. Although reporters could still write about the violence in the townships and elsewhere, the apartheid story disappeared from the air when the only images available became file footage.
President George H.W. Bush's method for controlling and retaining public support during the Persian Gulf War was to put a moratorium on journalists filing from the front lines and to filter the theater's information through official news conferences. Only a handful of "combat" images ever made it past the censors. Since then the spinning of images has continued to accelerate. The climactic event of the taking of Baghdad in April 2003, the bringing down of the Saddam Hussein statue, turned out to be an elaborate photo op. So too did the rescue of Pfc. Jessica Lynch.
Images are powerful indicators of victory and defeat. The war on terrorism and the shooting wars in Iraq and Lebanon are increasingly being played out through images in print, on television and online. Blogs post photos of an angry President Bush and juxtapose them with those of a smiling Hasan Nasrallah, Hezbollah's leader. Cable news programs show pictures of bleeding civilians in the streets of Iraq, which reverberate ominously after video images of British police patrolling Heathrow airport.
It's tempting to think that it's only in our brave new age of digital cameras and video phones, of 24-hour news channels and satellite uplinks, that images have mattered as much as they do -- that because we can see more images from literally anywhere in real time, images somehow have gained in power relative to the humble word. It's not true.
What is true is that images are no longer appropriated only after they are taken; they have become an intrinsic part of military strategy. One indication? In last month's fight with Israel, Nasrallah coordinated the timing of Hezbollah's missile attack on an Israeli warship with his on-air speech to the Lebanese public announcing the attack. Maybe it is a brave new world after all.
The writer is director of the International Center for Media and the Public Agenda at the University of Maryland at College Park and the author of "Shooting War: Photography and the American Experience of Combat."
By Susan D. Moeller for the Washington Post
Saturday, August 26, 2006; Page A21
"Correspondents have a job in war as essential as the military personnel," wrote Gen. Dwight D. Eisenhower in a memorandum drafted in the worrisome days before the Normandy invasion. "Fundamentally, public opinion wins wars." One of the greatest weapons in the World War II arsenal turned out to be a photograph -- the image taken by Associated Press photographer Joe Rosenthal of five Marines and one Navy corpsman raising the flag over Iwo Jima.
That image told of men, in the midst of cataclysm, together planting a symbol of America on contested ground. At a time when images of dead and wounded Americans were being published with regularity in the U.S. press, the photograph from Mount Suribachi celebrated a heroic moment on the front lines. It became the signature icon of the war, a photograph fortuitously taken, as Joe Rosenthal has often described it, and immediately seized upon by those leading the war effort back in the United States.
Countless publications duplicated the image. It was reproduced on a postage stamp, made into a statue, copied on untold numbers of commemorative items and turned into a Hollywood movie plot. Joe Rosenthal's photograph not only gave Americans back home an image of what was happening on the front lines, it persuasively argued that Americans were winning.
Rosenthal died last Sunday at the age of 94. When I interviewed him in the mid-1980s for a book I wrote on American war photography, he argued that he had no problem with his photograph being adopted as the icon of the war. What mattered, he said, is that the essential truth that his image captured had not been altered. World War II was the "good war." And Americans were the liberators.
Managing images to elicit a supportive public opinion in wartime was understood as essential long before the World War II -- it's simply the method of management that has changed. Napoleon III, during his mid-19th century reign in France, censored caricature more harshly than the written word -- in a time of low literacy, political cartoons were intelligible to all. Famed World War I photographer Jimmy Hare, who took pictures of the dead on the Italian front, wrote about being more stymied by the censors than were his reporter colleagues, and noted that "to so much as make a snapshot without official permission in writing means arrest."
In 1965 CBS correspondent Morley Safer enraged the military and the Johnson administration by showing footage of Marines burning thatched roofs of the village of Cam Ne with Zippo cigarette lighters. Although similar reports had been routinely documented in the print media, the visual effect of the television coverage so irritated President Lyndon Johnson that he is said to have awakened Frank Stanton, president of CBS News, with the demand "Are you trying to [expletive] me?"
In June 1986 the South African government tightened existing press restrictions with new guidelines cannily calculated to frustrate photographic coverage of disturbances throughout the country. Although reporters could still write about the violence in the townships and elsewhere, the apartheid story disappeared from the air when the only images available became file footage.
President George H.W. Bush's method for controlling and retaining public support during the Persian Gulf War was to put a moratorium on journalists filing from the front lines and to filter the theater's information through official news conferences. Only a handful of "combat" images ever made it past the censors. Since then the spinning of images has continued to accelerate. The climactic event of the taking of Baghdad in April 2003, the bringing down of the Saddam Hussein statue, turned out to be an elaborate photo op. So too did the rescue of Pfc. Jessica Lynch.
Images are powerful indicators of victory and defeat. The war on terrorism and the shooting wars in Iraq and Lebanon are increasingly being played out through images in print, on television and online. Blogs post photos of an angry President Bush and juxtapose them with those of a smiling Hasan Nasrallah, Hezbollah's leader. Cable news programs show pictures of bleeding civilians in the streets of Iraq, which reverberate ominously after video images of British police patrolling Heathrow airport.
It's tempting to think that it's only in our brave new age of digital cameras and video phones, of 24-hour news channels and satellite uplinks, that images have mattered as much as they do -- that because we can see more images from literally anywhere in real time, images somehow have gained in power relative to the humble word. It's not true.
What is true is that images are no longer appropriated only after they are taken; they have become an intrinsic part of military strategy. One indication? In last month's fight with Israel, Nasrallah coordinated the timing of Hezbollah's missile attack on an Israeli warship with his on-air speech to the Lebanese public announcing the attack. Maybe it is a brave new world after all.
The writer is director of the International Center for Media and the Public Agenda at the University of Maryland at College Park and the author of "Shooting War: Photography and the American Experience of Combat."
Monday, August 21
When the Hacks run the show (from Bruce Reed's THE PLAN)
Strip away the job titles and party labels, and you will find two tribes of people in Washington: political Hacks and policy Wonks. Hacks come to Washington because anywhere else they'd be bored to death. Wonks come here because nowhere else could they bore so many to death.
After two decades in Washington, we have come to the conclusion that the gap between Republicans and Democrats is as nothing compared to the one between these two tribes. We should know. When we began working together in the Clinton White House, we came from different tribes—one of us a Hack, the other a Wonk. (We're not telling which.) We made a deal to teach each other the secrets, quirks, and idioms of our respective sects.
Throughout history, Hacks and Wonks have been the yin and yang of politics. But in the last few years, something terrible has destroyed our political equilibrium. The political world suffered a devastating outbreak of what might be called Rove Flu—a virus that destroys any part of the brain not dedicated to partisan political manipulation. Now, Hacks are everywhere. Like woolly mammoths on the run from Neanderthals, Wonks are all but extinct.
Although Hacks have never been in short supply in our nation's capital, the rise of one-party rule in Washington over the past four years unleashed an all-out Hack attack. Every issue, every debate, every job opening was seen as an opportunity to gain partisan advantage. Internal disagreement was stifled, independent thought discouraged, party discipline strictly enforced—and that's just how they treated their friends.
The Bush White House was so obsessed with how to profit politically from its agenda that it never even asked whether its policies would actually work. It should come as no surprise that they didn't.
Perhaps the best recent example of paint-by-number politics was the Medicare prescription drug bill. One prominent Hack, Tom Scully—then an assistant secretary at HHS, later a health care lobbyist—allegedly threatened to fire Richard Foster, a career government actuary, if he revealed how much the prescription drug bill would explode Medicare spending.
Remember the good old days when Republicans went to jail for covering up burglaries and conducting covert wars against communism? Now they're under fire for covering up massive social spending. No wonder conservatives are unhappy. It's as if Oliver North were running a secret Head Start program in the White House basement.
President Bush served as Hack-in-Chief even when he studiously pretended not to be doing so. He came into office promising to be a compassionate conservative, soon left us yearning for a competent conservative, and seems destined to be remembered for presiding over the heyday of the corrupt conservative.
Republicans have learned the hard way that the American people are a lot smarter than either the Hacks or the Wonks imagine. For all the talk in both parties about the urgent need to win one constituency or another, most Americans apply the same political yardstick: They vote for what works. There aren't enough Hacks, even in Washington, to sell policies that don't work—although that never stopped Bush from trying.
Yet as Americans survey the damage from six years of Hacks Gone Wild, bad policy is only the beginning. In his Farewell Address, another Republican president, Dwight Eisenhower, warned of an "iron triangle" of legislators, bureaucrats, and private contractors eager to increase arms production. Today's Republicans have created a kind of Hack triangle from the White House to Congress to K Street lobbyists.
Tom DeLay may be gone, but those in office will still do anything to stay there; those who make their living off those in office stop at nothing to keep them there. And with so many private interests at stake, the country's problems have had to wait in line.
In the old days, a popular American business model was planned obsolescence: making products that wouldn't last long so that consumers had to buy a replacement. The Republican political model is planned incompetence: When bureaucrats screw up or government programs don't work, that only reinforces public skepticism about government.
Hack government could get by in the old era, when one party's Hacks simply had to outwit the other's. Now, the challenges government faces are too hard to fake it, and the consequences of failure too dire.
We knew Hack fever had gotten out of hand when the producers of Fear Factor proposed a reality show called Red/Blue, modeled after American Idol, to find the next Karl Rove. But we've known enough Hacks to realize how little the nation stands to gain from churning out more of them.
After two decades in Washington, we have come to the conclusion that the gap between Republicans and Democrats is as nothing compared to the one between these two tribes. We should know. When we began working together in the Clinton White House, we came from different tribes—one of us a Hack, the other a Wonk. (We're not telling which.) We made a deal to teach each other the secrets, quirks, and idioms of our respective sects.
Throughout history, Hacks and Wonks have been the yin and yang of politics. But in the last few years, something terrible has destroyed our political equilibrium. The political world suffered a devastating outbreak of what might be called Rove Flu—a virus that destroys any part of the brain not dedicated to partisan political manipulation. Now, Hacks are everywhere. Like woolly mammoths on the run from Neanderthals, Wonks are all but extinct.
Although Hacks have never been in short supply in our nation's capital, the rise of one-party rule in Washington over the past four years unleashed an all-out Hack attack. Every issue, every debate, every job opening was seen as an opportunity to gain partisan advantage. Internal disagreement was stifled, independent thought discouraged, party discipline strictly enforced—and that's just how they treated their friends.
The Bush White House was so obsessed with how to profit politically from its agenda that it never even asked whether its policies would actually work. It should come as no surprise that they didn't.
Perhaps the best recent example of paint-by-number politics was the Medicare prescription drug bill. One prominent Hack, Tom Scully—then an assistant secretary at HHS, later a health care lobbyist—allegedly threatened to fire Richard Foster, a career government actuary, if he revealed how much the prescription drug bill would explode Medicare spending.
Remember the good old days when Republicans went to jail for covering up burglaries and conducting covert wars against communism? Now they're under fire for covering up massive social spending. No wonder conservatives are unhappy. It's as if Oliver North were running a secret Head Start program in the White House basement.
President Bush served as Hack-in-Chief even when he studiously pretended not to be doing so. He came into office promising to be a compassionate conservative, soon left us yearning for a competent conservative, and seems destined to be remembered for presiding over the heyday of the corrupt conservative.
Republicans have learned the hard way that the American people are a lot smarter than either the Hacks or the Wonks imagine. For all the talk in both parties about the urgent need to win one constituency or another, most Americans apply the same political yardstick: They vote for what works. There aren't enough Hacks, even in Washington, to sell policies that don't work—although that never stopped Bush from trying.
Yet as Americans survey the damage from six years of Hacks Gone Wild, bad policy is only the beginning. In his Farewell Address, another Republican president, Dwight Eisenhower, warned of an "iron triangle" of legislators, bureaucrats, and private contractors eager to increase arms production. Today's Republicans have created a kind of Hack triangle from the White House to Congress to K Street lobbyists.
Tom DeLay may be gone, but those in office will still do anything to stay there; those who make their living off those in office stop at nothing to keep them there. And with so many private interests at stake, the country's problems have had to wait in line.
In the old days, a popular American business model was planned obsolescence: making products that wouldn't last long so that consumers had to buy a replacement. The Republican political model is planned incompetence: When bureaucrats screw up or government programs don't work, that only reinforces public skepticism about government.
Hack government could get by in the old era, when one party's Hacks simply had to outwit the other's. Now, the challenges government faces are too hard to fake it, and the consequences of failure too dire.
We knew Hack fever had gotten out of hand when the producers of Fear Factor proposed a reality show called Red/Blue, modeled after American Idol, to find the next Karl Rove. But we've known enough Hacks to realize how little the nation stands to gain from churning out more of them.
Returning Government to the 16th Century
Tax Farmers, Mercenaries and Viceroys
Paul Krugman
Yesterday The New York Times reported that the Internal Revenue Service would outsource collection of unpaid back taxes to private debt collectors, who would receive a share of the proceeds.
It’s an awful idea. Privatizing tax collection will cost far more than hiring additional I.R.S. agents, raise less revenue and pose obvious risks of abuse. But what’s really amazing is the extent to which this plan is a retreat from modern principles of government. I used to say that conservatives want to take us back to the 1920’s, but the Bush administration seemingly wants to go back to the 16th century.
And privatized tax collection is only part of the great march backward.
In the bad old days, government was a haphazard affair. There was no bureaucracy to collect taxes, so the king subcontracted the job to private “tax farmers,” who often engaged in extortion. There was no regular army, so the king hired mercenaries, who tended to wander off and pillage the nearest village. There was no regular system of administration, so the king assigned the task to favored courtiers, who tended to be corrupt, incompetent or both.
Modern governments solved these problems by creating a professional revenue department to collect taxes, a professional officer corps to enforce military discipline, and a professional civil service. But President Bush apparently doesn’t like these innovations, preferring to govern as if he were King Louis XII.
So the tax farmers are coming back, and the mercenaries already have. There are about 20,000 armed “security contractors” in Iraq, and they have been assigned critical tasks, from guarding top officials to training the Iraqi Army.
Like the mercenaries of old, today’s corporate mercenaries have discipline problems. “They shoot people, and someone else has to deal with the aftermath,” declared a U.S. officer last year.
And armed men operating outside the military chain of command have caused at least one catastrophe. Remember the four Americans hung from a bridge? They were security contractors from Blackwater USA who blundered into Falluja — bypassing a Marine checkpoint — while the Marines were trying to pursue a methodical strategy of pacifying the city. The killing of the four, and the knee-jerk reaction of the White House — which ordered an all-out assault, then called it off as casualties mounted — may have ended the last chance of containing the insurgency.
Yet Blackwater, whose chief executive is a major contributor to the Republican Party, continues to thrive. The Department of Homeland Security sent heavily armed Blackwater employees into New Orleans immediately after Katrina.
To whom are such contractors accountable? Last week a judge threw out a jury’s $10 million verdict against Custer Battles, a private contractor that was hired, among other things, to provide security at Baghdad’s airport. Custer Battles has become a symbol of the mix of cronyism, corruption and sheer amateurishness that doomed the Iraq adventure — and the judge didn’t challenge the jury’s finding that the company engaged in blatant fraud.
But he ruled that the civil fraud suit against the company lacked a legal basis, because as far as he could tell, the Coalition Provisional Authority, which ran Iraq’s government from April 2003 to June 2004, wasn’t “an instrumentality of the U.S. government.” It wasn’t created by an act of Congress; it wasn’t a branch of the State Department or any other established agency.
So what was it? Any premodern monarch would have recognized the arrangement: in effect, the authority was a personal fief run by a viceroy answering only to the ruler. And since the fief operated outside all the usual rules of government, the viceroy was free to hire a staff of political loyalists lacking any relevant qualifications for their jobs, and to hand out duffel bags filled with $100 bills to contractors with the right connections.
Tax farmers, mercenaries and viceroys: why does the Bush administration want to run a modern superpower as if it were a 16th-century monarchy? Maybe people who’ve spent their political careers denouncing government as the root of all evil can’t grasp the idea of governing well. Or maybe it’s cynical politics: privatization provides both an opportunity to evade accountability and a vast source of patronage.
But the price is enormous. This administration has thrown away centuries of lessons about how to make government work. No wonder it has failed at everything except fearmongering.Yesterday The New York Times reported that the Internal Revenue Service would outsource collection of unpaid back taxes to private debt collectors, who would receive a share of the proceeds.
Paul Krugman.
Money Talks
Send Your Comments About This Column
Paul Krugman takes readers' questions about economics and international finance.
Readers' Comments »
Columnist Page »
Podcasts
Audio Versions of Op-Ed Columns
TimesSelect subscribers can now listen to a reading of the day's Op-Ed columns.
It’s an awful idea. Privatizing tax collection will cost far more than hiring additional I.R.S. agents, raise less revenue and pose obvious risks of abuse. But what’s really amazing is the extent to which this plan is a retreat from modern principles of government. I used to say that conservatives want to take us back to the 1920’s, but the Bush administration seemingly wants to go back to the 16th century.
And privatized tax collection is only part of the great march backward.
In the bad old days, government was a haphazard affair. There was no bureaucracy to collect taxes, so the king subcontracted the job to private “tax farmers,” who often engaged in extortion. There was no regular army, so the king hired mercenaries, who tended to wander off and pillage the nearest village. There was no regular system of administration, so the king assigned the task to favored courtiers, who tended to be corrupt, incompetent or both.
Modern governments solved these problems by creating a professional revenue department to collect taxes, a professional officer corps to enforce military discipline, and a professional civil service. But President Bush apparently doesn’t like these innovations, preferring to govern as if he were King Louis XII.
So the tax farmers are coming back, and the mercenaries already have. There are about 20,000 armed “security contractors” in Iraq, and they have been assigned critical tasks, from guarding top officials to training the Iraqi Army.
Like the mercenaries of old, today’s corporate mercenaries have discipline problems. “They shoot people, and someone else has to deal with the aftermath,” declared a U.S. officer last year.
And armed men operating outside the military chain of command have caused at least one catastrophe. Remember the four Americans hung from a bridge? They were security contractors from Blackwater USA who blundered into Falluja — bypassing a Marine checkpoint — while the Marines were trying to pursue a methodical strategy of pacifying the city. The killing of the four, and the knee-jerk reaction of the White House — which ordered an all-out assault, then called it off as casualties mounted — may have ended the last chance of containing the insurgency.
Yet Blackwater, whose chief executive is a major contributor to the Republican Party, continues to thrive. The Department of Homeland Security sent heavily armed Blackwater employees into New Orleans immediately after Katrina.
To whom are such contractors accountable? Last week a judge threw out a jury’s $10 million verdict against Custer Battles, a private contractor that was hired, among other things, to provide security at Baghdad’s airport. Custer Battles has become a symbol of the mix of cronyism, corruption and sheer amateurishness that doomed the Iraq adventure — and the judge didn’t challenge the jury’s finding that the company engaged in blatant fraud.
But he ruled that the civil fraud suit against the company lacked a legal basis, because as far as he could tell, the Coalition Provisional Authority, which ran Iraq’s government from April 2003 to June 2004, wasn’t “an instrumentality of the U.S. government.” It wasn’t created by an act of Congress; it wasn’t a branch of the State Department or any other established agency.
So what was it? Any premodern monarch would have recognized the arrangement: in effect, the authority was a personal fief run by a viceroy answering only to the ruler. And since the fief operated outside all the usual rules of government, the viceroy was free to hire a staff of political loyalists lacking any relevant qualifications for their jobs, and to hand out duffel bags filled with $100 bills to contractors with the right connections.
Tax farmers, mercenaries and viceroys: why does the Bush administration want to run a modern superpower as if it were a 16th-century monarchy? Maybe people who’ve spent their political careers denouncing government as the root of all evil can’t grasp the idea of governing well. Or maybe it’s cynical politics: privatization provides both an opportunity to evade accountability and a vast source of patronage.
But the price is enormous. This administration has thrown away centuries of lessons about how to make government work. No wonder it has failed at everything except fearmongering.
Paul Krugman
Yesterday The New York Times reported that the Internal Revenue Service would outsource collection of unpaid back taxes to private debt collectors, who would receive a share of the proceeds.
It’s an awful idea. Privatizing tax collection will cost far more than hiring additional I.R.S. agents, raise less revenue and pose obvious risks of abuse. But what’s really amazing is the extent to which this plan is a retreat from modern principles of government. I used to say that conservatives want to take us back to the 1920’s, but the Bush administration seemingly wants to go back to the 16th century.
And privatized tax collection is only part of the great march backward.
In the bad old days, government was a haphazard affair. There was no bureaucracy to collect taxes, so the king subcontracted the job to private “tax farmers,” who often engaged in extortion. There was no regular army, so the king hired mercenaries, who tended to wander off and pillage the nearest village. There was no regular system of administration, so the king assigned the task to favored courtiers, who tended to be corrupt, incompetent or both.
Modern governments solved these problems by creating a professional revenue department to collect taxes, a professional officer corps to enforce military discipline, and a professional civil service. But President Bush apparently doesn’t like these innovations, preferring to govern as if he were King Louis XII.
So the tax farmers are coming back, and the mercenaries already have. There are about 20,000 armed “security contractors” in Iraq, and they have been assigned critical tasks, from guarding top officials to training the Iraqi Army.
Like the mercenaries of old, today’s corporate mercenaries have discipline problems. “They shoot people, and someone else has to deal with the aftermath,” declared a U.S. officer last year.
And armed men operating outside the military chain of command have caused at least one catastrophe. Remember the four Americans hung from a bridge? They were security contractors from Blackwater USA who blundered into Falluja — bypassing a Marine checkpoint — while the Marines were trying to pursue a methodical strategy of pacifying the city. The killing of the four, and the knee-jerk reaction of the White House — which ordered an all-out assault, then called it off as casualties mounted — may have ended the last chance of containing the insurgency.
Yet Blackwater, whose chief executive is a major contributor to the Republican Party, continues to thrive. The Department of Homeland Security sent heavily armed Blackwater employees into New Orleans immediately after Katrina.
To whom are such contractors accountable? Last week a judge threw out a jury’s $10 million verdict against Custer Battles, a private contractor that was hired, among other things, to provide security at Baghdad’s airport. Custer Battles has become a symbol of the mix of cronyism, corruption and sheer amateurishness that doomed the Iraq adventure — and the judge didn’t challenge the jury’s finding that the company engaged in blatant fraud.
But he ruled that the civil fraud suit against the company lacked a legal basis, because as far as he could tell, the Coalition Provisional Authority, which ran Iraq’s government from April 2003 to June 2004, wasn’t “an instrumentality of the U.S. government.” It wasn’t created by an act of Congress; it wasn’t a branch of the State Department or any other established agency.
So what was it? Any premodern monarch would have recognized the arrangement: in effect, the authority was a personal fief run by a viceroy answering only to the ruler. And since the fief operated outside all the usual rules of government, the viceroy was free to hire a staff of political loyalists lacking any relevant qualifications for their jobs, and to hand out duffel bags filled with $100 bills to contractors with the right connections.
Tax farmers, mercenaries and viceroys: why does the Bush administration want to run a modern superpower as if it were a 16th-century monarchy? Maybe people who’ve spent their political careers denouncing government as the root of all evil can’t grasp the idea of governing well. Or maybe it’s cynical politics: privatization provides both an opportunity to evade accountability and a vast source of patronage.
But the price is enormous. This administration has thrown away centuries of lessons about how to make government work. No wonder it has failed at everything except fearmongering.Yesterday The New York Times reported that the Internal Revenue Service would outsource collection of unpaid back taxes to private debt collectors, who would receive a share of the proceeds.
Paul Krugman.
Money Talks
Send Your Comments About This Column
Paul Krugman takes readers' questions about economics and international finance.
Readers' Comments »
Columnist Page »
Podcasts
Audio Versions of Op-Ed Columns
TimesSelect subscribers can now listen to a reading of the day's Op-Ed columns.
It’s an awful idea. Privatizing tax collection will cost far more than hiring additional I.R.S. agents, raise less revenue and pose obvious risks of abuse. But what’s really amazing is the extent to which this plan is a retreat from modern principles of government. I used to say that conservatives want to take us back to the 1920’s, but the Bush administration seemingly wants to go back to the 16th century.
And privatized tax collection is only part of the great march backward.
In the bad old days, government was a haphazard affair. There was no bureaucracy to collect taxes, so the king subcontracted the job to private “tax farmers,” who often engaged in extortion. There was no regular army, so the king hired mercenaries, who tended to wander off and pillage the nearest village. There was no regular system of administration, so the king assigned the task to favored courtiers, who tended to be corrupt, incompetent or both.
Modern governments solved these problems by creating a professional revenue department to collect taxes, a professional officer corps to enforce military discipline, and a professional civil service. But President Bush apparently doesn’t like these innovations, preferring to govern as if he were King Louis XII.
So the tax farmers are coming back, and the mercenaries already have. There are about 20,000 armed “security contractors” in Iraq, and they have been assigned critical tasks, from guarding top officials to training the Iraqi Army.
Like the mercenaries of old, today’s corporate mercenaries have discipline problems. “They shoot people, and someone else has to deal with the aftermath,” declared a U.S. officer last year.
And armed men operating outside the military chain of command have caused at least one catastrophe. Remember the four Americans hung from a bridge? They were security contractors from Blackwater USA who blundered into Falluja — bypassing a Marine checkpoint — while the Marines were trying to pursue a methodical strategy of pacifying the city. The killing of the four, and the knee-jerk reaction of the White House — which ordered an all-out assault, then called it off as casualties mounted — may have ended the last chance of containing the insurgency.
Yet Blackwater, whose chief executive is a major contributor to the Republican Party, continues to thrive. The Department of Homeland Security sent heavily armed Blackwater employees into New Orleans immediately after Katrina.
To whom are such contractors accountable? Last week a judge threw out a jury’s $10 million verdict against Custer Battles, a private contractor that was hired, among other things, to provide security at Baghdad’s airport. Custer Battles has become a symbol of the mix of cronyism, corruption and sheer amateurishness that doomed the Iraq adventure — and the judge didn’t challenge the jury’s finding that the company engaged in blatant fraud.
But he ruled that the civil fraud suit against the company lacked a legal basis, because as far as he could tell, the Coalition Provisional Authority, which ran Iraq’s government from April 2003 to June 2004, wasn’t “an instrumentality of the U.S. government.” It wasn’t created by an act of Congress; it wasn’t a branch of the State Department or any other established agency.
So what was it? Any premodern monarch would have recognized the arrangement: in effect, the authority was a personal fief run by a viceroy answering only to the ruler. And since the fief operated outside all the usual rules of government, the viceroy was free to hire a staff of political loyalists lacking any relevant qualifications for their jobs, and to hand out duffel bags filled with $100 bills to contractors with the right connections.
Tax farmers, mercenaries and viceroys: why does the Bush administration want to run a modern superpower as if it were a 16th-century monarchy? Maybe people who’ve spent their political careers denouncing government as the root of all evil can’t grasp the idea of governing well. Or maybe it’s cynical politics: privatization provides both an opportunity to evade accountability and a vast source of patronage.
But the price is enormous. This administration has thrown away centuries of lessons about how to make government work. No wonder it has failed at everything except fearmongering.
Saturday, August 19
The uphill battle to change children's diets (when their parents are rarely role-models)
The School-Lunch Test
By LISA BELKIN
It was not yet 11 a.m. at the Partin Settlement Elementary School in Kissimmee, Fla., on a sunny day last October. But lunch service necessarily begins early when there are 838 children to feed, and the meal was already well under way. Danielle Hollar walked calmly amid the lunchroom chaos, holding a large, raw, uncut sweet potato in one hand and a tray filled with tiny cups of puréed sweet potatoes in the other. That Hollar does not get frazzled even among hundreds of jabbering children is one of the talents she brings to her job. That she is tall and blond and slim, and many of the students seem to have school-kid crushes on her, is another.
“This is a sweet potato,” she said, as she stopped at each table and gave each child a purée sample. “It has a lot of vitamin A and C and B6. Have you ever seen one of these?” Most of the children had not. “It’s like a regular potato, but it’s orange inside,” she said, which got their attention. “It has a lot more vitamins than a white potato.”
Angelina wanted to know why there were no marshmallows on top. Angel put down his lemon pie, tried the sweet potatoes and announced he preferred the pie. Mateo, however, was making a meal of what his classmates eyed so suspiciously. Collecting the cups of everyone around him, he ate a dozen of the tablespoon-size portions before a teacher cut him off.
“It’s sweet, which is why it’s called a sweet potato, but it’s also good for you,” Hollar said, as she moved from table to table, sounding one exclamation point short of a sales pitch. Eager as she was to get the children to taste what she offered, though, she pointedly refused to compare this vegetable to candy.
“I don’t like sweet potatoes,” said Jessica, who is in the second grade.
“Have you ever tried one?” Hollar asked.
“No,” Jessica said, making it clear she would stick with her Lunchables nachos and her Capri Sun drink, which she took out of her Mickey Mouse lunchbox.
Hollar and her sweet potatoes were wandering the lunchroom courtesy of the Agatston Research Foundation, founded in 2004 by Dr. Arthur Agatston, creator of the South Beach Diet. Having tackled the eating habits of obese adults, Agatston has turned his attention to children. A cardiologist who considers himself a scientist and who just happened to become a wealthy minicelebrity, Agatston is using the cafeterias of the Osceola County School District as a clinical laboratory. There are 19 elementary schools in the district, and the Agatston Foundation started by taking control of the menus at 4 of them, all within Kissimmee. They are testing whether a plan he calls HOPS — Healthier Options for Public Schoolchildren — can measurably affect children’s health.
“The success of the book gave me a bully pulpit and an opportunity to change the way Americans eat,” Agatston told me not long ago. “One of the obvious places to start is with children. And that means schools.”
By any health measure, today’s children are in crisis. Seventeen percent of American children are overweight, and increasing numbers of children are developing high blood pressure, high cholesterol and Type 2 diabetes, which, until a few years ago, was a condition seen almost only in adults. The obesity rate of adolescents has tripled since 1980 and shows no sign of slowing down. Today’s children have the dubious honor of belonging to the first cohort in history that may have a lower life expectancy than their parents. The Centers for Disease Control and Prevention has predicted that 30 to 40 percent of today’s children will have diabetes in their lifetimes if current trends continue.
The only good news is that as these stark statistics have piled up, so have the resources being spent to improve school food. Throw a dart at a map and you will find a school district scrambling to fill its students with things that are low fat and high fiber.
In rural Arkansas, a program known as HOPE (Healthy Options for People through Extension) seeks to make nutrition a part of the math, science and reading curriculums. At the Promise Academy in Harlem, all meals served in the cafeteria are cooked from scratch, and the menu (heavily subsidized by private donations) now includes dishes like turkey lasagna with a side of fresh zucchini. In Santa Monica, Calif., there is a salad bar at every school in the district, with produce brought in from the local farmer’s market. At Grady High School, outside Atlanta, the student body president, a vegetarian, persuaded the company that runs the cafeteria to provide tofu stir fry, veggie burgers and hummus. In Irvington, N.Y., a group of committed parents established No Junk Food Week last March, where all unhealthy food was removed from the cafeteria and replaced with offerings from a local chef called Sushi Mike and donations from a nearby Trader Joe’s. At the Hatch Elementary School in Half Moon Bay, Calif., children learn songs like “Dirt Made My Lunch” and then taste fruits and vegetables they have grown in their own garden.
School lunch (and actually, breakfast, because schools that provide free and reduced-cost lunches must also provide breakfast) is now a most popular cause. Any number of groups, from the W.K. Kellogg Foundation and Kaiser Permanente (they both underwrite many of the above programs) to the William J. Clinton Foundation (it brokered an agreement among soft-drink manufacturers to stop selling soda in elementary and middle schools) have gotten in on the act.
But there is one big shadow over all this healthy enthusiasm: no one can prove that it works. For all the menus being defatted, salad bars made organic and vending machines being banned, no one can prove that changes in school lunches will make our children lose weight. True, studies show that students who exercise more and have healthier diets learn better and fidget less, and that alone would be a worthwhile goal. But if the main reason for overhauling the cafeteria is to reverse the epidemic of obesity and the lifelong health problems that result, then shouldn’t we be able to prove we are doing what we set out to do?
The smattering of controlled prevention studies in the scientific literature have decidedly mixed findings. “There just isn’t definitive proof,” says Benjamin Caballero, the principal investigator on the largest study, of 1,704 students over three years in the 1990’s, which showed no change in the body-mass index of those whose schools had spent $20 million changing their menus, exercise programs and nutritional education. A second study, of more than 5,000 students undertaken at about the same time, came to similar conclusions. “There are a few smaller studies with more promising results,” Caballero went on to say, “but right now we can’t scientifically say that all the things that should work — by that I mean improving diet, classroom nutrition education, physical activity, parental involvement — actually do work.”
And yet districts keep trying. Until recently, most were spurred by determined parents or energetic administrators, but now they have a Congressional incentive, too. This coming school year is the first when schools receiving federal lunch subsidies will have to create a wellness plan — a detailed strategy for how nutrition will be provided and taught. In addition, the actual nutrition requirements set by the government for school meals are expected to become more rigorous this coming spring, on the heels of the revised “food pyramid.”
Agatston’s HOPS program is but one example of the scramble to create systems that are replicable and economical enough to meet these demands and to prove, while doing so, that they have a measurable effect on children’s health. As such, the year I spent observing the successes and the setbacks of this particular experiment is a window into why it is so hard to do something that seems so straightforward and simple — feed school children better food.
In taking on this challenge, the Agatston team weighed and measured thousands of children at the start of the last school year, then weighed and measured them again in June. In the months in between, they wrestled with finicky eaters, reluctant administrators, hostile parents and uncooperative suppliers. So there was a lot riding on Hollar every time she presented a tray of sweet potatoes, or broccoli served with dabs of reduced-fat ranch dressing, or tiny cups of salsa. Would this bring schools closer to a solution? Or was it just another false start?
The reason that children are currently too fat is, in part, because they used to be too thin. During World War II, potential enlistees were regularly turned away because they were undernourished, and after the war the director of the Selective Service System declared malnutrition to be a national emergency. One result was the National School Lunch Act, signed by President Harry S. Truman in 1946, guaranteeing a hot lunch for every schoolchild who could not afford one.
Another result was also a complex web of regulations and restrictions, overseen by the United States Department of Agriculture. These rules have morphed and grown over the decades, adding free and reduced-cost breakfast during the Lyndon Johnson years; being pared back during the Reagan Administration, which, memorably, was lampooned for proposing that ketchup be declared a vegetable; and stressing nutrition under the Clinton Administration, which set a limit on fat at 30 percent of calories in a weekly menu (though, rules or no, the national average has never fallen below 34 percent for lunches).
Tweaks aside, the twofold effects of the School Lunch Act are much the same now as 60 years ago. First, the act put the government in the school-food-supply business, buying surplus product from farmers and sending it along to the schools. Twenty percent of the foods served in school cafeterias today are Agriculture Department commodities, which include everything the federal government buys a lot of and needs to pass along, from flour and sugar to fruits and vegetables. While the quality has improved somewhat in recent years, terms like farm-fresh and organic rarely apply. At the same time, the act put schools in the restaurant business, requiring that their lunchrooms manage to at least break even, reimbursing them between 23 cents and $2.40 a meal. It is a system in which pennies are necessarily looked at as closely as sodium content, perhaps even more.
Agatston knew next to nothing about the arcane intricacies of the system two years ago, other than that he wanted to do something about school lunch. As it happened, a lot of his cardiac patients worked as teachers, and for years he had heard about classroomwide sugar highs after lunch and children who seem to expand from year to year. He hired Hollar, whose background was not in nutrition or school food systems — she has a Ph.D. in public administration and policy — and teamed her with Marie Almon, a registered dietician who had worked with him for 20 years and created nearly all of the recipes for the original South Beach Diet book. She also had no experience in schools.
“Looking back, we were unprepared for the complexities,” Almon said this summer, reflecting on the prior two years. “But maybe that turned out to be best, because I might have been overwhelmed if I had known.”
In the spring of 2004, Almon, Hollar and Agatston set out to find a school district that would welcome their experiment. They wanted one with a relatively poor population, where a good portion of the children receive free or reduced-cost lunches (and breakfasts), where the food provided at school served the purpose originally intended under the law — to be the most nutritious meals a child had all day. And they wanted a place where the parents were less likely to have the economic or organizational clout to make change happen on their own.
The Osceola County School District met many of these requirements. The school-age population in the four chosen elementary schools — the first stage of the program would include only kindergarten through sixth grade on the theory that like language, teaching nutrition to younger children would have a higher “stick” rate — is 42.6 percent Hispanic, 41.3 percent white but not Hispanic and the rest divided among other ethnic groups. At these four schools — Partin Settlement Elementary, Mill Creek Elementary, Kissimmee Charter Academy and P.M. Wells Charter Academy — many students are from homeless families. Fifty-five percent qualify for free or reduced-cost meals. “We have kids who go shopping in the gas station across from the homeless shelter or who live in the Howard Johnson’s,” says Eileen Smith, Partin Settlement’s principal. “They are not going to get anything fresh.”
The Agatston group presented its proposal to the leadership of the Osceola district during the summer of 2004. The plan included changing the food served in the cafeteria; creating small gardens at each school to allow children to get their hands dirty; providing teachers with guides for incorporating nutrition lessons into Florida’s existing curriculum — inserting them into math class or social studies, for instance — so that the schools would stay on track in terms of their teaching schedule; and providing special programs, whether food tastings or creative assemblies, to reinforce the message.
Jean Palmore, the director of food services in Osceola, was at that meeting and was impressed by what she heard. “It sounded like it was workable; it sounded simple,” Palmore says. She liked the fact that two other Kissimmee elementary schools would be set aside as control schools, so that there would be a way to compare her standard menu, the one that would remain in effect at all the other schools in the district, with the HOPS revisions. She was particularly pleased that the foundation would reimburse the district for any costs over what was already budgeted for food. And she added a requirement — that HOPS also reimburse Osceola if the “participation rate” of students decreased to the point that the cafeterias could not break even. In other words, if the students refused the healthier food, Palmore would still meet her $19.5 million budget.
A contract was signed in July 2004. School opened in August. Instead of spending the first year learning and planning — which, in retrospect, might have been a good idea — the team jumped in and ran right into the realities of school nutrition.
For instance, nearly all the food for the coming school year had been ordered months earlier. Commodities, which can be had free from the government, must be requested as early as March, and those orders, once approved, cannot be changed. (Which does not mean, however, that the government cannot change its mind about what it sends, just that schools cannot change their requests. The summer before Hollar and Almon arrived, for example, a year’s supply of puréed prunes simply showed up at the Osceola warehouse. Federal law says that commodities must be used — they cannot be sold or thrown out or even given away — and Palmore’s staff spent a few months experimenting with baking recipes that used prunes in place of butter or oil.)
In turn, all noncommodity orders, both to huge companies like U.S. Foodservice and Sysco and to smaller regional producers, had been finalized the previous May. A year of menus based on those orders had already been set by Palmore. And because Osceola is part of a 24-county consortium of school districts, which join together to negotiate better prices, there was even less flexibility than there might otherwise have been.
All this would have to be undone, worked around or tweaked by the Agatston team. It declared that the first year — August 2004 through June 2005 — would be a trial year to see whether healthier food could actually be identified and served. The first step was to ban white bread and Tater Tots, replacing them with whole-wheat bread and sweet-potato fries. Other favorites, like turkey with gravy or pork with gravy, went too. There was “almost a mutiny,” Almon says, when she took away Lucky Charms and Fruit Loops at breakfast, replacing them with Total and Raisin Bran.
At first the children responded as Palmore predicted they would — they threw out their school-supplied food and started to bring lunch from home. For a brief time, the participation rate went down by 50 percent, but it did not stay there long enough to activate the reimbursement clause Palmore put in the contract.
Over the course of the rest of the trial year, Almon and Hollar kept replacing and limiting things. No more ketchup. Lower-fat hot dogs. Unbreaded chicken patties. Some of these changes were made possible through shuffling — changing those orders that could be changed at the last minute and moving cans and boxes of nonreturnable food around. “Since the HOPS schools weren’t allowed the Tater Tots, we sent them to the other schools, who were more than happy to trade them for their commodity cans of sweet potatoes,” says Palmore, who admits to a personal dislike of sweet potatoes.
Some changes were made by spending Agatston foundation money. During the first year, the supplemental costs paid by Agatston were about $2,500 a month, and they reflected facts like these: a white hamburger bun costs 7 cents, while a whole-wheat hamburger bun costs 11 cents; pizza with a white refined-flour crust costs 31 cents a serving, while pizza with a whole-wheat crust costs 35 cents; a white sandwich wrap costs 23 cents, while a whole-wheat sandwich wrap costs 26 cents; breaded chicken strips are a mere 18 cents a serving, while grilled chicken strips are a whopping 65 cents.
In addition to shuffling and spending, there was compromising. Those sweet-potato fries that replaced the Tater Tots? They were commercially cut and frozen, then baked in school ovens, rather than cut fresh from actual potatoes, a step the kitchen staff was simply not set up to do. And the sweet-potato purée that Hollar handed out in the lunchroom? The vanilla was artificial, because “that’s what was stocked on the shelves,” Almon says. And liquid margarine — made with soybean oil — and some sugar were used, because, Almon says, “if they don’t eat it, what have we accomplished?”
In the end, there were also changes that simply weren’t made, particularly during that first year. “We started out not adhering to the dictate of the HOPS programs to the letter,” Palmore says. The most striking (and contentious) example of this compromise was cheese.
Almon modified existing recipes with low-fat cheese. Palmore and her school kitchen managers were adamant that they could not get a low-fat version from any of their usual suppliers. Almon kept insisting and eventually was told that in fact a low-fat product had been found and was being used, though at added cost — the lower-fat version costs 5.6 cents a slice while the higher-fat one is 1.6 cents. But Hollar paid an unannounced visit to the kitchens one day in spring 2005 and found that at least on that occasion the old cheese product — something Almon calls “full fat” and Palmore calls “whole milk” — was being used instead.
An assumption is something you don’t realize you’ve made until someone else states a conflicting one. The HOPS vision of a healthy school lunch is based on an assumption that became clearer as the trial year gave way to the full program. Specifically, it is a vision based on the system as it exists, with large vendors supplying packaged items that are essentially assembled and reheated (rather than created or cooked).
Listening to Almon talk about evolving food products at the yearly School Nutrition Association meeting in Los Angeles this year makes that clear: “The difference between what was available two years ago and what is available this year is a world of difference,” she says. “Everybody is making cookies with whole grains. Pretzels with whole grains. The breakfast burritos, the tortillas, everything is whole grain. Even the French toast.” She and Hollar were particularly pleased that Smuckers, which has long made something called Uncrustables — premade peanut-butter-and-jelly sandwiches with the crust already removed — now markets a version with transfat-free peanut butter (though Almon wishes it came on whole-wheat bread).
This comfort with premade food products is a legacy of the South Beach Diet, which, though full of recipes that start from scratch, is also not shy about steering dieters to Paul Newman’s Own Lighten Up Italian salad dressing or Kraft’s entire line of South Beach branded snacks. “Things can be nutritious and come from a package,” Almon says. “It depends what’s in the package, not the fact that there is a package.”
Part of the decision to rely on such foods is simply logistical. School lunchrooms are no longer set up to actually cook but rather to reheat — hence the Kissimmee staff’s inability to slice sweet potatoes by hand.
Just as big a reason for this reliance on packaged foods, however, is what Palmore calls “the acceptance question.” In other words, what are children willing to eat? It is no coincidence that school cafeteria menus (and the children’s menus at restaurants, for that matter) are virtually identical. Pasta. Chicken nuggets. Pizza. Hamburger. French fries. The task of tackling those expectations can feel overwhelming at best.
“Children are so conditioned to these items — the hamburgers, the cheeseburgers, the pizza,” Almon says. “To make a healthier version of familiar things makes sense.”
however. Across the country, in Berkeley, the chef Ann Cooper questions the idea of making healthier versions of flawed foods. In her book “Lunch Lessons: Changing the Way We Feed Our Children,” she asks whether healthy food should simply mirror existing unhealthy patterns and concludes: “We just don’t need an organic Twinkie. We don’t!”
Cooper, who spent years impressively overhauling the menu at the select Ross School in East Hampton, N.Y., began trying to do the same thing at the 16 schools in the Berkeley public school district starting last October. Her six-figure salary is being paid by the Chez Panisse Foundation, which also finances, in Berkeley, Martin Luther King Jr. Middle School’s Edible Schoolyard kitchen garden, a creation of Alice Waters, who all but started the organic food movement in the United States 30 years ago.
It is a common assumption that the existence of programs like the Edible Schoolyard means that Berkeley students already eat well, but when Cooper arrived last fall, the district’s menu looked like menus everywhere with their fried and fatty foods. One item that Cooper makes particularly merciless fun of is the Uncrustables sandwich — the same one that caught Almon’s eye. She thawed one and kept it on display on a desk where, because of its preservatives, “it looked exactly the same months later,” she said while giving a tour of a high-school lunchroom.
In the time since she came aboard, a salad bar has been added to every school, with ingredients that include strawberries, organic chicken or turkey, sunflower seeds, fresh avocado and other eclectic in-season items in addition to the usual lettuce, tomato and cucumber. Ninety-five percent of the food was processed when she arrived, she says, and now 90 percent is fresh and cooked from scratch. And those foods are not what one would expect on a school menu, including choices like chicken cacciatore, organic sushi and organic chicken raised on a nearby farm. The foods she does not make on the premises, foods like fresh tamales and muffins and vegetable calzones, are brought in from small local businesses.
Even here, however, the “acceptance question” arises. When Cooper first removed nachos from the middle-school menu, the percentage of students buying lunch in the cafeteria dropped significantly. Cooper quickly restored the nachos, using transfat-free chips and Cheddar cheese — from an area cheesemaker, not an industrial processor — the equivalent, she concedes, of an organic Twinkie. And she did not even try to change the pizza her first year. “I just can’t take everything away,” she says. “Or they will walk out.
“Change is never easy. And if it’s hard for us, imagine how hard it would be in Oklahoma or Omaha.”
Or Osceola. The Agatston team is fully aware that its goals are less ambitious than those in Berkeley, but that is an inevitable difference between the two districts, Hollar says. Whereas 55 percent of Osceola’s students receive free or reduced-cost lunches, only 41 percent of Berkeley students do. And while Osceola charges $1.50 to those who pay full price at the elementary school, Berkeley charges $2.50. And then there is another, immeasurable but distinct difference — the parents. Children are not the only ones who bring expectations to food, and Almon says, “I just think there is a different culture around eating healthy in California than there is here, and we have to account for that.”
In fact, she and Hollar have come to believe that the greatest resistance to nutritional change comes not from the children but from the grown-ups, starting with the very administrators who invited HOPS in. Palmore, for instance, was ambivalent from the start about much of the suggested change. She told a CBS News reporter, on the air, that she would prefer the whole-wheat rolls if they had gravy. She made a notable “yuck” face in one conversation she and I had about sweet potatoes, and she expected rebellion or worse if they were served to the students. “I was surprised that they were eating the sweet-potato fries,” she said a few months after they appeared in place of Tater Tots on the menu. “That is not a child-friendly food. I was surprised they ate the brown bread.” And she made that same face again.
Many among the rest of the staff (and among the general population, one might add) also seem to have complex relationships with food. To walk through any of the HOPS schools is to be struck by the fact that there are few adult role models when it comes to good nutrition and exercise. Several teachers approached Almon during the year and asked her if she would lead a group that wanted to start on the South Beach Diet. But while many attended the first meeting, far fewer made it to the second or any of the monthly meetings after that.
Nor are staff members the only ones with food issues. Some parents wondered why their children were being put on the South Beach Diet. (“It’s not a diet,” Agatston says of HOPS. “It’s just healthy food.”) Others expressed concern that the new way of eating would be liked too much by their children. After the schoolwide assembly to introduce the full program last September, one frantic parent called to report that her child was refusing to eat anything in the house that was not healthy. “I can’t afford to throw everything away,” the mother said. “Please tell her to eat.”
And even parents who say they enthusiastically endorse better food in schools often play the role of saboteurs. One January afternoon, two girls, both in the fourth grade, sat outside at Kissimmee Charter, each having a McDonald’s hamburger, French fries and a shake. Inside, the rest of the students were eating turkey burgers on whole-wheat buns. The girls had to dine in the garden because junk food is banned from the school. Sitting with them while they ate was the person who supplied the lunch — the mother of one of the girls.
“This is a treat,” she said. Her daughter had made honor roll, and this was the reward. “I don’t see why they should have to be out here away from everyone,” she continued. “What’s the harm in a treat now and then?”
Schools have internal cultures, and the ratio of enthusiasm to resistance varied from one to another. In some, the teachers used the materials and nutrition books Hollar sent; in others, they remained in boxes. In some, children actually stood up to move and stretch in the middle of a lesson; in others, the punishment for being noisy during gym class was sitting absolutely still on the floor. At Partin Settlement, there was usually a dish of candy for sale for some fund-raiser or another on the counter in the school office. At Kissimmee Charter, parents still sold Chick-fil-A biscuits (with gravy) in the parking lot every Wednesday to raise money for the Parent Teacher Organization. And once each marking period, honor-roll students are still given a coupon for a free hamburger, soda and small fries at McDonald’s — which is where the fourth grader’s mother got the idea of bringing in that lunch in the first place.
JoAnn Kandrac, who was the Kissimmee Charter principal during the last school year, is conflicted about the mixed messages. After all, she was the one who exiled fast food to the garden where the two girls were eating. But she has found tidy ways of rationalizing both. “You have to pick your battles,” she told me last winter. The biscuits “make quite a bit of money for the P.T.O., and they aren’t selling them to the children as much as to other parents who don’t have time for breakfast.”
As for the coupons, she says, sounding a lot like Chef Cooper defending pizza’s place on the menu: “I can’t pull everything away from the children. McDonald’s treats have been a tradition here forever. It’s naïve to think children don’t know about treats at a fast-food restaurant. Modern times call for modern methods. This is educating our children that they can make smart choices at places like McDonald’s.”
On a balmy day in January, the cafeteria was filled at 8:45 a.m. On the stage in front of the room, Michelle Lombardo, as energetic as any of the kindergartners in the front rows, was telling a story about characters known as the Organwise Guys, with names like Sir Rebrum and Hardy Heart. Lombardo, an independent consultant whose appearances are frequently financed by a grant from the Kellogg foundation, takes this act on the road throughout the country and was spending two days in Kissimmee. Soon she had the children chanting, “Low fat, high fiber, lots of water, exercise,” accompanied by moves that looked a lot like the Macarena. The floor was shaking so much that the LCD projector wobbled.
“This is a free radical,” she continued, clicking a drawing of an obvious bad guy onto the screen. “When you put a lot of fruits and vegetables in your body, they gang up on the bad guy and kick him out.” Then she introduced Peri Stolic, a smiling, animated bit of intestine. “She likes to be big and fluffy and filled with fruits and vegetables,” she said, “because she’s like a tube of toothpaste, and when she’s fluffy, it’s easier to squeeze the garbage out.”
During the trial year, it became clear that new food cannot simply appear on the lunch tray. Children must be taught about nutrition outside the lunchroom if they are to eat what is offered inside. That is why Lombardo and her Organwise characters began making regular visits to the HOPS schools and why Hollar circulated among tables introducing the Food of the Month.
In September it was broccoli. In October, sweet potatoes and apples. November, corn and cranberries. December meant tomatoes, and Almon asked the kitchen managers to order salsa, not realizing it was already part of the commodities stockpile. The managers, in turn, thought Almon’s request meant she did not approve of the fructose in the salsa that was already on hand and scrambled to purchase a different salsa. Almon, when she finally realized the confusion and read the labels, felt the commodity salsa would do, because fructose was the fifth ingredient, not one of the first four. But then she tasted the stuff, decided the commodity salsa was far too spicy for children and asked that it be cut with commodity canned tomatoes.
Small glitches and refinements continued into spring. The team and the district were still squabbling, for instance over the price of cheese, and Agatston was still paying the difference between the higher-fat version that the school would have purchased and the lower-fat version that the HOPS menu required. They were paying for other items, too (whole-wheat versions of standard refined-wheat products, for instance), but the cheese was particularly expensive and one of the primary reasons the HOPS reimbursement had jumped to $3,700 a month (from $2,500 during the trial year). Hollar had researched a method, known as “processing,” that she felt could reduce the costs of cheese. The term is somewhat misleading, because it is not the cheese that is being processed. It is the bookkeeping. The school gives its credit for its allotted amount of whole-milk cheese to a producer, in this case Land O’Lakes, which in turn ships the low-fat kind to Osceola. But a change of this magnitude requires approval, which by March Hollar had still not been able to get.
Also in the spring, Hollar decided not to send new materials to many of the teachers who had received the original educational packets — things like curriculum suggestions, posters for the students to color. Too many were never used, she learned, and when she sent a questionnaire to the staff asking why, she was told that “they did not have time, did not want to take on additional teaching requirements, needed to focus on insuring that their kids passed the mandated state tests,” she says.
There was also the complication that state regulators at the Florida Education Department had questions about the program. A parent complained to that department that a child was being put on the South Beach Diet at school, leading to an audit of the school menus. The breakfast being served at the HOPS-intervention schools (and interestingly, at the control schools) was found to be under the state’s requirement of 554 calories. “It is almost impossible to get that many calories into a meal without too much fat and sugar,” Almon says. “The regs need to change, not the food.” The issue is still under discussion.
Despite the mixed results, the HOPS team is hopeful that real progress has been made and that the biggest fight — to make it a given that school lunch should be healthy — has been won. This is an increasingly common theme in conversations with healthy-lunch advocates throughout the country, who compare changing views of school lunch to public opinion on smoking. The same act has taken on different social meanings over the past few decades as the context around the act changes.
Cooper, too, is cautiously optimistic. “I think we are starting to see a movement,” she says. “We’re on the cusp of something.” The spate of obesity studies plus the diabetes data plus the new Congressional wellness plan requirements just might mean that a healthy school lunch will finally become the norm.
Lombardo ended her January program by asking the children to recite a pledge. “I do solemnly swear,” they repeated, “to be healthier, to eat low fat, to eat high fiber, to drink lots of water and get lots of exercise.”
Hollar added, under her breath, “And to stop bringing Lunchables to school.”
For a solid week last May, children lined up in the lobby of the Partin Settlement Elementary School, as they did at all the HOPS schools, waiting their turn to be weighed and measured. The youngest children stepped onto the scale happily and shared the number they saw with their classmates. The older children already knew that the number was loaded. They took off their shoes and their jewelry and excess clothing. They told their friends to stand away.
As they went from station to station, a tape measure was wrapped around their waists, they had their blood pressures taken, they were asked whether they played outside the day before or played video games after school, whether they brought their lunches or bought them.
Despite years of lunchroom changes at schools throughout the country, only now have advocates realized that they need to buckle down and get the facts. It just may not be enough to say, as Alice Waters does: “This is something right as rain. These kids like this; they are engaged in this. Why can’t every child have this? Anybody who sees it, gets it.”
It is not even enough to prove, as at the Ross School, that changing the menus means children eat better lunches. (Cooper’s menu doubled the consumption of fruits and vegetables compared with both the national average and control groups. And 80 percent of the parents changed the way they shopped, cooked or ate, thanks to the input from their children.)
What is needed — to persuade donors and school boards and government entities that better food is worth the cost — is hard proof that improving the lunchroom actually improves children’s health. “We have to measure, to document what we’re doing and evaluate its results,” says Carina Wong, executive director of the Chez Panisse Foundation, which, with financing from the Rodale Institute, is embarking on a three-year study of the health of children in the Berkeley programs. The Center for Ecoliteracy, together with the Children’s Hospital Oakland Research Institute, is studying them too, pricking their fingers and measuring their blood-sugar levels.
Agatston understands both the need for and the risk of measurement. There is always the chance that the results will not confirm what practitioners are certain is true. After all, studies have so far failed to make a definitive link. “We don’t have these children 24 hours a day,” says Caballero, who did some of the studies. “They go home, they go out with friends, they are off all summer and everything about the world — fast food, video games, television ads, everything — conspires to undo even the best things that happen in schools.”
Similarly, the data from the first year of HOPS was inconclusive. The program did reduce the fat served in the intervention schools by 20 percent, reduced saturated fat by 26 percent and increased fiber twofold. But after that first year, there was no sign that the “overweight” rate among the children had fallen.
Those results did not keep the Agatston team from proceeding with a second year of HOPS or from expanding the program at the start of the school year that began in Florida last week. HOPS is still in place in Kissimmee — but this year it is being run and paid for by the school district directly. Almon and Hollar are merely advisers. The cheese will be “processed” through the commodities program and Land O’Lakes. Two more schools will adopt the HOPS menu. Palmore has increased the price of lunch — to $1.75 from $1.50 in the elementary schools — and expects to cut some corners. Last year’s whole-wheat chicken nuggets, she says, were, at 41 cents a serving, simply too expensive, so the schools will instead serve a compromise like grilled chicken patties (about 34 cents). The Agatston team is also bringing HOPS to 9,000 more students in the Miami-Dade County School District and intends to expand beyond Florida next year.
Just this month, Hollar received an analysis of the data from the past school year that hints that HOPS is really working. The overweight rate in the HOPS schools in fact declined during the 2005-6 school year: specifically, 23 of the 486 children who had been characterized as overweight when school began were characterized as merely “at risk” or “normal” when school ended. In the control schools, by contrast, there was no decline and three children actually gained enough weight that they were added to the overweight category.
Hollar describes these results as “cautiously exciting” and warns that the sample size is small and that a true trend cannot be determined for at least five years. Agatston, in turn, says that even if the results had been otherwise, that would not have been a reason to abandon the program.
“If the data don’t show what we want to see,” he says, “we aren’t going to throw up our hands and say, ‘Let them eat what they want.’ All that will mean is that we aren’t doing this as well as we can, so we will have to find a way to do it better.”
Lisa Belkin, a contributing writer for the magazine, wrote about Meredith Vieira in last week’s issue.
By LISA BELKIN
It was not yet 11 a.m. at the Partin Settlement Elementary School in Kissimmee, Fla., on a sunny day last October. But lunch service necessarily begins early when there are 838 children to feed, and the meal was already well under way. Danielle Hollar walked calmly amid the lunchroom chaos, holding a large, raw, uncut sweet potato in one hand and a tray filled with tiny cups of puréed sweet potatoes in the other. That Hollar does not get frazzled even among hundreds of jabbering children is one of the talents she brings to her job. That she is tall and blond and slim, and many of the students seem to have school-kid crushes on her, is another.
“This is a sweet potato,” she said, as she stopped at each table and gave each child a purée sample. “It has a lot of vitamin A and C and B6. Have you ever seen one of these?” Most of the children had not. “It’s like a regular potato, but it’s orange inside,” she said, which got their attention. “It has a lot more vitamins than a white potato.”
Angelina wanted to know why there were no marshmallows on top. Angel put down his lemon pie, tried the sweet potatoes and announced he preferred the pie. Mateo, however, was making a meal of what his classmates eyed so suspiciously. Collecting the cups of everyone around him, he ate a dozen of the tablespoon-size portions before a teacher cut him off.
“It’s sweet, which is why it’s called a sweet potato, but it’s also good for you,” Hollar said, as she moved from table to table, sounding one exclamation point short of a sales pitch. Eager as she was to get the children to taste what she offered, though, she pointedly refused to compare this vegetable to candy.
“I don’t like sweet potatoes,” said Jessica, who is in the second grade.
“Have you ever tried one?” Hollar asked.
“No,” Jessica said, making it clear she would stick with her Lunchables nachos and her Capri Sun drink, which she took out of her Mickey Mouse lunchbox.
Hollar and her sweet potatoes were wandering the lunchroom courtesy of the Agatston Research Foundation, founded in 2004 by Dr. Arthur Agatston, creator of the South Beach Diet. Having tackled the eating habits of obese adults, Agatston has turned his attention to children. A cardiologist who considers himself a scientist and who just happened to become a wealthy minicelebrity, Agatston is using the cafeterias of the Osceola County School District as a clinical laboratory. There are 19 elementary schools in the district, and the Agatston Foundation started by taking control of the menus at 4 of them, all within Kissimmee. They are testing whether a plan he calls HOPS — Healthier Options for Public Schoolchildren — can measurably affect children’s health.
“The success of the book gave me a bully pulpit and an opportunity to change the way Americans eat,” Agatston told me not long ago. “One of the obvious places to start is with children. And that means schools.”
By any health measure, today’s children are in crisis. Seventeen percent of American children are overweight, and increasing numbers of children are developing high blood pressure, high cholesterol and Type 2 diabetes, which, until a few years ago, was a condition seen almost only in adults. The obesity rate of adolescents has tripled since 1980 and shows no sign of slowing down. Today’s children have the dubious honor of belonging to the first cohort in history that may have a lower life expectancy than their parents. The Centers for Disease Control and Prevention has predicted that 30 to 40 percent of today’s children will have diabetes in their lifetimes if current trends continue.
The only good news is that as these stark statistics have piled up, so have the resources being spent to improve school food. Throw a dart at a map and you will find a school district scrambling to fill its students with things that are low fat and high fiber.
In rural Arkansas, a program known as HOPE (Healthy Options for People through Extension) seeks to make nutrition a part of the math, science and reading curriculums. At the Promise Academy in Harlem, all meals served in the cafeteria are cooked from scratch, and the menu (heavily subsidized by private donations) now includes dishes like turkey lasagna with a side of fresh zucchini. In Santa Monica, Calif., there is a salad bar at every school in the district, with produce brought in from the local farmer’s market. At Grady High School, outside Atlanta, the student body president, a vegetarian, persuaded the company that runs the cafeteria to provide tofu stir fry, veggie burgers and hummus. In Irvington, N.Y., a group of committed parents established No Junk Food Week last March, where all unhealthy food was removed from the cafeteria and replaced with offerings from a local chef called Sushi Mike and donations from a nearby Trader Joe’s. At the Hatch Elementary School in Half Moon Bay, Calif., children learn songs like “Dirt Made My Lunch” and then taste fruits and vegetables they have grown in their own garden.
School lunch (and actually, breakfast, because schools that provide free and reduced-cost lunches must also provide breakfast) is now a most popular cause. Any number of groups, from the W.K. Kellogg Foundation and Kaiser Permanente (they both underwrite many of the above programs) to the William J. Clinton Foundation (it brokered an agreement among soft-drink manufacturers to stop selling soda in elementary and middle schools) have gotten in on the act.
But there is one big shadow over all this healthy enthusiasm: no one can prove that it works. For all the menus being defatted, salad bars made organic and vending machines being banned, no one can prove that changes in school lunches will make our children lose weight. True, studies show that students who exercise more and have healthier diets learn better and fidget less, and that alone would be a worthwhile goal. But if the main reason for overhauling the cafeteria is to reverse the epidemic of obesity and the lifelong health problems that result, then shouldn’t we be able to prove we are doing what we set out to do?
The smattering of controlled prevention studies in the scientific literature have decidedly mixed findings. “There just isn’t definitive proof,” says Benjamin Caballero, the principal investigator on the largest study, of 1,704 students over three years in the 1990’s, which showed no change in the body-mass index of those whose schools had spent $20 million changing their menus, exercise programs and nutritional education. A second study, of more than 5,000 students undertaken at about the same time, came to similar conclusions. “There are a few smaller studies with more promising results,” Caballero went on to say, “but right now we can’t scientifically say that all the things that should work — by that I mean improving diet, classroom nutrition education, physical activity, parental involvement — actually do work.”
And yet districts keep trying. Until recently, most were spurred by determined parents or energetic administrators, but now they have a Congressional incentive, too. This coming school year is the first when schools receiving federal lunch subsidies will have to create a wellness plan — a detailed strategy for how nutrition will be provided and taught. In addition, the actual nutrition requirements set by the government for school meals are expected to become more rigorous this coming spring, on the heels of the revised “food pyramid.”
Agatston’s HOPS program is but one example of the scramble to create systems that are replicable and economical enough to meet these demands and to prove, while doing so, that they have a measurable effect on children’s health. As such, the year I spent observing the successes and the setbacks of this particular experiment is a window into why it is so hard to do something that seems so straightforward and simple — feed school children better food.
In taking on this challenge, the Agatston team weighed and measured thousands of children at the start of the last school year, then weighed and measured them again in June. In the months in between, they wrestled with finicky eaters, reluctant administrators, hostile parents and uncooperative suppliers. So there was a lot riding on Hollar every time she presented a tray of sweet potatoes, or broccoli served with dabs of reduced-fat ranch dressing, or tiny cups of salsa. Would this bring schools closer to a solution? Or was it just another false start?
The reason that children are currently too fat is, in part, because they used to be too thin. During World War II, potential enlistees were regularly turned away because they were undernourished, and after the war the director of the Selective Service System declared malnutrition to be a national emergency. One result was the National School Lunch Act, signed by President Harry S. Truman in 1946, guaranteeing a hot lunch for every schoolchild who could not afford one.
Another result was also a complex web of regulations and restrictions, overseen by the United States Department of Agriculture. These rules have morphed and grown over the decades, adding free and reduced-cost breakfast during the Lyndon Johnson years; being pared back during the Reagan Administration, which, memorably, was lampooned for proposing that ketchup be declared a vegetable; and stressing nutrition under the Clinton Administration, which set a limit on fat at 30 percent of calories in a weekly menu (though, rules or no, the national average has never fallen below 34 percent for lunches).
Tweaks aside, the twofold effects of the School Lunch Act are much the same now as 60 years ago. First, the act put the government in the school-food-supply business, buying surplus product from farmers and sending it along to the schools. Twenty percent of the foods served in school cafeterias today are Agriculture Department commodities, which include everything the federal government buys a lot of and needs to pass along, from flour and sugar to fruits and vegetables. While the quality has improved somewhat in recent years, terms like farm-fresh and organic rarely apply. At the same time, the act put schools in the restaurant business, requiring that their lunchrooms manage to at least break even, reimbursing them between 23 cents and $2.40 a meal. It is a system in which pennies are necessarily looked at as closely as sodium content, perhaps even more.
Agatston knew next to nothing about the arcane intricacies of the system two years ago, other than that he wanted to do something about school lunch. As it happened, a lot of his cardiac patients worked as teachers, and for years he had heard about classroomwide sugar highs after lunch and children who seem to expand from year to year. He hired Hollar, whose background was not in nutrition or school food systems — she has a Ph.D. in public administration and policy — and teamed her with Marie Almon, a registered dietician who had worked with him for 20 years and created nearly all of the recipes for the original South Beach Diet book. She also had no experience in schools.
“Looking back, we were unprepared for the complexities,” Almon said this summer, reflecting on the prior two years. “But maybe that turned out to be best, because I might have been overwhelmed if I had known.”
In the spring of 2004, Almon, Hollar and Agatston set out to find a school district that would welcome their experiment. They wanted one with a relatively poor population, where a good portion of the children receive free or reduced-cost lunches (and breakfasts), where the food provided at school served the purpose originally intended under the law — to be the most nutritious meals a child had all day. And they wanted a place where the parents were less likely to have the economic or organizational clout to make change happen on their own.
The Osceola County School District met many of these requirements. The school-age population in the four chosen elementary schools — the first stage of the program would include only kindergarten through sixth grade on the theory that like language, teaching nutrition to younger children would have a higher “stick” rate — is 42.6 percent Hispanic, 41.3 percent white but not Hispanic and the rest divided among other ethnic groups. At these four schools — Partin Settlement Elementary, Mill Creek Elementary, Kissimmee Charter Academy and P.M. Wells Charter Academy — many students are from homeless families. Fifty-five percent qualify for free or reduced-cost meals. “We have kids who go shopping in the gas station across from the homeless shelter or who live in the Howard Johnson’s,” says Eileen Smith, Partin Settlement’s principal. “They are not going to get anything fresh.”
The Agatston group presented its proposal to the leadership of the Osceola district during the summer of 2004. The plan included changing the food served in the cafeteria; creating small gardens at each school to allow children to get their hands dirty; providing teachers with guides for incorporating nutrition lessons into Florida’s existing curriculum — inserting them into math class or social studies, for instance — so that the schools would stay on track in terms of their teaching schedule; and providing special programs, whether food tastings or creative assemblies, to reinforce the message.
Jean Palmore, the director of food services in Osceola, was at that meeting and was impressed by what she heard. “It sounded like it was workable; it sounded simple,” Palmore says. She liked the fact that two other Kissimmee elementary schools would be set aside as control schools, so that there would be a way to compare her standard menu, the one that would remain in effect at all the other schools in the district, with the HOPS revisions. She was particularly pleased that the foundation would reimburse the district for any costs over what was already budgeted for food. And she added a requirement — that HOPS also reimburse Osceola if the “participation rate” of students decreased to the point that the cafeterias could not break even. In other words, if the students refused the healthier food, Palmore would still meet her $19.5 million budget.
A contract was signed in July 2004. School opened in August. Instead of spending the first year learning and planning — which, in retrospect, might have been a good idea — the team jumped in and ran right into the realities of school nutrition.
For instance, nearly all the food for the coming school year had been ordered months earlier. Commodities, which can be had free from the government, must be requested as early as March, and those orders, once approved, cannot be changed. (Which does not mean, however, that the government cannot change its mind about what it sends, just that schools cannot change their requests. The summer before Hollar and Almon arrived, for example, a year’s supply of puréed prunes simply showed up at the Osceola warehouse. Federal law says that commodities must be used — they cannot be sold or thrown out or even given away — and Palmore’s staff spent a few months experimenting with baking recipes that used prunes in place of butter or oil.)
In turn, all noncommodity orders, both to huge companies like U.S. Foodservice and Sysco and to smaller regional producers, had been finalized the previous May. A year of menus based on those orders had already been set by Palmore. And because Osceola is part of a 24-county consortium of school districts, which join together to negotiate better prices, there was even less flexibility than there might otherwise have been.
All this would have to be undone, worked around or tweaked by the Agatston team. It declared that the first year — August 2004 through June 2005 — would be a trial year to see whether healthier food could actually be identified and served. The first step was to ban white bread and Tater Tots, replacing them with whole-wheat bread and sweet-potato fries. Other favorites, like turkey with gravy or pork with gravy, went too. There was “almost a mutiny,” Almon says, when she took away Lucky Charms and Fruit Loops at breakfast, replacing them with Total and Raisin Bran.
At first the children responded as Palmore predicted they would — they threw out their school-supplied food and started to bring lunch from home. For a brief time, the participation rate went down by 50 percent, but it did not stay there long enough to activate the reimbursement clause Palmore put in the contract.
Over the course of the rest of the trial year, Almon and Hollar kept replacing and limiting things. No more ketchup. Lower-fat hot dogs. Unbreaded chicken patties. Some of these changes were made possible through shuffling — changing those orders that could be changed at the last minute and moving cans and boxes of nonreturnable food around. “Since the HOPS schools weren’t allowed the Tater Tots, we sent them to the other schools, who were more than happy to trade them for their commodity cans of sweet potatoes,” says Palmore, who admits to a personal dislike of sweet potatoes.
Some changes were made by spending Agatston foundation money. During the first year, the supplemental costs paid by Agatston were about $2,500 a month, and they reflected facts like these: a white hamburger bun costs 7 cents, while a whole-wheat hamburger bun costs 11 cents; pizza with a white refined-flour crust costs 31 cents a serving, while pizza with a whole-wheat crust costs 35 cents; a white sandwich wrap costs 23 cents, while a whole-wheat sandwich wrap costs 26 cents; breaded chicken strips are a mere 18 cents a serving, while grilled chicken strips are a whopping 65 cents.
In addition to shuffling and spending, there was compromising. Those sweet-potato fries that replaced the Tater Tots? They were commercially cut and frozen, then baked in school ovens, rather than cut fresh from actual potatoes, a step the kitchen staff was simply not set up to do. And the sweet-potato purée that Hollar handed out in the lunchroom? The vanilla was artificial, because “that’s what was stocked on the shelves,” Almon says. And liquid margarine — made with soybean oil — and some sugar were used, because, Almon says, “if they don’t eat it, what have we accomplished?”
In the end, there were also changes that simply weren’t made, particularly during that first year. “We started out not adhering to the dictate of the HOPS programs to the letter,” Palmore says. The most striking (and contentious) example of this compromise was cheese.
Almon modified existing recipes with low-fat cheese. Palmore and her school kitchen managers were adamant that they could not get a low-fat version from any of their usual suppliers. Almon kept insisting and eventually was told that in fact a low-fat product had been found and was being used, though at added cost — the lower-fat version costs 5.6 cents a slice while the higher-fat one is 1.6 cents. But Hollar paid an unannounced visit to the kitchens one day in spring 2005 and found that at least on that occasion the old cheese product — something Almon calls “full fat” and Palmore calls “whole milk” — was being used instead.
An assumption is something you don’t realize you’ve made until someone else states a conflicting one. The HOPS vision of a healthy school lunch is based on an assumption that became clearer as the trial year gave way to the full program. Specifically, it is a vision based on the system as it exists, with large vendors supplying packaged items that are essentially assembled and reheated (rather than created or cooked).
Listening to Almon talk about evolving food products at the yearly School Nutrition Association meeting in Los Angeles this year makes that clear: “The difference between what was available two years ago and what is available this year is a world of difference,” she says. “Everybody is making cookies with whole grains. Pretzels with whole grains. The breakfast burritos, the tortillas, everything is whole grain. Even the French toast.” She and Hollar were particularly pleased that Smuckers, which has long made something called Uncrustables — premade peanut-butter-and-jelly sandwiches with the crust already removed — now markets a version with transfat-free peanut butter (though Almon wishes it came on whole-wheat bread).
This comfort with premade food products is a legacy of the South Beach Diet, which, though full of recipes that start from scratch, is also not shy about steering dieters to Paul Newman’s Own Lighten Up Italian salad dressing or Kraft’s entire line of South Beach branded snacks. “Things can be nutritious and come from a package,” Almon says. “It depends what’s in the package, not the fact that there is a package.”
Part of the decision to rely on such foods is simply logistical. School lunchrooms are no longer set up to actually cook but rather to reheat — hence the Kissimmee staff’s inability to slice sweet potatoes by hand.
Just as big a reason for this reliance on packaged foods, however, is what Palmore calls “the acceptance question.” In other words, what are children willing to eat? It is no coincidence that school cafeteria menus (and the children’s menus at restaurants, for that matter) are virtually identical. Pasta. Chicken nuggets. Pizza. Hamburger. French fries. The task of tackling those expectations can feel overwhelming at best.
“Children are so conditioned to these items — the hamburgers, the cheeseburgers, the pizza,” Almon says. “To make a healthier version of familiar things makes sense.”
however. Across the country, in Berkeley, the chef Ann Cooper questions the idea of making healthier versions of flawed foods. In her book “Lunch Lessons: Changing the Way We Feed Our Children,” she asks whether healthy food should simply mirror existing unhealthy patterns and concludes: “We just don’t need an organic Twinkie. We don’t!”
Cooper, who spent years impressively overhauling the menu at the select Ross School in East Hampton, N.Y., began trying to do the same thing at the 16 schools in the Berkeley public school district starting last October. Her six-figure salary is being paid by the Chez Panisse Foundation, which also finances, in Berkeley, Martin Luther King Jr. Middle School’s Edible Schoolyard kitchen garden, a creation of Alice Waters, who all but started the organic food movement in the United States 30 years ago.
It is a common assumption that the existence of programs like the Edible Schoolyard means that Berkeley students already eat well, but when Cooper arrived last fall, the district’s menu looked like menus everywhere with their fried and fatty foods. One item that Cooper makes particularly merciless fun of is the Uncrustables sandwich — the same one that caught Almon’s eye. She thawed one and kept it on display on a desk where, because of its preservatives, “it looked exactly the same months later,” she said while giving a tour of a high-school lunchroom.
In the time since she came aboard, a salad bar has been added to every school, with ingredients that include strawberries, organic chicken or turkey, sunflower seeds, fresh avocado and other eclectic in-season items in addition to the usual lettuce, tomato and cucumber. Ninety-five percent of the food was processed when she arrived, she says, and now 90 percent is fresh and cooked from scratch. And those foods are not what one would expect on a school menu, including choices like chicken cacciatore, organic sushi and organic chicken raised on a nearby farm. The foods she does not make on the premises, foods like fresh tamales and muffins and vegetable calzones, are brought in from small local businesses.
Even here, however, the “acceptance question” arises. When Cooper first removed nachos from the middle-school menu, the percentage of students buying lunch in the cafeteria dropped significantly. Cooper quickly restored the nachos, using transfat-free chips and Cheddar cheese — from an area cheesemaker, not an industrial processor — the equivalent, she concedes, of an organic Twinkie. And she did not even try to change the pizza her first year. “I just can’t take everything away,” she says. “Or they will walk out.
“Change is never easy. And if it’s hard for us, imagine how hard it would be in Oklahoma or Omaha.”
Or Osceola. The Agatston team is fully aware that its goals are less ambitious than those in Berkeley, but that is an inevitable difference between the two districts, Hollar says. Whereas 55 percent of Osceola’s students receive free or reduced-cost lunches, only 41 percent of Berkeley students do. And while Osceola charges $1.50 to those who pay full price at the elementary school, Berkeley charges $2.50. And then there is another, immeasurable but distinct difference — the parents. Children are not the only ones who bring expectations to food, and Almon says, “I just think there is a different culture around eating healthy in California than there is here, and we have to account for that.”
In fact, she and Hollar have come to believe that the greatest resistance to nutritional change comes not from the children but from the grown-ups, starting with the very administrators who invited HOPS in. Palmore, for instance, was ambivalent from the start about much of the suggested change. She told a CBS News reporter, on the air, that she would prefer the whole-wheat rolls if they had gravy. She made a notable “yuck” face in one conversation she and I had about sweet potatoes, and she expected rebellion or worse if they were served to the students. “I was surprised that they were eating the sweet-potato fries,” she said a few months after they appeared in place of Tater Tots on the menu. “That is not a child-friendly food. I was surprised they ate the brown bread.” And she made that same face again.
Many among the rest of the staff (and among the general population, one might add) also seem to have complex relationships with food. To walk through any of the HOPS schools is to be struck by the fact that there are few adult role models when it comes to good nutrition and exercise. Several teachers approached Almon during the year and asked her if she would lead a group that wanted to start on the South Beach Diet. But while many attended the first meeting, far fewer made it to the second or any of the monthly meetings after that.
Nor are staff members the only ones with food issues. Some parents wondered why their children were being put on the South Beach Diet. (“It’s not a diet,” Agatston says of HOPS. “It’s just healthy food.”) Others expressed concern that the new way of eating would be liked too much by their children. After the schoolwide assembly to introduce the full program last September, one frantic parent called to report that her child was refusing to eat anything in the house that was not healthy. “I can’t afford to throw everything away,” the mother said. “Please tell her to eat.”
And even parents who say they enthusiastically endorse better food in schools often play the role of saboteurs. One January afternoon, two girls, both in the fourth grade, sat outside at Kissimmee Charter, each having a McDonald’s hamburger, French fries and a shake. Inside, the rest of the students were eating turkey burgers on whole-wheat buns. The girls had to dine in the garden because junk food is banned from the school. Sitting with them while they ate was the person who supplied the lunch — the mother of one of the girls.
“This is a treat,” she said. Her daughter had made honor roll, and this was the reward. “I don’t see why they should have to be out here away from everyone,” she continued. “What’s the harm in a treat now and then?”
Schools have internal cultures, and the ratio of enthusiasm to resistance varied from one to another. In some, the teachers used the materials and nutrition books Hollar sent; in others, they remained in boxes. In some, children actually stood up to move and stretch in the middle of a lesson; in others, the punishment for being noisy during gym class was sitting absolutely still on the floor. At Partin Settlement, there was usually a dish of candy for sale for some fund-raiser or another on the counter in the school office. At Kissimmee Charter, parents still sold Chick-fil-A biscuits (with gravy) in the parking lot every Wednesday to raise money for the Parent Teacher Organization. And once each marking period, honor-roll students are still given a coupon for a free hamburger, soda and small fries at McDonald’s — which is where the fourth grader’s mother got the idea of bringing in that lunch in the first place.
JoAnn Kandrac, who was the Kissimmee Charter principal during the last school year, is conflicted about the mixed messages. After all, she was the one who exiled fast food to the garden where the two girls were eating. But she has found tidy ways of rationalizing both. “You have to pick your battles,” she told me last winter. The biscuits “make quite a bit of money for the P.T.O., and they aren’t selling them to the children as much as to other parents who don’t have time for breakfast.”
As for the coupons, she says, sounding a lot like Chef Cooper defending pizza’s place on the menu: “I can’t pull everything away from the children. McDonald’s treats have been a tradition here forever. It’s naïve to think children don’t know about treats at a fast-food restaurant. Modern times call for modern methods. This is educating our children that they can make smart choices at places like McDonald’s.”
On a balmy day in January, the cafeteria was filled at 8:45 a.m. On the stage in front of the room, Michelle Lombardo, as energetic as any of the kindergartners in the front rows, was telling a story about characters known as the Organwise Guys, with names like Sir Rebrum and Hardy Heart. Lombardo, an independent consultant whose appearances are frequently financed by a grant from the Kellogg foundation, takes this act on the road throughout the country and was spending two days in Kissimmee. Soon she had the children chanting, “Low fat, high fiber, lots of water, exercise,” accompanied by moves that looked a lot like the Macarena. The floor was shaking so much that the LCD projector wobbled.
“This is a free radical,” she continued, clicking a drawing of an obvious bad guy onto the screen. “When you put a lot of fruits and vegetables in your body, they gang up on the bad guy and kick him out.” Then she introduced Peri Stolic, a smiling, animated bit of intestine. “She likes to be big and fluffy and filled with fruits and vegetables,” she said, “because she’s like a tube of toothpaste, and when she’s fluffy, it’s easier to squeeze the garbage out.”
During the trial year, it became clear that new food cannot simply appear on the lunch tray. Children must be taught about nutrition outside the lunchroom if they are to eat what is offered inside. That is why Lombardo and her Organwise characters began making regular visits to the HOPS schools and why Hollar circulated among tables introducing the Food of the Month.
In September it was broccoli. In October, sweet potatoes and apples. November, corn and cranberries. December meant tomatoes, and Almon asked the kitchen managers to order salsa, not realizing it was already part of the commodities stockpile. The managers, in turn, thought Almon’s request meant she did not approve of the fructose in the salsa that was already on hand and scrambled to purchase a different salsa. Almon, when she finally realized the confusion and read the labels, felt the commodity salsa would do, because fructose was the fifth ingredient, not one of the first four. But then she tasted the stuff, decided the commodity salsa was far too spicy for children and asked that it be cut with commodity canned tomatoes.
Small glitches and refinements continued into spring. The team and the district were still squabbling, for instance over the price of cheese, and Agatston was still paying the difference between the higher-fat version that the school would have purchased and the lower-fat version that the HOPS menu required. They were paying for other items, too (whole-wheat versions of standard refined-wheat products, for instance), but the cheese was particularly expensive and one of the primary reasons the HOPS reimbursement had jumped to $3,700 a month (from $2,500 during the trial year). Hollar had researched a method, known as “processing,” that she felt could reduce the costs of cheese. The term is somewhat misleading, because it is not the cheese that is being processed. It is the bookkeeping. The school gives its credit for its allotted amount of whole-milk cheese to a producer, in this case Land O’Lakes, which in turn ships the low-fat kind to Osceola. But a change of this magnitude requires approval, which by March Hollar had still not been able to get.
Also in the spring, Hollar decided not to send new materials to many of the teachers who had received the original educational packets — things like curriculum suggestions, posters for the students to color. Too many were never used, she learned, and when she sent a questionnaire to the staff asking why, she was told that “they did not have time, did not want to take on additional teaching requirements, needed to focus on insuring that their kids passed the mandated state tests,” she says.
There was also the complication that state regulators at the Florida Education Department had questions about the program. A parent complained to that department that a child was being put on the South Beach Diet at school, leading to an audit of the school menus. The breakfast being served at the HOPS-intervention schools (and interestingly, at the control schools) was found to be under the state’s requirement of 554 calories. “It is almost impossible to get that many calories into a meal without too much fat and sugar,” Almon says. “The regs need to change, not the food.” The issue is still under discussion.
Despite the mixed results, the HOPS team is hopeful that real progress has been made and that the biggest fight — to make it a given that school lunch should be healthy — has been won. This is an increasingly common theme in conversations with healthy-lunch advocates throughout the country, who compare changing views of school lunch to public opinion on smoking. The same act has taken on different social meanings over the past few decades as the context around the act changes.
Cooper, too, is cautiously optimistic. “I think we are starting to see a movement,” she says. “We’re on the cusp of something.” The spate of obesity studies plus the diabetes data plus the new Congressional wellness plan requirements just might mean that a healthy school lunch will finally become the norm.
Lombardo ended her January program by asking the children to recite a pledge. “I do solemnly swear,” they repeated, “to be healthier, to eat low fat, to eat high fiber, to drink lots of water and get lots of exercise.”
Hollar added, under her breath, “And to stop bringing Lunchables to school.”
For a solid week last May, children lined up in the lobby of the Partin Settlement Elementary School, as they did at all the HOPS schools, waiting their turn to be weighed and measured. The youngest children stepped onto the scale happily and shared the number they saw with their classmates. The older children already knew that the number was loaded. They took off their shoes and their jewelry and excess clothing. They told their friends to stand away.
As they went from station to station, a tape measure was wrapped around their waists, they had their blood pressures taken, they were asked whether they played outside the day before or played video games after school, whether they brought their lunches or bought them.
Despite years of lunchroom changes at schools throughout the country, only now have advocates realized that they need to buckle down and get the facts. It just may not be enough to say, as Alice Waters does: “This is something right as rain. These kids like this; they are engaged in this. Why can’t every child have this? Anybody who sees it, gets it.”
It is not even enough to prove, as at the Ross School, that changing the menus means children eat better lunches. (Cooper’s menu doubled the consumption of fruits and vegetables compared with both the national average and control groups. And 80 percent of the parents changed the way they shopped, cooked or ate, thanks to the input from their children.)
What is needed — to persuade donors and school boards and government entities that better food is worth the cost — is hard proof that improving the lunchroom actually improves children’s health. “We have to measure, to document what we’re doing and evaluate its results,” says Carina Wong, executive director of the Chez Panisse Foundation, which, with financing from the Rodale Institute, is embarking on a three-year study of the health of children in the Berkeley programs. The Center for Ecoliteracy, together with the Children’s Hospital Oakland Research Institute, is studying them too, pricking their fingers and measuring their blood-sugar levels.
Agatston understands both the need for and the risk of measurement. There is always the chance that the results will not confirm what practitioners are certain is true. After all, studies have so far failed to make a definitive link. “We don’t have these children 24 hours a day,” says Caballero, who did some of the studies. “They go home, they go out with friends, they are off all summer and everything about the world — fast food, video games, television ads, everything — conspires to undo even the best things that happen in schools.”
Similarly, the data from the first year of HOPS was inconclusive. The program did reduce the fat served in the intervention schools by 20 percent, reduced saturated fat by 26 percent and increased fiber twofold. But after that first year, there was no sign that the “overweight” rate among the children had fallen.
Those results did not keep the Agatston team from proceeding with a second year of HOPS or from expanding the program at the start of the school year that began in Florida last week. HOPS is still in place in Kissimmee — but this year it is being run and paid for by the school district directly. Almon and Hollar are merely advisers. The cheese will be “processed” through the commodities program and Land O’Lakes. Two more schools will adopt the HOPS menu. Palmore has increased the price of lunch — to $1.75 from $1.50 in the elementary schools — and expects to cut some corners. Last year’s whole-wheat chicken nuggets, she says, were, at 41 cents a serving, simply too expensive, so the schools will instead serve a compromise like grilled chicken patties (about 34 cents). The Agatston team is also bringing HOPS to 9,000 more students in the Miami-Dade County School District and intends to expand beyond Florida next year.
Just this month, Hollar received an analysis of the data from the past school year that hints that HOPS is really working. The overweight rate in the HOPS schools in fact declined during the 2005-6 school year: specifically, 23 of the 486 children who had been characterized as overweight when school began were characterized as merely “at risk” or “normal” when school ended. In the control schools, by contrast, there was no decline and three children actually gained enough weight that they were added to the overweight category.
Hollar describes these results as “cautiously exciting” and warns that the sample size is small and that a true trend cannot be determined for at least five years. Agatston, in turn, says that even if the results had been otherwise, that would not have been a reason to abandon the program.
“If the data don’t show what we want to see,” he says, “we aren’t going to throw up our hands and say, ‘Let them eat what they want.’ All that will mean is that we aren’t doing this as well as we can, so we will have to find a way to do it better.”
Lisa Belkin, a contributing writer for the magazine, wrote about Meredith Vieira in last week’s issue.
Friday, August 18
Wages, Wealth and Politics
By Paul Krugman, Princeton University
Recently, Henry Paulson, the Treasury secretary, acknowledged that economic inequality is rising in America. In a break with previous administration pronouncements, he also conceded that this might be cause for concern.
But he quickly reverted to form, falsely implying that rising inequality is mainly a story about rising wages for the highly educated. And he argued that nothing can be done about this trend, that “it is simply an economic reality, and it is neither fair nor useful to blame any political party.”
History suggests otherwise.
I’ve been studying the long-term history of inequality in the United States. And it’s hard to avoid the sense that it matters a lot which political party, or more accurately, which political ideology rules Washington.
Since the 1920’s there have been four eras of American inequality:
• The Great Compression, 1929-1947: The birth of middle-class America. The real wages of production workers in manufacturing rose 67 percent, while the real income of the richest 1 percent of Americans actually fell 17 percent.
• The Postwar Boom, 1947-1973: An era of widely shared growth. Real wages rose 81 percent, and the income of the richest 1 percent rose 38 percent.
• Stagflation, 1973-1980: Everyone lost ground. Real wages fell 3 percent, and the income of the richest 1 percent fell 4 percent.
• The New Gilded Age, 1980-?: Big gains at the very top, stagnation below. Between 1980 and 2004, real wages in manufacturing fell 1 percent, while the real income of the richest 1 percent — people with incomes of more than $277,000 in 2004 — rose 135 percent.
What’s noticeable is that except during stagflation, when virtually all Americans were hurt by a tenfold increase in oil prices, what happened in each era was what the dominant political tendency of that era wanted to happen.
Franklin Roosevelt favored the interests of workers while declaring of plutocrats who considered him a class traitor, “I welcome their hatred.” Sure enough, under the New Deal wages surged while the rich lost ground.
What followed was an era of bipartisanship and political moderation; Dwight Eisenhower said of those who wanted to roll back the New Deal, “Their number is negligible, and they are stupid.” Sure enough, it was also an era of equable growth.
Finally, since 1980 the U.S. political scene has been dominated by a conservative movement firmly committed to the view that what’s good for the rich is good for America. Sure enough, the rich have seen their incomes soar, while working Americans have seen few if any gains.
By the way: Yes, Bill Clinton was president for eight years. But for six of those years Congress was controlled by hard-line right-wingers. Moreover, in practice Mr. Clinton governed well to the right of both Eisenhower and Nixon.
Now, this chronology doesn’t prove that politics drives changes in inequality. There were certainly other factors at work, including technological change, globalization and immigration, an issue that cuts across party lines.
But it seems likely that government policies have played a big role in America’s growing economic polarization — not just easily measured policies like tax rates for the rich and the level of the minimum wage, but things like the shift in Labor Department policy from protection of worker rights to tacit support for union-busting.
And if that’s true, it matters a lot which party is in power — and more important, which ideology. For the last few decades, even Democrats have been afraid to make an issue out of inequality, fearing that they would be accused of practicing class warfare and lose the support of wealthy campaign contributors.
That may be changing. Inequality seems to be an issue whose time has finally come, and if the growing movement to pressure Wal-Mart to treat its workers better is any indication, economic populism is making a comeback. It’s still unclear when the Democrats might regain power, or what economic policies they’ll pursue when they do. But if and when we get a government that tries to do something about rising inequality, rather than responding with a mixture of denial and fatalism, we may find that Mr. Paulson’s “economic reality” is a lot easier to change than he supposes.
Recently, Henry Paulson, the Treasury secretary, acknowledged that economic inequality is rising in America. In a break with previous administration pronouncements, he also conceded that this might be cause for concern.
But he quickly reverted to form, falsely implying that rising inequality is mainly a story about rising wages for the highly educated. And he argued that nothing can be done about this trend, that “it is simply an economic reality, and it is neither fair nor useful to blame any political party.”
History suggests otherwise.
I’ve been studying the long-term history of inequality in the United States. And it’s hard to avoid the sense that it matters a lot which political party, or more accurately, which political ideology rules Washington.
Since the 1920’s there have been four eras of American inequality:
• The Great Compression, 1929-1947: The birth of middle-class America. The real wages of production workers in manufacturing rose 67 percent, while the real income of the richest 1 percent of Americans actually fell 17 percent.
• The Postwar Boom, 1947-1973: An era of widely shared growth. Real wages rose 81 percent, and the income of the richest 1 percent rose 38 percent.
• Stagflation, 1973-1980: Everyone lost ground. Real wages fell 3 percent, and the income of the richest 1 percent fell 4 percent.
• The New Gilded Age, 1980-?: Big gains at the very top, stagnation below. Between 1980 and 2004, real wages in manufacturing fell 1 percent, while the real income of the richest 1 percent — people with incomes of more than $277,000 in 2004 — rose 135 percent.
What’s noticeable is that except during stagflation, when virtually all Americans were hurt by a tenfold increase in oil prices, what happened in each era was what the dominant political tendency of that era wanted to happen.
Franklin Roosevelt favored the interests of workers while declaring of plutocrats who considered him a class traitor, “I welcome their hatred.” Sure enough, under the New Deal wages surged while the rich lost ground.
What followed was an era of bipartisanship and political moderation; Dwight Eisenhower said of those who wanted to roll back the New Deal, “Their number is negligible, and they are stupid.” Sure enough, it was also an era of equable growth.
Finally, since 1980 the U.S. political scene has been dominated by a conservative movement firmly committed to the view that what’s good for the rich is good for America. Sure enough, the rich have seen their incomes soar, while working Americans have seen few if any gains.
By the way: Yes, Bill Clinton was president for eight years. But for six of those years Congress was controlled by hard-line right-wingers. Moreover, in practice Mr. Clinton governed well to the right of both Eisenhower and Nixon.
Now, this chronology doesn’t prove that politics drives changes in inequality. There were certainly other factors at work, including technological change, globalization and immigration, an issue that cuts across party lines.
But it seems likely that government policies have played a big role in America’s growing economic polarization — not just easily measured policies like tax rates for the rich and the level of the minimum wage, but things like the shift in Labor Department policy from protection of worker rights to tacit support for union-busting.
And if that’s true, it matters a lot which party is in power — and more important, which ideology. For the last few decades, even Democrats have been afraid to make an issue out of inequality, fearing that they would be accused of practicing class warfare and lose the support of wealthy campaign contributors.
That may be changing. Inequality seems to be an issue whose time has finally come, and if the growing movement to pressure Wal-Mart to treat its workers better is any indication, economic populism is making a comeback. It’s still unclear when the Democrats might regain power, or what economic policies they’ll pursue when they do. But if and when we get a government that tries to do something about rising inequality, rather than responding with a mixture of denial and fatalism, we may find that Mr. Paulson’s “economic reality” is a lot easier to change than he supposes.
Thursday, August 17
Hoping for Fear
By PAUL KRUGMAN
Just two days after 9/11, I learned from Congressional staffers that Republicans on Capitol Hill were already exploiting the atrocity, trying to use it to push through tax cuts for corporations and the wealthy. I wrote about the subject the next day, warning that “politicians who wrap themselves in the flag while relentlessly pursuing their usual partisan agenda are not true patriots.”
The response from readers was furious — fury not at the politicians but at me, for suggesting that such an outrage was even possible. “How can I say that to my young son?” demanded one angry correspondent.
I wonder what he says to his son these days.
We now know that from the very beginning, the Bush administration and its allies in Congress saw the terrorist threat not as a problem to be solved, but as a political opportunity to be exploited. The story of the latest terror plot makes the administration’s fecklessness and cynicism on terrorism clearer than ever.
Fecklessness: the administration has always pinched pennies when it comes to actually defending America against terrorist attacks. Now we learn that terrorism experts have known about the threat of liquid explosives for years, but that the Bush administration did nothing about that threat until now, and tried to divert funds from programs that might have helped protect us. “As the British terror plot was unfolding,” reports The Associated Press, “the Bush administration quietly tried to take away $6 million that was supposed to be spent this year developing new explosives detection technology.”
Cynicism: Republicans have consistently portrayed their opponents as weak on terrorism, if not actually in sympathy with the terrorists. Remember the 2002 TV ad in which Senator Max Cleland of Georgia was pictured with Osama bin Laden and Saddam Hussein? Now we have Dick Cheney suggesting that voters in the Democratic primary in Connecticut were lending aid and comfort to “Al Qaeda types.” There they go again.
More fecklessness, and maybe more cynicism, too: NBC reports that there was a dispute between the British and the Americans over when to make arrests in the latest plot. Since the alleged plotters weren’t ready to go — they hadn’t purchased airline tickets, and some didn’t even have passports yet — British officials wanted to watch and wait, hoping to gather more evidence. But according to NBC, the Americans insisted on early arrests.
Suspicions that the Bush administration might have had political motives in wanting the arrests made prematurely are fed by memories of events two years ago: the Department of Homeland Security declared a terror alert just after the Democratic National Convention, shifting the spotlight away from John Kerry — and, according to Pakistani intelligence officials, blowing the cover of a mole inside Al Qaeda.
But whether or not there was something fishy about the timing of the latest terror announcement, there’s the question of whether the administration’s scare tactics will work. If current polls are any indication, Republicans are on the verge of losing control of at least one house of Congress. And “on every issue other than terrorism and homeland security,” says Newsweek about its latest poll, “the Dems win.” Can a last-minute effort to make a big splash on terror stave off electoral disaster?
Many political analysts think it will. But even on terrorism, and even after the latest news, polls give Republicans at best a slight advantage. And Democrats are finally doing what they should have done long ago: calling foul on the administration’s attempt to take partisan advantage of the terrorist threat.
It was significant both that President Bush felt obliged to defend himself against that accusation in his Saturday radio address, and that his standard defense — attacking a straw man by declaring that “there should be no disagreement about the dangers we face” — came off sounding so weak.
Above all, many Americans now understand the extent to which Mr. Bush abused the trust the nation placed in him after 9/11. Americans no longer believe that he is someone who will keep them safe, as many did even in 2004; the pathetic response to Hurricane Katrina and the disaster in Iraq have seen to that.
All Mr. Bush and his party can do at this point is demonize their opposition. And my guess is that the public won’t go for it, that Americans are fed up with leadership that has nothing to hope for but fear itself.
Just two days after 9/11, I learned from Congressional staffers that Republicans on Capitol Hill were already exploiting the atrocity, trying to use it to push through tax cuts for corporations and the wealthy. I wrote about the subject the next day, warning that “politicians who wrap themselves in the flag while relentlessly pursuing their usual partisan agenda are not true patriots.”
The response from readers was furious — fury not at the politicians but at me, for suggesting that such an outrage was even possible. “How can I say that to my young son?” demanded one angry correspondent.
I wonder what he says to his son these days.
We now know that from the very beginning, the Bush administration and its allies in Congress saw the terrorist threat not as a problem to be solved, but as a political opportunity to be exploited. The story of the latest terror plot makes the administration’s fecklessness and cynicism on terrorism clearer than ever.
Fecklessness: the administration has always pinched pennies when it comes to actually defending America against terrorist attacks. Now we learn that terrorism experts have known about the threat of liquid explosives for years, but that the Bush administration did nothing about that threat until now, and tried to divert funds from programs that might have helped protect us. “As the British terror plot was unfolding,” reports The Associated Press, “the Bush administration quietly tried to take away $6 million that was supposed to be spent this year developing new explosives detection technology.”
Cynicism: Republicans have consistently portrayed their opponents as weak on terrorism, if not actually in sympathy with the terrorists. Remember the 2002 TV ad in which Senator Max Cleland of Georgia was pictured with Osama bin Laden and Saddam Hussein? Now we have Dick Cheney suggesting that voters in the Democratic primary in Connecticut were lending aid and comfort to “Al Qaeda types.” There they go again.
More fecklessness, and maybe more cynicism, too: NBC reports that there was a dispute between the British and the Americans over when to make arrests in the latest plot. Since the alleged plotters weren’t ready to go — they hadn’t purchased airline tickets, and some didn’t even have passports yet — British officials wanted to watch and wait, hoping to gather more evidence. But according to NBC, the Americans insisted on early arrests.
Suspicions that the Bush administration might have had political motives in wanting the arrests made prematurely are fed by memories of events two years ago: the Department of Homeland Security declared a terror alert just after the Democratic National Convention, shifting the spotlight away from John Kerry — and, according to Pakistani intelligence officials, blowing the cover of a mole inside Al Qaeda.
But whether or not there was something fishy about the timing of the latest terror announcement, there’s the question of whether the administration’s scare tactics will work. If current polls are any indication, Republicans are on the verge of losing control of at least one house of Congress. And “on every issue other than terrorism and homeland security,” says Newsweek about its latest poll, “the Dems win.” Can a last-minute effort to make a big splash on terror stave off electoral disaster?
Many political analysts think it will. But even on terrorism, and even after the latest news, polls give Republicans at best a slight advantage. And Democrats are finally doing what they should have done long ago: calling foul on the administration’s attempt to take partisan advantage of the terrorist threat.
It was significant both that President Bush felt obliged to defend himself against that accusation in his Saturday radio address, and that his standard defense — attacking a straw man by declaring that “there should be no disagreement about the dangers we face” — came off sounding so weak.
Above all, many Americans now understand the extent to which Mr. Bush abused the trust the nation placed in him after 9/11. Americans no longer believe that he is someone who will keep them safe, as many did even in 2004; the pathetic response to Hurricane Katrina and the disaster in Iraq have seen to that.
All Mr. Bush and his party can do at this point is demonize their opposition. And my guess is that the public won’t go for it, that Americans are fed up with leadership that has nothing to hope for but fear itself.
Federal Judge Orders End to Warrantless Wiretapping
By DAVID STOUT
WASHINGTON, Aug. 17 — A federal judge in Detroit ruled today that the Bush administration’s eavesdropping program is illegal and unconstitutional, and she ordered that it cease at once.
District Judge Anna Diggs Taylor found that President Bush exceeded his proper authority and that the eavesdropping without warrants violated the First and Fourth Amendment protections of free speech and privacy.
“It was never the intent of the Framers to give the president such unfettered control, particularly where his actions blatantly disregard the parameters clearly enumerated in the Bill of Rights,” she wrote, in a decision that the White House and Justice Department said they would fight to overturn. A hearing will be held before Judge Taylor on Sept. 7, and her decision will not be enforced in the meantime pending the government’s appeal.
The judge’s ruling is the latest chapter in the continuing debate over the proper balance between national security and personal liberty since the attacks of Sept. 11, 2001, which inspired the eavesdropping program and other surveillance measures that the administration says are necessary and constitutional and its critics say are intrusive.
In becoming the first federal judge to declare the eavesdropping program unconstitutional, Judge Taylor rejected the administration’s assertion that to defend itself against a lawsuit would force it to divulge information that should be kept secret in the name of national security.
“Predictably, the war on terror of this administration has produced a vast number of cases, in which the states secrets privilege has been invoked,” Judge Taylor wrote. She noted that the Supreme Court has held that because the president’s power to withhold secrets is so powerful, “it is not to be lightly invoked.” She also cited a finding in an earlier case by the Court of Appeals for the District of Columbia Circuit that “whenever possible, sensitive information must be disentangled from nonsensitive information to allow for the release of the latter.”
In any event, she said, she is convinced that the administration could defend itself in this case without disclosing state secrets. Judge Taylor’s ruling came in a suit filed by the American Civil Liberties Union on behalf of journalists, scholars, lawyers and various nonprofit organizations who argued that the possibility of eavesdropping by the National Security Agency interfered with their work.
Although she ordered an immediate halt to the eavesdropping program, no one who has followed the controversy expects the litigation to end quickly. The White House issued a statement saying “we couldn’t disagree more” with Judge Taylor’s decision and crediting the surveillance program with saving American lives.
Attorney General Alberto Gonzales said this afternoon that he was disappointed with the decision, and that while the stay is in place “we will continue to utilize the program to ensure that America is safer.” Mr. Gonzales said he remained confident that the program was constitutional, and that Congress had given the president all the authority he needed when it authorized the use of military force after the Sept. 11 attacks.
Earlier, the Justice Department called the surveillance program “a critical tool” against Al Qaeda and said the parties to the suit have agreed to a stay of Judge Taylor’s order until the Sept. 7 hearing. On that day, the judge will be asked to prolong the stay of her order pending further appeals, to the Court of Appeals for the Sixth Circuit or perhaps to the Supreme Court.
Some Republicans voiced disappointment over the ruling, while Democrats praised it. The starkly different reactions signaled more heated debate on Capitol Hill when Congress reconvenes.
But for the moment, the ruling by Judge Taylor caused elation among the plaintiffs.
“It’s another nail in the coffin of executive unilateralism,” said Jameel Jaffer, a lawyer for the plaintiffs with the A.C.L.U. And Anthony Romero, executive director of the A.C.L.U., said Judge Taylor’s ruling “confirms that the government has been acting illegally, in contravention of the Foreign Intelligence Surveillance Act and the Fourth Amendment.’’
The surveillance act was passed by Congress in 1978 in response to disclosures of previous government improprieties in eavesdropping. The act established a secret court to handle applications for surveillance operations, and set up procedures for them to take place while applications for warrants are pending in some limited circumstances and for limited times.
Judge Taylor said “the president has acted, undisputedly, as F.I.S.A. forbids,” thus defying the express will of Congress, and she was unpersuaded by the government’s stance that it could not defend itself in the lawsuit without doing the country harm.
“Consequently, the court finds defendants’ arguments that they cannot defend this case without the use of classified information to be disingenuous and without merit,” she wrote.
The judge, who heard arguments in the case in June, brushed aside several assertions made by lawyers for the National Security Agency. She held that, contrary to the N.S.A.’s assertions, the plaintiffs were suffering real harm, and had standing to sue the government.
“Here, plaintiffs are not asserting speculative allegations,” she said.
Judge Taylor, appointed by President Jimmy Carter in 1979, did not deal a total defeat to the administration. She dismissed a separate claim by the A.C.L.U. over data-mining of telephone records, agreeing that further litigation could indeed jeopardize state secrets.
But over all, Judge Taylor’s decision was a rebuke to the administration, as she made clear in closing by quoting Chief Justice Earl Warren’s words in a 1967 ruling: “Implicit in the term ‘national defense’ is the notion of defending those values and ideas which set this nation apart.”
Democrats said Judge Taylor saw things the right way. “Today’s district court ruling is a strong rebuke of this administration’s illegal wiretapping program,” said Senator Russell D. Feingold of Wisconsin. “The president must return to the Constitution and follow the statutes passed by Congress. We all want our government to monitor suspected terrorists, but there is no reason for it to break the law to do so.”
Representative Ed Markey of Massachusetts, a senior Democrat on the House Homeland Security Committee, said the administration should stop “poking holes in the Constitution” and concentrate on “plugging holes in homeland security.”
But Republicans lined up behind the administration. "America cannot stop terrorists while wearing blinders,” said House Speaker J. Dennis Hastert. “We stop terrorists by watching them, following them, listening in on their plans, and then arresting them before they can strike. Our terrorist surveillance programs are critical to fighting the war on terror and saved the day by foiling the London terror plot.”
Senator Bill Frist of Tennessee, the majority leader, agreed. “We need to strengthen, not weaken, our ability to foil terrorist plots before they can do us harm,” he said. “I encourage swift appeal by the government and quick reversal of this unfortunate decision."
WASHINGTON, Aug. 17 — A federal judge in Detroit ruled today that the Bush administration’s eavesdropping program is illegal and unconstitutional, and she ordered that it cease at once.
District Judge Anna Diggs Taylor found that President Bush exceeded his proper authority and that the eavesdropping without warrants violated the First and Fourth Amendment protections of free speech and privacy.
“It was never the intent of the Framers to give the president such unfettered control, particularly where his actions blatantly disregard the parameters clearly enumerated in the Bill of Rights,” she wrote, in a decision that the White House and Justice Department said they would fight to overturn. A hearing will be held before Judge Taylor on Sept. 7, and her decision will not be enforced in the meantime pending the government’s appeal.
The judge’s ruling is the latest chapter in the continuing debate over the proper balance between national security and personal liberty since the attacks of Sept. 11, 2001, which inspired the eavesdropping program and other surveillance measures that the administration says are necessary and constitutional and its critics say are intrusive.
In becoming the first federal judge to declare the eavesdropping program unconstitutional, Judge Taylor rejected the administration’s assertion that to defend itself against a lawsuit would force it to divulge information that should be kept secret in the name of national security.
“Predictably, the war on terror of this administration has produced a vast number of cases, in which the states secrets privilege has been invoked,” Judge Taylor wrote. She noted that the Supreme Court has held that because the president’s power to withhold secrets is so powerful, “it is not to be lightly invoked.” She also cited a finding in an earlier case by the Court of Appeals for the District of Columbia Circuit that “whenever possible, sensitive information must be disentangled from nonsensitive information to allow for the release of the latter.”
In any event, she said, she is convinced that the administration could defend itself in this case without disclosing state secrets. Judge Taylor’s ruling came in a suit filed by the American Civil Liberties Union on behalf of journalists, scholars, lawyers and various nonprofit organizations who argued that the possibility of eavesdropping by the National Security Agency interfered with their work.
Although she ordered an immediate halt to the eavesdropping program, no one who has followed the controversy expects the litigation to end quickly. The White House issued a statement saying “we couldn’t disagree more” with Judge Taylor’s decision and crediting the surveillance program with saving American lives.
Attorney General Alberto Gonzales said this afternoon that he was disappointed with the decision, and that while the stay is in place “we will continue to utilize the program to ensure that America is safer.” Mr. Gonzales said he remained confident that the program was constitutional, and that Congress had given the president all the authority he needed when it authorized the use of military force after the Sept. 11 attacks.
Earlier, the Justice Department called the surveillance program “a critical tool” against Al Qaeda and said the parties to the suit have agreed to a stay of Judge Taylor’s order until the Sept. 7 hearing. On that day, the judge will be asked to prolong the stay of her order pending further appeals, to the Court of Appeals for the Sixth Circuit or perhaps to the Supreme Court.
Some Republicans voiced disappointment over the ruling, while Democrats praised it. The starkly different reactions signaled more heated debate on Capitol Hill when Congress reconvenes.
But for the moment, the ruling by Judge Taylor caused elation among the plaintiffs.
“It’s another nail in the coffin of executive unilateralism,” said Jameel Jaffer, a lawyer for the plaintiffs with the A.C.L.U. And Anthony Romero, executive director of the A.C.L.U., said Judge Taylor’s ruling “confirms that the government has been acting illegally, in contravention of the Foreign Intelligence Surveillance Act and the Fourth Amendment.’’
The surveillance act was passed by Congress in 1978 in response to disclosures of previous government improprieties in eavesdropping. The act established a secret court to handle applications for surveillance operations, and set up procedures for them to take place while applications for warrants are pending in some limited circumstances and for limited times.
Judge Taylor said “the president has acted, undisputedly, as F.I.S.A. forbids,” thus defying the express will of Congress, and she was unpersuaded by the government’s stance that it could not defend itself in the lawsuit without doing the country harm.
“Consequently, the court finds defendants’ arguments that they cannot defend this case without the use of classified information to be disingenuous and without merit,” she wrote.
The judge, who heard arguments in the case in June, brushed aside several assertions made by lawyers for the National Security Agency. She held that, contrary to the N.S.A.’s assertions, the plaintiffs were suffering real harm, and had standing to sue the government.
“Here, plaintiffs are not asserting speculative allegations,” she said.
Judge Taylor, appointed by President Jimmy Carter in 1979, did not deal a total defeat to the administration. She dismissed a separate claim by the A.C.L.U. over data-mining of telephone records, agreeing that further litigation could indeed jeopardize state secrets.
But over all, Judge Taylor’s decision was a rebuke to the administration, as she made clear in closing by quoting Chief Justice Earl Warren’s words in a 1967 ruling: “Implicit in the term ‘national defense’ is the notion of defending those values and ideas which set this nation apart.”
Democrats said Judge Taylor saw things the right way. “Today’s district court ruling is a strong rebuke of this administration’s illegal wiretapping program,” said Senator Russell D. Feingold of Wisconsin. “The president must return to the Constitution and follow the statutes passed by Congress. We all want our government to monitor suspected terrorists, but there is no reason for it to break the law to do so.”
Representative Ed Markey of Massachusetts, a senior Democrat on the House Homeland Security Committee, said the administration should stop “poking holes in the Constitution” and concentrate on “plugging holes in homeland security.”
But Republicans lined up behind the administration. "America cannot stop terrorists while wearing blinders,” said House Speaker J. Dennis Hastert. “We stop terrorists by watching them, following them, listening in on their plans, and then arresting them before they can strike. Our terrorist surveillance programs are critical to fighting the war on terror and saved the day by foiling the London terror plot.”
Senator Bill Frist of Tennessee, the majority leader, agreed. “We need to strengthen, not weaken, our ability to foil terrorist plots before they can do us harm,” he said. “I encourage swift appeal by the government and quick reversal of this unfortunate decision."
Tuesday, August 15
How to make sure children are scientifically illiterate
By LAWRENCE M. KRAUSS
Voters in Kansas ensured this month that noncreationist moderates will once again have a majority (6 to 4) on the state school board, keeping new standards inspired by intelligent design from taking effect.
This is a victory for public education and sends a message nationwide about the public’s ability to see through efforts by groups like the Discovery Institute to misrepresent science in the schools. But for those of us who are interested in improving science education, any celebration should be muted.
... But perhaps more worrisome than a political movement against science is plain old ignorance. The people determining the curriculum of our children in many states remain scientifically illiterate. And Kansas is a good case in point.
The chairman of the school board, Dr. Steve Abrams, a veterinarian, is not merely a strict creationist. He has openly stated that he believes that God created the universe 6,500 years ago, although he was quoted in The New York Times this month as saying that his personal faith “doesn’t have anything to do with science.”
“I can separate them,” he continued, adding, “My personal views of Scripture have no room in the science classroom.”
A key concern should not be whether Dr. Abrams’s religious views have a place in the classroom, but rather how someone whose religious views require a denial of essentially all modern scientific knowledge can be chairman of a state school board.
I have recently been criticized by some for strenuously objecting in print to what I believe are scientifically inappropriate attempts by some scientists to discredit the religious faith of others. However, the age of the earth, and the universe, is no more a matter of religious faith than is the question of whether or not the earth is flat.
It is a matter of overwhelming scientific evidence. To maintain a belief in a 6,000-year-old earth requires a denial of essentially all the results of modern physics, chemistry, astronomy, biology and geology. It is to imply that airplanes and automobiles work by divine magic, rather than by empirically testable laws.
Dr. Abrams has no choice but to separate his views from what is taught in science classes, because what he says he believes is inconsistent with the most fundamental facts the Kansas schools teach children.
Another member of the board, who unfortunately survived a primary challenge, is John Bacon. In spite of his name, Mr. Bacon is no friend of science. In a 1999 debate about the removal of evolution and the Big Bang from science standards, Mr. Bacon said he was baffled about the objections of scientists. “I can’t understand what they’re squealing about,” he is quoted as saying. “I wasn’t here, and neither were they.”
This again represents a remarkable misunderstanding of the nature of the scientific method. Many fields — including evolutionary biology, astronomy and physics — use evidence from the past in formulating hypotheses. But they do not stop there. Science is not storytelling.
These disciplines take hypotheses and subject them to further tests and experiments. This is how we distinguish theories that work, like evolution or gravitation.
As we continue to work to improve the abysmal state of science education in our schools, we will continue to battle those who feel that knowledge is a threat to faith.
But when we win minor skirmishes, as we did in Kansas, we must remember that the issue is far deeper than this. We must hold our elected school officials to certain basic standards of knowledge about the world. The battle is not against faith, but against ignorance.
Lawrence M. Krauss is a professor of physics and astronomy at Case Western Reserve University.
Voters in Kansas ensured this month that noncreationist moderates will once again have a majority (6 to 4) on the state school board, keeping new standards inspired by intelligent design from taking effect.
This is a victory for public education and sends a message nationwide about the public’s ability to see through efforts by groups like the Discovery Institute to misrepresent science in the schools. But for those of us who are interested in improving science education, any celebration should be muted.
... But perhaps more worrisome than a political movement against science is plain old ignorance. The people determining the curriculum of our children in many states remain scientifically illiterate. And Kansas is a good case in point.
The chairman of the school board, Dr. Steve Abrams, a veterinarian, is not merely a strict creationist. He has openly stated that he believes that God created the universe 6,500 years ago, although he was quoted in The New York Times this month as saying that his personal faith “doesn’t have anything to do with science.”
“I can separate them,” he continued, adding, “My personal views of Scripture have no room in the science classroom.”
A key concern should not be whether Dr. Abrams’s religious views have a place in the classroom, but rather how someone whose religious views require a denial of essentially all modern scientific knowledge can be chairman of a state school board.
I have recently been criticized by some for strenuously objecting in print to what I believe are scientifically inappropriate attempts by some scientists to discredit the religious faith of others. However, the age of the earth, and the universe, is no more a matter of religious faith than is the question of whether or not the earth is flat.
It is a matter of overwhelming scientific evidence. To maintain a belief in a 6,000-year-old earth requires a denial of essentially all the results of modern physics, chemistry, astronomy, biology and geology. It is to imply that airplanes and automobiles work by divine magic, rather than by empirically testable laws.
Dr. Abrams has no choice but to separate his views from what is taught in science classes, because what he says he believes is inconsistent with the most fundamental facts the Kansas schools teach children.
Another member of the board, who unfortunately survived a primary challenge, is John Bacon. In spite of his name, Mr. Bacon is no friend of science. In a 1999 debate about the removal of evolution and the Big Bang from science standards, Mr. Bacon said he was baffled about the objections of scientists. “I can’t understand what they’re squealing about,” he is quoted as saying. “I wasn’t here, and neither were they.”
This again represents a remarkable misunderstanding of the nature of the scientific method. Many fields — including evolutionary biology, astronomy and physics — use evidence from the past in formulating hypotheses. But they do not stop there. Science is not storytelling.
These disciplines take hypotheses and subject them to further tests and experiments. This is how we distinguish theories that work, like evolution or gravitation.
As we continue to work to improve the abysmal state of science education in our schools, we will continue to battle those who feel that knowledge is a threat to faith.
But when we win minor skirmishes, as we did in Kansas, we must remember that the issue is far deeper than this. We must hold our elected school officials to certain basic standards of knowledge about the world. The battle is not against faith, but against ignorance.
Lawrence M. Krauss is a professor of physics and astronomy at Case Western Reserve University.
Sunday, August 13
Sizing Up the People Who Tell Us to Take Our Shoes Off
August 13, 2006
By SEWELL CHAN and RICHARD PÉREZ-PEÑA
In exercises by undercover government workers, and experiences with real passengers, screeners repeatedly failed to prevent weapons from being taken aboard airliners. And their ranks have shrunk since 2003 even as air travel has increased.
The Transportation Security Administration created two months after 9/11, had a rocky start — millions wasted in the rush to hire, reliance on dubious contractors, even an inability to pay people on time, according to several government reports.
Among the most serious problems that were discovered was that the agency hired hundreds of screeners with criminal records, in some cases for felonies as serious as manslaughter and rape. Reports of thefts soared as more bags than ever were inspected by hand.
The starting salary for screeners is less than $24,000, and some are hired without high school diplomas. People who do specialized work like reading X-rays are no better paid those who ask people to take their shoes off.
Yet the Government Accountability Office reported in April that investigators slipped bomb components past checkpoints at all 21 airports tried. The components could be combined onboard to make an explosive — the very strategy British authorities say plotters in England planned to use.
Private security experts say the airport screeners would be hard-pressed to stop sophisticated terrorists.
By SEWELL CHAN and RICHARD PÉREZ-PEÑA
In exercises by undercover government workers, and experiences with real passengers, screeners repeatedly failed to prevent weapons from being taken aboard airliners. And their ranks have shrunk since 2003 even as air travel has increased.
The Transportation Security Administration created two months after 9/11, had a rocky start — millions wasted in the rush to hire, reliance on dubious contractors, even an inability to pay people on time, according to several government reports.
Among the most serious problems that were discovered was that the agency hired hundreds of screeners with criminal records, in some cases for felonies as serious as manslaughter and rape. Reports of thefts soared as more bags than ever were inspected by hand.
The starting salary for screeners is less than $24,000, and some are hired without high school diplomas. People who do specialized work like reading X-rays are no better paid those who ask people to take their shoes off.
Yet the Government Accountability Office reported in April that investigators slipped bomb components past checkpoints at all 21 airports tried. The components could be combined onboard to make an explosive — the very strategy British authorities say plotters in England planned to use.
Private security experts say the airport screeners would be hard-pressed to stop sophisticated terrorists.
Fat Factors
August 13, 2006
By ROBIN MARANTZ HENIG
In the 30-plus years that Richard Atkinson has been studying obesity, he has always maintained that overeating doesn’t really explain it all. His epiphany came early in his career, when he was a medical fellow at U.C.L.A. engaged in a study of people who weighed more than 300 pounds and had come in for obesity surgery. “The general thought at the time was that fat people ate too much,” Atkinson, now at Virginia Commonwealth University, told me recently. “And we documented that fat people do eat too much — our subjects ate an average of 6,700 calories a day. But what was so impressive to me was the fact that not all fat people eat too much.”
One of Atkinson’s most memorable patients was Janet S., a bright, funny 25-year-old who weighed 348 pounds when she finally made her way to U.C.L.A. in 1975. In exchange for agreeing to be hospitalized for three months so scientists could study them, Janet and the other obese research subjects (30 in all) each received a free intestinal bypass. During the three months of presurgical study, the dietitian on the research team calculated how many calories it should take for a 5-foot-6-inch woman like Janet to maintain a weight of 348. They fed her exactly that many calories — no more, no less. She dutifully ate what she was told, and she gained 12 pounds in two weeks — almost a pound a day.
“I don’t think I’d ever gained that much weight that quickly,” recalled Janet, who asked me not to use her full name because she didn’t want people to know how fat she had once been. The doctors accused her of sneaking snacks into the hospital. “But I told them, ‘I’m gaining weight because you’re feeding me a tremendous amount of food!’ ”
The experience with Janet was an early inkling that traditional ideas about obesity were incomplete. Researchers and public-health officials have long understood that to maintain a given weight, energy in (calories consumed) must equal energy out (calories expended). But then they learned that genes were important, too, and that for some people, like Janet, this formula was tilted in a direction that led to weight gain. Since the discovery of the first obesity gene in 1994, scientists have found about 50 genes involved in obesity. Some of them determine how individuals lay down fat and metabolize energy stores. Others regulate how much people want to eat in the first place, how they know when they’ve had enough and how likely they are to use up calories through activities ranging from fidgeting to running marathons. People like Janet, who can get fat on very little fuel, may be genetically programmed to survive in harsher environments. When the human species got its start, it was an advantage to be efficient. Today, when food is plentiful, it is a hazard.
But even as our understanding of genes and behavior has become more refined, some cases still boggle the mind, like identical twins who eat roughly the same and yet have vastly different weights. Now a third wave of obesity researchers are looking for explanations that don’t fall into the relatively easy ones of genetics, overeating or lack of exercise. They are investigating what might seem to be the unlikeliest of culprits: the microorganisms we encounter every day.
Jeffrey Gordon, whose theory is that obesity is related to intestinal microorganisms, has never had a weight problem. I went to meet him and his colleagues at the Center for Genome Sciences at Washington University, which he directs. I wanted to find out everything Gordon knows about the bugs in our guts, and how those bugs might contribute to human physiology — in particular, how they might make some people fat.
Of the trillions and trillions of cells in a typical human body — at least 10 times as many cells in a single individual as there are stars in the Milky Way — only about 1 in 10 is human. The other 90 percent are microbial. These microbes — a term that encompasses all forms of microscopic organisms, including bacteria, fungi, protozoa and a form of life called archaea — exist everywhere. They are found in the ears, nose, mouth, vagina, anus, as well as every inch of skin, especially the armpits, the groin and between the toes. The vast majority are in the gut, which harbors 10 trillion to 100 trillion of them. “Microbes colonize our body surfaces from the moment of our birth,” Gordon said. “They are with us throughout our lives, and at the moment of our death they consume us.”
Known collectively as the gut microflora (or microbiota, a term Gordon prefers because it derives from the Greek word bios, for “life”), these microbes have a Star Trek analogue, he says: the Borg Collective, a community of cybernetically enhanced humanoids with functions so intertwined that they operate as a single intelligence, sort of like an ant colony. In its Borglike way, the microflora assumes an extraordinary array of functions on our behalf — functions that we couldn’t manage on our own. It helps create the capillaries that line and nourish the intestines. It produces vitamins, in particular thiamine, pyroxidine and vitamin K. It provides the enzymes necessary to metabolize cholesterol and bile acid. It digests complex plant polysaccharides, the fiber found in grains, fruits and vegetables that would otherwise be indigestible.
And it helps extract calories from the food we eat and helps store those calories in fat cells for later use — which gives them, in effect, a role in determining whether our diets will make us fat or thin.
In the womb, humans are free of microbes. Colonization begins during the journey down the birth canal, which is riddled with bacteria, some of which make their way onto the newborn’s skin. From that moment on, every mother’s kiss, every swaddling blanket, carries on it more microbes, which are introduced into the baby’s system.
By about the age of 2, most of a person’s microbial community is established, and it looks much like any other person’s microbial community. But in the same way that it takes only a small percentage of our genome to make each of us unique, modest differences in our microflora may make a big difference from one person to another. It’s not clear what accounts for individual variations. Some guts may be innately more hospitable to certain microbes, either because of genetics or because of the mix of microbes already there. Most of the colonization probably happens in the first few years, which explains why the microflora fingerprints of adult twins, who shared an intimate environment (and a mother) in childhood, more closely resemble each other than they do those of their spouses, with whom they became intimate later in life.
No one yet knows whether an individual’s microflora community tends to remain stable for a lifetime, but it is known that certain environmental changes, like taking antibiotics, can alter it at least temporarily. Stop the antibiotics, and the microflora seems to bounce back — but it might not bounce back to exactly what it was before the antibiotics.
In 2004, a group of microbiologists at Stanford University led by David Relman conducted the first census of the gut microflora. It took them a year to do an analysis of just three healthy subjects, by which time they had counted 395 species of bacteria. They stopped counting before the census was complete; Relman has said the real count might be anywhere from 500 species to a few thousand.
About a year ago, Relman joined with other scientists, including Jeffrey Gordon, to begin to sequence all the genes of the human gut microflora. In early June, they published their results in Science: some 78 million base pairs in all. But even this huge number barely scratches the surface; the total number of base pairs in the gut microflora might be 100 times that. Because there are so many trillions of microbes in the gut, the vast majority of the genes that a person carries around are more microbial than human. “Humans are superorganisms,” the scientists wrote, “whose metabolism represents an amalgamation of microbial and human attributes.” They call this amalgamation — human genes plus microbial genes — the metagenome.
Gordon first began studying the connection between the microflora and obesity when he saw what happened to mice without any microbes at all. These germ-free mice, reared in sterile isolators in Gordon’s lab, had 60 percent less fat than ordinary mice. Although they ate voraciously, usually about 30 percent more food than the others, they stayed lean. Without gut microbes, they were unable to extract calories from some of the types of food they ate, which passed through their bodies without being either used or converted to fat.
When Gordon’s postdoctoral researcher Fredrik Bäckhed transplanted gut microbes from normal mice into the germ-free mice, the germ-free mice started metabolizing their food better, extracting calories efficiently and laying down fat to store for later use. Within two weeks, they were just as fat as ordinary mice. Bäckhed and Gordon found at least one mechanism that helps explain this observation. As they reported in the Proceedings of the National Academy of Sciences in 2004, some common gut bacteria, including B. theta, suppress the protein FIAF, which ordinarily prevents the body from storing fat. By suppressing FIAF, B. theta allows fat deposition to increase. A different gut microbe, M. smithii, was later found to interact with B. theta in a way that extracts additional calories from polysaccharides in the diet, further increasing the amount of fat available to be deposited after the mouse eats a meal. Mice whose guts were colonized with both B. theta and M. smithii — as usually happens in humans in the real world — were found to have about 13 percent more body fat than mice colonized by just one or the other.
Gordon likes to explain his hypothesis of what gut microbes do by talking about Cheerios. The cereal box says that a one-cup serving contains 110 calories. But it may be that not everyone will extract 110 calories from a cup of Cheerios. Some may extract more, some less, depending on the particular combination of microbes in their guts. “A diet has a certain amount of absolute energy,” he said. “But the amount that can be extracted from that diet may vary between individuals — not in a huge way, but if the energy balance is affected by just a few calories a day, over time that can make a big difference in body weight.”
Gordon says he is still far from understanding the relationship between gut microflora and weight gain. “I wish you were writing this article a year from now, even two years from now,” he told me. “We’re just beginning to explore this wilderness, finding out who’s there, how does that population change, which are the key players.” He says it will be a while before anyone figures out what the gut microbes do, how they interact with one another and how, or even whether, they play a role in obesity. And it will be even longer before anyone learns how to change the microflora in a deliberate way.
There’s another way that biological middlemen might be involved in obesity — in this case, not the gut microbes (mostly bacteria) with which we co-exist but the viruses and other pathogens that occasionally infect us and make us ill. This is the subspecialty that is being called infectobesity.
The idea of infectobesity dates to 1988, when Nikhil Dhurandhar was a young physician studying for his doctorate in biochemistry at the University of Bombay. He was having tea with his father, also a physician and the head of an obesity clinic, and an old family friend, S. M. Ajinkya, a pathologist at Bombay Veterinary College. Ajinkya was describing a plague that was killing thousands of chickens throughout India, caused by a new poultry virus that he had discovered and named with his own and a colleague’s initials, SMAM-1. On autopsy, the vet said, chickens infected with SMAM-1 revealed pale and enlarged livers and kidneys, an atrophied thymus and excess fat in the abdomen.
The finding of abdominal fat intrigued Dhurandhar. “If a chicken died of infection, having wasted away, it should be less fat, not more,” he remembered thinking at the time. He asked permission to conduct a small experiment at the vet school.
Working with about 20 chickens, Dhurandhar, then 28, infected half of them with SMAM-1. He fed them all the same amount of food, but only the infected chickens became obese. Strangely, despite their excess fat, the infected obese chickens had low levels of cholesterol and triglycerides in their blood — just the opposite of what was thought to happen in humans, whose cholesterol and triglyceride levels generally increase as their weight increases. After his pilot study in 1988, Dhurandhar conducted a larger one with 100 chickens. It confirmed his finding that SMAM-1 caused obesity in chickens.
But what about humans? With a built-in patient population from his clinic, Dhurandhar collected blood samples from 52 overweight patients. Ten of them, nearly 20 percent, showed antibody evidence of prior exposure to the SMAM-1 virus, which was a chicken virus not previously thought to have infected humans. Moreover, the once-infected patients weighed an average of 33 pounds more than those who were never infected and, most surprisingly, had lower cholesterol and triglyceride levels — the same paradoxical finding as in the chickens.
The findings violated three pieces of conventional wisdom, Dhurandhar said recently: “The first is that viruses don’t cause obesity. The second is that obesity leads to high cholesterol and triglycerides. The third is that avian viruses don’t infect humans.”
Dhurandhar, now 46, is a thoughtful man with a head of still-dark hair. Like Gordon, he has never been fat. But even though he is so firmly in the biological camp of obesity researchers, he ascribes his own weight control to behavior, not microbes; he says he is slim because he walks five miles a day, lifts weights and is careful about what he eats. Being overweight runs in his family; Dhurandhar’s father, who still practices medicine in India, began treating obese patients because of his own struggle to keep his weight down, from a onetime high of 220.
Slim as he is, Dhurandhar nonetheless is sensitive to the pain of being fat and the maddening frustration of trying to do anything about it. He takes to heart the anguished letters and e-mail he receives each time his research is publicized. Once, he said, he heard from a woman whose 10-year-old grandson weighed 184 pounds. The boy rode his bicycle until his feet bled, hoping to lose weight; he was so embarrassed by his body that he kept his T-shirt on when he went swimming. The grandmother told Dhurandhar that the virus research sounded like the answer to her prayers. But the scientist knew that even if a virus was to blame for this boy’s obesity, he was a long way from offering any real help.
In 1992, Dhurandhar moved his wife and 7-year-old son to the United States in search of a lab where he could continue his research. At first, because infectobesity was so far out of the mainstream, all he could find was unrelated work at North Dakota State University. “My wife and I gave ourselves two years,” he recalled. “If I didn’t find work in the field of viruses and obesity in two years, we would go back to Bombay.”
One month before his self-imposed deadline in 1994, Dhurandhar received a job offer from Richard Atkinson, who was then at the University of Wisconsin, Madison. Atkinson, always on the lookout for new biological explanations of obesity, wanted to collaborate with Dhurandhar on SMAM-1. But the virus existed only in India, and the U.S. government would not allow it to be imported. So the scientists decided to work with a closely related virus, a human adenovirus. They opened the catalogue of a laboratory-supply company to see which one of the 50 human adenoviruses they should order.
“I’d like to say we chose the virus out of some wisdom, out of some belief that it was similar in important ways to SMAM-1,” Dhurandhar said. But really, he admitted, it was dumb luck that the adenovirus they started with, Ad-36, turned out to be so fattening.
By this time, several pathogens had already been shown to cause obesity in laboratory animals. With Ad-36, Dhurandhar and Atkinson began by squirting the virus up the nostrils of a series of lab animals — chickens, rats, marmosets — and in every species the infected animals got fat.
“The marmosets were most dramatic,” Atkinson recalled. By seven months after infection, he said, 100 percent of them became obese. Subsequently, Atkinson’s group and another in England conducted similar research using other strains of human adenovirus. The British group found that one strain, Ad-5, caused obesity in mice; the Wisconsin group found the same thing with Ad-37 and chickens. Two other strains, Ad-2 and Ad-31, failed to cause obesity.
In 2004, Atkinson and Dhurandhar were ready to move to humans. All of the 50 strains of human adenoviruses cause infections that are usually mild and transient, the kind that people pass off as a cold, a stomach bug or pink eye. The symptoms are so minor that people who have been infected often don’t remember ever having been sick. Even with such an innocuous virus, it would be unethical, of course, for a scientist to infect a human deliberately just to see if the person gets fat. Human studies are, therefore, always retrospective, a hunt for antibodies that would signal the presence of an infectious agent at some point in the past. To carry out this research, Atkinson developed — and patented — a screening test to look for the presence of Ad-36 antibodies in the blood.
The scientists found 502 volunteers from Wisconsin, Florida and New York willing to be screened for antibodies, 360 of them obese and 142 of them of not obese. Of the leaner subjects, 11 percent had antibodies to Ad-36, indicating an infection at some point in the past. (Ad-36 was identified relatively recently, in 1978.) Among the obese subjects, 30 percent had antibodies— a difference large enough to suggest it was not just chance. In addition, subjects who were antibody-positive weighed significantly more than subjects who were uninfected. Those who were antibody-positive also had cholesterol and triglyceride readings that were significantly lower than people who were antibody-negative — just as in the infected chickens — a finding that held true whether or not they were obese.
As for the other pathogens implicated in infectobesity — nine in all — certain viruses are known to impair the brain’s appetite-control mechanism in the hypothalamus, as happens in some cases of people becoming grossly obese after meningitis. Scientists also point to a commonality between fat cells and immune-system cells, although the exact significance of the connection is unclear. Immature fat cells, for instance, have been shown to behave like macrophages, the immune cells that engulf and destroy invading pathogens. Mature fat cells secrete hormones that stimulate the production of macrophages as well as another kind of immune-system cell, T-lymphocytes.
Another line of investigation in the field of infectobesity concerns inflammation, a corollary of infection. Obese people have higher levels of two proteins related to inflammation, C-reactive protein and interleukin-6. This may suggest that an infectious agent has set off some sort of derangement in the body’s system of fat regulation, making the infected person fat. A different interpretation is not about obesity causation but about its associated risks. Some scientists, including Jeffrey Gordon’s colleagues at Washington University, are trying to see whether the ailments of obesity (especially diabetes and high blood pressure) might be caused not by the added weight per se, but by the associated inflammation.
The thrifty-genotype hypothesis holds that there was, once upon a time, an adaptive advantage to being able to get fat. Our ancestors survived unpredictable cycles of food catastrophes by laying down fat stores when food was plentiful, and using up the stores slowly when food was scarce. The ones who did this best were the ones most likely to survive and to pass on the thrifty genotype to the next generation. But this mechanism evolved to get through a difficult winter — and we’re living now in an eternal spring. With food so readily available, thriftiness is a liability, and the ability to slow down metabolism during periods of reduced eating (a k a dieting) tends to create a fatter populace, albeit a more famine-proof one.
Obesity has turned out to be a daunting foe. Many of us are tethered to bodies that sabotage us in our struggle to keep from getting fat, or to slim down when we do. Microbes might be one explanation. There might be others, as outlined in June in a paper in The International Journal of Obesity listing 10 “putative contributors” to obesity, among them sleep deprivation, the increased use of psychoactive prescription drugs and the spread of air-conditioning.
But where does this leave us, exactly? Whatever the reason for any one individual’s tendency to gain weight, the only way to lose the weight is to eat less and exercise more. Behavioral interventions are all we’ve got right now. Even the supposedly biological approach to weight loss — that is, diet drugs — still works (or, more often, fails to work) by affecting eating behavior, through chemicals instead of through willpower. If it turns out that microbes are implicated in obesity, this biological approach will become more direct, in the form of an antiviral agent or a microbial supplement. But the truth is, this isn’t going to happen any time soon.
On an individual level and for the foreseeable future, if you want to lose weight, you still have to fiddle with the energy equation. Weight still boils down to the balance between how much a particular body needs to maintain a certain weight and how much it is fed. What complicates things is that in some people, for reasons still not fully understood, what their bodies need is set unfairly low. It could be genes; it could be microbes; it could be something else entirely.
According to Rudolph Leibel, an obesity researcher at Columbia University who was involved in the discovery of the first human gene implicated in obesity, if you take two nonobese people of the same weight, they will require different amounts of food depending on whether or not they were once obese. It goes in precisely the maddening direction you might expect: formerly fat people need to eat less than never-fat people to maintain exactly the same weight. In other words, a 150-pound woman who has always weighed 150 might be able to get away with eating, say, 2,500 calories a day, but a 150-pound woman who once weighed more — 20 pounds more, 200 pounds more, the exact amount doesn’t matter — would have to consume about 15 percent fewer calories to keep from regaining the weight. The change occurs as soon as the person starts reducing, Leibel said, and it “is not proportional to amount of weight lost, and persists over time.”
By ROBIN MARANTZ HENIG
In the 30-plus years that Richard Atkinson has been studying obesity, he has always maintained that overeating doesn’t really explain it all. His epiphany came early in his career, when he was a medical fellow at U.C.L.A. engaged in a study of people who weighed more than 300 pounds and had come in for obesity surgery. “The general thought at the time was that fat people ate too much,” Atkinson, now at Virginia Commonwealth University, told me recently. “And we documented that fat people do eat too much — our subjects ate an average of 6,700 calories a day. But what was so impressive to me was the fact that not all fat people eat too much.”
One of Atkinson’s most memorable patients was Janet S., a bright, funny 25-year-old who weighed 348 pounds when she finally made her way to U.C.L.A. in 1975. In exchange for agreeing to be hospitalized for three months so scientists could study them, Janet and the other obese research subjects (30 in all) each received a free intestinal bypass. During the three months of presurgical study, the dietitian on the research team calculated how many calories it should take for a 5-foot-6-inch woman like Janet to maintain a weight of 348. They fed her exactly that many calories — no more, no less. She dutifully ate what she was told, and she gained 12 pounds in two weeks — almost a pound a day.
“I don’t think I’d ever gained that much weight that quickly,” recalled Janet, who asked me not to use her full name because she didn’t want people to know how fat she had once been. The doctors accused her of sneaking snacks into the hospital. “But I told them, ‘I’m gaining weight because you’re feeding me a tremendous amount of food!’ ”
The experience with Janet was an early inkling that traditional ideas about obesity were incomplete. Researchers and public-health officials have long understood that to maintain a given weight, energy in (calories consumed) must equal energy out (calories expended). But then they learned that genes were important, too, and that for some people, like Janet, this formula was tilted in a direction that led to weight gain. Since the discovery of the first obesity gene in 1994, scientists have found about 50 genes involved in obesity. Some of them determine how individuals lay down fat and metabolize energy stores. Others regulate how much people want to eat in the first place, how they know when they’ve had enough and how likely they are to use up calories through activities ranging from fidgeting to running marathons. People like Janet, who can get fat on very little fuel, may be genetically programmed to survive in harsher environments. When the human species got its start, it was an advantage to be efficient. Today, when food is plentiful, it is a hazard.
But even as our understanding of genes and behavior has become more refined, some cases still boggle the mind, like identical twins who eat roughly the same and yet have vastly different weights. Now a third wave of obesity researchers are looking for explanations that don’t fall into the relatively easy ones of genetics, overeating or lack of exercise. They are investigating what might seem to be the unlikeliest of culprits: the microorganisms we encounter every day.
Jeffrey Gordon, whose theory is that obesity is related to intestinal microorganisms, has never had a weight problem. I went to meet him and his colleagues at the Center for Genome Sciences at Washington University, which he directs. I wanted to find out everything Gordon knows about the bugs in our guts, and how those bugs might contribute to human physiology — in particular, how they might make some people fat.
Of the trillions and trillions of cells in a typical human body — at least 10 times as many cells in a single individual as there are stars in the Milky Way — only about 1 in 10 is human. The other 90 percent are microbial. These microbes — a term that encompasses all forms of microscopic organisms, including bacteria, fungi, protozoa and a form of life called archaea — exist everywhere. They are found in the ears, nose, mouth, vagina, anus, as well as every inch of skin, especially the armpits, the groin and between the toes. The vast majority are in the gut, which harbors 10 trillion to 100 trillion of them. “Microbes colonize our body surfaces from the moment of our birth,” Gordon said. “They are with us throughout our lives, and at the moment of our death they consume us.”
Known collectively as the gut microflora (or microbiota, a term Gordon prefers because it derives from the Greek word bios, for “life”), these microbes have a Star Trek analogue, he says: the Borg Collective, a community of cybernetically enhanced humanoids with functions so intertwined that they operate as a single intelligence, sort of like an ant colony. In its Borglike way, the microflora assumes an extraordinary array of functions on our behalf — functions that we couldn’t manage on our own. It helps create the capillaries that line and nourish the intestines. It produces vitamins, in particular thiamine, pyroxidine and vitamin K. It provides the enzymes necessary to metabolize cholesterol and bile acid. It digests complex plant polysaccharides, the fiber found in grains, fruits and vegetables that would otherwise be indigestible.
And it helps extract calories from the food we eat and helps store those calories in fat cells for later use — which gives them, in effect, a role in determining whether our diets will make us fat or thin.
In the womb, humans are free of microbes. Colonization begins during the journey down the birth canal, which is riddled with bacteria, some of which make their way onto the newborn’s skin. From that moment on, every mother’s kiss, every swaddling blanket, carries on it more microbes, which are introduced into the baby’s system.
By about the age of 2, most of a person’s microbial community is established, and it looks much like any other person’s microbial community. But in the same way that it takes only a small percentage of our genome to make each of us unique, modest differences in our microflora may make a big difference from one person to another. It’s not clear what accounts for individual variations. Some guts may be innately more hospitable to certain microbes, either because of genetics or because of the mix of microbes already there. Most of the colonization probably happens in the first few years, which explains why the microflora fingerprints of adult twins, who shared an intimate environment (and a mother) in childhood, more closely resemble each other than they do those of their spouses, with whom they became intimate later in life.
No one yet knows whether an individual’s microflora community tends to remain stable for a lifetime, but it is known that certain environmental changes, like taking antibiotics, can alter it at least temporarily. Stop the antibiotics, and the microflora seems to bounce back — but it might not bounce back to exactly what it was before the antibiotics.
In 2004, a group of microbiologists at Stanford University led by David Relman conducted the first census of the gut microflora. It took them a year to do an analysis of just three healthy subjects, by which time they had counted 395 species of bacteria. They stopped counting before the census was complete; Relman has said the real count might be anywhere from 500 species to a few thousand.
About a year ago, Relman joined with other scientists, including Jeffrey Gordon, to begin to sequence all the genes of the human gut microflora. In early June, they published their results in Science: some 78 million base pairs in all. But even this huge number barely scratches the surface; the total number of base pairs in the gut microflora might be 100 times that. Because there are so many trillions of microbes in the gut, the vast majority of the genes that a person carries around are more microbial than human. “Humans are superorganisms,” the scientists wrote, “whose metabolism represents an amalgamation of microbial and human attributes.” They call this amalgamation — human genes plus microbial genes — the metagenome.
Gordon first began studying the connection between the microflora and obesity when he saw what happened to mice without any microbes at all. These germ-free mice, reared in sterile isolators in Gordon’s lab, had 60 percent less fat than ordinary mice. Although they ate voraciously, usually about 30 percent more food than the others, they stayed lean. Without gut microbes, they were unable to extract calories from some of the types of food they ate, which passed through their bodies without being either used or converted to fat.
When Gordon’s postdoctoral researcher Fredrik Bäckhed transplanted gut microbes from normal mice into the germ-free mice, the germ-free mice started metabolizing their food better, extracting calories efficiently and laying down fat to store for later use. Within two weeks, they were just as fat as ordinary mice. Bäckhed and Gordon found at least one mechanism that helps explain this observation. As they reported in the Proceedings of the National Academy of Sciences in 2004, some common gut bacteria, including B. theta, suppress the protein FIAF, which ordinarily prevents the body from storing fat. By suppressing FIAF, B. theta allows fat deposition to increase. A different gut microbe, M. smithii, was later found to interact with B. theta in a way that extracts additional calories from polysaccharides in the diet, further increasing the amount of fat available to be deposited after the mouse eats a meal. Mice whose guts were colonized with both B. theta and M. smithii — as usually happens in humans in the real world — were found to have about 13 percent more body fat than mice colonized by just one or the other.
Gordon likes to explain his hypothesis of what gut microbes do by talking about Cheerios. The cereal box says that a one-cup serving contains 110 calories. But it may be that not everyone will extract 110 calories from a cup of Cheerios. Some may extract more, some less, depending on the particular combination of microbes in their guts. “A diet has a certain amount of absolute energy,” he said. “But the amount that can be extracted from that diet may vary between individuals — not in a huge way, but if the energy balance is affected by just a few calories a day, over time that can make a big difference in body weight.”
Gordon says he is still far from understanding the relationship between gut microflora and weight gain. “I wish you were writing this article a year from now, even two years from now,” he told me. “We’re just beginning to explore this wilderness, finding out who’s there, how does that population change, which are the key players.” He says it will be a while before anyone figures out what the gut microbes do, how they interact with one another and how, or even whether, they play a role in obesity. And it will be even longer before anyone learns how to change the microflora in a deliberate way.
There’s another way that biological middlemen might be involved in obesity — in this case, not the gut microbes (mostly bacteria) with which we co-exist but the viruses and other pathogens that occasionally infect us and make us ill. This is the subspecialty that is being called infectobesity.
The idea of infectobesity dates to 1988, when Nikhil Dhurandhar was a young physician studying for his doctorate in biochemistry at the University of Bombay. He was having tea with his father, also a physician and the head of an obesity clinic, and an old family friend, S. M. Ajinkya, a pathologist at Bombay Veterinary College. Ajinkya was describing a plague that was killing thousands of chickens throughout India, caused by a new poultry virus that he had discovered and named with his own and a colleague’s initials, SMAM-1. On autopsy, the vet said, chickens infected with SMAM-1 revealed pale and enlarged livers and kidneys, an atrophied thymus and excess fat in the abdomen.
The finding of abdominal fat intrigued Dhurandhar. “If a chicken died of infection, having wasted away, it should be less fat, not more,” he remembered thinking at the time. He asked permission to conduct a small experiment at the vet school.
Working with about 20 chickens, Dhurandhar, then 28, infected half of them with SMAM-1. He fed them all the same amount of food, but only the infected chickens became obese. Strangely, despite their excess fat, the infected obese chickens had low levels of cholesterol and triglycerides in their blood — just the opposite of what was thought to happen in humans, whose cholesterol and triglyceride levels generally increase as their weight increases. After his pilot study in 1988, Dhurandhar conducted a larger one with 100 chickens. It confirmed his finding that SMAM-1 caused obesity in chickens.
But what about humans? With a built-in patient population from his clinic, Dhurandhar collected blood samples from 52 overweight patients. Ten of them, nearly 20 percent, showed antibody evidence of prior exposure to the SMAM-1 virus, which was a chicken virus not previously thought to have infected humans. Moreover, the once-infected patients weighed an average of 33 pounds more than those who were never infected and, most surprisingly, had lower cholesterol and triglyceride levels — the same paradoxical finding as in the chickens.
The findings violated three pieces of conventional wisdom, Dhurandhar said recently: “The first is that viruses don’t cause obesity. The second is that obesity leads to high cholesterol and triglycerides. The third is that avian viruses don’t infect humans.”
Dhurandhar, now 46, is a thoughtful man with a head of still-dark hair. Like Gordon, he has never been fat. But even though he is so firmly in the biological camp of obesity researchers, he ascribes his own weight control to behavior, not microbes; he says he is slim because he walks five miles a day, lifts weights and is careful about what he eats. Being overweight runs in his family; Dhurandhar’s father, who still practices medicine in India, began treating obese patients because of his own struggle to keep his weight down, from a onetime high of 220.
Slim as he is, Dhurandhar nonetheless is sensitive to the pain of being fat and the maddening frustration of trying to do anything about it. He takes to heart the anguished letters and e-mail he receives each time his research is publicized. Once, he said, he heard from a woman whose 10-year-old grandson weighed 184 pounds. The boy rode his bicycle until his feet bled, hoping to lose weight; he was so embarrassed by his body that he kept his T-shirt on when he went swimming. The grandmother told Dhurandhar that the virus research sounded like the answer to her prayers. But the scientist knew that even if a virus was to blame for this boy’s obesity, he was a long way from offering any real help.
In 1992, Dhurandhar moved his wife and 7-year-old son to the United States in search of a lab where he could continue his research. At first, because infectobesity was so far out of the mainstream, all he could find was unrelated work at North Dakota State University. “My wife and I gave ourselves two years,” he recalled. “If I didn’t find work in the field of viruses and obesity in two years, we would go back to Bombay.”
One month before his self-imposed deadline in 1994, Dhurandhar received a job offer from Richard Atkinson, who was then at the University of Wisconsin, Madison. Atkinson, always on the lookout for new biological explanations of obesity, wanted to collaborate with Dhurandhar on SMAM-1. But the virus existed only in India, and the U.S. government would not allow it to be imported. So the scientists decided to work with a closely related virus, a human adenovirus. They opened the catalogue of a laboratory-supply company to see which one of the 50 human adenoviruses they should order.
“I’d like to say we chose the virus out of some wisdom, out of some belief that it was similar in important ways to SMAM-1,” Dhurandhar said. But really, he admitted, it was dumb luck that the adenovirus they started with, Ad-36, turned out to be so fattening.
By this time, several pathogens had already been shown to cause obesity in laboratory animals. With Ad-36, Dhurandhar and Atkinson began by squirting the virus up the nostrils of a series of lab animals — chickens, rats, marmosets — and in every species the infected animals got fat.
“The marmosets were most dramatic,” Atkinson recalled. By seven months after infection, he said, 100 percent of them became obese. Subsequently, Atkinson’s group and another in England conducted similar research using other strains of human adenovirus. The British group found that one strain, Ad-5, caused obesity in mice; the Wisconsin group found the same thing with Ad-37 and chickens. Two other strains, Ad-2 and Ad-31, failed to cause obesity.
In 2004, Atkinson and Dhurandhar were ready to move to humans. All of the 50 strains of human adenoviruses cause infections that are usually mild and transient, the kind that people pass off as a cold, a stomach bug or pink eye. The symptoms are so minor that people who have been infected often don’t remember ever having been sick. Even with such an innocuous virus, it would be unethical, of course, for a scientist to infect a human deliberately just to see if the person gets fat. Human studies are, therefore, always retrospective, a hunt for antibodies that would signal the presence of an infectious agent at some point in the past. To carry out this research, Atkinson developed — and patented — a screening test to look for the presence of Ad-36 antibodies in the blood.
The scientists found 502 volunteers from Wisconsin, Florida and New York willing to be screened for antibodies, 360 of them obese and 142 of them of not obese. Of the leaner subjects, 11 percent had antibodies to Ad-36, indicating an infection at some point in the past. (Ad-36 was identified relatively recently, in 1978.) Among the obese subjects, 30 percent had antibodies— a difference large enough to suggest it was not just chance. In addition, subjects who were antibody-positive weighed significantly more than subjects who were uninfected. Those who were antibody-positive also had cholesterol and triglyceride readings that were significantly lower than people who were antibody-negative — just as in the infected chickens — a finding that held true whether or not they were obese.
As for the other pathogens implicated in infectobesity — nine in all — certain viruses are known to impair the brain’s appetite-control mechanism in the hypothalamus, as happens in some cases of people becoming grossly obese after meningitis. Scientists also point to a commonality between fat cells and immune-system cells, although the exact significance of the connection is unclear. Immature fat cells, for instance, have been shown to behave like macrophages, the immune cells that engulf and destroy invading pathogens. Mature fat cells secrete hormones that stimulate the production of macrophages as well as another kind of immune-system cell, T-lymphocytes.
Another line of investigation in the field of infectobesity concerns inflammation, a corollary of infection. Obese people have higher levels of two proteins related to inflammation, C-reactive protein and interleukin-6. This may suggest that an infectious agent has set off some sort of derangement in the body’s system of fat regulation, making the infected person fat. A different interpretation is not about obesity causation but about its associated risks. Some scientists, including Jeffrey Gordon’s colleagues at Washington University, are trying to see whether the ailments of obesity (especially diabetes and high blood pressure) might be caused not by the added weight per se, but by the associated inflammation.
The thrifty-genotype hypothesis holds that there was, once upon a time, an adaptive advantage to being able to get fat. Our ancestors survived unpredictable cycles of food catastrophes by laying down fat stores when food was plentiful, and using up the stores slowly when food was scarce. The ones who did this best were the ones most likely to survive and to pass on the thrifty genotype to the next generation. But this mechanism evolved to get through a difficult winter — and we’re living now in an eternal spring. With food so readily available, thriftiness is a liability, and the ability to slow down metabolism during periods of reduced eating (a k a dieting) tends to create a fatter populace, albeit a more famine-proof one.
Obesity has turned out to be a daunting foe. Many of us are tethered to bodies that sabotage us in our struggle to keep from getting fat, or to slim down when we do. Microbes might be one explanation. There might be others, as outlined in June in a paper in The International Journal of Obesity listing 10 “putative contributors” to obesity, among them sleep deprivation, the increased use of psychoactive prescription drugs and the spread of air-conditioning.
But where does this leave us, exactly? Whatever the reason for any one individual’s tendency to gain weight, the only way to lose the weight is to eat less and exercise more. Behavioral interventions are all we’ve got right now. Even the supposedly biological approach to weight loss — that is, diet drugs — still works (or, more often, fails to work) by affecting eating behavior, through chemicals instead of through willpower. If it turns out that microbes are implicated in obesity, this biological approach will become more direct, in the form of an antiviral agent or a microbial supplement. But the truth is, this isn’t going to happen any time soon.
On an individual level and for the foreseeable future, if you want to lose weight, you still have to fiddle with the energy equation. Weight still boils down to the balance between how much a particular body needs to maintain a certain weight and how much it is fed. What complicates things is that in some people, for reasons still not fully understood, what their bodies need is set unfairly low. It could be genes; it could be microbes; it could be something else entirely.
According to Rudolph Leibel, an obesity researcher at Columbia University who was involved in the discovery of the first human gene implicated in obesity, if you take two nonobese people of the same weight, they will require different amounts of food depending on whether or not they were once obese. It goes in precisely the maddening direction you might expect: formerly fat people need to eat less than never-fat people to maintain exactly the same weight. In other words, a 150-pound woman who has always weighed 150 might be able to get away with eating, say, 2,500 calories a day, but a 150-pound woman who once weighed more — 20 pounds more, 200 pounds more, the exact amount doesn’t matter — would have to consume about 15 percent fewer calories to keep from regaining the weight. The change occurs as soon as the person starts reducing, Leibel said, and it “is not proportional to amount of weight lost, and persists over time.”