By MICHAEL POLLAN
Eat food. Not too much. Mostly plants.
That, more or less, is the short answer to the supposedly incredibly complicated and confusing question of what we humans should eat in order to be maximally healthy. I hate to give away the game right here at the beginning of a long essay, and I confess that I’m tempted to complicate matters in the interest of keeping things going for a few thousand more words. I’ll try to resist but will go ahead and add a couple more details to flesh out the advice. Like: A little meat won’t kill you, though it’s better approached as a side dish than as a main. And you’re much better off eating whole fresh foods than processed food products. That’s what I mean by the recommendation to eat “food.” Once, food was all you could eat, but today there are lots of other edible foodlike substances in the supermarket. These novel products of food science often come in packages festooned with health claims, which brings me to a related rule of thumb: if you’re concerned about your health, you should probably avoid food products that make health claims. Why? Because a health claim on a food product is a good indication that it’s not really food, and food is what you want to eat.
Uh-oh. Things are suddenly sounding a little more complicated, aren’t they? Sorry. But that’s how it goes as soon as you try to get to the bottom of the whole vexing question of food and health. Before long, a dense cloud bank of confusion moves in. Sooner or later, everything solid you thought you knew about the links between diet and health gets blown away in the gust of the latest study.
Last winter came the news that a low-fat diet, long believed to protect against breast cancer, may do no such thing — this from the monumental, federally financed Women’s Health Initiative, which has also found no link between a low-fat diet and rates of coronary disease. The year before we learned that dietary fiber might not, as we had been confidently told, help prevent colon cancer. Just last fall two prestigious studies on omega-3 fats published at the same time presented us with strikingly different conclusions. While the Institute of Medicine stated that “it is uncertain how much these omega-3s contribute to improving health” (and they might do the opposite if you get them from mercury-contaminated fish), a Harvard study declared that simply by eating a couple of servings of fish each week (or by downing enough fish oil), you could cut your risk of dying from a heart attack by more than a third — a stunningly hopeful piece of news. It’s no wonder that omega-3 fatty acids are poised to become the oat bran of 2007, as food scientists micro-encapsulate fish oil and algae oil and blast them into such formerly all-terrestrial foods as bread and tortillas, milk and yogurt and cheese, all of which will soon, you can be sure, sprout fishy new health claims. (Remember the rule?)
By now you’re probably registering the cognitive dissonance of the supermarket shopper or science-section reader, as well as some nostalgia for the simplicity and solidity of the first few sentences of this essay. Which I’m still prepared to defend against the shifting winds of nutritional science and food-industry marketing. But before I do that, it might be useful to figure out how we arrived at our present state of nutritional confusion and anxiety.
The story of how the most basic questions about what to eat ever got so complicated reveals a great deal about the institutional imperatives of the food industry, nutritional science and — ahem — journalism, three parties that stand to gain much from widespread confusion surrounding what is, after all, the most elemental question an omnivore confronts. Humans deciding what to eat without expert help — something they have been doing with notable success since coming down out of the trees — is seriously unprofitable if you’re a food company, distinctly risky if you’re a nutritionist and just plain boring if you’re a newspaper editor or journalist. (Or, for that matter, an eater. Who wants to hear, yet again, “Eat more fruits and vegetables”?) And so, like a large gray fog, a great Conspiracy of Confusion has gathered around the simplest questions of nutrition — much to the advantage of everybody involved. Except perhaps the ostensible beneficiary of all this nutritional expertise and advice: us, and our health and happiness as eaters.
FROM FOODS TO NUTRIENTS
It was in the 1980s that food began disappearing from the American supermarket, gradually to be replaced by “nutrients,” which are not the same thing. Where once the familiar names of recognizable comestibles — things like eggs or breakfast cereal or cookies — claimed pride of place on the brightly colored packages crowding the aisles, now new terms like “fiber” and “cholesterol” and “saturated fat” rose to large-type prominence. More important than mere foods, the presence or absence of these invisible substances was now generally believed to confer health benefits on their eaters. Foods by comparison were coarse, old-fashioned and decidedly unscientific things — who could say what was in them, really? But nutrients — those chemical compounds and minerals in foods that nutritionists have deemed important to health — gleamed with the promise of scientific certainty; eat more of the right ones, fewer of the wrong, and you would live longer and avoid chronic diseases.
Nutrients themselves had been around, as a concept, since the early 19th century, when the English doctor and chemist William Prout identified what came to be called the “macronutrients”: protein, fat and carbohydrates. It was thought that that was pretty much all there was going on in food, until doctors noticed that an adequate supply of the big three did not necessarily keep people nourished. At the end of the 19th century, British doctors were puzzled by the fact that Chinese laborers in the Malay states were dying of a disease called beriberi, which didn’t seem to afflict Tamils or native Malays. The mystery was solved when someone pointed out that the Chinese ate “polished,” or white, rice, while the others ate rice that hadn’t been mechanically milled. A few years later, Casimir Funk, a Polish chemist, discovered the “essential nutrient” in rice husks that protected against beriberi and called it a “vitamine,” the first micronutrient. Vitamins brought a kind of glamour to the science of nutrition, and though certain sectors of the population began to eat by its expert lights, it really wasn’t until late in the 20th century that nutrients managed to push food aside in the popular imagination of what it means to eat.
No single event marked the shift from eating food to eating nutrients, though in retrospect a little-noticed political dust-up in Washington in 1977 seems to have helped propel American food culture down this dimly lighted path. Responding to an alarming increase in chronic diseases linked to diet — including heart disease, cancer and diabetes — a Senate Select Committee on Nutrition, headed by George McGovern, held hearings on the problem and prepared what by all rights should have been an uncontroversial document called “Dietary Goals for the United States.” The committee learned that while rates of coronary heart disease had soared in America since World War II, other cultures that consumed traditional diets based largely on plants had strikingly low rates of chronic disease. Epidemiologists also had observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease temporarily plummeted.
Naïvely putting two and two together, the committee drafted a straightforward set of dietary guidelines calling on Americans to cut down on red meat and dairy products. Within weeks a firestorm, emanating from the red-meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee’s recommendations were hastily rewritten. Plain talk about food — the committee had advised Americans to actually “reduce consumption of meat” — was replaced by artful compromise: “Choose meats, poultry and fish that will reduce saturated-fat intake.”
A subtle change in emphasis, you might say, but a world of difference just the same. First, the stark message to “eat less” of a particular food has been deep-sixed; don’t look for it ever again in any official U.S. dietary pronouncement. Second, notice how distinctions between entities as different as fish and beef and chicken have collapsed; those three venerable foods, each representing an entirely different taxonomic class, are now lumped together as delivery systems for a single nutrient. Notice too how the new language exonerates the foods themselves; now the culprit is an obscure, invisible, tasteless — and politically unconnected — substance that may or may not lurk in them called “saturated fat.”
The linguistic capitulation did nothing to rescue McGovern from his blunder; the very next election, in 1980, the beef lobby helped rusticate the three-term senator, sending an unmistakable warning to anyone who would challenge the American diet, and in particular the big chunk of animal protein sitting in the middle of its plate. Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, and would instead arrive clothed in scientific euphemism and speaking of nutrients, entities that few Americans really understood but that lack powerful lobbies in Washington. This was precisely the tack taken by the National Academy of Sciences when it issued its landmark report on diet and cancer in 1982. Organized nutrient by nutrient in a way guaranteed to offend no food group, it codified the official new dietary language. Industry and media followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids and carotenes soon colonized much of the cultural space previously occupied by the tangible substance formerly known as food. The Age of Nutritionism had arrived.
THE RISE OF NUTRITIONISM
The first thing to understand about nutritionism — I first encountered the term in the work of an Australian sociologist of science named Gyorgy Scrinis — is that it is not quite the same as nutrition. As the “ism” suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it’s exerting its hold on your culture. A reigning ideology is a little like the weather, all pervasive and virtually inescapable. Still, we can try.
In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. From this basic premise flow several others. Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists speak) to explain the hidden reality of foods to us. To enter a world in which you dine on unseen nutrients, you need lots of expert help.
But expert help to do what, exactly? This brings us to another unexamined assumption: that the whole point of eating is to maintain and promote bodily health. Hippocrates’s famous injunction to “let food be thy medicine” is ritually invoked to support this notion. I’ll leave the premise alone for now, except to point out that it is not shared by all cultures and that the experience of these other cultures suggests that, paradoxically, viewing food as being about things other than bodily health — like pleasure, say, or socializing — makes people no less healthy; indeed, there’s some reason to believe that it may make them more healthy. This is what we usually have in mind when we speak of the “French paradox” — the fact that a population that eats all sorts of unhealthful nutrients is in many ways healthier than we Americans are. So there is at least a question as to whether nutritionism is actually any good for you.
Another potentially serious weakness of nutritionist ideology is that it has trouble discerning qualitative distinctions between foods. So fish, beef and chicken through the nutritionists’ lens become mere delivery systems for varying quantities of fats and proteins and whatever other nutrients are on their scope. Similarly, any qualitative distinctions between processed foods and whole foods disappear when your focus is on quantifying the nutrients they contain (or, more precisely, the known nutrients).
This is a great boon for manufacturers of processed food, and it helps explain why they have been so happy to get with the nutritionism program. In the years following McGovern’s capitulation and the 1982 National Academy report, the food industry set about re-engineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and less of the bad, and by the late ’80s a golden era of food science was upon us. The Year of Eating Oat Bran — also known as 1988 — served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran’s moment on the dietary stage didn’t last long, but the pattern had been established, and every few years since then a new oat bran has taken its turn under the marketing lights. (Here comes omega-3!)
By comparison, the typical real food has more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can’t easily change its nutritional stripes (though rest assured the genetic engineers are hard at work on the problem). So far, at least, you can’t put oat bran in a banana. So depending on the reigning nutritional orthodoxy, the avocado might be either a high-fat food to be avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate of each whole food rises and falls with every change in the nutritional weather, while the processed foods are simply reformulated. That’s why when the Atkins mania hit the food industry, bread and pasta were given a quick redesign (dialing back the carbs; boosting the protein), while the poor unreconstructed potatoes and carrots were left out in the cold.
Of course it’s also a lot easier to slap a health claim on a box of sugary cereal than on a potato or carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over, the Cocoa Puffs and Lucky Charms are screaming about their newfound whole-grain goodness.
EAT RIGHT, GET FATTER
So nutritionism is good for business. But is it good for us? You might think that a national fixation on nutrients would lead to measurable improvements in the public health. But for that to happen, the underlying nutritional science, as well as the policy recommendations (and the journalism) based on that science, would have to be sound. This has seldom been the case.
Consider what happened immediately after the 1977 “Dietary Goals” — McGovern’s masterpiece of politico-nutritionist compromise. In the wake of the panel’s recommendation that we cut down on saturated fat, a recommendation seconded by the 1982 National Academy report on cancer, Americans did indeed change their diets, endeavoring for a quarter-century to do what they had been told. Well, kind of. The industrial food supply was promptly reformulated to reflect the official advice, giving us low-fat pork, low-fat Snackwell’s and all the low-fat pasta and high-fructose (yet low-fat!) corn syrup we could consume. Which turned out to be quite a lot. Oddly, America got really fat on its new low-fat diet — indeed, many date the current obesity and diabetes epidemic to the late 1970s, when Americans began binging on carbohydrates, ostensibly as a way to avoid the evils of fat.
This story has been told before, notably in these pages (“What if It’s All Been a Big Fat Lie?” by Gary Taubes, July 7, 2002), but it’s a little more complicated than the official version suggests. In that version, which inspired the most recent Atkins craze, we were told that America got fat when, responding to bad scientific advice, it shifted its diet from fats to carbs, suggesting that a re-evaluation of the two nutrients is in order: fat doesn’t make you fat; carbs do. (Why this should have come as news is a mystery: as long as people have been raising animals for food, they have fattened them on carbs.)
But there are a couple of problems with this revisionist picture. First, while it is true that Americans post-1977 did begin binging on carbs, and that fat as a percentage of total calories in the American diet declined, we never did in fact cut down on our consumption of fat. Meat consumption actually climbed. We just heaped a bunch more carbs onto our plates, obscuring perhaps, but not replacing, the expanding chunk of animal protein squatting in the center.
How did that happen? I would submit that the ideology of nutritionism deserves as much of the blame as the carbohydrates themselves do — that and human nature. By framing dietary advice in terms of good and bad nutrients, and by burying the recommendation that we should eat less of any particular food, it was easy for the take-home message of the 1977 and 1982 dietary guidelines to be simplified as follows: Eat more low-fat foods. And that is what we did. We’re always happy to receive a dispensation to eat more of something (with the possible exception of oat bran), and one of the things nutritionism reliably gives us is some such dispensation: low-fat cookies then, low-carb beer now. It’s hard to imagine the low-fat craze taking off as it did if McGovern’s original food-based recommendations had stood: eat fewer meat and dairy products. For how do you get from that stark counsel to the idea that another case of Snackwell’s is just what the doctor ordered?
BAD SCIENCE
But if nutritionism leads to a kind of false consciousness in the mind of the eater, the ideology can just as easily mislead the scientist. Most nutritional science involves studying one nutrient at a time, an approach that even nutritionists who do it will tell you is deeply flawed. “The problem with nutrient-by-nutrient nutrition science,” points out Marion Nestle, the New York University nutritionist, “is that it takes the nutrient out of the context of food, the food out of the context of diet and the diet out of the context of lifestyle.”
If nutritional scientists know this, why do they do it anyway? Because a nutrient bias is built into the way science is done: scientists need individual variables they can isolate. Yet even the simplest food is a hopelessly complex thing to study, a virtual wilderness of chemical compounds, many of which exist in complex and dynamic relation to one another, and all of which together are in the process of changing from one state to another. So if you’re a nutritional scientist, you do the only thing you can do, given the tools at your disposal: break the thing down into its component parts and study those one by one, even if that means ignoring complex interactions and contexts, as well as the fact that the whole may be more than, or just different from, the sum of its parts. This is what we mean by reductionist science.
Scientific reductionism is an undeniably powerful tool, but it can mislead us too, especially when applied to something as complex as, on the one side, a food, and on the other, a human eater. It encourages us to take a mechanistic view of that transaction: put in this nutrient; get out that physiological result. Yet people differ in important ways. Some populations can metabolize sugars better than others; depending on your evolutionary heritage, you may or may not be able to digest the lactose in milk. The specific ecology of your intestines helps determine how efficiently you digest what you eat, so that the same input of 100 calories may yield more or less energy depending on the proportion of Firmicutes and Bacteroidetes living in your gut. There is nothing very machinelike about the human eater, and so to think of food as simply fuel is wrong.
Also, people don’t eat nutrients, they eat foods, and foods can behave very differently than the nutrients they contain. Researchers have long believed, based on epidemiological comparisons of different populations, that a diet high in fruits and vegetables confers some protection against cancer. So naturally they ask, What nutrients in those plant foods are responsible for that effect? One hypothesis is that the antioxidants in fresh produce — compounds like beta carotene, lycopene, vitamin E, etc. — are the X factor. It makes good sense: these molecules (which plants produce to protect themselves from the highly reactive oxygen atoms produced in photosynthesis) vanquish the free radicals in our bodies, which can damage DNA and initiate cancers. At least that’s how it seems to work in the test tube. Yet as soon as you remove these useful molecules from the context of the whole foods they’re found in, as we’ve done in creating antioxidant supplements, they don’t work at all. Indeed, in the case of beta carotene ingested as a supplement, scientists have discovered that it actually increases the risk of certain cancers. Big oops.
What’s going on here? We don’t know. It could be the vagaries of human digestion. Maybe the fiber (or some other component) in a carrot protects the antioxidant molecules from destruction by stomach acids early in the digestive process. Or it could be that we isolated the wrong antioxidant. Beta is just one of a whole slew of carotenes found in common vegetables; maybe we focused on the wrong one. Or maybe beta carotene works as an antioxidant only in concert with some other plant chemical or process; under other circumstances, it may behave as a pro-oxidant.
Indeed, to look at the chemical composition of any common food plant is to realize just how much complexity lurks within it. Here’s a list of just the antioxidants that have been identified in garden-variety thyme:
4-Terpineol, alanine, anethole, apigenin, ascorbic acid, beta carotene, caffeic acid, camphene, carvacrol, chlorogenic acid, chrysoeriol, eriodictyol, eugenol, ferulic acid, gallic acid, gamma-terpinene isochlorogenic acid, isoeugenol, isothymonin, kaempferol, labiatic acid, lauric acid, linalyl acetate, luteolin, methionine, myrcene, myristic acid, naringenin, oleanolic acid, p-coumoric acid, p-hydroxy-benzoic acid, palmitic acid, rosmarinic acid, selenium, tannin, thymol, tryptophan, ursolic acid, vanillic acid.
This is what you’re ingesting when you eat food flavored with thyme. Some of these chemicals are broken down by your digestion, but others are going on to do undetermined things to your body: turning some gene’s expression on or off, perhaps, or heading off a free radical before it disturbs a strand of DNA deep in some cell. It would be great to know how this all works, but in the meantime we can enjoy thyme in the knowledge that it probably doesn’t do any harm (since people have been eating it forever) and that it may actually do some good (since people have been eating it forever) and that even if it does nothing, we like the way it tastes.
It’s also important to remind ourselves that what reductive science can manage to perceive well enough to isolate and study is subject to change, and that we have a tendency to assume that what we can see is all there is to see. When William Prout isolated the big three macronutrients, scientists figured they now understood food and what the body needs from it; when the vitamins were isolated a few decades later, scientists thought, O.K., now we really understand food and what the body needs to be healthy; today it’s the polyphenols and carotenoids that seem all-important. But who knows what the hell else is going on deep in the soul of a carrot?
The good news is that, to the carrot eater, it doesn’t matter. That’s the great thing about eating food as compared with nutrients: you don’t need to fathom a carrot’s complexity to reap its benefits.
The case of the antioxidants points up the dangers in taking a nutrient out of the context of food; as Nestle suggests, scientists make a second, related error when they study the food out of the context of the diet. We don’t eat just one thing, and when we are eating any one thing, we’re not eating another. We also eat foods in combinations and in orders that can affect how they’re absorbed. Drink coffee with your steak, and your body won’t be able to fully absorb the iron in the meat. The trace of limestone in the corn tortilla unlocks essential amino acids in the corn that would otherwise remain unavailable. Some of those compounds in that sprig of thyme may well affect my digestion of the dish I add it to, helping to break down one compound or possibly stimulate production of an enzyme to detoxify another. We have barely begun to understand the relationships among foods in a cuisine.
But we do understand some of the simplest relationships, like the zero-sum relationship: that if you eat a lot of meat you’re probably not eating a lot of vegetables. This simple fact may explain why populations that eat diets high in meat have higher rates of coronary heart disease and cancer than those that don’t. Yet nutritionism encourages us to look elsewhere for the explanation: deep within the meat itself, to the culpable nutrient, which scientists have long assumed to be the saturated fat. So they are baffled when large-population studies, like the Women’s Health Initiative, fail to find that reducing fat intake significantly reduces the incidence of heart disease or cancer.
Of course thanks to the low-fat fad (inspired by the very same reductionist fat hypothesis), it is entirely possible to reduce your intake of saturated fat without significantly reducing your consumption of animal protein: just drink the low-fat milk and order the skinless chicken breast or the turkey bacon. So maybe the culprit nutrient in meat and dairy is the animal protein itself, as some researchers now hypothesize. (The Cornell nutritionist T. Colin Campbell argues as much in his recent book, “The China Study.”) Or, as the Harvard epidemiologist Walter C. Willett suggests, it could be the steroid hormones typically present in the milk and meat; these hormones (which occur naturally in meat and milk but are often augmented in industrial production) are known to promote certain cancers.
But people worried about their health needn’t wait for scientists to settle this question before deciding that it might be wise to eat more plants and less meat. This is of course precisely what the McGovern committee was trying to tell us.
Nestle also cautions against taking the diet out of the context of the lifestyle. The Mediterranean diet is widely believed to be one of the most healthful ways to eat, yet much of what we know about it is based on studies of people living on the island of Crete in the 1950s, who in many respects lived lives very different from our own. Yes, they ate lots of olive oil and little meat. But they also did more physical labor. They fasted regularly. They ate a lot of wild greens — weeds. And, perhaps most important, they consumed far fewer total calories than we do. Similarly, much of what we know about the health benefits of a vegetarian diet is based on studies of Seventh Day Adventists, who muddy the nutritional picture by drinking absolutely no alcohol and never smoking. These extraneous but unavoidable factors are called, aptly, “confounders.” One last example: People who take supplements are healthier than the population at large, but their health probably has nothing whatsoever to do with the supplements they take — which recent studies have suggested are worthless. Supplement-takers are better-educated, more-affluent people who, almost by definition, take a greater-than-normal interest in personal health — confounding factors that probably account for their superior health.
But if confounding factors of lifestyle bedevil comparative studies of different populations, the supposedly more rigorous “prospective” studies of large American populations suffer from their own arguably even more disabling flaws. In these studies — of which the Women’s Health Initiative is the best known — a large population is divided into two groups. The intervention group changes its diet in some prescribed manner, while the control group does not. The two groups are then tracked over many years to learn whether the intervention affects relative rates of chronic disease.
When it comes to studying nutrition, this sort of extensive, long-term clinical trial is supposed to be the gold standard. It certainly sounds sound. In the case of the Women’s Health Initiative, sponsored by the National Institutes of Health, the eating habits and health outcomes of nearly 49,000 women (ages 50 to 79 at the beginning of the study) were tracked for eight years. One group of the women were told to reduce their consumption of fat to 20 percent of total calories. The results were announced early last year, producing front-page headlines of which the one in this newspaper was typical: “Low-Fat Diet Does Not Cut Health Risks, Study Finds.” And the cloud of nutritional confusion over the country darkened.
But even a cursory analysis of the study’s methods makes you wonder why anyone would take such a finding seriously, let alone order a Quarter Pounder With Cheese to celebrate it, as many newspaper readers no doubt promptly went out and did. Even the beginner student of nutritionism will immediately spot several flaws: the focus was on “fat,” rather than on any particular food, like meat or dairy. So women could comply simply by switching to lower-fat animal products. Also, no distinctions were made between types of fat: women getting their allowable portion of fat from olive oil or fish were lumped together with woman getting their fat from low-fat cheese or chicken breasts or margarine. Why? Because when the study was designed 16 years ago, the whole notion of “good fats” was not yet on the scientific scope. Scientists study what scientists can see.
But perhaps the biggest flaw in this study, and other studies like it, is that we have no idea what these women were really eating because, like most people when asked about their diet, they lied about it. How do we know this? Deduction. Consider: When the study began, the average participant weighed in at 170 pounds and claimed to be eating 1,800 calories a day. It would take an unusual metabolism to maintain that weight on so little food. And it would take an even freakier metabolism to drop only one or two pounds after getting down to a diet of 1,400 to 1,500 calories a day — as the women on the “low-fat” regimen claimed to have done. Sorry, ladies, but I just don’t buy it.
In fact, nobody buys it. Even the scientists who conduct this sort of research conduct it in the knowledge that people lie about their food intake all the time. They even have scientific figures for the magnitude of the lie. Dietary trials like the Women’s Health Initiative rely on “food-frequency questionnaires,” and studies suggest that people on average eat between a fifth and a third more than they claim to on the questionnaires. How do the researchers know that? By comparing what people report on questionnaires with interviews about their dietary intake over the previous 24 hours, thought to be somewhat more reliable. In fact, the magnitude of the lie could be much greater, judging by the huge disparity between the total number of food calories produced every day for each American (3,900 calories) and the average number of those calories Americans own up to chomping: 2,000. (Waste accounts for some of the disparity, but nowhere near all of it.) All we really know about how much people actually eat is that the real number lies somewhere between those two figures.
To try to fill out the food-frequency questionnaire used by the Women’s Health Initiative, as I recently did, is to realize just how shaky the data on which such trials rely really are. The survey, which took about 45 minutes to complete, started off with some relatively easy questions: “Did you eat chicken or turkey during the last three months?” Having answered yes, I was then asked, “When you ate chicken or turkey, how often did you eat the skin?” But the survey soon became harder, as when it asked me to think back over the past three months to recall whether when I ate okra, squash or yams, they were fried, and if so, were they fried in stick margarine, tub margarine, butter, “shortening” (in which category they inexplicably lump together hydrogenated vegetable oil and lard), olive or canola oil or nonstick spray? I honestly didn’t remember, and in the case of any okra eaten in a restaurant, even a hypnotist could not get out of me what sort of fat it was fried in. In the meat section, the portion sizes specified haven’t been seen in America since the Hoover administration. If a four-ounce portion of steak is considered “medium,” was I really going to admit that the steak I enjoyed on an unrecallable number of occasions during the past three months was probably the equivalent of two or three (or, in the case of a steakhouse steak, no less than four) of these portions? I think not. In fact, most of the “medium serving sizes” to which I was asked to compare my own consumption made me feel piggish enough to want to shave a few ounces here, a few there. (I mean, I wasn’t under oath or anything, was I?)
This is the sort of data on which the largest questions of diet and health are being decided in America today.
THE ELEPHANT IN THE ROOM
In the end, the biggest, most ambitious and widely reported studies of diet and health leave more or less undisturbed the main features of the Western diet: lots of meat and processed foods, lots of added fat and sugar, lots of everything — except fruits, vegetables and whole grains. In keeping with the nutritionism paradigm and the limits of reductionist science, the researchers fiddle with single nutrients as best they can, but the populations they recruit and study are typical American eaters doing what typical American eaters do: trying to eat a little less of this nutrient, a little more of that, depending on the latest thinking. (One problem with the control groups in these studies is that they too are exposed to nutritional fads in the culture, so over time their eating habits come to more closely resemble the habits of the intervention group.) It should not surprise us that the findings of such research would be so equivocal and confusing.
But what about the elephant in the room — the Western diet? It might be useful, in the midst of our deepening confusion about nutrition, to review what we do know about diet and health. What we know is that people who eat the way we do in America today suffer much higher rates of cancer, heart disease, diabetes and obesity than people eating more traditional diets. (Four of the 10 leading killers in America are linked to diet.) Further, we know that simply by moving to America, people from nations with low rates of these “diseases of affluence” will quickly acquire them. Nutritionism by and large takes the Western diet as a given, seeking to moderate its most deleterious effects by isolating the bad nutrients in it — things like fat, sugar, salt — and encouraging the public and the food industry to limit them. But after several decades of nutrient-based health advice, rates of cancer and heart disease in the U.S. have declined only slightly (mortality from heart disease is down since the ’50s, but this is mainly because of improved treatment), and rates of obesity and diabetes have soared.
No one likes to admit that his or her best efforts at understanding and solving a problem have actually made the problem worse, but that’s exactly what has happened in the case of nutritionism. Scientists operating with the best of intentions, using the best tools at their disposal, have taught us to look at food in a way that has diminished our pleasure in eating it while doing little or nothing to improve our health. Perhaps what we need now is a broader, less reductive view of what food is, one that is at once more ecological and cultural. What would happen, for example, if we were to start thinking about food as less of a thing and more of a relationship?
In nature, that is of course precisely what eating has always been: relationships among species in what we call food chains, or webs, that reach all the way down to the soil. Species co-evolve with the other species they eat, and very often a relationship of interdependence develops: I’ll feed you if you spread around my genes. A gradual process of mutual adaptation transforms something like an apple or a squash into a nutritious and tasty food for a hungry animal. Over time and through trial and error, the plant becomes tastier (and often more conspicuous) in order to gratify the animal’s needs and desires, while the animal gradually acquires whatever digestive tools (enzymes, etc.) are needed to make optimal use of the plant. Similarly, cow’s milk did not start out as a nutritious food for humans; in fact, it made them sick until humans who lived around cows evolved the ability to digest lactose as adults. This development proved much to the advantage of both the milk drinkers and the cows.
“Health” is, among other things, the byproduct of being involved in these sorts of relationships in a food chain — involved in a great many of them, in the case of an omnivorous creature like us. Further, when the health of one link of the food chain is disturbed, it can affect all the creatures in it. When the soil is sick or in some way deficient, so will be the grasses that grow in that soil and the cattle that eat the grasses and the people who drink the milk. Or, as the English agronomist Sir Albert Howard put it in 1945 in “The Soil and Health” (a founding text of organic agriculture), we would do well to regard “the whole problem of health in soil, plant, animal and man as one great subject.” Our personal health is inextricably bound up with the health of the entire food web.
In many cases, long familiarity between foods and their eaters leads to elaborate systems of communications up and down the food chain, so that a creature’s senses come to recognize foods as suitable by taste and smell and color, and our bodies learn what to do with these foods after they pass the test of the senses, producing in anticipation the chemicals necessary to break them down. Health depends on knowing how to read these biological signals: this smells spoiled; this looks ripe; that’s one good-looking cow. This is easier to do when a creature has long experience of a food, and much harder when a food has been designed expressly to deceive its senses — with artificial flavors, say, or synthetic sweeteners.
Note that these ecological relationships are between eaters and whole foods, not nutrients. Even though the foods in question eventually get broken down in our bodies into simple nutrients, as corn is reduced to simple sugars, the qualities of the whole food are not unimportant — they govern such things as the speed at which the sugars will be released and absorbed, which we’re coming to see as critical to insulin metabolism. Put another way, our bodies have a longstanding and sustainable relationship to corn that we do not have to high-fructose corn syrup. Such a relationship with corn syrup might develop someday (as people evolve superhuman insulin systems to cope with regular floods of fructose and glucose), but for now the relationship leads to ill health because our bodies don’t know how to handle these biological novelties. In much the same way, human bodies that can cope with chewing coca leaves — a longstanding relationship between native people and the coca plant in South America — cannot cope with cocaine or crack, even though the same “active ingredients” are present in all three. Reductionism as a way of understanding food or drugs may be harmless, even necessary, but reductionism in practice can lead to problems.
Looking at eating through this ecological lens opens a whole new perspective on exactly what the Western diet is: a radical and rapid change not just in our foodstuffs over the course of the 20th century but also in our food relationships, all the way from the soil to the meal. The ideology of nutritionism is itself part of that change. To get a firmer grip on the nature of those changes is to begin to know how we might make our relationships to food healthier. These changes have been numerous and far-reaching, but consider as a start these four large-scale ones:
From Whole Foods to Refined. The case of corn points up one of the key features of the modern diet: a shift toward increasingly refined foods, especially carbohydrates. Call it applied reductionism. Humans have been refining grains since at least the Industrial Revolution, favoring white flour (and white rice) even at the price of lost nutrients. Refining grains extends their shelf life (precisely because it renders them less nutritious to pests) and makes them easier to digest, by removing the fiber that ordinarily slows the release of their sugars. Much industrial food production involves an extension and intensification of this practice, as food processors find ways to deliver glucose — the brain’s preferred fuel — ever more swiftly and efficiently. Sometimes this is precisely the point, as when corn is refined into corn syrup; other times it is an unfortunate byproduct of food processing, as when freezing food destroys the fiber that would slow sugar absorption.
So fast food is fast in this other sense too: it is to a considerable extent predigested, in effect, and therefore more readily absorbed by the body. But while the widespread acceleration of the Western diet offers us the instant gratification of sugar, in many people (and especially those newly exposed to it) the “speediness” of this food overwhelms the insulin response and leads to Type II diabetes. As one nutrition expert put it to me, we’re in the middle of “a national experiment in mainlining glucose.” To encounter such a diet for the first time, as when people accustomed to a more traditional diet come to America, or when fast food comes to their countries, delivers a shock to the system. Public-health experts call it “the nutrition transition,” and it can be deadly.
From Complexity to Simplicity. If there is one word that covers nearly all the changes industrialization has made to the food chain, it would be simplification. Chemical fertilizers simplify the chemistry of the soil, which in turn appears to simplify the chemistry of the food grown in that soil. Since the widespread adoption of synthetic nitrogen fertilizers in the 1950s, the nutritional quality of produce in America has, according to U.S.D.A. figures, declined significantly. Some researchers blame the quality of the soil for the decline; others cite the tendency of modern plant breeding to select for industrial qualities like yield rather than nutritional quality. Whichever it is, the trend toward simplification of our food continues on up the chain. Processing foods depletes them of many nutrients, a few of which are then added back in through “fortification”: folic acid in refined flour, vitamins and minerals in breakfast cereal. But food scientists can add back only the nutrients food scientists recognize as important. What are they overlooking?
Simplification has occurred at the level of species diversity, too. The astounding variety of foods on offer in the modern supermarket obscures the fact that the actual number of species in the modern diet is shrinking. For reasons of economics, the food industry prefers to tease its myriad processed offerings from a tiny group of plant species, corn and soybeans chief among them. Today, a mere four crops account for two-thirds of the calories humans eat. When you consider that humankind has historically consumed some 80,000 edible species, and that 3,000 of these have been in widespread use, this represents a radical simplification of the food web. Why should this matter? Because humans are omnivores, requiring somewhere between 50 and 100 different chemical compounds and elements to be healthy. It’s hard to believe that we can get everything we need from a diet consisting largely of processed corn, soybeans, wheat and rice.
From Leaves to Seeds. It’s no coincidence that most of the plants we have come to rely on are grains; these crops are exceptionally efficient at transforming sunlight into macronutrients — carbs, fats and proteins. These macronutrients in turn can be profitably transformed into animal protein (by feeding them to animals) and processed foods of every description. Also, the fact that grains are durable seeds that can be stored for long periods means they can function as commodities as well as food, making these plants particularly well suited to the needs of industrial capitalism.
The needs of the human eater are another matter. An oversupply of macronutrients, as we now have, itself represents a serious threat to our health, as evidenced by soaring rates of obesity and diabetes. But the undersupply of micronutrients may constitute a threat just as serious. Put in the simplest terms, we’re eating a lot more seeds and a lot fewer leaves, a tectonic dietary shift the full implications of which we are just beginning to glimpse. If I may borrow the nutritionist’s reductionist vocabulary for a moment, there are a host of critical micronutrients that are harder to get from a diet of refined seeds than from a diet of leaves. There are the antioxidants and all the other newly discovered phytochemicals (remember that sprig of thyme?); there is the fiber, and then there are the healthy omega-3 fats found in leafy green plants, which may turn out to be most important benefit of all.
Most people associate omega-3 fatty acids with fish, but fish get them from green plants (specifically algae), which is where they all originate. Plant leaves produce these essential fatty acids (“essential” because our bodies can’t produce them on their own) as part of photosynthesis. Seeds contain more of another essential fatty acid: omega-6. Without delving too deeply into the biochemistry, the two fats perform very different functions, in the plant as well as the plant eater. Omega-3s appear to play an important role in neurological development and processing, the permeability of cell walls, the metabolism of glucose and the calming of inflammation. Omega-6s are involved in fat storage (which is what they do for the plant), the rigidity of cell walls, clotting and the inflammation response. (Think of omega-3s as fleet and flexible, omega-6s as sturdy and slow.) Since the two lipids compete with each other for the attention of important enzymes, the ratio between omega-3s and omega-6s may matter more than the absolute quantity of either fat. Thus too much omega-6 may be just as much a problem as too little omega-3.
And that might well be a problem for people eating a Western diet. As we’ve shifted from leaves to seeds, the ratio of omega-6s to omega-3s in our bodies has shifted, too. At the same time, modern food-production practices have further diminished the omega-3s in our diet. Omega-3s, being less stable than omega-6s, spoil more readily, so we have selected for plants that produce fewer of them; further, when we partly hydrogenate oils to render them more stable, omega-3s are eliminated. Industrial meat, raised on seeds rather than leaves, has fewer omega-3s and more omega-6s than preindustrial meat used to have. And official dietary advice since the 1970s has promoted the consumption of polyunsaturated vegetable oils, most of which are high in omega-6s (corn and soy, especially). Thus, without realizing what we were doing, we significantly altered the ratio of these two essential fats in our diets and bodies, with the result that the ratio of omega-6 to omega-3 in the typical American today stands at more than 10 to 1; before the widespread introduction of seed oils at the turn of the last century, it was closer to 1 to 1.
The role of these lipids is not completely understood, but many researchers say that these historically low levels of omega-3 (or, conversely, high levels of omega-6) bear responsibility for many of the chronic diseases associated with the Western diet, especially heart disease and diabetes. (Some researchers implicate omega-3 deficiency in rising rates of depression and learning disabilities as well.) To remedy this deficiency, nutritionism classically argues for taking omega-3 supplements or fortifying food products, but because of the complex, competitive relationship between omega-3 and omega-6, adding more omega-3s to the diet may not do much good unless you also reduce your intake of omega-6.
From Food Culture to Food Science. The last important change wrought by the Western diet is not, strictly speaking, ecological. But the industrialization of our food that we call the Western diet is systematically destroying traditional food cultures. Before the modern food era — and before nutritionism — people relied for guidance about what to eat on their national or ethnic or regional cultures. We think of culture as a set of beliefs and practices to help mediate our relationship to other people, but of course culture (at least before the rise of science) has also played a critical role in helping mediate people’s relationship to nature. Eating being a big part of that relationship, cultures have had a great deal to say about what and how and why and when and how much we should eat. Of course when it comes to food, culture is really just a fancy word for Mom, the figure who typically passes on the food ways of the group — food ways that, although they were never “designed” to optimize health (we have many reasons to eat the way we do), would not have endured if they did not keep eaters alive and well.
The sheer novelty and glamour of the Western diet, with its 17,000 new food products introduced every year, and the marketing muscle used to sell these products, has overwhelmed the force of tradition and left us where we now find ourselves: relying on science and journalism and marketing to help us decide questions about what to eat. Nutritionism, which arose to help us better deal with the problems of the Western diet, has largely been co-opted by it, used by the industry to sell more food and to undermine the authority of traditional ways of eating. You would not have read this far into this article if your food culture were intact and healthy; you would simply eat the way your parents and grandparents and great-grandparents taught you to eat. The question is, Are we better off with these new authorities than we were with the traditional authorities they supplanted? The answer by now should be clear.
It might be argued that, at this point in history, we should simply accept that fast food is our food culture. Over time, people will get used to eating this way and our health will improve. But for natural selection to help populations adapt to the Western diet, we’d have to be prepared to let those whom it sickens die. That’s not what we’re doing. Rather, we’re turning to the health-care industry to help us “adapt.” Medicine is learning how to keep alive the people whom the Western diet is making sick. It’s gotten good at extending the lives of people with heart disease, and now it’s working on obesity and diabetes. Capitalism is itself marvelously adaptive, able to turn the problems it creates into lucrative business opportunities: diet pills, heart-bypass operations, insulin pumps, bariatric surgery. But while fast food may be good business for the health-care industry, surely the cost to society — estimated at more than $200 billion a year in diet-related health-care costs — is unsustainable.
BEYOND NUTRITIONISM
To medicalize the diet problem is of course perfectly consistent with nutritionism. So what might a more ecological or cultural approach to the problem recommend? How might we plot our escape from nutritionism and, in turn, from the deleterious effects of the modern diet? In theory nothing could be simpler — stop thinking and eating that way — but this is somewhat harder to do in practice, given the food environment we now inhabit and the loss of sharp cultural tools to guide us through it. Still, I do think escape is possible, to which end I can now revisit — and elaborate on, but just a little — the simple principles of healthy eating I proposed at the beginning of this essay, several thousand words ago. So try these few (flagrantly unscientific) rules of thumb, collected in the course of my nutritional odyssey, and see if they don’t at least point us in the right direction.
1. Eat food. Though in our current state of confusion, this is much easier said than done. So try this: Don’t eat anything your great-great-grandmother wouldn’t recognize as food. (Sorry, but at this point Moms are as confused as the rest of us, which is why we have to go back a couple of generations, to a time before the advent of modern food products.) There are a great many foodlike items in the supermarket your ancestors wouldn’t recognize as food (Go-Gurt? Breakfast-cereal bars? Nondairy creamer?); stay away from these.
2. Avoid even those food products that come bearing health claims. They’re apt to be heavily processed, and the claims are often dubious at best. Don’t forget that margarine, one of the first industrial foods to claim that it was more healthful than the traditional food it replaced, turned out to give people heart attacks. When Kellogg’s can boast about its Healthy Heart Strawberry Vanilla cereal bars, health claims have become hopelessly compromised. (The American Heart Association charges food makers for their endorsement.) Don’t take the silence of the yams as a sign that they have nothing valuable to say about health.
3. Especially avoid food products containing ingredients that are a) unfamiliar, b) unpronounceable c) more than five in number — or that contain high-fructose corn syrup.None of these characteristics are necessarily harmful in and of themselves, but all of them are reliable markers for foods that have been highly processed.
4. Get out of the supermarket whenever possible. You won’t find any high-fructose corn syrup at the farmer’s market; you also won’t find food harvested long ago and far away. What you will find are fresh whole foods picked at the peak of nutritional quality. Precisely the kind of food your great-great-grandmother would have recognized as food.
5. Pay more, eat less. The American food system has for a century devoted its energies and policies to increasing quantity and reducing price, not to improving quality. There’s no escaping the fact that better food — measured by taste or nutritional quality (which often correspond) — costs more, because it has been grown or raised less intensively and with more care. Not everyone can afford to eat well in America, which is shameful, but most of us can: Americans spend, on average, less than 10 percent of their income on food, down from 24 percent in 1947, and less than the citizens of any other nation. And those of us who can afford to eat well should. Paying more for food well grown in good soils — whether certified organic or not — will contribute not only to your health (by reducing exposure to pesticides) but also to the health of others who might not themselves be able to afford that sort of food: the people who grow it and the people who live downstream, and downwind, of the farms where it is grown.
“Eat less” is the most unwelcome advice of all, but in fact the scientific case for eating a lot less than we currently do is compelling. “Calorie restriction” has repeatedly been shown to slow aging in animals, and many researchers (including Walter Willett, the Harvard epidemiologist) believe it offers the single strongest link between diet and cancer prevention. Food abundance is a problem, but culture has helped here, too, by promoting the idea of moderation. Once one of the longest-lived people on earth, the Okinawans practiced a principle they called “Hara Hachi Bu”: eat until you are 80 percent full. To make the “eat less” message a bit more palatable, consider that quality may have a bearing on quantity: I don’t know about you, but the better the quality of the food I eat, the less of it I need to feel satisfied. All tomatoes are not created equal.
6. Eat mostly plants, especially leaves. Scientists may disagree on what’s so good about plants — the antioxidants? Fiber? Omega-3s? — but they do agree that they’re probably really good for you and certainly can’t hurt. Also, by eating a plant-based diet, you’ll be consuming far fewer calories, since plant foods (except seeds) are typically less “energy dense” than the other things you might eat. Vegetarians are healthier than carnivores, but near vegetarians (“flexitarians”) are as healthy as vegetarians. Thomas Jefferson was on to something when he advised treating meat more as a flavoring than a food.
7. Eat more like the French. Or the Japanese. Or the Italians. Or the Greeks. Confounding factors aside, people who eat according to the rules of a traditional food culture are generally healthier than we are. Any traditional diet will do: if it weren’t a healthy diet, the people who follow it wouldn’t still be around. True, food cultures are embedded in societies and economies and ecologies, and some of them travel better than others: Inuit not so well as Italian. In borrowing from a food culture, pay attention to how a culture eats, as well as to what it eats. In the case of the French paradox, it may not be the dietary nutrients that keep the French healthy (lots of saturated fat and alcohol?!) so much as the dietary habits: small portions, no seconds or snacking, communal meals — and the serious pleasure taken in eating. (Worrying about diet can’t possibly be good for you.) Let culture be your guide, not science.
8. Cook. And if you can, plant a garden. To take part in the intricate and endlessly interesting processes of providing for our sustenance is the surest way to escape the culture of fast food and the values implicit in it: that food should be cheap and easy; that food is fuel and not communion. The culture of the kitchen, as embodied in those enduring traditions we call cuisines, contains more wisdom about diet and health than you are apt to find in any nutrition journal or journalism. Plus, the food you grow yourself contributes to your health long before you sit down to eat it. So you might want to think about putting down this article now and picking up a spatula or hoe.
9. Eat like an omnivore. Try to add new species, not just new foods, to your diet. The greater the diversity of species you eat, the more likely you are to cover all your nutritional bases. That of course is an argument from nutritionism, but there is a better one, one that takes a broader view of “health.” Biodiversity in the diet means less monoculture in the fields. What does that have to do with your health? Everything. The vast monocultures that now feed us require tremendous amounts of chemical fertilizers and pesticides to keep from collapsing. Diversifying those fields will mean fewer chemicals, healthier soils, healthier plants and animals and, in turn, healthier people. It’s all connected, which is another way of saying that your health isn’t bordered by your body and that what’s good for the soil is probably good for you, too.
Michael Pollan, a contributing writer, is the Knight professor of journalism at the University of California, Berkeley. His most recent book, “The Omnivore’s Dilemma,” was chosen by the editors of The New York Times Book Review as one of the 10 best books of 2006.
Monday, January 29
Friday, January 26
the president is not our commander in chief
WE hear constantly now about “our commander in chief.” The word has become a synonym for “president.” It is said that we “elect a commander in chief.” It is asked whether this or that candidate is “worthy to be our commander in chief.”
But the president is not our commander in chief. He certainly is not mine. I am not in the Army.
I first cringed at the misuse in 1973, during the “Saturday Night Massacre” (as it was called). President Richard Nixon, angered at the Watergate inquiry being conducted by the special prosecutor Archibald Cox, dispatched his chief of staff, Al Haig, to arrange for Mr. Cox’s firing. Mr. Haig told the attorney general, Elliot Richardson, to dismiss Mr. Cox. Mr. Richardson refused, and resigned. Then Mr. Haig told the second in line at the Justice Department, William Ruckelshaus, to fire Cox. Mr. Ruckelshaus refused, and accepted his dismissal. The third in line, Robert Bork, finally did the deed.
What struck me was what Mr. Haig told Mr. Ruckelshaus, “You know what it means when an order comes down from the commander in chief and a member of his team cannot execute it.” This was as great a constitutional faux pas as Mr. Haig’s later claim, when President Reagan was wounded, that “Constitutionally ... I’m in control.”
President Nixon was not Mr. Ruckelshaus’s commander in chief. The president is not the commander in chief of civilians. He is not even commander in chief of National Guard troops unless and until they are federalized. The Constitution is clear on this: “The president shall be commander in chief of the Army and Navy of the United States, and of the militia of the several states, when called into the actual service of the United States.”
When Abraham Lincoln took actions based on military considerations, he gave himself the proper title, “commander in chief of the Army and Navy of the United States.” That title is rarely — more like never — heard today. It is just “commander in chief,” or even “commander in chief of the United States.” This reflects the increasing militarization of our politics. The citizenry at large is now thought of as under military discipline. In wartime, it is true, people submit to the national leadership more than in peacetime. The executive branch takes actions in secret, unaccountable to the electorate, to hide its moves from the enemy and protect national secrets. Constitutional shortcuts are taken “for the duration.” But those impositions are removed when normal life returns.
But we have not seen normal life in 66 years. The wartime discipline imposed in 1941 has never been lifted, and “the duration” has become the norm. World War II melded into the cold war, with greater secrecy than ever — more classified information, tougher security clearances. And now the cold war has modulated into the war on terrorism.
There has never been an executive branch more fetishistic about secrecy than the Bush-Cheney one. The secrecy has been used to throw a veil over detentions, “renditions,” suspension of the Geneva Conventions and of habeas corpus, torture and warrantless wiretaps. We hear again the refrain so common in the other wars — If you knew what we know, you would see how justified all our actions are.
But we can never know what they know. We do not have sufficient clearance.
When Adm. William Crowe, the former chairman of the Joint Chiefs of Staff, criticized the gulf war under the first President Bush, Secretary of State James Baker said that the admiral was not qualified to speak on the matter since he no longer had the clearance to read classified reports. If he is not qualified, then no ordinary citizen is. We must simply trust our lords and obey the commander in chief.
The glorification of the president as a war leader is registered in numerous and substantial executive aggrandizements; but it is symbolized in other ways that, while small in themselves, dispose the citizenry to accept those aggrandizements. We are reminded, for instance, of the expanded commander in chief status every time a modern president gets off the White House helicopter and returns the salute of marines.
That is an innovation that was begun by Ronald Reagan. Dwight Eisenhower, a real general, knew that the salute is for the uniform, and as president he was not wearing one. An exchange of salutes was out of order. (George Bush came as close as he could to wearing a uniform while president when he landed on the telegenic aircraft carrier in an Air Force flight jacket).
We used to take pride in civilian leadership of the military under the Constitution, a principle that George Washington embraced when he avoided military symbols at Mount Vernon. We are not led — or were not in the past — by caudillos.
Senator Daniel Patrick Moynihan’s prescient last book, “Secrecy,” traced the ever-faster-growing secrecy of our government and said that it strikes at the very essence of democracy — accountability of representatives to the people. How can the people hold their representatives to account if they are denied knowledge of what they are doing? Wartime and war analogies are embraced because these justify the secrecy. The representative is accountable to citizens. Soldiers are accountable to their officer. The dynamics are different, and to blend them is to undermine the basic principles of our Constitution.
Garry Wills, a professor emeritus of history at Northwestern, is the author, most recently, of “What Paul Meant.”
But the president is not our commander in chief. He certainly is not mine. I am not in the Army.
I first cringed at the misuse in 1973, during the “Saturday Night Massacre” (as it was called). President Richard Nixon, angered at the Watergate inquiry being conducted by the special prosecutor Archibald Cox, dispatched his chief of staff, Al Haig, to arrange for Mr. Cox’s firing. Mr. Haig told the attorney general, Elliot Richardson, to dismiss Mr. Cox. Mr. Richardson refused, and resigned. Then Mr. Haig told the second in line at the Justice Department, William Ruckelshaus, to fire Cox. Mr. Ruckelshaus refused, and accepted his dismissal. The third in line, Robert Bork, finally did the deed.
What struck me was what Mr. Haig told Mr. Ruckelshaus, “You know what it means when an order comes down from the commander in chief and a member of his team cannot execute it.” This was as great a constitutional faux pas as Mr. Haig’s later claim, when President Reagan was wounded, that “Constitutionally ... I’m in control.”
President Nixon was not Mr. Ruckelshaus’s commander in chief. The president is not the commander in chief of civilians. He is not even commander in chief of National Guard troops unless and until they are federalized. The Constitution is clear on this: “The president shall be commander in chief of the Army and Navy of the United States, and of the militia of the several states, when called into the actual service of the United States.”
When Abraham Lincoln took actions based on military considerations, he gave himself the proper title, “commander in chief of the Army and Navy of the United States.” That title is rarely — more like never — heard today. It is just “commander in chief,” or even “commander in chief of the United States.” This reflects the increasing militarization of our politics. The citizenry at large is now thought of as under military discipline. In wartime, it is true, people submit to the national leadership more than in peacetime. The executive branch takes actions in secret, unaccountable to the electorate, to hide its moves from the enemy and protect national secrets. Constitutional shortcuts are taken “for the duration.” But those impositions are removed when normal life returns.
But we have not seen normal life in 66 years. The wartime discipline imposed in 1941 has never been lifted, and “the duration” has become the norm. World War II melded into the cold war, with greater secrecy than ever — more classified information, tougher security clearances. And now the cold war has modulated into the war on terrorism.
There has never been an executive branch more fetishistic about secrecy than the Bush-Cheney one. The secrecy has been used to throw a veil over detentions, “renditions,” suspension of the Geneva Conventions and of habeas corpus, torture and warrantless wiretaps. We hear again the refrain so common in the other wars — If you knew what we know, you would see how justified all our actions are.
But we can never know what they know. We do not have sufficient clearance.
When Adm. William Crowe, the former chairman of the Joint Chiefs of Staff, criticized the gulf war under the first President Bush, Secretary of State James Baker said that the admiral was not qualified to speak on the matter since he no longer had the clearance to read classified reports. If he is not qualified, then no ordinary citizen is. We must simply trust our lords and obey the commander in chief.
The glorification of the president as a war leader is registered in numerous and substantial executive aggrandizements; but it is symbolized in other ways that, while small in themselves, dispose the citizenry to accept those aggrandizements. We are reminded, for instance, of the expanded commander in chief status every time a modern president gets off the White House helicopter and returns the salute of marines.
That is an innovation that was begun by Ronald Reagan. Dwight Eisenhower, a real general, knew that the salute is for the uniform, and as president he was not wearing one. An exchange of salutes was out of order. (George Bush came as close as he could to wearing a uniform while president when he landed on the telegenic aircraft carrier in an Air Force flight jacket).
We used to take pride in civilian leadership of the military under the Constitution, a principle that George Washington embraced when he avoided military symbols at Mount Vernon. We are not led — or were not in the past — by caudillos.
Senator Daniel Patrick Moynihan’s prescient last book, “Secrecy,” traced the ever-faster-growing secrecy of our government and said that it strikes at the very essence of democracy — accountability of representatives to the people. How can the people hold their representatives to account if they are denied knowledge of what they are doing? Wartime and war analogies are embraced because these justify the secrecy. The representative is accountable to citizens. Soldiers are accountable to their officer. The dynamics are different, and to blend them is to undermine the basic principles of our Constitution.
Garry Wills, a professor emeritus of history at Northwestern, is the author, most recently, of “What Paul Meant.”
Thursday, January 25
The State of Our Energy Policy
This is what I wish President Bush would say about energy in his State of the Union message:
In 2001 I said we needed to increase energy supplies, and in my State of the Union speech in 2002 I said we needed to increase energy production at home to make America less dependent on foreign oil. That was wrong. I’m sorry. The United States has only 3 percent of the world’s oil reserves, so it’s obvious that we need to reduce demand at home. On my watch, we’ve become more dependent on imports. Six years ago, when I came into office, we were importing 11 million barrels of oil a day; last week we brought in 14 million barrels a day. Meanwhile, federal funding for energy efficiency programs has fallen by a third.
In 2003 I promised that a child born that year would have a hydrogen car as his or her first car. I’m sorry — it’s just not clear that hydrogen fuel, fueling stations and cars will be ready by that date. Or ever, for that matter.
In 2004 I spent more time talking about the problems of steroids in baseball players than about energy. But between January 2002 and today, Americans have paid nearly a trillion dollars more for energy — oil, gasoline and natural gas — than during the previous five years. Obviously, if I’d stopped to think about the implications of this in 2004, I would have realized that these increasing energy prices were outstripping my tax cuts, and that they’re symptoms of what’s been called an “energy straightjacket”: in which our delivery infrastructure is stressed, world supplies are tight, and we’re vulnerable to political and weather events over which we have no control.
In 2006 I promised to make our dependence on Middle Eastern oil “a thing of the past” by 2025. Unfortunately the latest projections show that by 2030 we will actually import 40 percent more of our oil from that region. I then said that “Americans are addicted to oil,” which suggested that we need to give up petroleum cold turkey. Obviously impossible. Better, I suggested, to simply wait for labs to make the equivalent of methadone for oil — cellulosic ethanol — which I said would be a practical and competitive fuel within six years, because we are on the “threshold of incredible advances.”
In short, I made a series of promises that wasted valuable time and money, while our uncontrolled energy demand chewed away at the underpinnings of our prosperity, jeopardized the security of our environment, and limited our foreign policy and strategic vision. At the same time, I suggested that the American people should remain passive, waiting for answers from science, from the energy industry or from the government.
Now I’ve had an awakening, and I’ve decided to announce a very different energy policy. This one contains no promises, no wild breakthroughs and no passivity. Now all I have is a plan.
Starting tomorrow, we’re going to cut back on the oil we use, stimulating our economy to become more energy-efficient, and we’re going to phase in alternative fuels in a way that lets the market decide which ones are best. Of course, we’ll invest in research — billions, in fact, far more than the $29 million I proposed last year to investigate cellulosic ethanol.
On conservation:
The 2005 Energy Bill contained a $450 million fund for public education, but the money has never been spent. We’ll use it to pay for a campaign to teach American drivers to save fuel by changing the way they drive. Driving 55 and changing our habits could save more than 7 billion gallons of gasoline a year. The next thing we’ll do is change the timing on 330,000 traffic signals, which could save as much as 17 billion gallons. Then we’ll help the nation’s long-haul truckers install aerodynamic kits on their trucks to save another billion or more. If we get to the point where we’re saving just 3 percent of what we now spend on gasoline, we’re likely to see the price of gas, as well as the worldwide price of oil, fall. And that will give every household, as well as our foreign policy leaders, a bit of breathing room.
On efficiency:
For the past 30 years, efficiency has been America’s largest and cheapest new energy resource; since 1970, increasing efficiency has met three-quarters of our new energy needs. But we can do much more, and we must do more to reduce greenhouse gases and stay competitive with China and Europe. So we’re going to start an initiative to make every aspect of the American economy better at turning energy into gross domestic product.
We’ll set tough standards for motor vehicles and utilities — our two biggest energy users — so that they reduce total energy demand while delivering more services. We’ll also add a market mechanism so that corporations, cities and other organizations can buy and sell efficiency credits, so-called “white certificates” — something Europe and several American states are already doing. In the meantime, we’ll fix policies that discourage efficiency. (Now, for example, we allow landlords to write off their cost of energy yearly, but we require that they write off investments in energy efficiency over a 30-year period.) We believe this combination of tough standards and an efficiency trading system will speed up the commercialization of cutting-edge technology and, at the same time, improve the lives of people around the world. If you want proof, look at how three decades of refrigerator standards have made for better, cheaper fridges that use 70 percent less energy. With the right policies, we can reduce energy demand by more than 15 percent over the next 10 years, while making our economy more productive and more resilient.
On alternative fuels and vehicles:
In the past I’ve overestimated the potential of alternative fuels. We need to scale back our expectations, so that we develop the smartest fuels, not the most politically convenient ones. For starters, we’re going to stop subsidizing alternative fuels. At the same time, we will put a small tax on fuels that pollute, and as time goes on, we’ll increase that tax. Then we’ll copy a model for renewable energy that’s already working in 20 states. It requires that utility companies meet a “renewable portfolio standard” by making sure that a percentage of their electricity comes from renewable sources like solar, wind and biomass.
We’ll extend that to all states, and we’ll start a similar scheme requiring that a small percentage of renewable fuels be mixed in with gasoline and diesel. We won’t dictate what that fuel is, or how it’s made — that’s a decision best left to the market. Investors will build renewable fuel plants because they’ll know there’s a market, and fuel dealers will buy the best mixture of quality and price. The government’s role here isn’t to pick winners, but to foster research and collaboration between the public and private sectors. Already, the Environmental Protection Agency, Eaton Corporation and U.P.S., among others, are cooperating on the production of a new kind of hydraulic hybrid delivery truck. When it’s ready, it’ll have 70-percent better fuel economy and 40-percent fewer greenhouse gas emissions, and it will pay for itself in just three years.
As we start to experiment with new fuels and vehicles, we must remember that innovation takes time. Even petroleum didn’t fully catch on for the first 50 years of its use. Expecting ethanol in five years or hydrogen in 10 is a foolish and expensive mistake.
In 2001 I said we needed to increase energy supplies, and in my State of the Union speech in 2002 I said we needed to increase energy production at home to make America less dependent on foreign oil. That was wrong. I’m sorry. The United States has only 3 percent of the world’s oil reserves, so it’s obvious that we need to reduce demand at home. On my watch, we’ve become more dependent on imports. Six years ago, when I came into office, we were importing 11 million barrels of oil a day; last week we brought in 14 million barrels a day. Meanwhile, federal funding for energy efficiency programs has fallen by a third.
In 2003 I promised that a child born that year would have a hydrogen car as his or her first car. I’m sorry — it’s just not clear that hydrogen fuel, fueling stations and cars will be ready by that date. Or ever, for that matter.
In 2004 I spent more time talking about the problems of steroids in baseball players than about energy. But between January 2002 and today, Americans have paid nearly a trillion dollars more for energy — oil, gasoline and natural gas — than during the previous five years. Obviously, if I’d stopped to think about the implications of this in 2004, I would have realized that these increasing energy prices were outstripping my tax cuts, and that they’re symptoms of what’s been called an “energy straightjacket”: in which our delivery infrastructure is stressed, world supplies are tight, and we’re vulnerable to political and weather events over which we have no control.
In 2006 I promised to make our dependence on Middle Eastern oil “a thing of the past” by 2025. Unfortunately the latest projections show that by 2030 we will actually import 40 percent more of our oil from that region. I then said that “Americans are addicted to oil,” which suggested that we need to give up petroleum cold turkey. Obviously impossible. Better, I suggested, to simply wait for labs to make the equivalent of methadone for oil — cellulosic ethanol — which I said would be a practical and competitive fuel within six years, because we are on the “threshold of incredible advances.”
In short, I made a series of promises that wasted valuable time and money, while our uncontrolled energy demand chewed away at the underpinnings of our prosperity, jeopardized the security of our environment, and limited our foreign policy and strategic vision. At the same time, I suggested that the American people should remain passive, waiting for answers from science, from the energy industry or from the government.
Now I’ve had an awakening, and I’ve decided to announce a very different energy policy. This one contains no promises, no wild breakthroughs and no passivity. Now all I have is a plan.
Starting tomorrow, we’re going to cut back on the oil we use, stimulating our economy to become more energy-efficient, and we’re going to phase in alternative fuels in a way that lets the market decide which ones are best. Of course, we’ll invest in research — billions, in fact, far more than the $29 million I proposed last year to investigate cellulosic ethanol.
On conservation:
The 2005 Energy Bill contained a $450 million fund for public education, but the money has never been spent. We’ll use it to pay for a campaign to teach American drivers to save fuel by changing the way they drive. Driving 55 and changing our habits could save more than 7 billion gallons of gasoline a year. The next thing we’ll do is change the timing on 330,000 traffic signals, which could save as much as 17 billion gallons. Then we’ll help the nation’s long-haul truckers install aerodynamic kits on their trucks to save another billion or more. If we get to the point where we’re saving just 3 percent of what we now spend on gasoline, we’re likely to see the price of gas, as well as the worldwide price of oil, fall. And that will give every household, as well as our foreign policy leaders, a bit of breathing room.
On efficiency:
For the past 30 years, efficiency has been America’s largest and cheapest new energy resource; since 1970, increasing efficiency has met three-quarters of our new energy needs. But we can do much more, and we must do more to reduce greenhouse gases and stay competitive with China and Europe. So we’re going to start an initiative to make every aspect of the American economy better at turning energy into gross domestic product.
We’ll set tough standards for motor vehicles and utilities — our two biggest energy users — so that they reduce total energy demand while delivering more services. We’ll also add a market mechanism so that corporations, cities and other organizations can buy and sell efficiency credits, so-called “white certificates” — something Europe and several American states are already doing. In the meantime, we’ll fix policies that discourage efficiency. (Now, for example, we allow landlords to write off their cost of energy yearly, but we require that they write off investments in energy efficiency over a 30-year period.) We believe this combination of tough standards and an efficiency trading system will speed up the commercialization of cutting-edge technology and, at the same time, improve the lives of people around the world. If you want proof, look at how three decades of refrigerator standards have made for better, cheaper fridges that use 70 percent less energy. With the right policies, we can reduce energy demand by more than 15 percent over the next 10 years, while making our economy more productive and more resilient.
On alternative fuels and vehicles:
In the past I’ve overestimated the potential of alternative fuels. We need to scale back our expectations, so that we develop the smartest fuels, not the most politically convenient ones. For starters, we’re going to stop subsidizing alternative fuels. At the same time, we will put a small tax on fuels that pollute, and as time goes on, we’ll increase that tax. Then we’ll copy a model for renewable energy that’s already working in 20 states. It requires that utility companies meet a “renewable portfolio standard” by making sure that a percentage of their electricity comes from renewable sources like solar, wind and biomass.
We’ll extend that to all states, and we’ll start a similar scheme requiring that a small percentage of renewable fuels be mixed in with gasoline and diesel. We won’t dictate what that fuel is, or how it’s made — that’s a decision best left to the market. Investors will build renewable fuel plants because they’ll know there’s a market, and fuel dealers will buy the best mixture of quality and price. The government’s role here isn’t to pick winners, but to foster research and collaboration between the public and private sectors. Already, the Environmental Protection Agency, Eaton Corporation and U.P.S., among others, are cooperating on the production of a new kind of hydraulic hybrid delivery truck. When it’s ready, it’ll have 70-percent better fuel economy and 40-percent fewer greenhouse gas emissions, and it will pay for itself in just three years.
As we start to experiment with new fuels and vehicles, we must remember that innovation takes time. Even petroleum didn’t fully catch on for the first 50 years of its use. Expecting ethanol in five years or hydrogen in 10 is a foolish and expensive mistake.
Sunday, January 21
Lying Like It’s 2003
FRANK RICH
Published: January 21, 2007
Scooter Libby, the mastermind behind the White House’s bogus scenarios for ginning up the war in Iraq, is back at Washington’s center stage, proudly defending the indefensible in a perjury trial. Ahmad Chalabi, the peddler of flawed prewar intelligence hyped by Mr. Libby, is back in clover in Baghdad, where he purports to lead the government’s Shiite-Baathist reconciliation efforts in between visits to his pal Mahmoud Ahmadinejad in Iran.
Last but never least is Mr. Libby’s former boss and Mr. Chalabi’s former patron, Dick Cheney, who is back on Sunday-morning television floating fictions about Iraq and accusing administration critics of aiding Al Qaeda. When the vice president went on a tear like this in 2003, hawking Iraq’s nonexistent W.M.D. and nonexistent connections to Mohamed Atta, he set the stage for a war that now kills Iraqi civilians in rising numbers (34,000-plus last year) that are heading into the genocidal realms of Saddam. Mr. Cheney’s latest sales pitch is for a new plan for “victory” promising an even bigger bloodbath.
Mr. Cheney was honest, at least, when he said that the White House’s Iraq policy would remain “full speed ahead!” no matter what happened on Nov. 7. Now it is our patriotic duty — politicians, the press and the public alike — to apply the brakes. Our failure to check the administration when it rushed into Iraq in 2003 will look even more shameful to history if we roll over again for a reboot in 2007. For all the belated Washington scrutiny of the war since the election, and for all the heralded (if so far symbolic) Congressional efforts to challenge it, too much lip service is still being paid to the deceptive P.R. strategies used by the administration to sell its reckless policies. This time we must do what too few did the first time: call the White House on its lies. Lies should not be confused with euphemisms like “incompetence” and “denial.”
Mr. Cheney’s performance last week on “Fox News Sunday” illustrates the problem; his lying is nowhere near its last throes. Asked by Chris Wallace about the White House’s decision to overrule commanders who recommended against a troop escalation, the vice president said, “I don’t think we’ve overruled the commanders.” He claimed we’ve made “enormous progress” in Iraq. He said the administration is not “embattled.” (Well, maybe that one is denial.)
This White House gang is so practiced in lying with a straight face that it never thinks twice about recycling its greatest hits. Hours after Mr. Cheney’s Fox interview, President Bush was on “60 Minutes,” claiming that before the war “everybody was wrong on weapons of mass destruction” and that “the minute we found out” the W.M.D. didn’t exist he “was the first to say so.” Everybody, of course, was not wrong on W.M.D., starting with the United Nations weapons inspection team in Iraq. Nor was Mr. Bush the first to come clean once the truth became apparent after the invasion. On May 29, 2003 — two days after a secret Defense Intelligence Agency-sponsored mission found no biological weapons in trailers captured by American forces — Mr. Bush declared: “We found the weapons of mass destruction. We found biological laboratories.”
But that’s all W.M.D under the bridge. The most important lies to watch for now are the new ones being reiterated daily by the administration’s top brass, from Mr. Bush and Mr. Cheney on down. You know fiasco awaits America when everyone in the White House is reading in unison from the same fictional script, as they did back in the day when “mushroom clouds” and “uranium from Africa” were the daily drumbeat.
The latest lies are custom-made to prop up the new “way forward” that is anything but. Among the emerging examples is a rewriting of the history of Iraq’s sectarian violence. The fictional version was initially laid out by Mr. Bush in his Jan. 10 prime-time speech and has since been repeated on television by both Mr. Cheney and the national security adviser, Stephen Hadley, last Sunday and by Mr. Bush again on PBS’s “NewsHour” on Tuesday. It goes like this: sectarian violence didn’t start spiraling out of control until the summer of 2006, after Sunni terrorists bombed the Golden Mosque in Samarra and forced the Shiites to take revenge.
But as Mark Seibel of McClatchy Newspapers noted last week, “the president’s account understates by at least 15 months when Shiite death squads began targeting Sunni politicians and clerics.” They were visible in embryo long before that; The Times, among others, reported as far back as September 2003 that Shiite militias were becoming more radical, dangerous and anti-American. The reasons Mr. Bush pretends that Shiite killing started only last year are obvious enough. He wants to duck culpability for failing to recognize the sectarian violence from the outset — much as he failed to recognize the Sunni insurgency before it — and to underplay the intractability of the civil war to which he will now sacrifice fresh American flesh.
An equally big lie is the administration’s constant claim that it is on the same page as Prime Minister Nuri al-Maliki as we go full speed ahead. Only last month Mr. Maliki told The Wall Street Journal that he wished he “could be done with” his role as Iraq’s leader “before the end of this term.” Now we are asked to believe not merely that he is a strongman capable of vanquishing the death squads of the anti-American cleric Moktada al-Sadr, his political ally, but also that he can be trusted to produce the troops he failed to supply in last year’s failed Baghdad crackdown. Yet as recently as November, there still wasn’t a single Iraqi battalion capable of fighting on its own.
Hardly a day passes without Mr. Maliki mocking the White House’s professed faith in him. In the past week or so alone, he has presided over a second botched hanging (despite delaying it for more than two weeks to put in place new guidelines), charged Condi Rice with giving a “morale boost to the terrorists” because she criticized him, and overruled American objections to appoint an obscure commander from deep in Shiite territory to run the Baghdad “surge.” His government doesn’t even try to hide its greater allegiance to Iran. Mr. Maliki’s foreign minister has asked for the release of the five Iranians detained in an American raid on an Iranian office in northern Iraq this month and, on Monday, called for setting up more Iranian “consulates” in Iraq.
The president’s pretense that Mr. Maliki and his inept, ill-equipped, militia-infiltrated security forces can advance American interests in this war is Neville Chamberlain-like in its naiveté and disingenuousness. An American military official in Baghdad read the writing on the wall to The Times last week: “We are implementing a strategy to embolden a government that is actually part of the problem. We are being played like a pawn.” That’s why the most destructive lie of all may be the White House’s constant refrain that its doomed strategy is the only one anyone has proposed. Administration critics, Mr. Cheney said last Sunday, “have absolutely nothing to offer in its place,” as if the Iraq Study Group, John Murtha and Joseph Biden-Leslie Gelb plans, among others, didn’t predate the White House’s own.
In reality we’re learning piece by piece that it is the White House that has no plan. Ms. Rice has now downsized the surge/escalation into an “augmentation,” inadvertently divulging how the Pentagon is improvising, juggling small deployments in fits and starts. No one can plausibly explain how a parallel chain of command sending American and Iraqi troops into urban street combat side by side will work with Iraqis in the lead (it will report to a “committee” led by Mr. Maliki!). Or how $1 billion in new American reconstruction spending will accomplish what the $30 billion thrown down the drain in previous reconstruction spending did not.
All of this replays 2003, when the White House refused to consider any plan, including existing ones in the Pentagon and State Department bureaucracies, for coping with a broken post-Saddam Iraq. Then, as at every stage of the war since, the only administration plan was for a propaganda campaign to bamboozle American voters into believing “victory” was just around the corner.
The next push on the “way forward” propaganda campaign arrives Tuesday night, with the State of the Union address. The good news is that the Democrats have chosen Jim Webb, the new Virginia senator, to give their official response. Mr. Webb, a Reagan administration Navy secretary and the father of a son serving in Iraq, has already provoked a testy exchange about the war with the president at a White House reception for freshmen in Congress. He’s the kind of guy likely to keep a scorecard of the lies on Tuesday night. But whether he does or not, it’s incumbent on all those talking heads who fell for “shock and awe” and “Mission Accomplished” in 2003 to not let history repeat itself in 2007. Facing the truth is the only way forward in Iraq.
Published: January 21, 2007
Scooter Libby, the mastermind behind the White House’s bogus scenarios for ginning up the war in Iraq, is back at Washington’s center stage, proudly defending the indefensible in a perjury trial. Ahmad Chalabi, the peddler of flawed prewar intelligence hyped by Mr. Libby, is back in clover in Baghdad, where he purports to lead the government’s Shiite-Baathist reconciliation efforts in between visits to his pal Mahmoud Ahmadinejad in Iran.
Last but never least is Mr. Libby’s former boss and Mr. Chalabi’s former patron, Dick Cheney, who is back on Sunday-morning television floating fictions about Iraq and accusing administration critics of aiding Al Qaeda. When the vice president went on a tear like this in 2003, hawking Iraq’s nonexistent W.M.D. and nonexistent connections to Mohamed Atta, he set the stage for a war that now kills Iraqi civilians in rising numbers (34,000-plus last year) that are heading into the genocidal realms of Saddam. Mr. Cheney’s latest sales pitch is for a new plan for “victory” promising an even bigger bloodbath.
Mr. Cheney was honest, at least, when he said that the White House’s Iraq policy would remain “full speed ahead!” no matter what happened on Nov. 7. Now it is our patriotic duty — politicians, the press and the public alike — to apply the brakes. Our failure to check the administration when it rushed into Iraq in 2003 will look even more shameful to history if we roll over again for a reboot in 2007. For all the belated Washington scrutiny of the war since the election, and for all the heralded (if so far symbolic) Congressional efforts to challenge it, too much lip service is still being paid to the deceptive P.R. strategies used by the administration to sell its reckless policies. This time we must do what too few did the first time: call the White House on its lies. Lies should not be confused with euphemisms like “incompetence” and “denial.”
Mr. Cheney’s performance last week on “Fox News Sunday” illustrates the problem; his lying is nowhere near its last throes. Asked by Chris Wallace about the White House’s decision to overrule commanders who recommended against a troop escalation, the vice president said, “I don’t think we’ve overruled the commanders.” He claimed we’ve made “enormous progress” in Iraq. He said the administration is not “embattled.” (Well, maybe that one is denial.)
This White House gang is so practiced in lying with a straight face that it never thinks twice about recycling its greatest hits. Hours after Mr. Cheney’s Fox interview, President Bush was on “60 Minutes,” claiming that before the war “everybody was wrong on weapons of mass destruction” and that “the minute we found out” the W.M.D. didn’t exist he “was the first to say so.” Everybody, of course, was not wrong on W.M.D., starting with the United Nations weapons inspection team in Iraq. Nor was Mr. Bush the first to come clean once the truth became apparent after the invasion. On May 29, 2003 — two days after a secret Defense Intelligence Agency-sponsored mission found no biological weapons in trailers captured by American forces — Mr. Bush declared: “We found the weapons of mass destruction. We found biological laboratories.”
But that’s all W.M.D under the bridge. The most important lies to watch for now are the new ones being reiterated daily by the administration’s top brass, from Mr. Bush and Mr. Cheney on down. You know fiasco awaits America when everyone in the White House is reading in unison from the same fictional script, as they did back in the day when “mushroom clouds” and “uranium from Africa” were the daily drumbeat.
The latest lies are custom-made to prop up the new “way forward” that is anything but. Among the emerging examples is a rewriting of the history of Iraq’s sectarian violence. The fictional version was initially laid out by Mr. Bush in his Jan. 10 prime-time speech and has since been repeated on television by both Mr. Cheney and the national security adviser, Stephen Hadley, last Sunday and by Mr. Bush again on PBS’s “NewsHour” on Tuesday. It goes like this: sectarian violence didn’t start spiraling out of control until the summer of 2006, after Sunni terrorists bombed the Golden Mosque in Samarra and forced the Shiites to take revenge.
But as Mark Seibel of McClatchy Newspapers noted last week, “the president’s account understates by at least 15 months when Shiite death squads began targeting Sunni politicians and clerics.” They were visible in embryo long before that; The Times, among others, reported as far back as September 2003 that Shiite militias were becoming more radical, dangerous and anti-American. The reasons Mr. Bush pretends that Shiite killing started only last year are obvious enough. He wants to duck culpability for failing to recognize the sectarian violence from the outset — much as he failed to recognize the Sunni insurgency before it — and to underplay the intractability of the civil war to which he will now sacrifice fresh American flesh.
An equally big lie is the administration’s constant claim that it is on the same page as Prime Minister Nuri al-Maliki as we go full speed ahead. Only last month Mr. Maliki told The Wall Street Journal that he wished he “could be done with” his role as Iraq’s leader “before the end of this term.” Now we are asked to believe not merely that he is a strongman capable of vanquishing the death squads of the anti-American cleric Moktada al-Sadr, his political ally, but also that he can be trusted to produce the troops he failed to supply in last year’s failed Baghdad crackdown. Yet as recently as November, there still wasn’t a single Iraqi battalion capable of fighting on its own.
Hardly a day passes without Mr. Maliki mocking the White House’s professed faith in him. In the past week or so alone, he has presided over a second botched hanging (despite delaying it for more than two weeks to put in place new guidelines), charged Condi Rice with giving a “morale boost to the terrorists” because she criticized him, and overruled American objections to appoint an obscure commander from deep in Shiite territory to run the Baghdad “surge.” His government doesn’t even try to hide its greater allegiance to Iran. Mr. Maliki’s foreign minister has asked for the release of the five Iranians detained in an American raid on an Iranian office in northern Iraq this month and, on Monday, called for setting up more Iranian “consulates” in Iraq.
The president’s pretense that Mr. Maliki and his inept, ill-equipped, militia-infiltrated security forces can advance American interests in this war is Neville Chamberlain-like in its naiveté and disingenuousness. An American military official in Baghdad read the writing on the wall to The Times last week: “We are implementing a strategy to embolden a government that is actually part of the problem. We are being played like a pawn.” That’s why the most destructive lie of all may be the White House’s constant refrain that its doomed strategy is the only one anyone has proposed. Administration critics, Mr. Cheney said last Sunday, “have absolutely nothing to offer in its place,” as if the Iraq Study Group, John Murtha and Joseph Biden-Leslie Gelb plans, among others, didn’t predate the White House’s own.
In reality we’re learning piece by piece that it is the White House that has no plan. Ms. Rice has now downsized the surge/escalation into an “augmentation,” inadvertently divulging how the Pentagon is improvising, juggling small deployments in fits and starts. No one can plausibly explain how a parallel chain of command sending American and Iraqi troops into urban street combat side by side will work with Iraqis in the lead (it will report to a “committee” led by Mr. Maliki!). Or how $1 billion in new American reconstruction spending will accomplish what the $30 billion thrown down the drain in previous reconstruction spending did not.
All of this replays 2003, when the White House refused to consider any plan, including existing ones in the Pentagon and State Department bureaucracies, for coping with a broken post-Saddam Iraq. Then, as at every stage of the war since, the only administration plan was for a propaganda campaign to bamboozle American voters into believing “victory” was just around the corner.
The next push on the “way forward” propaganda campaign arrives Tuesday night, with the State of the Union address. The good news is that the Democrats have chosen Jim Webb, the new Virginia senator, to give their official response. Mr. Webb, a Reagan administration Navy secretary and the father of a son serving in Iraq, has already provoked a testy exchange about the war with the president at a White House reception for freshmen in Congress. He’s the kind of guy likely to keep a scorecard of the lies on Tuesday night. But whether he does or not, it’s incumbent on all those talking heads who fell for “shock and awe” and “Mission Accomplished” in 2003 to not let history repeat itself in 2007. Facing the truth is the only way forward in Iraq.
Sex and the Single Minded
How to get a job in Washington, that balmy, bipartisan town: Direct an organization that opposes contraception on the grounds that it is “demeaning to women.” Compare premarital sex to heroin addiction. Advertise a link between breast cancer and abortion — a link that was refuted in 1997. Rant against sex ed. And hatch a loony theory about hormones.
You’re a shoo-in, and if your name is Eric Keroack you’re in your second month as deputy assistant secretary for population affairs at the Department of Health and Human Services. Dr. Keroack, a 46-year-old Massachusetts ob-gyn, today oversees the $280 million Title X program, the only federal program “designed to provide access to contraceptive supplies and information to all who want and need them, with priority given to low-income persons.”
It’s not a job that plays to Dr. Keroack’s talents, which happen to be prodigious. In the PowerPoint presentation that has cemented his reputation, he makes the case that premarital sex suppresses the hormone oxytocin, thereby impairing one’s ability to forge a successful long-term relationship. If forced to mince words you might call this fanciful or speculative. Otherwise you’d call it wacko. “Really, really scary” and “utterly hilarious” were the first two reactions I heard from scientists.
Each of us owes a rather critical debt to oxytocin. It’s what moves a new mother to comfort and nurse a squalling baby rather than to toss it from the window, as common sense might dictate. It is — you knew your husband was missing something — the hormone of intimacy. (No, you can’t buy supplements across the border. And yes, OxyContin is something different. Rush Limbaugh was not working on his bonding instincts.)
Louann Brizendine, a neuropsychiatrist at the University of California, San Francisco, calls oxytocin the Glinda the Good Witch of her field. It is the drug of trust and partnership and attachment, commonly known by their street name: love. Oxytocin mellows, elates, and throws you into a mental fog. Rats prefer it to cocaine. While the rush at childbirth is particularly dramatic, the hormone swells with physical and emotional bonding of all kinds.
“Surge” has not always been a dirty word.
But no one has had as much good, clean fun with oxytocin as Dr. Keroack, for whom it is “God’s superglue.” Extrapolating in part from research with prairie voles, which are monogamous, he postulates that oxytocin cannot survive too much sex, at least with multiple partners, at least prior to marriage. By way of demonstration he proposes the duct tape test: you need only an adhesive and a hairy arm. The tape represents the brain. Press it down. Now reapply. See what happens? Less sticky, right? Concludes Keroack: “Basically, you will end up damaging your brain’s ability to use the oxytocin system as a chemical mechanism that serves to help you successfully bond in future relationships.” Don’t ask about his illustrations. They are offensive.
Keroack presents this as gospel truth, though the scientists on whose research he bases his theory balk. One called it a wild leap. “A bungee jump without a cord,” suggested another expert. Dr. Brizendine had a less kind word for it. She adds that while premarital sex cannot ruin your oxytocin response, it has been shown — in the absence of options — to ruin your life. Something tells me that Dr. Keroack is not planning a 34th anniversary bash on Monday for Roe v. Wade.
I know what you’re thinking: if Dr. Keroack can write stuff this outlandish he’s spelling his name wrong. As the other Kerouac said — arguably with a firmer grasp of neurochemistry — “I had nothing to offer anybody but my own confusion.” Dr. Keroack may want to borrow the disclaimer that prefaces Michael Crichton’s newest best-seller: “This novel is fiction, except for the parts that aren’t.” It takes an agenda rather than a medical degree to engage in this kind of science. Or an imagination.
In all fairness, Dr. Keroack has long been a little clumsy as an analogist. In a 2001 letter to the Massachusetts Legislature he explained the logic of performing sonograms on women considering abortion: “Even Midas lets you look at your old muffler before they advise you to change it.”
There are many ways to define demeaning.
Stacy Schiff is the author, most recently, of “A Great Improvisation: Franklin, France and the Birth of America.” She is a guest columnist.
You’re a shoo-in, and if your name is Eric Keroack you’re in your second month as deputy assistant secretary for population affairs at the Department of Health and Human Services. Dr. Keroack, a 46-year-old Massachusetts ob-gyn, today oversees the $280 million Title X program, the only federal program “designed to provide access to contraceptive supplies and information to all who want and need them, with priority given to low-income persons.”
It’s not a job that plays to Dr. Keroack’s talents, which happen to be prodigious. In the PowerPoint presentation that has cemented his reputation, he makes the case that premarital sex suppresses the hormone oxytocin, thereby impairing one’s ability to forge a successful long-term relationship. If forced to mince words you might call this fanciful or speculative. Otherwise you’d call it wacko. “Really, really scary” and “utterly hilarious” were the first two reactions I heard from scientists.
Each of us owes a rather critical debt to oxytocin. It’s what moves a new mother to comfort and nurse a squalling baby rather than to toss it from the window, as common sense might dictate. It is — you knew your husband was missing something — the hormone of intimacy. (No, you can’t buy supplements across the border. And yes, OxyContin is something different. Rush Limbaugh was not working on his bonding instincts.)
Louann Brizendine, a neuropsychiatrist at the University of California, San Francisco, calls oxytocin the Glinda the Good Witch of her field. It is the drug of trust and partnership and attachment, commonly known by their street name: love. Oxytocin mellows, elates, and throws you into a mental fog. Rats prefer it to cocaine. While the rush at childbirth is particularly dramatic, the hormone swells with physical and emotional bonding of all kinds.
“Surge” has not always been a dirty word.
But no one has had as much good, clean fun with oxytocin as Dr. Keroack, for whom it is “God’s superglue.” Extrapolating in part from research with prairie voles, which are monogamous, he postulates that oxytocin cannot survive too much sex, at least with multiple partners, at least prior to marriage. By way of demonstration he proposes the duct tape test: you need only an adhesive and a hairy arm. The tape represents the brain. Press it down. Now reapply. See what happens? Less sticky, right? Concludes Keroack: “Basically, you will end up damaging your brain’s ability to use the oxytocin system as a chemical mechanism that serves to help you successfully bond in future relationships.” Don’t ask about his illustrations. They are offensive.
Keroack presents this as gospel truth, though the scientists on whose research he bases his theory balk. One called it a wild leap. “A bungee jump without a cord,” suggested another expert. Dr. Brizendine had a less kind word for it. She adds that while premarital sex cannot ruin your oxytocin response, it has been shown — in the absence of options — to ruin your life. Something tells me that Dr. Keroack is not planning a 34th anniversary bash on Monday for Roe v. Wade.
I know what you’re thinking: if Dr. Keroack can write stuff this outlandish he’s spelling his name wrong. As the other Kerouac said — arguably with a firmer grasp of neurochemistry — “I had nothing to offer anybody but my own confusion.” Dr. Keroack may want to borrow the disclaimer that prefaces Michael Crichton’s newest best-seller: “This novel is fiction, except for the parts that aren’t.” It takes an agenda rather than a medical degree to engage in this kind of science. Or an imagination.
In all fairness, Dr. Keroack has long been a little clumsy as an analogist. In a 2001 letter to the Massachusetts Legislature he explained the logic of performing sonograms on women considering abortion: “Even Midas lets you look at your old muffler before they advise you to change it.”
There are many ways to define demeaning.
Stacy Schiff is the author, most recently, of “A Great Improvisation: Franklin, France and the Birth of America.” She is a guest columnist.
Retreat and Cheat
NYT Editorial
President Bush’s warrantless wiretapping program was once deemed so vital to national security that it could not be subjected to judicial review. Last week, the White House said it was doing just that.
In 2005, the White House would not even comment on news reports about the C.I.A.’s prisons because Americans’ safety depended on their being kept secret. In 2006, Mr. Bush held a photo-op to announce that he was keeping them open.
The administration has repeatedly insisted that it was essential to the American way of life for Mr. Bush to be able to imprison foreigners without trial or legal counsel. Now the administration claims it was trying to bring those detainees to trial all along but was stymied by white-shoe lawyers.
By now, this is a familiar pattern: First, Mr. Bush and his aides say his actions are so vital to national security that to even report on them — let alone question them — lends comfort to the terrorists. Then, usually when his decisions face scrutiny from someone other than a compliant Republican Congress, the president seems to compromise.
Behind this behavior are at least two dynamics, both of them disturbing.
The first is that the policies Mr. Bush is trying so hard to hide have little, if anything, to do with real national security issues — and everything to do with a campaign, spearheaded by Vice President Dick Cheney, to break the restraints on presidential power imposed after Vietnam and Watergate. And there is much less than meets the eye to Mr. Bush’s supposed concessions.
Generally, they mask the fact that he either got what he wanted from Congress or found a way to add some other veneer of legitimacy to his lawless behavior. The campaign to expand presidential power goes on, at the expense of American values.
Mr. Bush’s aides don’t try very hard to hide it. The day the shift on domestic wiretapping was announced, Attorney General Alberto Gonzales gave a speech in which he sneered at the idea of allowing judges to review national security policies. The next day, he was in the Senate refusing to turn over the agreement that he said would provide judicial review for the wiretapping. And his lawyers were in court arguing that a lawsuit over the warrantless eavesdropping should be dropped because Mr. Bush said he would stop the operation.
We don’t know exactly what agreement the White House made with the Foreign Intelligence Surveillance Court about eavesdropping. But there is evidence that Mr. Bush got some broad approval for a wiretapping “program” rather than the individual warrants required by law. Because the court works in secret, the public may never know whether Mr. Bush really is complying with the law.
Nor is there likely to be an explanation of why the White House could not have sought the court’s approval in the first place. The White House’s claim that the process is too cumbersome doesn’t ring true. The law already allows the government to wiretap first and then ask for a warrant within three days. The real reason is almost certainly that the imperial presidency had no desire to share power even with the most secret part of the judiciary.
Why else would the president have turned down more than one offer from Congress to amend the 1978 wiretapping law after 9/11 to make getting warrants easier and faster than the three-day rule?
For that matter, why did the White House initially refuse to work with senior Republican lawmakers to create a legal court system for the Guantánamo detainees? Instead, Mr. Bush ordered the creation of kangaroo courts, expanding presidential authority at the expense of Congress and the judiciary, and at the expense of justice.
The Republican-led Congress (with the help of cowed Democrats) refused to hold Mr. Bush and his presidency to account on any of these issues. The Military Commissions Act, passed by Congress just before the midterm elections last year, gave Mr. Bush a pass on what he had already done with the detainees outside the law, and did not stop him from jailing non-Americans indefinitely without due process. Congress absolved American intelligence agents of past abuses of prisoners and approved future abuses, and Mr. Bush happily announced that the C.I.A. prisons would stay open.
There are signs, from people like Senator Patrick Leahy, the new chairman of the Judiciary Committee, that the Democrats will be tougher than the Republicans on these issues. The eavesdropping program and Mr. Bush’s secret deal with the surveillance court are a very good place to start.
President Bush’s warrantless wiretapping program was once deemed so vital to national security that it could not be subjected to judicial review. Last week, the White House said it was doing just that.
In 2005, the White House would not even comment on news reports about the C.I.A.’s prisons because Americans’ safety depended on their being kept secret. In 2006, Mr. Bush held a photo-op to announce that he was keeping them open.
The administration has repeatedly insisted that it was essential to the American way of life for Mr. Bush to be able to imprison foreigners without trial or legal counsel. Now the administration claims it was trying to bring those detainees to trial all along but was stymied by white-shoe lawyers.
By now, this is a familiar pattern: First, Mr. Bush and his aides say his actions are so vital to national security that to even report on them — let alone question them — lends comfort to the terrorists. Then, usually when his decisions face scrutiny from someone other than a compliant Republican Congress, the president seems to compromise.
Behind this behavior are at least two dynamics, both of them disturbing.
The first is that the policies Mr. Bush is trying so hard to hide have little, if anything, to do with real national security issues — and everything to do with a campaign, spearheaded by Vice President Dick Cheney, to break the restraints on presidential power imposed after Vietnam and Watergate. And there is much less than meets the eye to Mr. Bush’s supposed concessions.
Generally, they mask the fact that he either got what he wanted from Congress or found a way to add some other veneer of legitimacy to his lawless behavior. The campaign to expand presidential power goes on, at the expense of American values.
Mr. Bush’s aides don’t try very hard to hide it. The day the shift on domestic wiretapping was announced, Attorney General Alberto Gonzales gave a speech in which he sneered at the idea of allowing judges to review national security policies. The next day, he was in the Senate refusing to turn over the agreement that he said would provide judicial review for the wiretapping. And his lawyers were in court arguing that a lawsuit over the warrantless eavesdropping should be dropped because Mr. Bush said he would stop the operation.
We don’t know exactly what agreement the White House made with the Foreign Intelligence Surveillance Court about eavesdropping. But there is evidence that Mr. Bush got some broad approval for a wiretapping “program” rather than the individual warrants required by law. Because the court works in secret, the public may never know whether Mr. Bush really is complying with the law.
Nor is there likely to be an explanation of why the White House could not have sought the court’s approval in the first place. The White House’s claim that the process is too cumbersome doesn’t ring true. The law already allows the government to wiretap first and then ask for a warrant within three days. The real reason is almost certainly that the imperial presidency had no desire to share power even with the most secret part of the judiciary.
Why else would the president have turned down more than one offer from Congress to amend the 1978 wiretapping law after 9/11 to make getting warrants easier and faster than the three-day rule?
For that matter, why did the White House initially refuse to work with senior Republican lawmakers to create a legal court system for the Guantánamo detainees? Instead, Mr. Bush ordered the creation of kangaroo courts, expanding presidential authority at the expense of Congress and the judiciary, and at the expense of justice.
The Republican-led Congress (with the help of cowed Democrats) refused to hold Mr. Bush and his presidency to account on any of these issues. The Military Commissions Act, passed by Congress just before the midterm elections last year, gave Mr. Bush a pass on what he had already done with the detainees outside the law, and did not stop him from jailing non-Americans indefinitely without due process. Congress absolved American intelligence agents of past abuses of prisoners and approved future abuses, and Mr. Bush happily announced that the C.I.A. prisons would stay open.
There are signs, from people like Senator Patrick Leahy, the new chairman of the Judiciary Committee, that the Democrats will be tougher than the Republicans on these issues. The eavesdropping program and Mr. Bush’s secret deal with the surveillance court are a very good place to start.
Monday, January 15
The real reason the Bush administration won't back down on Guantanamo.
By Dahlia Lithwick
Posted Saturday, Jan. 13, 2007, at 6:52 AM ET
Why is the United States poised to try Jose Padilla as a dangerous terrorist, long after it has become perfectly clear that he was just the wrong Muslim in the wrong airport on the wrong day?
Why is the United States still holding hundreds of detainees at Guantanamo Bay, long after years of interrogation and abuse have established that few, if any, of them are the deadly terrorists they have been held out to be?
And why is President Bush still issuing grandiose and provocative signing statements, the latest of which claims that the executive branch holds the power to open mail as it sees fit?
Willing to give the benefit of the doubt, I once believed the common thread here was presidential blindness—an extreme executive-branch myopia that leads the president to believe that these futile little measures are somehow integral to combating terrorism. That this is some piece of self-delusion that precludes Bush and his advisers from recognizing that Padilla is just a chump and Guantanamo merely a holding pen for a jumble of innocent and half-guilty wretches.
But it has finally become clear that the goal of these foolish efforts isn't really to win the war against terrorism; indeed, nothing about Padilla, Guantanamo, or signing statements moves the country an inch closer to eradicating terror. The object is a larger one, and the original overarching goal of this administration: expanding executive power, for its own sake.
Two scrupulously reported pieces on the Padilla case are illuminating. On Jan. 3, Nina Totenberg of National Public Radio interviewed Mark Corallo, spokesman for then-Attorney General John Ashcroft, about the behind-the-scenes decision-making in the Padilla case—a case that's lolled through the federal courts for years. According to Totenberg, when the Supreme Court sent Padilla's case back to the lower federal courts on technical grounds in 2004, the Bush administration's sole concern was preserving its constitutional claim that it could hold citizens as enemy combatants. "Justice Department officials warned that if the case went back to the Supreme Court, the administration would almost certainly lose," she reports, which is why Padilla was hauled back to the lower courts. Her sources further confirmed that "key players in the Defense Department and Vice President Cheney's office insisted that the power to detain Americans as enemy combatants had to be preserved."
Deborah Sontag's excellent New York Times story on Padilla on Jan. 4 makes the same point: He was moved from military custody to criminal court only as "a legal maneuver that kept the issue of his detention without charges out of the Supreme Court." So this is why the White House yanked Padilla from the brig to the high court to the federal courts and back to a Florida trial court: They were only forum shopping for the best place to enshrine the right to detain him indefinitely. Their claims about Padilla's dirty bomb, known to be false, were a means of advancing their larger claims about executive power. And when confronted with the possibility of losing on those claims, they yanked him back to the criminal courts as a way to avoid losing powers they'd already won.
This need to preserve newly won legal ground also explains the continued operation of the detention center at Guantanamo Bay. Last week marked the fifth anniversary of the camp that—according to Donald Rumsfeld in 2002—houses only "the worst of the worst." Now that over half of them have been released (apparently, the best of the worst) and even though only about 80 of the rest will ever see trials, the camp remains open. Why? Civil-rights groups worldwide and even close U.S. allies like Germany, Denmark, and England clamor for its closure. And as the ever-vigilant Nat Hentoff points out, new studies reveal that only a small fraction of the detainees there are even connected to al-Qaida—according to the Defense Department's own best data.
But Guantanamo stays open for the same reason Padilla stays on trial. Having claimed the right to label enemy combatants and detain them indefinitely without charges, the Bush administration is unable to retreat from that position without ceding ground. In some sense, the president is now as much a prisoner of Guantanamo as the detainees. And having gone nose-to-nose with the Congress over his authority to craft stripped-down courts for these "enemies," courts guaranteed to produce guilty verdicts, Bush cannot just call off the trials.
The endgame in the war on terror isn't holding the line against terrorists. It's holding the line on hard-fought claims to absolutely limitless presidential authority.
Enter these signing statements. The most recent of the all-but-meaningless postscripts Bush tacks onto legislation gives him the power to "authorize a search of mail in an emergency" to ''protect human life and safety" and "for foreign intelligence collection." There is some debate about whether the president has that power already, but it misses the point. The purpose of these signing statements is simply to plant a flag on the moon—one more way for the president to stake out the furthest corners in his field of constitutional dreams.
Last spring, The New Yorker's Jane Mayer profiled David Addington, Vice President Richard Cheney's chief of staff and legal adviser. Addington's worldview in brief: A single-minded devotion to something called the New Paradigm, a constitutional theory of virtually limitless executive power, wherein "the President, as Commander-in-Chief, has the authority to disregard virtually all previously known legal boundaries, if national security demands it," Mayer describes.
Insiders in the Bush administration told Mayer that Addington and Cheney had been "laying the groundwork" for a vast expansion of presidential power long before 9/11. In 2002, the vice president told ABC News that the presidency was "weaker today as an institution because of the unwise compromises that have been made over the last 30 to 35 years." Rebuilding that presidency has been their sole goal for decades.
The image of Addington scrutinizing "every bill before President Bush signs it, searching for any language that might impinge on Presidential power," as Mayer puts it, can be amusing—like the mother of the bride obsessing over a tricky seating chart. But this zeal to restore an all-powerful presidency traps the Bush administration in its own worst legal sinkholes. This newfound authority—to maintain a disastrous Guantanamo, to stage rights-free tribunals and hold detainees forever—is the kind of power Nixon only dreamed about. It cannot be let go.
In a heartbreaking letter from Guantanamo this week, published in the Los Angeles Times, prisoner Jumah Al Dossari writes: "The purpose of Guantanamo is to destroy people, and I have been destroyed." I fear he is wrong. The destruction of Al Dossari, Jose Padilla, Zacarias Moussaoui, and some of our most basic civil liberties was never a purpose or a goal—it was a mere byproduct. The true purpose is more abstract and more tragic: To establish a clunky post-Watergate dream of an imperial presidency, whatever the human cost may be.
Posted Saturday, Jan. 13, 2007, at 6:52 AM ET
Why is the United States poised to try Jose Padilla as a dangerous terrorist, long after it has become perfectly clear that he was just the wrong Muslim in the wrong airport on the wrong day?
Why is the United States still holding hundreds of detainees at Guantanamo Bay, long after years of interrogation and abuse have established that few, if any, of them are the deadly terrorists they have been held out to be?
And why is President Bush still issuing grandiose and provocative signing statements, the latest of which claims that the executive branch holds the power to open mail as it sees fit?
Willing to give the benefit of the doubt, I once believed the common thread here was presidential blindness—an extreme executive-branch myopia that leads the president to believe that these futile little measures are somehow integral to combating terrorism. That this is some piece of self-delusion that precludes Bush and his advisers from recognizing that Padilla is just a chump and Guantanamo merely a holding pen for a jumble of innocent and half-guilty wretches.
But it has finally become clear that the goal of these foolish efforts isn't really to win the war against terrorism; indeed, nothing about Padilla, Guantanamo, or signing statements moves the country an inch closer to eradicating terror. The object is a larger one, and the original overarching goal of this administration: expanding executive power, for its own sake.
Two scrupulously reported pieces on the Padilla case are illuminating. On Jan. 3, Nina Totenberg of National Public Radio interviewed Mark Corallo, spokesman for then-Attorney General John Ashcroft, about the behind-the-scenes decision-making in the Padilla case—a case that's lolled through the federal courts for years. According to Totenberg, when the Supreme Court sent Padilla's case back to the lower federal courts on technical grounds in 2004, the Bush administration's sole concern was preserving its constitutional claim that it could hold citizens as enemy combatants. "Justice Department officials warned that if the case went back to the Supreme Court, the administration would almost certainly lose," she reports, which is why Padilla was hauled back to the lower courts. Her sources further confirmed that "key players in the Defense Department and Vice President Cheney's office insisted that the power to detain Americans as enemy combatants had to be preserved."
Deborah Sontag's excellent New York Times story on Padilla on Jan. 4 makes the same point: He was moved from military custody to criminal court only as "a legal maneuver that kept the issue of his detention without charges out of the Supreme Court." So this is why the White House yanked Padilla from the brig to the high court to the federal courts and back to a Florida trial court: They were only forum shopping for the best place to enshrine the right to detain him indefinitely. Their claims about Padilla's dirty bomb, known to be false, were a means of advancing their larger claims about executive power. And when confronted with the possibility of losing on those claims, they yanked him back to the criminal courts as a way to avoid losing powers they'd already won.
This need to preserve newly won legal ground also explains the continued operation of the detention center at Guantanamo Bay. Last week marked the fifth anniversary of the camp that—according to Donald Rumsfeld in 2002—houses only "the worst of the worst." Now that over half of them have been released (apparently, the best of the worst) and even though only about 80 of the rest will ever see trials, the camp remains open. Why? Civil-rights groups worldwide and even close U.S. allies like Germany, Denmark, and England clamor for its closure. And as the ever-vigilant Nat Hentoff points out, new studies reveal that only a small fraction of the detainees there are even connected to al-Qaida—according to the Defense Department's own best data.
But Guantanamo stays open for the same reason Padilla stays on trial. Having claimed the right to label enemy combatants and detain them indefinitely without charges, the Bush administration is unable to retreat from that position without ceding ground. In some sense, the president is now as much a prisoner of Guantanamo as the detainees. And having gone nose-to-nose with the Congress over his authority to craft stripped-down courts for these "enemies," courts guaranteed to produce guilty verdicts, Bush cannot just call off the trials.
The endgame in the war on terror isn't holding the line against terrorists. It's holding the line on hard-fought claims to absolutely limitless presidential authority.
Enter these signing statements. The most recent of the all-but-meaningless postscripts Bush tacks onto legislation gives him the power to "authorize a search of mail in an emergency" to ''protect human life and safety" and "for foreign intelligence collection." There is some debate about whether the president has that power already, but it misses the point. The purpose of these signing statements is simply to plant a flag on the moon—one more way for the president to stake out the furthest corners in his field of constitutional dreams.
Last spring, The New Yorker's Jane Mayer profiled David Addington, Vice President Richard Cheney's chief of staff and legal adviser. Addington's worldview in brief: A single-minded devotion to something called the New Paradigm, a constitutional theory of virtually limitless executive power, wherein "the President, as Commander-in-Chief, has the authority to disregard virtually all previously known legal boundaries, if national security demands it," Mayer describes.
Insiders in the Bush administration told Mayer that Addington and Cheney had been "laying the groundwork" for a vast expansion of presidential power long before 9/11. In 2002, the vice president told ABC News that the presidency was "weaker today as an institution because of the unwise compromises that have been made over the last 30 to 35 years." Rebuilding that presidency has been their sole goal for decades.
The image of Addington scrutinizing "every bill before President Bush signs it, searching for any language that might impinge on Presidential power," as Mayer puts it, can be amusing—like the mother of the bride obsessing over a tricky seating chart. But this zeal to restore an all-powerful presidency traps the Bush administration in its own worst legal sinkholes. This newfound authority—to maintain a disastrous Guantanamo, to stage rights-free tribunals and hold detainees forever—is the kind of power Nixon only dreamed about. It cannot be let go.
In a heartbreaking letter from Guantanamo this week, published in the Los Angeles Times, prisoner Jumah Al Dossari writes: "The purpose of Guantanamo is to destroy people, and I have been destroyed." I fear he is wrong. The destruction of Al Dossari, Jose Padilla, Zacarias Moussaoui, and some of our most basic civil liberties was never a purpose or a goal—it was a mere byproduct. The true purpose is more abstract and more tragic: To establish a clunky post-Watergate dream of an imperial presidency, whatever the human cost may be.