In a finding that is likely to intensify the debate over what to teach students about the origins of life, a poll released yesterday found that nearly two-thirds of Americans say that creationism should be taught alongside evolution in public schools.
The poll found that 42 percent of respondents held strict creationist views, agreeing that "living things have existed in their present form since the beginning of time."
In contrast, 48 percent said they believed that humans had evolved over time. But of those, 18 percent said that evolution was "guided by a supreme being," and 26 percent said that evolution occurred through natural selection. In all, 64 percent said they were open to the idea of teaching creationism in addition to evolution, while 38 percent favored replacing evolution with creationism.
The poll was conducted July 7-17 by the Pew Forum on Religion and Public Life and the Pew Research Center for the People and the Press. The questions about evolution were asked of 2,000 people. The margin of error was 3.5 percentage points.
President Bush joined the debate on Aug. 2, telling reporters that both evolution and the theory of intelligent design should be taught in schools "so people can understand what the debate is about."
The poll showed 41 percent of respondents wanted parents to have the primary say over how evolution is taught, compared with 28 percent who said teachers and scientists should decide and 21 percent who said school boards should. Asked whether they believed creationism should be taught instead of evolution, 38 percent were in favor, and 49 percent were opposed.
More of those who believe in creationism said they were "very certain" of their views (63 percent), compared with those who believe in evolution (32 percent).
Fully 70% of white evangelical Protestants say that life has existed in its present form since the beginning of time.
College graduates are twice as likely as people who did not attend college to accept the natural selection theory of evolution (40%-18%). Nearly two-thirds (63%) of those who take a creationist point of view say they are very certain about how life developed. By contrast, those who believe in evolution are less certain of their views just 32% say they are very certain.
White evangelicals and black Protestants are the only religious groups expressing majority support for teaching creationism instead of evolution in public schools. Majorities of mainline Protestants, Catholics and seculars oppose this idea. Politically, a majority of conservative Republicans favor replacing evolution with creationism in the classroom, but support for this proposal falls below 40% for all other political groups, including moderate and liberal Republicans. Regionally, only among Southerners does a plurality (45%) support replacing evolution with creationism in the schools.
Wednesday, August 31
Intelligent Design Has No Place in the Science Curriculum
By HAROLD MOROWITZ, ROBERT HAZEN, and JAMES TREFIL
Scientists who teach evolution sometimes feel as if they are trapped in an old horror film -- the kind where the monster is killed repeatedly, only to come to life in a nastier form each time. Since the Scopes trial in 1925, the battle between scientists who want to teach mainstream biology in American public schools, and creationists who want to promulgate a more religious view, has gone through several cycles.
In McLean v. Arkansas Board of Education in 1982, a federal court ruled that the introduction of creationism into public-school curricula constituted the establishment of religion, and hence was expressly forbidden by the First Amendment. That decision dealt a serious (though by no means fatal) blow to old-line creationism and its close cousin, so-called creation science. But another variant of creationism, so-called intelligent design, has cropped up. At least 19 states are now debating its use in public education, and President Bush commented in August that he thought both evolution and intelligent design "ought to be properly taught."
Many people fail to understand the subtle but important differences between the new and old forms of creationism, and the different debates those approaches engender. Like the French generals who used tactics from World War I to face the Nazis in 1939, some educators seem intent on fighting the last war.
A word about the authors of this essay: Although our areas of expertise differ, all of us have investigated aspects of life's origin and evolution. In addition, our political views span the spectrum from liberal Democrat to conservative Republican. Thus the essay does not represent any particular ideological or disciplinary viewpoint. We are united in our concern that the science curriculum, from kindergarten through university, should reflect the best and most up-to-date scholarship.
Consider, then, several different theories of life's origin and evolution. The main theories are those of miraculous creation and of sequential origins. Within the theories of sequential origins are the theories of intelligent design and of emergent complexity, and the latter can in turn be divided into the theories of frozen accident and of deterministic origins. The debate surrounding each pair focuses on a different aspect of the nature of science.
Miraculous creation versus sequential origins. Was the origin of life a miracle, or did it conform to natural law -- and how can we tell? Many different versions of the doctrine of miraculous creation exist, but the one that is most at odds with modern science is called "young Earth creationism" and is based on a literal reading of the Bible. According to the supporters of that theory, our planet and its life-forms were created more or less in their present forms in a miraculous act about 10,000 years ago.
Young Earth creationism is in direct conflict with scientific measurements of the age of rocks, the thickness of polar ice sheets, the expansion of the universe, and numerous other indicators of our planet's great antiquity.
One unusual solution to that disparity was proposed in a book by Philip Gosse, called Omphalos, which was published two years before Darwin's On the Origin of Species. The word "omphalos" means navel in Greek, and Gosse argued that Adam was created with a navel, even though he had never been inside a womb. From that insight has flowed the so-called doctrine of created antiquity (Gosse actually called it Pre-Chronism), which states that although Earth was created 10,000 years ago, it was created to look as if it were much older. Are some stars more than 10,000 light-years away? The universe was created with light from those stars already on its way to Earth. And what about those apparently ancient rocks? The universe was created with just the right mixtures of potassium-40 and argon to make the rocks appear much older than they really are.
It is impossible to conceive of any experiment or observation that could prove the doctrine of created antiquity wrong. Any result, no matter what it was, could be explained by saying "the universe was just created that way."
In fact, that property of young Earth creationism proved to be its Achilles' heel. Every scientific theory must be testable by observation or experiment -- or it cannot be considered science. In principle, it must be possible to imagine outcomes that would prove the theory wrong. In the words of Karl Popper, scientific theories must be falsifiable, even if they are not false. Popper said that a theory that cannot be overturned by experimental data is not a part of experimental science.
Created antiquity is not falsifiable. The teaching of young Earth creationism, along with any other doctrine based on a miraculous creation of life, was prohibited in public schools not because the theory was proved wrong but because it simply is not science. It is, as the court in McLean v. Arkansas Board of Education recognized, a religious doctrine, untestable by the techniques of science.
Once we discard the theories of miraculous creation, we are left with the theories of sequential origins.
Intelligent design versus emergent complexity. The theory of intelligent design, or ID, is a theory of sequential origins, but it is also the latest attack on the idea that the origin and evolution of life follow natural laws. Like created antiquity, ID has a long intellectual pedigree. The English philosopher William Paley first espoused it in 1802, arguing that if you found a watch in a field, you would conclude that it had been designed by some intelligence rather than assembled by chance. In the same way, the argument goes, the intricate universe in which we live reflects the mind of an intelligent maker.
The modern theory of intelligent design is more sophisticated than Paley's argument, although it derives from much the same kind of reasoning. It is anchored in a concept called "irreducible complexity" -- the idea that organisms possess many complicated structures, which are essential to the organism's survival but which are useless unless all the structures are present. The chance of Darwinian evolution's producing so many such structures and of their existing simultaneously, according to the theory, is so small that they must have been produced by an intelligent designer.
Intelligent design challenges the conventional wisdom in origin-of-life research that life is a prime example of so-called emergent complexity. All around us are complex systems that arise when energy flows through a collection of particles, like living cells or grains of sand. Ant colonies, slime molds, sand dunes, spiral galaxies, traffic jams, and human consciousness are examples of such systems. Although scientists have yet to produce a living system in the laboratory, most origin-of-life researchers are optimistic that one day we will be able to do so, or at least to understand how life first emerged from inorganic materials.
The supporters of intelligent design resort to the same kind of argument that creationists have used for decades, identifying some biological structure and claiming that it is irreducibly complex. Then supporters of emergent complexity have to analyze that structure and show that its complexity arises naturally. For example, 20 years ago, the predecessors of ID advocates pointed to the modern whale as an example of what would be called irreducible complexity today (that term wasn't used then). The whale, they argued, is a form so specialized that it could not possibly have been produced by Darwinian evolution.
Alan Haywood, author of Creation and Evolution, put it this way: "Darwinists rarely mention the whale because it presents them with one of their most insoluble problems. They believe that somehow a whale must have evolved from an ordinary land-dwelling animal, which took to the sea and lost its legs. ... A land mammal that was in the process of becoming a whale would fall between two stools -- it would not be fitted for life on land or at sea, and would have no hope for survival."
The power of science is that, faced with such a challenge, one can test the relevant theory. The theory of evolution predicts that whales with atrophied hind legs must have once swum in the seas. If Darwin is correct, then those whales' fossils must lie buried somewhere. Furthermore, those strange creatures must have arisen during a relatively narrow interval of geological time, after the evolution of the earliest known marine mammals (about 60 million years ago) and before the appearance of the streamlined whales of the present era (which show up in the fossil record during the past 30 million years). Armed with those conclusions, paleontologists searched shallow marine formations from 35 million to 55 million years in age. Sure enough, in the past decade the scientists have excavated dozens of those "missing links" in the development of the whale -- curious creatures that sport combinations of anatomical features characteristic of land and sea mammals.
But there's always another challenge to evolution, always another supposed example of irreducible complexity. At the present time the poster child of intelligent design is the flagellum of a bacterium. That complex structure in bacterial walls features a corkscrew-shaped fiber that rotates, propelling the bacterium through the water. Obviously, a completely functioning flagellum is very useful, but it is also obvious that all its parts have to be present for it to function. A nonmoving corkscrew, for example, would be useless and would confer no evolutionary advantage on its own. Roughly 50 molecules are involved in constructing the flagellum, so the probability of all the parts' coming together by chance seems infinitesimally small.
However, that intelligent-design argument contains a hidden assumption: that all parts of a complex structure must have had the same function throughout the history of the development of the organism. In fact, it is quite common for structures to have one function at one time and be adapted for quite another use later on. A land animal's legs become a whale's flippers. An insect may develop bumps on the side of its body to help it get rid of internal heat, but when the bumps get big enough, they may help the insect glide or fly, thus opening up an entirely new ecological niche for exploitation. That process is so common that evolutionary scientists have given it a name: exaptation.
No evolutionary theorist would suggest that something as complex as the flagellum appeared ab initio. Instead, it was assembled from parts that had developed for other uses. For example, some molecules produce energy by rotating, a normal procedure within cells. Other molecules have a shape that makes them ideal for moving materials across cell membranes. The flagellum's building blocks include both types of molecules. Instead of being assembled from scratch, then, the flagellum is put together from a stock of already existing parts, each of which evolved to carry out a completely different task. The flagellum may be complicated, but it is not irreducibly complex.
An important distinction between the theories of intelligent design and miraculous creation is that the former makes predictions that can be tested. The problem with ID, at least so far, is that when statements like the one claiming irreducible complexity for the flagellum are put to the test, they turn out to be wrong.
That distinction means that we should use different methods to counter intelligent design than those that defeated young Earth creationism. The more thoughtful advocates of intelligent design accept many of the tenets of Darwinism -- the idea that living things have changed over time, for example. Although the motive of some ID proponents may be to re-introduce God into the debate about the origin of life, their arguments can be met with scientific, not legal, rebuttals. That is good news: They are playing on our field.
Frozen accident versus deterministic origins. The last pair of theories are both subsets of emergent complexity, and both fall within the scientific mainstream; the debate here is about whether life had to develop the way it did, or whether it could have turned out differently. A number of distinguished scientists see the development of life on our planet as a series of accidental, perhaps improbable, events that became locked into the structures of living things -- what have been termed "frozen accidents." In the words of the most eloquent advocate for that point of view, the late Stephen Jay Gould, if you played the tape again, you would get a different set of accidents, and hence a different outcome. Therefore life may be rare in the universe, and the way it began and evolved on Earth may be unique.
Other scientists see life's chemical origin and many of its subsequent evolutionary steps as inevitable -- a cosmic imperative. Indeed, much modern research on the origin of life is devoted to showing precisely how living things arose from inanimate matter through the action of the ordinary laws of chemistry and physics. That more deterministic view of life's origin and evolution means scientists are more likely to eventually understand the details of life's emergence, and it includes the testable prediction that similar life-forms exist on many other planets throughout the universe.
It seems to us that the frozen-accident theory of life's origin is at best unsatisfying, and may be unworthy of the scientific way of approaching the world. To say that a natural process is random is, in effect, an act of surrender, something that should be done only as a last resort. If you read the frozen-accident literature carefully, you often get the feeling that what is really being said is: "My friends and I can't figure out why things happened this way, so it must have been random."
Another aspect of the frozen-accident school of thought has unfortunate consequences for the educational system. Random events are, by definition, not reproducible. That makes them disturbingly similar to the unknowable interventions posited by intelligent design. Is there really much difference between irreproducible random events and irreproducible acts of God? We should note, however, that proponents of the frozen-accident theory make no claims of divine intervention, while advocates of intelligent design do move on to theological arguments.
Although both the theories of frozen accident and deterministic origins have their supporters, virtually all scientists who work in the field believe that once living things appeared on our planet, the Darwinian process of natural selection guided their development. There is no disagreement on that point, although there is -- and should be -- vigorous debate on the details of the way natural selection has worked.
Shouldn't we just teach the debates? That is the rallying cry of intelligent-design advocates. Having learned their lesson in Arkansas in 1982, they no longer demand that schools teach the theory of miraculous creation. Instead they say that students should be told that legitimate alternatives to Darwinian evolution exist, and thus biology classes should include the theory of intelligent design.
That argument has an apparent fairness that is hard to resist, especially for academics who believe that, at least in the sciences, subjects should be approached with an open mind and critical thinking. But the idea of "teaching the debate" founders on two points.
First, there really is no debate in the mainstream literature. The vast majority of scientists who study the origin of life accept the idea of nonmiraculous origins without any reservations. Only creationists support the theory of intelligent design.
Second, American students, from kindergarten to university, spend far too little time as it is studying science. We shouldn't teach them about intelligent design for the same reason that we don't teach them that Earth is flat, or that flies are produced by spontaneous generation from rotting meat. It's bad science, and the curriculum has no room for bad science.
Our educational system produces citizens who are ill prepared to deal with a world increasingly dominated by scientific and technological advances. If we were to "teach the debate," what should we remove from the already inadequate curriculum to make room for an idea that has yet to meet even the most rudimentary scientific tests? Should we neglect the environment? Energy? Genetics? Most high-school biology courses devote a pitifully small amount of time to evolution, which is arguably the most important idea in the life sciences. Should we dilute that instruction even further?
The time to discuss altering the curriculum is when the theory of intelligent design reaches the point where it has serious arguments and data to put forward -- to the point, in other words, where there is a significant debate among scientists.
Harold Morowitz, Robert Hazen, and James Trefil are, respectively, the Clarence J. Robinson Professors of biology and natural philosophy, earth sciences, and physics at George Mason University.
Scientists who teach evolution sometimes feel as if they are trapped in an old horror film -- the kind where the monster is killed repeatedly, only to come to life in a nastier form each time. Since the Scopes trial in 1925, the battle between scientists who want to teach mainstream biology in American public schools, and creationists who want to promulgate a more religious view, has gone through several cycles.
In McLean v. Arkansas Board of Education in 1982, a federal court ruled that the introduction of creationism into public-school curricula constituted the establishment of religion, and hence was expressly forbidden by the First Amendment. That decision dealt a serious (though by no means fatal) blow to old-line creationism and its close cousin, so-called creation science. But another variant of creationism, so-called intelligent design, has cropped up. At least 19 states are now debating its use in public education, and President Bush commented in August that he thought both evolution and intelligent design "ought to be properly taught."
Many people fail to understand the subtle but important differences between the new and old forms of creationism, and the different debates those approaches engender. Like the French generals who used tactics from World War I to face the Nazis in 1939, some educators seem intent on fighting the last war.
A word about the authors of this essay: Although our areas of expertise differ, all of us have investigated aspects of life's origin and evolution. In addition, our political views span the spectrum from liberal Democrat to conservative Republican. Thus the essay does not represent any particular ideological or disciplinary viewpoint. We are united in our concern that the science curriculum, from kindergarten through university, should reflect the best and most up-to-date scholarship.
Consider, then, several different theories of life's origin and evolution. The main theories are those of miraculous creation and of sequential origins. Within the theories of sequential origins are the theories of intelligent design and of emergent complexity, and the latter can in turn be divided into the theories of frozen accident and of deterministic origins. The debate surrounding each pair focuses on a different aspect of the nature of science.
Miraculous creation versus sequential origins. Was the origin of life a miracle, or did it conform to natural law -- and how can we tell? Many different versions of the doctrine of miraculous creation exist, but the one that is most at odds with modern science is called "young Earth creationism" and is based on a literal reading of the Bible. According to the supporters of that theory, our planet and its life-forms were created more or less in their present forms in a miraculous act about 10,000 years ago.
Young Earth creationism is in direct conflict with scientific measurements of the age of rocks, the thickness of polar ice sheets, the expansion of the universe, and numerous other indicators of our planet's great antiquity.
One unusual solution to that disparity was proposed in a book by Philip Gosse, called Omphalos, which was published two years before Darwin's On the Origin of Species. The word "omphalos" means navel in Greek, and Gosse argued that Adam was created with a navel, even though he had never been inside a womb. From that insight has flowed the so-called doctrine of created antiquity (Gosse actually called it Pre-Chronism), which states that although Earth was created 10,000 years ago, it was created to look as if it were much older. Are some stars more than 10,000 light-years away? The universe was created with light from those stars already on its way to Earth. And what about those apparently ancient rocks? The universe was created with just the right mixtures of potassium-40 and argon to make the rocks appear much older than they really are.
It is impossible to conceive of any experiment or observation that could prove the doctrine of created antiquity wrong. Any result, no matter what it was, could be explained by saying "the universe was just created that way."
In fact, that property of young Earth creationism proved to be its Achilles' heel. Every scientific theory must be testable by observation or experiment -- or it cannot be considered science. In principle, it must be possible to imagine outcomes that would prove the theory wrong. In the words of Karl Popper, scientific theories must be falsifiable, even if they are not false. Popper said that a theory that cannot be overturned by experimental data is not a part of experimental science.
Created antiquity is not falsifiable. The teaching of young Earth creationism, along with any other doctrine based on a miraculous creation of life, was prohibited in public schools not because the theory was proved wrong but because it simply is not science. It is, as the court in McLean v. Arkansas Board of Education recognized, a religious doctrine, untestable by the techniques of science.
Once we discard the theories of miraculous creation, we are left with the theories of sequential origins.
Intelligent design versus emergent complexity. The theory of intelligent design, or ID, is a theory of sequential origins, but it is also the latest attack on the idea that the origin and evolution of life follow natural laws. Like created antiquity, ID has a long intellectual pedigree. The English philosopher William Paley first espoused it in 1802, arguing that if you found a watch in a field, you would conclude that it had been designed by some intelligence rather than assembled by chance. In the same way, the argument goes, the intricate universe in which we live reflects the mind of an intelligent maker.
The modern theory of intelligent design is more sophisticated than Paley's argument, although it derives from much the same kind of reasoning. It is anchored in a concept called "irreducible complexity" -- the idea that organisms possess many complicated structures, which are essential to the organism's survival but which are useless unless all the structures are present. The chance of Darwinian evolution's producing so many such structures and of their existing simultaneously, according to the theory, is so small that they must have been produced by an intelligent designer.
Intelligent design challenges the conventional wisdom in origin-of-life research that life is a prime example of so-called emergent complexity. All around us are complex systems that arise when energy flows through a collection of particles, like living cells or grains of sand. Ant colonies, slime molds, sand dunes, spiral galaxies, traffic jams, and human consciousness are examples of such systems. Although scientists have yet to produce a living system in the laboratory, most origin-of-life researchers are optimistic that one day we will be able to do so, or at least to understand how life first emerged from inorganic materials.
The supporters of intelligent design resort to the same kind of argument that creationists have used for decades, identifying some biological structure and claiming that it is irreducibly complex. Then supporters of emergent complexity have to analyze that structure and show that its complexity arises naturally. For example, 20 years ago, the predecessors of ID advocates pointed to the modern whale as an example of what would be called irreducible complexity today (that term wasn't used then). The whale, they argued, is a form so specialized that it could not possibly have been produced by Darwinian evolution.
Alan Haywood, author of Creation and Evolution, put it this way: "Darwinists rarely mention the whale because it presents them with one of their most insoluble problems. They believe that somehow a whale must have evolved from an ordinary land-dwelling animal, which took to the sea and lost its legs. ... A land mammal that was in the process of becoming a whale would fall between two stools -- it would not be fitted for life on land or at sea, and would have no hope for survival."
The power of science is that, faced with such a challenge, one can test the relevant theory. The theory of evolution predicts that whales with atrophied hind legs must have once swum in the seas. If Darwin is correct, then those whales' fossils must lie buried somewhere. Furthermore, those strange creatures must have arisen during a relatively narrow interval of geological time, after the evolution of the earliest known marine mammals (about 60 million years ago) and before the appearance of the streamlined whales of the present era (which show up in the fossil record during the past 30 million years). Armed with those conclusions, paleontologists searched shallow marine formations from 35 million to 55 million years in age. Sure enough, in the past decade the scientists have excavated dozens of those "missing links" in the development of the whale -- curious creatures that sport combinations of anatomical features characteristic of land and sea mammals.
But there's always another challenge to evolution, always another supposed example of irreducible complexity. At the present time the poster child of intelligent design is the flagellum of a bacterium. That complex structure in bacterial walls features a corkscrew-shaped fiber that rotates, propelling the bacterium through the water. Obviously, a completely functioning flagellum is very useful, but it is also obvious that all its parts have to be present for it to function. A nonmoving corkscrew, for example, would be useless and would confer no evolutionary advantage on its own. Roughly 50 molecules are involved in constructing the flagellum, so the probability of all the parts' coming together by chance seems infinitesimally small.
However, that intelligent-design argument contains a hidden assumption: that all parts of a complex structure must have had the same function throughout the history of the development of the organism. In fact, it is quite common for structures to have one function at one time and be adapted for quite another use later on. A land animal's legs become a whale's flippers. An insect may develop bumps on the side of its body to help it get rid of internal heat, but when the bumps get big enough, they may help the insect glide or fly, thus opening up an entirely new ecological niche for exploitation. That process is so common that evolutionary scientists have given it a name: exaptation.
No evolutionary theorist would suggest that something as complex as the flagellum appeared ab initio. Instead, it was assembled from parts that had developed for other uses. For example, some molecules produce energy by rotating, a normal procedure within cells. Other molecules have a shape that makes them ideal for moving materials across cell membranes. The flagellum's building blocks include both types of molecules. Instead of being assembled from scratch, then, the flagellum is put together from a stock of already existing parts, each of which evolved to carry out a completely different task. The flagellum may be complicated, but it is not irreducibly complex.
An important distinction between the theories of intelligent design and miraculous creation is that the former makes predictions that can be tested. The problem with ID, at least so far, is that when statements like the one claiming irreducible complexity for the flagellum are put to the test, they turn out to be wrong.
That distinction means that we should use different methods to counter intelligent design than those that defeated young Earth creationism. The more thoughtful advocates of intelligent design accept many of the tenets of Darwinism -- the idea that living things have changed over time, for example. Although the motive of some ID proponents may be to re-introduce God into the debate about the origin of life, their arguments can be met with scientific, not legal, rebuttals. That is good news: They are playing on our field.
Frozen accident versus deterministic origins. The last pair of theories are both subsets of emergent complexity, and both fall within the scientific mainstream; the debate here is about whether life had to develop the way it did, or whether it could have turned out differently. A number of distinguished scientists see the development of life on our planet as a series of accidental, perhaps improbable, events that became locked into the structures of living things -- what have been termed "frozen accidents." In the words of the most eloquent advocate for that point of view, the late Stephen Jay Gould, if you played the tape again, you would get a different set of accidents, and hence a different outcome. Therefore life may be rare in the universe, and the way it began and evolved on Earth may be unique.
Other scientists see life's chemical origin and many of its subsequent evolutionary steps as inevitable -- a cosmic imperative. Indeed, much modern research on the origin of life is devoted to showing precisely how living things arose from inanimate matter through the action of the ordinary laws of chemistry and physics. That more deterministic view of life's origin and evolution means scientists are more likely to eventually understand the details of life's emergence, and it includes the testable prediction that similar life-forms exist on many other planets throughout the universe.
It seems to us that the frozen-accident theory of life's origin is at best unsatisfying, and may be unworthy of the scientific way of approaching the world. To say that a natural process is random is, in effect, an act of surrender, something that should be done only as a last resort. If you read the frozen-accident literature carefully, you often get the feeling that what is really being said is: "My friends and I can't figure out why things happened this way, so it must have been random."
Another aspect of the frozen-accident school of thought has unfortunate consequences for the educational system. Random events are, by definition, not reproducible. That makes them disturbingly similar to the unknowable interventions posited by intelligent design. Is there really much difference between irreproducible random events and irreproducible acts of God? We should note, however, that proponents of the frozen-accident theory make no claims of divine intervention, while advocates of intelligent design do move on to theological arguments.
Although both the theories of frozen accident and deterministic origins have their supporters, virtually all scientists who work in the field believe that once living things appeared on our planet, the Darwinian process of natural selection guided their development. There is no disagreement on that point, although there is -- and should be -- vigorous debate on the details of the way natural selection has worked.
Shouldn't we just teach the debates? That is the rallying cry of intelligent-design advocates. Having learned their lesson in Arkansas in 1982, they no longer demand that schools teach the theory of miraculous creation. Instead they say that students should be told that legitimate alternatives to Darwinian evolution exist, and thus biology classes should include the theory of intelligent design.
That argument has an apparent fairness that is hard to resist, especially for academics who believe that, at least in the sciences, subjects should be approached with an open mind and critical thinking. But the idea of "teaching the debate" founders on two points.
First, there really is no debate in the mainstream literature. The vast majority of scientists who study the origin of life accept the idea of nonmiraculous origins without any reservations. Only creationists support the theory of intelligent design.
Second, American students, from kindergarten to university, spend far too little time as it is studying science. We shouldn't teach them about intelligent design for the same reason that we don't teach them that Earth is flat, or that flies are produced by spontaneous generation from rotting meat. It's bad science, and the curriculum has no room for bad science.
Our educational system produces citizens who are ill prepared to deal with a world increasingly dominated by scientific and technological advances. If we were to "teach the debate," what should we remove from the already inadequate curriculum to make room for an idea that has yet to meet even the most rudimentary scientific tests? Should we neglect the environment? Energy? Genetics? Most high-school biology courses devote a pitifully small amount of time to evolution, which is arguably the most important idea in the life sciences. Should we dilute that instruction even further?
The time to discuss altering the curriculum is when the theory of intelligent design reaches the point where it has serious arguments and data to put forward -- to the point, in other words, where there is a significant debate among scientists.
Harold Morowitz, Robert Hazen, and James Trefil are, respectively, the Clarence J. Robinson Professors of biology and natural philosophy, earth sciences, and physics at George Mason University.
FUKUYAMA, Prominent NeoCon at JHU, Speaks out against Bush's War
By FRANCIS FUKUYAMA
AS we mark four years since Sept. 11, 2001, one way to organize a review of what has happened in American foreign policy since that terrible day is with a question: To what extent has that policy flowed from the wellspring of American politics and culture, and to what extent has it flowed from the particularities of this president and this administration?
It is tempting to see continuity with the American character and foreign policy tradition in the Bush administration's response to 9/11, and many have done so. We have tended toward the forcefully unilateral when we have felt ourselves under duress; and we have spoken in highly idealistic cadences in such times, as well. Nevertheless, neither American political culture nor any underlying domestic pressures or constraints have determined the key decisions in American foreign policy since Sept. 11.
In the immediate aftermath of the 9/11 attacks, Americans would have allowed President Bush to lead them in any of several directions, and the nation was prepared to accept substantial risks and sacrifices. The Bush administration asked for no sacrifices from the average American, but after the quick fall of the Taliban it rolled the dice in a big way by moving to solve a longstanding problem only tangentially related to the threat from Al Qaeda - Iraq. In the process, it squandered the overwhelming public mandate it had received after Sept. 11. At the same time, it alienated most of its close allies, many of whom have since engaged in "soft balancing" against American influence, and stirred up anti-Americanism in the Middle East.
The Bush administration could instead have chosen to create a true alliance of democracies to fight the illiberal currents coming out of the Middle East. It could also have tightened economic sanctions and secured the return of arms inspectors to Iraq without going to war. It could have made a go at a new international regime to battle proliferation. All of these paths would have been in keeping with American foreign policy traditions. But Mr. Bush and his administration freely chose to do otherwise.
The administration's policy choices have not been restrained by domestic political concerns any more than by American foreign policy culture. Much has been made of the emergence of "red state" America, which supposedly constitutes the political base for President Bush's unilateralist foreign policy, and of the increased number of conservative Christians who supposedly shape the president's international agenda. But the extent and significance of these phenomena have been much exaggerated.
So much attention has been paid to these false determinants of administration policy that a different political dynamic has been underappreciated. Within the Republican Party, the Bush administration got support for the Iraq war from the neoconservatives (who lack a political base of their own but who provide considerable intellectual firepower) and from what Walter Russell Mead calls "Jacksonian America" - American nationalists whose instincts lead them toward a pugnacious isolationism.
Happenstance then magnified this unlikely alliance. Failure to find weapons of mass destruction in Iraq and the inability to prove relevant connections between Saddam Hussein and Al Qaeda left the president, by the time of his second inaugural address, justifying the war exclusively in neoconservative terms: that is, as part of an idealistic policy of political transformation of the broader Middle East. The president's Jacksonian base, which provides the bulk of the troops serving and dying in Iraq, has no natural affinity for such a policy but would not abandon the commander in chief in the middle of a war, particularly if there is clear hope of success.
This war coalition is fragile, however, and vulnerable to mishap. If Jacksonians begin to perceive the war as unwinnable or a failure, there will be little future support for an expansive foreign policy that focuses on promoting democracy. That in turn could drive the 2008 Republican presidential primaries in ways likely to affect the future of American foreign policy as a whole.
Are we failing in Iraq? That's still unclear. The United States can control the situation militarily as long as it chooses to remain there in force, but our willingness to maintain the personnel levels necessary to stay the course is limited. The all-volunteer Army was never intended to fight a prolonged insurgency, and both the Army and Marine Corps face manpower and morale problems. While public support for staying in Iraq remains stable, powerful operational reasons are likely to drive the administration to lower force levels within the next year.
With the failure to secure Sunni support for the constitution and splits within the Shiite community, it seems increasingly unlikely that a strong and cohesive Iraqi government will be in place anytime soon. Indeed, the problem now will be to prevent Iraq's constituent groups from looking to their own militias rather than to the government for protection. If the United States withdraws prematurely, Iraq will slide into greater chaos. That would set off a chain of unfortunate events that will further damage American credibility around the world and ensure that the United States remains preoccupied with the Middle East to the detriment of other important regions - Asia, for example - for years to come.
We do not know what outcome we will face in Iraq. We do know that four years after 9/11, our whole foreign policy seems destined to rise or fall on the outcome of a war only marginally related to the source of what befell us on that day. There was nothing inevitable about this. There is everything to be regretted about it.
Francis Fukuyama, a professor of international political economy at the Johns Hopkins School of Advanced International Studies, is editorial board chairman of a new magazine, The American Interest.
AS we mark four years since Sept. 11, 2001, one way to organize a review of what has happened in American foreign policy since that terrible day is with a question: To what extent has that policy flowed from the wellspring of American politics and culture, and to what extent has it flowed from the particularities of this president and this administration?
It is tempting to see continuity with the American character and foreign policy tradition in the Bush administration's response to 9/11, and many have done so. We have tended toward the forcefully unilateral when we have felt ourselves under duress; and we have spoken in highly idealistic cadences in such times, as well. Nevertheless, neither American political culture nor any underlying domestic pressures or constraints have determined the key decisions in American foreign policy since Sept. 11.
In the immediate aftermath of the 9/11 attacks, Americans would have allowed President Bush to lead them in any of several directions, and the nation was prepared to accept substantial risks and sacrifices. The Bush administration asked for no sacrifices from the average American, but after the quick fall of the Taliban it rolled the dice in a big way by moving to solve a longstanding problem only tangentially related to the threat from Al Qaeda - Iraq. In the process, it squandered the overwhelming public mandate it had received after Sept. 11. At the same time, it alienated most of its close allies, many of whom have since engaged in "soft balancing" against American influence, and stirred up anti-Americanism in the Middle East.
The Bush administration could instead have chosen to create a true alliance of democracies to fight the illiberal currents coming out of the Middle East. It could also have tightened economic sanctions and secured the return of arms inspectors to Iraq without going to war. It could have made a go at a new international regime to battle proliferation. All of these paths would have been in keeping with American foreign policy traditions. But Mr. Bush and his administration freely chose to do otherwise.
The administration's policy choices have not been restrained by domestic political concerns any more than by American foreign policy culture. Much has been made of the emergence of "red state" America, which supposedly constitutes the political base for President Bush's unilateralist foreign policy, and of the increased number of conservative Christians who supposedly shape the president's international agenda. But the extent and significance of these phenomena have been much exaggerated.
So much attention has been paid to these false determinants of administration policy that a different political dynamic has been underappreciated. Within the Republican Party, the Bush administration got support for the Iraq war from the neoconservatives (who lack a political base of their own but who provide considerable intellectual firepower) and from what Walter Russell Mead calls "Jacksonian America" - American nationalists whose instincts lead them toward a pugnacious isolationism.
Happenstance then magnified this unlikely alliance. Failure to find weapons of mass destruction in Iraq and the inability to prove relevant connections between Saddam Hussein and Al Qaeda left the president, by the time of his second inaugural address, justifying the war exclusively in neoconservative terms: that is, as part of an idealistic policy of political transformation of the broader Middle East. The president's Jacksonian base, which provides the bulk of the troops serving and dying in Iraq, has no natural affinity for such a policy but would not abandon the commander in chief in the middle of a war, particularly if there is clear hope of success.
This war coalition is fragile, however, and vulnerable to mishap. If Jacksonians begin to perceive the war as unwinnable or a failure, there will be little future support for an expansive foreign policy that focuses on promoting democracy. That in turn could drive the 2008 Republican presidential primaries in ways likely to affect the future of American foreign policy as a whole.
Are we failing in Iraq? That's still unclear. The United States can control the situation militarily as long as it chooses to remain there in force, but our willingness to maintain the personnel levels necessary to stay the course is limited. The all-volunteer Army was never intended to fight a prolonged insurgency, and both the Army and Marine Corps face manpower and morale problems. While public support for staying in Iraq remains stable, powerful operational reasons are likely to drive the administration to lower force levels within the next year.
With the failure to secure Sunni support for the constitution and splits within the Shiite community, it seems increasingly unlikely that a strong and cohesive Iraqi government will be in place anytime soon. Indeed, the problem now will be to prevent Iraq's constituent groups from looking to their own militias rather than to the government for protection. If the United States withdraws prematurely, Iraq will slide into greater chaos. That would set off a chain of unfortunate events that will further damage American credibility around the world and ensure that the United States remains preoccupied with the Middle East to the detriment of other important regions - Asia, for example - for years to come.
We do not know what outcome we will face in Iraq. We do know that four years after 9/11, our whole foreign policy seems destined to rise or fall on the outcome of a war only marginally related to the source of what befell us on that day. There was nothing inevitable about this. There is everything to be regretted about it.
Francis Fukuyama, a professor of international political economy at the Johns Hopkins School of Advanced International Studies, is editorial board chairman of a new magazine, The American Interest.
Tuesday, August 30
Unpopularity Contest
It's official—George Bush has crossed the Mendoza line. On Friday, Gallup announced that the president's approval has reached a new low of 40 percent, while his disapproval has soared to a new high of 56 percent.
Every second-term president has his eye on the history books—and with these numbers, Bush has secured his place in them. Of the 12 presidents who've served since Gallup started polling in the late 1930s, Bush has entered the ranks of the most unpopular. He's now more unpopular than FDR, Ike, JFK, LBJ, Ford, and Clinton ever were, and has matched the highest disapproval rating of his idol, Ronald Reagan.
Bush's disapproval rose five points in August alone. At his current pace of losing favor, he could speed past two more presidents within the next month: Jimmy Carter, who peaked at 59 percent in mid-1979, and George H.W. Bush, who hit 60 percent in the summer of 1992. That would leave the current Bush just two more men to pass on his way to the top spot: Richard Nixon, who reached 66 percent before resigning in 1974, and Harry Truman, who set Gallup's all-time record at 67percent in January 1952.
Can Bush break the record? The experts say it's nearly impossible in a political climate so much more polarized than the one the men he's competing against faced. To increase his disapproval ratings among Republicans, Bush would have to lose a war, explode the national debt, or preside over a period of steep moral decline. Moreover, as his friends have learned, it's a lot harder breaking records when you have to do it without steroids.
Bush v. History: But don't count Bush out—he thrives on being told a goal is beyond his reach. The president is an intense competitor and stacks up well against the historical competition:
Reagan was old and amiable; Bush is young, vigorous, and has a smirk in reserve.
Both Carter and Bush 41 were one-term, rookie presidents with no clear plan to gain disfavor and who had to rely entirely on external events going south. Bush 43's chances don't depend on luck: He has a proven strategy to fail at home and abroad.
Nixon had to achieve his disapproval ratings almost entirely through scandal, with little help from the economy or world events. The Bush White House is much more versatile: They won't let scandal distract them from screwing up foreign and domestic policy. Already, 62 percent of Americans believe the country is going in the wrong direction—the highest level in a decade—even before the Bush scandals have begun to take a toll.
Truman might seem tough to beat, because Bush has no popular generals to fire. But Truman had several historic achievements under his belt that kept his unpopularity down, such as winning World War II and presiding over the postwar boom. Bush's record is free of any such ballast. In a pinch, the Bush camp can also make a good case that polling on Truman was notoriously unreliable, and that Bush deserves a share of the modern-day record if he reaches Nixon's level.
The Worst Is Not, So Long as We Can Say, This Is the Worst: Best of all, Bush has a luxury that unpopular presidents before him did not: time. His father and Jimmy Carter made the mistake of losing favor in their first term, assuring that voters would not give them a second one. Nixon committed high crimes and misdemeanors that helped him win a second term but prevented him from completing it. Truman's numbers didn't tank until his last year in office.
George W. Bush still has 41 months to turn the rest of the country against him. In the past 41 months, he has cut his popularity by 40 points—from 80 percent to 40 percent. At that rate, he's on track to set a record for presidential approval that could never be broken: zero.
Every second-term president has his eye on the history books—and with these numbers, Bush has secured his place in them. Of the 12 presidents who've served since Gallup started polling in the late 1930s, Bush has entered the ranks of the most unpopular. He's now more unpopular than FDR, Ike, JFK, LBJ, Ford, and Clinton ever were, and has matched the highest disapproval rating of his idol, Ronald Reagan.
Bush's disapproval rose five points in August alone. At his current pace of losing favor, he could speed past two more presidents within the next month: Jimmy Carter, who peaked at 59 percent in mid-1979, and George H.W. Bush, who hit 60 percent in the summer of 1992. That would leave the current Bush just two more men to pass on his way to the top spot: Richard Nixon, who reached 66 percent before resigning in 1974, and Harry Truman, who set Gallup's all-time record at 67percent in January 1952.
Can Bush break the record? The experts say it's nearly impossible in a political climate so much more polarized than the one the men he's competing against faced. To increase his disapproval ratings among Republicans, Bush would have to lose a war, explode the national debt, or preside over a period of steep moral decline. Moreover, as his friends have learned, it's a lot harder breaking records when you have to do it without steroids.
Bush v. History: But don't count Bush out—he thrives on being told a goal is beyond his reach. The president is an intense competitor and stacks up well against the historical competition:
Reagan was old and amiable; Bush is young, vigorous, and has a smirk in reserve.
Both Carter and Bush 41 were one-term, rookie presidents with no clear plan to gain disfavor and who had to rely entirely on external events going south. Bush 43's chances don't depend on luck: He has a proven strategy to fail at home and abroad.
Nixon had to achieve his disapproval ratings almost entirely through scandal, with little help from the economy or world events. The Bush White House is much more versatile: They won't let scandal distract them from screwing up foreign and domestic policy. Already, 62 percent of Americans believe the country is going in the wrong direction—the highest level in a decade—even before the Bush scandals have begun to take a toll.
Truman might seem tough to beat, because Bush has no popular generals to fire. But Truman had several historic achievements under his belt that kept his unpopularity down, such as winning World War II and presiding over the postwar boom. Bush's record is free of any such ballast. In a pinch, the Bush camp can also make a good case that polling on Truman was notoriously unreliable, and that Bush deserves a share of the modern-day record if he reaches Nixon's level.
The Worst Is Not, So Long as We Can Say, This Is the Worst: Best of all, Bush has a luxury that unpopular presidents before him did not: time. His father and Jimmy Carter made the mistake of losing favor in their first term, assuring that voters would not give them a second one. Nixon committed high crimes and misdemeanors that helped him win a second term but prevented him from completing it. Truman's numbers didn't tank until his last year in office.
George W. Bush still has 41 months to turn the rest of the country against him. In the past 41 months, he has cut his popularity by 40 points—from 80 percent to 40 percent. At that rate, he's on track to set a record for presidential approval that could never be broken: zero.
Friday, August 26
Economy
For the last few months there has been a running debate about the U.S. economy, more or less like this:
American families: "We're not doing very well."
Washington officials: "You're wrong - you're doing great. Here, look at these statistics!"
The administration and some political commentators seem genuinely puzzled by polls showing that Americans are unhappy about the economy. After all, they point out, numbers like the growth rate of G.D.P. look pretty good. So why aren't people cheering? But when your numbers tell you that people should be feeling good, but they aren't, that means you're looking at the wrong numbers.
American families don't care about G.D.P. They care about whether jobs are available, how much those jobs pay and how that pay compares with the cost of living. And recent G.D.P. growth has failed to produce exceptional gains in employment, while wages for most workers haven't kept up with inflation.
About employment: it's true that the economy finally started adding jobs two years ago. But although many people say "four million jobs in the last two years" reverently, as if it were an amazing achievement, it's actually a rise of about 3 percent, not much faster than the growth of the working-age population over the same period. And recent job growth would have been considered subpar in the past: employment grew more slowly during the best two years of the Bush administration than in any two years during the Clinton administration.
Some commentators dismiss concerns about gasoline prices, because those prices are still below previous peaks when you adjust for inflation. But that misses the point: Americans bought cars and made decisions about where to live when gas was $1.50 or less per gallon, and now suddenly find themselves paying $2.60 or more. That's a rude shock, which I estimate raises the typical family's expenses by more than $900 a year.
You may ask where economic growth is going, if it isn't showing up in wages. That's easy to answer: it's going to corporate profits, to rising health care costs and to a surge in the salaries and other compensation of executives. (Forbes reports that the combined compensation of the chief executives of America's 500 largest companies rose 54 percent last year.)
The bottom line, then, is that most Americans have good reason to feel unhappy about the economy, whatever Washington's favorite statistics may say. This is an economic expansion that hasn't trickled down; many people are worse off than they were a year ago.
American families: "We're not doing very well."
Washington officials: "You're wrong - you're doing great. Here, look at these statistics!"
The administration and some political commentators seem genuinely puzzled by polls showing that Americans are unhappy about the economy. After all, they point out, numbers like the growth rate of G.D.P. look pretty good. So why aren't people cheering? But when your numbers tell you that people should be feeling good, but they aren't, that means you're looking at the wrong numbers.
American families don't care about G.D.P. They care about whether jobs are available, how much those jobs pay and how that pay compares with the cost of living. And recent G.D.P. growth has failed to produce exceptional gains in employment, while wages for most workers haven't kept up with inflation.
About employment: it's true that the economy finally started adding jobs two years ago. But although many people say "four million jobs in the last two years" reverently, as if it were an amazing achievement, it's actually a rise of about 3 percent, not much faster than the growth of the working-age population over the same period. And recent job growth would have been considered subpar in the past: employment grew more slowly during the best two years of the Bush administration than in any two years during the Clinton administration.
Some commentators dismiss concerns about gasoline prices, because those prices are still below previous peaks when you adjust for inflation. But that misses the point: Americans bought cars and made decisions about where to live when gas was $1.50 or less per gallon, and now suddenly find themselves paying $2.60 or more. That's a rude shock, which I estimate raises the typical family's expenses by more than $900 a year.
You may ask where economic growth is going, if it isn't showing up in wages. That's easy to answer: it's going to corporate profits, to rising health care costs and to a surge in the salaries and other compensation of executives. (Forbes reports that the combined compensation of the chief executives of America's 500 largest companies rose 54 percent last year.)
The bottom line, then, is that most Americans have good reason to feel unhappy about the economy, whatever Washington's favorite statistics may say. This is an economic expansion that hasn't trickled down; many people are worse off than they were a year ago.
Greenspan's speech reflects caution
n the first of two speeches at a Fed symposium about the "Greenspan legacy," Mr. Greenspan warned that investors as well as ordinary consumers were being too complacent in assuming that high interest rates, high inflation and sluggish growth in productivity are all things of the past.
Mr. Greenspan took particular aim at the willingness of investors to pay ever-higher prices for stocks and bonds and the torrid run-up in housing prices, as well as a greater willingness of people to spend money on the basis of increases in their apparent wealth rather than gains in their actual incomes.
Both trends reflect a willingness to accept unusually low "risk premiums," to pay top dollar for bonds or real estate and to accept a relatively low rate of return because people feel more secure and more tranquil about the long-run future of the economy.
Mr. Greenspan noted that big increases in wealth over the past decade stemmed in large part from the run-up in prices for assets ranging from bonds to houses, but he said such increases could evaporate if investors suddenly became more worried about risk.
"Such an increase in market value is too often viewed by market participants as structural and permanent," Mr. Greenspan said. But he warned that what people perceive as an abundance of new wealth "can readily disappear."
"Any onset of increased investor caution elevates risk premiums and, as a consequence, lowers asset values and promotes the liquidation of the debt that supported higher asset prices," Mr. Greenspan said. "This is the reason that history has not dealt kindly with the aftermath of protracted periods of low risk premiums."
Though the Fed chairman kept his remarks general, they harkened back to the mounting concern among policymakers and many outside economists about the possibility of a dangerous bubble in housing that could abruptly deflate and might send the economy into a tailspin.
Mr. Greenspan took particular aim at the willingness of investors to pay ever-higher prices for stocks and bonds and the torrid run-up in housing prices, as well as a greater willingness of people to spend money on the basis of increases in their apparent wealth rather than gains in their actual incomes.
Both trends reflect a willingness to accept unusually low "risk premiums," to pay top dollar for bonds or real estate and to accept a relatively low rate of return because people feel more secure and more tranquil about the long-run future of the economy.
Mr. Greenspan noted that big increases in wealth over the past decade stemmed in large part from the run-up in prices for assets ranging from bonds to houses, but he said such increases could evaporate if investors suddenly became more worried about risk.
"Such an increase in market value is too often viewed by market participants as structural and permanent," Mr. Greenspan said. But he warned that what people perceive as an abundance of new wealth "can readily disappear."
"Any onset of increased investor caution elevates risk premiums and, as a consequence, lowers asset values and promotes the liquidation of the debt that supported higher asset prices," Mr. Greenspan said. "This is the reason that history has not dealt kindly with the aftermath of protracted periods of low risk premiums."
Though the Fed chairman kept his remarks general, they harkened back to the mounting concern among policymakers and many outside economists about the possibility of a dangerous bubble in housing that could abruptly deflate and might send the economy into a tailspin.
Thursday, August 25
Justice Roberts Scorn for Women?
From Slate.com's Jursiprudence Reporter Dalia Lithwick:
What's most startling about Roberts' writings isn't always the substance. Some of the policy ideas he rejected—like that of paying "comparable worth" for traditionally female jobs—may have deserved the scorn he evinced. What's truly is shocking is his dismissive tone, which seemed to surprise even ultraconservative Phyllis Schlafly, who described it yesterday as "smart alecky." Gender disparities are invariably "perceived" or "purported," in Roberts' eyes. Every effort to solve them is laughable. At a moment when serious inequities in women's wages, employment, and opportunities existed in this country, Roberts seemed to dismiss every attempt to remedy them as a knock-knock joke.
In a 1985 memo about whether a government lawyer could be nominated for an award program honoring women who changed professions after age 30, Roberts wrote and approved the nomination but added: "Some might question whether encouraging homemakers to become lawyers contributes to the common good, but I suppose that is for the judges to decide."
Another memo has Roberts blasting the proposed Equal Rights Amendment, dismissing it as an attempt to "bridge the purported 'gender gap.'
Pile these memos onto what we already know of Roberts and gender issues: that later and as a lawyer in private practice, Roberts would argue for narrowing the scope of Title IX—the statute that bars gender discrimination at any school receiving federal funding.
Elliot Mincberg, senior vice president of People for the American Way, told the Chicago Tribune today, "You do see a real clear lack of regard for—and even it could be argued, hostility toward—laws and theories and arguments that would promote equality for women in important ways."
And Kim Gandy, president of NOW, fumed in the same paper: "I don't see Roberts' positions as conservative. ... I know a lot of conservatives who expect women to be paid fairly, who think women should become lawyers if they want to be lawyers. That is not a conservative position, that is a Neanderthal position. It's unfair to conservatives to call the positions he takes conservative."
inally, there's the humorless-feminist tack. I vaguely remember this argument from the '80s: It's that women can't take a joke. So that is the new defense: This wasn't just a joke, it was a lawyer joke! That's evidently the White House position, too: "It's pretty clear from the more than 60,000 pages of documents that have been released that John Roberts has a great sense of humor," Steve Schmidt, a Bush spokesman told the Washington Post. "In this [housewives] memo, he offers a lawyer joke."
Heh-heh.
I don't quite know what to make of that argument. It brings me back to Bruce Reed's giggling blondes. The problem isn't with his desperate housewives (or hideous lawyers) crack, but with his relentless "Gidget sucks" tone. Roberts honestly seemed to think that humor or disdain were the only appropriate ways to think about gender. It's not that feminists can't take a joke. It's that Roberts can't seem to take feminists seriously.
The record seems to make it quite clear that Roberts—with his "perceived/purported/alleged" discrimination trope—simply didn't believe that gender problems were worthy of his serious consideration or scrutiny.
The emerging picture of Roberts is of a man deeply skeptical about federal efforts to equalize opportunity for women or minorities, be it through busing, housing, voting rights, or affirmative-action programs. He was even more skeptical of judicial efforts to engage in the same project. And that's a legitimate, if debatable, political theory. But if, as the memos suggest, Roberts' ideological views are the result of being too smart-alecky or dismissive to accept that these disparities were of serious national concern in the first place, he doesn't just have a gender problem. He has a reality problem.
What's most startling about Roberts' writings isn't always the substance. Some of the policy ideas he rejected—like that of paying "comparable worth" for traditionally female jobs—may have deserved the scorn he evinced. What's truly is shocking is his dismissive tone, which seemed to surprise even ultraconservative Phyllis Schlafly, who described it yesterday as "smart alecky." Gender disparities are invariably "perceived" or "purported," in Roberts' eyes. Every effort to solve them is laughable. At a moment when serious inequities in women's wages, employment, and opportunities existed in this country, Roberts seemed to dismiss every attempt to remedy them as a knock-knock joke.
In a 1985 memo about whether a government lawyer could be nominated for an award program honoring women who changed professions after age 30, Roberts wrote and approved the nomination but added: "Some might question whether encouraging homemakers to become lawyers contributes to the common good, but I suppose that is for the judges to decide."
Another memo has Roberts blasting the proposed Equal Rights Amendment, dismissing it as an attempt to "bridge the purported 'gender gap.'
Pile these memos onto what we already know of Roberts and gender issues: that later and as a lawyer in private practice, Roberts would argue for narrowing the scope of Title IX—the statute that bars gender discrimination at any school receiving federal funding.
Elliot Mincberg, senior vice president of People for the American Way, told the Chicago Tribune today, "You do see a real clear lack of regard for—and even it could be argued, hostility toward—laws and theories and arguments that would promote equality for women in important ways."
And Kim Gandy, president of NOW, fumed in the same paper: "I don't see Roberts' positions as conservative. ... I know a lot of conservatives who expect women to be paid fairly, who think women should become lawyers if they want to be lawyers. That is not a conservative position, that is a Neanderthal position. It's unfair to conservatives to call the positions he takes conservative."
inally, there's the humorless-feminist tack. I vaguely remember this argument from the '80s: It's that women can't take a joke. So that is the new defense: This wasn't just a joke, it was a lawyer joke! That's evidently the White House position, too: "It's pretty clear from the more than 60,000 pages of documents that have been released that John Roberts has a great sense of humor," Steve Schmidt, a Bush spokesman told the Washington Post. "In this [housewives] memo, he offers a lawyer joke."
Heh-heh.
I don't quite know what to make of that argument. It brings me back to Bruce Reed's giggling blondes. The problem isn't with his desperate housewives (or hideous lawyers) crack, but with his relentless "Gidget sucks" tone. Roberts honestly seemed to think that humor or disdain were the only appropriate ways to think about gender. It's not that feminists can't take a joke. It's that Roberts can't seem to take feminists seriously.
The record seems to make it quite clear that Roberts—with his "perceived/purported/alleged" discrimination trope—simply didn't believe that gender problems were worthy of his serious consideration or scrutiny.
The emerging picture of Roberts is of a man deeply skeptical about federal efforts to equalize opportunity for women or minorities, be it through busing, housing, voting rights, or affirmative-action programs. He was even more skeptical of judicial efforts to engage in the same project. And that's a legitimate, if debatable, political theory. But if, as the memos suggest, Roberts' ideological views are the result of being too smart-alecky or dismissive to accept that these disparities were of serious national concern in the first place, he doesn't just have a gender problem. He has a reality problem.
Wednesday, August 17
unrealities
The Bush administration is significantly lowering expectations of what can be achieved in Iraq, recognizing that the United States will have to settle for far less progress than originally envisioned during the transition due to end in four months, according to U.S. officials in Washington and Baghdad.
The United States no longer expects to see a model new democracy, a self-supporting oil industry or a society in which the majority of people are free from serious security or economic challenges, U.S. officials say.
"What we expected to achieve was never realistic given the timetable or what unfolded on the ground," said a senior official involved in policy since the 2003 invasion. "We are in a process of absorbing the factors of the situation we're in and shedding the unreality that dominated at the beginning."
The United States no longer expects to see a model new democracy, a self-supporting oil industry or a society in which the majority of people are free from serious security or economic challenges, U.S. officials say.
"What we expected to achieve was never realistic given the timetable or what unfolded on the ground," said a senior official involved in policy since the 2003 invasion. "We are in a process of absorbing the factors of the situation we're in and shedding the unreality that dominated at the beginning."
Saturday, August 13
Someone Tell the President the War Is Over
August 14, 2005
By FRANK RICH
LIKE the Japanese soldier marooned on an island for years after V-J Day, President Bush may be the last person in the country to learn that for Americans, if not Iraqis, the war in Iraq is over. "We will stay the course," he insistently tells us from his Texas ranch. What do you mean we, white man?
A president can't stay the course when his own citizens (let alone his own allies) won't stay with him. The approval rate for Mr. Bush's handling of Iraq plunged to 34 percent in last weekend's Newsweek poll - a match for the 32 percent that approved L.B.J.'s handling of Vietnam in early March 1968. (The two presidents' overall approval ratings have also converged: 41 percent for Johnson then, 42 percent for Bush now.) On March 31, 1968, as L.B.J.'s ratings plummeted further, he announced he wouldn't seek re-election, commencing our long extrication from that quagmire.
But our current Texas president has even outdone his predecessor; Mr. Bush has lost not only the country but also his army. Neither bonuses nor fudged standards nor the faking of high school diplomas has solved the recruitment shortfall. Now Jake Tapper of ABC News reports that the armed forces are so eager for bodies they will flout "don't ask, don't tell" and hang on to gay soldiers who tell, even if they tell the press.
The president's cable cadre is in disarray as well. At Fox News Bill O'Reilly is trashing Donald Rumsfeld for his incompetence, and Ann Coulter is chiding Mr. O'Reilly for being a defeatist. In an emblematic gesture akin to waving a white flag, Robert Novak walked off a CNN set and possibly out of a job rather than answer questions about his role in smearing the man who helped expose the administration's prewar inflation of Saddam W.M.D.'s. (On this sinking ship, it's hard to know which rat to root for.)
As if the right-wing pundit crackup isn't unsettling enough, Mr. Bush's top war strategists, starting with Mr. Rumsfeld and Gen. Richard Myers, have of late tried to rebrand the war in Iraq as what the defense secretary calls "a global struggle against violent extremism." A struggle is what you have with your landlord. When the war's über-managers start using euphemisms for a conflict this lethal, it's a clear sign that the battle to keep the Iraq war afloat with the American public is lost.
That battle crashed past the tipping point this month in Ohio. There's historical symmetry in that. It was in Cincinnati on Oct. 7, 2002, that Mr. Bush gave the fateful address that sped Congressional ratification of the war just days later. The speech was a miasma of self-delusion, half-truths and hype. The president said that "we know that Iraq and Al Qaeda have had high-level contacts that go back a decade," an exaggeration based on evidence that the Senate Intelligence Committee would later find far from conclusive. He said that Saddam "could have a nuclear weapon in less than a year" were he able to secure "an amount of highly enriched uranium a little larger than a single softball." Our own National Intelligence Estimate of Oct. 1 quoted State Department findings that claims of Iraqi pursuit of uranium in Africa were "highly dubious."
It was on these false premises - that Iraq was both a collaborator on 9/11 and about to inflict mushroom clouds on America - that honorable and brave young Americans were sent off to fight. Among them were the 19 marine reservists from a single suburban Cleveland battalion slaughtered in just three days at the start of this month. As they perished, another Ohio marine reservist who had served in Iraq came close to winning a Congressional election in southern Ohio. Paul Hackett, a Democrat who called the president a "chicken hawk," received 48 percent of the vote in exactly the kind of bedrock conservative Ohio district that decided the 2004 election for Mr. Bush.
These are the tea leaves that all Republicans, not just Chuck Hagel, are reading now. Newt Gingrich called the Hackett near-victory "a wake-up call." The resolutely pro-war New York Post editorial page begged Mr. Bush (to no avail) to "show some leadership" by showing up in Ohio to salute the fallen and their families. A Bush loyalist, Senator George Allen of Virginia, instructed the president to meet with Cindy Sheehan, the mother camping out in Crawford, as "a matter of courtesy and decency." Or, to translate his Washingtonese, as a matter of politics. Only someone as adrift from reality as Mr. Bush would need to be told that a vacationing president can't win a standoff with a grief-stricken parent commandeering TV cameras and the blogosphere 24/7.
Such political imperatives are rapidly bringing about the war's end. That's inevitable for a war of choice, not necessity, that was conceived in politics from the start. Iraq was a Bush administration idée fixe before there was a 9/11. Within hours of that horrible trauma, according to Richard Clarke's "Against All Enemies," Mr. Rumsfeld was proposing Iraq as a battlefield, not because the enemy that attacked America was there, but because it offered "better targets" than the shadowy terrorist redoubts of Afghanistan. It was easier to take out Saddam - and burnish Mr. Bush's credentials as a slam-dunk "war president," suitable for a "Top Gun" victory jig - than to shut down Al Qaeda and smoke out its leader "dead or alive."
But just as politics are a bad motive for choosing a war, so they can be a doomed engine for running a war. In an interview with Tim Russert early last year, Mr. Bush said, "The thing about the Vietnam War that troubles me, as I look back, was it was a political war," adding that the "essential" lesson he learned from Vietnam was to not have "politicians making military decisions." But by then Mr. Bush had disastrously ignored that very lesson; he had let Mr. Rumsfeld publicly rebuke the Army's chief of staff, Eric Shinseki, after the general dared tell the truth: that several hundred thousand troops would be required to secure Iraq. To this day it's our failure to provide that security that has turned the country into the terrorist haven it hadn't been before 9/11 - "the central front in the war on terror," as Mr. Bush keeps reminding us, as if that might make us forget he's the one who recklessly created it.
The endgame for American involvement in Iraq will be of a piece with the rest of this sorry history. "It makes no sense for the commander in chief to put out a timetable" for withdrawal, Mr. Bush declared on the same day that 14 of those Ohio troops were killed by a roadside bomb in Haditha. But even as he spoke, the war's actual commander, Gen. George Casey, had already publicly set a timetable for "some fairly substantial reductions" to start next spring. Officially this calendar is tied to the next round of Iraqi elections, but it's quite another election this administration has in mind. The priority now is less to save Jessica Lynch (or Iraqi democracy) than to save Rick Santorum and every other endangered Republican facing voters in November 2006.
Nothing that happens on the ground in Iraq can turn around the fate of this war in America: not a shotgun constitution rushed to meet an arbitrary deadline, not another Iraqi election, not higher terrorist body counts, not another battle for Falluja (where insurgents may again regroup, The Los Angeles Times reported last week). A citizenry that was asked to accept tax cuts, not sacrifice, at the war's inception is hardly in the mood to start sacrificing now. There will be neither the volunteers nor the money required to field the wholesale additional American troops that might bolster the security situation in Iraq.
WHAT lies ahead now in Iraq instead is not victory, which Mr. Bush has never clearly defined anyway, but an exit (or triage) strategy that may echo Johnson's March 1968 plan for retreat from Vietnam: some kind of negotiations (in this case, with Sunni elements of the insurgency), followed by more inflated claims about the readiness of the local troops-in-training, whom we'll then throw to the wolves. Such an outcome may lead to even greater disaster, but this administration long ago squandered the credibility needed to make the difficult case that more human and financial resources might prevent Iraq from continuing its descent into civil war and its devolution into jihad central.
Thus the president's claim on Thursday that "no decision has been made yet" about withdrawing troops from Iraq can be taken exactly as seriously as the vice president's preceding fantasy that the insurgency is in its "last throes." The country has already made the decision for Mr. Bush. We're outta there. Now comes the hard task of identifying the leaders who can pick up the pieces of the fiasco that has made us more vulnerable, not less, to the terrorists who struck us four years ago next month.
By FRANK RICH
LIKE the Japanese soldier marooned on an island for years after V-J Day, President Bush may be the last person in the country to learn that for Americans, if not Iraqis, the war in Iraq is over. "We will stay the course," he insistently tells us from his Texas ranch. What do you mean we, white man?
A president can't stay the course when his own citizens (let alone his own allies) won't stay with him. The approval rate for Mr. Bush's handling of Iraq plunged to 34 percent in last weekend's Newsweek poll - a match for the 32 percent that approved L.B.J.'s handling of Vietnam in early March 1968. (The two presidents' overall approval ratings have also converged: 41 percent for Johnson then, 42 percent for Bush now.) On March 31, 1968, as L.B.J.'s ratings plummeted further, he announced he wouldn't seek re-election, commencing our long extrication from that quagmire.
But our current Texas president has even outdone his predecessor; Mr. Bush has lost not only the country but also his army. Neither bonuses nor fudged standards nor the faking of high school diplomas has solved the recruitment shortfall. Now Jake Tapper of ABC News reports that the armed forces are so eager for bodies they will flout "don't ask, don't tell" and hang on to gay soldiers who tell, even if they tell the press.
The president's cable cadre is in disarray as well. At Fox News Bill O'Reilly is trashing Donald Rumsfeld for his incompetence, and Ann Coulter is chiding Mr. O'Reilly for being a defeatist. In an emblematic gesture akin to waving a white flag, Robert Novak walked off a CNN set and possibly out of a job rather than answer questions about his role in smearing the man who helped expose the administration's prewar inflation of Saddam W.M.D.'s. (On this sinking ship, it's hard to know which rat to root for.)
As if the right-wing pundit crackup isn't unsettling enough, Mr. Bush's top war strategists, starting with Mr. Rumsfeld and Gen. Richard Myers, have of late tried to rebrand the war in Iraq as what the defense secretary calls "a global struggle against violent extremism." A struggle is what you have with your landlord. When the war's über-managers start using euphemisms for a conflict this lethal, it's a clear sign that the battle to keep the Iraq war afloat with the American public is lost.
That battle crashed past the tipping point this month in Ohio. There's historical symmetry in that. It was in Cincinnati on Oct. 7, 2002, that Mr. Bush gave the fateful address that sped Congressional ratification of the war just days later. The speech was a miasma of self-delusion, half-truths and hype. The president said that "we know that Iraq and Al Qaeda have had high-level contacts that go back a decade," an exaggeration based on evidence that the Senate Intelligence Committee would later find far from conclusive. He said that Saddam "could have a nuclear weapon in less than a year" were he able to secure "an amount of highly enriched uranium a little larger than a single softball." Our own National Intelligence Estimate of Oct. 1 quoted State Department findings that claims of Iraqi pursuit of uranium in Africa were "highly dubious."
It was on these false premises - that Iraq was both a collaborator on 9/11 and about to inflict mushroom clouds on America - that honorable and brave young Americans were sent off to fight. Among them were the 19 marine reservists from a single suburban Cleveland battalion slaughtered in just three days at the start of this month. As they perished, another Ohio marine reservist who had served in Iraq came close to winning a Congressional election in southern Ohio. Paul Hackett, a Democrat who called the president a "chicken hawk," received 48 percent of the vote in exactly the kind of bedrock conservative Ohio district that decided the 2004 election for Mr. Bush.
These are the tea leaves that all Republicans, not just Chuck Hagel, are reading now. Newt Gingrich called the Hackett near-victory "a wake-up call." The resolutely pro-war New York Post editorial page begged Mr. Bush (to no avail) to "show some leadership" by showing up in Ohio to salute the fallen and their families. A Bush loyalist, Senator George Allen of Virginia, instructed the president to meet with Cindy Sheehan, the mother camping out in Crawford, as "a matter of courtesy and decency." Or, to translate his Washingtonese, as a matter of politics. Only someone as adrift from reality as Mr. Bush would need to be told that a vacationing president can't win a standoff with a grief-stricken parent commandeering TV cameras and the blogosphere 24/7.
Such political imperatives are rapidly bringing about the war's end. That's inevitable for a war of choice, not necessity, that was conceived in politics from the start. Iraq was a Bush administration idée fixe before there was a 9/11. Within hours of that horrible trauma, according to Richard Clarke's "Against All Enemies," Mr. Rumsfeld was proposing Iraq as a battlefield, not because the enemy that attacked America was there, but because it offered "better targets" than the shadowy terrorist redoubts of Afghanistan. It was easier to take out Saddam - and burnish Mr. Bush's credentials as a slam-dunk "war president," suitable for a "Top Gun" victory jig - than to shut down Al Qaeda and smoke out its leader "dead or alive."
But just as politics are a bad motive for choosing a war, so they can be a doomed engine for running a war. In an interview with Tim Russert early last year, Mr. Bush said, "The thing about the Vietnam War that troubles me, as I look back, was it was a political war," adding that the "essential" lesson he learned from Vietnam was to not have "politicians making military decisions." But by then Mr. Bush had disastrously ignored that very lesson; he had let Mr. Rumsfeld publicly rebuke the Army's chief of staff, Eric Shinseki, after the general dared tell the truth: that several hundred thousand troops would be required to secure Iraq. To this day it's our failure to provide that security that has turned the country into the terrorist haven it hadn't been before 9/11 - "the central front in the war on terror," as Mr. Bush keeps reminding us, as if that might make us forget he's the one who recklessly created it.
The endgame for American involvement in Iraq will be of a piece with the rest of this sorry history. "It makes no sense for the commander in chief to put out a timetable" for withdrawal, Mr. Bush declared on the same day that 14 of those Ohio troops were killed by a roadside bomb in Haditha. But even as he spoke, the war's actual commander, Gen. George Casey, had already publicly set a timetable for "some fairly substantial reductions" to start next spring. Officially this calendar is tied to the next round of Iraqi elections, but it's quite another election this administration has in mind. The priority now is less to save Jessica Lynch (or Iraqi democracy) than to save Rick Santorum and every other endangered Republican facing voters in November 2006.
Nothing that happens on the ground in Iraq can turn around the fate of this war in America: not a shotgun constitution rushed to meet an arbitrary deadline, not another Iraqi election, not higher terrorist body counts, not another battle for Falluja (where insurgents may again regroup, The Los Angeles Times reported last week). A citizenry that was asked to accept tax cuts, not sacrifice, at the war's inception is hardly in the mood to start sacrificing now. There will be neither the volunteers nor the money required to field the wholesale additional American troops that might bolster the security situation in Iraq.
WHAT lies ahead now in Iraq instead is not victory, which Mr. Bush has never clearly defined anyway, but an exit (or triage) strategy that may echo Johnson's March 1968 plan for retreat from Vietnam: some kind of negotiations (in this case, with Sunni elements of the insurgency), followed by more inflated claims about the readiness of the local troops-in-training, whom we'll then throw to the wolves. Such an outcome may lead to even greater disaster, but this administration long ago squandered the credibility needed to make the difficult case that more human and financial resources might prevent Iraq from continuing its descent into civil war and its devolution into jihad central.
Thus the president's claim on Thursday that "no decision has been made yet" about withdrawing troops from Iraq can be taken exactly as seriously as the vice president's preceding fantasy that the insurgency is in its "last throes." The country has already made the decision for Mr. Bush. We're outta there. Now comes the hard task of identifying the leaders who can pick up the pieces of the fiasco that has made us more vulnerable, not less, to the terrorists who struck us four years ago next month.
Thursday, August 11
The Orange Lounge
The Orange County Museum of Art ‘s Orange Lounge at South Coast Plaza, is the first museum space on the West Coast to regularly present digital art and video and the only space of its kind in the United States located in a major retail complex.
The Orange Lounge presents solo and group exhibitions and special events featuring artist demonstrations, performances, and discussions that investigate the impact of digital media on contemporary visual culture. From real-time computer-generated moving images, to room-sized video installations, to artworks based strictly on the Web, the Orange Lounge is a dynamic site for exploring and interacting with new media art. Catalogues, CDs and other products related to video and new media are available in the Orange Lounge.
Exhibition Sponsors
The Orange Lounge design and construction was funded by the Segerstrom Foundation. Major support for Orange Lounge programs is provided by The James Irvine Foundation and technical support is provided by Integrated Media Systems.
The Orange Lounge presents solo and group exhibitions and special events featuring artist demonstrations, performances, and discussions that investigate the impact of digital media on contemporary visual culture. From real-time computer-generated moving images, to room-sized video installations, to artworks based strictly on the Web, the Orange Lounge is a dynamic site for exploring and interacting with new media art. Catalogues, CDs and other products related to video and new media are available in the Orange Lounge.
Exhibition Sponsors
The Orange Lounge design and construction was funded by the Segerstrom Foundation. Major support for Orange Lounge programs is provided by The James Irvine Foundation and technical support is provided by Integrated Media Systems.
Evolution vs. Religion
Quit pretending they're compatible.
By Jacob Weisberg, Slate.com
President Bush used to be content to revel in his own ignorance. Now he wants to share it with America's schoolchildren.
I refer to his recent comments in favor of teaching "intelligent design" alongside evolution. "Both sides ought to be properly taught … so people can understand what the debate is about," Bush told a group of Texas newspaper reporters who interviewed him on Aug. 1. "Part of education is to expose people to different schools of thought."
The president seems to view the conflict between evolutionary theory and intelligent design as something like the debate over Social Security reform. But this is not a disagreement with two reasonable points of view, let alone two equally valid ones. Intelligent design, which asserts that gaps in evolutionary science prove God must have had a role in creation, may be—as Bob Wright argues—creationism in camouflage. Or it may be—as William Saletan argues—a step in the creationist cave-in to evolution. But whatever it represents, intelligent design is a faith-based theory with no scientific validity or credibility.
If Bush had said schools should give equal time to the view that the Sun revolves around the Earth, or that smoking doesn't cause lung cancer, he'd have been laughed out of his office. The difference with evolution is that a large majority of Americans reject what scientists regard as equally well supported: that we're here because of random mutation and natural selection. According to the most recent Gallup poll on the subject (2004), 45 percent of Americans believe God created human beings in their present form 10,000 years ago, while another 38 percent believe that God directed the process of evolution. Only 13 percent accept the prevailing scientific view of evolution as an unguided, random process.
Being right and yet so unpopular presents an interesting problem for evolutionists. Their theory has won over the world scientific community but very few of the citizens of red-state America, who decide what gets taught in their own public schools. How can followers of Darwin prevent the propagation of ignorance in places like Kansas, whose board of education just voted to rewrite its biology curriculum to do what President Bush suggests?
Many biologists believe the answer is to present evolution as less menacing to religious belief than it really is. In much the same way that intelligent-design advocates try to assert that a creator must be compatible with evolution in order to shoehorn God into science classrooms, evolutionists claim Darwin is compatible with religion in order to keep God out. Don't worry, they insist, there's no conflict between evolution and religion—they simply belong to different realms. Evolution should be taught in the secular classroom, along with other hypotheses that can be verified or falsified. Intelligent design belongs in Sunday schools, with stuff that can't.
This was the soothing contention of the famed paleontologist Stephen Jay Gould, who argued that science and religion were separate "magisteria," or domains of teaching. The theme appears frequently in statements by major scientific organizations and wherever fundamentalists try to force creationism or its descendents on local school boards. Here, for instance, is the official position of Kansas Citizens for Science, the group opposing the inclusion of intelligent design in the state's science curricula: "People of faith do not have to choose between science and religion. Science is neither anti-Christian nor anti-God. Science denies neither God nor creation. Science merely looks for natural evidence of how the universe got to its current state. If viewed theistically, science is not commenting on whether there was a creation, but could be viewed as trying to find out how it happened."
In a state like Kansas, where public opinion remains overwhelmingly hostile to evolution, one sees the political logic of this kind of tap-dance. But let's be serious: Evolutionary theory may not be incompatible with all forms of religious belief, but it surely does undercut the basic teachings and doctrines of the world's great religions (and most of its not-so-great ones as well). Look at this 1993 NORC survey: In the United States, 63 percent of the public believed in God and 35 percent believed in evolution. In Great Britain, by comparison, 24 percent of people believed in God and 77 percent believed in evolution. You can believe in both—but not many people do.
That evolution erodes religious belief seems almost too obvious to require argument. It destroyed the faith of Darwin himself, who moved from Christianity to agnosticism as a result of his discoveries and was immediately recognized as a huge threat by his reverent contemporaries. In reviewing The Origin of Species in 1860, Samuel Wilberforce, the bishop of Oxford, wrote that the religious view of man as a creature with free will was "utterly irreconcilable with the degrading notion of the brute origin of him who was created in the image of God." (The passage is quoted in Daniel C. Dennett's superb book Darwin's Dangerous Idea.)
Cardinal Christoph Schonborn, the archbishop of Vienna, was saying nothing very different when he argued in a New York Times op-ed piece on July 7 that random evolution can't be harmonized with Catholic doctrine. To be sure, there are plenty of scientists who believe in God, and even Darwinists who call themselves Christians. But the acceptance of evolution diminishes religious belief in aggregate for a simple reason: It provides a better answer to the question of how we got here than religion does. Not a different answer, a better answer: more plausible, more logical, and supported by an enormous body of evidence. Post-Darwinian evolutionary theory, which can explain the emergence of the first bacteria, doesn't even leave much room for a deist God whose minimal role might have been to flick the first switch.
So, what should evolutionists and their supporters say to parents who don't want their children to become atheists and who may even hold firm to the virgin birth and the parting of the Red Sea? That it's time for them to finally let go of their quaint superstitions? That Darwinists aren't trying to push people away from religion but recognize that teaching their views does tend to have that effect? Dennett notes that Darwin himself avoided exploring the issue of the ultimate origins of life in part to avoid upsetting his wife Emma's religious beliefs.
One possible avenue is to focus more strongly on the practical consequences of resisting scientific reality. In a world where Koreans are cloning dogs, can the U.S. afford—ethically or economically—to raise our children on fraudulent biology? But whatever tack they take, evolutionists should quit pretending their views are no threat to believers. This insults our intelligence, and the president is doing that already.
By Jacob Weisberg, Slate.com
President Bush used to be content to revel in his own ignorance. Now he wants to share it with America's schoolchildren.
I refer to his recent comments in favor of teaching "intelligent design" alongside evolution. "Both sides ought to be properly taught … so people can understand what the debate is about," Bush told a group of Texas newspaper reporters who interviewed him on Aug. 1. "Part of education is to expose people to different schools of thought."
The president seems to view the conflict between evolutionary theory and intelligent design as something like the debate over Social Security reform. But this is not a disagreement with two reasonable points of view, let alone two equally valid ones. Intelligent design, which asserts that gaps in evolutionary science prove God must have had a role in creation, may be—as Bob Wright argues—creationism in camouflage. Or it may be—as William Saletan argues—a step in the creationist cave-in to evolution. But whatever it represents, intelligent design is a faith-based theory with no scientific validity or credibility.
If Bush had said schools should give equal time to the view that the Sun revolves around the Earth, or that smoking doesn't cause lung cancer, he'd have been laughed out of his office. The difference with evolution is that a large majority of Americans reject what scientists regard as equally well supported: that we're here because of random mutation and natural selection. According to the most recent Gallup poll on the subject (2004), 45 percent of Americans believe God created human beings in their present form 10,000 years ago, while another 38 percent believe that God directed the process of evolution. Only 13 percent accept the prevailing scientific view of evolution as an unguided, random process.
Being right and yet so unpopular presents an interesting problem for evolutionists. Their theory has won over the world scientific community but very few of the citizens of red-state America, who decide what gets taught in their own public schools. How can followers of Darwin prevent the propagation of ignorance in places like Kansas, whose board of education just voted to rewrite its biology curriculum to do what President Bush suggests?
Many biologists believe the answer is to present evolution as less menacing to religious belief than it really is. In much the same way that intelligent-design advocates try to assert that a creator must be compatible with evolution in order to shoehorn God into science classrooms, evolutionists claim Darwin is compatible with religion in order to keep God out. Don't worry, they insist, there's no conflict between evolution and religion—they simply belong to different realms. Evolution should be taught in the secular classroom, along with other hypotheses that can be verified or falsified. Intelligent design belongs in Sunday schools, with stuff that can't.
This was the soothing contention of the famed paleontologist Stephen Jay Gould, who argued that science and religion were separate "magisteria," or domains of teaching. The theme appears frequently in statements by major scientific organizations and wherever fundamentalists try to force creationism or its descendents on local school boards. Here, for instance, is the official position of Kansas Citizens for Science, the group opposing the inclusion of intelligent design in the state's science curricula: "People of faith do not have to choose between science and religion. Science is neither anti-Christian nor anti-God. Science denies neither God nor creation. Science merely looks for natural evidence of how the universe got to its current state. If viewed theistically, science is not commenting on whether there was a creation, but could be viewed as trying to find out how it happened."
In a state like Kansas, where public opinion remains overwhelmingly hostile to evolution, one sees the political logic of this kind of tap-dance. But let's be serious: Evolutionary theory may not be incompatible with all forms of religious belief, but it surely does undercut the basic teachings and doctrines of the world's great religions (and most of its not-so-great ones as well). Look at this 1993 NORC survey: In the United States, 63 percent of the public believed in God and 35 percent believed in evolution. In Great Britain, by comparison, 24 percent of people believed in God and 77 percent believed in evolution. You can believe in both—but not many people do.
That evolution erodes religious belief seems almost too obvious to require argument. It destroyed the faith of Darwin himself, who moved from Christianity to agnosticism as a result of his discoveries and was immediately recognized as a huge threat by his reverent contemporaries. In reviewing The Origin of Species in 1860, Samuel Wilberforce, the bishop of Oxford, wrote that the religious view of man as a creature with free will was "utterly irreconcilable with the degrading notion of the brute origin of him who was created in the image of God." (The passage is quoted in Daniel C. Dennett's superb book Darwin's Dangerous Idea.)
Cardinal Christoph Schonborn, the archbishop of Vienna, was saying nothing very different when he argued in a New York Times op-ed piece on July 7 that random evolution can't be harmonized with Catholic doctrine. To be sure, there are plenty of scientists who believe in God, and even Darwinists who call themselves Christians. But the acceptance of evolution diminishes religious belief in aggregate for a simple reason: It provides a better answer to the question of how we got here than religion does. Not a different answer, a better answer: more plausible, more logical, and supported by an enormous body of evidence. Post-Darwinian evolutionary theory, which can explain the emergence of the first bacteria, doesn't even leave much room for a deist God whose minimal role might have been to flick the first switch.
So, what should evolutionists and their supporters say to parents who don't want their children to become atheists and who may even hold firm to the virgin birth and the parting of the Red Sea? That it's time for them to finally let go of their quaint superstitions? That Darwinists aren't trying to push people away from religion but recognize that teaching their views does tend to have that effect? Dennett notes that Darwin himself avoided exploring the issue of the ultimate origins of life in part to avoid upsetting his wife Emma's religious beliefs.
One possible avenue is to focus more strongly on the practical consequences of resisting scientific reality. In a world where Koreans are cloning dogs, can the U.S. afford—ethically or economically—to raise our children on fraudulent biology? But whatever tack they take, evolutionists should quit pretending their views are no threat to believers. This insults our intelligence, and the president is doing that already.
Wednesday, August 10
Reading in America Declines yet again?
The number of Americans who read literature has dropped by 10 percent over the past decade, according to a survey released in 2004 by the National Endowment for the Arts. That loss of 20 million readers means that, for most Americans, "literature has no real existence," says Mark Bauerlein, editor of Forum, in a response to the arts endowment's report.
The response, in a journal published by the Association of Literary Scholars and Critics, includes comments from 12 scholars from across the country.
Of particular interest to humanities professors, writes Mr. Bauerlein, is that the drop increased to 17 percent among 18- to 24-year-olds. The dip "signifies a vast shift in youth culture," he says, and it is clear "the Internet, videogames, instant messaging, etc., have contributed to this decline."
Moreover, the decline in young readers has led to a shift in resources from literature programs to curricula dedicated to popular culture, film, and media and technology studies, writes Mr. Bauerlein.
To scholars taking part in the response, the implications of the survey's findings are far-reaching.
"It is apparent that my students do not know as much as they should know or as they need to know," writes Paul Voss, an associate professor of Renaissance literature at Georgia State University. "Specifically," he says, "they do not know how to read in the full sense of that term. More dangerously, they don't care to read. Apathy, in this case, is a far greater problem than ignorance."
The NEA report "is a powerful contribution to what is developing into a serious debate over the relation of reading to technology," writes Sharon Alusow Hart, a literature professor at East Tennessee State University.
But, while preserving readership in the face of technology may seem daunting, now is no time to back away from the challenge, writes Mr. Bauerlein. Like it or not, he says, the survival of literary expression "is no longer a departmental concern or a canon question. It is a public concern."
The article, "Reading at Risk: a Forum," is available online at http://www.bu.edu/literary/forum/onlineform.html
The NEA report, "Reading at Risk: A Survey of Literary Reading in America," is available at http://www.nea.gov/news/news04/ReadingAtRisk.html
The response, in a journal published by the Association of Literary Scholars and Critics, includes comments from 12 scholars from across the country.
Of particular interest to humanities professors, writes Mr. Bauerlein, is that the drop increased to 17 percent among 18- to 24-year-olds. The dip "signifies a vast shift in youth culture," he says, and it is clear "the Internet, videogames, instant messaging, etc., have contributed to this decline."
Moreover, the decline in young readers has led to a shift in resources from literature programs to curricula dedicated to popular culture, film, and media and technology studies, writes Mr. Bauerlein.
To scholars taking part in the response, the implications of the survey's findings are far-reaching.
"It is apparent that my students do not know as much as they should know or as they need to know," writes Paul Voss, an associate professor of Renaissance literature at Georgia State University. "Specifically," he says, "they do not know how to read in the full sense of that term. More dangerously, they don't care to read. Apathy, in this case, is a far greater problem than ignorance."
The NEA report "is a powerful contribution to what is developing into a serious debate over the relation of reading to technology," writes Sharon Alusow Hart, a literature professor at East Tennessee State University.
But, while preserving readership in the face of technology may seem daunting, now is no time to back away from the challenge, writes Mr. Bauerlein. Like it or not, he says, the survival of literary expression "is no longer a departmental concern or a canon question. It is a public concern."
The article, "Reading at Risk: a Forum," is available online at http://www.bu.edu/literary/forum/onlineform.html
The NEA report, "Reading at Risk: A Survey of Literary Reading in America," is available at http://www.nea.gov/news/news04/ReadingAtRisk.html
Sunday, August 7
The Xbox Auteurs
August 7, 2005
By CLIVE THOMPSON
Like many young hipsters in Austin, Tex., Michael Burns wanted to make it big in some creative field -- perhaps writing comedy scripts in Hollywood. Instead, he wound up in a dead-end job, managing a call center. To kill time, he made friends with a group of equally clever and bored young men at the company where he worked, and they'd sit around talking about their shared passion: video games. Their favorite title was Halo, a best-selling Xbox game in which players control armor-clad soldiers as they wander through gorgeous coastal forests and grim military bunkers and fight an army of lizardlike aliens. Burns and his gang especially loved the ''team versus team'' mode, which is like a digital version of paint ball: instead of fighting aliens, players hook their Xboxes to the Internet, then log on together in a single game, at which point they assemble into two teams -- red-armored soldiers versus blue-armored ones. Instead of shooting aliens, they try to slaughter one another, using grenades, machine guns and death rays. On evenings and weekends, Burns and his friends would cluster around their TV's until the wee hours of the morning, gleefully blowing one another to pieces.
''Halo is like crack,'' Burns recalls thinking. ''I could play it until I die.''
Whenever a friend discovered a particularly cool stunt inside Halo -- for example, obliterating an enemy with a new type of grenade toss -- Burns would record a video of the stunt for posterity. (His friend would perform the move after Burns had run a video cord from his TV to his computer, so he could save it onto his hard drive.) Then he'd post the video on a Web site to show other gamers how the trick was done. To make the videos funnier, sometimes Burns would pull out a microphone and record a comedic voice-over, using video-editing software to make it appear as if the helmeted soldier himself were doing the talking.
Then one day he realized that the videos he was making were essentially computer-animated movies, almost like miniature emulations of ''Finding Nemo'' or ''The Incredibles.'' He was using the game to function like a personal Pixar studio. He wondered: Could he use it to create an actual movie or TV series?
Burns's group decided to give it a shot. They gathered around the Xbox at Burns's apartment, manipulating their soldiers like tiny virtual actors, bobbing their heads to look as if they were deep in conversation. Burns wrote sharp, sardonic scripts for them to perform. He created a comedy series called ''Red vs. Blue,'' a sort of sci-fi version of ''M*A*S*H.'' In ''Red vs. Blue,'' the soldiers rarely do any fighting; they just stand around insulting one another and musing over the absurdities of war, sounding less like patriotic warriors than like bored, clever video-store clerks. The first 10-minute episode opened with a scene set in Halo's bleakest desert canyon. Two red soldiers stood on their base, peering at two blue soldiers far off in the distance, and traded quips that sounded almost like a slacker disquisition on Iraq:
Red Soldier: ''Why are we out here? Far as I can tell, it's just a box canyon in the middle of nowhere, with no way in or out. And the only reason we set up a red base here is because they have a blue base there. And the only reason they have a blue base over there is because we have a red base here.''
When they were done, they posted the episode on their Web site (surreptitiously hosted on computers at work). They figured maybe a few hundred people would see it and get a chuckle or two.
Instead, ''Red vs. Blue'' became an instant runaway hit on geek blogs, and within a single day, 20,000 people stampeded to the Web site to download the file. The avalanche of traffic crashed the company server. ''My boss came into the office and was like, 'What the hell is going on?' '' Burns recalls. ''I looked over at the server, and it was going blink, blink, blink.''
Thrilled, Burns and his crew quickly cranked out another video, then another. They kept up a weekly production schedule, and after a few months, ''Red vs. Blue'' had, like some dystopian version of ''Friends,'' become a piece of appointment viewing. Nearly a million people were downloading each episode every Friday, writing mash notes to the creators and asking if they could buy a DVD of the collected episodes. Mainstream media picked up on the phenomenon. The Village Voice described it as '' 'Clerks' meets 'Star Wars,' '' and the BBC called it ''riotously funny'' and said it was ''reminiscent of the anarchic energy of 'South Park.' '' Burns realized something strange was going on. He and his crew had created a hit comedy show -- entirely inside a video game.
Video games have not enjoyed good publicity lately. Hillary Clinton has been denouncing the violence in titles like Grand Theft Auto, which was yanked out of many stores last month amid news that players had unlocked sex scenes hidden inside. Yet when they're not bemoaning the virtual bloodshed, cultural pundits grudgingly admit that today's games have become impressively cinematic. It's not merely that the graphics are so good: the camera angles inside the games borrow literally from the visual language of film. When you're playing Halo and look up at the sun, you'll see a little ''lens flare,'' as if you were viewing the whole experience through the eyepiece of a 16-millimeter Arriflex. By using the game to actually make cinema, Burns and his crew flipped a switch that neatly closed a self-referential media loop: movies begat games that begat movies.
And Burns and his crew aren't alone. Video-game aficionados have been creating ''machinima'' -- an ungainly term mixing ''machine'' and ''cinema'' and pronounced ma-SHEEN-i-ma -- since the late 90's. ''Red vs. Blue'' is the first to break out of the underground, and now corporations like Volvo are hiring machinima artists to make short promotional films, while MTV, Spike TV and the Independent Film Channel are running comedy shorts and music videos produced inside games. By last spring, Burns and his friends were making so much money from ''Red vs. Blue'' that they left their jobs and founded Rooster Teeth Productions. Now they produce machinima full time.
It may be the most unlikely form of indie filmmaking yet -- and one of the most weirdly democratic. ''It's like 'The Blair Witch Project' all over again, except you don't even need a camera,'' says Julie Kanarowski, a product manager with Electronic Arts, the nation's largest video-game publisher. ''You don't even need actors.''
Back in college, Burns and another Rooster Teeth founder, Matt Hullum, wrote and produced a traditional live-action indie movie. It cost $9,000, required a full year to make and was seen by virtually no one. By contrast, the four Xboxes needed to make ''Red vs. Blue'' cost a mere $600. Each 10-minute episode requires a single day to perform and edit and is viewed by hordes of feverish video-game fans the planet over.
More than just a cheap way to make an animated movie, machinima allows game players to comment directly on the pop culture they so devotedly consume. Much like ''fan fiction'' (homespun tales featuring popular TV characters) or ''mash-ups'' (music fans blending two songs to create a new hybrid), machinima is a fan-created art form. It's what you get when gamers stop blasting aliens for a second and start messing with the narrative.
And God knows, there's plenty to mess with. These days, the worlds inside games are so huge and open-ended that gamers can roam anywhere they wish. Indeed, players often abandon the official goal of the game -- save the princess; vanquish the eldritch forces of evil -- in favor of merely using the virtual environment as a gigantic jungle gym. In one popular piece of Halo machinima, ''Warthog Jump,'' a player cunningly used the game to conduct a series of dazzling physics experiments. He placed grenades in precise locations beneath jeeps and troops, such that when the targets blew sky-high, they pinwheeled through the air in precise formations, like synchronized divers. Another gamer recorded a machinima movie that poked subversive fun at Grand Theft Auto. Instead of playing as a dangerous, cop-killing gangster, the player pretended he was a naive Canadian tourist -- putting down his gun, dressing in tacky clothes and simply wandering around the game's downtown environment for hours, admiring the scenery.
So what's it like to actually shoot a movie inside a game? In June, I visited the Rooster Teeth offices in Buda, Tex., a tiny Austin suburb, to observe Burns and his group as they produced a scene of ''Red vs. Blue.'' Burns, a tall, burly 32-year-old, sat in front of two huge flat-panel screens, preparing the editing software. Nearby were the two Rooster Teeth producers who would be acting on-screen: Geoff Ramsey, a scraggly-bearded 30-year-old whose arms are completely covered in tattoos of fish and skulls, and Gustavo Sorola, a gangly 27-year-old who sprawled in a beanbag chair and peered through his thick architect glasses at the day's e-mail. They were fan letters, Sorola told me, that pour in from teenagers who are as enthusiastic as they are incoherent. ''The way kids write these days,'' he said with a grimace. ''It's like someone threw up on the keyboard.''
In the script they were acting out that day, a pair of ''Red vs. Blue'' soldiers engaged in one of their typically pointless existential arguments, bickering over whether it's possible to kill someone with a toy replica of a real weapon. The Rooster Teeth crew recorded the voice-overs earlier in the day; now they were going to create the animation for the scene.
Burns picked up a controller and booted up Halo on an Xbox. He would act as the camera: whatever his character saw would be recorded, from his point of view. Then Sorola and Ramsey logged into the game, teleporting in as an orange-suited and a red-suited soldier. Burns posed them near a massive concrete bunker and frowned as he scrutinized the view on the computer screen. ''Hmmmm,'' he muttered. ''We need something to frame you guys -- some sort of prop.'' He ran his character over to a nearby alien hovercraft, jumped in and parked it next to the actors. ''Sweet!'' he said. ''I like it!''
In a ''Red vs. Blue'' shoot, the actors all must follow one important rule: Be careful not to accidentally kill another actor. ''Sometimes you'll drop your controller and it unintentionally launches a grenade. It takes, like, 20 minutes for the blood splatters to dry up,'' Ramsey said. ''Totally ruins the scene.''
Finally, Burns was ready to go. He shouted, ''Action!'' and the voice-overs began playing over loudspeakers. Sorola and Ramsey acted in time with the dialogue. Acting, in this context, was weirdly minimalist. They mashed the controller joysticks with their thumbs, bobbing the soldiers' heads back and forth roughly in time with important words in each line. ''It's puppetry, basically,'' Ramsey said, as he jiggled his controller. Of all the ''Red vs. Blue'' crew members, Ramsey is renowned for his dexterity with an Xbox. When a scene calls for more than five actors onstage, he'll put another controller on the ground and manipulate it with his right foot, allowing him to perform as two characters simultaneously.
As I watched, I was reminded of what initially cracked me up so much about ''Red vs. Blue'': the idea that faceless, anonymous soldiers in a video game have interior lives. It's a ''Rosencrantz and Guildenstern'' conceit; ''Red vs. Blue'' is what the game characters talk about when we're not around to play with them. As it turns out, they're a bunch of neurotics straight out of ''Seinfeld.'' One recruit reveals that he chain-smokes inside his airtight armor; a sergeant tells a soldier his battle instructions are to ''scream like a woman.'' And, in a sardonic gloss on the game's endless carnage, none of the soldiers have the vaguest clue why they're fighting.
Yet as I discovered, real-life soldiers are among the most ardent fans of ''Red vs. Blue.'' When I walked around the Rooster Teeth office, I found it was festooned with letters, plaques and an enormous American flag, gifts from grateful American troops, many of whom are currently stationed in Iraq. Isn't it a little astonishing, I asked Burns when the crew went out in the baking Texas sun for a break, that actual soldiers are so enamored of a show that portrays troops as inept cowards, leaders as cynical sociopaths and war itself as a supremely meaningless endeavor? Burns laughed, but said the appeal was nothing sinister.
'' 'Red vs. Blue' is about downtime,'' he said. ''There's very little action, which is precisely the way things are in real life.''
''He's right,'' Ramsey added. He himself spent five years in the army after high school. ''We'd just sit around digging ditches and threatening to kill each other all day long,'' he said. ''We were bored out of our minds.''
Perhaps the most unusual thing about machinima is that none of its creators are in jail. After all, they're gleefully plundering intellectual property at a time when the copyright wars have become particularly vicious. Yet video-game companies have been upbeat -- even exuberant -- about the legions of teenagers and artists pillaging their games. This is particularly bewildering in the case of ''Red vs. Blue,'' because Halo is made by Bungie, a subsidiary of Microsoft, a company no stranger to using a courtroom to defend its goods. What the heck is going on?
As it turns out, people at Bungie love ''Red vs. Blue.'' ''We thought it was kind of brilliant,'' says Brian Jarrard, the Bungie staff member who manages interactions with fans. ''There are people out there who would never have heard about Halo without 'Red vs. Blue.' It's getting an audience outside the hardcore gaming crowd.''
Sure, Rooster Teeth ripped off Microsoft's intellectual property. But Microsoft got something in return: ''Red vs. Blue'' gave the game a whiff of countercultural coolness, the sort of grass-roots street cred that major corporations desperately crave but can never manufacture. After talking with Rooster Teeth, Microsoft agreed, remarkably, to let them use the game without paying any licensing fees at all. In fact, the company later hired Rooster Teeth to produce ''Red vs. Blue'' videos to play as advertisements in game stores. Microsoft has been so strangely solicitous that when it was developing the sequel to Halo last year, the designers actually inserted a special command -- a joystick button that makes a soldier lower his weapon -- designed solely to make it easier for Rooster Teeth to do dialogue.
''If you're playing the game, there's no reason to lower your weapon at all,'' Burns explained. ''They put that in literally just so we can shoot machinima.''
Other game companies have gone even further. Many now include editing software with their games, specifically to encourage fans to shoot movies. When Valve software released its hit game Half-Life 2 last year, it included ''Faceposer'' software so that machinima creators could tweak the facial expressions of characters. When the Sims 2 -- a sequel to the top-selling game of all time -- came out last year, its publisher, Electronic Arts, set up a Web site so that fans could upload their Sims 2 movies to show to the world. (About 8,000 people so far have done so.)
Still, it's one thing for gamers to produce a jokey comedy or a music video. Can machinima actually produce a work of art -- something with serious emotional depth? A few people have tried. In China, a visual artist named Feng Mengbo used the first-person-shooter game Quake III to produce Q4U, in which the screen is filled with multiple versions of himself, killing one another. Players' relationships with constant, blood-splattering violence is a common subject in game art. Last year, the 31-year-old artist Brody Condon produced an unsettling film that consisted of nothing but shots of himself committing suicide inside 50 different video games.
''I try to come to terms with what taking your life means in these games,'' Condon says. ''I'm trying to understand, spiritually, your relationship with an avatar on the screen.''
But even machinima's biggest fans admit that the vast majority of machinima is pretty amateurish. ''It's like if some friends of mine all broke into a movie set, and we all got to use all the cameras and special-effects equipment,'' says Carl Goodman, director of digital media at the American Museum of the Moving Image, which began to hold an annual machinima festival two years ago. ''We wouldn't quite know how to use it, but we'd make some pretty interesting stuff.''
Yet as Goodman points out, there's a competing proposition. Machinima does not always strive to emulate ''realistic,'' artistic movies. On the contrary, it is often explicitly devoted to celebrating the aesthetics of games -- the animations and in-jokes, the precise physics. Most machinima is probably meaningless to people who don't play games, much as ESPN is opaque to anyone who doesn't watch sports. But for those who do play Halo, it was genuinely thrilling to see something like ''Warthog Jump,'' with its meticulously synchronized explosions.
The Rooster Teeth crew has its own hilariously stringent rule for making machinima: no cheating. When they shoot ''Red vs. Blue,'' they do not use any special effects that are not organically included in the game; everything you see in an episode of ''Red vs. Blue'' could in theory have taken place during an actual game of Halo, played by a fan in his bedroom. It's a charmingly purist attitude, a sci-fi version of the ''Dogma'' school of indie film, which argues that movies are best when cinematic trickery is kept to a minimum.
One evening in New York, I visited with Ethan Vogt as he and his machinima team shot a car-chase scene for a Volvo promo. Vogt and two producers sat at computers, logged into a multiplayer game; each producer controlled a car racing through crowded city streets, while Vogt controlled a free-floating ''camera'' that followed behind, recording the visuals. The vehicles -- an enormous 1972 Chevy Impala and a Volvo V50 -- screamed along at about 60 miles an hour, fishtailing through corners while plowing into mailboxes, lampposts and, occasionally, clots of pedestrians. The lead car burst into flames. ''That's great,'' Vogt said. ''That's great.''
Though it shares with independent filmmaking a do-it-yourself aesthetic, machinima inverts the central tradition of indie film: smallness. With their skimpy budgets, indie directors tend to set movies in kitchens or living rooms -- and focus instead on providing quality acting and scripts. Machinima, in contrast, often has horribly cheesy acting and ham-fisted, purple-prose stories -- but they're set in outer space. Want massive shootouts? Howling mob scenes? Roman gladiatorial armies clashing by night? No problem. It is the rare form of amateur film in which the directors aspire to be not Wes Anderson but George Lucas.
Indeed, with video games played on computers, it is now possible to build an entire world from scratch. The core of any video game is its game engine, the software that knows how to render 3-D objects and how to realistically represent the physics of how they move, bounce or collide. But the actual objects inside the game -- the people, the cars, the guns, even the buildings -- can be altered, tweaked or replaced by modifications, or ''mods.'' Mods do not require any deep programming skills; indeed, almost any teenager with a passing acquaintance in graphic-design software can ''re-skin'' a character in a game to make it look like himself, for instance. (Xbox and PlayStation games, in comparison, are much harder to mod, because the consoles are locked boxes, designed to prevent players from tampering with the games.)
I was able to see modding in action one night when I visited the ILL Clan, a pioneering machinima group. Their headquarters are the kitchen table in the cramped one-bedroom Brooklyn apartment of Frank Dellario; a lanky, hyperkinetic 42-year-old, he sat on a rickety folding chair, pecking at a keyboard. The table was littered with four computer screens and laptops, the remnants of take-out sushi and a hopelessly tangled morass of computer cords and joysticks; a huge wide-screen TV lurked behind them for viewing their work. On the night I visited, they were using a game engine called Torque to shoot a short heist movie for Audi, in which two thugs beat up a concert violinist and make off with an antique violin in a van.
To quickly create a gritty-looking city, Dellario and his colleague -- ILL Clan's co-founder, Matt Dominianni -- hired a local artist to build a generic-looking urban intersection inside the game. To customize it, Dominianni went onto Google, found snapshots of a few seedy stores (an adult bookstore, a tattoo parlor and a furniture outlet) and digitally pasted them onto the front of the buildings. Then they went to a site called Turbo-Squid, a sort of Amazon for virtual in-game items, and for $45 bought a van that could be plunked down inside the game. When I arrived, they were browsing the site and contemplating buying a few women. ''My God, look at this one,'' Dellario marveled, as he clicked open a picture of an eerily realistic 3-D brunette named Masha. ''I'm going to marry this woman. They've finally broken through to total reality.''
Dellario put the van into the correct location in the scene, then logged into the game to figure out the camera angle for this shot. He frowned. It didn't look right. The lighting was all off, with shadows falling in the wrong places.
Dominianni figured out the problem: ''The sun is supposed to be at high noon. It's in the wrong place.''
''Oh, yeah,'' Dellario said. ''Let me move it.'' He pulled up a menu, clicked on the ''sun'' command, and dragged it across the sky.
Now they were finally ready to shoot. Dellario realized they needed an extra pair of hands to manipulate one of the thugs. ''Want to act in this scene?'' Dellario asked, and he handed me a joystick.
I sat down at one of the computers and took control of ''Thug1,'' a brown-haired man in a golf shirt and brown pants, carrying the stolen violin. Dominianni was playing ''Thug2.'' Our characters were supposed to look around to make sure the coast is clear, then jump in the truck and race off. Dellario gave me my motivation: ''It's like you hear a suspicious noise. You're nervous.'' I used the joystick to practice moving my virtual character, craning its neck -- my neck? -- back and forth. I have played plenty of video games, but this felt awfully odd. Usually when I am inside a game, I'm just worried about staying alive while the bullets whiz past my ears. I've never had to emote.
While Dellario and Dominianni fiddled with the camera angle, I grew impatient and wandered around, exploring the virtual set. I peered in a few shop windows -- they were strikingly photorealistic, even up close. Then I walked down an alley and suddenly arrived at the end of the set. It was like a tiny Western town in the desert: once you got beyond the few clustered buildings, there was nothing there -- just a vast, enormous plain, utterly empty and stretching off infinitely into the distance.
This spring, electronic arts decided to promote the Sims 2 by hiring Rooster Teeth to create a machinima show using the game. Called ''The Strangerhood,'' it would be freely available online. ''The Strangerhood'' is a parody of reality TV: a group of people wake up one day to discover that they are living in new houses, and they can't remember who they are or how they got there. In the Sims 2, the animated people are impressively Pixar-like and expressive, making ''The Strangerhood'' even more like a mainstream animated show than ''Red vs. Blue''; you could almost imagine watching it on Saturday morning.
The problem is, the Sims 2 has turned out to be incredibly difficult to shoot with. When the Rooster Teeth gang uses Halo for machinima, the characters are mere puppets and can be posed any way the creators want. But in the Sims 2, the little virtual characters have artificial intelligence and free will. When you're playing, you do not control all the action: the fun is in putting your Sims in interesting social situations, then standing back and watching what they'll do. When Rooster Teeth's Matt Hullum builds a virtual set and puts the ''Strangerhood'' characters in place for a shoot, he's never quite sure what will happen. To shoot a scene in which two men wake up in bed together, Hullum had to spend hours playing with the two characters -- who are nominally heterosexual -- forcing them into repeated conversations until they eventually became such good friends they were willing to share a bed. Shooting machinima with Sims is thus maddeningly like using actual, human stars: they're stubborn; they stage hissy fits and stomp off to their trailers.
''We'll do three or four takes of a scene, and one of the Sims will start getting tired and want to go to sleep,'' Hullum said. ''It's just like being on a real set. You're screaming: 'Quick, quick, get the shot! We're losing light!' ''
Hullum showed me a typical ''Strangerhood'' scene. He put Nikki, a young ponytailed brunette in a baseball cap, in the kitchen to interact with Wade, a slacker who looked eerily like a digital Owen Wilson. (To give Wade a mellow, San Francisco vibe, Hullum programmed him to move at a pace 50 percent slower than the other characters.) Hullum pointed to Nikki's ''mood'' bar; it was low, which meant she was in a bad mood and wouldn't want to talk. ''When they're bored, you have to lock them in a room alone for a few hours until they start to crave conversation,'' Hullum said. He tried anyway, prodding Wade to approach her and talk about food, one of Nikki's favorite subjects. It worked. The two became engrossed in a conversation, laughing and gesticulating wildly. ''See, this footage would be great if we were shooting a scene where these guys are maybe gossiping,'' Hullum mused, as he zoomed the camera in to frame a close-up on Wade. Then Nikki started to yawn. ''Oh, damn. See -- she's getting bored. Oh, no, she's walking away,'' Hullum said, as the little virtual Nikki wandered out of the room. ''Damn. You see what we have to deal with?''
The audience for ''The Strangerhood'' has not exploded the way ''Red vs. Blue'' did. The project is a gamble: its creators hope it will break out of machinima's geeky subculture and vault into the mainstream.
Though in a way, Hullum said, the mainstream isn't always a fun place to be, either. Before he returned to Austin to work on ''Red vs. Blue,'' he spent six miserable years in Hollywood working on second-rate teen movies with big budgets, like ''Scooby-Doo'' and ''The Faculty.''
''So now to come to this, where we have total creative control of our own stuff, it's amazing,'' Hullum said, as he watched Nikki walk out of the house in search of a more interesting conversation. ''I just pray we can keep this going. Because if we can't, I'm in big trouble.''
Clive Thompson is a contributing writer for the magazine.
By CLIVE THOMPSON
Like many young hipsters in Austin, Tex., Michael Burns wanted to make it big in some creative field -- perhaps writing comedy scripts in Hollywood. Instead, he wound up in a dead-end job, managing a call center. To kill time, he made friends with a group of equally clever and bored young men at the company where he worked, and they'd sit around talking about their shared passion: video games. Their favorite title was Halo, a best-selling Xbox game in which players control armor-clad soldiers as they wander through gorgeous coastal forests and grim military bunkers and fight an army of lizardlike aliens. Burns and his gang especially loved the ''team versus team'' mode, which is like a digital version of paint ball: instead of fighting aliens, players hook their Xboxes to the Internet, then log on together in a single game, at which point they assemble into two teams -- red-armored soldiers versus blue-armored ones. Instead of shooting aliens, they try to slaughter one another, using grenades, machine guns and death rays. On evenings and weekends, Burns and his friends would cluster around their TV's until the wee hours of the morning, gleefully blowing one another to pieces.
''Halo is like crack,'' Burns recalls thinking. ''I could play it until I die.''
Whenever a friend discovered a particularly cool stunt inside Halo -- for example, obliterating an enemy with a new type of grenade toss -- Burns would record a video of the stunt for posterity. (His friend would perform the move after Burns had run a video cord from his TV to his computer, so he could save it onto his hard drive.) Then he'd post the video on a Web site to show other gamers how the trick was done. To make the videos funnier, sometimes Burns would pull out a microphone and record a comedic voice-over, using video-editing software to make it appear as if the helmeted soldier himself were doing the talking.
Then one day he realized that the videos he was making were essentially computer-animated movies, almost like miniature emulations of ''Finding Nemo'' or ''The Incredibles.'' He was using the game to function like a personal Pixar studio. He wondered: Could he use it to create an actual movie or TV series?
Burns's group decided to give it a shot. They gathered around the Xbox at Burns's apartment, manipulating their soldiers like tiny virtual actors, bobbing their heads to look as if they were deep in conversation. Burns wrote sharp, sardonic scripts for them to perform. He created a comedy series called ''Red vs. Blue,'' a sort of sci-fi version of ''M*A*S*H.'' In ''Red vs. Blue,'' the soldiers rarely do any fighting; they just stand around insulting one another and musing over the absurdities of war, sounding less like patriotic warriors than like bored, clever video-store clerks. The first 10-minute episode opened with a scene set in Halo's bleakest desert canyon. Two red soldiers stood on their base, peering at two blue soldiers far off in the distance, and traded quips that sounded almost like a slacker disquisition on Iraq:
Red Soldier: ''Why are we out here? Far as I can tell, it's just a box canyon in the middle of nowhere, with no way in or out. And the only reason we set up a red base here is because they have a blue base there. And the only reason they have a blue base over there is because we have a red base here.''
When they were done, they posted the episode on their Web site (surreptitiously hosted on computers at work). They figured maybe a few hundred people would see it and get a chuckle or two.
Instead, ''Red vs. Blue'' became an instant runaway hit on geek blogs, and within a single day, 20,000 people stampeded to the Web site to download the file. The avalanche of traffic crashed the company server. ''My boss came into the office and was like, 'What the hell is going on?' '' Burns recalls. ''I looked over at the server, and it was going blink, blink, blink.''
Thrilled, Burns and his crew quickly cranked out another video, then another. They kept up a weekly production schedule, and after a few months, ''Red vs. Blue'' had, like some dystopian version of ''Friends,'' become a piece of appointment viewing. Nearly a million people were downloading each episode every Friday, writing mash notes to the creators and asking if they could buy a DVD of the collected episodes. Mainstream media picked up on the phenomenon. The Village Voice described it as '' 'Clerks' meets 'Star Wars,' '' and the BBC called it ''riotously funny'' and said it was ''reminiscent of the anarchic energy of 'South Park.' '' Burns realized something strange was going on. He and his crew had created a hit comedy show -- entirely inside a video game.
Video games have not enjoyed good publicity lately. Hillary Clinton has been denouncing the violence in titles like Grand Theft Auto, which was yanked out of many stores last month amid news that players had unlocked sex scenes hidden inside. Yet when they're not bemoaning the virtual bloodshed, cultural pundits grudgingly admit that today's games have become impressively cinematic. It's not merely that the graphics are so good: the camera angles inside the games borrow literally from the visual language of film. When you're playing Halo and look up at the sun, you'll see a little ''lens flare,'' as if you were viewing the whole experience through the eyepiece of a 16-millimeter Arriflex. By using the game to actually make cinema, Burns and his crew flipped a switch that neatly closed a self-referential media loop: movies begat games that begat movies.
And Burns and his crew aren't alone. Video-game aficionados have been creating ''machinima'' -- an ungainly term mixing ''machine'' and ''cinema'' and pronounced ma-SHEEN-i-ma -- since the late 90's. ''Red vs. Blue'' is the first to break out of the underground, and now corporations like Volvo are hiring machinima artists to make short promotional films, while MTV, Spike TV and the Independent Film Channel are running comedy shorts and music videos produced inside games. By last spring, Burns and his friends were making so much money from ''Red vs. Blue'' that they left their jobs and founded Rooster Teeth Productions. Now they produce machinima full time.
It may be the most unlikely form of indie filmmaking yet -- and one of the most weirdly democratic. ''It's like 'The Blair Witch Project' all over again, except you don't even need a camera,'' says Julie Kanarowski, a product manager with Electronic Arts, the nation's largest video-game publisher. ''You don't even need actors.''
Back in college, Burns and another Rooster Teeth founder, Matt Hullum, wrote and produced a traditional live-action indie movie. It cost $9,000, required a full year to make and was seen by virtually no one. By contrast, the four Xboxes needed to make ''Red vs. Blue'' cost a mere $600. Each 10-minute episode requires a single day to perform and edit and is viewed by hordes of feverish video-game fans the planet over.
More than just a cheap way to make an animated movie, machinima allows game players to comment directly on the pop culture they so devotedly consume. Much like ''fan fiction'' (homespun tales featuring popular TV characters) or ''mash-ups'' (music fans blending two songs to create a new hybrid), machinima is a fan-created art form. It's what you get when gamers stop blasting aliens for a second and start messing with the narrative.
And God knows, there's plenty to mess with. These days, the worlds inside games are so huge and open-ended that gamers can roam anywhere they wish. Indeed, players often abandon the official goal of the game -- save the princess; vanquish the eldritch forces of evil -- in favor of merely using the virtual environment as a gigantic jungle gym. In one popular piece of Halo machinima, ''Warthog Jump,'' a player cunningly used the game to conduct a series of dazzling physics experiments. He placed grenades in precise locations beneath jeeps and troops, such that when the targets blew sky-high, they pinwheeled through the air in precise formations, like synchronized divers. Another gamer recorded a machinima movie that poked subversive fun at Grand Theft Auto. Instead of playing as a dangerous, cop-killing gangster, the player pretended he was a naive Canadian tourist -- putting down his gun, dressing in tacky clothes and simply wandering around the game's downtown environment for hours, admiring the scenery.
So what's it like to actually shoot a movie inside a game? In June, I visited the Rooster Teeth offices in Buda, Tex., a tiny Austin suburb, to observe Burns and his group as they produced a scene of ''Red vs. Blue.'' Burns, a tall, burly 32-year-old, sat in front of two huge flat-panel screens, preparing the editing software. Nearby were the two Rooster Teeth producers who would be acting on-screen: Geoff Ramsey, a scraggly-bearded 30-year-old whose arms are completely covered in tattoos of fish and skulls, and Gustavo Sorola, a gangly 27-year-old who sprawled in a beanbag chair and peered through his thick architect glasses at the day's e-mail. They were fan letters, Sorola told me, that pour in from teenagers who are as enthusiastic as they are incoherent. ''The way kids write these days,'' he said with a grimace. ''It's like someone threw up on the keyboard.''
In the script they were acting out that day, a pair of ''Red vs. Blue'' soldiers engaged in one of their typically pointless existential arguments, bickering over whether it's possible to kill someone with a toy replica of a real weapon. The Rooster Teeth crew recorded the voice-overs earlier in the day; now they were going to create the animation for the scene.
Burns picked up a controller and booted up Halo on an Xbox. He would act as the camera: whatever his character saw would be recorded, from his point of view. Then Sorola and Ramsey logged into the game, teleporting in as an orange-suited and a red-suited soldier. Burns posed them near a massive concrete bunker and frowned as he scrutinized the view on the computer screen. ''Hmmmm,'' he muttered. ''We need something to frame you guys -- some sort of prop.'' He ran his character over to a nearby alien hovercraft, jumped in and parked it next to the actors. ''Sweet!'' he said. ''I like it!''
In a ''Red vs. Blue'' shoot, the actors all must follow one important rule: Be careful not to accidentally kill another actor. ''Sometimes you'll drop your controller and it unintentionally launches a grenade. It takes, like, 20 minutes for the blood splatters to dry up,'' Ramsey said. ''Totally ruins the scene.''
Finally, Burns was ready to go. He shouted, ''Action!'' and the voice-overs began playing over loudspeakers. Sorola and Ramsey acted in time with the dialogue. Acting, in this context, was weirdly minimalist. They mashed the controller joysticks with their thumbs, bobbing the soldiers' heads back and forth roughly in time with important words in each line. ''It's puppetry, basically,'' Ramsey said, as he jiggled his controller. Of all the ''Red vs. Blue'' crew members, Ramsey is renowned for his dexterity with an Xbox. When a scene calls for more than five actors onstage, he'll put another controller on the ground and manipulate it with his right foot, allowing him to perform as two characters simultaneously.
As I watched, I was reminded of what initially cracked me up so much about ''Red vs. Blue'': the idea that faceless, anonymous soldiers in a video game have interior lives. It's a ''Rosencrantz and Guildenstern'' conceit; ''Red vs. Blue'' is what the game characters talk about when we're not around to play with them. As it turns out, they're a bunch of neurotics straight out of ''Seinfeld.'' One recruit reveals that he chain-smokes inside his airtight armor; a sergeant tells a soldier his battle instructions are to ''scream like a woman.'' And, in a sardonic gloss on the game's endless carnage, none of the soldiers have the vaguest clue why they're fighting.
Yet as I discovered, real-life soldiers are among the most ardent fans of ''Red vs. Blue.'' When I walked around the Rooster Teeth office, I found it was festooned with letters, plaques and an enormous American flag, gifts from grateful American troops, many of whom are currently stationed in Iraq. Isn't it a little astonishing, I asked Burns when the crew went out in the baking Texas sun for a break, that actual soldiers are so enamored of a show that portrays troops as inept cowards, leaders as cynical sociopaths and war itself as a supremely meaningless endeavor? Burns laughed, but said the appeal was nothing sinister.
'' 'Red vs. Blue' is about downtime,'' he said. ''There's very little action, which is precisely the way things are in real life.''
''He's right,'' Ramsey added. He himself spent five years in the army after high school. ''We'd just sit around digging ditches and threatening to kill each other all day long,'' he said. ''We were bored out of our minds.''
Perhaps the most unusual thing about machinima is that none of its creators are in jail. After all, they're gleefully plundering intellectual property at a time when the copyright wars have become particularly vicious. Yet video-game companies have been upbeat -- even exuberant -- about the legions of teenagers and artists pillaging their games. This is particularly bewildering in the case of ''Red vs. Blue,'' because Halo is made by Bungie, a subsidiary of Microsoft, a company no stranger to using a courtroom to defend its goods. What the heck is going on?
As it turns out, people at Bungie love ''Red vs. Blue.'' ''We thought it was kind of brilliant,'' says Brian Jarrard, the Bungie staff member who manages interactions with fans. ''There are people out there who would never have heard about Halo without 'Red vs. Blue.' It's getting an audience outside the hardcore gaming crowd.''
Sure, Rooster Teeth ripped off Microsoft's intellectual property. But Microsoft got something in return: ''Red vs. Blue'' gave the game a whiff of countercultural coolness, the sort of grass-roots street cred that major corporations desperately crave but can never manufacture. After talking with Rooster Teeth, Microsoft agreed, remarkably, to let them use the game without paying any licensing fees at all. In fact, the company later hired Rooster Teeth to produce ''Red vs. Blue'' videos to play as advertisements in game stores. Microsoft has been so strangely solicitous that when it was developing the sequel to Halo last year, the designers actually inserted a special command -- a joystick button that makes a soldier lower his weapon -- designed solely to make it easier for Rooster Teeth to do dialogue.
''If you're playing the game, there's no reason to lower your weapon at all,'' Burns explained. ''They put that in literally just so we can shoot machinima.''
Other game companies have gone even further. Many now include editing software with their games, specifically to encourage fans to shoot movies. When Valve software released its hit game Half-Life 2 last year, it included ''Faceposer'' software so that machinima creators could tweak the facial expressions of characters. When the Sims 2 -- a sequel to the top-selling game of all time -- came out last year, its publisher, Electronic Arts, set up a Web site so that fans could upload their Sims 2 movies to show to the world. (About 8,000 people so far have done so.)
Still, it's one thing for gamers to produce a jokey comedy or a music video. Can machinima actually produce a work of art -- something with serious emotional depth? A few people have tried. In China, a visual artist named Feng Mengbo used the first-person-shooter game Quake III to produce Q4U, in which the screen is filled with multiple versions of himself, killing one another. Players' relationships with constant, blood-splattering violence is a common subject in game art. Last year, the 31-year-old artist Brody Condon produced an unsettling film that consisted of nothing but shots of himself committing suicide inside 50 different video games.
''I try to come to terms with what taking your life means in these games,'' Condon says. ''I'm trying to understand, spiritually, your relationship with an avatar on the screen.''
But even machinima's biggest fans admit that the vast majority of machinima is pretty amateurish. ''It's like if some friends of mine all broke into a movie set, and we all got to use all the cameras and special-effects equipment,'' says Carl Goodman, director of digital media at the American Museum of the Moving Image, which began to hold an annual machinima festival two years ago. ''We wouldn't quite know how to use it, but we'd make some pretty interesting stuff.''
Yet as Goodman points out, there's a competing proposition. Machinima does not always strive to emulate ''realistic,'' artistic movies. On the contrary, it is often explicitly devoted to celebrating the aesthetics of games -- the animations and in-jokes, the precise physics. Most machinima is probably meaningless to people who don't play games, much as ESPN is opaque to anyone who doesn't watch sports. But for those who do play Halo, it was genuinely thrilling to see something like ''Warthog Jump,'' with its meticulously synchronized explosions.
The Rooster Teeth crew has its own hilariously stringent rule for making machinima: no cheating. When they shoot ''Red vs. Blue,'' they do not use any special effects that are not organically included in the game; everything you see in an episode of ''Red vs. Blue'' could in theory have taken place during an actual game of Halo, played by a fan in his bedroom. It's a charmingly purist attitude, a sci-fi version of the ''Dogma'' school of indie film, which argues that movies are best when cinematic trickery is kept to a minimum.
One evening in New York, I visited with Ethan Vogt as he and his machinima team shot a car-chase scene for a Volvo promo. Vogt and two producers sat at computers, logged into a multiplayer game; each producer controlled a car racing through crowded city streets, while Vogt controlled a free-floating ''camera'' that followed behind, recording the visuals. The vehicles -- an enormous 1972 Chevy Impala and a Volvo V50 -- screamed along at about 60 miles an hour, fishtailing through corners while plowing into mailboxes, lampposts and, occasionally, clots of pedestrians. The lead car burst into flames. ''That's great,'' Vogt said. ''That's great.''
Though it shares with independent filmmaking a do-it-yourself aesthetic, machinima inverts the central tradition of indie film: smallness. With their skimpy budgets, indie directors tend to set movies in kitchens or living rooms -- and focus instead on providing quality acting and scripts. Machinima, in contrast, often has horribly cheesy acting and ham-fisted, purple-prose stories -- but they're set in outer space. Want massive shootouts? Howling mob scenes? Roman gladiatorial armies clashing by night? No problem. It is the rare form of amateur film in which the directors aspire to be not Wes Anderson but George Lucas.
Indeed, with video games played on computers, it is now possible to build an entire world from scratch. The core of any video game is its game engine, the software that knows how to render 3-D objects and how to realistically represent the physics of how they move, bounce or collide. But the actual objects inside the game -- the people, the cars, the guns, even the buildings -- can be altered, tweaked or replaced by modifications, or ''mods.'' Mods do not require any deep programming skills; indeed, almost any teenager with a passing acquaintance in graphic-design software can ''re-skin'' a character in a game to make it look like himself, for instance. (Xbox and PlayStation games, in comparison, are much harder to mod, because the consoles are locked boxes, designed to prevent players from tampering with the games.)
I was able to see modding in action one night when I visited the ILL Clan, a pioneering machinima group. Their headquarters are the kitchen table in the cramped one-bedroom Brooklyn apartment of Frank Dellario; a lanky, hyperkinetic 42-year-old, he sat on a rickety folding chair, pecking at a keyboard. The table was littered with four computer screens and laptops, the remnants of take-out sushi and a hopelessly tangled morass of computer cords and joysticks; a huge wide-screen TV lurked behind them for viewing their work. On the night I visited, they were using a game engine called Torque to shoot a short heist movie for Audi, in which two thugs beat up a concert violinist and make off with an antique violin in a van.
To quickly create a gritty-looking city, Dellario and his colleague -- ILL Clan's co-founder, Matt Dominianni -- hired a local artist to build a generic-looking urban intersection inside the game. To customize it, Dominianni went onto Google, found snapshots of a few seedy stores (an adult bookstore, a tattoo parlor and a furniture outlet) and digitally pasted them onto the front of the buildings. Then they went to a site called Turbo-Squid, a sort of Amazon for virtual in-game items, and for $45 bought a van that could be plunked down inside the game. When I arrived, they were browsing the site and contemplating buying a few women. ''My God, look at this one,'' Dellario marveled, as he clicked open a picture of an eerily realistic 3-D brunette named Masha. ''I'm going to marry this woman. They've finally broken through to total reality.''
Dellario put the van into the correct location in the scene, then logged into the game to figure out the camera angle for this shot. He frowned. It didn't look right. The lighting was all off, with shadows falling in the wrong places.
Dominianni figured out the problem: ''The sun is supposed to be at high noon. It's in the wrong place.''
''Oh, yeah,'' Dellario said. ''Let me move it.'' He pulled up a menu, clicked on the ''sun'' command, and dragged it across the sky.
Now they were finally ready to shoot. Dellario realized they needed an extra pair of hands to manipulate one of the thugs. ''Want to act in this scene?'' Dellario asked, and he handed me a joystick.
I sat down at one of the computers and took control of ''Thug1,'' a brown-haired man in a golf shirt and brown pants, carrying the stolen violin. Dominianni was playing ''Thug2.'' Our characters were supposed to look around to make sure the coast is clear, then jump in the truck and race off. Dellario gave me my motivation: ''It's like you hear a suspicious noise. You're nervous.'' I used the joystick to practice moving my virtual character, craning its neck -- my neck? -- back and forth. I have played plenty of video games, but this felt awfully odd. Usually when I am inside a game, I'm just worried about staying alive while the bullets whiz past my ears. I've never had to emote.
While Dellario and Dominianni fiddled with the camera angle, I grew impatient and wandered around, exploring the virtual set. I peered in a few shop windows -- they were strikingly photorealistic, even up close. Then I walked down an alley and suddenly arrived at the end of the set. It was like a tiny Western town in the desert: once you got beyond the few clustered buildings, there was nothing there -- just a vast, enormous plain, utterly empty and stretching off infinitely into the distance.
This spring, electronic arts decided to promote the Sims 2 by hiring Rooster Teeth to create a machinima show using the game. Called ''The Strangerhood,'' it would be freely available online. ''The Strangerhood'' is a parody of reality TV: a group of people wake up one day to discover that they are living in new houses, and they can't remember who they are or how they got there. In the Sims 2, the animated people are impressively Pixar-like and expressive, making ''The Strangerhood'' even more like a mainstream animated show than ''Red vs. Blue''; you could almost imagine watching it on Saturday morning.
The problem is, the Sims 2 has turned out to be incredibly difficult to shoot with. When the Rooster Teeth gang uses Halo for machinima, the characters are mere puppets and can be posed any way the creators want. But in the Sims 2, the little virtual characters have artificial intelligence and free will. When you're playing, you do not control all the action: the fun is in putting your Sims in interesting social situations, then standing back and watching what they'll do. When Rooster Teeth's Matt Hullum builds a virtual set and puts the ''Strangerhood'' characters in place for a shoot, he's never quite sure what will happen. To shoot a scene in which two men wake up in bed together, Hullum had to spend hours playing with the two characters -- who are nominally heterosexual -- forcing them into repeated conversations until they eventually became such good friends they were willing to share a bed. Shooting machinima with Sims is thus maddeningly like using actual, human stars: they're stubborn; they stage hissy fits and stomp off to their trailers.
''We'll do three or four takes of a scene, and one of the Sims will start getting tired and want to go to sleep,'' Hullum said. ''It's just like being on a real set. You're screaming: 'Quick, quick, get the shot! We're losing light!' ''
Hullum showed me a typical ''Strangerhood'' scene. He put Nikki, a young ponytailed brunette in a baseball cap, in the kitchen to interact with Wade, a slacker who looked eerily like a digital Owen Wilson. (To give Wade a mellow, San Francisco vibe, Hullum programmed him to move at a pace 50 percent slower than the other characters.) Hullum pointed to Nikki's ''mood'' bar; it was low, which meant she was in a bad mood and wouldn't want to talk. ''When they're bored, you have to lock them in a room alone for a few hours until they start to crave conversation,'' Hullum said. He tried anyway, prodding Wade to approach her and talk about food, one of Nikki's favorite subjects. It worked. The two became engrossed in a conversation, laughing and gesticulating wildly. ''See, this footage would be great if we were shooting a scene where these guys are maybe gossiping,'' Hullum mused, as he zoomed the camera in to frame a close-up on Wade. Then Nikki started to yawn. ''Oh, damn. See -- she's getting bored. Oh, no, she's walking away,'' Hullum said, as the little virtual Nikki wandered out of the room. ''Damn. You see what we have to deal with?''
The audience for ''The Strangerhood'' has not exploded the way ''Red vs. Blue'' did. The project is a gamble: its creators hope it will break out of machinima's geeky subculture and vault into the mainstream.
Though in a way, Hullum said, the mainstream isn't always a fun place to be, either. Before he returned to Austin to work on ''Red vs. Blue,'' he spent six miserable years in Hollywood working on second-rate teen movies with big budgets, like ''Scooby-Doo'' and ''The Faculty.''
''So now to come to this, where we have total creative control of our own stuff, it's amazing,'' Hullum said, as he watched Nikki walk out of the house in search of a more interesting conversation. ''I just pray we can keep this going. Because if we can't, I'm in big trouble.''
Clive Thompson is a contributing writer for the magazine.
Friday, August 5
Design for Confusion, Paul Krugman
I'd like to nominate Irving Kristol, the neoconservative former editor of The Public Interest, as the father of "intelligent design." No, he didn't play any role in developing the doctrine. But he is the father of the political strategy that lies behind the intelligent design movement - a strategy that has been used with great success by the economic right and has now been adopted by the religious right.
Back in 1978 Mr. Kristol urged corporations to make "philanthropic contributions to scholars and institutions who are likely to advocate preservation of a strong private sector." That was delicately worded, but the clear implication was that corporations that didn't like the results of academic research, however valid, should support people willing to say something more to their liking.
Mr. Kristol led by example, using The Public Interest to promote supply-side economics, a doctrine whose central claim - that tax cuts have such miraculous positive effects on the economy that they pay for themselves - has never been backed by evidence. He would later concede, or perhaps boast, that he had a "cavalier attitude toward the budget deficit."
"Political effectiveness was the priority," he wrote in 1995, "not the accounting deficiencies of government."
Corporations followed his lead, pouring a steady stream of money into think tanks that created a sort of parallel intellectual universe, a world of "scholars" whose careers are based on toeing an ideological line, rather than on doing research that stands up to scrutiny by their peers.
You might have thought that a strategy of creating doubt about inconvenient research results could work only in soft fields like economics. But it turns out that the strategy works equally well when deployed against the hard sciences.
The most spectacular example is the campaign to discredit research on global warming. Despite an overwhelming scientific consensus, many people have the impression that the issue is still unresolved. This impression reflects the assiduous work of conservative think tanks, which produce and promote skeptical reports that look like peer-reviewed research, but aren't. And behind it all lies lavish financing from the energy industry, especially ExxonMobil.
There are several reasons why fake research is so effective. One is that nonscientists sometimes find it hard to tell the difference between research and advocacy - if it's got numbers and charts in it, doesn't that make it science?
Even when reporters do know the difference, the conventions of he-said-she-said journalism get in the way of conveying that knowledge to readers. I once joked that if President Bush said that the Earth was flat, the headlines of news articles would read, "Opinions Differ on Shape of the Earth." The headlines on many articles about the intelligent design controversy come pretty close.
Finally, the self-policing nature of science - scientific truth is determined by peer review, not public opinion - can be exploited by skilled purveyors of cultural resentment. Do virtually all biologists agree that Darwin was right? Well, that just shows that they're elitists who think they're smarter than the rest of us.
Which brings us, finally, to intelligent design. Some of America's most powerful politicians have a deep hatred for Darwinism. Tom DeLay, the House majority leader, blamed the theory of evolution for the Columbine school shootings. But sheer political power hasn't been enough to get creationism into the school curriculum. The theory of evolution has overwhelming scientific support, and the country isn't ready - yet - to teach religious doctrine in public schools.
But what if creationists do to evolutionary theory what corporate interests did to global warming: create a widespread impression that the scientific consensus has shaky foundations?
Creationists failed when they pretended to be engaged in science, not religious indoctrination: "creation science" was too crude to fool anyone. But intelligent design, which spreads doubt about evolution without being too overtly religious, may succeed where creation science failed.
The important thing to remember is that like supply-side economics or global-warming skepticism, intelligent design doesn't have to attract significant support from actual researchers to be effective. All it has to do is create confusion, to make it seem as if there really is a controversy about the validity of evolutionary theory. That, together with the political muscle of the religious right, may be enough to start a process that ends with banishing Darwin from the classroom.
Back in 1978 Mr. Kristol urged corporations to make "philanthropic contributions to scholars and institutions who are likely to advocate preservation of a strong private sector." That was delicately worded, but the clear implication was that corporations that didn't like the results of academic research, however valid, should support people willing to say something more to their liking.
Mr. Kristol led by example, using The Public Interest to promote supply-side economics, a doctrine whose central claim - that tax cuts have such miraculous positive effects on the economy that they pay for themselves - has never been backed by evidence. He would later concede, or perhaps boast, that he had a "cavalier attitude toward the budget deficit."
"Political effectiveness was the priority," he wrote in 1995, "not the accounting deficiencies of government."
Corporations followed his lead, pouring a steady stream of money into think tanks that created a sort of parallel intellectual universe, a world of "scholars" whose careers are based on toeing an ideological line, rather than on doing research that stands up to scrutiny by their peers.
You might have thought that a strategy of creating doubt about inconvenient research results could work only in soft fields like economics. But it turns out that the strategy works equally well when deployed against the hard sciences.
The most spectacular example is the campaign to discredit research on global warming. Despite an overwhelming scientific consensus, many people have the impression that the issue is still unresolved. This impression reflects the assiduous work of conservative think tanks, which produce and promote skeptical reports that look like peer-reviewed research, but aren't. And behind it all lies lavish financing from the energy industry, especially ExxonMobil.
There are several reasons why fake research is so effective. One is that nonscientists sometimes find it hard to tell the difference between research and advocacy - if it's got numbers and charts in it, doesn't that make it science?
Even when reporters do know the difference, the conventions of he-said-she-said journalism get in the way of conveying that knowledge to readers. I once joked that if President Bush said that the Earth was flat, the headlines of news articles would read, "Opinions Differ on Shape of the Earth." The headlines on many articles about the intelligent design controversy come pretty close.
Finally, the self-policing nature of science - scientific truth is determined by peer review, not public opinion - can be exploited by skilled purveyors of cultural resentment. Do virtually all biologists agree that Darwin was right? Well, that just shows that they're elitists who think they're smarter than the rest of us.
Which brings us, finally, to intelligent design. Some of America's most powerful politicians have a deep hatred for Darwinism. Tom DeLay, the House majority leader, blamed the theory of evolution for the Columbine school shootings. But sheer political power hasn't been enough to get creationism into the school curriculum. The theory of evolution has overwhelming scientific support, and the country isn't ready - yet - to teach religious doctrine in public schools.
But what if creationists do to evolutionary theory what corporate interests did to global warming: create a widespread impression that the scientific consensus has shaky foundations?
Creationists failed when they pretended to be engaged in science, not religious indoctrination: "creation science" was too crude to fool anyone. But intelligent design, which spreads doubt about evolution without being too overtly religious, may succeed where creation science failed.
The important thing to remember is that like supply-side economics or global-warming skepticism, intelligent design doesn't have to attract significant support from actual researchers to be effective. All it has to do is create confusion, to make it seem as if there really is a controversy about the validity of evolutionary theory. That, together with the political muscle of the religious right, may be enough to start a process that ends with banishing Darwin from the classroom.