Tuesday, May 31

Pope Benedict determined to haul Catholics into the 19th century

Spain recently endorced gay marriage. But just next door, Pope Benedict wants to ban infertility assistance. Italian law would make sure infertile couples cannot receive sperm or egg donations to enable them to have children, and gay couples would be allowed no fertility assistance whatsoever. The Times also notes that Benedict rallied in support of intervening in the Terri Shiavo case, which puts his position at odds with well over 3/4 of Americans on that subject.

"Church officials, starting with Cardinal Camillo Ruini, the vicar of Rome and a close aide to Benedict, are urging Italians to boycott the referendum because any referendum that does not attract 50 percent of voters automatically fails. Therefore, critics argue, the referendum will be decided less on a discussion of the issues than on a legal quirk. Several polls show that Italian voters largely support repealing the sections in question, but also that less than 50 percent will vote."

Wednesday, May 25

China, New Land of Shoppers, Builds Malls on Gigantic Scale

DONGGUAN, China - After construction workers finish plastering a replica of the Arc de Triomphe and buffing the imitation streets of Hollywood, Paris and Amsterdam, a giant new shopping theme park here will proclaim itself the world's largest shopping mall.

The South China Mall - a jumble of Disneyland and Las Vegas, a shoppers' version of paradise and hell all wrapped in one - will be nearly three times the size of the massive Mall of America in Minnesota. It is part of yet another astonishing new consequence of the quarter-century economic boom here: the great malls of China.

Not long ago, shopping in China consisted mostly of lining up to entreat surly clerks to accept cash in exchange for ugly merchandise that did not fit. But now, Chinese have started to embrace America's modern "shop till you drop" ethos and are in the midst of a buy-at-the-mall frenzy.

Already, four shopping malls in China are larger than the Mall of America. Two, including the South China Mall, are bigger than the West Edmonton Mall in Alberta, which just surrendered its status as the world's largest to an enormous retail center in Beijing. And by 2010, China is expected to be home to at least 7 of the world's 10 largest malls.

Chinese are swarming into malls, which usually have many levels that rise up rather than out in the sprawling two-level style typical in much of the United States. Chinese consumers arrive by bus and train, and growing numbers are driving there. On busy days, one mall in the southern city of Guangzhou attracts about 600,000 shoppers.

For years, the Chinese missed out on the fruits of their labor, stitching shoes, purses or dresses that were exported around the world. Now, China's growing consumerism means that its people may be a step or two closer to buying the billion Cokes, Revlon lipsticks, Kodak cameras and the like that foreign companies have long dreamed they could sell.

"Forget the idea that consumers in China don't have enough money to spend," said David Hand, a real estate and retailing expert at Jones Lang LaSalle in Beijing. "There are people with a lot of money here. And that's driving the development of these shopping malls."

For sale are a wide range of consumer favorites - cellphones, DVD players, jeans, sofas and closets to assemble yourself. There is food from many regions of China and franchises with familiar names - KFC, McDonald's and IMAX theaters. Stores without Western pedigree sell Gucci and Louis Vuitton goods. While peasants and poor workers may only window-shop, they have joined a regular pilgrimage to the mall that has set builders and developers afire. The developers are spending billions of dollars to create these supersize shopping centers in the country's fastest-growing cities - betting that a nation of savers is on the verge of also becoming a nation of tireless shoppers.

For the moment, the world's biggest mall is the six-million-square-foot Golden Resources Mall, which opened last October in northwestern Beijing. It has already sparked envy and competitive ambition among the world's big mall builders, who outwardly scoff at the Chinese ascent to mall-dom, even as they plot their own path to build on such scale in China.

How big is six million square feet? That mall, which is expected to cost $1.3 billion when completed, spans the length of six football fields and easily exceeds the floor space of the Pentagon, which at 3.7 million square feet is the world's largest office building. It is a single, colossal five-story building - with rows and rows of shops stacked on top of more rows and rows of shops - so large that it is hard to navigate among the 1,000 stores and the thousands of shoppers.

China is still a land of disparity, though it has a growing middle class that has swelled to as many as 70 million. And as the country rapidly urbanizes and modernizes, open-air food markets and old department stores are being replaced by giant supermarkets and big-box retailers. Ikea and Carrefour, the French supermarket chain, are mobbed with customers. And China's increasingly affluent young people are adopting the American teenager's habit of hanging out at the mall.

Big enclosed shopping malls, which came of age in America in the late 1970's and Europe in the late 80's, are sprouting up all over China. According to retail analysts, more than 400 large malls have been built in China in the last six years.

And at a time when the biggest malls under construction in the United States measure about a million square feet, developers here are creating malls that are six, seven and eight million square feet.

The current titleholder, the Golden Resources Mall, where 20,000 employees work, is the creation of Huang Rulun, an entrepreneur who made a fortune selling real estate in coastal Fujian Province. Six years ago, Mr. Huang acquired a 440-acre tract of land outside Beijing to create a virtual satellite city, which will soon have 110 new apartment buildings, along with schools and offices planted like potted trees around his neon-lighted mall.

In Dongguan, the developers of the South China Mall say they traveled around the world for two years in search of the right model. The result is a $400 million fantasy land: 150 acres of palm-tree-lined shopping plazas, theme parks, hotels, water fountains, pyramids, bridges and giant windmills. Trying to exceed even some of the over-the-top casino extravaganzas in Las Vegas, it has a 1.3-mile artificial river circling the complex, which includes districts modeled on the world's seven "famous water cities," and an 85-foot replica of the Arc de Triomphe.

"We have outstanding architecture from around the world," Tong Rui, vice chief executive at Sanyuan Yinhui Investment and Development, the mall's developer, said as he toured a section modeled on Paris. "You can't see this architecture anywhere else in shopping malls."

Hu Guirong, the man behind the development, made his fortune selling noodles and biscuits in China. His aides say he built his mall in Dongguan, a fast-growing city whose population is estimated as high as eight million, with one of the highest car-to-household ratios in the country, because it is situated at a crossroads of two bustling South China metropolises, Shenzhen and Guangzhou. "We wanted to do something groundbreaking," Mr. Tong said, referring to his boss. "We wanted to leave our mark on history."

But just to keep a seven-million-square-foot shopping center from looking deserted, some retailing specialists say, requires 50,000 to 70,000 visitors a day. Officials of the South China Mall say they will easily surpass those figures.

But before the mall is fully open, the Triple Five Group is working to reclaim the world title, with three megamalls in the planning stages that will expand its operations from its base in North America into China.
Two of them, the Mall of China and the Triple Five Wenzhou Mall, are each projected to be 10 million square feet.

Saturday, May 21

Argumentation in a Culture of Discord

[Abridged] Last October the comedian-philosopher Jon Stewart did writing teachers a great service. Accosting the hosts of CNN's Crossfire, Stewart accused them of shortchanging the American public by failing to offer a forum for genuine debate, and by reducing issues to black/white, right/wrong dichotomies. CNN apparently agreed, as it canceled the show after a 23-year run. And while I certainly admit that Stewart himself argued unfairly, his point nonetheless stands: Our media do not provide a forum for actual debate. Instead they're a venue for self-promotion and squabbling, for hawking goods, for infomercials masquerading as news or serious commentary. In terms of discussing issues, they offer two sides, pick one: Either you are for gay marriage or against it, either for abortion or for life, either for pulling the feeding tube or for "life."

This failure to provide a forum for argumentative discourse has steadily eroded students' understanding of "argument" as a concept. For decades my college writing classes have stressed the need to write papers with an argumentative edge. Yet students don't get it. Either they don't understand what I mean, or they reject the whole enterprise. Students typically don't want to attempt "argument" or take a controversial position to defend, probably because they've seen or heard enough of the media's models -- Bill O'Reilly, Ann Coulter, or Al Franken, to name a few -- and are sick of them. If I were an 18-year-old college freshman assigned an argumentative essay, I'd groan in despair, either because I found the food-fight-journalism model repulsive or because I didn't feel strongly enough about anything to engage in the furious invective that I had all too often witnessed. Maybe the unanticipated consequence of the culture of contentious argument -- and this, I think, was Stewart's larger point -- is the decline in the general dissemination of intellectual, argumentative discourse more broadly construed.

I propose that we teach students more about how intellectual discourse works, about how it offers something exciting -- yet how when it succeeds, it succeeds in only approaching understanding. The philosopher Frank Plumpton Ramsey puts it bluntly but eloquently: "Meaning is mainly potential." Philosophical and, more generally, argumentative discourse presents no irrefutable proofs, no indelible answers. In fact, the best writing of this kind tends not to answer but to raise questions, ones that perhaps the audience hadn't previously considered. Or to put it in terms my college-age nephew uses, when you're writing argument, don't go for the slam-dunk.

At the same time, we should make students aware that they're not alone on the court. We need, that is, to emphasize more the need for counterarguments, which inevitably force writers to place themselves in the audience's position and to attempt to imagine what that audience values and feels -- what objections it might intelligently raise. In On Liberty, John Stuart Mill asserts that 75 percent of an argument should consist of counterarguments. And, further, writers should not merely parrot these, but must "know them in their most plausible and persuasive form ... must feel the whole force of the difficulty which the true view of the subject has to encounter and dispose of." Presenting and empathizing with counterarguments force an author to go somewhere new, to modify her initial position into one more nuanced, more complex, more problematic -- perhaps to one of greater potential, to use Ramsey's formulation.

Not surprisingly, that kind of thought and writing process are difficult to teach. It's easier to give "evaluative" writing assignments for which there are more or less clear-cut answers: Summarize this. Give a précis of that. Answer this question. Give us an outline. Fill in the blank. True or false?

Using writing only as evaluative tool, these assignments invoke the consumerlike currency-exchange model. Think of how in the course of a semester so much of a discipline's dialectical ambiguity emerges, yet how often we will use "evaluative" writing assignments such as the aforementioned, with the expressed purpose of seeing if students "got" the "material," which even for us is slippery and elusive. And the transitive verb really matters here: I "got" a new iPod; I "got" a pair of Gap jeans; I "got" John Rawls's "veil of ignorance" concept; I "got" an A. This pedagogy resembles the consumer myth: There is an answer (a product, an idea, a methodology, a theory, a grade); it's this.

By offering such assignments, we unwittingly embrace what the media have led people to believe that intellectual debate and discourse consist of. People on shows such as Crossfire stake out a position, and they iterate and reiterate that position. They give examples of what they mean, and "defend" themselves by ignoring or deliberately misconstruing vicious attacks from the opposing side. But this is not intellectual discourse; it's discourse packaged as product. Academic, intellectual discourse -- true debate, the attempt to genuinely advance knowledge, the use of imaginative arguments in general -- cannot be easily captured in a half-hour television program. Such discourse requires time and labor. It requires sustained analysis and construction of an intended audience. It requires careful marshaling of evidence, organization of ideas, rewriting, rethinking. It may seem a little boring to listen to, and is often too dense to grasp at first hearing.

Most people never encounter such discourse. And most students, on entering college, have no idea of what it's like. They've come from a culture that wants answers, not nuanced problematizations, not philosophy. They've been conditioned, as have most Americans, to seek out a position where a simple choice will solve the problem. They've been conditioned to see ideas as being part of a marketplace, just like sweatshirts, snowboards, or songs, and when they are asked to produce ideas, they look to that marketplace for a model. And students do this with their research papers as much as with their arguments. How often, in fact, does a student's research paper look like an amateur journalist's report of multiple facts and views, a superficial survey of x number of sources, with no argument even implied?

Consider, for example, the "five-paragraph essay" so often taught in high schools around the country and further abetted by the new SAT exam. Paragraph one offers an introduction, including a thesis at the end of the introduction. It's best if this thesis has three points. The subsequent three paragraphs develop and explain these thesis-supporting points. The last paragraph, the conclusion, sums up the paper and restates the thesis.

It resembles the script for commercials. It inhibits, even prohibits freedom of thought. It's static -- more noise than signal. There's no real inquiry going on, no grappling with complexities. It seeks only support, and readily available support at that. It can appear to be heated, resembling the screaming-heads model. But it's one-sided, and it goes nowhere, except to its inevitable end, which resembles or reproduces its beginning.

When we try to teach argument in the classroom, we have to fight a model of discourse that, zombielike, still stalks many classrooms. At the same time, we're pressed to provide a better model for students- the reasoned, calm approach, the one that engages and responds to counterarguments, that strives only to approach an understanding. The model for this in public discourse is as hard to find as the genre is to explain or justify. It's no surprise that we can't stick an ice pick through the five-paragraph monster's gelid heart.

The best argumentative writing expands and transforms the ideas of the writer. It questions itself, actively seeking out emergent problems along the way. And it ends not with a definitive, an in-your-face "So there!" (or a "You should just read the Bible!"), but probably with more complex questions, ones that push the continuum of the subject matter. One student told me writing in the argumentative mode was "scary." It's just not something they've been taught to do -- yet its being tantamount to a transgressive act can make it much more attractive.

Frank L. Cioffi, an assistant professor of writing and director of the writing program at Scripps College, is author of The Imaginative Argument: A Practical Manifesto for Writers, published this month by Princeton University Press.

Wednesday, May 18

And you thought the _current_ military was expensive... The Weaponization of Space

The Air Force, saying it must secure space to protect the nation from attack, is seeking President Bush's approval of a national-security directive that could move the United States closer to fielding offensive and defensive space weapons, according to White House and Air Force officials.

The proposed change would be a substantial shift in American policy. It would almost certainly be opposed by many American allies and potential enemies, who have said it may create an arms race in space.

A senior administration official said that a new presidential directive would replace a 1996 Clinton administration policy that emphasized a more pacific use of space, including spy satellites' support for military operations, arms control and nonproliferation pacts.

Any deployment of space weapons would face financial, technological, political and diplomatic hurdles, although no treaty or law bans Washington from putting weapons in space, barring weapons of mass destruction.

A presidential directive is expected within weeks, said the senior administration official, who is involved with space policy and insisted that he not be identified because the directive is still under final review and the White House has not disclosed its details.

Air Force officials said yesterday that the directive, which is still in draft form, did not call for militarizing space. "The focus of the process is not putting weapons in space," said Maj. Karen Finn, an Air Force spokeswoman, who said that the White House, not the Air Force, makes national policy. "The focus is having free access in space."

With little public debate, the Pentagon has already spent billions of dollars developing space weapons and preparing plans to deploy them.

"We haven't reached the point of strafing and bombing from space," Pete Teets, who stepped down last month as the acting secretary of the Air Force, told a space warfare symposium last year. "Nonetheless, we are thinking about those possibilities."

In January 2001, a commission led by Donald H. Rumsfeld, then the newly nominated defense secretary, recommended that the military should "ensure that the president will have the option to deploy weapons in space."

It said that "explicit national security guidance and defense policy is needed to direct development of doctrine, concepts of operations and capabilities for space, including weapons systems that operate in space."

The effort to develop a new policy directive reflects three years of work prompted by the report. The White House would not say if all the report's recommendations would be adopted.

In 2002, after weighing the report of the Rumsfeld space commission, President Bush withdrew from the 30-year-old Antiballistic Missile Treaty, which banned space-based weapons.

Ever since then, the Air Force has sought a new presidential policy officially ratifying the concept of seeking American space superiority.

The Air Force believes "we must establish and maintain space superiority," Gen. Lance Lord, who leads the Air Force Space Command, told Congress recently. "Simply put, it's the American way of fighting." Air Force doctrine defines space superiority as "freedom to attack as well as freedom from attack" in space.

The mission will require new weapons, new space satellites, new ways of doing battle and, by some estimates, hundreds of billions of dollars. It faces enormous technological obstacles. And many of the nation's allies object to the idea that space is an American frontier.

Yet "there seems little doubt that space-basing of weapons is an accepted aspect of the Air Force" and its plans for the future, Capt. David C. Hardesty of the Naval War College faculty says in a new study.

A new Air Force strategy, Global Strike, calls for a military space plane carrying precision-guided weapons armed with a half-ton of munitions. General Lord told Congress last month that Global Strike would be "an incredible capability" to destroy command centers or missile bases "anywhere in the world."

Pentagon documents say the weapon, called the common aero vehicle, could strike from halfway around the world in 45 minutes. "This is the type of prompt Global Strike I have identified as a top priority for our space and missile force," General Lord said.

The Air Force's drive into space has been accelerated by the Pentagon's failure to build a missile defense on earth. After spending 22 years and nearly $100 billion, Pentagon officials say they cannot reliably detect and destroy a threat today.

"Are we out of the woods? No," Lt. Gen. Trey Obering, who directs the Missile Defense Agency, said in an interview. "We've got a long way to go, a lot of testing to do."

While the Missile Defense Agency struggles with new technology for a space-based laser, the Air Force already has a potential weapon in space.

In April, the Air Force launched the XSS-11, an experimental microsatellite with the technical ability to disrupt other nations' military reconnaissance and communications satellites.

Another Air Force space program, nicknamed Rods From God, aims to hurl cylinders of tungsten, titanium or uranium from the edge of space to destroy targets on the ground, striking at speeds of about 7,200 miles an hour with the force of a small nuclear weapon.

A third program would bounce laser beams off mirrors hung from space satellites or huge high-altitude blimps, redirecting the lethal rays down to targets around the world. A fourth seeks to turn radio waves into weapons whose powers could range "from tap on the shoulder to toast," in the words of an Air Force plan.

Captain Hardesty, in the new issue of the Naval War College Review, calls for "a thorough military analysis" of these plans, followed by "a larger public debate."

"To proceed with space-based weapons on any other foundation would be the height of folly," he concludes, warning that other nations not necessarily allies would follow America's lead into space.

Despite objections from members of Congress who thought "space should be sanctified and no weapons ever put in space," Mr. Teets, then the Air Force under secretary, told the space-warfare symposium last June that "that policy needs to be pushed forward."

Last month, Gen. James E. Cartwright, who leads the United States Strategic Command, told the Senate Armed Services nuclear forces subcommittee that the goal of developing space weaponry was to allow the nation to deliver an attack "very quickly, with very short time lines on the planning and delivery, any place on the face of the earth."

Senator Jeff Sessions, a Republican from Alabama who is chairman of the subcommittee, worried that the common aero vehicle might be used in ways that would "be mistaken as some sort of attack on, for example, Russia."

"They might think it would be a launch against them of maybe a nuclear warhead," Senator Sessions said. "We want to be sure that there could be no misunderstanding in that before we authorize going forward with this vehicle."

General Cartwright said that the military would "provide every opportunity to ensure that it's not misunderstood" and that Global Strike simply aimed to "expand the choices that we might be able to offer to the president in crisis."

Senior military and space officials of the European Union, Canada, China and Russia have objected publicly to the notion of American space superiority.

They think that "the United States doesn't own space - nobody owns space," said Teresa Hitchens, vice president of the Center for Defense Information, a policy analysis group in Washington that tends to be critical of the Pentagon. "Space is a global commons under international treaty and international law."

No nation will "accept the U.S. developing something they see as the death star," Ms. Hitchens told a Council on Foreign Relations meeting last month. "I don't think the United States would find it very comforting if China were to develop a death star, a 24/7 on-orbit weapon that could strike at targets on the ground anywhere in 90 minutes."

International objections aside, Randy Correll, an Air Force veteran and military consultant, told the council, "the big problem now is it's too expensive."

The Air Force does not put a price tag on space superiority. Published studies by leading weapons scientists, physicists and engineers say the cost of a space-based system that could defend the nation against an attack by a handful of missiles could be anywhere from $220 billion to $1 trillion.

Richard Garwin, widely regarded as a dean of American weapons science, and three colleagues wrote in the March issue of IEEE Spectrum, the professional journal of electric engineering, that "a space-based laser would cost $100 million per target, compared with $600,000 for a Tomahawk missile."

"The psychological impact of such a blow might rival that of such devastating attacks as Hiroshima," they wrote. "But just as the unleashing of nuclear weapons had unforeseen consequences, so, too, would the weaponization of space."

Surveillance and reconnaissance satellites are a crucial component of space superiority. But the biggest new spy satellite program, Future Imagery Architecture, has tripled in price to about $25 billion while producing less than promised, military contractors say. A new space technology for detecting enemy launchings has risen to more than $10 billion from a promised $4 billion, Mr. Teets told Congress last month.

But General Lord said such problems should not stand in the way of the Air Force's plans to move into space.

"Space superiority is not our birthright, but it is our destiny," he told an Air Force conference in September. "Space superiority is our day-to-day mission. Space supremacy is our vision for the future."

Tuesday, May 17

Why Hollywood Blockbusters lose money at the boxoffice (but gain it on dvd sales)

Illustration by Mark Stamaty. Click image to expand.
The media, by treating the box-office grosses released on Sunday afternoons as if they were the results of a weekly horse race, further a misunderstanding about the New Hollywood. Once upon a time, when the studios owned the theaters and carted away locked boxes of cash from them, these box-office numbers meant something. But nowadays, as dazzling as the "boffo," "socko," and "near-record" figures may seem to the media and other number fetishists, they have little real significance other than to measure the effectiveness of the studios' massive expenditures on ads.

To begin with, the Sunday numbers are not actual ticket sales but "projections" furnished by Nielsen EDI, since the Sunday evening box office cannot be counted in time to meet the deadlines of the morning papers. Variety, to its credit, corrects the guess estimates on Monday with the actual weekend take. Yet even these accurate numbers leave in place four other confusions about who earns what.

First, the reported "grosses" are not those of the studios but those of the movie houses. The movie houses take these sums and keep their share (or what they claim is their share)—which can amount to more than 50 percent of the original box-office total. Consider, for example, Touchstone's Gone in 60 Seconds, which had a $242 million box-office gross. From this impressive haul, the theaters kept $129.8 million and remitted the balance to Disney's distribution arm, Buena Vista. After paying mandatory trade dues to the MPAA, Buena Vista was left with $101.6 million. From this amount, it repaid the marketing expenses that had been advanced—$13 million for prints so the film could open in thousands of theatres; $10.2 million for the insurance, local taxes, custom clearances, and other logistical expenses; and $67.4 million for advertising. What remained of the nearly quarter-billion-dollar "gross" was a paltry $11 million. (And that figure does not account for the $103.3 million that Disney had paid to make the movie in the first place.)

Second, box-office results reflect neither the appeal of the actual movies—nor their quality—but the number of screens on which they are playing and the efficacy of the marketing that drove an audience into the theaters. If a movie opens on 30 screens, like Sideways or Million Dollar Baby, there is obviously no way it can achieve the results of a movie opening on 3,000 screens. And how do studios motivate millions of moviegoers—mainly under 25—to go to the 3,000 screens on an opening weekend to see a film no one else has yet seen or recommended? With a successful advertising campaign.

Studios spend $20 million to $40 million on TV ads because their market research shows that those ads are what can draw a movie's crucial opening-weekend teenage audience. To do that, they typically blitz this audience, aiming to hit each viewer with between five to eight ads in the two weeks before a movie's opening. The studios also spend a great deal of money testing the ads on focus groups, some of whom are wired up to measure their nonverbal responses. If the ads fail to trigger the right response, the film usually "bombs" in the media's hyperbolic judgment. If the ads succeed, the film is rewarded with "boffo" box-office numbers.

Third, the "news" of the weekend grosses confuses the feat of buying an audience with that of making a profit. The cost of prints and advertising for the opening of a studio film in America in 2003 totaled, on average, $39 million. That's $18.4 million more per film than studios recovered from box-office receipts. In other words, it cost more in prints and ads—not even counting the actual costs of making the film—to lure an audience into theaters than the studio got back. So while a "boffo" box-office gross might look good in a Variety headline, it might also signify a boffo loss.

Finally, and most important, the fixation on box-office grosses obscures the much more lucrative global home-entertainment business, which is the New Hollywood's real profit center. The six major studios spoon-feed their box-office grosses to the media, but they go to great lengths to conceal the other components of their revenue streams from the public, as well as from the agents, stars, and writers who may profit from a movie.

Each of the major studios, however, supplies the real numbers to its trade association, the MPAA, including a detailed breakdown of the money they actually receive, country by country, from movie theaters, home video, network television, local television, pay television, and pay-per-view, which is then privately circulated among the six studios as "All Media Revenue Report." (To see these private data click here.)

These numbers tell the story. Ticket sales from theaters provided 100 percent of the studios' revenues in 1948; in 2003, they accounted for less than 20 percent. Instead, home entertainment provided 82 percent of the 2003 revenues. In terms of profits, the studios can make an even larger proportion from home entertainment since most, if not all, of the theatrical revenues go to pay for the prints and advertising required to get audiences into theaters. (Video, DVDs, and TV have much lower marketing costs.)

This profit reality has transformed the way Hollywood operates. Theatrical releases now essentially serve as launching platforms for videos, DVDs, network TV, pay TV, games, and a host of other products. Even so, the box-office totals are losing their traditional influence. Up until a few years ago, the results from the U.S. box office largely drove secondary markets, especially video. If a film had a huge opening, the video chains would order 200,000 or more copies (at $60 or more apiece wholesale) for rentals. But this buying formula ended when consumers began buying DVDs at mass retailers. By 2004, Wal-Mart was accounting for more than one-third of the studios' revenues in video and DVD.

For merchandisers like Wal-Mart, DVDs are a means to lure consumers, who may buy other products, into the store. The box-office numbers are of little relevance (especially since it's teenagers who create huge opening weekends, and they cannot afford to buy more profitable goods like plasma TVs). Instead of box-office results, merchandisers look for movies with stars such as Tom Hanks, Julia Roberts, or Arnold Schwarzenegger, who have traction with their highly desired older customers. For example, whereas the sophisticated mind-bending love story Eternal Sunshine of the Spotless Mind had a dismal seventh-place finish in the box-office gross sweepstakes—earning a mere $8.1 million for the theaters during its opening weekend—thanks to the presence of recognizable names like Jim Carrey and Kate Winslet, it did extremely well on DVD, selling more than 1.5 million copies during its first week in the stores.

Edward Jay Epstein is the author of The Big Picture: The New Logic of Money and Power in Hollywood.
Illustration by Mark Stamaty.

Sunday, May 15

Why strict churches are strong - a rational choice theory of religion

Why strict churches are strong.
By Judith Shulevitz

Illustration by Robert Neubecker. Click image to expand.
It isn't easy to explain why some people submit enthusiastically to religious law, especially when you're talking to people who have never had the slightest desire to do so. Why limit yourself to a "theology of the body," as the late Pope John Paul II called it, when birth control and stem-cell research promise relief from two of the most painful vicissitudes of bodily existence, unwanted pregnancy and degenerative disease? Why restrict yourself to kosher food, when kashrut relies on zoological classifications that went out of date thousands of years ago?

Among the nondevout, piety of this magnitude is often dismissed as a social pathology. The mildly religious are more respectful but no more helpful; they just shake their heads and say, fine for them, but not for me. Not even the pious have figured out how to communicate to the rest of the world why strict religious observance appeals to them. They just say that they do what they do because God wants them to do it—an argument that simply isn't going to make sense to a nonbeliever. Or they lay claim to moral superiority, which, if you believe that morality derives from God, is pretty much the same as saying you're doing what God wants you to do.

You wouldn't expect an economist to do a better job than the religious at explaining religion. But one has, using the amoral language of rational choice theory, which reduces people to "rational agents" who "maximize utility," that is, act out of self-interest. (Economists assume that people are rational for methodological reasons, not because they believe it.) In his 1994 essay "Why Strict Churches Are Strong," which has become quite influential in the sociology of religion, economist Laurence Iannacone makes the counterintuitive case that people choose to be strictly religious because of the quantifiable benefits their piety affords them, not just in the afterlife but in the here and now.

Iannacone starts by asking why people join strict churches, given that doing so exacts such a high price. Eccentric customs invite ridicule and persecution; membership in a marginal church may limit chances for social and economic advancement; rules of observance bar access to apparently innocent pleasures; the entire undertaking squanders time that could have been spent amusing or improving oneself.

According to Iannacone, the devout person pays the high social price because it buys a better religious product. The rules discourage free riders, the people who undermine group efforts by taking more than they give back. The strict church is one in which members with weak commitments have been weeded out. Raising fees for membership doesn't work nearly as well as raising the opportunity cost of joining, because fees drive away the poor, who have the least to lose when they volunteer their time, and who also have the most incentive to pray. Fees also encourage the rich to substitute money for piety.

What does the pious person get in return for all of his or her time and effort? A church full of passionate members; a community of people deeply involved in one another's lives and more willing than most to come to one another's aid; a peer group of knowledgeable souls who speak the same language (or languages), are moved by the same texts, and cherish the same dreams. Religion is a " 'commodity' that people produce collectively," says Iannacone. "My religious satisfaction thus depends both on my 'inputs' and those of others." If a rich and textured spiritual experience is what you seek, then a storefront Holy Roller church or an Orthodox shtiebl is a better fit than a suburban church made up of distracted, ambitious people who can barely manage to find a morning free for Sunday services, let alone several evenings a week for text study and volunteer work.

At some point, of course, the disadvantages of zealotry outweigh the benefits. A church reaches that point when it fails to offer acceptable substitutes for everything it has asked its members to give up. Cults that lure their followers into the wilderness but provide them with no livelihood soon fade into history. All-encompassing codes of behavior that isolate people socially—such as, say, Judaism's—all but disappear unless networks are established to support their adherents. This helps to explain, among other things, why the Jews who moved to small Southern towns to open dry goods stores in the 19th and early 20th centuries and lived for decades as the only Jewish families in their towns wound up becoming some of the most assimilated Jews in the world.

The example Iannacone gives for a church whose strictness may have backfired is the Catholic Church, which has been having a hard time holding on to followers in Europe and attracting men to the priesthood in America. Traditionalists blame the church's difficulties on the reforms of Vatican II, when the Mass began to be said in the vernacular and priests and nuns shed their otherworldly clothes. Would-be reformers blame church officials' refusal to yield to popular opinion on contraception, homosexuality, and priestly celibacy. Iannacone says both are right. "The Catholic church may have managed to arrive at a remarkable, 'worst-of-both-worlds' position," he writes, "discarding cherished distinctiveness in the areas of liturgy, theology, and lifestyle, while at the same time maintaining the very demands that its members and clergy are least willing to accept."

Still, if strictness, judiciously enforced, provides an advantage in the spiritual marketplace, then it makes sense that America, one of the few countries with no state religion and a truly open market in religion, should be home to so many varieties of fundamentalism and orthodoxy. The explosive growth of conservative Christianity, Judaism, and Islam and the slow decline of more genteel denominations such as Episcopalianism may well represent not the triumph of reactionary forces, but the natural outcome of religious competition.

Does it follow from Iannacone's theory that America is destined to be dominated by the religious right, at least until its leaders overreach? Not necessarily. His observations have more to do with the way churches work than with what they espouse. The point is that worshippers want enthusiastic commitment from fellow worshippers, not that those who want commitment list to the left or the right.

Admittedly, piety and absolutist ideas tend to go together. It's easier to wrench members away from competing claims on their time when you can assert that your way of life provides exclusive access to the truth. Nonetheless, if the desire for thick connections and strong community accounts for even a small part of the allure of strict piety, Iannacone's solutions to the free-rider problem might provide helpful hints, even for less stringent churches and synagogues. Methodist ministers could allow themselves to demand more prayer and volunteer work from their congregants. Rabbis in Judaism's Conservative movement (which is less strict than Jewish Orthodoxy) could push harder for their congregations to keep kosher, study Talmud, and visit the sick. There's no reason that higher levels of religious involvement couldn't be tied to liberal, rather than conservative, theologies, to doctrines of skepticism and doubt rather than those of certitude, if that's what pastors and rabbis believed in and wanted to preach. Higher demands might yield smaller churches and synagogues, but Pope Benedict XVI may have been onto something when, as cardinal, he told a German journalist that the future of the Catholic Church lies in smaller churches made up of more dedicated followers—a Christianity "characterized more by the mustard seed," as he put it.

The biggest obstacle to such reforms by liberal religious leaders is, of course, the liberal imagination, which tends to associate traditional ritual with being backward, ignorant, and right-wing. But the world is full of painstakingly observant sects whose politics defy easy categorization. Think of the pacifism of the Quakers and the anti-death penalty activism of many Catholics. As the greatest religious leaders have understood, ritual is theater. You can use it to send any message you want.

Liberal Bible-Thumping

By NICHOLAS D. KRISTOF

Even aside from his arguments that Jesus and Mary Magdalene were married and that St. Paul was a self-hating gay, the new book by a former Episcopal bishop of Newark is explosive.

John Shelby Spong, the former bishop, tosses a hand grenade into the cultural wars with "The Sins of Scripture," which examines why the Bible - for all its message of love and charity - has often been used through history to oppose democracy and women's rights, to justify slavery and even mass murder.

It's a provocative question, and Bishop Spong approaches it with gusto. His mission, he says, is "to force the Christian Church to face its own terrifying history that so often has been justified by quotations from 'the Scriptures.' "

This book is long overdue, because one of the biggest mistakes liberals have made has been to forfeit battles in which faith plays a crucial role. Religion has always been a central current of American life, and it is becoming more important in politics because of the new Great Awakening unfolding across the United States.

Yet liberals have tended to stay apart from the fray rather than engaging in it. In fact, when conservatives quote from the Bible to make moral points, they tend to quote very selectively. After all, while Leviticus bans gay sex, it also forbids touching anything made of pigskin (is playing football banned?) - and some biblical passages seem not so much morally uplifting as genocidal.

"Can we really worship the God found in the Bible who sent the angel of death across the land of Egypt to murder the firstborn males in every Egyptian household?" Bishop Spong asks. Or what about 1 Samuel 15, in which God is quoted as issuing orders to wipe out all the Amalekites: "Kill both man and woman, child and infant." Hmmm. Tough love, or war crimes? As for the New Testament, Revelation 19:17 has an angel handing out invitations to a divine dinner of "the flesh of all people."

Bishop Spong, who has also taught at Harvard Divinity School, argues that while Christianity historically tried to block advances by women, Jesus himself treated women with unusual dignity and was probably married to Mary Magdalene.

Christianity may have become unfriendly to women's rights partly because, in its early years, it absorbed an antipathy for sexuality from the Neoplatonists. That led to an emphasis on the perpetual virginity of Mary, with some early Christian thinkers even trying to preserve the Virgin Mary's honor by raising the possibility that Jesus had been born through her ear.

The squeamishness about sexuality led the church into such absurdities as a debate about "prelapsarian sex": the question of whether Adam and Eve might have slept together in the Garden of Eden, at least if they had stayed longer. St. Augustine's dour answer was: Maybe, but they wouldn't have enjoyed it. In modern times, this same discomfort with sex has led some conservative Christians to a hatred of gays and a hostility toward condoms, even to fight AIDS.

Bishop Spong particularly denounces preachers who selectively quote Scripture against homosexuality. He also cites various textual reasons for concluding (not very persuasively) that St. Paul was "a frightened gay man condemning other gay people so that he can keep his own homosexuality inside the rigid discipline of his faith."

The bishop also tries to cast doubt on the idea that Judas betrayed Jesus. He notes that the earliest New Testament writings, of Paul and the source known as Q, don't mention a betrayal by Judas. Bishop Spong contends that after the destruction of Jewish Jerusalem in A.D. 70, early Christians curried favor with Roman gentiles by blaming the Crucifixion on Jewish authorities - nurturing two millennia of anti-Semitism that bigots insisted was biblically sanctioned.

Some of the bishop's ideas strike me as more provocative than persuasive, but at least he's engaged in the debate. When liberals take on conservative Christians, it tends to be with insults - by deriding them as jihadists and fleeing the field. That's a mistake. It's entirely possible to honor Christian conservatives for their first-rate humanitarian work treating the sick in Africa or fighting sex trafficking in Asia, and still do battle with them over issues like gay rights.

Liberals can and should confront Bible-thumping preachers on their own terms, for the scriptural emphasis on justice and compassion gives the left plenty of ammunition. After all, the Bible depicts Jesus as healing lepers, not slashing Medicaid.

Monday, May 9

Larry Summers + Ward Churchill ≠ Free Speech Issue

It's fair to say, I think, that at the present moment, the two most famous academics in the country are Larry Summers and Ward Churchill.

Larry Summers is the president of Harvard University, and so his fame is no surprise: Everyone with any interest at all in higher education knows who the president of Harvard is. Ward Churchill's fame, on the other hand, is of another order. Churchill is a professor at the University of Colorado at Boulder, and he is famous, or rather infamous, because of an essay he wrote more than three years ago in which he went so far as to say that those who died in the September 11 attack on the World Trade Center were part of the military-industrialist machine that had produced the policies that had produced the hatred that eventually produced the terrible events of that day. In a phrase that has been cited endlessly, he called those who died "little Eichmanns," that is, persons who willingly served a regime while taking no responsibility for its actions and their consequences. The chickens, he said, have come home to roost.

The predictable outcry against Churchill's words has been more than matched by the outcry against Summers's speculation, offered at an academic conference held in Cambridge, Mass., that the underrepresentation of women in the sciences might have a genetic basis. In more than a few essays, online journals, and mainstream op-eds, Summers and Churchill have been bracketed together, usually as an illustration of how the predominantly left-wing faculty on American campuses plays a double and hypocritical game. Here, for example, is a statement offered by George Neumayr in The American Spectator under the headline "Professors of Stupidity": "While Ward Churchill can tell lies about differences between America and the terrorists, Larry Summers is forbidden to tell truths about differences between men and women."

Although that parallelism of words and thought is neatly formulated, it is a bit too neat and not quite accurate. First of all, Churchill is not telling lies in the sense that he affirms something he knows to be false (that's what a lie is); rather he is telling what seem to him to be truths, but truths Neumayr rejects. Nor has Summers been forbidden to express the views he may hold to be true and others regard as false: Rather he expressed them freely and is now taking considerable heat for having done so. Churchill, too, is taking considerable heat for the views he expressed, and both men are reported to be in danger of losing their employment. On the surface, the two cases seem similar. I suggest, however, first that they raise different issues, and second that the issues they raise are by and large issues of administrative judgment and not issues of academic freedom or free speech.

It will be helpful to remind ourselves of exactly what freedom of speech is, especially as it relates to college and university life. The first thing to say is that freedom of speech is not an absolute right we carry with us from situation to situation; rather its relevance will vary from situation to situation, as will the force of invoking it. My so-called free-speech rights will be very different depending, for example, on whether I am a fan at a baseball game or a nurse in an operating room. In the first context, my free-speech rights are pretty broad; I can yell any number of things, even abusive profane things, without being silenced or arrested or thrown out of the stadium. In the context of the operating room, however, my free-speech rights barely exist at all; if I decide, in the middle of a procedure, to advocate for a higher salary or better working conditions, I will have no First Amendment defense when I am hauled out of the room and later fired. The reason is that I would have been fired not because of the content of what I said, but because my words, whatever their content, were uttered in a setting that rendered them inappropriate and even dangerous.

Academic situations fall somewhere in between the baseball stadium and the medical operating theater. But here, too, one must be careful to make distinctions, for even in the context of the academy, where free speech is generally highly valued, you will have more or less free-speech rights depending on what you're doing and where you're doing it. If I am a student, and I begin to say something, and the teacher cuts me off and says that my point is beside the point he or she wishes to pursue, I have no free-speech recourse. On the other side, as an instructor I can conduct my class in any manner I like -- lecture, discussion, group presentations -- and I can assign whatever readings I judge to be relevant to the course's topic. Those are pedagogical choices, and I cannot be penalized for making them.

But if I harass students, or call them names, or make fun of their ethnicity, or if I use class time to rehearse my personal political views or attempt to win students over to them, I might well find myself in a disciplinary hearing, either because I am abusing my pedagogical authority or because I am turning the scene of instruction into a scene of indoctrination. Political persuasion is just not what is supposed to go on in the college classroom, even though it may be going on -- and going on legitimately -- at the noontime rally or in dormitory bull sessions. What you are free to say in some venues you are not free to say in all venues, and your lack of freedom is not a First Amendment matter; it is a matter, rather, of the appropriateness or inappropriateness of certain kinds of speech relative to certain contexts of employment or social interaction.

What that means is that free speech as a general category will be a very crude instrument for assessing what is going on at Harvard and at Colorado. Before yielding to the impulse to yell "Free Speech, Free Speech," we should first ask some questions. Who is doing the speaking? Where is he or she when doing it? Who is paying the freight?

Let's take Churchill first. Churchill, as I have already noted, is a professor at the University of Colorado at Boulder. He is a member of the ethnic-studies department and was its chair until his recent resignation. In his resignation letter, he explained that he was taking that action because, given recent events and the political climate surrounding his writings, he was now "a liability in terms of representing either my department, the college, or the university." Let me say, first, that that is a good reason for resigning, and, moreover, it would have been a good reason for Philip P. DiStefano, interim chancellor, to have demanded Churchill's resignation or removed him on the spot.

The good reason -- we must be clear about this -- is not that Churchill said this or that about the causes and lessons of the September 11 attacks; the good reason is that the reaction to what he said -- a reaction for which he is not responsible and one engineered to some extent by Bill O'Reilly and other conservative pundits -- got in the way of his performing his duties as a department chair. The content of what Churchill wrote is irrelevant to the assessment of his administrative performance, and, indeed, any reference to that content by a senior administrator would be a mistake.

Chancellor DiStefano made that mistake when he said, upon accepting Churchill's resignation, "While Professor Churchill has the constitutional right to express his political views, his essay on 9/11 has outraged and appalled us and the general public." One knows what Di-Stefano is up to: He is wrapping himself in the flag and mantra of strong First Amendment doctrine; that is, "I despise what you say, but I will defend to the death your right to say it."

But it is not the job of a senior administrator either to approve or disapprove of what a faculty member writes in a nonuniversity publication. It is his job to make the jurisdictional boundaries clear, to say something like, "Mr. Churchill's remarks to the general public about matters of general political concern do not fall under the scope of the university's jurisdiction. He is, of course, free to make them, although one should not assume that in doing so he speaks for the university." Notice that would stop short of disavowing (or embracing) Churchill's remarks. Simply by throwing in the egregious "has ... appalled us," DiStefano has the university coming down on one side of a political question, and he also creates a First Amendment issue where there was none before. If, at a later juncture, Churchill is disciplined or dismissed, he will be able to cite DiStefano's statement as evidence that he is being discriminated against because of the content of his constitutionally protected speech: "They're doing this to me," he can say, "because they don't like -- are appalled by -- my ideas."

Must the university then resign itself to living with Professor Churchill? Can he not be fired? Well, that depends on the answer to the question, Fired for what? The governor of Colorado, many state legislators, and many more outraged citizens want him fired because they intensely dislike and are offended by what he said. They reason, "Given that Colorado is a state university, and we taxpayers finance his salary, why can't we get rid of him if we don't like what he's doing?" The answer is simple: Because the charter of the university does not say (and one cannot imagine it saying), "Faculty members at this institution will express only opinions (inside or outside the classroom) with which the citizens of Colorado, or the legislators of Colorado, or the governor of Colorado agree."

Academic activities, if they really are academic activities, cannot follow, in the sense of track, political trends; unless, that is, we want to go in the direction (symbolized for many by the old Soviet Union) of an academy whose research results are known in advance because they will always support the policies and reigning values of the state. Some in Colorado do seem to want to go in that direction. Interviewed on Chris Matthews's Hardball, Kevin Lundberg, a state representative, said that Churchill should be "held accountable" to "a common sense of values shared by the culture." Only in that way could he exhibit "professional integrity." The reverse is true. If Churchill were to limit his conclusions to those already reached by the culture, he would throw his professional integrity out the window. What Lundberg really wanted him to do is to stop being an academic and instead click his heels in time with the pronouncements of the state legislature.

Nevertheless, that doesn't mean that Churchill can't be disciplined or even fired. The analysis he presented of the September 11 attacks in his controversial essay was part and parcel of an avowedly polemical set of political recommendations, a veritable call to arms. He has every right to issue that call, as long as he doesn't do it in the classroom and, as it were, on the state's dime. While Churchill cannot or should not be disciplined for the political views he urges in his role as a citizen, he can and should be disciplined for urging those views in venues designated as academic and financed as such by state revenues or by tuition.

I am not saying that political matters can never be raised in an academic setting; such a draconian requirement would mean the end of departments of political science, philosophy, sociology, English, criminal justice, and more. I am just saying that when political matters do enter an academic setting, they must do so in academic terms. A few years ago, a national conference was held at my university on an important topic. A flier advertising the conference went out before I saw it. One sentence in that flier began, "Now that we are fighting a racist war in Afghanistan ... " Because the flier carried with it the imprimatur of the University of Illinois at Chicago, it seemed to be the university that was issuing that judgment.

The case would have been entirely different if there had been a list of the conference's panels on the flier, and if one of those panels had been titled, "Are We Fighting a Racist War in Afghanistan?" That would have been perfectly appropriate because it would have identified the question as one that would be debated at the conference: Speakers would give their answers and back up what they said with evidence, and other speakers would give opposing answers and cite alternative bodies of evidence. That's what we do in the academic world, and if Churchill is doing something else (and I don't know that he is), he is taking money under false pretenses, and he should be called to account for it.

There is at least one other possible reason for firing Professor Churchill. It is said by some that he is not really an American Indian, as he makes himself out to be, and that his credentials for that status are limited to a certificate of the kind you might get (and here I date myself) out of a Cracker Jack box. If that is so (and again, I don't know), Churchill is engaged in a professional misrepresentation (he aggressively identifies himself as a Native American in his writings) little different from the misrepresentations of several football and basketball coaches who listed on their résumés degrees from colleges they never attended. The coaches I am thinking of were dismissed, and if this charge against Churchill proves to be true, so could he. Notice that, like the other questions raised by the Churchill affair, that is not a philosophical or a constitutional matter. It is a matter, simply, of falsified credentials. When such a falsification has been documented, the action taken needs no fancy theoretical backup. You just don't want people around who lie to you about who they are and what they've done.

...

It turns out, however, that there is really not much to say about Summers except that he's a public-relations disaster, a walking time bomb likely to detonate at any moment, especially if his handlers let him out of their sight. One can say something about what issues the Summers brouhaha does not raise. It does not raise issues of free speech or academic freedom.

Stanley Kurtz opined in the National Review that Summers's critics have "turned him into a free-speech martyr," but that piece of alchemy could have been performed only if the hapless president had been prevented from speaking or punished by some state authority for the content of his words. In fact, he spoke freely (perhaps too freely), and if he is now suffering the consequences, they are not consequences from which the First Amendment protects him.

The First Amendment says that, in most circumstances, you can't be stopped from saying something and that, in many (but not all) circumstances, the content of what you say cannot be a reason for imprisoning you or firing you. But that doesn't mean that you get a free pass; you are not exempt from criticism; you are not exempt from public ridicule; you are not exempt from being voted out of your country club; and if what you have said causes enough of a ruckus, you are not exempt from being removed from your position, so long as the reason given for your removal is that your words have created conditions such that you can no longer do your job (others will do for you what Churchill did for himself) and not that somebody up there doesn't like their content. There is a big difference between "I don't like what that guy said, and I'm going to fire him" and "I don't like the effects brought about by what he said, and I'm going to fire him." The first raises constitutional issues; the second doesn't. It's just a judgment on job performance.

To be sure, some on the left do want Summers fired because of the content of what he said, while some on the right want him retained (and celebrated) because of the content of what he said. Both sides, then, want, in different ways and for different reasons, to make Summers into a First Amendment martyr and turn this incident into a First Amendment test. But the content of what Summers said is irrelevant to the only question that should be asked: Is he discharging the duties and obligations of his office in a way that protects the reputation of the university and fosters its academic, political, and financial health? There is good reason to answer no, an answer that would flow not from the fact that Summers said this or that about women in science, but from the fact that, whatever he said, he said it in a way that brought Harvard weeks, and now months, of hostile publicity, led some alumni to announce that they would never give a penny to the institution, probably led many senior female scientists to cross Harvard off their lists, and gave late-night comedians and independent pundits like me a new target. That's not exactly what you want on the résumé of your chief executive officer.

Defenders of Summers usually take two (related) tacks. They say, first, that he is an intellectual pathbreaker, and that (I quote from a particularly smarmy and pious editorial in the Chicago Tribune) his "comments were in the best tradition of free intellectual inquiry." Not unless the best traditions of intellectual inquiry include opening up your big mouth to pronounce publicly on matters far from your area of expertise. Richard A. Posner, the conservative jurist and law professor and sometime Harvard University Press author, points out (on his blog) that, since Summers has no credentials in the history of science or the field of gender discrimination, the odds of his contributing anything valuable to the discussion of women and science were low, while, on the other hand, the odds that he would misstep in some way were high. On a cost/benefit analysis, then, speaking up as he did was a bad idea.

The second line of defense begins by acknowledging that Summers wasn't exactly on familiar ground and was talking off the top of his head (with a little help from some members of his faculty), but contrives to make his ignorance a virtue: He wasn't offering scholarship or long-considered arguments; he was keeping the pot boiling; he was adding to the liveliness of the occasion; he was being (and this is the word Summers himself has used in his many apologies) "provocative." But being provocative is not in the job description; being provocative may be a qualification for a classroom teacher, or the host of a talk-radio show, or a backbencher in Parliament, but it is hardly first on the list of the qualities you look for when interviewing candidates for the presidency of a university...

It is not the first time. From the early days of his tenure as president, Summers has been making the wrong kind of headlines; wrong not because of his views, but because of the lack of tact with which he has announced and deployed them. One of those he bumped up against in a flap, whose reverberations have not yet subsided, was Cornel West, then at Harvard, now at Princeton University. The events of the past months gave West the delicious opportunity to speak more in sorrow than in anger. "I was praying for the brother, hoping he would change," West said, but then added, "It's clear he hasn't changed." Still and all, West acknowledged, there's a bright side to look on, for it's "good to see the faculty wake up." I guess, West concluded, "the chickens have come home to roost."

Man, those chickens are working overtime these days.

Stanley Fish is dean emeritus of the College of Liberal Arts and Sciences at the University of Illinois at Chicago.

When Students Complain About Professors, Who Gets to Define the Controversy?

The media storm around Columbia's Middle Eastern-studies department provides one of the few cases in which students' complaints about professors' classroom conduct have made it into the news. It brings to mind a case at Harvard, more than a decade ago, that offers illuminating contrasts. Together they raise the question of how the news media frame stories about such complaints.

The Columbia story by now is familiar: After students objected to what they saw as anti-Israel bias among professors in the Middle Eastern-studies department, the university was widely criticized as a place where students were intimidated, faculty members were prejudiced, and scholarly standards were in decline. And when a faculty committee appointed by the administration concluded that there had been no serious misconduct, most of the news media rejected that conclusion and demanded additional action by the university.

The Harvard story, in contrast, has been largely forgotten, except among some conservative writers. When a few students complained in 1988 about "racial insensitivity" in a lecture by the history professor Stephan Thernstrom, news organizations rose to his defense by describing him as a victim of "political correctness." Thernstrom's high-profile outrage made him a hero in neoconservative circles, and in 2002 he was appointed a member of the National Council on the Humanities by President Bush.

Pundits on the right often complain that the left dominates American universities. Both of these stories were framed to advance that interpretation. At Harvard, the story was that the professor was a victim of left-wing students; at Columbia, the students were victims of left-wing professors. In each case, news reports said that the threats to the university were coming from the left. In each case, the story told to the public was inaccurate.

In the Columbia case, The New York Times, the Daily News, the New York Post, and The New York Sun (a conservative-minded daily newspaper) each published several articles reporting that pro-Israel students had complained they were treated abusively by some faculty members in class discussions as well as outside of class.

A Barnard College student said that in a class discussion she had asked Joseph A. Massad, a professor of Arab politics, whether Israel gave advance warning before bombing a Palestinian building, and that he had replied angrily, "If you're going to deny the atrocities being committed against Palestinians, then you can get out of my classroom!" He later denied ever telling any student to leave his class, but the faculty committee found the complaint "credible." Massad in turn complained that his classes had been infiltrated by hecklers and "monitors," and that he had received hate mail and death threats.

In the Harvard case, The New Republic, The New York Review of Books, and New York magazine featured Thernstrom's story as told in one of the key neocon books of the decade, Dinesh D'Souza's Illiberal Education (Free Press, 1991). In The New Republic, the historian Eugene D. Genovese wrote that Thernstrom had been "savaged for political incorrectness in the classroom." A cover story in New York magazine featured Thernstrom as a victim of "demagogic and fanatical" black students. In the New York Review, the Yale historian C. Vann Woodward cited the case as an example of "the attack on freedom ... led by minorities."

The story told in the news media was that three black students had accused Thernstrom, a distinguished historian, of racial insensitivity in an introductory history course, "The Peopling of America." Instead of coming to him with their complaints, Thernstrom said, they went to the administration and to the student newspaper, The Harvard Crimson. The greatest damage to Thernstrom, D'Souza said, was done not by the black students, but by the Harvard administration: "Far from coming to his defense," D'Souza wrote, the administration "appeared to give full administrative sanction to the charges against Thernstrom."

Thernstrom said he was so discouraged by the students' attack and the administration's failure to defend his academic freedom that he decided not to teach the course again. Thus the case was framed by the news media as an example of a distinguished professor's being hounded out of teaching his course by an alliance of militant black students, the campus newspaper, and the administrators who supported them. Thernstrom called it "McCarthyism of the left."

In fact almost every element of the story Thernstrom told the news media was erroneous. The incident in question consisted of three black students' complaining about the absence of a black perspective in a lecture on slavery. Thernstrom's response focused primarily on the administration.

Because he received so little support from the administration, he told D'Souza, "I felt like a rape victim." But, in fact, the administration backed up Thernstrom. When the students' took their complaint to Harvard's Committee on Race Relations, they were told that it had no jurisdiction over professors' teaching, and that they should take their complaint to Thernstrom -- which they did. "They felt the university didn't do anything to back up their concerns," the former dean, Fred Jewett, told me in a 1991 interview for The Nation.

Nevertheless the Thernstrom version of the story lives on. As recently as February 2005, Michael A. Ledeen, of the American Enterprise Institute for Public Policy Research, wrote that "'freedom of speech' on most major university campuses nowadays is a fraud. When America's greatest living historian of the antebellum South, Stephan Thernstrom [of Harvard], is prevented from teaching that course ["The Peopling of America"] because black students protest against a white man teaching it, you know that free speech is over."

The students' complaints in both cases share the same problem: As Eric Foner, a historian at Columbia, wrote in a letter last month to The New York Times, professors who do not treat students fairly should be reprimanded; but when students encounter ideas they disagree with, that does not constitute grounds for a complaint to the university.

The faculty responses in the two cases also had similarities: At Harvard, Thernstrom declared that he would give up teaching the course because the university had failed to stand up for him. At Columbia, Massad declared that he would no longer teach the course because the university would not "ensure my rights and protect me against intimidation." Both thus claimed that they were being silenced as the result of the administration's failures in the face of students' complaints.

Why, then, did the news media frame the two cases so differently? At Columbia, the complaining students had the backing of a well-financed pro-Israel organization, the David Project, which organized a sophisticated media campaign. With other pro-Israel groups, such as Campus Watch, they used the Internet to solicit student complaints in an organized national effort aimed at not just Massad but the entire department of Middle Eastern-studies at Columbia, as well as other Middle Eastern-studies programs across the United States.

The groups sent "monitors" into classrooms to report on what professors were saying. The David Project financed a video documentary on the student charges, "Columbia Unbecoming," and ran a public-relations effort to publicize it, screening the video for selected journalists. In contrast, the three students at Harvard who complained had no support from outside groups or the media: no documentary, no Web site, no national support network. They were on their own.

Thus political forces outside the two universities played key roles in shaping what the public was told about the cases. The media campaign charging "anti-Israel bias" at Columbia gained political traction, I believe, because the university has been hoping to expand its campus, which requires city approval. A mayoral race was beginning, and one of the candidates made Columbia his issue, promising to "do something" about it. Meanwhile, a new, right-wing daily newspaper had begun publication, and it fanned the flames by running dozens of stories about the "crisis" at Columbia. Columbia's president failed for months to speak out in defense of academic freedom, perhaps because he feared that the expansion project would be blocked by elected officials.

A decade earlier, in Cambridge, only a few people came to the defense of the three black students who wanted more of the slaves' perspective in a lecture on slavery. On the contrary, a sophisticated and well-financed media campaign distorted the incident mercilessly to advance the neoconservative cause. The key activist here was D'Souza, the finest flower of a vast neocon talent search supported by foundations and think tanks. After 10 years of cultivating young ideologues, the John M. Olin Foundation and the American Enterprise Institute finally got everything they could have hoped for in D'Souza and his book: a best seller attacking the campus left, and, best of all, a right-wing book written by a young person of color.

The news media, for their part, like stories that can be framed as controversies, especially when the stakes seem to be so high: nothing less than freedom in the university. Still, these controversies could have been described differently.

At Columbia the issue could have been defined, in the words of Joan W. Scott of the American Association of University Professors, as "the threat to the integrity of the university by the intervention of organized outside agitators who are disrupting classes and programs for ideological purposes." Instead the issue became professors' "anti-Israel bias."

At Harvard the issue could have been a professor's overreacting to students' disagreement with one of his lectures. But it came to be defined as the victimization of the professor by the forces of "left-wing McCarthyism." The key was not the nature or seriousness of the complaints, but rather the political forces outside the university that defined the issues at stake.

Jon Wiener is a professor of history at the University of California at Irvine and author of Historians in Trouble: Plagiarism, Fraud and Politics in the Ivory Tower (The New Press, 2005).

Friday, May 6

Learn how to learn

By THOMAS L. FRIEDMAN

There's a huge undertow of worry out in the country about how our kids are being educated and whether they'll be able to find jobs in an increasingly flat world, where more Chinese, Indians and Russians than ever can connect, collaborate and compete with us. In three different cities I had parents ask me some version of: "My daughter [or son] is studying Chinese in high school. That's the right thing to do, isn't it?"

Not being an educator, I can't give any such advice. But my own research has taught me that the most important thing you can learn in this era of heightened global competition is how to learn. Being really good at "learning how to learn," as President Bill Brody of Johns Hopkins put it, will be an enormous asset in an era of rapid change and innovation, when new jobs will be phased in and old ones phased out faster than ever.

O.K., one ninth grader in St. Paul asked me, then "what courses should I take?" How do you learn how to learn? Hmm. Maybe, I said, the best way to learn how to learn is to go ask your friends: "Who are the best teachers?" Then - no matter the subject - take their courses. When I think back on my favorite teachers, I don't remember anymore much of what they taught me, but I sure remember being excited about learning it.

What has stayed with me are not the facts they imparted, but the excitement about learning they inspired. To learn how to learn, you have to love learning - while some people are born with that gene, many others can develop it with the right teacher (or parent).

There was a great piece in the April 24 Education Life section of The New York Times that described Britney Schmidt, a student at the University of Arizona who was utterly bored with her courses, mostly because her professors seemed interested only in giving lectures and leaving. "I was getting A's in all my classes, but I wasn't being challenged, and I wasn't thinking about new things," she said.

She had to take a natural science course, though, and it turned out to have a great professor and teaching assistants, who inspired her. "I was lucky," she said. "I took a class from somebody who really cared." The result: a scientist was born. Ms. Schmidt has since been accepted to graduate school at U.C.L.A. in planetary physics and the University of Chicago in cosmo-chemistry.

I just interviewed Craig Barrett, the chief executive of Intel, which has invested millions of dollars in trying to improve the way science is taught in U.S. schools. (The Wall Street Journal noted yesterday that China is graduating four times the number of engineers as the U.S.; Japan, with less than half our population, graduates double the number.)

In today's flat world, Mr. Barrett said, Intel can be a totally successful company without ever hiring another American. That is not its desire or intention, he said, but the fact is that it can now hire the best brain talent "wherever it resides."

If you look at where Intel is making its new engineering investments today, he said, it is in China, India, Russia, Poland and, to a lesser extent, Malaysia and Israel. While cutting-edge talent is still being grown in America, he added, it's not enough for Intel's needs, and not enough is being done in U.S. public schools - not just to leave no child behind, but to make sure that the best students and teachers are nurtured and rewarded.

Monday, May 2

bad conscience

The ethical philosopher Hannah Arendt speculated that "the activity of thinking as such" could be "among the conditions that make men abstain from evil-doing," a hypothesis enforced by everything we know about conscience, namely, that a 'good conscience' is enjoyed, as a rule, only by really bad people... while only 'good people' are capable of having a bad conscience."

Why David Horowitz is Wrong about "Balance" in the Academy

I did not know about David Horowitz's "academic bill of rights" when I began teaching my courses last fall, but even if I had, I would not have thought that I had anything to fear from it.

I teach religious studies at a public university in a conservative part of the nation -- not too different from the traditionally Republican state where I grew up. When I arrived here 16 years ago, I had no trouble adapting to the conservative religious background of many of my students. In graduate school I was more religiously and socially conservative than most of my fellow students. But although I have had my differences with liberals, I never felt that they forbade me to express an informed professional opinion. The chilling effect of today's conservative watchdogs is a much more serious matter.

Last semester I had my first significant falling-out with students, inspired -- I have no doubt -- by David Horowitz and his crusade against liberal bias in academe. Some of the students in my course on "Religion in American Culture" were upset that George M. Marsden's Religion and American Culture (2nd ed., Harcourt, 2001) and Randall Balmer's Mine Eyes Have Seen the Glory: A Journey Into the Evangelical Subculture in America (3rd ed., Oxford University Press, 2000) were on the reading list. They felt that those two books were biased against evangelicals.

Marsden is a highly respected evangelical scholar, and Balmer's work on evangelicals has also been highly acclaimed. Although his religious affiliation is not as clear as Marsden's, I had never before heard complaints that he has been unfair to the evangelicals about whom he writes. I would have thought that the two scholars had impeccable credentials for inclusion in my course, but I now suspect that the objective, scholarly tone of the books upset my students.

I had also assigned some online readings about Christian Identity, a white-supremacist movement that considers Jews and anyone who is not white to belong to inferior races; believes that anything -- e.g., feminism and homosexuality -- not in accordance with traditional gender roles is sinful; and claims to be based on the Bible. Those readings were part of a series of items about Protestant, Catholic, Nation of Islam, American Indian, and other visions of America.

In the session that I had set aside for discussion of the Christian Identity readings, a student asked me if I would have included them had I known how many students believed in the movement. I had not expected many, if any, of my students to be affiliated with Christian Identity, so I had not prepared a response to that question. I think I said something to the effect that I did not fear for my life from the group because I was a white person who was neither a feminist nor a lesbian. (There have been reports of violence associated with Christian Identity.)

About two-thirds of my students did not return to class after that day, which was around the midpoint of the semester, except to take exams. Because I never had an opportunity to discuss the matter with the students who left, I don't know if they were members of Christian Identity, or if they simply believed that a movement that claimed to be based on the Bible could not be wrong. I had never had a large-scale problem with attendance before.

I had another problem with my course on the New Testament in the fall, also unprecedented in my teaching career. I had not included any discussion of homosexuality and the Bible in the syllabus, which was already crowded thanks to the requirements placed on general-education courses by the state Board of Regents, piled on top of the disciplinary imperative of explaining academic methods of studying the Bible and applying them to the New Testament. But when students requested that we take up homosexuality, I did what I normally do when students show a particular interest in something: I modified the syllabus to include it.

We read a Jewish scholar's interpretation of several passages in the Bible for a Jewish view on the subject. I invited a Reformed Church minister to speak to the students, and he explained why there is plenty of room for debate on the question of how Christians should respond to their homosexual brethren. It turned out, however, that at least one student had a specific book in mind for us to discuss: The Bible and Homosexual Practice (Abingdon Press, 2001), by Robert A.J. Gagnon.

Although the students realized they were not academically advanced enough to read the book on their own, they wanted to know what I thought about it. The semester was rapidly drawing to a close. Not having time to read the entire book, I promised to take a look at Gagnon's discussion of Romans 1:24-27, the most explicit and substantial discussion of homosexual behavior in the New Testament.

After I read that part of the book, I told the students that I thought the linguistic work was excellent, and that the linkage of Paul's views on homosexual behavior to his remarks on idolatry was brilliant. However, I did not think that Gagnon's argument would stand up for long.

I was about to explain why when I saw anger flash across several of the students' faces, and I realized that they thought my explanation was going to echo the beliefs of the minister they had already heard, whom they considered a liberal. So I simply said that anyone who wanted to know what flaws I saw in Gagnon's argument would have to come talk to me about that outside of class. No one did.

For the first time in my life, I felt as if I had to leave my commitment to the truth (which is what scholarship is all about!) at the door of the classroom. I didn't feel that I could tell my students they were wrong to avoid hearing my explanation -- in the current political climate, that would have been considered both anti-Republican and insulting to their conservative religious beliefs.

I have to believe that my students' behavior is a direct result of the new political climate on the campus that has been nurtured by the Horowitz "academic bill of rights," in cooperation with conservative media. I do not think that Horowitz intended those results. The problem is that students do not have the academic maturity to know how to use his document.

Nor do I see how they could have that maturity before completing a liberal-arts program of studies. Taking a smattering of liberal-arts courses, which is all that most students are required to do, does not give students the ability to detect bias in their professors or in what they read. Furthermore, many students take their definition of bias from conservative talk-radio shows and Fox News -- even people considered to be moderates from a liberal viewpoint seem biased from such conservative perspectives.

It seems that I must now bow to political or popular pressure because the ultimate judges of my professional expertise will not be my scholarly peers, but the public. And while members of the public and students may be able to judge many aspects of my teaching (that is why we have student evaluations of professors), they cannot judge whether I am teaching according to the best standards of the discipline.

Politics has always played a role on our campuses, but we are now experiencing a new form of political intrusion in academic life, and it is extremely dangerous. It has a direct impact on academic freedom because it threatens professors -- with the loss of the usual presumption that they are experts in their subject matter, or even with the loss of employment, if they do not agree with popular opinions.

That is too high a price for me to pay to keep my job, and I have resolved never again to bow to religious or political pressure in the classroom. In the future I will send students to the Internet to view authors' credentials. When I next teach the New Testament, I will use the disagreement between Gagnon and myself to demonstrate that scholarly debate -- unlike political debate, in which each side is expected to be partisan -- is a way of systematically testing the beliefs of both sides, and that my job is to critically assess all the arguments, from within my area of expertise.

Like many other academics, I have dedicated my life to the faithful transmission of the truth as best I can discern it. It makes me sick to my stomach to think of falsifying the truth, or even sacrificing my right to have an informed professional opinion.

Ann Marie B. Bahr is a professor of philosophy and religion at South Dakota State University. She is editor of Chelsea House's "Religions of the World" series and author of two books in the series, Christianity (2004) and Indigenous Religions (2005)

Sunday, May 1

Israel and Palestine and news

When people have strong, emotional attitudes, they project these attitudes onto other people - and particularly the news media.

To examine this phenomenon, I developed a videotape composed of actual footage of approximately equal amounts of violence committed by Israel and the Palestine Liberation Organization. Pro-Israel partisans perceived the news was biased against Israel and in favor of the P.L.O. Palestinian activists believed the same news coverage was slanted against the P.L.O. and in favor of Israel. Colleagues at Stanford, New York University and the University of Wisconsin have reported similar results in studies of the Middle East and other controversial issues. They call this "hostile media bias."

What happens psychologically is that strong partisans "see" only those portions of the news that dispute their view of reality because this either attracts their attention, is intensely memorable or excites their emotions to a greater degree than media coverage that supports their worldview. The objective content of news remains elusive. Truth is in the eye of the media beholder.

RICHARD M. PERLOFF
Cleveland, April 27, 2005
The writer is the director of the School of Communication, Cleveland State University.