July 31, 2006, NYT
By LOUIS UCHITELLE and DAVID LEONHARDT
ROCK FALLS, Ill. — Alan Beggerow has stopped looking for work. Laid off as a steelworker at 48, he taught math for a while at a community college. But when that ended, he could not find a job that, in his view, was neither demeaning nor underpaid.
So instead of heading to work, Mr. Beggerow, now 53, fills his days with diversions: playing the piano, reading histories and biographies, writing unpublished Western potboilers in the Louis L’Amour style — all activities once relegated to spare time. He often stays up late and sleeps until 11 a.m.
“I have come to realize that my free time is worth a lot to me,” he said. To make ends meet, he has tapped the equity in his home through a $30,000 second mortgage, and he is drawing down the family’s savings, at the rate of $7,500 a year. About $60,000 is left. His wife’s income helps them scrape by. “If things really get tight,” Mr. Beggerow said, “I might have to take a low-wage job, but I don’t want to do that.”
Millions of men like Mr. Beggerow — men in the prime of their lives, between 30 and 55 — have dropped out of regular work. They are turning down jobs they think beneath them or are unable to find work for which they are qualified, even as an expanding economy offers opportunities to work.
About 13 percent of American men in this age group are not working, up from 5 percent in the late 1960’s. The difference represents 4 million men who would be working today if the employment rate had remained where it was in the 1950’s and 60’s.
Most of these missing men are, like Mr. Beggerow, former blue-collar workers with no more than a high school education. But their ranks are growing at all education and income levels. Refugees of failed Internet businesses have spent years out of work during their 30’s, while former managers in their late 40’s are trying to stretch severance packages and savings all the way to retirement.
Accumulated savings can make dropping out more affordable at the upper end than it is for Mr. Beggerow, but the dynamic is often the same — the loss of a career and of a sense that one’s work is valued.
“These are men forced to compete to get back into the work force, and even then they cannot easily reconstruct what many lost in a former job,” said Thomas A. Kochan, a labor and management expert at the Sloan School of Management at Massachusetts Institute of Technology. “So they stop trying.”
Many of these men could find work if they had to, but with lower pay and fewer benefits than they once earned, and they have decided they prefer the alternative. It is a significant cultural shift from three decades ago, when men almost invariably went back into the work force after losing a job and were more often able to find a new one that met their needs.
“To be honest, I’m kind of looking for the home run,” said Christopher Priga, who is 54 and has not had steady work since he lost a job with a six-figure income as an electrical engineer at Xerox in 2002. “There’s no point in hitting for base hits,” he explained. “I’ve been down the road where I did all the things I was supposed to do, and the end result of that is nil.”
Instead, Mr. Priga supports himself by borrowing against the rising value of his Los Angeles home. Other men fall back on wives or family members.
But the fastest growing source of help is a patchwork system of government support, the main one being federal disability insurance, which is financed by Social Security payroll taxes. The disability stipends range up to $1,000 a month and, after the first two years, Medicare kicks in, giving access to health insurance that for many missing men no longer comes with the low-wage jobs available to them.
No federal entitlement program is growing as quickly, with more than 6.5 million men and women now receiving monthly disability payments, up from 3 million in 1990. About 25 percent of the missing men are collecting this insurance.
The ailments that qualify them are usually real, like back pain, heart trouble or mental illness. But in some cases, the illnesses are not so serious that they would prevent people from working if a well-paying job with benefits were an option.
The disability program, in turn, is an obstacle to working again. Taking a job holds the risk of demonstrating that one can earn a living and is thus no longer entitled to the monthly payments. But staying out of work has consequences. Skills deteriorate, along with the desire for a paying job and the habits that it requires.
“The longer you stay on disability benefits,” said Martin H. Gerry, deputy commissioner for disability and income security at the Social Security Administration, “the longer you’re out of the work force, the less likely you are to go back to work.”
As a rule, out-of-work men are less educated than the population as a whole. Their numbers have grown sharply among black men and men who live in hard-hit industrial areas like Michigan, West Virginia and upstate New York, as well as those who live in rural states like Mississippi and Oklahoma.
The missing men are also more likely to live alone. Nearly 60 percent are divorced, separated, widowed or never married, up from 50 percent a decade earlier, the Census Bureau reports. Sometimes women who are working throw out men who are not, says Kathryn Edin, a sociologist at the University of Pennsylvania. In any case, without a household to support, there is less pressure to work, and for men who fall behind on support payments, an incentive exists to work off the books — hiding employment — so that wages cannot be garnisheed.
“What happens to a lot of guys who become unmoored from family life, they become unmoored from everything,” Ms. Edin said. “They are just living without attachments and by the time they are 40 or 50 years old, the things that kept these men from falling away — family and community life — are gone.”
Even as more men are dropping out of the work force, more women are entering it. This change has occurred partly because employment has shrunk in industries where men predominated, like manufacturing, while fields where women are far more common, like teaching, health care and retailing, have grown. Today, about 73 percent of women between 30 and 54 have a job, compared with 45 percent in the mid-1960’s, according to an analysis of Census data by researchers at Queens College. Many women without jobs are raising children at home, while men who are out of a job tend to be doing neither family work nor paid work.
Women are also making inroads in fields where they were once excluded — as lawyers and doctors, for example, and on Wall Street. Men still make significantly more money than women, but as women become more educated than men, even more men may end up out of the work force.
At the low end of the spectrum, men emerging from prison with felony records are not easily absorbed into steady employment. Hundreds of thousands of young men were jailed in the 1980’s and 1990’s, in a surge of convictions for drug-related crimes. As prisoners, they were not counted in the employment data; as ex-prisoners they are. They are now being freed in their 30’s and 40’s and are struggling to be hired. Roughly two million men in this group have prison records, according to a calculation by Richard Freeman and Harry J. Holzer, labor economists at Harvard and the Urban Institute, respectively.Many of these men do not find work because of their records.
Despite their great numbers, many of the men not working are missing from the nation’s best-known statistic on unemployment. The jobless rate is now a low 4.6 percent, yet that number excludes most of the missing men, because they have stopped looking for work and are therefore not considered officially unemployed. That makes the unemployment rate a far less useful measure of the country’s well-being than it once was.
Indeed, a larger share of working-age men are not working today than at almost any point in the last half-century, which raises the question of how they will get by as they age. They may be forced back to work after years of absence, they may fall into poverty, or they may be rescued by the government. This same trend is evident in other industrialized countries. In the European Union, 14 percent of men between 25 and 54 were not working last year, up from 7 percent in 1975, according to the Organization for Economic Cooperation and Development. Over the same period in Japan, the proportion of such men rose to 8 percent from 4 percent.
In these countries, too, decently paying blue-collar jobs are disappearing, and as they do men who held them fall back on government benefits for income. But the growth of subsidies through federal and state programs like disability insurance has happened largely without notice in this country while it is a major topic of political debate in Europe.
“We have a de facto welfare system as Europe does,” said Teresa Ghilarducci, a labor economist at the University of Notre Dame. “But we are not proud of it, as they are.”
Reading, Sleeping, Scraping By
Alan Beggerow has not worked regularly in the five years since the steel mill that employed him for three decades closed. He and his wife, Cathleen, 47, cannot really afford to live without his paycheck. Yet with her sometimes reluctant blessing, Mr. Beggerow persists in constructing a way of life that he finds as satisfying as the work he did only in the last three years of his 30-year career at the mill. The trappings of this new life surround Mr. Beggerow in the cluttered living room of his one-story bungalow-style home in this half-rural, half-industrial prairie town west of Chicago. A bookcase covers an entire wall, and the books that Mr. Beggerow is reading are stacked on a glass coffee table in front of a comfortable sofa where he reads late into the night — consuming two or three books a week — many more than in his working years.
He also gets more sleep, regularly more than nine hours, a characteristic of men without work. As the months pass, they average almost nine-and-a-half hours a night, about 80 minutes more than working men, according to an analysis of time-use surveys by Harley Frazis and Jay Stewart, economists at the Bureau of Labor Statistics.
Very few of the books Mr. Beggerow reads are novels, and certainly not the escapist Westerns that he himself writes (two in the last five years), his hope being that someday he will interest a publisher and earn some money. His own catholic tastes range over history — currently the Bolshevik revolution and a biography of Charlemagne — as well as music and the origins of Christianity.
He often has strong views about what he has just read, which he expresses in reviews that he posts on Amazon.com: 124 so far, he said.
Always on the coffee table is a thick reference work, “Guide to the Pianist’s Repertoire” by Maurice Hinson. Mr. Beggerow is a serious pianist now that he has the time to practice, sometimes two or three hours at a stretch. He does so on an old upright in a corner of the living room, a piano he purchased as a young steelworker, when he first took lessons.
His new life began in the spring of 2001 with the closing of Northwestern Wire and Steel in Sterling, Ill., where he had worked since 1971. During the last three of those 30 years, Mr. Beggerow found himself assigned to work he really liked: as a union representative on union-management teams that assessed every aspect of the plant’s operations.
What made him valuable was his dexterity as a writer. No one could put together committee reports as articulately as he did, and he found himself on nearly every team. His salary rose to $50,000. During those years, he taught himself more math, too, to help in the analyses of the issues that the teams tackled: productivity, safety, plant layout and the like.
“I actually loved that job,” he said. “I even looked forward to going to work. The more teams they had, the more they found out what I could do and the more I found out what I could do.”
Mr. Beggerow would take another job in a heartbeat, he says, if it were like the work he did in those last three years at Northwestern. The closest he has gotten has been as an instructor at a community college, teaching plant maintenance and other useful factory skills. His students were from nearby manufacturing companies, which subsidized the courses, including his pay of $45 an hour. But factory operations in the area are shrinking, and Mr. Beggerow has not had a teaching stint since November.
Like Mr. Beggerow, the great majority of the missing men are out of the work force for months or years at a time rather than drifting in and out of jobs. There appears to have been no rise since the 1960’s in the percentage of men out of work for short periods, according to research by Chinhui Juhn, a University of Houston professor, and other economists.
Mr. Beggerow will not take a lesser job, he says, because of his bitter memories of earlier years at Northwestern Wire, particularly the 1980’s, when the industry was in turmoil. A powerful man, over 6 feet and 200 pounds, he worked then as a warehouseman.
What got to him was not the work. It was the frequent furloughs, the uncertainty whether he would be recalled, the mandatory overtime and 50-hour weeks often imposed when he did return, the schedules that forced him to work every holiday except Christmas, and then, as rising seniority finally gave him some protection, a six-month strike in 1983 followed by a wage cut. His pay shrank to $13 an hour from $17, a loss he did not fully recover until those last three years.
“I was always thinking if there was some way I could get out of this, do something else,” Mr. Beggerow said. “What made me so upset was the insecurity of it all and the humiliation. I don’t want to take a job that would put me through that again.”
Shortly after Northwestern closed, Mr. Beggerow married. It was his third marriage, and also Cathleen’s third. He has one adult child by the first wife; Cathleen has no children. For six months they lived on his $12,000 from a shrunken pension and her $28,000 as a factory worker — until severe injuries in an auto accident five months after their wedding forced her out of that job. She eventually qualified for $12,000 a year in disability insurance.
Their two incomes are not enough to cover expenses, which bothers Mrs. Beggerow, although not enough to badger her husband to take a job, any job. She respects him too much for that, she says.
Instead, she finds ways to make money herself, in activities she enjoys. She is taking in work as a seamstress, baking pastries for parties and selling merchandise for others on eBay, collecting a fee. Still, she says, she hopes to land a part-time clerical job. “The comfort of a paycheck every week would take a load off my mind,” she said.
While she is tolerant of her husband’s reluctance to work, respecting his current pursuits, she is not above looking for a job he would consider suitable. “I look at the employment ads every day,’’ she said, “and every so often I find one that I think might be right up his alley.”
Less Concern About the Future
Recently there was an opening for an editor-writer at a small travel magazine published in a nearby town. “I applied,” Mr. Beggerow said, “but the publisher did not seem to want someone my age.”
Meanwhile the Beggerows’ savings are shrinking. This year, for the first time, they have drawn down so much from their 401(k)’s they have been forced to pay early-withdrawal penalties. But Mr. Beggerow resists being stampeded.
“The future is always a concern, but I no longer allow myself to dwell on it,” he said, waving aside, in his new and precarious life, the preparations for retirement and old age that were a feature of his 30 years as a steelworker.
“When you are in the mode of having money coming in,” he explained, “naturally you think about planning and saving. And then when you don’t have the money coming in, you think less about the future, at least money-wise. It is still a concern, but not a concern that keeps me up at night, not in this life that I am now leading.”
Men like Mr. Beggerow, neither working nor looking for a job, also have become more common in the popular culture, making the phenomenon more acceptable. On the television show “Seinfeld,” Cosmo Kramer, who did not work, and George Costanza, who regularly lost jobs, were beloved figures. Personal-finance magazines whose circulations have grown rapidly over the last 25 years also encourage not working — by telling readers how to afford retirement at 50 and by painting not working as the good life, which it apparently is for a small number of wealthy men. About 8 percent of non-working men between 30 and 54 lived in households that had more than $100,000 of income in 2004.
“Men don’t feel a need to be in a career, not as much as they once did,” said Ruth Milkman, a sociologist at the University of California at Los Angeles. “Nor do men have the incentive they once had to pursue a career, not when employers are no longer committed to them.”
Mr. Priga, the former Xerox engineer who lives in Los Angeles, has been wandering in this latter Diaspora. He is a tall, thin man with a perpetually dour expression. His dress — old jeans and a faded khaki shirt — seemed out of place in the upscale Beverly Hills restaurant where he was interviewed for this article. But his education and skill were not out of place.
Mr. Priga is an electrical engineer skilled in computer technology, and much involved, as he tells the story, in writing early versions of Internet and e-mail software for banks and other companies. A divorce in 1996 left him with custody of his three children. One of them had behavioral problems and to care for the boy he dropped out of steady work for a while, mortgaging his house to raise money and designing Web sites as a freelancer.
He re-entered the work force in 2000, joining Xerox at just over $100,000 a year as a systems designer for a new project, which did not last. In the aftermath of the dot-com bust, Xerox downsized and Mr. Priga was let go in January 2003.
From Prison to Joblessness
“I’ve been through a lot of layoffs over the years, and there is a certain procedure you follow,” he said. “You contact the headhunters. You go looking for other work. You do all of that, and this time around it didn’t work.”
So he went back to designing Web sites as a freelancer, postponing the purchase of health insurance. No work has come his way since March, and even if people had hired him to design Web sites for them, Mr. Priga would not consider that real employment.
His father is his standard. At Mr. Priga’s age, 54, “my father was with Rockwell International designing the fiber optic backbone for U.S. Navy ships,” he said. “He got a regular paycheck. He had retirement benefits, medical benefits, all of that. I’m at that age and I don’t see that as even possible. I’ve kind of written off the idea completely. I’m more like a casual laborer.”
The Bureau of Labor Statistics determines who is working through a monthly survey of 65,000 representative households. People are asked if they did any work for pay in the week before the survey, including self-employment. For Mr. Beggerow and Mr. Priga, the answer has been no.
The same goes for Rodney Bly, a 41-year-old Philadelphia man struggling with a prison record, although he has had income — from off-the-books work that he refuses to think of as employment.
Mr. Bly, a lanky, neatly dressed six-footer, was in and out of jail, mostly on drug convictions, from 1996 until 2003, but has been clean since then, he said in an interview last month. He has even been a leader of an Alcoholics Anonymous-style group of former addicts who meet regularly and do their best to stay off drugs and out of jail.
Mr. Bly has been living in a recovery shelter for addicts and shows up occasionally for meals at St. Francis Inn, a soup kitchen and health clinic in a poor North Philadelphia neighborhood that tries to help ex-convicts get work and keep it.
He has worked pretty regularly, distributing flyers. But that brings him only $270 a week, most of which goes to the shelter for rent, utilities and food. More to the point, the work is off the books, which makes Mr. Bly invisible in the national statistics as a member of the work force.
Still, he has a girlfriend, reports Karen Pushaw, a staff member at St. Francis, “and that grounds him, keeps him looking for legitimate work.”
Ms. Pushaw tries to help. At her encouragement, he applied for 25 jobs this spring but received no offers, not even an interview. The obstacle is two felony convictions, one for car theft, the other for three instances of drug possession.
“Because of the two felonies, I can’t get a job as a security guard or a sales person or a short-order cook,” Mr. Bly said. “I can be a pot washer or a dish washer, but I can’t get a job that pays more than $8 an hour, not a legitimate one. I’m excluded.”
Amanda Cox contributed reporting for this article from New York.
Monday, July 31
The "intellectual property" morass
Susan Bielstein's wonderful "Permissions: A Love Story", just out from University of Chicago Press, takes on this issue from the perspective of art history and visual culture. -ed
Writers Who Price Themselves Out of the Canon
By KEVIN J.H. DETTMAR
Arguments about the shape and nature of the literary canon — at least the interesting, thoughtful, impassioned arguments of a decade ago — seem to have quieted down somewhat of late. Though we've by no means crossed the Jordan into some kind of post-canon promised land, certain premises of the 1990s canon wars are now more or less taken for granted by both sides: for instance, that canons are always to some extent political, and in need of constant surveillance. With so many books and so little time, we will always be vetting and winnowing in our roles as teachers and scholars.
The muting of arguments about the canon now highlights a more subtle influence on what our students read: the high cost of obtaining permission to reprint copyrighted material.
If my experience is any indication, we teachers tend to open up a new anthology and do a quick survey of what's in and what's out. We may be tempted to chalk up any divergence from our ideal Table of Contents to taste, politics, or even space; all of those certainly are factors. But so is cost. Some authors — and their agents or executors and estates — are concerned about privacy and understandably limit access to certain unpublished, personal materials; but some seem to be promoting only their own financial interests, trading long-term literary reputation for short-term profit.
Privacy is the battle cry of Stephen James Joyce, the author's grandson and literary executor. In a profile of him in a recent issue of The New Yorker, Mr. Joyce makes it clear that he's still angry about the violation of privacy involved in the 1975 publication of some erotic letters his grandparents exchanged, and his oversight of Joyce's literary estate now seems largely dictated by his resentment over that breach. But vigilant defense of his grandfather's privacy has gradually slid into a crusade to burnish Joyce's reputation, a crusade that, colored by Stephen Joyce's general disdain for scholars and scholarship, has resulted in a seemingly across-the-board policy of denying permission to quote from Joyce's works even for scholarly purposes. Behind the mask of an arguably legitimate concern for family privacy, Stephen Joyce has begun to hamstring new research into the work of the 20th century's greatest novelist.
The suit filed in June on behalf of the Joyce scholar Carol Loeb Shloss by Stanford Law School's Center for Internet and Society (The Chronicle, June 23) promises to bring some clarity to the currently fuzzy concept of fair use and to provide scholars and publishers with better guidance regarding citation from copyrighted works for scholarly purposes. But the granting, or denial, of permission to quote from copyrighted material is just part of the problem, and one for the most part affecting only scholars and artists (such as DJ Danger Mouse, whose innovative The Grey Album, a remix of Jay-Z's The Black Album and the Beatles' self-titled double album — known colloquially as The White Album — was blocked by lawyers for the Beatles). More worrisome for classroom teachers, editors, and publishers are the escalating charges assessed by authors and estates for the reprinting of literary texts still under copyright protection.
I've served as co-editor of the 20th-century selections for the Longman Anthology of British Literature for the past decade and as general editoralong with the founding editor, David Damroschfor the recently published third edition (2005). Part of my work as general editor involves overseeing permissions costs, so I am in a position to witness how those costs influence decisions about what is included and what is not (and why); and ultimately I am the one who must make those difficult decisions.
A portion of every dollar we spend on permissions to reprint is charged against the royalties of the textbook's editors, which means that we need to be prudent without being downright stingy. In effect, every dollar in permissions cost is split between the editors and the students who buy the book: We editors split permissions costs with the publisher, whose share is built into the price of the book, to the extent that the market will bear it. Because the publisher passes along its share to the book's purchasers — primarily students — the editors are in effect equal stakeholders with the students, equally interested in keeping permissions costs down. Permissions costs can mount up pretty quickly, contributing to students' complaints that their books are too expensive.
By my calculations, almost 70 percent of the permissions costs of the current edition of the Longman result from selections in the 20th-century volume. That should come as no surprise, since for most works published before 1923 the copyright has run out in this country. But the translations and scholarly editions chosen for earlier periods often are still covered by copyright protection, and payment must be made to the translators and scholars who have made older works accessible to modern readers. Those costs can be quite high: I was surprised to learn, for instance, that the volume on the Middle Ages is the next most expensive after that for the 20th century, accounting for nearly 15 percent of the permissions costs for the entire anthology. J.R.R. Tolkien's translation of Sir Gawain and the Green Knight, for example, is the second costliest item in the entire anthology (after Samuel Beckett's play Endgame). Is it worth it? Absolutely: Tolkien's is a vivid, accessible translation, and his dual credentials as a medieval scholar and fantasy author make him the perfect voice to render the tale. That said, we can still hope that the windfall of the Lord of the Rings movies will allow the Tolkien estate not to charge us as much the next time around.
In sorting through the permissions charges as they come across my desk, I have found it useful to think of four different categories of works and to deal with the invoices accordingly. The first category — "the A list" — consists of those texts that the anthology cannot be published without, and for which there is only one source. T.S. Eliot's The Waste Land is one of those must-haves. Fortunately, it moved into the public domain just before passage of the 1998 Sonny Bono Copyright Term Extension Act, which extended the term of U.S. copyright for certain older works to 95 years from publication. Also on the A list would be something from the works of Virginia Woolf, including at least an excerpt from A Room of One's Own, and from James Joyce, including a chapter from Ulysses, although it is unlikely to be taught in a survey course. The anthology's editors have no choice but to pay what's required by the copyright holders.
Ulysses was a problem for our third edition, as it has been a problem for so many scholars and publishers. Dubliners, Joyce's early short-story collection, is the Joyce text most useful in an undergraduate survey course; mercifully the stories are in the public domain and thus cost nothing to reprint. The copyright status of Ulysses, on the other hand, is a matter of some debate, though Carol Shloss's lawsuit against the Joyce estate may help to clarify it.
Robert Spoo, perhaps the leading scholar of intellectual-property law as it pertains to modernist literature, published an article in the Yale Law Journal in 1998 arguing that, because of the delayed publication of Ulysses in the United States, the 1922 first edition published in Paris has been in the public domain in this country for years — an argument that, to the best of my knowledge, has never been rebutted. The U.S. copyright law in force in 1922, Spoo argues, would have required Joyce to deposit a copy of the book at the U.S. copyright office within two months of its initial publication in France, and then, within another four months, to have the book printed on American soil by a U.S. printer. Because the book was declared obscene and not published here until 1933, Joyce was not able to satisfy those requirements, and even testified in 1926 in a sworn deposition in Paris that he had never tried to secure American copyright.
That fact notwithstanding, publishers understandably tend to be conservative in these matters: Contrary to the old saw, they find it easier, or at least cheaper, to ask permission than to ask forgiveness. And because the editors of an anthology are typically required to split the cost of permissions with the publisher, and the publisher can pass its costs along to the book's purchasers, any financial incentive to fight the good fight over public domain is considerably lessened.
Because Ulysses is commonly (if mistakenly) thought to be protected by copyright, we applied for permission to reprint the "Nausicäa" chapter, as we had done for our second edition. (From a scholar's point of view, it's maddening to pay for something I believe we should have free; but establishing the public-domain status of Ulysses with certainty would require costly litigation, and no individual publisher finds that expense worthwhile. Fortunately, Spoo's claim about the public-domain status of Ulysses may be adjudicated in the Shloss suit.) Permission was flatly denied by Joyce's grandson. The reasons for his refusal are complicated and perhaps best not recounted in detail here; in part, Mr. Joyce was unhappy with the way we had handled permissions in previous editions, and also, I was told by our permissions person, he "has problems with the introductory material on Joyce" — material I had written myself.
Any scholar who has done much work on Joyce (and anyone who has read The New Yorker piece on his grandson) knows the dim view the Joyce estate takes of literary scholarship; as a consequence, most of us in modernist studies know more than we'd care to about copyright law and the public domain. In this case, that knowledge saved us. Because Joyce published Ulysses serially in the United States before the book was published in its entirety in 1922, we had a loophole: The "Nausicäa" chapter for which we had sought permission had been published in the July/August 1920 issue of the important modernist literary magazine The Little Review. That version of the chapter is now indisputably in the public domain and thus does not require us to seek permission to use it at all. It differs somewhat from the 1922 book version, and I'm disappointed that some late stylistic embellishment is missing from the version we've published. On the other hand, the 1920 version is historic, a landmark in U.S. obscenity law: It got the magazine shut down, and its editor, Margaret Andersonalong with her partner, jane heaphauled into court on obscenity charges. We now use the 1920 version of the text in the anthology to illustrate modernism's running battle with censorship, and the anthology also reprints the famous 1932 court decision ruling that Ulysses is not pornographic and can therefore be published legally in the United States. Certainly that chapter is enriched by this historical and cultural context.
A second permissions category consists of authors who must be included, but whose work might be represented in different ways: The writer is canonical, but no one text is a sine qua non. Any anthology of British literature, for instance, must include a representative selection of the poems of W.H. Auden. Quarrels continue around the borders of the canon, but Wystan is in, end of story. He has also become fantastically expensive — at an average fee of $20 per line, the modest selections we've included come to more than $8,000. Those permissions costs are a bitter pill for an editor to swallow. While Auden is in no danger of disappearing from our anthologies, and consequently from our classrooms or our cultural consciousness — "September 1, 1939," for instance, was the most visible piece of literature in the days, weeks, and months following the September 11, 2001, attacks — one begins to think about how much one can afford, and what one can afford to do without. I resent being forced to make absurd calculations, such as five lines of Auden equals one page of Salman Rushdie. Although Auden is one of the 20th century's most gifted English-language poets, he is represented in the Longman by a mere handful of poems, most of which are somewhat familiar choices.
Other 20th-century poets have suffered that same diminution, as permissions fees in some cases have actually doubled in the short time between editions. We were forced to drop Dylan Thomas's poetic radio drama "Return Journey," for example, and Eavan Boland's poem "The Journey." Poets, because their work is available in smaller discrete units than that of playwrights and fiction writers, are especially susceptible to such cuts. It's hard to feel good about that state of affairs.
A third group comprises those authors and works that one tries to include not because one must have them, but because one wants them — an interesting writer who might otherwise be overlooked, but whose work deserves attention or has become newly interesting in light of recent scholarship; a text whose inclusion helps to enrich an otherwise two-dimensional rendering of an era, problem, or movement. For the second edition of the Longman, for instance, Jennifer Wicke, my editing partner for the 20th-century volume, had the brilliant idea of adding a handful of poems by the American poet Sylvia Plath, to accompany some poetry by her (in)famous British husband, Ted Hughes. Their difficult marriage is the stuff of literary legend and now big-screen fame (the 2003 film Sylvia, starring Gwyneth Paltrow and Daniel Craig). More to the point, in 1998 Hughes's poetic narrative of their partnership, Birthday Letters, was published posthumously, bringing the couple again to the forefront of literary and public consciousness.
Well, in the third edition Plath is gone again: Whether because of the recent resurgence of interest in her life (and perhaps poetry) among the American public, or some other cause, the fees demanded for reprinting just four poems suddenly rose to more than $3,500 for the U.S. and Canadian rights. Plath thereby crossed some kind of invisible line into the realm of extravagant indulgence. Assembling an anthology is a little bit like building a home: One goes into the project with big dreams, but after meeting with the architects and builders, certain features go by the boards; even more desires are sacrificed during construction, as costs overrun estimates. In our third edition, a smattering of poems by Sylvia Plath became those skylights in the kitchen that would have to wait for later.
Sometimes texts are omitted, or added, for the most idiosyncratic of reasons: This is my fourth category, probably best labeled "Misc." In thinking about contemporary responses to Eliot's The Waste Land, I very much wanted to include a poem by the contemporary Irish poet Thomas Kinsella (who also taught here in Carbondale for many years), the long poem "Nightwalker." But Kinsella himself refused permission. Why? His refusal had to do with the title of our anthology. Because we are convinced that the phrase "English literature" suggests an Anglophilic perspective that doesn't account for the vitality and diversity of British writing, which includes works by current and former members of the British Empire (is Rushdie an "English" writer?), ours is the Longman Anthology of British Literature. Mr. Kinsella objected that he is not a British writer, and of course he's right; "I am sorry to have to refuse, as I feel the adjective 'British' does not apply to my work," he wrote. On the other hand, we thought the Longman Anthology of English, Irish, Northern Irish, Welsh, Scottish, Indian, Pakistani, South African, and West Indian Literature was a bit unwieldy.
As our first edition was being prepared for press, back in 1998, we learned that permission to reprint Caryl Churchill's play Cloud 9 could not be granted. Ownership of rights to the play was then being decided in the British courts, and no one had legal standing to grant permission. The play was simply unavailable to us. Given that immovable object, Jennifer and I quickly turned to another script that we had hoped to include but hadn't found room for, Hanif Kureishi's screenplay for the wonderful film about the complexities of multicultural London, My Beautiful Laundrette. That text proved very popular with teachers and students, and by no means felt like a compromise or second choice. But when we sat down to plan the second edition, we learned that the legal status of Cloud 9 had been established and that the play would be available to us. Although it was a difficult decision, we decided to drop Kureishi to make room for Churchill — we couldn't pay, nor had we room, for both. Such is the horse trading that goes into assembling an anthology.
When we began planning the third edition, we decided to keep Cloud 9 — the play had been at least as popular with our readers as My Beautiful Laundrette, and we saw no reason to drop it. But when we applied for permission to reprint the play in the new edition, we learned that the play was now worth $10,000 up front, plus a percentage of the gross, which would have put the cost in excess of $13,000. After a good deal of soul-searching, I suggested that we contact Mr. Kureishi's people to see what it would cost to switch back to our first-edition favorite, My Beautiful Laundrette. The answer, to my delight, was $1,000.
And there was a bonus. Just as our volume was going to press, Mr. Kureishi published an essay in The Guardian, following the London subway bombings of last summer, about the costs and limitations of multicultural London. I thought the piece would make a wonderful companion to the screenplay, suggesting some of the ways that London has changed since 1985, when the script was written. Our deadline was so short — just one week — that our permissions staff told me there was no way we could clear permissions for the essay in time. I e-mailed Mr. Kureishi directly late on a Wednesday night and delivered my plea. I never heard back from him, but the following morning an e-mail message was waiting for me from the person who handles his permissions in the United States: "The author has granted his permission." Even better, he gave it to us for a song.
From these examples it seems clear that one of the hidden forces shaping the evolving canon of modern literature, in ways not always having much to do with literary value, is the shortsightedness of copyright holders. A traditional understanding of the canon and its formation might be contributing to that unfortunate result: If writers, or their agents or estates, ascribe to the belief that the greatest works of art transcend time and tide and petty interference — that the cream will rise and quality will out — then they can afford to price their work without respect to the vagaries of the literary marketplace. If, on the other hand, one thinks of literary reputation as something like a stock market, driven by investor confidence, then one realizes that the best long-term investment strategy is to balance short- and medium-term profitability with protection of the brand, by making one's work, or the work for which one is responsible, available. We might think of these as long-term and short-term investment strategies: Mr. Kureishi is investing for the long-term by letting his work be reprinted comparatively cheaply, knowing that securing a place in the canon will pay long-term dividends in both prestige and income. Ms. Churchill's handlers, on the other hand, seem to be looking only at quick returns, and in so doing are not serving her well in the long run.
Pursued to its logical extreme, the strategy embraced by Ms. Churchill's permissions people endangers her place in the canon. I understand, of course, that the canon does not reside in any one book, but is a kind of Platonic ideal, in the mind of God — or Harold Bloom. I also realize that the Longman anthology does not dictate the canon of 20th-century British literature, and that being dropped from our pages does not mean that one has fallen out of the canon. But we are an important force in canon formation, one of the two dominant classroom anthologies on the market; and for us, Kureishi's in, and Churchill's out. The situation is particularly bad for contemporary British drama — playwrights such as Harold Pinter and Tom Stoppard seem to be pricing themselves out of the market — and for some contemporary poetry: Thom Gunn, to my dismay, is gone from our third edition, since his executors, following his death in 2004, doubled their fees between publication of our second and third editions.
An author and later his or her heirs have a legitimate right to recompense for their artistic property, the more so in cases where a work is published in its entirety: Harcourt Brace will presumably sell fewer copies of Mrs. Dalloway because we've reprinted the entire novel, and so Woolf's estate is entitled to the large sum we're paying for it. A good editorial team will use every tool at its disposal to prevent its sense of the canon from being deformed by such economic considerations. But without some forward thinking by literary executors, 20th-century literature risks being distorted in ways that none of us who love that literature can easily stand by and silently watch.
Kevin J.H. Dettmar is a professor of English and cultural studies at Southern Illinois University at Carbondale and the author, most recently, of Is Rock Dead? (Routledge, 2005). He is a regular contributor to The Chronicle Review.
http://chronicle.com
Section: The Chronicle Review
Volume 52, Issue 48, Page B6
Copyright © 2006 by The Chronicle of Higher Education
Writers Who Price Themselves Out of the Canon
By KEVIN J.H. DETTMAR
Arguments about the shape and nature of the literary canon — at least the interesting, thoughtful, impassioned arguments of a decade ago — seem to have quieted down somewhat of late. Though we've by no means crossed the Jordan into some kind of post-canon promised land, certain premises of the 1990s canon wars are now more or less taken for granted by both sides: for instance, that canons are always to some extent political, and in need of constant surveillance. With so many books and so little time, we will always be vetting and winnowing in our roles as teachers and scholars.
The muting of arguments about the canon now highlights a more subtle influence on what our students read: the high cost of obtaining permission to reprint copyrighted material.
If my experience is any indication, we teachers tend to open up a new anthology and do a quick survey of what's in and what's out. We may be tempted to chalk up any divergence from our ideal Table of Contents to taste, politics, or even space; all of those certainly are factors. But so is cost. Some authors — and their agents or executors and estates — are concerned about privacy and understandably limit access to certain unpublished, personal materials; but some seem to be promoting only their own financial interests, trading long-term literary reputation for short-term profit.
Privacy is the battle cry of Stephen James Joyce, the author's grandson and literary executor. In a profile of him in a recent issue of The New Yorker, Mr. Joyce makes it clear that he's still angry about the violation of privacy involved in the 1975 publication of some erotic letters his grandparents exchanged, and his oversight of Joyce's literary estate now seems largely dictated by his resentment over that breach. But vigilant defense of his grandfather's privacy has gradually slid into a crusade to burnish Joyce's reputation, a crusade that, colored by Stephen Joyce's general disdain for scholars and scholarship, has resulted in a seemingly across-the-board policy of denying permission to quote from Joyce's works even for scholarly purposes. Behind the mask of an arguably legitimate concern for family privacy, Stephen Joyce has begun to hamstring new research into the work of the 20th century's greatest novelist.
The suit filed in June on behalf of the Joyce scholar Carol Loeb Shloss by Stanford Law School's Center for Internet and Society (The Chronicle, June 23) promises to bring some clarity to the currently fuzzy concept of fair use and to provide scholars and publishers with better guidance regarding citation from copyrighted works for scholarly purposes. But the granting, or denial, of permission to quote from copyrighted material is just part of the problem, and one for the most part affecting only scholars and artists (such as DJ Danger Mouse, whose innovative The Grey Album, a remix of Jay-Z's The Black Album and the Beatles' self-titled double album — known colloquially as The White Album — was blocked by lawyers for the Beatles). More worrisome for classroom teachers, editors, and publishers are the escalating charges assessed by authors and estates for the reprinting of literary texts still under copyright protection.
I've served as co-editor of the 20th-century selections for the Longman Anthology of British Literature for the past decade and as general editoralong with the founding editor, David Damroschfor the recently published third edition (2005). Part of my work as general editor involves overseeing permissions costs, so I am in a position to witness how those costs influence decisions about what is included and what is not (and why); and ultimately I am the one who must make those difficult decisions.
A portion of every dollar we spend on permissions to reprint is charged against the royalties of the textbook's editors, which means that we need to be prudent without being downright stingy. In effect, every dollar in permissions cost is split between the editors and the students who buy the book: We editors split permissions costs with the publisher, whose share is built into the price of the book, to the extent that the market will bear it. Because the publisher passes along its share to the book's purchasers — primarily students — the editors are in effect equal stakeholders with the students, equally interested in keeping permissions costs down. Permissions costs can mount up pretty quickly, contributing to students' complaints that their books are too expensive.
By my calculations, almost 70 percent of the permissions costs of the current edition of the Longman result from selections in the 20th-century volume. That should come as no surprise, since for most works published before 1923 the copyright has run out in this country. But the translations and scholarly editions chosen for earlier periods often are still covered by copyright protection, and payment must be made to the translators and scholars who have made older works accessible to modern readers. Those costs can be quite high: I was surprised to learn, for instance, that the volume on the Middle Ages is the next most expensive after that for the 20th century, accounting for nearly 15 percent of the permissions costs for the entire anthology. J.R.R. Tolkien's translation of Sir Gawain and the Green Knight, for example, is the second costliest item in the entire anthology (after Samuel Beckett's play Endgame). Is it worth it? Absolutely: Tolkien's is a vivid, accessible translation, and his dual credentials as a medieval scholar and fantasy author make him the perfect voice to render the tale. That said, we can still hope that the windfall of the Lord of the Rings movies will allow the Tolkien estate not to charge us as much the next time around.
In sorting through the permissions charges as they come across my desk, I have found it useful to think of four different categories of works and to deal with the invoices accordingly. The first category — "the A list" — consists of those texts that the anthology cannot be published without, and for which there is only one source. T.S. Eliot's The Waste Land is one of those must-haves. Fortunately, it moved into the public domain just before passage of the 1998 Sonny Bono Copyright Term Extension Act, which extended the term of U.S. copyright for certain older works to 95 years from publication. Also on the A list would be something from the works of Virginia Woolf, including at least an excerpt from A Room of One's Own, and from James Joyce, including a chapter from Ulysses, although it is unlikely to be taught in a survey course. The anthology's editors have no choice but to pay what's required by the copyright holders.
Ulysses was a problem for our third edition, as it has been a problem for so many scholars and publishers. Dubliners, Joyce's early short-story collection, is the Joyce text most useful in an undergraduate survey course; mercifully the stories are in the public domain and thus cost nothing to reprint. The copyright status of Ulysses, on the other hand, is a matter of some debate, though Carol Shloss's lawsuit against the Joyce estate may help to clarify it.
Robert Spoo, perhaps the leading scholar of intellectual-property law as it pertains to modernist literature, published an article in the Yale Law Journal in 1998 arguing that, because of the delayed publication of Ulysses in the United States, the 1922 first edition published in Paris has been in the public domain in this country for years — an argument that, to the best of my knowledge, has never been rebutted. The U.S. copyright law in force in 1922, Spoo argues, would have required Joyce to deposit a copy of the book at the U.S. copyright office within two months of its initial publication in France, and then, within another four months, to have the book printed on American soil by a U.S. printer. Because the book was declared obscene and not published here until 1933, Joyce was not able to satisfy those requirements, and even testified in 1926 in a sworn deposition in Paris that he had never tried to secure American copyright.
That fact notwithstanding, publishers understandably tend to be conservative in these matters: Contrary to the old saw, they find it easier, or at least cheaper, to ask permission than to ask forgiveness. And because the editors of an anthology are typically required to split the cost of permissions with the publisher, and the publisher can pass its costs along to the book's purchasers, any financial incentive to fight the good fight over public domain is considerably lessened.
Because Ulysses is commonly (if mistakenly) thought to be protected by copyright, we applied for permission to reprint the "Nausicäa" chapter, as we had done for our second edition. (From a scholar's point of view, it's maddening to pay for something I believe we should have free; but establishing the public-domain status of Ulysses with certainty would require costly litigation, and no individual publisher finds that expense worthwhile. Fortunately, Spoo's claim about the public-domain status of Ulysses may be adjudicated in the Shloss suit.) Permission was flatly denied by Joyce's grandson. The reasons for his refusal are complicated and perhaps best not recounted in detail here; in part, Mr. Joyce was unhappy with the way we had handled permissions in previous editions, and also, I was told by our permissions person, he "has problems with the introductory material on Joyce" — material I had written myself.
Any scholar who has done much work on Joyce (and anyone who has read The New Yorker piece on his grandson) knows the dim view the Joyce estate takes of literary scholarship; as a consequence, most of us in modernist studies know more than we'd care to about copyright law and the public domain. In this case, that knowledge saved us. Because Joyce published Ulysses serially in the United States before the book was published in its entirety in 1922, we had a loophole: The "Nausicäa" chapter for which we had sought permission had been published in the July/August 1920 issue of the important modernist literary magazine The Little Review. That version of the chapter is now indisputably in the public domain and thus does not require us to seek permission to use it at all. It differs somewhat from the 1922 book version, and I'm disappointed that some late stylistic embellishment is missing from the version we've published. On the other hand, the 1920 version is historic, a landmark in U.S. obscenity law: It got the magazine shut down, and its editor, Margaret Andersonalong with her partner, jane heaphauled into court on obscenity charges. We now use the 1920 version of the text in the anthology to illustrate modernism's running battle with censorship, and the anthology also reprints the famous 1932 court decision ruling that Ulysses is not pornographic and can therefore be published legally in the United States. Certainly that chapter is enriched by this historical and cultural context.
A second permissions category consists of authors who must be included, but whose work might be represented in different ways: The writer is canonical, but no one text is a sine qua non. Any anthology of British literature, for instance, must include a representative selection of the poems of W.H. Auden. Quarrels continue around the borders of the canon, but Wystan is in, end of story. He has also become fantastically expensive — at an average fee of $20 per line, the modest selections we've included come to more than $8,000. Those permissions costs are a bitter pill for an editor to swallow. While Auden is in no danger of disappearing from our anthologies, and consequently from our classrooms or our cultural consciousness — "September 1, 1939," for instance, was the most visible piece of literature in the days, weeks, and months following the September 11, 2001, attacks — one begins to think about how much one can afford, and what one can afford to do without. I resent being forced to make absurd calculations, such as five lines of Auden equals one page of Salman Rushdie. Although Auden is one of the 20th century's most gifted English-language poets, he is represented in the Longman by a mere handful of poems, most of which are somewhat familiar choices.
Other 20th-century poets have suffered that same diminution, as permissions fees in some cases have actually doubled in the short time between editions. We were forced to drop Dylan Thomas's poetic radio drama "Return Journey," for example, and Eavan Boland's poem "The Journey." Poets, because their work is available in smaller discrete units than that of playwrights and fiction writers, are especially susceptible to such cuts. It's hard to feel good about that state of affairs.
A third group comprises those authors and works that one tries to include not because one must have them, but because one wants them — an interesting writer who might otherwise be overlooked, but whose work deserves attention or has become newly interesting in light of recent scholarship; a text whose inclusion helps to enrich an otherwise two-dimensional rendering of an era, problem, or movement. For the second edition of the Longman, for instance, Jennifer Wicke, my editing partner for the 20th-century volume, had the brilliant idea of adding a handful of poems by the American poet Sylvia Plath, to accompany some poetry by her (in)famous British husband, Ted Hughes. Their difficult marriage is the stuff of literary legend and now big-screen fame (the 2003 film Sylvia, starring Gwyneth Paltrow and Daniel Craig). More to the point, in 1998 Hughes's poetic narrative of their partnership, Birthday Letters, was published posthumously, bringing the couple again to the forefront of literary and public consciousness.
Well, in the third edition Plath is gone again: Whether because of the recent resurgence of interest in her life (and perhaps poetry) among the American public, or some other cause, the fees demanded for reprinting just four poems suddenly rose to more than $3,500 for the U.S. and Canadian rights. Plath thereby crossed some kind of invisible line into the realm of extravagant indulgence. Assembling an anthology is a little bit like building a home: One goes into the project with big dreams, but after meeting with the architects and builders, certain features go by the boards; even more desires are sacrificed during construction, as costs overrun estimates. In our third edition, a smattering of poems by Sylvia Plath became those skylights in the kitchen that would have to wait for later.
Sometimes texts are omitted, or added, for the most idiosyncratic of reasons: This is my fourth category, probably best labeled "Misc." In thinking about contemporary responses to Eliot's The Waste Land, I very much wanted to include a poem by the contemporary Irish poet Thomas Kinsella (who also taught here in Carbondale for many years), the long poem "Nightwalker." But Kinsella himself refused permission. Why? His refusal had to do with the title of our anthology. Because we are convinced that the phrase "English literature" suggests an Anglophilic perspective that doesn't account for the vitality and diversity of British writing, which includes works by current and former members of the British Empire (is Rushdie an "English" writer?), ours is the Longman Anthology of British Literature. Mr. Kinsella objected that he is not a British writer, and of course he's right; "I am sorry to have to refuse, as I feel the adjective 'British' does not apply to my work," he wrote. On the other hand, we thought the Longman Anthology of English, Irish, Northern Irish, Welsh, Scottish, Indian, Pakistani, South African, and West Indian Literature was a bit unwieldy.
As our first edition was being prepared for press, back in 1998, we learned that permission to reprint Caryl Churchill's play Cloud 9 could not be granted. Ownership of rights to the play was then being decided in the British courts, and no one had legal standing to grant permission. The play was simply unavailable to us. Given that immovable object, Jennifer and I quickly turned to another script that we had hoped to include but hadn't found room for, Hanif Kureishi's screenplay for the wonderful film about the complexities of multicultural London, My Beautiful Laundrette. That text proved very popular with teachers and students, and by no means felt like a compromise or second choice. But when we sat down to plan the second edition, we learned that the legal status of Cloud 9 had been established and that the play would be available to us. Although it was a difficult decision, we decided to drop Kureishi to make room for Churchill — we couldn't pay, nor had we room, for both. Such is the horse trading that goes into assembling an anthology.
When we began planning the third edition, we decided to keep Cloud 9 — the play had been at least as popular with our readers as My Beautiful Laundrette, and we saw no reason to drop it. But when we applied for permission to reprint the play in the new edition, we learned that the play was now worth $10,000 up front, plus a percentage of the gross, which would have put the cost in excess of $13,000. After a good deal of soul-searching, I suggested that we contact Mr. Kureishi's people to see what it would cost to switch back to our first-edition favorite, My Beautiful Laundrette. The answer, to my delight, was $1,000.
And there was a bonus. Just as our volume was going to press, Mr. Kureishi published an essay in The Guardian, following the London subway bombings of last summer, about the costs and limitations of multicultural London. I thought the piece would make a wonderful companion to the screenplay, suggesting some of the ways that London has changed since 1985, when the script was written. Our deadline was so short — just one week — that our permissions staff told me there was no way we could clear permissions for the essay in time. I e-mailed Mr. Kureishi directly late on a Wednesday night and delivered my plea. I never heard back from him, but the following morning an e-mail message was waiting for me from the person who handles his permissions in the United States: "The author has granted his permission." Even better, he gave it to us for a song.
From these examples it seems clear that one of the hidden forces shaping the evolving canon of modern literature, in ways not always having much to do with literary value, is the shortsightedness of copyright holders. A traditional understanding of the canon and its formation might be contributing to that unfortunate result: If writers, or their agents or estates, ascribe to the belief that the greatest works of art transcend time and tide and petty interference — that the cream will rise and quality will out — then they can afford to price their work without respect to the vagaries of the literary marketplace. If, on the other hand, one thinks of literary reputation as something like a stock market, driven by investor confidence, then one realizes that the best long-term investment strategy is to balance short- and medium-term profitability with protection of the brand, by making one's work, or the work for which one is responsible, available. We might think of these as long-term and short-term investment strategies: Mr. Kureishi is investing for the long-term by letting his work be reprinted comparatively cheaply, knowing that securing a place in the canon will pay long-term dividends in both prestige and income. Ms. Churchill's handlers, on the other hand, seem to be looking only at quick returns, and in so doing are not serving her well in the long run.
Pursued to its logical extreme, the strategy embraced by Ms. Churchill's permissions people endangers her place in the canon. I understand, of course, that the canon does not reside in any one book, but is a kind of Platonic ideal, in the mind of God — or Harold Bloom. I also realize that the Longman anthology does not dictate the canon of 20th-century British literature, and that being dropped from our pages does not mean that one has fallen out of the canon. But we are an important force in canon formation, one of the two dominant classroom anthologies on the market; and for us, Kureishi's in, and Churchill's out. The situation is particularly bad for contemporary British drama — playwrights such as Harold Pinter and Tom Stoppard seem to be pricing themselves out of the market — and for some contemporary poetry: Thom Gunn, to my dismay, is gone from our third edition, since his executors, following his death in 2004, doubled their fees between publication of our second and third editions.
An author and later his or her heirs have a legitimate right to recompense for their artistic property, the more so in cases where a work is published in its entirety: Harcourt Brace will presumably sell fewer copies of Mrs. Dalloway because we've reprinted the entire novel, and so Woolf's estate is entitled to the large sum we're paying for it. A good editorial team will use every tool at its disposal to prevent its sense of the canon from being deformed by such economic considerations. But without some forward thinking by literary executors, 20th-century literature risks being distorted in ways that none of us who love that literature can easily stand by and silently watch.
Kevin J.H. Dettmar is a professor of English and cultural studies at Southern Illinois University at Carbondale and the author, most recently, of Is Rock Dead? (Routledge, 2005). He is a regular contributor to The Chronicle Review.
http://chronicle.com
Section: The Chronicle Review
Volume 52, Issue 48, Page B6
Copyright © 2006 by The Chronicle of Higher Education
Sunday, July 30
The Peculiar Disappearance of the War in Iraq
July 30, 2006
By FRANK RICH
AS America fell into the quagmire of Vietnam, the comedian Milton Berle joked that the fastest way to end the war would be to put it on the last-place network, ABC, where it was certain to be canceled. Berle’s gallows humor lives on in the quagmire in Iraq. Americans want this war canceled too, and first- and last-place networks alike are more than happy to oblige.
CNN will surely remind us today that it is Day 19 of the Israel-Hezbollah war — now branded as Crisis in the Middle East — but you won’t catch anyone saying it’s Day 1,229 of the war in Iraq. On the Big Three networks’ evening newscasts, the time devoted to Iraq has fallen 60 percent between 2003 and this spring, as clocked by the television monitor, the Tyndall Report. On Thursday, Brian Williams of NBC read aloud a “shame on you” e-mail complaint from the parents of two military sons anguished that his broadcast had so little news about the war.
This is happening even as the casualties in Iraq, averaging more than 100 a day, easily surpass those in Israel and Lebanon combined. When Nouri al-Maliki, the latest Iraqi prime minister, visited Washington last week to address Congress, he too got short TV shrift — a mere five sentences about the speech on ABC’s “World News.” The networks know a rerun when they see it. Only 22 months earlier, one of Mr. Maliki’s short-lived predecessors, Ayad Allawi, had come to town during the 2004 campaign to give a similarly empty Congressional address laced with White House-scripted talking points about the war’s progress. Propaganda stunts, unlike “Law & Order” episodes, don’t hold up on a second viewing.
The steady falloff in Iraq coverage isn’t happenstance. It’s a barometer of the scope of the tragedy. For reporters, the already apocalyptic security situation in Baghdad keeps getting worse, simply making the war more difficult to cover than ever. The audience has its own phobia: Iraq is a bummer. “It is depressing to pay attention to this war on terror,” said Fox News’s Bill O’Reilly on July 18. “I mean, it’s summertime.” Americans don’t like to lose, whatever the season. They know defeat when they see it, no matter how many new plans for victory are trotted out to obscure that reality.
The specter of defeat is not the only reason Americans have switched off Iraq. The larger issue is that we don’t know what we — or, more specifically, 135,000 brave and vulnerable American troops — are fighting for. In contrast to the Israel-Hezbollah war, where the stakes for the combatants and American interests are clear, the war in Iraq has no rationale to keep it afloat on television or anywhere else. It’s a big, nightmarish story, all right, but one that lacks the thread of a coherent plot.
Certainly there has been no shortage of retrofitted explanations for the war in the three-plus years since the administration’s initial casus belli, to fend off Saddam’s mushroom clouds and vanquish Al Qaeda, proved to be frauds. We’ve been told that the war would promote democracy in the Arab world. And make the region safer for Israel. And secure the flow of cheap oil. If any of these justifications retained any credibility, they have been obliterated by Crisis in the Middle East. The new war is a grueling daily object lesson in just how much the American blunders in Iraq have undermined the one robust democracy that already existed in the region, Israel, while emboldening terrorists and strengthening the hand of Iran.
But it’s the collapse of the one remaining (and unassailable) motivation that still might justify staying the course in Iraq — as a humanitarian mission on behalf of the Iraqi people — that is most revealing of what a moral catastrophe this misadventure has been for our country. The sad truth is that the war’s architects always cared more about their own grandiose political and ideological ambitions than they did about the Iraqis, and they communicated that indifference from the start to Iraqis and Americans alike. The legacy of that attitude is that the American public cannot be rallied to the Iraqi cause today, as the war reaches its treacherous endgame.
The Bush administration constantly congratulates itself for liberating Iraq from Saddam’s genocidal regime. But regime change was never billed as a primary motivation for the war; the White House instead appealed to American fears and narcissism — we had to be saved from Saddam’s W.M.D. From “Shock and Awe” on, the fate of Iraqis was an afterthought. They would greet our troops with flowers and go about their business.
Donald Rumsfeld boasted that “the care” and “the humanity” that went into our precision assaults on military targets would minimize any civilian deaths. Such casualties were merely “collateral damage,” unworthy of quantification. “We don’t do body counts,” said Gen. Tommy Franks. President Bush at last started counting those Iraqi bodies publicly — with an estimate of 30,000 — some seven months ago. (More recently, The Los Angeles Times put the figure at, conservatively, 50,000.) By then, Americans had tuned out.
The contempt our government showed for Iraqis was not just to be found in our cavalier stance toward their casualties, or in the abuses at Abu Ghraib. There was a cultural condescension toward the Iraqi people from the get-go as well, as if they were schoolchildren in a compassionate-conservatism campaign ad. This attitude was epitomized by Mr. Rumsfeld’s “stuff happens” response to the looting of Baghdad at the dawn of the American occupation. In “Fiasco,” his stunning new book about the American failure in Iraq, Thomas E. Ricks, The Washington Post’s senior Pentagon correspondent, captures the meaning of that pivotal moment perfectly: “The message sent to Iraqis was far more troubling than Americans understood. It was that the U.S. government didn’t care — or, even more troubling for the future security of Iraq, that it did care but was incapable of acting effectively.”
As it turned out, it was the worst of both worlds: we didn’t care, and we were incapable of acting effectively. Nowhere is this seen more explicitly than in the subsequent American failure to follow through on our promise to reconstruct the Iraqi infrastructure we helped to smash. “There’s some little part of my brain that simply doesn’t understand how the most powerful country on earth just can’t get electricity back in Baghdad,” said Kanan Makiya, an Iraqi exile and prominent proponent of the war, in a recent Washington Post interview.
The simple answer is that the war planners didn’t care enough to provide the number of troops needed to secure the country so that reconstruction could proceed. The coalition authority isolated in its Green Zone bubble didn’t care enough to police the cronyism and corruption that squandered billions of dollars on abandoned projects. The latest monument to this humanitarian disaster was reported by James Glanz of The New York Times on Friday: a high-tech children’s hospital planned for Basra, repeatedly publicized by Laura Bush and Condi Rice, is now in serious jeopardy because of cost overruns and delays.
This history can’t be undone; there’s neither the American money nor the manpower to fulfill the mission left unaccomplished. The Iraqi people, whose collateral damage was so successfully hidden for so long by the Rumsfeld war plan, remain a sentimental abstraction to most Americans. Whether they are seen in agony after another Baghdad bombing or waving their inked fingers after an election or being used as props to frame Mrs. Bush during the State of the Union address, they have little more specificity than movie extras. Chalabi, Allawi, Jaafari, Maliki come and go, all graced with the same indistinguishable praise from the American president, all blurring into an endless loop of instability and crisis. We feel badly ... and change the channel.
Given that the violence in Iraq has only increased in the weeks since the elimination of Abu Musab al-Zarqawi, the Jordanian terrorist portrayed by the White House as the fount of Iraqi troubles, any Americans still paying attention to the war must now confront the reality that the administration is desperately trying to hide. “The enemy in Iraq is a combination of rejectionists and Saddamists and terrorists,” President Bush said in December when branding Zarqawi Public Enemy No. 1. But Iraq’s exploding sectarian warfare cannot be pinned on Al Qaeda or Baathist dead-enders.
The most dangerous figure in Iraq, the home-grown radical Shiite cleric Moqtada al-Sadr, is an acolyte of neither Osama bin Laden nor Saddam but an ally of Iran who has sworn solidarity to both Hezbollah and Hamas. He commands more than 30 seats in Mr. Maliki’s governing coalition in Parliament and 5 cabinet positions. He is also linked to death squads that have slaughtered Iraqis and Americans with impunity since the April 2004 uprising that killed, among others, Cindy Sheehan’s son, Casey. Since then, Mr. Sadr’s power has only grown, enabled by Iraqi “democracy.”
That the latest American plan for victory is to reposition our forces by putting more of them in the crossfire of Baghdad’s civil war is tantamount to treating our troops as if they were deck chairs on the Titanic. Even if the networks led with the story every night, what Americans would have the stomach to watch?
By FRANK RICH
AS America fell into the quagmire of Vietnam, the comedian Milton Berle joked that the fastest way to end the war would be to put it on the last-place network, ABC, where it was certain to be canceled. Berle’s gallows humor lives on in the quagmire in Iraq. Americans want this war canceled too, and first- and last-place networks alike are more than happy to oblige.
CNN will surely remind us today that it is Day 19 of the Israel-Hezbollah war — now branded as Crisis in the Middle East — but you won’t catch anyone saying it’s Day 1,229 of the war in Iraq. On the Big Three networks’ evening newscasts, the time devoted to Iraq has fallen 60 percent between 2003 and this spring, as clocked by the television monitor, the Tyndall Report. On Thursday, Brian Williams of NBC read aloud a “shame on you” e-mail complaint from the parents of two military sons anguished that his broadcast had so little news about the war.
This is happening even as the casualties in Iraq, averaging more than 100 a day, easily surpass those in Israel and Lebanon combined. When Nouri al-Maliki, the latest Iraqi prime minister, visited Washington last week to address Congress, he too got short TV shrift — a mere five sentences about the speech on ABC’s “World News.” The networks know a rerun when they see it. Only 22 months earlier, one of Mr. Maliki’s short-lived predecessors, Ayad Allawi, had come to town during the 2004 campaign to give a similarly empty Congressional address laced with White House-scripted talking points about the war’s progress. Propaganda stunts, unlike “Law & Order” episodes, don’t hold up on a second viewing.
The steady falloff in Iraq coverage isn’t happenstance. It’s a barometer of the scope of the tragedy. For reporters, the already apocalyptic security situation in Baghdad keeps getting worse, simply making the war more difficult to cover than ever. The audience has its own phobia: Iraq is a bummer. “It is depressing to pay attention to this war on terror,” said Fox News’s Bill O’Reilly on July 18. “I mean, it’s summertime.” Americans don’t like to lose, whatever the season. They know defeat when they see it, no matter how many new plans for victory are trotted out to obscure that reality.
The specter of defeat is not the only reason Americans have switched off Iraq. The larger issue is that we don’t know what we — or, more specifically, 135,000 brave and vulnerable American troops — are fighting for. In contrast to the Israel-Hezbollah war, where the stakes for the combatants and American interests are clear, the war in Iraq has no rationale to keep it afloat on television or anywhere else. It’s a big, nightmarish story, all right, but one that lacks the thread of a coherent plot.
Certainly there has been no shortage of retrofitted explanations for the war in the three-plus years since the administration’s initial casus belli, to fend off Saddam’s mushroom clouds and vanquish Al Qaeda, proved to be frauds. We’ve been told that the war would promote democracy in the Arab world. And make the region safer for Israel. And secure the flow of cheap oil. If any of these justifications retained any credibility, they have been obliterated by Crisis in the Middle East. The new war is a grueling daily object lesson in just how much the American blunders in Iraq have undermined the one robust democracy that already existed in the region, Israel, while emboldening terrorists and strengthening the hand of Iran.
But it’s the collapse of the one remaining (and unassailable) motivation that still might justify staying the course in Iraq — as a humanitarian mission on behalf of the Iraqi people — that is most revealing of what a moral catastrophe this misadventure has been for our country. The sad truth is that the war’s architects always cared more about their own grandiose political and ideological ambitions than they did about the Iraqis, and they communicated that indifference from the start to Iraqis and Americans alike. The legacy of that attitude is that the American public cannot be rallied to the Iraqi cause today, as the war reaches its treacherous endgame.
The Bush administration constantly congratulates itself for liberating Iraq from Saddam’s genocidal regime. But regime change was never billed as a primary motivation for the war; the White House instead appealed to American fears and narcissism — we had to be saved from Saddam’s W.M.D. From “Shock and Awe” on, the fate of Iraqis was an afterthought. They would greet our troops with flowers and go about their business.
Donald Rumsfeld boasted that “the care” and “the humanity” that went into our precision assaults on military targets would minimize any civilian deaths. Such casualties were merely “collateral damage,” unworthy of quantification. “We don’t do body counts,” said Gen. Tommy Franks. President Bush at last started counting those Iraqi bodies publicly — with an estimate of 30,000 — some seven months ago. (More recently, The Los Angeles Times put the figure at, conservatively, 50,000.) By then, Americans had tuned out.
The contempt our government showed for Iraqis was not just to be found in our cavalier stance toward their casualties, or in the abuses at Abu Ghraib. There was a cultural condescension toward the Iraqi people from the get-go as well, as if they were schoolchildren in a compassionate-conservatism campaign ad. This attitude was epitomized by Mr. Rumsfeld’s “stuff happens” response to the looting of Baghdad at the dawn of the American occupation. In “Fiasco,” his stunning new book about the American failure in Iraq, Thomas E. Ricks, The Washington Post’s senior Pentagon correspondent, captures the meaning of that pivotal moment perfectly: “The message sent to Iraqis was far more troubling than Americans understood. It was that the U.S. government didn’t care — or, even more troubling for the future security of Iraq, that it did care but was incapable of acting effectively.”
As it turned out, it was the worst of both worlds: we didn’t care, and we were incapable of acting effectively. Nowhere is this seen more explicitly than in the subsequent American failure to follow through on our promise to reconstruct the Iraqi infrastructure we helped to smash. “There’s some little part of my brain that simply doesn’t understand how the most powerful country on earth just can’t get electricity back in Baghdad,” said Kanan Makiya, an Iraqi exile and prominent proponent of the war, in a recent Washington Post interview.
The simple answer is that the war planners didn’t care enough to provide the number of troops needed to secure the country so that reconstruction could proceed. The coalition authority isolated in its Green Zone bubble didn’t care enough to police the cronyism and corruption that squandered billions of dollars on abandoned projects. The latest monument to this humanitarian disaster was reported by James Glanz of The New York Times on Friday: a high-tech children’s hospital planned for Basra, repeatedly publicized by Laura Bush and Condi Rice, is now in serious jeopardy because of cost overruns and delays.
This history can’t be undone; there’s neither the American money nor the manpower to fulfill the mission left unaccomplished. The Iraqi people, whose collateral damage was so successfully hidden for so long by the Rumsfeld war plan, remain a sentimental abstraction to most Americans. Whether they are seen in agony after another Baghdad bombing or waving their inked fingers after an election or being used as props to frame Mrs. Bush during the State of the Union address, they have little more specificity than movie extras. Chalabi, Allawi, Jaafari, Maliki come and go, all graced with the same indistinguishable praise from the American president, all blurring into an endless loop of instability and crisis. We feel badly ... and change the channel.
Given that the violence in Iraq has only increased in the weeks since the elimination of Abu Musab al-Zarqawi, the Jordanian terrorist portrayed by the White House as the fount of Iraqi troubles, any Americans still paying attention to the war must now confront the reality that the administration is desperately trying to hide. “The enemy in Iraq is a combination of rejectionists and Saddamists and terrorists,” President Bush said in December when branding Zarqawi Public Enemy No. 1. But Iraq’s exploding sectarian warfare cannot be pinned on Al Qaeda or Baathist dead-enders.
The most dangerous figure in Iraq, the home-grown radical Shiite cleric Moqtada al-Sadr, is an acolyte of neither Osama bin Laden nor Saddam but an ally of Iran who has sworn solidarity to both Hezbollah and Hamas. He commands more than 30 seats in Mr. Maliki’s governing coalition in Parliament and 5 cabinet positions. He is also linked to death squads that have slaughtered Iraqis and Americans with impunity since the April 2004 uprising that killed, among others, Cindy Sheehan’s son, Casey. Since then, Mr. Sadr’s power has only grown, enabled by Iraqi “democracy.”
That the latest American plan for victory is to reposition our forces by putting more of them in the crossfire of Baghdad’s civil war is tantamount to treating our troops as if they were deck chairs on the Titanic. Even if the networks led with the story every night, what Americans would have the stomach to watch?
Saturday, July 29
Religious Loonies at it again
GEORGETOWN, Del. — After her family moved to this small town 30 years ago, Mona Dobrich grew up as the only Jew in school. Mrs. Dobrich, 39, married a local man, bought the house behind her parents’ home and brought up her two children as Jews.
For years, she and her daughter, Samantha, listened to Christian prayers at public school potlucks, award dinners and parent-teacher group meetings, she said. But at Samantha’s high school graduation in June 2004, a minister’s prayer proclaiming Jesus as the only way to the truth nudged Mrs. Dobrich to act.
“It was as if no matter how much hard work, no matter how good a person you are, the only way you’ll ever be anything is through Jesus Christ,” Mrs. Dobrich said. “He said those words, and I saw Sam’s head snap and her start looking around, like, ‘Where’s my mom? Where’s my mom?’ And all I wanted to do was run up and take her in my arms.”
After the graduation, Mrs. Dobrich asked the Indian River district school board to consider prayers that were more generic and, she said, less exclusionary. As news of her request spread, many local Christians saw it as an effort to limit their free exercise of religion, residents said. Anger spilled on to talk radio, in letters to the editor and at school board meetings attended by hundreds of people carrying signs praising Jesus.
“What people here are saying is, ‘Stop interfering with our traditions, stop interfering with our faith and leave our country the way we knew it to be,’ ” said Dan Gaffney, a host at WGMD, a talk radio station in Rehoboth, and a supporter of prayer in the school district.
After receiving several threats, Mrs. Dobrich took her son, Alex, to Wilmington in the fall of 2004, planning to stay until the controversy blew over. It never has.
The Dobriches eventually sued the Indian River School District, challenging what they asserted was the pervasiveness of religion in the schools and seeking financial damages. They have been joined by “the Does,” a family still in the school district who have remained anonymous because of the response against the Dobriches.
Meanwhile, a Muslim family in another school district here in Sussex County has filed suit, alleging proselytizing in the schools and the harassment of their daughters.
The move to Wilmington, the Dobriches said, wrecked them financially, leading them to sell their house and their daughter to drop out of Columbia University.
The dispute here underscores the rising tensions over religion in public schools.
“We don’t have data on the number of lawsuits, but anecdotally, people think it has never been so active — the degree to which these conflicts erupt in schools and the degree to which they are litigated,” said Tom Hutton, a staff lawyer at the National School Boards Association.
More religion probably exists in schools now than in decades because of the role religious conservatives play in politics and the passage of certain education laws over the last 25 years, including the Equal Access Act in 1984, said Charles C. Haynes, senior scholar at the First Amendment Center, a research and education group.
“There are communities largely of one faith, and despite all the court rulings and Supreme Court decisions, they continue to promote one faith,” Mr. Haynes said. “They don’t much care what the minority complains about. They’re just convinced that what they are doing is good for kids and what America is all about.”
Dr. Donald G. Hattier, a member of the Indian River school board, said the district had changed many policies in response to Mrs. Dobrich’s initial complaints. But the board unanimously rejected a proposed settlement of the Dobriches’ lawsuit.
“There were a couple of provisions that were unacceptable to the board,” said Jason Gosselin, a lawyer for the board. “The parties are working in good faith to move closer to settlement.”
Until recently, it was safe to assume that everyone in the Indian River district was Christian, said the Rev. Mark Harris, an Episcopal priest at St. Peter’s Church in Lewes.
But much has changed in Sussex County over the last 30 years. The county, in southern Delaware, has resort enclaves like Rehoboth Beach, to which outsiders bring their cash and, often, liberal values. Inland, in the area of Georgetown, the county seat, the land is still a lush patchwork of corn and soybean fields, with a few poultry plants. But developers are turning more fields into tracts of rambling homes. The Hispanic population is booming. There are enough Reform Jews, Muslims and Quakers to set up their own centers and groups, Mr. Harris said.
In interviews with a dozen people here and comments on the radio by a half-dozen others, the overwhelming majority insisted, usually politely, that prayer should stay in the schools.
“We have a way of doing things here, and it’s not going to change to accommodate a very small minority,’’ said Kenneth R. Stevens, 41, a businessman sitting in the Georgetown Diner. “If they feel singled out, they should find another school or excuse themselves from those functions. It’s our way of life.”
The Dobrich and Doe legal complaint portrays a district in which children were given special privileges for being in Bible club, Bibles were distributed in 2003 at an elementary school, Christian prayer was routine at school functions and teachers evangelized.
“Because Jesus Christ is my Lord and Savior, I will speak out for him,” said the Rev. Jerry Fike of Mount Olivet Brethren Church, who gave the prayer at Samantha’s graduation. “The Bible encourages that.” Mr. Fike continued: “Ultimately, he is the one I have to please. If doing that places me at odds with the law of the land, I still have to follow him.”
Mrs. Dobrich, who is Orthodox, said that when she was a girl, Christians here had treated her faith with respectful interest. Now, she said, her son was ridiculed in school for wearing his yarmulke. She described a classmate of his drawing a picture of a pathway to heaven for everyone except “Alex the Jew.”
Mrs. Dobrich’s decision to leave her hometown and seek legal help came after a school board meeting in August 2004 on the issue of prayer. Dr. Hattier had called WGMD to discuss the issue, and Mr. Gaffney and others encouraged people to go the meeting. Hundreds showed up.
A homemaker active in her children’s schools, Mrs. Dobrich said she had asked the board to develop policies that would leave no one feeling excluded because of faith. People booed and rattled signs that read “Jesus Saves,” she recalled. Her son had written a short statement, but he felt so intimidated that his sister read it for him. In his statement, Alex, who was 11 then, said: “I feel bad when kids in my class call me ‘Jew boy.’ I do not want to move away from the house I have lived in forever.”
Later, another speaker turned to Mrs. Dobrich and said, according to several witnesses, “If you want people to stop calling him ‘Jew boy,’ you tell him to give his heart to Jesus.”
Immediately afterward, the Dobriches got threatening phone calls. Samantha had enrolled in Columbia, and Mrs. Dobrich decided to go to Wilmington temporarily.
But the controversy simmered, keeping Mrs. Dobrich and Alex away. The cost of renting an apartment in Wilmington led the Dobriches to sell their home here. Mrs. Dobrich’s husband, Marco, a school bus driver and transportation coordinator, makes about $30,000 a year and has stayed in town to care for Mrs. Dobrich’s ailing parents. Mr. Dobrich declined to comment. Samantha left Columbia because of the financial strain.
The only thing to flourish, Mrs. Dobrich said, was her faith. Her children, she said, “have so much pride in their religion now.”
“Alex wears his yarmulke all the time. He never takes it off.”
By NEELA BANERJEE, NYT
and:
MEL GIBSON'S ANTISEMETIC TIRADE DURING DUI ARREST:
The sheriff’s report, carried on TMZ.com, a Web site owned by Time Warner, said Mr. Gibson had demanded to know if the officer, James Mee, was a Jew. During an obscenity-laced tirade, according to the report, Mr. Gibson also said “the Jews are responsible for all the wars in the world.”
Steve Whitmore, a spokesman for the Los Angeles County Sheriff’s Department, declined to comment on the report. But he said the department would eventually disclose details of the arrest. “Nothing will be sanitized,” Mr. Whitmore said in a statement.
People associated with the case privately acknowledged the report’s authenticity, but they agreed to speak only on condition of anonymity because of the ongoing investigation.
The report of Mr. Gibson’s outburst further disturbed some people who were already wary of what they saw as anti-Semitic overtones in his 2004 blockbuster “The Passion of the Christ,” and who believe that he has failed to disassociate himself clearly enough from remarks by his father denying the Holocaust.
For years, she and her daughter, Samantha, listened to Christian prayers at public school potlucks, award dinners and parent-teacher group meetings, she said. But at Samantha’s high school graduation in June 2004, a minister’s prayer proclaiming Jesus as the only way to the truth nudged Mrs. Dobrich to act.
“It was as if no matter how much hard work, no matter how good a person you are, the only way you’ll ever be anything is through Jesus Christ,” Mrs. Dobrich said. “He said those words, and I saw Sam’s head snap and her start looking around, like, ‘Where’s my mom? Where’s my mom?’ And all I wanted to do was run up and take her in my arms.”
After the graduation, Mrs. Dobrich asked the Indian River district school board to consider prayers that were more generic and, she said, less exclusionary. As news of her request spread, many local Christians saw it as an effort to limit their free exercise of religion, residents said. Anger spilled on to talk radio, in letters to the editor and at school board meetings attended by hundreds of people carrying signs praising Jesus.
“What people here are saying is, ‘Stop interfering with our traditions, stop interfering with our faith and leave our country the way we knew it to be,’ ” said Dan Gaffney, a host at WGMD, a talk radio station in Rehoboth, and a supporter of prayer in the school district.
After receiving several threats, Mrs. Dobrich took her son, Alex, to Wilmington in the fall of 2004, planning to stay until the controversy blew over. It never has.
The Dobriches eventually sued the Indian River School District, challenging what they asserted was the pervasiveness of religion in the schools and seeking financial damages. They have been joined by “the Does,” a family still in the school district who have remained anonymous because of the response against the Dobriches.
Meanwhile, a Muslim family in another school district here in Sussex County has filed suit, alleging proselytizing in the schools and the harassment of their daughters.
The move to Wilmington, the Dobriches said, wrecked them financially, leading them to sell their house and their daughter to drop out of Columbia University.
The dispute here underscores the rising tensions over religion in public schools.
“We don’t have data on the number of lawsuits, but anecdotally, people think it has never been so active — the degree to which these conflicts erupt in schools and the degree to which they are litigated,” said Tom Hutton, a staff lawyer at the National School Boards Association.
More religion probably exists in schools now than in decades because of the role religious conservatives play in politics and the passage of certain education laws over the last 25 years, including the Equal Access Act in 1984, said Charles C. Haynes, senior scholar at the First Amendment Center, a research and education group.
“There are communities largely of one faith, and despite all the court rulings and Supreme Court decisions, they continue to promote one faith,” Mr. Haynes said. “They don’t much care what the minority complains about. They’re just convinced that what they are doing is good for kids and what America is all about.”
Dr. Donald G. Hattier, a member of the Indian River school board, said the district had changed many policies in response to Mrs. Dobrich’s initial complaints. But the board unanimously rejected a proposed settlement of the Dobriches’ lawsuit.
“There were a couple of provisions that were unacceptable to the board,” said Jason Gosselin, a lawyer for the board. “The parties are working in good faith to move closer to settlement.”
Until recently, it was safe to assume that everyone in the Indian River district was Christian, said the Rev. Mark Harris, an Episcopal priest at St. Peter’s Church in Lewes.
But much has changed in Sussex County over the last 30 years. The county, in southern Delaware, has resort enclaves like Rehoboth Beach, to which outsiders bring their cash and, often, liberal values. Inland, in the area of Georgetown, the county seat, the land is still a lush patchwork of corn and soybean fields, with a few poultry plants. But developers are turning more fields into tracts of rambling homes. The Hispanic population is booming. There are enough Reform Jews, Muslims and Quakers to set up their own centers and groups, Mr. Harris said.
In interviews with a dozen people here and comments on the radio by a half-dozen others, the overwhelming majority insisted, usually politely, that prayer should stay in the schools.
“We have a way of doing things here, and it’s not going to change to accommodate a very small minority,’’ said Kenneth R. Stevens, 41, a businessman sitting in the Georgetown Diner. “If they feel singled out, they should find another school or excuse themselves from those functions. It’s our way of life.”
The Dobrich and Doe legal complaint portrays a district in which children were given special privileges for being in Bible club, Bibles were distributed in 2003 at an elementary school, Christian prayer was routine at school functions and teachers evangelized.
“Because Jesus Christ is my Lord and Savior, I will speak out for him,” said the Rev. Jerry Fike of Mount Olivet Brethren Church, who gave the prayer at Samantha’s graduation. “The Bible encourages that.” Mr. Fike continued: “Ultimately, he is the one I have to please. If doing that places me at odds with the law of the land, I still have to follow him.”
Mrs. Dobrich, who is Orthodox, said that when she was a girl, Christians here had treated her faith with respectful interest. Now, she said, her son was ridiculed in school for wearing his yarmulke. She described a classmate of his drawing a picture of a pathway to heaven for everyone except “Alex the Jew.”
Mrs. Dobrich’s decision to leave her hometown and seek legal help came after a school board meeting in August 2004 on the issue of prayer. Dr. Hattier had called WGMD to discuss the issue, and Mr. Gaffney and others encouraged people to go the meeting. Hundreds showed up.
A homemaker active in her children’s schools, Mrs. Dobrich said she had asked the board to develop policies that would leave no one feeling excluded because of faith. People booed and rattled signs that read “Jesus Saves,” she recalled. Her son had written a short statement, but he felt so intimidated that his sister read it for him. In his statement, Alex, who was 11 then, said: “I feel bad when kids in my class call me ‘Jew boy.’ I do not want to move away from the house I have lived in forever.”
Later, another speaker turned to Mrs. Dobrich and said, according to several witnesses, “If you want people to stop calling him ‘Jew boy,’ you tell him to give his heart to Jesus.”
Immediately afterward, the Dobriches got threatening phone calls. Samantha had enrolled in Columbia, and Mrs. Dobrich decided to go to Wilmington temporarily.
But the controversy simmered, keeping Mrs. Dobrich and Alex away. The cost of renting an apartment in Wilmington led the Dobriches to sell their home here. Mrs. Dobrich’s husband, Marco, a school bus driver and transportation coordinator, makes about $30,000 a year and has stayed in town to care for Mrs. Dobrich’s ailing parents. Mr. Dobrich declined to comment. Samantha left Columbia because of the financial strain.
The only thing to flourish, Mrs. Dobrich said, was her faith. Her children, she said, “have so much pride in their religion now.”
“Alex wears his yarmulke all the time. He never takes it off.”
By NEELA BANERJEE, NYT
and:
MEL GIBSON'S ANTISEMETIC TIRADE DURING DUI ARREST:
The sheriff’s report, carried on TMZ.com, a Web site owned by Time Warner, said Mr. Gibson had demanded to know if the officer, James Mee, was a Jew. During an obscenity-laced tirade, according to the report, Mr. Gibson also said “the Jews are responsible for all the wars in the world.”
Steve Whitmore, a spokesman for the Los Angeles County Sheriff’s Department, declined to comment on the report. But he said the department would eventually disclose details of the arrest. “Nothing will be sanitized,” Mr. Whitmore said in a statement.
People associated with the case privately acknowledged the report’s authenticity, but they agreed to speak only on condition of anonymity because of the ongoing investigation.
The report of Mr. Gibson’s outburst further disturbed some people who were already wary of what they saw as anti-Semitic overtones in his 2004 blockbuster “The Passion of the Christ,” and who believe that he has failed to disassociate himself clearly enough from remarks by his father denying the Holocaust.
Friday, July 28
How not to use a metaphor
A Reader Responds to David Brooks's column in the NY Times:
You write, “But they do hope to change the environment, and slowly begin to crowd out Hezbollah influence, the way healthy grass crowds out weeds in a lawn.”
Let me guess: you don't have a lawn, right?
The sad thing is that this very type of incorrect assumption probably has the most to do with why the world is erupting in violence and despair. Weeds never get crowded out by healthy grass because weeds are way more durable and aggressive than healthy grass. Weeds eagerly grow in cracks in concrete, they grow without watering and fertilizing, and they grow in areas where we can't get grass to grow. Efforts to kill weeds often end up killing healthy grass. Around our yard, we've decided that, rather than spraying we have pets and we don't wish to expose them to toxins, we'll remove the weeds surgically, by hand. It works, but it's time consuming and it never ends.
I suspect it's the same thing with terrorism. Terrorists, like weeds, are a whole different organism — much tougher and more resilient than the innocent civilians they exist among. I don't know what the answer is, but I do know that, like trying to kill weeds in healthy grass, it's going to take precise and patient efforts. It will never be over — just like weeds, every time you pluck one, another pops up, and it's just as important not to kill all the healthy life and infrastructure around it. No resident of the 'burbs would view a poisoned, parched lawn devoid of both weeds and grass as a victory. Our ultimate goal isn't to kill weeds; it's to have a nice lawn.
The civilized nations of this world would do well to start viewing the situation the way they view lawn maintenance, with just as much emphasis put on protecting and nurturing the healthy and desirable as weeding out the undesirable.
-Christine Mank, Ottawa, Ontario, Canada
You write, “But they do hope to change the environment, and slowly begin to crowd out Hezbollah influence, the way healthy grass crowds out weeds in a lawn.”
Let me guess: you don't have a lawn, right?
The sad thing is that this very type of incorrect assumption probably has the most to do with why the world is erupting in violence and despair. Weeds never get crowded out by healthy grass because weeds are way more durable and aggressive than healthy grass. Weeds eagerly grow in cracks in concrete, they grow without watering and fertilizing, and they grow in areas where we can't get grass to grow. Efforts to kill weeds often end up killing healthy grass. Around our yard, we've decided that, rather than spraying we have pets and we don't wish to expose them to toxins, we'll remove the weeds surgically, by hand. It works, but it's time consuming and it never ends.
I suspect it's the same thing with terrorism. Terrorists, like weeds, are a whole different organism — much tougher and more resilient than the innocent civilians they exist among. I don't know what the answer is, but I do know that, like trying to kill weeds in healthy grass, it's going to take precise and patient efforts. It will never be over — just like weeds, every time you pluck one, another pops up, and it's just as important not to kill all the healthy life and infrastructure around it. No resident of the 'burbs would view a poisoned, parched lawn devoid of both weeds and grass as a victory. Our ultimate goal isn't to kill weeds; it's to have a nice lawn.
The civilized nations of this world would do well to start viewing the situation the way they view lawn maintenance, with just as much emphasis put on protecting and nurturing the healthy and desirable as weeding out the undesirable.
-Christine Mank, Ottawa, Ontario, Canada
Thursday, July 27
Can Wikipedia conquer expertise?
On March 1st, Wikipedia, the online interactive encyclopedia, hit the million-articles mark, with an entry on Jordanhill, a railway station in suburban Glasgow. Its author, Ewan MacDonald posted a single sentence about the station at 11 P.M., local time; over the next twenty-four hours, the entry was edited more than four hundred times, by dozens of people. (Jordanhill happens to be the “1029th busiest station in the United Kingdom”; it “no longer has a staffed ticket counter.”) The Encyclopædia Britannica, which for more than two centuries has been considered the gold standard for reference works, has only a hundred and twenty thousand entries in its most comprehensive edition. Apparently, no traditional encyclopedia has ever suspected that someone might wonder about Sudoku or about prostitution in China. Or, for that matter, about Capgras delusion (the unnerving sensation that an impostor is sitting in for a close relative), the Boston molasses disaster, the Rhinoceros Party of Canada, Bill Gates’s house, the forty-five-minute Anglo-Zanzibar War, or Islam in Iceland. Wikipedia includes fine entries on Kafka and the War of the Spanish Succession, and also a complete guide to the ships of the U.S. Navy, a definition of Philadelphia cheesesteak, a masterly page on Scrabble, a list of historical cats (celebrity cats, a cat millionaire, the first feline to circumnavigate Australia), a survey of invented expletives in fiction (“bippie,” “cakesniffer,” “furgle”), instructions for curing hiccups, and an article that describes, with schematic diagrams, how to build a stove from a discarded soda can. The how-to entries represent territory that the encyclopedia has not claimed since the eighteenth century. You could cure a toothache or make snowshoes using the original Britannica, of 1768-71. (You could also imbibe a lot of prejudice and superstition. The entry on Woman was just six words: “The female of man. See HOMO.”) If you look up “coffee preparation” on Wikipedia, you will find your way, via the entry on Espresso, to a piece on types of espresso machines, which you will want to consult before buying. There is also a page on the site dedicated to “Errors in the Encyclopædia Britannica that have been corrected in Wikipedia” (Stalin’s birth date, the true inventor of the safety razor).
Because there are no physical limits on its size, Wikipedia can aspire to be all-inclusive. It is also perfectly configured to be current: there are detailed entries for each of the twelve finalists on this season’s “American Idol,” and the article on the “2006 Israel-Lebanon Conflict” has been edited more than four thousand times since it was created, on July 12th, six hours after Hezbollah militants ignited the hostilities by kidnapping two Israeli soldiers. Wikipedia, which was launched in 2001, is now the seventeenth-most-popular site on the Internet, generating more traffic daily than MSNBC.com and the online versions of the Times and the Wall Street Journal combined. The number of visitors has been doubling every four months; the site receives as many as fourteen thousand hits per second. Wikipedia functions as a filter for vast amounts of information online, and it could be said that Google owes the site for tidying up the neighborhood. But the search engine is amply repaying its debt: because Wikipedia pages contain so many links to other entries on the site, and are so frequently updated, they enjoy an enviably high page rank.
The site has achieved this prominence largely without paid staff or revenue. It has five employees in addition to Jimmy Wales, Wikipedia’s thirty-nine-year-old founder, and it carries no advertising. In 2003, Wikipedia became a nonprofit organization; it meets most of its budget, of seven hundred and fifty thousand dollars, with donations, the bulk of them contributions of twenty dollars or less. Wales says that he is on a mission to “distribute a free encyclopedia to every single person on the planet in their own language,” and to an astonishing degree he is succeeding. Anyone with Internet access can create a Wikipedia entry or edit an existing one. The site currently exists in more than two hundred languages and has hundreds of thousands of contributors around the world. Wales is at the forefront of a revolution in knowledge gathering: he has marshalled an army of volunteers who believe that, working collaboratively, they can produce an encyclopedia that is as good as any written by experts, and with an unprecedented range.
Wikipedia is an online community devoted not to last night’s party or to next season’s iPod but to a higher good. It is also no more immune to human nature than any other utopian project. Pettiness, idiocy, and vulgarity are regular features of the site. Nothing about high-minded collaboration guarantees accuracy, and open editing invites abuse. Senators and congressmen have been caught tampering with their entries; the entire House of Representatives has been banned from Wikipedia several times. (It is not subtle to change Senator Robert Byrd’s age from eighty-eight to a hundred and eighty. It is subtler to sanitize one’s voting record in order to distance oneself from an unpopular President, or to delete broken campaign promises.) Curiously, though, mob rule has not led to chaos. Wikipedia, which began as an experiment in unfettered democracy, has sprouted policies and procedures. At the same time, the site embodies our newly casual relationship to truth. When confronted with evidence of errors or bias, Wikipedians invoke a favorite excuse: look how often the mainstream media, and the traditional encyclopedia, are wrong! As defenses go, this is the epistemological equivalent of “But Johnny jumped off the bridge first.” Wikipedia, though, is only five years old. One day, it may grow up.
The encyclopedic impulse dates back more than two thousand years and has rarely balked at national borders. Among the first general reference works was Emperor’s Mirror, commissioned i 220 A.D. by a Chinese emperor, for use by civil servants. The quest to catalogue all human knowledge accelerated in the eighteenth century. In the seventeen-seventies, the Germans, champion of thoroughness, began assembling a two-hundred-and-forty-two-volume masterwork. A few decades earlier, Johann Heinrich Zedler, a Leipzig bookseller, had alarmed local competitors whe he solicited articles for his Universal-Lexicon. His rivals, fearing that the work would put them out of business by rendering all other books obsolete, tried unsuccessfully to sabotage the project
It took a devious Frenchman, Pierre Bayle, to conceive of an encyclopedia composed solely of errors. After the idea failed to generate much enthusiasm among potential readers, he instead compiled a “Dictionnaire Historique et Critique,” which consisted almost entirely of footnotes, many highlighting flaws of earlier scholarship. Bayle taught readers to doubt, a lesson in subversion that Diderot and d’Alembert, the authors of the Encyclopédie (1751-80), learned well. Their thirty-five-volume work preached rationalism at the expense of church and state. The more stolid Britannica was born of cross-channel rivalry and an Anglo-Saxon passion for utility.
Wales’s first encyclopedia was the World Book, which his parents acquired after dinner one evening in 1969, from a door-to-door salesman. Wales—who resembles a young Billy Crystal with the neuroses neatly tucked in—recalls the enchantment of pasting in update stickers that cross-referenced older entries to the annual supplements. Wales’s mother and grandmother ran a private school in Huntsville, Alabama, which he attended from the age of three. He graduated from Auburn University with a degree in finance and began a Ph.D. in the subject, enrolling first at the University of Alabama and later at Indiana University. In 1994, he decided to take a job trading options in Chicago rather than write his dissertation. Four years later, he moved to San Diego, where he used his savings to found an Internet portal. Its audience was mostly men; pornography—videos and blogs—accounted for about a tenth of its revenues. Meanwhile, Wales was cogitating. In his view, misinformation, propaganda, and ignorance are responsible for many of the world’s ills. “I’m very much an Enlightenment kind of guy,” Wales told me. The promise of the Internet is free knowledge for everyone, he recalls thinking. How do we make that happen?
As an undergraduate, he had read Friedrich Hayek’s 1945 free-market manifesto, “The Use of Knowledge in Society,” which argues that a person’s knowledge is by definition partial, and that truth is established only when people pool their wisdom. Wales thought of the essay again in the nineteen-nineties, when he began reading about the open-source movement, a group of programmers who believed that software should be free and distributed in such a way that anyone could modify the code. He was particularly impressed by “The Cathedral and the Bazaar,” an essay, later expanded into a book, by Eric Raymond, one of the movement’s founders. “It opened my eyes to the possibility of mass collaboration,” Wales said.
The first step was a misstep. In 2000, Wales hired Larry Sanger, a graduate student in philosophy he had met on a Listserv, to help him create an online general-interest encyclopedia called Nupedia. The idea was to solicit articles from scholars, subject the articles to a seven-step review process, and post them free online. Wales himself tried to compose the entry on Robert Merton and options-pricing theory; after he had written a few sentences, he remembered why he had dropped out of graduate school. “They were going to take my essay and send it to two finance professors in the field,” he recalled. “I had been out of academia for several years. It was intimidating; it felt like homework.”
After a year, Nupedia had only twenty-one articles, on such topics as atonality and Herodotus. In January, 2001, Sanger had dinner with a friend, who told him about the wiki, a simple software tool that allows for collaborative writing and editing. Sanger thought that a wiki might attract new contributors to Nupedia. (Wales says that using a wiki was his idea.) Wales agreed to try it, more or less as a lark. Under the wiki model that Sanger and Wales adopted, each entry included a history page, which preserves a record of all editing changes. They added a talk page, to allow for discussion of the editorial process—an idea Bayle would have appreciated. Sanger coined the term Wikipedia, and the site went live on January 15, 2001. Two days later, he sent an e-mail to the Nupedia mailing list—about two thousand people. “Wikipedia is up!” he wrote. “Humor me. Go there and add a little article. It will take all of five or ten minutes.”
Wales braced himself for “complete rubbish.” He figured that if he and Sanger were lucky the wiki would generate a few rough drafts for Nupedia. Within a month, Wikipedia had six hundred articles. After a year, there were twenty thousand.
Wales is fond of citing a 1962 proclamation by Charles Van Doren, who later became an editor at Britannica. Van Doren believed that the traditional encyclopedia was defunct. It had grown by accretion rather than by design; it had sacrificed artful synthesis to plodding convention; it looked backward. “Because the world is radically new, the ideal encyclopedia should be radical, too,” Van Doren wrote. “It should stop being safe—in politics, in philosophy, in science.”
In its seminal Western incarnation, the encyclopedia had been a dangerous book. The Encyclopédie muscled aside religious institutions and orthodoxies to install human reason at the center of the universe—and, for that muscling, briefly earned the book’s publisher a place in the Bastille. As the historian Robert Darnton pointed out, the entry in the Encyclopédie on cannibalism ends with the cross-reference “See Eucharist.” What Wales seems to have in mind, however, is less Van Doren’s call to arms than that of an earlier rabble-rouser. In the nineteen-thirties, H. G. Wells lamented that, while the world was becoming smaller and moving at increasing speed, the way information was distributed remained old-fashioned and ineffective. He prescribed a “world brain,” a collaborative, decentralized repository of knowledge that would be subject to continual revision. More radically—with “alma-matricidal impiety,” as he put it—Wells indicted academia; the university was itself medieval. “We want a Henry Ford today to modernize the distribution of knowledge, make good knowledge cheap and easy in this still very ignorant, ill-educated, ill-served English-speaking world of ours,” he wrote. Had the Internet existed in his lifetime, Wells might have beaten Wales to the punch.
Wales’s most radical contribution may be not to have made information free but—in his own alma-matricidal way—to have invented a system that does not favor the Ph.D. over the well-read fifteen-year-old. “To me, the key thing is getting it right,” Wales has said of Wikipedia’s contributors. “I don’t care if they’re a high-school kid or a Harvard professor.” At the beginning, there were no formal rules, though Sanger eventually posted a set of guidelines on the site. The first was “Ignore all the rules.” Two of the others have become central tenets: articles must reflect a neutral point of view (N.P.O.V., in Wikipedia lingo), and their content must be both verifiable and previously published. Among other things, the prohibition against original research heads off a great deal of material about people’s pets.
Insofar as Wikipedia has a physical existence, it is in St. Petersburg, Florida, in an executive suite that serves as the headquarters of the Wikimedia Foundation, the parent organization o Wikipedia and its lesser-known sister projects, among them Wikisource (a library of free texts), Wikinews (a current-events site) and Wikiquote (bye-bye Bartlett’s). Wales, who is married an has a five-year-old daughter, says that St. Petersburg’s attractive housing prices lured him from California. When I visited the offices in March, the walls were bare, the furniture battered. With th addition of a dead plant, the suite could pass for a graduate-student lounge
The real work at Wikipedia takes place not in Florida but on thousands of computer screens across the world. Perhaps Wikipedia’s greatest achievement—one that Wales did not fully anticipate—was the creation of a community. Wikipedians are officially anonymous, contributing to unsigned entries under screen names. They are also predominantly male—about eighty per cent, Wales says—and compulsively social, conversing with each other not only on the talk pages attached to each entry but on Wikipedia-dedicated I.R.C. channels and on user pages, which regular contributors often create and which serve as a sort of personalized office cooler. On the page of a twenty-year-old Wikipedian named Arocoun, who lists “philosophizing” among his favorite activities, messages from other users range from the reflective (“I’d argue against your claim that humans should aim to be independent/self-reliant in all aspects of their lives . . . I don’t think true independence is a realistic ideal given all the inherent intertwinings of any society”) to the geekily flirtatious (“I’m a neurotic painter from Ohio, and I guess if you consider your views radical, then I’m a radical, too. So . . . we should be friends”).
Wikipedians have evolved a distinctive vocabulary, of which “revert,” meaning “reinstate”—as in “I reverted the edit, but the user has simply rereverted it”—may be the most commonly used word. Other terms include WikiGnome (a user who keeps a low profile, fixing typos, poor grammar, and broken links) and its antithesis, WikiTroll (a user who persistently violates the site’s guidelines or otherwise engages in disruptive behavior). There are Aspergian Wikipedians (seventy-two), bipolar Wikipedians, vegetarian Wikipedians, antivegetarian Wikipedians, existential Wikipedians, pro-Luxembourg Wikipedians, and Wikipedians who don’t like to be categorized. According to a page on the site, an avid interest in Wikipedia has been known to afflict “computer programmers, academics, graduate students, game-show contestants, news junkies, the unemployed, the soon-to-be unemployed and, in general, people with multiple interests and good memories.” You may travel in more exalted circles, but this covers pretty much everyone I know.
Wikipedia may be the world’s most ambitious vanity press. There are two hundred thousand registered users on the English-language site, of whom about thirty-three hundred—fewer than two per cent—are responsible for seventy per cent of the work. The site allows you to compare contributors by the number of edits they have made, by the number of articles that have been judged by community vote to be outstanding (these “featured” articles often appear on the site’s home page), and by hourly activity, in graph form. A seventeen-year-old P. G. Wodehouse fan who specializes in British peerages leads the featured-article pack, with fifty-eight entries. A twenty-four-year-old University of Toronto graduate is the site’s premier contributor. Since composing his first piece, on the Panama Canal, in 2001, he has written or edited more than seventy-two thousand articles. “Wikipediholism” and “editcountitis” are well defined on the site; both link to an article on obsessive-compulsive disorder. (There is a Britannica entry for O.C.D., but no version of it has included Felix Unger’s name in the third sentence, a comprehensive survey of “OCD in literature and film,” or a list of celebrity O.C.D. sufferers, which unites, surely for the first time in history, Florence Nightingale with Joey Ramone.)
One regular on the site is a user known as Essjay, who holds a Ph.D. in theology and a degree in canon law and has written or contributed to sixteen thousand entries. A tenured professor of religion at a private university, Essjay made his first edit in February, 2005. Initially, he contributed to articles in his field—on the penitential rite, transubstantiation, the papal tiara. Soon he was spending fourteen hours a day on the site, though he was careful to keep his online life a secret from his colleagues and friends. (To his knowledge, he has never met another Wikipedian, and he will not be attending Wikimania, the second international gathering of the encyclopedia’s contributors, which will take place in early August in Boston.)
Gradually, Essjay found himself devoting less time to editing and more to correcting errors and removing obscenities from the site. In May, he twice removed a sentence from the entry on Justin Timberlake asserting that the pop star had lost his home in 2002 for failing to pay federal taxes—a statement that Essjay knew to be false. The incident ended there. Others involve ideological disagreements and escalate into intense edit wars. A number of the disputes on the English-language Wikipedia relate to the Israeli-Palestinian conflict and to religious issues. Almost as acrimonious are the battles waged over the entries on Macedonia, Danzig, the Armenian genocide, and Henry Ford. Ethnic feuds die hard: Was Copernicus Polish, German, or Prussian? (A nonbinding poll was conducted earlier this year to determine whether the question merited mention in the article’s lead.) Some debates may never be resolved: Was the 1812 Battle of Borodino a victory for the Russians or for the French? What is the date of Ann Coulter’s birth? Is apple pie all-American? (The answer, at least for now, is no: “Apple trees didn’t even grow in America until the Europeans brought them over,” one user railed. He was seconded by another, who added, “Apple pie is very popular in the Netherlands too. Americans did not invent or introduce it to the Netherlands. You already plagiarized Santa Claus from our Saint Nicholas. Stop it!”) Who could have guessed that “cheese” would figure among the site’s most contested entries? (The controversy entailed whether in Asia there is a cultural prohibition against eating it.) For the past nine months, Baltimore’s climate has been a subject of bitter debate. What is the average temperature in January?
At first, Wales handled the fistfights himself, but he was reluctant to ban anyone from the site. As the number of users increased, so did the editing wars and the incidence of vandalism. In October, 2001, Wales appointed a small cadre of administrators, called admins, to police the site for abuse. Admins can delete articles or protect them from further changes, block users from editing, and revert text more efficiently than can ordinary users. (There are now nearly a thousand admins on the site.) In 2004, Wales formalized the 3R rule—initially it had been merely a guideline—according to which any user who reverts the same text more than three times in a twenty-four-hour period is blocked from editing for a day. The policy grew out of a series of particularly vitriolic battles, including one over the U.S. economy—it was experiencing either high growth and low unemployment or low growth and high unemployment.
Wales also appointed an arbitration committee to rule on disputes. Before a case reaches the arbitration committee, it often passes through a mediation committee. Essjay is serving a second term as chair of the mediation committee. He is also an admin, a bureaucrat, and a checkuser, which means that he is one of fourteen Wikipedians authorized to trace I.P. addresses in cases of suspected abuse. He often takes his laptop to class, so that he can be available to Wikipedians while giving a quiz, and he keeps an eye on twenty I.R.C. chat channels, where users often trade gossip about abuses they have witnessed.
Five robots troll the site for obvious vandalism, searching for obscenities and evidence of mass deletions, reverting text as they go. More egregious violations require human intervention. Essjay recently caught a user who, under one screen name, was replacing sentences with nonsense and deleting whole entries and, under another, correcting the abuses—all in order to boost his edit count. He was banned permanently from the site. Some users who have been caught tampering threaten revenge against the admins who apprehend them. Essjay says that he routinely receives death threats. “There are people who take Wikipedia way too seriously,” he told me. (Wikipedians have acknowledged Essjay’s labors by awarding him numerous barnstars—five-pointed stars, which the community has adopted as a symbol of praise—including several Random Acts of Kindness Barnstars and the Tireless Contributor Barnstar.)
Wikipedia has become a regulatory thicket, complete with an elaborate hierarchy of users and policies about policies. Martin Wattenberg and Fernanda B. Viégas, two researchers at I.B.M. who have studied the site using computerized visual models called “history flows,” found that the talk pages and “meta pages”—those dealing with coördination and administration—have experienced the greatest growth. Whereas articles once made up about eighty-five per cent of the site’s content, as of last October they represented seventy per cent. As Wattenberg put it, “People are talking about governance, not working on content.” Wales is ambivalent about the rules and procedures but believes that they are necessary. “Things work well when a group of people know each other, and things break down when it’s a bunch of random people interacting,” he told me.
For all its protocol, Wikipedia’s bureaucracy doesn’t necessarily favor truth. In March, 2005, William Connolley, a climate modeller at the British Antarctic Survey, in Cambridge, was briefly victim of an edit war over the entry on global warming, to which he had contributed. After a particularly nasty confrontation with a skeptic, who had repeatedly watered down language pertainin to the greenhouse effect, the case went into arbitration. “User William M. Connolley strongly pushes his POV with systematic removal of any POV which does not match his own,” his accuse charged in a written deposition. “His views on climate science are singular and narrow.” A decision from the arbitration committee was three months in coming, after which Connolley was place on a humiliating one-revert-a-day parole. The punishment was later revoked, and Connolley is now an admin, with two thousand pages on his watchlist—a feature that enables users to compile list of entries and to be notified when changes are made to them. He says that Wikipedia’s entry on global warming may be the best page on the subject anywhere on the Web. Nevertheless Wales admits that in this case the system failed. It can still seem as though the user who spends the most time on the site—or who yells the loudest—wins.
Connolley believes that Wikipedia “gives no privilege to those who know what they’re talking about,” a view that is echoed by many academics and former contributors, including Larry Sanger, who argues that too many Wikipedians are fundamentally suspicious of experts and unjustly confident of their own opinions. He left Wikipedia in March, 2002, after Wales ran out of money to support the site during the dot-com bust. Sanger concluded that he had become a symbol of authority in an anti-authoritarian community. “Wikipedia has gone from a nearly perfect anarchy to an anarchy with gang rule,” he told me. (Sanger is now the director of collaborative projects at the online foundation Digital Universe, where he is helping to develop a Web-based encyclopedia, a hybrid between a wiki and a traditional reference work. He promises that it will have “the lowest error rate in history.”) Even Eric Raymond, the open-source pioneer whose work inspired Wales, argues that “ ‘disaster’ is not too strong a word” for Wikipedia. In his view, the site is “infested with moonbats.” (Think hobgoblins of little minds, varsity division.) He has found his corrections to entries on science fiction dismantled by users who evidently felt that he was trespassing on their terrain. “The more you look at what some of the Wikipedia contributors have done, the better Britannica looks,” Raymond said. He believes that the open-source model is simply inapplicable to an encyclopedia. For software, there is an objective standard: either it works or it doesn’t. There is no such test for truth.
Nor has increasing surveillance of the site by admins deterred vandals, a majority of whom seem to be inserting obscenities and absurdities into Wikipedia when they should be doing their homework. Many are committing their pranks in the classroom: the abuse tends to ebb on a Friday afternoon and resume early on a Monday. Entire schools and universities have found their I.P. addresses blocked as a result. The entry on George W. Bush has been vandalized so frequently—sometimes more than twice a minute—that it is often closed to editing for days. At any given time, a couple of hundred entries are semi-protected, which means that a user must register his I.P. address and wait several days before making changes. This group recently included not only the entries on God, Galileo, and Al Gore but also those on poodles, oranges, and Frédéric Chopin. Even Wales has been caught airbrushing his Wikipedia entry—eighteen times in the past year. He is particularly sensitive about references to the porn traffic on his Web portal. “Adult content” or “glamour photography” are the terms that he prefers, though, as one user pointed out on the site, they are perhaps not the most precise way to describe lesbian strip-poker threesomes. (In January, Wales agreed to a compromise: “erotic photography.”) He is repentant about his meddling. “People shouldn’t do it, including me,” he said. “It’s in poor taste.”
Wales recently established an “oversight” function, by which some admins (Essjay among them) can purge text from the system, so that even the history page bears no record of its ever having been there. Wales says that this measure is rarely used, and only in order to remove slanderous or private information, such as a telephone number. “It’s a perfectly reasonable power in any other situation, but completely antithetical to this project,” said Jason Scott, a longtime contributor to Wikipedia who has published several essays critical of the site.
Is Wikipedia accurate? Last year, Nature published a survey comparing forty-two entries on scientific topics on Wikipedia with their counterparts in Encyclopædia Britannica. According to the survey, Wikipedia had four errors for every three of Britannica’s, a result that, oddly, was hailed as a triumph for the upstart. Such exercises in nitpicking are relatively meaningless, as no reference work is infallible. Britannica issued a public statement refuting the survey’s findings, and took out a half-page advertisement in the Times, which said, in part, “Britannica has never claimed to be error-free. We have a reputation not for unattainable perfection but for strong scholarship, sound judgment, and disciplined editorial review.” Later, Jorge Cauz, Britannica’s president, told me in an e-mail that if Wikipedia continued without some kind of editorial oversight it would “decline into a hulking mediocre mass of uneven, unreliable, and, many times, unreadable articles.” Wales has said that he would consider Britannica a competitor, “except that I think they will be crushed out of existence within five years.”
Larry Sanger proposes a fine distinction between knowledge that is useful and knowledge that is reliable, and there is no question that Wikipedia beats every other source when it comes to breadth, efficiency, and accessibility. Yet the site’s virtues are also liabilities. Cauz scoffed at the notion of “good enough knowledge.” “I hate that,” he said, pointing out that there is no way to know which facts in an entry to trust. Or, as Robert McHenry, a veteran editor at Britannica, put it, “We can get the wrong answer to a question quicker than our fathers and mothers could find a pencil.”
Part of the problem is provenance. The bulk of Wikipedia’s content originates not in the stacks but on the Web, which offers up everything from breaking news, spin, and gossip to proof that the moon landings never took place. Glaring errors jostle quiet omissions. Wales, in his public speeches, cites the Google test: “If it isn’t on Google, it doesn’t exist.” This position poses another difficulty: on Wikipedia, the present takes precedent over the past. The (generally good) entry on St. Augustine is shorter than the one on Britney Spears. The article on Nietzsche has been modified incessantly, yielding five archived talk pages. But the debate is largely over Nietzsche’s politics; taken as a whole, the entry is inferior to the essay in the current Britannica, a model of its form. (From Wikipedia: “Nietzsche also owned a copy of Philipp Mainländer’s ‘Die Philosophie der Erlösung,’ a work which, like Schopenhauer’s philosophy, expressed pessimism.”)
Wikipedia remains a lumpy work in progress. The entries can read as though they had been written by a seventh grader: clarity and concision are lacking; the facts may be sturdy, but the connective tissue is either anemic or absent; and citation is hit or miss. Wattenberg and Viégas, of I.B.M., note that the vast majority of Wikipedia edits consist of deletions and additions rather than of attempts to reorder paragraphs or to shape an entry as a whole, and they believe that Wikipedia’s twenty-five-line editing window deserves some of the blame. It is difficult to craft an article in its entirety when reading it piecemeal, and, given Wikipedians’ obsession with racking up edits, simple fixes often take priority over more complex edits. Wattenberg and Viégas have also identified a “first-mover advantage”: the initial contributor to an article often sets the tone, and that person is rarely a Macaulay or a Johnson. The over-all effect is jittery, the textual equivalent of a film shot with a handheld camera.
What can be said for an encyclopedia that is sometimes right, sometimes wrong, and sometimes illiterate? When I showed the Harvard philosopher Hilary Putnam his entry, he was surprised to find it as good as the one in the Stanford Encyclopedia of Philosophy. He was flabbergasted when he learned how Wikipedia worked. “Obviously, this was the work of experts,” he said. In the nineteen-sixties, William F. Buckley, Jr., said that he would sooner “live in a society governed by the first two thousand names in the Boston telephone directory than in a society governed by the two thousand faculty members of Harvard University.” On Wikipedia, he might finally have his wish. How was his page? Essentially on target, he said. All the same, Buckley added, he would prefer that those anonymous two thousand souls govern, and leave the encyclopedia writing to the experts.
Over breakfast in early May, I asked Cauz for an analogy with which to compare Britannica and Wikipedia. “Wikipedia is to Britannica as ‘American Idol’ is to the Juilliard School,” he e-mailed me the next day. A few days later, Wales also chose a musical metaphor. “Wikipedia is to Britannica as rock and roll is to easy listening,” he suggested. “It may not be as smooth, but it scares the parents and is a lot smarter in the end.” He is right to emphasize the fright factor over accuracy. As was the Encyclopédie, Wikipedia is a combination of manifesto and reference work. Peer review, the mainstream media, and government agencies have landed us in a ditch. Not only are we impatient with the authorities but we are in a mood to talk back. Wikipedia offers endless opportunities for self-expression. It is the love child of reading groups and chat rooms, a second home for anyone who has written an Amazon review. This is not the first time that encyclopedia-makers have snatched control from an élite, or cast a harsh light on certitude. Jimmy Wales may or may not be the new Henry Ford, yet he has sent us tooling down the interstate, with but a squint back at the railroad. We’re on the open road now, without conductors and timetables. We’re free to chart our own course, also free to get gloriously, recklessly lost. Your truth or mine?
Because there are no physical limits on its size, Wikipedia can aspire to be all-inclusive. It is also perfectly configured to be current: there are detailed entries for each of the twelve finalists on this season’s “American Idol,” and the article on the “2006 Israel-Lebanon Conflict” has been edited more than four thousand times since it was created, on July 12th, six hours after Hezbollah militants ignited the hostilities by kidnapping two Israeli soldiers. Wikipedia, which was launched in 2001, is now the seventeenth-most-popular site on the Internet, generating more traffic daily than MSNBC.com and the online versions of the Times and the Wall Street Journal combined. The number of visitors has been doubling every four months; the site receives as many as fourteen thousand hits per second. Wikipedia functions as a filter for vast amounts of information online, and it could be said that Google owes the site for tidying up the neighborhood. But the search engine is amply repaying its debt: because Wikipedia pages contain so many links to other entries on the site, and are so frequently updated, they enjoy an enviably high page rank.
The site has achieved this prominence largely without paid staff or revenue. It has five employees in addition to Jimmy Wales, Wikipedia’s thirty-nine-year-old founder, and it carries no advertising. In 2003, Wikipedia became a nonprofit organization; it meets most of its budget, of seven hundred and fifty thousand dollars, with donations, the bulk of them contributions of twenty dollars or less. Wales says that he is on a mission to “distribute a free encyclopedia to every single person on the planet in their own language,” and to an astonishing degree he is succeeding. Anyone with Internet access can create a Wikipedia entry or edit an existing one. The site currently exists in more than two hundred languages and has hundreds of thousands of contributors around the world. Wales is at the forefront of a revolution in knowledge gathering: he has marshalled an army of volunteers who believe that, working collaboratively, they can produce an encyclopedia that is as good as any written by experts, and with an unprecedented range.
Wikipedia is an online community devoted not to last night’s party or to next season’s iPod but to a higher good. It is also no more immune to human nature than any other utopian project. Pettiness, idiocy, and vulgarity are regular features of the site. Nothing about high-minded collaboration guarantees accuracy, and open editing invites abuse. Senators and congressmen have been caught tampering with their entries; the entire House of Representatives has been banned from Wikipedia several times. (It is not subtle to change Senator Robert Byrd’s age from eighty-eight to a hundred and eighty. It is subtler to sanitize one’s voting record in order to distance oneself from an unpopular President, or to delete broken campaign promises.) Curiously, though, mob rule has not led to chaos. Wikipedia, which began as an experiment in unfettered democracy, has sprouted policies and procedures. At the same time, the site embodies our newly casual relationship to truth. When confronted with evidence of errors or bias, Wikipedians invoke a favorite excuse: look how often the mainstream media, and the traditional encyclopedia, are wrong! As defenses go, this is the epistemological equivalent of “But Johnny jumped off the bridge first.” Wikipedia, though, is only five years old. One day, it may grow up.
The encyclopedic impulse dates back more than two thousand years and has rarely balked at national borders. Among the first general reference works was Emperor’s Mirror, commissioned i 220 A.D. by a Chinese emperor, for use by civil servants. The quest to catalogue all human knowledge accelerated in the eighteenth century. In the seventeen-seventies, the Germans, champion of thoroughness, began assembling a two-hundred-and-forty-two-volume masterwork. A few decades earlier, Johann Heinrich Zedler, a Leipzig bookseller, had alarmed local competitors whe he solicited articles for his Universal-Lexicon. His rivals, fearing that the work would put them out of business by rendering all other books obsolete, tried unsuccessfully to sabotage the project
It took a devious Frenchman, Pierre Bayle, to conceive of an encyclopedia composed solely of errors. After the idea failed to generate much enthusiasm among potential readers, he instead compiled a “Dictionnaire Historique et Critique,” which consisted almost entirely of footnotes, many highlighting flaws of earlier scholarship. Bayle taught readers to doubt, a lesson in subversion that Diderot and d’Alembert, the authors of the Encyclopédie (1751-80), learned well. Their thirty-five-volume work preached rationalism at the expense of church and state. The more stolid Britannica was born of cross-channel rivalry and an Anglo-Saxon passion for utility.
Wales’s first encyclopedia was the World Book, which his parents acquired after dinner one evening in 1969, from a door-to-door salesman. Wales—who resembles a young Billy Crystal with the neuroses neatly tucked in—recalls the enchantment of pasting in update stickers that cross-referenced older entries to the annual supplements. Wales’s mother and grandmother ran a private school in Huntsville, Alabama, which he attended from the age of three. He graduated from Auburn University with a degree in finance and began a Ph.D. in the subject, enrolling first at the University of Alabama and later at Indiana University. In 1994, he decided to take a job trading options in Chicago rather than write his dissertation. Four years later, he moved to San Diego, where he used his savings to found an Internet portal. Its audience was mostly men; pornography—videos and blogs—accounted for about a tenth of its revenues. Meanwhile, Wales was cogitating. In his view, misinformation, propaganda, and ignorance are responsible for many of the world’s ills. “I’m very much an Enlightenment kind of guy,” Wales told me. The promise of the Internet is free knowledge for everyone, he recalls thinking. How do we make that happen?
As an undergraduate, he had read Friedrich Hayek’s 1945 free-market manifesto, “The Use of Knowledge in Society,” which argues that a person’s knowledge is by definition partial, and that truth is established only when people pool their wisdom. Wales thought of the essay again in the nineteen-nineties, when he began reading about the open-source movement, a group of programmers who believed that software should be free and distributed in such a way that anyone could modify the code. He was particularly impressed by “The Cathedral and the Bazaar,” an essay, later expanded into a book, by Eric Raymond, one of the movement’s founders. “It opened my eyes to the possibility of mass collaboration,” Wales said.
The first step was a misstep. In 2000, Wales hired Larry Sanger, a graduate student in philosophy he had met on a Listserv, to help him create an online general-interest encyclopedia called Nupedia. The idea was to solicit articles from scholars, subject the articles to a seven-step review process, and post them free online. Wales himself tried to compose the entry on Robert Merton and options-pricing theory; after he had written a few sentences, he remembered why he had dropped out of graduate school. “They were going to take my essay and send it to two finance professors in the field,” he recalled. “I had been out of academia for several years. It was intimidating; it felt like homework.”
After a year, Nupedia had only twenty-one articles, on such topics as atonality and Herodotus. In January, 2001, Sanger had dinner with a friend, who told him about the wiki, a simple software tool that allows for collaborative writing and editing. Sanger thought that a wiki might attract new contributors to Nupedia. (Wales says that using a wiki was his idea.) Wales agreed to try it, more or less as a lark. Under the wiki model that Sanger and Wales adopted, each entry included a history page, which preserves a record of all editing changes. They added a talk page, to allow for discussion of the editorial process—an idea Bayle would have appreciated. Sanger coined the term Wikipedia, and the site went live on January 15, 2001. Two days later, he sent an e-mail to the Nupedia mailing list—about two thousand people. “Wikipedia is up!” he wrote. “Humor me. Go there and add a little article. It will take all of five or ten minutes.”
Wales braced himself for “complete rubbish.” He figured that if he and Sanger were lucky the wiki would generate a few rough drafts for Nupedia. Within a month, Wikipedia had six hundred articles. After a year, there were twenty thousand.
Wales is fond of citing a 1962 proclamation by Charles Van Doren, who later became an editor at Britannica. Van Doren believed that the traditional encyclopedia was defunct. It had grown by accretion rather than by design; it had sacrificed artful synthesis to plodding convention; it looked backward. “Because the world is radically new, the ideal encyclopedia should be radical, too,” Van Doren wrote. “It should stop being safe—in politics, in philosophy, in science.”
In its seminal Western incarnation, the encyclopedia had been a dangerous book. The Encyclopédie muscled aside religious institutions and orthodoxies to install human reason at the center of the universe—and, for that muscling, briefly earned the book’s publisher a place in the Bastille. As the historian Robert Darnton pointed out, the entry in the Encyclopédie on cannibalism ends with the cross-reference “See Eucharist.” What Wales seems to have in mind, however, is less Van Doren’s call to arms than that of an earlier rabble-rouser. In the nineteen-thirties, H. G. Wells lamented that, while the world was becoming smaller and moving at increasing speed, the way information was distributed remained old-fashioned and ineffective. He prescribed a “world brain,” a collaborative, decentralized repository of knowledge that would be subject to continual revision. More radically—with “alma-matricidal impiety,” as he put it—Wells indicted academia; the university was itself medieval. “We want a Henry Ford today to modernize the distribution of knowledge, make good knowledge cheap and easy in this still very ignorant, ill-educated, ill-served English-speaking world of ours,” he wrote. Had the Internet existed in his lifetime, Wells might have beaten Wales to the punch.
Wales’s most radical contribution may be not to have made information free but—in his own alma-matricidal way—to have invented a system that does not favor the Ph.D. over the well-read fifteen-year-old. “To me, the key thing is getting it right,” Wales has said of Wikipedia’s contributors. “I don’t care if they’re a high-school kid or a Harvard professor.” At the beginning, there were no formal rules, though Sanger eventually posted a set of guidelines on the site. The first was “Ignore all the rules.” Two of the others have become central tenets: articles must reflect a neutral point of view (N.P.O.V., in Wikipedia lingo), and their content must be both verifiable and previously published. Among other things, the prohibition against original research heads off a great deal of material about people’s pets.
Insofar as Wikipedia has a physical existence, it is in St. Petersburg, Florida, in an executive suite that serves as the headquarters of the Wikimedia Foundation, the parent organization o Wikipedia and its lesser-known sister projects, among them Wikisource (a library of free texts), Wikinews (a current-events site) and Wikiquote (bye-bye Bartlett’s). Wales, who is married an has a five-year-old daughter, says that St. Petersburg’s attractive housing prices lured him from California. When I visited the offices in March, the walls were bare, the furniture battered. With th addition of a dead plant, the suite could pass for a graduate-student lounge
The real work at Wikipedia takes place not in Florida but on thousands of computer screens across the world. Perhaps Wikipedia’s greatest achievement—one that Wales did not fully anticipate—was the creation of a community. Wikipedians are officially anonymous, contributing to unsigned entries under screen names. They are also predominantly male—about eighty per cent, Wales says—and compulsively social, conversing with each other not only on the talk pages attached to each entry but on Wikipedia-dedicated I.R.C. channels and on user pages, which regular contributors often create and which serve as a sort of personalized office cooler. On the page of a twenty-year-old Wikipedian named Arocoun, who lists “philosophizing” among his favorite activities, messages from other users range from the reflective (“I’d argue against your claim that humans should aim to be independent/self-reliant in all aspects of their lives . . . I don’t think true independence is a realistic ideal given all the inherent intertwinings of any society”) to the geekily flirtatious (“I’m a neurotic painter from Ohio, and I guess if you consider your views radical, then I’m a radical, too. So . . . we should be friends”).
Wikipedians have evolved a distinctive vocabulary, of which “revert,” meaning “reinstate”—as in “I reverted the edit, but the user has simply rereverted it”—may be the most commonly used word. Other terms include WikiGnome (a user who keeps a low profile, fixing typos, poor grammar, and broken links) and its antithesis, WikiTroll (a user who persistently violates the site’s guidelines or otherwise engages in disruptive behavior). There are Aspergian Wikipedians (seventy-two), bipolar Wikipedians, vegetarian Wikipedians, antivegetarian Wikipedians, existential Wikipedians, pro-Luxembourg Wikipedians, and Wikipedians who don’t like to be categorized. According to a page on the site, an avid interest in Wikipedia has been known to afflict “computer programmers, academics, graduate students, game-show contestants, news junkies, the unemployed, the soon-to-be unemployed and, in general, people with multiple interests and good memories.” You may travel in more exalted circles, but this covers pretty much everyone I know.
Wikipedia may be the world’s most ambitious vanity press. There are two hundred thousand registered users on the English-language site, of whom about thirty-three hundred—fewer than two per cent—are responsible for seventy per cent of the work. The site allows you to compare contributors by the number of edits they have made, by the number of articles that have been judged by community vote to be outstanding (these “featured” articles often appear on the site’s home page), and by hourly activity, in graph form. A seventeen-year-old P. G. Wodehouse fan who specializes in British peerages leads the featured-article pack, with fifty-eight entries. A twenty-four-year-old University of Toronto graduate is the site’s premier contributor. Since composing his first piece, on the Panama Canal, in 2001, he has written or edited more than seventy-two thousand articles. “Wikipediholism” and “editcountitis” are well defined on the site; both link to an article on obsessive-compulsive disorder. (There is a Britannica entry for O.C.D., but no version of it has included Felix Unger’s name in the third sentence, a comprehensive survey of “OCD in literature and film,” or a list of celebrity O.C.D. sufferers, which unites, surely for the first time in history, Florence Nightingale with Joey Ramone.)
One regular on the site is a user known as Essjay, who holds a Ph.D. in theology and a degree in canon law and has written or contributed to sixteen thousand entries. A tenured professor of religion at a private university, Essjay made his first edit in February, 2005. Initially, he contributed to articles in his field—on the penitential rite, transubstantiation, the papal tiara. Soon he was spending fourteen hours a day on the site, though he was careful to keep his online life a secret from his colleagues and friends. (To his knowledge, he has never met another Wikipedian, and he will not be attending Wikimania, the second international gathering of the encyclopedia’s contributors, which will take place in early August in Boston.)
Gradually, Essjay found himself devoting less time to editing and more to correcting errors and removing obscenities from the site. In May, he twice removed a sentence from the entry on Justin Timberlake asserting that the pop star had lost his home in 2002 for failing to pay federal taxes—a statement that Essjay knew to be false. The incident ended there. Others involve ideological disagreements and escalate into intense edit wars. A number of the disputes on the English-language Wikipedia relate to the Israeli-Palestinian conflict and to religious issues. Almost as acrimonious are the battles waged over the entries on Macedonia, Danzig, the Armenian genocide, and Henry Ford. Ethnic feuds die hard: Was Copernicus Polish, German, or Prussian? (A nonbinding poll was conducted earlier this year to determine whether the question merited mention in the article’s lead.) Some debates may never be resolved: Was the 1812 Battle of Borodino a victory for the Russians or for the French? What is the date of Ann Coulter’s birth? Is apple pie all-American? (The answer, at least for now, is no: “Apple trees didn’t even grow in America until the Europeans brought them over,” one user railed. He was seconded by another, who added, “Apple pie is very popular in the Netherlands too. Americans did not invent or introduce it to the Netherlands. You already plagiarized Santa Claus from our Saint Nicholas. Stop it!”) Who could have guessed that “cheese” would figure among the site’s most contested entries? (The controversy entailed whether in Asia there is a cultural prohibition against eating it.) For the past nine months, Baltimore’s climate has been a subject of bitter debate. What is the average temperature in January?
At first, Wales handled the fistfights himself, but he was reluctant to ban anyone from the site. As the number of users increased, so did the editing wars and the incidence of vandalism. In October, 2001, Wales appointed a small cadre of administrators, called admins, to police the site for abuse. Admins can delete articles or protect them from further changes, block users from editing, and revert text more efficiently than can ordinary users. (There are now nearly a thousand admins on the site.) In 2004, Wales formalized the 3R rule—initially it had been merely a guideline—according to which any user who reverts the same text more than three times in a twenty-four-hour period is blocked from editing for a day. The policy grew out of a series of particularly vitriolic battles, including one over the U.S. economy—it was experiencing either high growth and low unemployment or low growth and high unemployment.
Wales also appointed an arbitration committee to rule on disputes. Before a case reaches the arbitration committee, it often passes through a mediation committee. Essjay is serving a second term as chair of the mediation committee. He is also an admin, a bureaucrat, and a checkuser, which means that he is one of fourteen Wikipedians authorized to trace I.P. addresses in cases of suspected abuse. He often takes his laptop to class, so that he can be available to Wikipedians while giving a quiz, and he keeps an eye on twenty I.R.C. chat channels, where users often trade gossip about abuses they have witnessed.
Five robots troll the site for obvious vandalism, searching for obscenities and evidence of mass deletions, reverting text as they go. More egregious violations require human intervention. Essjay recently caught a user who, under one screen name, was replacing sentences with nonsense and deleting whole entries and, under another, correcting the abuses—all in order to boost his edit count. He was banned permanently from the site. Some users who have been caught tampering threaten revenge against the admins who apprehend them. Essjay says that he routinely receives death threats. “There are people who take Wikipedia way too seriously,” he told me. (Wikipedians have acknowledged Essjay’s labors by awarding him numerous barnstars—five-pointed stars, which the community has adopted as a symbol of praise—including several Random Acts of Kindness Barnstars and the Tireless Contributor Barnstar.)
Wikipedia has become a regulatory thicket, complete with an elaborate hierarchy of users and policies about policies. Martin Wattenberg and Fernanda B. Viégas, two researchers at I.B.M. who have studied the site using computerized visual models called “history flows,” found that the talk pages and “meta pages”—those dealing with coördination and administration—have experienced the greatest growth. Whereas articles once made up about eighty-five per cent of the site’s content, as of last October they represented seventy per cent. As Wattenberg put it, “People are talking about governance, not working on content.” Wales is ambivalent about the rules and procedures but believes that they are necessary. “Things work well when a group of people know each other, and things break down when it’s a bunch of random people interacting,” he told me.
For all its protocol, Wikipedia’s bureaucracy doesn’t necessarily favor truth. In March, 2005, William Connolley, a climate modeller at the British Antarctic Survey, in Cambridge, was briefly victim of an edit war over the entry on global warming, to which he had contributed. After a particularly nasty confrontation with a skeptic, who had repeatedly watered down language pertainin to the greenhouse effect, the case went into arbitration. “User William M. Connolley strongly pushes his POV with systematic removal of any POV which does not match his own,” his accuse charged in a written deposition. “His views on climate science are singular and narrow.” A decision from the arbitration committee was three months in coming, after which Connolley was place on a humiliating one-revert-a-day parole. The punishment was later revoked, and Connolley is now an admin, with two thousand pages on his watchlist—a feature that enables users to compile list of entries and to be notified when changes are made to them. He says that Wikipedia’s entry on global warming may be the best page on the subject anywhere on the Web. Nevertheless Wales admits that in this case the system failed. It can still seem as though the user who spends the most time on the site—or who yells the loudest—wins.
Connolley believes that Wikipedia “gives no privilege to those who know what they’re talking about,” a view that is echoed by many academics and former contributors, including Larry Sanger, who argues that too many Wikipedians are fundamentally suspicious of experts and unjustly confident of their own opinions. He left Wikipedia in March, 2002, after Wales ran out of money to support the site during the dot-com bust. Sanger concluded that he had become a symbol of authority in an anti-authoritarian community. “Wikipedia has gone from a nearly perfect anarchy to an anarchy with gang rule,” he told me. (Sanger is now the director of collaborative projects at the online foundation Digital Universe, where he is helping to develop a Web-based encyclopedia, a hybrid between a wiki and a traditional reference work. He promises that it will have “the lowest error rate in history.”) Even Eric Raymond, the open-source pioneer whose work inspired Wales, argues that “ ‘disaster’ is not too strong a word” for Wikipedia. In his view, the site is “infested with moonbats.” (Think hobgoblins of little minds, varsity division.) He has found his corrections to entries on science fiction dismantled by users who evidently felt that he was trespassing on their terrain. “The more you look at what some of the Wikipedia contributors have done, the better Britannica looks,” Raymond said. He believes that the open-source model is simply inapplicable to an encyclopedia. For software, there is an objective standard: either it works or it doesn’t. There is no such test for truth.
Nor has increasing surveillance of the site by admins deterred vandals, a majority of whom seem to be inserting obscenities and absurdities into Wikipedia when they should be doing their homework. Many are committing their pranks in the classroom: the abuse tends to ebb on a Friday afternoon and resume early on a Monday. Entire schools and universities have found their I.P. addresses blocked as a result. The entry on George W. Bush has been vandalized so frequently—sometimes more than twice a minute—that it is often closed to editing for days. At any given time, a couple of hundred entries are semi-protected, which means that a user must register his I.P. address and wait several days before making changes. This group recently included not only the entries on God, Galileo, and Al Gore but also those on poodles, oranges, and Frédéric Chopin. Even Wales has been caught airbrushing his Wikipedia entry—eighteen times in the past year. He is particularly sensitive about references to the porn traffic on his Web portal. “Adult content” or “glamour photography” are the terms that he prefers, though, as one user pointed out on the site, they are perhaps not the most precise way to describe lesbian strip-poker threesomes. (In January, Wales agreed to a compromise: “erotic photography.”) He is repentant about his meddling. “People shouldn’t do it, including me,” he said. “It’s in poor taste.”
Wales recently established an “oversight” function, by which some admins (Essjay among them) can purge text from the system, so that even the history page bears no record of its ever having been there. Wales says that this measure is rarely used, and only in order to remove slanderous or private information, such as a telephone number. “It’s a perfectly reasonable power in any other situation, but completely antithetical to this project,” said Jason Scott, a longtime contributor to Wikipedia who has published several essays critical of the site.
Is Wikipedia accurate? Last year, Nature published a survey comparing forty-two entries on scientific topics on Wikipedia with their counterparts in Encyclopædia Britannica. According to the survey, Wikipedia had four errors for every three of Britannica’s, a result that, oddly, was hailed as a triumph for the upstart. Such exercises in nitpicking are relatively meaningless, as no reference work is infallible. Britannica issued a public statement refuting the survey’s findings, and took out a half-page advertisement in the Times, which said, in part, “Britannica has never claimed to be error-free. We have a reputation not for unattainable perfection but for strong scholarship, sound judgment, and disciplined editorial review.” Later, Jorge Cauz, Britannica’s president, told me in an e-mail that if Wikipedia continued without some kind of editorial oversight it would “decline into a hulking mediocre mass of uneven, unreliable, and, many times, unreadable articles.” Wales has said that he would consider Britannica a competitor, “except that I think they will be crushed out of existence within five years.”
Larry Sanger proposes a fine distinction between knowledge that is useful and knowledge that is reliable, and there is no question that Wikipedia beats every other source when it comes to breadth, efficiency, and accessibility. Yet the site’s virtues are also liabilities. Cauz scoffed at the notion of “good enough knowledge.” “I hate that,” he said, pointing out that there is no way to know which facts in an entry to trust. Or, as Robert McHenry, a veteran editor at Britannica, put it, “We can get the wrong answer to a question quicker than our fathers and mothers could find a pencil.”
Part of the problem is provenance. The bulk of Wikipedia’s content originates not in the stacks but on the Web, which offers up everything from breaking news, spin, and gossip to proof that the moon landings never took place. Glaring errors jostle quiet omissions. Wales, in his public speeches, cites the Google test: “If it isn’t on Google, it doesn’t exist.” This position poses another difficulty: on Wikipedia, the present takes precedent over the past. The (generally good) entry on St. Augustine is shorter than the one on Britney Spears. The article on Nietzsche has been modified incessantly, yielding five archived talk pages. But the debate is largely over Nietzsche’s politics; taken as a whole, the entry is inferior to the essay in the current Britannica, a model of its form. (From Wikipedia: “Nietzsche also owned a copy of Philipp Mainländer’s ‘Die Philosophie der Erlösung,’ a work which, like Schopenhauer’s philosophy, expressed pessimism.”)
Wikipedia remains a lumpy work in progress. The entries can read as though they had been written by a seventh grader: clarity and concision are lacking; the facts may be sturdy, but the connective tissue is either anemic or absent; and citation is hit or miss. Wattenberg and Viégas, of I.B.M., note that the vast majority of Wikipedia edits consist of deletions and additions rather than of attempts to reorder paragraphs or to shape an entry as a whole, and they believe that Wikipedia’s twenty-five-line editing window deserves some of the blame. It is difficult to craft an article in its entirety when reading it piecemeal, and, given Wikipedians’ obsession with racking up edits, simple fixes often take priority over more complex edits. Wattenberg and Viégas have also identified a “first-mover advantage”: the initial contributor to an article often sets the tone, and that person is rarely a Macaulay or a Johnson. The over-all effect is jittery, the textual equivalent of a film shot with a handheld camera.
What can be said for an encyclopedia that is sometimes right, sometimes wrong, and sometimes illiterate? When I showed the Harvard philosopher Hilary Putnam his entry, he was surprised to find it as good as the one in the Stanford Encyclopedia of Philosophy. He was flabbergasted when he learned how Wikipedia worked. “Obviously, this was the work of experts,” he said. In the nineteen-sixties, William F. Buckley, Jr., said that he would sooner “live in a society governed by the first two thousand names in the Boston telephone directory than in a society governed by the two thousand faculty members of Harvard University.” On Wikipedia, he might finally have his wish. How was his page? Essentially on target, he said. All the same, Buckley added, he would prefer that those anonymous two thousand souls govern, and leave the encyclopedia writing to the experts.
Over breakfast in early May, I asked Cauz for an analogy with which to compare Britannica and Wikipedia. “Wikipedia is to Britannica as ‘American Idol’ is to the Juilliard School,” he e-mailed me the next day. A few days later, Wales also chose a musical metaphor. “Wikipedia is to Britannica as rock and roll is to easy listening,” he suggested. “It may not be as smooth, but it scares the parents and is a lot smarter in the end.” He is right to emphasize the fright factor over accuracy. As was the Encyclopédie, Wikipedia is a combination of manifesto and reference work. Peer review, the mainstream media, and government agencies have landed us in a ditch. Not only are we impatient with the authorities but we are in a mood to talk back. Wikipedia offers endless opportunities for self-expression. It is the love child of reading groups and chat rooms, a second home for anyone who has written an Amazon review. This is not the first time that encyclopedia-makers have snatched control from an élite, or cast a harsh light on certitude. Jimmy Wales may or may not be the new Henry Ford, yet he has sent us tooling down the interstate, with but a squint back at the railroad. We’re on the open road now, without conductors and timetables. We’re free to chart our own course, also free to get gloriously, recklessly lost. Your truth or mine?
Cold, Hard Facts
IN the debate on global warming, the data on the climate of Antarctica has been distorted, at different times, by both sides. As a polar researcher caught in the middle, I’d like to set the record straight.
In January 2002, a research paper about Antarctic temperatures, of which I was the lead author, appeared in the journal Nature. At the time, the Antarctic Peninsula was warming, and many people assumed that meant the climate on the entire continent was heating up, as the Arctic was. But the Antarctic Peninsula represents only about 15 percent of the continent’s land mass, so it could not tell the whole story of Antarctic climate. Our paper made the continental picture more clear.
My research colleagues and I found that from 1996 to 2000, one small, ice-free area of the Antarctic mainland had actually cooled. Our report also analyzed temperatures for the mainland in such a way as to remove the influence of the peninsula warming and found that, from 1966 to 2000, more of the continent had cooled than had warmed. Our summary statement pointed out how the cooling trend posed challenges to models of Antarctic climate and ecosystem change.
Newspaper and television reports focused on this part of the paper. And many news and opinion writers linked our study with another bit of polar research published that month, in Science, showing that part of Antarctica’s ice sheet had been thickening — and erroneously concluded that the earth was not warming at all. “Scientific findings run counter to theory of global warming,” said a headline on an editorial in The San Diego Union-Tribune. One conservative commentator wrote, “It’s ironic that two studies suggesting that a new Ice Age may be under way may end the global warming debate.”
In a rebuttal in The Providence Journal, in Rhode Island, the lead author of the Science paper and I explained that our studies offered no evidence that the earth was cooling. But the misinterpretation had already become legend, and in the four and half years since, it has only grown.
Our results have been misused as “evidence” against global warming by Michael Crichton in his novel “State of Fear” and by Ann Coulter in her latest book, “Godless: The Church of Liberalism.” Search my name on the Web, and you will find pages of links to everything from climate discussion groups to Senate policy committee documents — all citing my 2002 study as reason to doubt that the earth is warming. One recent Web column even put words in my mouth. I have never said that “the unexpected colder climate in Antarctica may possibly be signaling a lessening of the current global warming cycle.” I have never thought such a thing either.
Our study did find that 58 percent of Antarctica cooled from 1966 to 2000. But during that period, the rest of the continent was warming. And climate models created since our paper was published have suggested a link between the lack of significant warming in Antarctica and the ozone hole over that continent. These models, conspicuously missing from the warming-skeptic literature, suggest that as the ozone hole heals — thanks to worldwide bans on ozone-destroying chemicals — all of Antarctica is likely to warm with the rest of the planet. An inconvenient truth?
Also missing from the skeptics’ arguments is the debate over our conclusions. Another group of researchers who took a different approach found no clear cooling trend in Antarctica. We still stand by our results for the period we analyzed, but unbiased reporting would acknowledge differences of scientific opinion.
The disappointing thing is that we are even debating the direction of climate change on this globally important continent. And it may not end until we have more weather stations on Antarctica and longer-term data that demonstrate a clear trend.
In the meantime, I would like to remove my name from the list of scientists who dispute global warming. I know my coauthors would as well.
Peter Doran is an associate professor of earth and environmental sciences at the University of Illinois at Chicago.
In January 2002, a research paper about Antarctic temperatures, of which I was the lead author, appeared in the journal Nature. At the time, the Antarctic Peninsula was warming, and many people assumed that meant the climate on the entire continent was heating up, as the Arctic was. But the Antarctic Peninsula represents only about 15 percent of the continent’s land mass, so it could not tell the whole story of Antarctic climate. Our paper made the continental picture more clear.
My research colleagues and I found that from 1996 to 2000, one small, ice-free area of the Antarctic mainland had actually cooled. Our report also analyzed temperatures for the mainland in such a way as to remove the influence of the peninsula warming and found that, from 1966 to 2000, more of the continent had cooled than had warmed. Our summary statement pointed out how the cooling trend posed challenges to models of Antarctic climate and ecosystem change.
Newspaper and television reports focused on this part of the paper. And many news and opinion writers linked our study with another bit of polar research published that month, in Science, showing that part of Antarctica’s ice sheet had been thickening — and erroneously concluded that the earth was not warming at all. “Scientific findings run counter to theory of global warming,” said a headline on an editorial in The San Diego Union-Tribune. One conservative commentator wrote, “It’s ironic that two studies suggesting that a new Ice Age may be under way may end the global warming debate.”
In a rebuttal in The Providence Journal, in Rhode Island, the lead author of the Science paper and I explained that our studies offered no evidence that the earth was cooling. But the misinterpretation had already become legend, and in the four and half years since, it has only grown.
Our results have been misused as “evidence” against global warming by Michael Crichton in his novel “State of Fear” and by Ann Coulter in her latest book, “Godless: The Church of Liberalism.” Search my name on the Web, and you will find pages of links to everything from climate discussion groups to Senate policy committee documents — all citing my 2002 study as reason to doubt that the earth is warming. One recent Web column even put words in my mouth. I have never said that “the unexpected colder climate in Antarctica may possibly be signaling a lessening of the current global warming cycle.” I have never thought such a thing either.
Our study did find that 58 percent of Antarctica cooled from 1966 to 2000. But during that period, the rest of the continent was warming. And climate models created since our paper was published have suggested a link between the lack of significant warming in Antarctica and the ozone hole over that continent. These models, conspicuously missing from the warming-skeptic literature, suggest that as the ozone hole heals — thanks to worldwide bans on ozone-destroying chemicals — all of Antarctica is likely to warm with the rest of the planet. An inconvenient truth?
Also missing from the skeptics’ arguments is the debate over our conclusions. Another group of researchers who took a different approach found no clear cooling trend in Antarctica. We still stand by our results for the period we analyzed, but unbiased reporting would acknowledge differences of scientific opinion.
The disappointing thing is that we are even debating the direction of climate change on this globally important continent. And it may not end until we have more weather stations on Antarctica and longer-term data that demonstrate a clear trend.
In the meantime, I would like to remove my name from the list of scientists who dispute global warming. I know my coauthors would as well.
Peter Doran is an associate professor of earth and environmental sciences at the University of Illinois at Chicago.
Subscribe to:
Posts (Atom)