From: andrew.uroskie@lcc.gatech.edu
Subject: Supreme Court Nomination
Date: October 31, 2005 9:11:53 AM EST
To: public@nytimes.com
Dear Public Editor,
I am concerned that the developing story by Christine Hauser and David Kirkpatrick is already taking a biased stance through its framing headline, "Nomination Likely to Please G.O.P., but Not Some Democrats: The choice of Samuel A. Alito Jr. likely will mend a rift in the G.O.P. caused by his failed nomination of Harriet E. Miers" Far from being a neutral description of a situation the authors themselves admit they cannot yet foresee, this headline frames the story as one in which Judge Alito is uniformly celebrated by the Republicans and some Democrats, with only a few remaining Democrats dissenting. If anything, the facts so far show the exact opposite: this candidate replaces a moderate conservative woman who upheld Roe v. Wade with a far-right man who would not. Polls for decades have repeatedly shown that a majority of Americans support Roe v. Wade. It is the Republicans who now will have to worry that their moderate members - generally on record as supporting Roe - will be the ones to join Democrats opposing Alito. A more accurate headline would be, "Nomination Likely to Please Far-Right, but Not Moderate Republicans." More accurate still would be to simply refrain from judgment: "Nomination Likely to Give Rise to Heated Debate." Please refrain from this kind of biased framing, especially in the crucial early stages when the public is just being introduced to an issue.
Andrew V. Uroskie
Asst. Professor of Literature, Communication and Culture
Georgia Institute of Technology
Monday, October 31
Alito
Amazing.
With a 39% approval rating, President Bush might have tried appealing to moderates in both parties. But he decided not to consult with Democrats in the Congress this time, as he had with now-Chief Justice Roberts.
Instead, he chose to replace a moderate woman, Sandra Day O'Conner, with a radical right-wing white man. Not an hispanic man, not an african american, not an asian-american, not even a white woman. Another white man. That would make him the second white male italian-american Catholic from New Jersey on the court.
Everyone says he's intelligent and a nice guy. I'm sure he is. It's just that you're swinging the supreme court vastly towards the right for probably 30+ years, and on what mandate? The Republican Part won a war-time election by the thinnist of hairs, and now is decidely out of step with the American Public. A new poll shows that a majority of Americans found that the ethics and integrity of the Presidency has decreased, rather than improved, since the Presidency of Bill Clinton.
Here's Dahlia Litwick, chief legal correspondent for Slate:
So rededicated is President Bush to keeping his promise to elevate a Clarence Thomas or an Antonin Scalia to the high court, that he picked the guy in the Scalia costume. Alito offers no surprises to anyone. If explicit promises to reverse Roe v. Wade are in fact the only qualification now needed to be confirmed to the Supreme Court, Alito has offered that pledge in spades: In Planned Parenthood of Southeastern Pennsylvania v. Casey—which later became the case that reaffirmed Roe, Alito dissented when his 3rd Circuit colleagues struck down Pennsylvania's most restrictive abortion regulations. Alito felt that none of the provisions proved an undue burden, including a requirement that women notify their spouses of their intent to have an abortion, absent narrow exceptions. Alito wrote: "The Pennsylvania legislature could have rationally believed that some married women are initially inclined to obtain an abortion without their husbands' knowledge because of perceived problems—such as economic constraints, future plans, or the husbands' previously expressed opposition—that may be obviated by discussion prior to the abortion."
Sandra Day O'Connor rejected that analysis, and Casey reaffirmed the central holding of Roe. Then Chief Justice Rehnquist quoted Alito's dissent in his own.
You'll hear a lot about some of Alito's other decisions in the coming days, including his vote to limit Congress' power to ban even machine-gun possession, and his ruling that broadened police search powers to include the right to strip-search a drug dealer's wife and 10-year-old daughter—although they were not mentioned in the search warrant. He upheld a Christmas display against an Establishment Clause challenge. His prior rulings show that he would raise the barriers for victims of sex discrimination to seek redress in the courts. He would change the standard for analyzing race discrimination claims to such an extent that his colleagues on the court of appeals fretted that Title VII of the Civil Rights Act, which prohibits employment discrimination based on race, color, religion, sex, or national origin, would be "eviscerated" under his view of the law. He sought to narrow the Family and Medical Leave Act such that states would be immune from suit—a position the Supreme Court later rejected. In an antitrust case involving the Scotch tape giant 3M, he took a position described by a colleague as likely to weaken a provision of the Sherman Antitrust Act to "the point of impotence."
And there's a whole lot more where that came from.
Best of all for Bush's base, Alito is the kind of "restrained" jurist who isn't above striking down acts of Congress whenever they offend him. Bush noted this morning: "He has a deep understanding of the proper role of judges in our society. He understands that judges are to interpret the laws, not to impose their preferences or priorities on the people."
Except, of course, that Alito doesn't think Congress has the power to regulate machine-gun possession, or to broadly enforce the Family and Medical Leave Act, or to enact race or gender discrimination laws that might be effective in remedying race and gender discrimination, or to tackle monopolists. Alito thus neatly joins the ranks of right-wing activists in the battle to limit the power of Congress and diminish the efficacy of the judiciary. In that sense Bush has pulled off the perfect Halloween maneuver: He's managed the trick of getting his sticky scandals off the front pages, and the treat of a right-wing activist dressed up as a constitutional minimalist.
And Bruce Reed:
No mention of the Constitution or strict constructionism. No false judicial modesty that the new guy will sit quietly and behave himself on that bench.
What happened to Bush's old mantra? First, while we may not know Alito's shoe size, we know that shoe doesn't fit. Nobody who tried to overturn the Family and Medical Leave Act can claim that his philosophy is judge-modestly-and-carry-a-blank-slate.
The other reason Bush threw his judicial activism talking points out the window is that he doesn't need them anymore. On the contrary, he wants the right wing—and the left—to know that this nominee is the conservative judicial activist they've been waiting for all along. Bush's new message: Bring it on.
Forget all that mumbo-jumbo about umpires and judicial restraint, Bush seems to be saying. The fans don't come out to watch everybody sit on the bench—they want to see a brawl that clears it.
With a 39% approval rating, President Bush might have tried appealing to moderates in both parties. But he decided not to consult with Democrats in the Congress this time, as he had with now-Chief Justice Roberts.
Instead, he chose to replace a moderate woman, Sandra Day O'Conner, with a radical right-wing white man. Not an hispanic man, not an african american, not an asian-american, not even a white woman. Another white man. That would make him the second white male italian-american Catholic from New Jersey on the court.
Everyone says he's intelligent and a nice guy. I'm sure he is. It's just that you're swinging the supreme court vastly towards the right for probably 30+ years, and on what mandate? The Republican Part won a war-time election by the thinnist of hairs, and now is decidely out of step with the American Public. A new poll shows that a majority of Americans found that the ethics and integrity of the Presidency has decreased, rather than improved, since the Presidency of Bill Clinton.
Here's Dahlia Litwick, chief legal correspondent for Slate:
So rededicated is President Bush to keeping his promise to elevate a Clarence Thomas or an Antonin Scalia to the high court, that he picked the guy in the Scalia costume. Alito offers no surprises to anyone. If explicit promises to reverse Roe v. Wade are in fact the only qualification now needed to be confirmed to the Supreme Court, Alito has offered that pledge in spades: In Planned Parenthood of Southeastern Pennsylvania v. Casey—which later became the case that reaffirmed Roe, Alito dissented when his 3rd Circuit colleagues struck down Pennsylvania's most restrictive abortion regulations. Alito felt that none of the provisions proved an undue burden, including a requirement that women notify their spouses of their intent to have an abortion, absent narrow exceptions. Alito wrote: "The Pennsylvania legislature could have rationally believed that some married women are initially inclined to obtain an abortion without their husbands' knowledge because of perceived problems—such as economic constraints, future plans, or the husbands' previously expressed opposition—that may be obviated by discussion prior to the abortion."
Sandra Day O'Connor rejected that analysis, and Casey reaffirmed the central holding of Roe. Then Chief Justice Rehnquist quoted Alito's dissent in his own.
You'll hear a lot about some of Alito's other decisions in the coming days, including his vote to limit Congress' power to ban even machine-gun possession, and his ruling that broadened police search powers to include the right to strip-search a drug dealer's wife and 10-year-old daughter—although they were not mentioned in the search warrant. He upheld a Christmas display against an Establishment Clause challenge. His prior rulings show that he would raise the barriers for victims of sex discrimination to seek redress in the courts. He would change the standard for analyzing race discrimination claims to such an extent that his colleagues on the court of appeals fretted that Title VII of the Civil Rights Act, which prohibits employment discrimination based on race, color, religion, sex, or national origin, would be "eviscerated" under his view of the law. He sought to narrow the Family and Medical Leave Act such that states would be immune from suit—a position the Supreme Court later rejected. In an antitrust case involving the Scotch tape giant 3M, he took a position described by a colleague as likely to weaken a provision of the Sherman Antitrust Act to "the point of impotence."
And there's a whole lot more where that came from.
Best of all for Bush's base, Alito is the kind of "restrained" jurist who isn't above striking down acts of Congress whenever they offend him. Bush noted this morning: "He has a deep understanding of the proper role of judges in our society. He understands that judges are to interpret the laws, not to impose their preferences or priorities on the people."
Except, of course, that Alito doesn't think Congress has the power to regulate machine-gun possession, or to broadly enforce the Family and Medical Leave Act, or to enact race or gender discrimination laws that might be effective in remedying race and gender discrimination, or to tackle monopolists. Alito thus neatly joins the ranks of right-wing activists in the battle to limit the power of Congress and diminish the efficacy of the judiciary. In that sense Bush has pulled off the perfect Halloween maneuver: He's managed the trick of getting his sticky scandals off the front pages, and the treat of a right-wing activist dressed up as a constitutional minimalist.
And Bruce Reed:
No mention of the Constitution or strict constructionism. No false judicial modesty that the new guy will sit quietly and behave himself on that bench.
What happened to Bush's old mantra? First, while we may not know Alito's shoe size, we know that shoe doesn't fit. Nobody who tried to overturn the Family and Medical Leave Act can claim that his philosophy is judge-modestly-and-carry-a-blank-slate.
The other reason Bush threw his judicial activism talking points out the window is that he doesn't need them anymore. On the contrary, he wants the right wing—and the left—to know that this nominee is the conservative judicial activist they've been waiting for all along. Bush's new message: Bring it on.
Forget all that mumbo-jumbo about umpires and judicial restraint, Bush seems to be saying. The fans don't come out to watch everybody sit on the bench—they want to see a brawl that clears it.
What's a Modern Girl to Do?
October 30, 2005
By MAUREEN DOWD
When I entered college in 1969, women were bursting out of their 50's chrysalis, shedding girdles, padded bras and conventions. The Jazz Age spirit flared in the Age of Aquarius. Women were once again imitating men and acting all independent: smoking, drinking, wanting to earn money and thinking they had the right to be sexual, this time protected by the pill. I didn't fit in with the brazen new world of hard-charging feminists. I was more of a fun-loving (if chaste) type who would decades later come to life in Sarah Jessica Parker's Carrie Bradshaw. I hated the grubby, unisex jeans and no-makeup look and drugs that zoned you out, and I couldn't understand the appeal of dances that didn't involve touching your partner. In the universe of Eros, I longed for style and wit. I loved the Art Deco glamour of 30's movies. I wanted to dance the Continental like Fred and Ginger in white hotel suites; drink martinis like Myrna Loy and William Powell; live the life of a screwball heroine like Katharine Hepburn, wearing a gold lamé gown cut on the bias, cavorting with Cary Grant, strolling along Fifth Avenue with my pet leopard.
My mom would just shake her head and tell me that my idea of the 30's was wildly romanticized. "We were poor," she'd say. "We didn't dance around in white hotel suites." I took the idealism and passion of the 60's for granted, simply assuming we were sailing toward perfect equality with men, a utopian world at home and at work. I didn't listen to her when she cautioned me about the chimera of equality.
On my 31st birthday, she sent me a bankbook with a modest nest egg she had saved for me. "I always felt that the girls in a family should get a little more than the boys even though all are equally loved," she wrote in a letter. "They need a little cushion to fall back on. Women can stand on the Empire State Building and scream to the heavens that they are equal to men and liberated, but until they have the same anatomy, it's a lie. It's more of a man's world today than ever. Men can eat their cake in unlimited bakeries."
I thought she was just being Old World, like my favorite jade, Dorothy Parker, when she wrote:
By the time you swear you're his,
Shivering and sighing,
And he vows his passion is
Infinite, undying -
Lady, make a note of this:
One of you is lying.
I thought the struggle for egalitarianism was a cinch, so I could leave it to my earnest sisters in black turtlenecks and Birkenstocks. I figured there was plenty of time for me to get serious later, that America would always be full of passionate and full-throated debate about the big stuff - social issues, sexual equality, civil rights. Little did I realize that the feminist revolution would have the unexpected consequence of intensifying the confusion between the sexes, leaving women in a tangle of dependence and independence as they entered the 21st century.
Maybe we should have known that the story of women's progress would be more of a zigzag than a superhighway, that the triumph of feminism would last a nanosecond while the backlash lasted 40 years.
Despite the best efforts of philosophers, politicians, historians, novelists, screenwriters, linguists, therapists, anthropologists and facilitators, men and women are still in a muddle in the boardroom, the bedroom and the Situation Room.
Courtship
My mom gave me three essential books on the subject of men. The first, when I was 13, was "On Becoming a Woman." The second, when I was 21, was "365 Ways to Cook Hamburger." The third, when I was 25, was "How to Catch and Hold a Man," by Yvonne Antelle. ("Keep thinking of yourself as a soft, mysterious cat.. . .Men are fascinated by bright, shiny objects, by lots of curls, lots of hair on the head . . . by bows, ribbons, ruffles and bright colors.. . .Sarcasm is dangerous. Avoid it altogether.")
Because I received "How to Catch and Hold a Man" at a time when we were entering the Age of Equality, I put it aside as an anachronism. After all, sometime in the 1960's flirting went out of fashion, as did ironing boards, makeup and the idea that men needed to be "trapped" or "landed." The way to approach men, we reasoned, was forthrightly and without games, artifice or frills. Unfortunately, history has shown this to be a misguided notion.
I knew it even before the 1995 publication of "The Rules," a dating bible that encouraged women to return to prefeminist mind games by playing hard to get. ("Don't stay on the phone for more than 10 minutes.. . .Even if you are the head of your own company. . .when you're with a man you like, be quiet and mysterious, act ladylike, cross your legs and smile.. . .Wear black sheer pantyhose and hike up your skirt to entice the opposite sex!")
I knew this before fashion magazines became crowded with crinolines, bows, ruffles, leopard-skin scarves, 50's party dresses and other sartorial equivalents of flirting and with articles like "The Return of Hard to Get." ("I think it behooves us to stop offering each other these pearls of feminism, to stop saying, 'So, why don't you call him?"' a writer lectured in Mademoiselle. "Some men must have the thrill of the chase.")
I knew things were changing because a succession of my single girlfriends had called, sounding sheepish, to ask if they could borrow my out-of-print copy of "How to Catch and Hold a Man."
Decades after the feminist movement promised equality with men, it was becoming increasingly apparent that many women would have to brush up on the venerable tricks of the trade: an absurdly charming little laugh, a pert toss of the head, an air of saucy triumph, dewy eyes and a full knowledge of music, drawing, elegant note writing and geography. It would once more be considered captivating to lie on a chaise longue, pass a lacy handkerchief across the eyelids and complain of a case of springtime giddiness.
Today, women have gone back to hunting their quarry - in person and in cyberspace - with elaborate schemes designed to allow the deluded creatures to think they are the hunters. "Men like hunting, and we shouldn't deprive them of their chance to do their hunting and mating rituals," my 26-year-old friend Julie Bosman, a New York Times reporter, says. "As my mom says, Men don't like to be chased." Or as the Marvelettes sang, "The hunter gets captured by the game."
These days the key to staying cool in the courtship rituals is B. & I., girls say - Busy and Important. "As much as you're waiting for that little envelope to appear on your screen," says Carrie Foster, a 29-year-old publicist in Washington, "you happen to have a lot of stuff to do anyway." If a guy rejects you or turns out to be the essence of evil, you can ratchet up from B. & I. to C.B.B., Can't Be Bothered. In the T.M.I. - Too Much Information - digital age, there can be infinite technological foreplay.
Helen Fisher, a Rutgers anthropologist, concurs with Julie: "What our grandmothers told us about playing hard to get is true. The whole point of the game is to impress and capture. It's not about honesty. Many men and women, when they're playing the courtship game, deceive so they can win. Novelty, excitement and danger drive up dopamine in the brain. And both sexes brag."
Women might dye their hair, apply makeup and spend hours finding a hip-slimming dress, she said, while men may drive a nice car or wear a fancy suit that makes them seem richer than they are. In this retro world, a woman must play hard to get but stay soft as a kitten. And avoid sarcasm. Altogether.
Money
In those faraway, long-ago days of feminism, there was talk about equal pay for equal work. Now there's talk about "girl money."
A friend of mine in her 30's says it is a term she hears bandied about the New York dating scene. She also notes a shift in the type of gifts given at wedding showers around town, a reversion to 50's-style offerings: soup ladles and those frilly little aprons from Anthropologie and vintage stores are being unwrapped along with see-through nighties and push-up bras.
"What I find most disturbing about the 1950's-ification and retrogression of women's lives is that it has seeped into the corporate and social culture, where it can do real damage," she complains. "Otherwise intelligent men, who know women still earn less than men as a rule, say things like: 'I'll get the check. You only have girl money."'
Throughout the long, dark ages of undisputed patriarchy, women connived to trade beauty and sex for affluence and status. In the first flush of feminism, women offered to pay half the check with "woman money" as a way to show that these crass calculations - that a woman's worth in society was determined by her looks, that she was an ornament up for sale to the highest bidder - no longer applied.
Now dating etiquette has reverted. Young women no longer care about using the check to assert their equality. They care about using it to assess their sexuality. Going Dutch is an archaic feminist relic. Young women talk about it with disbelief and disdain. "It's a scuzzy 70's thing, like platform shoes on men," one told me.
"Feminists in the 70's went overboard," Anne Schroeder, a 26-year-old magazine editor in Washington, agrees. "Paying is like opening a car door. It's nice. I appreciate it. But he doesn't have to."
Unless he wants another date.
Women in their 20's think old-school feminists looked for equality in all the wrong places, that instead of fighting battles about whether women should pay for dinner or wear padded bras they should have focused only on big economic issues.
After Googling and Bikramming to get ready for a first dinner date, a modern girl will end the evening with the Offering, an insincere bid to help pay the check. "They make like they are heading into their bag after a meal, but it is a dodge," Marc Santora, a 30-year-old Metro reporter for The Times, says. "They know you will stop them before a credit card can be drawn. If you don't, they hold it against you."
One of my girlfriends, a TV producer in New York, told me much the same thing: "If you offer, and they accept, then it's over."
Jurassic feminists shudder at the retro implication of a quid profiterole. But it doesn't matter if the woman is making as much money as the man, or more, she expects him to pay, both to prove her desirability and as a way of signaling romance - something that's more confusing in a dating culture rife with casual hookups and group activities. (Once beyond the initial testing phase and settled in a relationship, of course, she can pony up more.)
"There are plenty of ways for me to find out if he's going to see me as an equal without disturbing the dating ritual," one young woman says. "Disturbing the dating ritual leads to chaos. Everybody knows that."
When I asked a young man at my gym how he and his lawyer girlfriend were going to divide the costs on a California vacation, he looked askance. "She never offers," he replied. "And I like paying for her." It is, as one guy said, "one of the few remaining ways we can demonstrate our manhood."
Power Dynamics
At a party for the Broadway opening of "Sweet Smell of Success," a top New York producer gave me a lecture on the price of female success that was anything but sweet. He confessed that he had wanted to ask me out on a date when he was between marriages but nixed the idea because my job as a Times columnist made me too intimidating. Men, he explained, prefer women who seem malleable and awed. He predicted that I would never find a mate because if there's one thing men fear, it's a woman who uses her critical faculties. Will she be critical of absolutely everything, even his manhood?
He had hit on a primal fear of single successful women: that the aroma of male power is an aphrodisiac for women, but the perfume of female power is a turnoff for men. It took women a few decades to realize that everything they were doing to advance themselves in the boardroom could be sabotaging their chances in the bedroom, that evolution was lagging behind equality.
A few years ago at a White House correspondents' dinner, I met a very beautiful and successful actress. Within minutes, she blurted out: "I can't believe I'm 46 and not married. Men only want to marry their personal assistants or P.R. women."
I'd been noticing a trend along these lines, as famous and powerful men took up with young women whose job it was was to care for them and nurture them in some way: their secretaries, assistants, nannies, caterers, flight attendants, researchers and fact-checkers.
John Schwartz of The New York Times made the trend official in 2004 when he reported: "Men would rather marry their secretaries than their bosses, and evolution may be to blame." A study by psychology researchers at the University of Michigan, using college undergraduates, suggested that men going for long-term relationships would rather marry women in subordinate jobs than women who are supervisors. Men think that women with important jobs are more likely to cheat on them. There it is, right in the DNA: women get penalized by insecure men for being too independent.
"The hypothesis," Dr. Stephanie Brown, the lead author of the study, theorized, "is that there are evolutionary pressures on males to take steps to minimize the risk of raising offspring that are not their own." Women, by contrast, did not show a marked difference between their attraction to men who might work above them and their attraction to men who might work below them.
So was the feminist movement some sort of cruel hoax? Do women get less desirable as they get more successful?
After I first wrote on this subject, a Times reader named Ray Lewis e-mailed me. While we had assumed that making ourselves more professionally accomplished would make us more fascinating, it turned out, as Lewis put it, that smart women were "draining at times."
Or as Bill Maher more crudely but usefully summed it up to Craig Ferguson on the "Late Late Show" on CBS: "Women get in relationships because they want somebody to talk to. Men want women to shut up."
Women moving up still strive to marry up. Men moving up still tend to marry down. The two sexes' going in opposite directions has led to an epidemic of professional women missing out on husbands and kids.
Sylvia Ann Hewlett, an economist and the author of "Creating a Life: Professional Women and the Quest for Children," a book published in 2002, conducted a survey and found that 55 percent of 35-year-old career women were childless. And among corporate executives who earn $100,000 or more, she said, 49 percent of the women did not have children, compared with only 19 percent of the men.
Hewlett quantified, yet again, that men have an unfair advantage. "Nowadays," she said, "the rule of thumb seems to be that the more successful the woman, the less likely it is she will find a husband or bear a child. For men, the reverse is true."
A 2005 report by researchers at four British universities indicated that a high I.Q. hampers a woman's chance to marry, while it is a plus for men. The prospect for marriage increased by 35 percent for guys for each 16-point increase in I.Q.; for women, there is a 40 percent drop for each 16-point rise.
On a "60 Minutes" report on the Hewlett book, Lesley Stahl talked to two young women who went to Harvard Business School. They agreed that while they were the perfect age to start families, they didn't find it easy to meet the right mates.
Men, apparently, learn early to protect their eggshell egos from high-achieving women. The girls said they hid the fact that they went to Harvard from guys they met because it was the kiss of death. "The H-bomb," they dubbed it. "As soon as you say Harvard Business School . . . that's the end of the conversation," Ani Vartanian said. "As soon as the guys say, 'Oh, I go to Harvard Business School,' all the girls start falling into them."
Hewlett thinks that the 2005 American workplace is more macho than ever. "It's actually much more difficult now than 10 years ago to have a career and raise a family," she told me. "The trend lines continue that highly educated women in many countries are increasingly dealing with this creeping nonchoice and end up on this path of delaying finding a mate and delaying childbearing. Whether you're looking at Italy, Russia or the U.S., all of that is true." Many women continue to fear that the more they accomplish, the more they may have to sacrifice. They worry that men still veer away from "challenging" women because of a male atavistic desire to be the superior force in a relationship.
"With men and women, it's always all about control issues, isn't it?" says a guy I know, talking about his bitter divorce.
Or, as Craig Bierko, a musical comedy star and actor who played one of Carrie's boyfriends on "Sex and the City," told me, "Deep down, beneath the bluster and machismo, men are simply afraid to say that what they're truly looking for in a woman is an intelligent, confident and dependable partner in life whom they can devote themselves to unconditionally until she's 40."
Ms. Versus Mrs.
"Ms." was supposed to neutralize the stature of women, so they weren't publicly defined by their marital status. When The Times finally agreed to switch to Ms. in its news pages in 1986, after much hectoring by feminists, Gloria Steinem sent flowers to the executive editor, Abe Rosenthal. But nowadays most young brides want to take their husbands' names and brag on the moniker Mrs., a brand that proclaims you belong to him. T-shirts with "MRS." emblazoned in sequins or sparkly beads are popular wedding-shower gifts.
A Harvard economics professor, Claudia Goldin, did a study last year that found that 44 percent of women in the Harvard class of 1980 who married within 10 years of graduation kept their birth names, while in the class of '90 it was down to 32 percent. In 1990, 23 percent of college-educated women kept their own names after marriage, while a decade later the number had fallen to 17 percent.
Time magazine reported that an informal poll in the spring of 2005 by the Knot, a wedding Web site, showed similar results: 81 percent of respondents took their spouse's last name, an increase from 71 percent in 2000. The number of women with hyphenated surnames fell from 21 percent to 8 percent.
"It's a return to romance, a desire to make marriage work," Goldin told one interviewer, adding that young women might feel that by keeping their own names they were aligning themselves with tedious old-fashioned feminists, and this might be a turnoff to them.
The professor, who married in 1979 and kept her name, undertook the study after her niece, a lawyer, changed hers. "She felt that her generation of women didn't have to do the same things mine did, because of what we had already achieved," Goldin told Time.
Many women now do not think of domestic life as a "comfortable concentration camp," as Betty Friedan wrote in "The Feminine Mystique," where they are losing their identities and turning into "anonymous biological robots in a docile mass." Now they want to be Mrs. Anonymous Biological Robot in a Docile Mass. They dream of being rescued - to flirt, to shop, to stay home and be taken care of. They shop for "Stepford Fashions" - matching shoes and ladylike bags and the 50's-style satin, lace and chiffon party dresses featured in InStyle layouts - and spend their days at the gym trying for Wisteria Lane waistlines.
The Times recently ran a front-page article about young women attending Ivy League colleges, women who are being groomed to take their places in the professional and political elite, who are planning to reject careers in favor of playing traditional roles, staying home and raising children.
"My mother always told me you can't be the best career woman and the best mother at the same time," the brainy, accomplished Cynthia Liu told Louise Story, explaining why she hoped to be a stay-at-home mom a few years after she goes to law school. "You always have to choose one over the other."
Kate White, the editor of Cosmopolitan, told me that she sees a distinct shift in what her readers want these days. "Women now don't want to be in the grind," she said. "The baby boomers made the grind seem unappealing."
Cynthia Russett, a professor of American history at Yale, told Story that women today are simply more "realistic," having seen the dashed utopia of those who assumed it wouldn't be so hard to combine full-time work and child rearing.
To the extent that young women are rejecting the old idea of copying men and reshaping the world around their desires, it's exhilarating progress. But to the extent that a pampered class of females is walking away from the problem and just planning to marry rich enough to cosset themselves in a narrow world of dependence on men, it's an irritating setback. If the new ethos is "a woman needs a career like a fish needs a bicycle," it won't be healthy.
Movies
In all those Tracy-Hepburn movies more than a half-century ago, it was the snap and crackle of a romance between equals that was so exciting. You still see it onscreen occasionally - the incendiary chemistry of Brad Pitt and Angelina Jolie playing married assassins aiming for mutually assured orgasms and destruction in "Mr. and Mrs. Smith." Interestingly, that movie was described as retro because of its salty battle of wits between two peppery lovers. Moviemakers these days are more interested in exploring what Steve Martin, in his novel "Shopgirl," calls the "calm cushion" of romances between unequals.
In James Brooks's movie "Spanglish," Adam Sandler, playing a sensitive Los Angeles chef, falls for his hot Mexican maid, just as in "Maid in Manhattan," Ralph Fiennes, playing a sensitive New York pol, falls for the hot Latino maid at his hotel, played by Jennifer Lopez. Sandler's maid, who cleans up for him without being able to speak English, is presented as the ideal woman, in looks and character. His wife, played by Téa Leoni, is repellent: a jangly, yakking, overachieving, overexercised, unfaithful, shallow she-monster who has just lost her job with a commercial design firm and fears she has lost her identity.
In 2003, we had "Girl With a Pearl Earring," in which Colin Firth's Vermeer erotically paints Scarlett Johansson's Dutch maid, and Richard Curtis's "Love Actually," about the attraction of unequals. The witty and sophisticated British prime minister, played by Hugh Grant, falls for the chubby girl who wheels the tea and scones into his office. A businessman married to the substantial Emma Thompson, the sister of the prime minister, falls for his sultry secretary. A novelist played by Colin Firth falls for his maid, who speaks only Portuguese.
Art is imitating life, turning women who seek equality into selfish narcissists and objects of rejection rather than of affection.
It's funny. I come from a family of Irish domestics - statuesque, 6-foot-tall women who cooked, kept house and acted as nannies for some of America's first families. I was always so proud of achieving more - succeeding in a high-powered career that would have been closed to my great-aunts. How odd, then, to find out now that being a maid would have enhanced my chances with men.
An upstairs maid, of course.
Women's Magazines
Cosmo is still the best-selling magazine on college campuses, as it was when I was in college, and the best-selling monthly magazine on the newsstand. The June 2005 issue, with Jessica Simpson on the cover, her cleavage spilling out of an orange croqueted halter dress, could have been June 1970. The headlines are familiar: "How to turn him on in 10 words or less," "Do You Make Men M-E-L-T? Take our quiz," "Bridal Special," Cosmo's stud search and "Cosmo's Most Famous Sex Tips; the Legendary Tricks That Have Brought Countless Guys to Their Knees." (Sex Trick 4: "Place a glazed doughnut around your man's member, then gently nibble the pastry and lick the icing . . . as well as his manhood." Another favorite Cosmo trick is to yell out during sex which of your girlfriends thinks your man is hot.)
At any newsstand, you'll see the original Cosmo girl's man-crazy, sex-obsessed image endlessly, tiresomely replicated, even for the teen set. On the cover of Elle Girl: "267 Ways to Look Hot."
"There has been lots of copying - look at Glamour," Helen Gurley Brown, Cosmo's founding editor told me and sighed. "I used to have all the sex to myself."
Before it curdled into a collection of stereotypes, feminism had fleetingly held out a promise that there would be some precincts of womanly life that were not all about men. But it never quite materialized.
It took only a few decades to create a brazen new world where the highest ideal is to acknowledge your inner slut. I am woman; see me strip. Instead of peaceful havens of girl things and boy things, we have a society where women of all ages are striving to become self-actualized sex kittens. Hollywood actresses now work out by taking pole-dancing classes.
Female sexuality has been a confusing corkscrew path, not a serene progressive arc. We had decades of Victorian prudery, when women were not supposed to like sex. Then we had the pill and zipless encounters, when women were supposed to have the same animalistic drive as men. Then it was discovered - shock, horror! - that men and women are not alike in their desires. But zipless morphed into hookups, and the more one-night stands the girls on "Sex and the City" had, the grumpier they got.
Oddly enough, Felix Dennis, who created the top-selling Maxim, said he stole his "us against the world" lad-magazine attitude from women's magazines like Cosmo. Just as women didn't mind losing Cosmo's prestigious fiction as the magazine got raunchier, plenty of guys were happy to lose the literary pretensions of venerable men's magazines and embrace simple-minded gender stereotypes, like the Maxim manifesto instructing women, "If we see you in the morning and night, why call us at work?"
Jessica Simpson and Eva Longoria move seamlessly from showing their curves on the covers of Cosmo and Glamour to Maxim, which dubbed Simpson "America's favorite ball and chain!" In the summer of 2005, both British GQ and FHM featured Pamela Anderson busting out of their covers. ("I think of my breasts as props," she told FHM.)
A lot of women now want to be Maxim babes as much as men want Maxim babes. So women have moved from fighting objectification to seeking it. "I have been surprised," Maxim's editor, Ed Needham, confessed to me, "to find that a lot of women would want to be somehow validated as a Maxim girl type, that they'd like to be thought of as hot and would like their boyfriends to take pictures of them or make comments about them that mirror the Maxim representation of a woman, the Pamela Anderson sort of brand. That, to me, is kind of extraordinary."
The luscious babes on the cover of Maxim were supposed to be men's fantasy guilty pleasures, after all, not their real life-affirming girlfriends.
Beauty
While I never related to the unstyled look of the early feminists and I tangled with boyfriends who did not want me to wear makeup and heels, I always assumed that one positive result of the feminist movement would be a more flexible and capacious notion of female beauty, a release from the tyranny of the girdled, primped ideal of the 50's.
I was wrong. Forty years after the dawn of feminism, the ideal of feminine beauty is more rigid and unnatural than ever.
When Gloria Steinem wrote that "all women are Bunnies," she did not mean it as a compliment; it was a feminist call to arms. Decades later, it's just an aesthetic fact, as more and more women embrace Botox and implants and stretch and protrude to extreme proportions to satisfy male desires. Now that technology is biology, all women can look like inflatable dolls. It's clear that American narcissism has trumped American feminism.
It was naïve and misguided for the early feminists to tendentiously demonize Barbie and Cosmo girl, to disdain such female proclivities as shopping, applying makeup and hunting for sexy shoes and cute boyfriends and to prognosticate a world where men and women dressed alike and worked alike in navy suits and were equal in every way.
But it is equally naïve and misguided for young women now to fritter away all their time shopping for boudoirish clothes and text-messaging about guys while they disdainfully ignore gender politics and the seismic shifts on the Supreme Court that will affect women's rights for a generation.
What I didn't like at the start of the feminist movement was that young women were dressing alike, looking alike and thinking alike. They were supposed to be liberated, but it just seemed like stifling conformity.
What I don't like now is that the young women rejecting the feminist movement are dressing alike, looking alike and thinking alike. The plumage is more colorful, the shapes are more curvy, the look is more plastic, the message is diametrically opposite - before it was don't be a sex object; now it's be a sex object - but the conformity is just as stifling.
And the Future . . .
Having boomeranged once, will women do it again in a couple of decades? If we flash forward to 2030, will we see all those young women who thought trying to Have It All was a pointless slog, now middle-aged and stranded in suburbia, popping Ativan, struggling with rebellious teenagers, deserted by husbands for younger babes, unable to get back into a work force they never tried to be part of?
It's easy to picture a surreally familiar scene when women realize they bought into a raw deal and old trap. With no power or money or independence, they'll be mere domestic robots, lasering their legs and waxing their floors - or vice versa - and desperately seeking a new Betty Friedan.
Maureen Dowd is a columnist for The New York Times. This essay is adapted from "Are Men Necessary: When Sexes Collide," to be published next month by G.P. Putnam's Sons.
By MAUREEN DOWD
When I entered college in 1969, women were bursting out of their 50's chrysalis, shedding girdles, padded bras and conventions. The Jazz Age spirit flared in the Age of Aquarius. Women were once again imitating men and acting all independent: smoking, drinking, wanting to earn money and thinking they had the right to be sexual, this time protected by the pill. I didn't fit in with the brazen new world of hard-charging feminists. I was more of a fun-loving (if chaste) type who would decades later come to life in Sarah Jessica Parker's Carrie Bradshaw. I hated the grubby, unisex jeans and no-makeup look and drugs that zoned you out, and I couldn't understand the appeal of dances that didn't involve touching your partner. In the universe of Eros, I longed for style and wit. I loved the Art Deco glamour of 30's movies. I wanted to dance the Continental like Fred and Ginger in white hotel suites; drink martinis like Myrna Loy and William Powell; live the life of a screwball heroine like Katharine Hepburn, wearing a gold lamé gown cut on the bias, cavorting with Cary Grant, strolling along Fifth Avenue with my pet leopard.
My mom would just shake her head and tell me that my idea of the 30's was wildly romanticized. "We were poor," she'd say. "We didn't dance around in white hotel suites." I took the idealism and passion of the 60's for granted, simply assuming we were sailing toward perfect equality with men, a utopian world at home and at work. I didn't listen to her when she cautioned me about the chimera of equality.
On my 31st birthday, she sent me a bankbook with a modest nest egg she had saved for me. "I always felt that the girls in a family should get a little more than the boys even though all are equally loved," she wrote in a letter. "They need a little cushion to fall back on. Women can stand on the Empire State Building and scream to the heavens that they are equal to men and liberated, but until they have the same anatomy, it's a lie. It's more of a man's world today than ever. Men can eat their cake in unlimited bakeries."
I thought she was just being Old World, like my favorite jade, Dorothy Parker, when she wrote:
By the time you swear you're his,
Shivering and sighing,
And he vows his passion is
Infinite, undying -
Lady, make a note of this:
One of you is lying.
I thought the struggle for egalitarianism was a cinch, so I could leave it to my earnest sisters in black turtlenecks and Birkenstocks. I figured there was plenty of time for me to get serious later, that America would always be full of passionate and full-throated debate about the big stuff - social issues, sexual equality, civil rights. Little did I realize that the feminist revolution would have the unexpected consequence of intensifying the confusion between the sexes, leaving women in a tangle of dependence and independence as they entered the 21st century.
Maybe we should have known that the story of women's progress would be more of a zigzag than a superhighway, that the triumph of feminism would last a nanosecond while the backlash lasted 40 years.
Despite the best efforts of philosophers, politicians, historians, novelists, screenwriters, linguists, therapists, anthropologists and facilitators, men and women are still in a muddle in the boardroom, the bedroom and the Situation Room.
Courtship
My mom gave me three essential books on the subject of men. The first, when I was 13, was "On Becoming a Woman." The second, when I was 21, was "365 Ways to Cook Hamburger." The third, when I was 25, was "How to Catch and Hold a Man," by Yvonne Antelle. ("Keep thinking of yourself as a soft, mysterious cat.. . .Men are fascinated by bright, shiny objects, by lots of curls, lots of hair on the head . . . by bows, ribbons, ruffles and bright colors.. . .Sarcasm is dangerous. Avoid it altogether.")
Because I received "How to Catch and Hold a Man" at a time when we were entering the Age of Equality, I put it aside as an anachronism. After all, sometime in the 1960's flirting went out of fashion, as did ironing boards, makeup and the idea that men needed to be "trapped" or "landed." The way to approach men, we reasoned, was forthrightly and without games, artifice or frills. Unfortunately, history has shown this to be a misguided notion.
I knew it even before the 1995 publication of "The Rules," a dating bible that encouraged women to return to prefeminist mind games by playing hard to get. ("Don't stay on the phone for more than 10 minutes.. . .Even if you are the head of your own company. . .when you're with a man you like, be quiet and mysterious, act ladylike, cross your legs and smile.. . .Wear black sheer pantyhose and hike up your skirt to entice the opposite sex!")
I knew this before fashion magazines became crowded with crinolines, bows, ruffles, leopard-skin scarves, 50's party dresses and other sartorial equivalents of flirting and with articles like "The Return of Hard to Get." ("I think it behooves us to stop offering each other these pearls of feminism, to stop saying, 'So, why don't you call him?"' a writer lectured in Mademoiselle. "Some men must have the thrill of the chase.")
I knew things were changing because a succession of my single girlfriends had called, sounding sheepish, to ask if they could borrow my out-of-print copy of "How to Catch and Hold a Man."
Decades after the feminist movement promised equality with men, it was becoming increasingly apparent that many women would have to brush up on the venerable tricks of the trade: an absurdly charming little laugh, a pert toss of the head, an air of saucy triumph, dewy eyes and a full knowledge of music, drawing, elegant note writing and geography. It would once more be considered captivating to lie on a chaise longue, pass a lacy handkerchief across the eyelids and complain of a case of springtime giddiness.
Today, women have gone back to hunting their quarry - in person and in cyberspace - with elaborate schemes designed to allow the deluded creatures to think they are the hunters. "Men like hunting, and we shouldn't deprive them of their chance to do their hunting and mating rituals," my 26-year-old friend Julie Bosman, a New York Times reporter, says. "As my mom says, Men don't like to be chased." Or as the Marvelettes sang, "The hunter gets captured by the game."
These days the key to staying cool in the courtship rituals is B. & I., girls say - Busy and Important. "As much as you're waiting for that little envelope to appear on your screen," says Carrie Foster, a 29-year-old publicist in Washington, "you happen to have a lot of stuff to do anyway." If a guy rejects you or turns out to be the essence of evil, you can ratchet up from B. & I. to C.B.B., Can't Be Bothered. In the T.M.I. - Too Much Information - digital age, there can be infinite technological foreplay.
Helen Fisher, a Rutgers anthropologist, concurs with Julie: "What our grandmothers told us about playing hard to get is true. The whole point of the game is to impress and capture. It's not about honesty. Many men and women, when they're playing the courtship game, deceive so they can win. Novelty, excitement and danger drive up dopamine in the brain. And both sexes brag."
Women might dye their hair, apply makeup and spend hours finding a hip-slimming dress, she said, while men may drive a nice car or wear a fancy suit that makes them seem richer than they are. In this retro world, a woman must play hard to get but stay soft as a kitten. And avoid sarcasm. Altogether.
Money
In those faraway, long-ago days of feminism, there was talk about equal pay for equal work. Now there's talk about "girl money."
A friend of mine in her 30's says it is a term she hears bandied about the New York dating scene. She also notes a shift in the type of gifts given at wedding showers around town, a reversion to 50's-style offerings: soup ladles and those frilly little aprons from Anthropologie and vintage stores are being unwrapped along with see-through nighties and push-up bras.
"What I find most disturbing about the 1950's-ification and retrogression of women's lives is that it has seeped into the corporate and social culture, where it can do real damage," she complains. "Otherwise intelligent men, who know women still earn less than men as a rule, say things like: 'I'll get the check. You only have girl money."'
Throughout the long, dark ages of undisputed patriarchy, women connived to trade beauty and sex for affluence and status. In the first flush of feminism, women offered to pay half the check with "woman money" as a way to show that these crass calculations - that a woman's worth in society was determined by her looks, that she was an ornament up for sale to the highest bidder - no longer applied.
Now dating etiquette has reverted. Young women no longer care about using the check to assert their equality. They care about using it to assess their sexuality. Going Dutch is an archaic feminist relic. Young women talk about it with disbelief and disdain. "It's a scuzzy 70's thing, like platform shoes on men," one told me.
"Feminists in the 70's went overboard," Anne Schroeder, a 26-year-old magazine editor in Washington, agrees. "Paying is like opening a car door. It's nice. I appreciate it. But he doesn't have to."
Unless he wants another date.
Women in their 20's think old-school feminists looked for equality in all the wrong places, that instead of fighting battles about whether women should pay for dinner or wear padded bras they should have focused only on big economic issues.
After Googling and Bikramming to get ready for a first dinner date, a modern girl will end the evening with the Offering, an insincere bid to help pay the check. "They make like they are heading into their bag after a meal, but it is a dodge," Marc Santora, a 30-year-old Metro reporter for The Times, says. "They know you will stop them before a credit card can be drawn. If you don't, they hold it against you."
One of my girlfriends, a TV producer in New York, told me much the same thing: "If you offer, and they accept, then it's over."
Jurassic feminists shudder at the retro implication of a quid profiterole. But it doesn't matter if the woman is making as much money as the man, or more, she expects him to pay, both to prove her desirability and as a way of signaling romance - something that's more confusing in a dating culture rife with casual hookups and group activities. (Once beyond the initial testing phase and settled in a relationship, of course, she can pony up more.)
"There are plenty of ways for me to find out if he's going to see me as an equal without disturbing the dating ritual," one young woman says. "Disturbing the dating ritual leads to chaos. Everybody knows that."
When I asked a young man at my gym how he and his lawyer girlfriend were going to divide the costs on a California vacation, he looked askance. "She never offers," he replied. "And I like paying for her." It is, as one guy said, "one of the few remaining ways we can demonstrate our manhood."
Power Dynamics
At a party for the Broadway opening of "Sweet Smell of Success," a top New York producer gave me a lecture on the price of female success that was anything but sweet. He confessed that he had wanted to ask me out on a date when he was between marriages but nixed the idea because my job as a Times columnist made me too intimidating. Men, he explained, prefer women who seem malleable and awed. He predicted that I would never find a mate because if there's one thing men fear, it's a woman who uses her critical faculties. Will she be critical of absolutely everything, even his manhood?
He had hit on a primal fear of single successful women: that the aroma of male power is an aphrodisiac for women, but the perfume of female power is a turnoff for men. It took women a few decades to realize that everything they were doing to advance themselves in the boardroom could be sabotaging their chances in the bedroom, that evolution was lagging behind equality.
A few years ago at a White House correspondents' dinner, I met a very beautiful and successful actress. Within minutes, she blurted out: "I can't believe I'm 46 and not married. Men only want to marry their personal assistants or P.R. women."
I'd been noticing a trend along these lines, as famous and powerful men took up with young women whose job it was was to care for them and nurture them in some way: their secretaries, assistants, nannies, caterers, flight attendants, researchers and fact-checkers.
John Schwartz of The New York Times made the trend official in 2004 when he reported: "Men would rather marry their secretaries than their bosses, and evolution may be to blame." A study by psychology researchers at the University of Michigan, using college undergraduates, suggested that men going for long-term relationships would rather marry women in subordinate jobs than women who are supervisors. Men think that women with important jobs are more likely to cheat on them. There it is, right in the DNA: women get penalized by insecure men for being too independent.
"The hypothesis," Dr. Stephanie Brown, the lead author of the study, theorized, "is that there are evolutionary pressures on males to take steps to minimize the risk of raising offspring that are not their own." Women, by contrast, did not show a marked difference between their attraction to men who might work above them and their attraction to men who might work below them.
So was the feminist movement some sort of cruel hoax? Do women get less desirable as they get more successful?
After I first wrote on this subject, a Times reader named Ray Lewis e-mailed me. While we had assumed that making ourselves more professionally accomplished would make us more fascinating, it turned out, as Lewis put it, that smart women were "draining at times."
Or as Bill Maher more crudely but usefully summed it up to Craig Ferguson on the "Late Late Show" on CBS: "Women get in relationships because they want somebody to talk to. Men want women to shut up."
Women moving up still strive to marry up. Men moving up still tend to marry down. The two sexes' going in opposite directions has led to an epidemic of professional women missing out on husbands and kids.
Sylvia Ann Hewlett, an economist and the author of "Creating a Life: Professional Women and the Quest for Children," a book published in 2002, conducted a survey and found that 55 percent of 35-year-old career women were childless. And among corporate executives who earn $100,000 or more, she said, 49 percent of the women did not have children, compared with only 19 percent of the men.
Hewlett quantified, yet again, that men have an unfair advantage. "Nowadays," she said, "the rule of thumb seems to be that the more successful the woman, the less likely it is she will find a husband or bear a child. For men, the reverse is true."
A 2005 report by researchers at four British universities indicated that a high I.Q. hampers a woman's chance to marry, while it is a plus for men. The prospect for marriage increased by 35 percent for guys for each 16-point increase in I.Q.; for women, there is a 40 percent drop for each 16-point rise.
On a "60 Minutes" report on the Hewlett book, Lesley Stahl talked to two young women who went to Harvard Business School. They agreed that while they were the perfect age to start families, they didn't find it easy to meet the right mates.
Men, apparently, learn early to protect their eggshell egos from high-achieving women. The girls said they hid the fact that they went to Harvard from guys they met because it was the kiss of death. "The H-bomb," they dubbed it. "As soon as you say Harvard Business School . . . that's the end of the conversation," Ani Vartanian said. "As soon as the guys say, 'Oh, I go to Harvard Business School,' all the girls start falling into them."
Hewlett thinks that the 2005 American workplace is more macho than ever. "It's actually much more difficult now than 10 years ago to have a career and raise a family," she told me. "The trend lines continue that highly educated women in many countries are increasingly dealing with this creeping nonchoice and end up on this path of delaying finding a mate and delaying childbearing. Whether you're looking at Italy, Russia or the U.S., all of that is true." Many women continue to fear that the more they accomplish, the more they may have to sacrifice. They worry that men still veer away from "challenging" women because of a male atavistic desire to be the superior force in a relationship.
"With men and women, it's always all about control issues, isn't it?" says a guy I know, talking about his bitter divorce.
Or, as Craig Bierko, a musical comedy star and actor who played one of Carrie's boyfriends on "Sex and the City," told me, "Deep down, beneath the bluster and machismo, men are simply afraid to say that what they're truly looking for in a woman is an intelligent, confident and dependable partner in life whom they can devote themselves to unconditionally until she's 40."
Ms. Versus Mrs.
"Ms." was supposed to neutralize the stature of women, so they weren't publicly defined by their marital status. When The Times finally agreed to switch to Ms. in its news pages in 1986, after much hectoring by feminists, Gloria Steinem sent flowers to the executive editor, Abe Rosenthal. But nowadays most young brides want to take their husbands' names and brag on the moniker Mrs., a brand that proclaims you belong to him. T-shirts with "MRS." emblazoned in sequins or sparkly beads are popular wedding-shower gifts.
A Harvard economics professor, Claudia Goldin, did a study last year that found that 44 percent of women in the Harvard class of 1980 who married within 10 years of graduation kept their birth names, while in the class of '90 it was down to 32 percent. In 1990, 23 percent of college-educated women kept their own names after marriage, while a decade later the number had fallen to 17 percent.
Time magazine reported that an informal poll in the spring of 2005 by the Knot, a wedding Web site, showed similar results: 81 percent of respondents took their spouse's last name, an increase from 71 percent in 2000. The number of women with hyphenated surnames fell from 21 percent to 8 percent.
"It's a return to romance, a desire to make marriage work," Goldin told one interviewer, adding that young women might feel that by keeping their own names they were aligning themselves with tedious old-fashioned feminists, and this might be a turnoff to them.
The professor, who married in 1979 and kept her name, undertook the study after her niece, a lawyer, changed hers. "She felt that her generation of women didn't have to do the same things mine did, because of what we had already achieved," Goldin told Time.
Many women now do not think of domestic life as a "comfortable concentration camp," as Betty Friedan wrote in "The Feminine Mystique," where they are losing their identities and turning into "anonymous biological robots in a docile mass." Now they want to be Mrs. Anonymous Biological Robot in a Docile Mass. They dream of being rescued - to flirt, to shop, to stay home and be taken care of. They shop for "Stepford Fashions" - matching shoes and ladylike bags and the 50's-style satin, lace and chiffon party dresses featured in InStyle layouts - and spend their days at the gym trying for Wisteria Lane waistlines.
The Times recently ran a front-page article about young women attending Ivy League colleges, women who are being groomed to take their places in the professional and political elite, who are planning to reject careers in favor of playing traditional roles, staying home and raising children.
"My mother always told me you can't be the best career woman and the best mother at the same time," the brainy, accomplished Cynthia Liu told Louise Story, explaining why she hoped to be a stay-at-home mom a few years after she goes to law school. "You always have to choose one over the other."
Kate White, the editor of Cosmopolitan, told me that she sees a distinct shift in what her readers want these days. "Women now don't want to be in the grind," she said. "The baby boomers made the grind seem unappealing."
Cynthia Russett, a professor of American history at Yale, told Story that women today are simply more "realistic," having seen the dashed utopia of those who assumed it wouldn't be so hard to combine full-time work and child rearing.
To the extent that young women are rejecting the old idea of copying men and reshaping the world around their desires, it's exhilarating progress. But to the extent that a pampered class of females is walking away from the problem and just planning to marry rich enough to cosset themselves in a narrow world of dependence on men, it's an irritating setback. If the new ethos is "a woman needs a career like a fish needs a bicycle," it won't be healthy.
Movies
In all those Tracy-Hepburn movies more than a half-century ago, it was the snap and crackle of a romance between equals that was so exciting. You still see it onscreen occasionally - the incendiary chemistry of Brad Pitt and Angelina Jolie playing married assassins aiming for mutually assured orgasms and destruction in "Mr. and Mrs. Smith." Interestingly, that movie was described as retro because of its salty battle of wits between two peppery lovers. Moviemakers these days are more interested in exploring what Steve Martin, in his novel "Shopgirl," calls the "calm cushion" of romances between unequals.
In James Brooks's movie "Spanglish," Adam Sandler, playing a sensitive Los Angeles chef, falls for his hot Mexican maid, just as in "Maid in Manhattan," Ralph Fiennes, playing a sensitive New York pol, falls for the hot Latino maid at his hotel, played by Jennifer Lopez. Sandler's maid, who cleans up for him without being able to speak English, is presented as the ideal woman, in looks and character. His wife, played by Téa Leoni, is repellent: a jangly, yakking, overachieving, overexercised, unfaithful, shallow she-monster who has just lost her job with a commercial design firm and fears she has lost her identity.
In 2003, we had "Girl With a Pearl Earring," in which Colin Firth's Vermeer erotically paints Scarlett Johansson's Dutch maid, and Richard Curtis's "Love Actually," about the attraction of unequals. The witty and sophisticated British prime minister, played by Hugh Grant, falls for the chubby girl who wheels the tea and scones into his office. A businessman married to the substantial Emma Thompson, the sister of the prime minister, falls for his sultry secretary. A novelist played by Colin Firth falls for his maid, who speaks only Portuguese.
Art is imitating life, turning women who seek equality into selfish narcissists and objects of rejection rather than of affection.
It's funny. I come from a family of Irish domestics - statuesque, 6-foot-tall women who cooked, kept house and acted as nannies for some of America's first families. I was always so proud of achieving more - succeeding in a high-powered career that would have been closed to my great-aunts. How odd, then, to find out now that being a maid would have enhanced my chances with men.
An upstairs maid, of course.
Women's Magazines
Cosmo is still the best-selling magazine on college campuses, as it was when I was in college, and the best-selling monthly magazine on the newsstand. The June 2005 issue, with Jessica Simpson on the cover, her cleavage spilling out of an orange croqueted halter dress, could have been June 1970. The headlines are familiar: "How to turn him on in 10 words or less," "Do You Make Men M-E-L-T? Take our quiz," "Bridal Special," Cosmo's stud search and "Cosmo's Most Famous Sex Tips; the Legendary Tricks That Have Brought Countless Guys to Their Knees." (Sex Trick 4: "Place a glazed doughnut around your man's member, then gently nibble the pastry and lick the icing . . . as well as his manhood." Another favorite Cosmo trick is to yell out during sex which of your girlfriends thinks your man is hot.)
At any newsstand, you'll see the original Cosmo girl's man-crazy, sex-obsessed image endlessly, tiresomely replicated, even for the teen set. On the cover of Elle Girl: "267 Ways to Look Hot."
"There has been lots of copying - look at Glamour," Helen Gurley Brown, Cosmo's founding editor told me and sighed. "I used to have all the sex to myself."
Before it curdled into a collection of stereotypes, feminism had fleetingly held out a promise that there would be some precincts of womanly life that were not all about men. But it never quite materialized.
It took only a few decades to create a brazen new world where the highest ideal is to acknowledge your inner slut. I am woman; see me strip. Instead of peaceful havens of girl things and boy things, we have a society where women of all ages are striving to become self-actualized sex kittens. Hollywood actresses now work out by taking pole-dancing classes.
Female sexuality has been a confusing corkscrew path, not a serene progressive arc. We had decades of Victorian prudery, when women were not supposed to like sex. Then we had the pill and zipless encounters, when women were supposed to have the same animalistic drive as men. Then it was discovered - shock, horror! - that men and women are not alike in their desires. But zipless morphed into hookups, and the more one-night stands the girls on "Sex and the City" had, the grumpier they got.
Oddly enough, Felix Dennis, who created the top-selling Maxim, said he stole his "us against the world" lad-magazine attitude from women's magazines like Cosmo. Just as women didn't mind losing Cosmo's prestigious fiction as the magazine got raunchier, plenty of guys were happy to lose the literary pretensions of venerable men's magazines and embrace simple-minded gender stereotypes, like the Maxim manifesto instructing women, "If we see you in the morning and night, why call us at work?"
Jessica Simpson and Eva Longoria move seamlessly from showing their curves on the covers of Cosmo and Glamour to Maxim, which dubbed Simpson "America's favorite ball and chain!" In the summer of 2005, both British GQ and FHM featured Pamela Anderson busting out of their covers. ("I think of my breasts as props," she told FHM.)
A lot of women now want to be Maxim babes as much as men want Maxim babes. So women have moved from fighting objectification to seeking it. "I have been surprised," Maxim's editor, Ed Needham, confessed to me, "to find that a lot of women would want to be somehow validated as a Maxim girl type, that they'd like to be thought of as hot and would like their boyfriends to take pictures of them or make comments about them that mirror the Maxim representation of a woman, the Pamela Anderson sort of brand. That, to me, is kind of extraordinary."
The luscious babes on the cover of Maxim were supposed to be men's fantasy guilty pleasures, after all, not their real life-affirming girlfriends.
Beauty
While I never related to the unstyled look of the early feminists and I tangled with boyfriends who did not want me to wear makeup and heels, I always assumed that one positive result of the feminist movement would be a more flexible and capacious notion of female beauty, a release from the tyranny of the girdled, primped ideal of the 50's.
I was wrong. Forty years after the dawn of feminism, the ideal of feminine beauty is more rigid and unnatural than ever.
When Gloria Steinem wrote that "all women are Bunnies," she did not mean it as a compliment; it was a feminist call to arms. Decades later, it's just an aesthetic fact, as more and more women embrace Botox and implants and stretch and protrude to extreme proportions to satisfy male desires. Now that technology is biology, all women can look like inflatable dolls. It's clear that American narcissism has trumped American feminism.
It was naïve and misguided for the early feminists to tendentiously demonize Barbie and Cosmo girl, to disdain such female proclivities as shopping, applying makeup and hunting for sexy shoes and cute boyfriends and to prognosticate a world where men and women dressed alike and worked alike in navy suits and were equal in every way.
But it is equally naïve and misguided for young women now to fritter away all their time shopping for boudoirish clothes and text-messaging about guys while they disdainfully ignore gender politics and the seismic shifts on the Supreme Court that will affect women's rights for a generation.
What I didn't like at the start of the feminist movement was that young women were dressing alike, looking alike and thinking alike. They were supposed to be liberated, but it just seemed like stifling conformity.
What I don't like now is that the young women rejecting the feminist movement are dressing alike, looking alike and thinking alike. The plumage is more colorful, the shapes are more curvy, the look is more plastic, the message is diametrically opposite - before it was don't be a sex object; now it's be a sex object - but the conformity is just as stifling.
And the Future . . .
Having boomeranged once, will women do it again in a couple of decades? If we flash forward to 2030, will we see all those young women who thought trying to Have It All was a pointless slog, now middle-aged and stranded in suburbia, popping Ativan, struggling with rebellious teenagers, deserted by husbands for younger babes, unable to get back into a work force they never tried to be part of?
It's easy to picture a surreally familiar scene when women realize they bought into a raw deal and old trap. With no power or money or independence, they'll be mere domestic robots, lasering their legs and waxing their floors - or vice versa - and desperately seeking a new Betty Friedan.
Maureen Dowd is a columnist for The New York Times. This essay is adapted from "Are Men Necessary: When Sexes Collide," to be published next month by G.P. Putnam's Sons.
Ending the Fraudulence
By PAUL KRUGMAN
Let me be frank: it has been a long political nightmare. For some of us, daily life has remained safe and comfortable, so the nightmare has merely been intellectual: we realized early on that this administration was cynical, dishonest and incompetent, but spent a long time unable to get others to see the obvious. For others - above all, of course, those Americans risking their lives in a war whose real rationale has never been explained - the nightmare has been all too concrete.
So is the nightmare finally coming to an end? Yes, I think so. I have no idea whether Patrick Fitzgerald, the special prosecutor, will bring more indictments in the Plame affair. In any case, I don't share fantasies that Dick Cheney will be forced to resign; even Karl Rove may keep his post. One way or another, the Bush administration will stagger on for three more years. But its essential fraudulence stands exposed, and it's hard to see how that exposure can be undone.
What do I mean by essential fraudulence? Basically, I mean the way an administration with an almost unbroken record of policy failure has nonetheless achieved political dominance through a carefully cultivated set of myths.
The record of policy failure is truly remarkable. It sometimes seems as if President Bush and Mr. Cheney are Midases in reverse: everything they touch - from Iraq reconstruction to hurricane relief, from prescription drug coverage to the pursuit of Osama - turns to crud. Even the few apparent successes turn out to contain failures at their core: for example, real G.D.P. may be up, but real wages are down.
The point is that this administration's political triumphs have never been based on its real-world achievements, which are few and far between. The administration has, instead, built its power on myths: the myth of presidential leadership, the ugly myth that the administration is patriotic while its critics are not. Take away those myths, and the administration has nothing left.
Well, Katrina ended the leadership myth, which was already fading as the war dragged on. There was a time when a photo of Mr. Bush looking out the window of Air Force One on 9/11 became an iconic image of leadership. Now, a similar image of Mr. Bush looking out at a flooded New Orleans has become an iconic image of his lack of connection. Pundits may try to resurrect Mr. Bush's reputation, but his cult of personality is dead - and the inscription on the tombstone reads, "Brownie, you're doing a heck of a job."
Meanwhile, the Plame inquiry, however it winds up, has ended the myth of the administration's monopoly on patriotism, which was also fading in the face of the war.
Apologists can shout all they like that no laws were broken, that hardball politics is nothing new, or whatever. The fact remains that officials close to both Mr. Cheney and Mr. Bush leaked the identity of an undercover operative for political reasons. Whether or not that act was illegal, it was clearly unpatriotic.
And the Plame affair has also solidified the public's growing doubts about the administration's morals. By a three-to-one margin, according to a Washington Post poll, the public now believes that the level of ethics and honesty in the government has declined rather than risen under Mr. Bush.
So the Bush administration has lost the myths that sustained its mojo, and with them much of its power to do harm. But the nightmare won't be fully over until two things happen.
First, politicians will have to admit that they were misled. Second, the news media will have to face up to their role in allowing incompetents to pose as leaders and political apparatchiks to pose as patriots.
It's a sad commentary on the timidity of most Democrats that even now, with Lawrence Wilkerson, Colin Powell's former chief of staff, telling us how policy was "hijacked" by the Cheney-Rumsfeld "cabal," it's hard to get leading figures to admit that they were misled into supporting the Iraq war. Kudos to John Kerry for finally saying just that last week.
And as for the media: these days, there is much harsh, justified criticism of the failure of major news organizations, this one included, to exert due diligence on rationales for the war. But the failures that made the long nightmare possible began much earlier, during the weeks after 9/11, when the media eagerly helped our political leaders build up a completely false picture of who they were.
So the long nightmare won't really be over until journalists ask themselves: what did we know, when did we know it, and why didn't we tell the public?
Let me be frank: it has been a long political nightmare. For some of us, daily life has remained safe and comfortable, so the nightmare has merely been intellectual: we realized early on that this administration was cynical, dishonest and incompetent, but spent a long time unable to get others to see the obvious. For others - above all, of course, those Americans risking their lives in a war whose real rationale has never been explained - the nightmare has been all too concrete.
So is the nightmare finally coming to an end? Yes, I think so. I have no idea whether Patrick Fitzgerald, the special prosecutor, will bring more indictments in the Plame affair. In any case, I don't share fantasies that Dick Cheney will be forced to resign; even Karl Rove may keep his post. One way or another, the Bush administration will stagger on for three more years. But its essential fraudulence stands exposed, and it's hard to see how that exposure can be undone.
What do I mean by essential fraudulence? Basically, I mean the way an administration with an almost unbroken record of policy failure has nonetheless achieved political dominance through a carefully cultivated set of myths.
The record of policy failure is truly remarkable. It sometimes seems as if President Bush and Mr. Cheney are Midases in reverse: everything they touch - from Iraq reconstruction to hurricane relief, from prescription drug coverage to the pursuit of Osama - turns to crud. Even the few apparent successes turn out to contain failures at their core: for example, real G.D.P. may be up, but real wages are down.
The point is that this administration's political triumphs have never been based on its real-world achievements, which are few and far between. The administration has, instead, built its power on myths: the myth of presidential leadership, the ugly myth that the administration is patriotic while its critics are not. Take away those myths, and the administration has nothing left.
Well, Katrina ended the leadership myth, which was already fading as the war dragged on. There was a time when a photo of Mr. Bush looking out the window of Air Force One on 9/11 became an iconic image of leadership. Now, a similar image of Mr. Bush looking out at a flooded New Orleans has become an iconic image of his lack of connection. Pundits may try to resurrect Mr. Bush's reputation, but his cult of personality is dead - and the inscription on the tombstone reads, "Brownie, you're doing a heck of a job."
Meanwhile, the Plame inquiry, however it winds up, has ended the myth of the administration's monopoly on patriotism, which was also fading in the face of the war.
Apologists can shout all they like that no laws were broken, that hardball politics is nothing new, or whatever. The fact remains that officials close to both Mr. Cheney and Mr. Bush leaked the identity of an undercover operative for political reasons. Whether or not that act was illegal, it was clearly unpatriotic.
And the Plame affair has also solidified the public's growing doubts about the administration's morals. By a three-to-one margin, according to a Washington Post poll, the public now believes that the level of ethics and honesty in the government has declined rather than risen under Mr. Bush.
So the Bush administration has lost the myths that sustained its mojo, and with them much of its power to do harm. But the nightmare won't be fully over until two things happen.
First, politicians will have to admit that they were misled. Second, the news media will have to face up to their role in allowing incompetents to pose as leaders and political apparatchiks to pose as patriots.
It's a sad commentary on the timidity of most Democrats that even now, with Lawrence Wilkerson, Colin Powell's former chief of staff, telling us how policy was "hijacked" by the Cheney-Rumsfeld "cabal," it's hard to get leading figures to admit that they were misled into supporting the Iraq war. Kudos to John Kerry for finally saying just that last week.
And as for the media: these days, there is much harsh, justified criticism of the failure of major news organizations, this one included, to exert due diligence on rationales for the war. But the failures that made the long nightmare possible began much earlier, during the weeks after 9/11, when the media eagerly helped our political leaders build up a completely false picture of who they were.
So the long nightmare won't really be over until journalists ask themselves: what did we know, when did we know it, and why didn't we tell the public?
Thursday, October 27
Guantanemo Case
For weeks, the justices have been avoiding one of the term's most far-reaching and explosive cases, Hamdan v. Rumsfeld. The case seemed made for review by the court. In two contentious opinions that came down in June 2004, the justices left open crucial questions about the scope of the rights of the foreign suspects whom the Bush administration is holding at Guantanamo Bay. Hamdan is an obvious vehicle for beginning to provide answers. The lower-court opinion in the case, by a panel of three judges on the D.C. Circuit in July, was breathtakingly broad. It allowed the administration to try Salim Ahmed Hamdan, the former bodyguard and driver of Osama Bin Laden, before a special military commission for crimes including murder and terrorism. Because it sets itself no limits, the opinion in theory would also allow the president to set up the same sort of commission—one that doesn't provide for basic rights afforded both in civilian court and in a military court martial—for any offense committed by any offender anywhere, including by an American on American soil. "No decision, by any court, in the wake of the September 11, 2001 attacks has gone this far," Hamdan's lawyers argue. They're right.
Yet twice in the last few weeks the Supreme Court has considered whether to hear Hamdan this winter or spring, and twice the justices have declined to say they will do so. Tomorrow, they may discuss the case for a third time. Four-hundred-and-fifty law professors issued a statement on Wednesday urging it to grant review. They think the military commission set up to try Hamdan should be ruled out of bounds for three reasons. First, the commission violates traditional separation-of-powers principles—the president created it, defined who and what offenses it may try, set all its rules, and controls the appointment of its members. One branch of government isn't supposed to act both as prosecutor and judge. Second, the commission is out of step with constitutional and international standards of due process. Its rules allow for unsworn statements as testimony and for evidence that may have been gathered using coercive tactics that amount to torture. The presumption of innocence can be dispensed with at any time. Hamdan also has no right to be present at his trial.
Perhaps most significant, in approving the commission, the D.C. Circuit appears to have stripped the basic protections of the Geneva Convention from all the Guantanamo detainees. In other words, no Geneva for the fight between the United States and a terrorist group like al-Qaida. This is a long-sought goal of the Bush administration—it's the position taken by Attorney General Alberto Gonzales and his Justice Department that led to the 2002 torture memo.
The D.C. Circuit threw Geneva overboard with a reading of the convention's Common Article Three that is plausible but unconvincing, as Georgetown law professor David Luban explains in this helpful post. Common Article Three protects detainees from being sentenced or punished without minimal rights and protections. It prohibits torture and "humiliating and degrading treatment." Its text states that it applies to armed conflicts that are "not of an international character." The question is what that phrase means. Two judges on the D.C. Circuit panel—one of whom was John Roberts, before he became the Supreme Court's chief justice—read "not … international" to mean "internal," as in a civil war. Wars between states are covered elsewhere in Geneva, so this cramped reading of Common Article Three doesn't remove the significant protections for prisoners of war and civilians in conflicts that pit one state against another. But detainees captured in conflicts that are neither civil nor state-against-state are out of luck. The third judge on the panel, Stephen Williams, argued that "not international" really means "not between nation states"—in which case Common Article Three would apply to the United States' fight against al-Qaida. Luban points out that Williams' reading is standard among international lawyers. If the D.C. Circuit majority's contrary interpretation is left to stand nonetheless, then Common Article Three is no longer common—it doesn't apply to everyone anymore.
Yet twice in the last few weeks the Supreme Court has considered whether to hear Hamdan this winter or spring, and twice the justices have declined to say they will do so. Tomorrow, they may discuss the case for a third time. Four-hundred-and-fifty law professors issued a statement on Wednesday urging it to grant review. They think the military commission set up to try Hamdan should be ruled out of bounds for three reasons. First, the commission violates traditional separation-of-powers principles—the president created it, defined who and what offenses it may try, set all its rules, and controls the appointment of its members. One branch of government isn't supposed to act both as prosecutor and judge. Second, the commission is out of step with constitutional and international standards of due process. Its rules allow for unsworn statements as testimony and for evidence that may have been gathered using coercive tactics that amount to torture. The presumption of innocence can be dispensed with at any time. Hamdan also has no right to be present at his trial.
Perhaps most significant, in approving the commission, the D.C. Circuit appears to have stripped the basic protections of the Geneva Convention from all the Guantanamo detainees. In other words, no Geneva for the fight between the United States and a terrorist group like al-Qaida. This is a long-sought goal of the Bush administration—it's the position taken by Attorney General Alberto Gonzales and his Justice Department that led to the 2002 torture memo.
The D.C. Circuit threw Geneva overboard with a reading of the convention's Common Article Three that is plausible but unconvincing, as Georgetown law professor David Luban explains in this helpful post. Common Article Three protects detainees from being sentenced or punished without minimal rights and protections. It prohibits torture and "humiliating and degrading treatment." Its text states that it applies to armed conflicts that are "not of an international character." The question is what that phrase means. Two judges on the D.C. Circuit panel—one of whom was John Roberts, before he became the Supreme Court's chief justice—read "not … international" to mean "internal," as in a civil war. Wars between states are covered elsewhere in Geneva, so this cramped reading of Common Article Three doesn't remove the significant protections for prisoners of war and civilians in conflicts that pit one state against another. But detainees captured in conflicts that are neither civil nor state-against-state are out of luck. The third judge on the panel, Stephen Williams, argued that "not international" really means "not between nation states"—in which case Common Article Three would apply to the United States' fight against al-Qaida. Luban points out that Williams' reading is standard among international lawyers. If the D.C. Circuit majority's contrary interpretation is left to stand nonetheless, then Common Article Three is no longer common—it doesn't apply to everyone anymore.
Monday, October 24
The pseudo-meritocracy of the Ivy League
By James Traub
Posted Monday, Oct. 24, 2005, at 2:35 AM PT
Thanks to Jerome Karabel, author of The Chosen, I know now a great deal more about the circumstances surrounding my admission to Harvard in 1972 than I ever wanted to know. I understood even then that my unimpressive academic record would not have put me over the top had my father not attended Harvard. But I now know that in the late 1960s and early 1970s, supposedly a time when the admissions process had at last been freed of archaic bias, "legacies" were two-and-a-half to three times likelier to be admitted than was the average applicant; that admitted legacies ranked lower than average admits on everything Harvard cared about—personal attributes, extracurricular activities, academic achievement, recommendations, and so forth; and that the degree of preference granted legacies was only slightly less than that given to black candidates, who in turn received less of a thumb on the scale than did athletes. I was, in short, an affirmative-action baby.
Well, who among us isn't? Karabel notes that even today 40 percent of Princeton's freshman class consists of legacies, athletes, and under-represented minorities, the three chief beneficiaries of admissions preference. But Karabel's larger aim in this epically scaled and scrupulously rendered history of the admissions systems at Harvard, Yale, and Princeton is to call into question our confident use of words like "preference." Along with works like The Big Test, by Nicholas Lemann, and The Shape of the River, by William Bowen and Derek Bok, The Chosen constitutes a second-generation defense of affirmative action, undermining the pat narrative of critics who imagine that our great universities operated according to a consensual, unarguable definition of "merit" until racial blackmail forced them to betray their principles.
There was never any doubt in my mind as to what Harvard was selecting for in 1972— intellectual brilliance. I knew that somewhere swam shoals of crew jocks and legacies far more square-jawed than I, but my world was IQ-denominated. My dorm consisted largely of ill-bred physics geniuses, Unabombers in the making. I had one friend who could talk to the kid who could, in turn, talk to the kid who as a freshman was said to have corrected a computing error by Harvard's great mathematician Jean-Pierre Serre. Of such stuff were our legends made. But of such stuff, also, are tacit worldviews made. It took me years to figure out that life was not IQ-denominated, and that while academic intelligence was significantly correlated with success, the world defined "merit" far more variously than my little corner of Harvard had.
The task Karabel sets himself in The Chosen is to trace the evolution of tacit worldviews, each appearing fixed and immutable to its advocates, that over the last century determined who would and would not have access to America's finest universities. It turns out, ironically enough from the point of view of my family trajectory, that the admissions systems at the Big Three were built expressly to keep out people like my father—smart, driven Jewish kids from gigantic New York City public high schools. Until 1920 or so, anyone could gain admission to Harvard, Yale, or Princeton by passing a battery of subject-matter exams; the lunkheads from Andover who couldn't parse a literary paragraph could be admitted with "conditions." Of course this meant the student body was heavily salted with "the stupid sons of rich men," in the memorably pithy phrase of Charles Eliot, Harvard's great Victorian-era president. But for the Harvard man, or, even more, for that paragon known as "the Yale man," intellectual brilliance was a deeply suspect attribute, like speaking French too well. These young men had been bred for "character" and "manliness"—that ineffable mix of deeply heritable qualities prized by the WASP establishment, a mix that worthies like Endicott Peabody, the founder of Groton, the greatest of the "feeder schools," believed could best be demonstrated on the football field. They would have considered my dorm companions less than human, not more.
And then along came the Jews—lots and lots of Jews. By 1920, the Big Three presidents were looking on in horror as Columbia, the Ivy League school situated in the midst of the melting pot, became 40 percent Jewish. These men shared the anti-Semitism almost universal in their class, but because they saw themselves as custodians of ancient and indispensable institutions, they did not simply dislike these uncouth scholars; they felt a deep professional obligation to keep their numbers to a manageable minimum. Karabel unearthed a letter from Harvard president Lawrence Lowell that delineates the issue with admirable, if stomach-turning, clarity: "The summer hotel that is ruined by admitting Jews meets its fate, not because the Jews it admits are of bad character, but because they drive away the Gentiles, and then after the Gentiles have left, they leave also." The problem, in other words, was WASP flight.
The answer was selective admissions. In 1922, Lowell was reckless enough to think that he could solve "the Jew problem," as he was wont to call it, with a straightforward quota. This provoked a mighty uproar among faculty members and outsiders with more tender consciences; instead, Lowell agreed to limit the size of the entering class and to institute recommendation letters and personal interviews. Yale and Princeton followed suit; and soon came the whole panoply familiar to this day: lengthy applications, personal essays, descriptions of extracurricular activities. This cumbersome and expensive process served two central functions. It allowed the universities to select for an attribute the disfavored class was thought to lack—i.e., "character"—and it shrouded the admissions process in impenetrable layers of subjectivity and opacity, thus rendering it effectively impervious to criticism. The swift drop in admission of Jews could thus be explained as the byproduct of the application of neutral principles—just as could the increase of minority students, 60 years later, in institutions seeking greater "diversity."
The willingness of these universities to suffer real harms rather than admit more Jews is astonishing. Having long distinguished itself as a "national" and "democratic" institution, Yale by 1930 had become more insular, more parochial, and less intellectual as a consequence of the new admissions system. During World War II, with the size of the entering class size seriously depleted, Yale turned away qualified Jewish students rather than increase the proportion of Jews. "Yale judged its symbolic capital to be even more precious than its academic capital," as Karabel dryly puts it. Or, to put it more contemporary terms, Yale understood the imperative to protect its brand.
We have grown accustomed to the idea that the academic, test-driven meritocracy began to replace the old, ascriptive order in the 1940s. This is the central theme of Lemann's The Big Test. But Karabel demonstrates that the old order had a lot more staying power than is commonly thought. James Bryant Conant, Harvard's midcentury president and an outspoken foe of inherited privilege, is widely credited with democratizing Harvard's student body. But it turns out that Jews had only slightly better chances of admission under Conant, and the lunkheads of "St. Grottlesex," as the feeder prep schools were collectively known, only slightly worse, than they had in the Lowell era. This was true not so much because Conant shared Lowell's prejudices as because he operated under his constraints: Harvard needed "paying customers," and it needed to preserve an environment that would keep those Brahmin scions happy. But it is also true that great WASP patriarchs like Whitney Griswold, Yale's president in the '50s, shared the tribal prejudice against "beetle-browed" intellectuals.
The idea of merit-as-brains is really a product of the 1960s. Karabel attributes this in part to the growing power of the professoriat, whose deepest loyalties were to knowledge rather than to the institutions with which they were affiliated. Changes in the economy and Cold War competition also turned brain-power into a precious resource, thus changing the social definition of merit. And the egalitarianism of the 1960s, along with the enfeeblement of the WASP elite, made the old association of character with "breeding"—indeed, the very idea of character as a fixable commodity—seem ludicrous. As blacks, Jews, and women clambered over the ramparts, the one interest group that clung to the ancient ideals—the alumni—took up arms in defense of the walled ethnic garden of yesteryear. They were fossils, of course; but many of them were rich fossils. Karabel quotes the humiliating 1973 recantation of Yale president Kingman Brewster after many an Old Eli had committed rebellion-by-checkbook: "If Yale is going to expect her alumni to care about Yale, then she must convince her alumni that Yale cares about them." And that helps explain why you-know-who was able to enroll you-know-where.
By the time the reader arrives at Page 374 of The Chosen, where the book's affirmative action exegesis begins, he is fully persuaded of the folly of objectifying "merit" or "preference," of piercing the veil of opacity, or in any case of preventing the great private universities from doing anything they deem in their self-interest. Are the same smokescreens that were once used to exclude the underprivileged now to be used to include them? Let it be. Karabel, whose role in redesigning Berkeley's admissions policy in the late '80s in order to pass constitutional muster is described in The Big Test, and who remains one of the most thoughtful advocates of affirmative action, candidly concedes that the Big Three ramped up the admission of black students almost overnight owing not to some midnight conversion but to terror at the rising tide of black anger and violence—owing, that is, to racial blackmail. And because the elite universities began admitting large numbers of black students with sub-par academic records at precisely the moment they were becoming more "meritocratic"—i.e, more academically selective—affirmative action felt more like a violation of meritocratic principle than a recalibration of it. This painful fact continues to haunt affirmative action and is why even some advocates, like the Harvard sociologist Orlando Patterson, have called for such programs to be phased out over time. But this is unlikely ever to happen, because universities now define "diversity" as a central virtue.
Karabel's ultimate goal in deconstructing merit is not, however, to vindicate affirmative action but to expose the hollowness of the central American myth of equal opportunity. The selection process at elite universities is widely understood as the outward symbol, and in many ways the foundation, of our society's distribution of opportunities and rewards. It thus "legitimates the established order as one that rewards ability and hard work over the prerogatives of birth." But the truth, Karabel argues, is very nearly the opposite: Social mobility is diminishing, privilege is increasingly reproducing itself, and the system of higher education has become the chief means whereby well-situated parents pass on the "cultural capital" indispensable to success. "Merit" is always a political tool, always "bears the imprint of the distribution of power in the larger society." When merit was defined according to character attributes associated with the upper class, that imprint was plain for all to see, and to attack, but now that elite universities reward academic skills theoretically attainable by all, but in practice concentrated among the children of the well-to-do and the well-educated, the mark of power is, like the admissions process itself, "veiled." And it is precisely this appearance of equal opportunity that makes current-day admissions systems so effective a legitimating device.
What, then, to do? Karabel proposes that colleges extend affirmative action from race to class, as some have tentatively begun to do, and end preferences for legacies and athletes. I am on record elsewhere as having renounced the legacy privilege on behalf of my son—not that I asked him at the time—but Karabel's own narrative has persuaded me that the elite universities are unlikely to end affirmative action for the overprivileged. If anything, The Chosen demonstrates the danger of imagining great universities as miniature replicas of the social order, and their admissions policies as simulacra of the national reward system. Yes, Harvard, Yale, and Princeton are plainly open to, and in many ways driven by, our animating national ideals; but Karabel shows us that their admissions choices are profoundly shaped by cultural, political, and economic considerations that can not be wished away. If we care about equality of opportunity, perhaps we would do better to focus our attention on the public schools, on the tax system, on such social goods as housing and health care. I don't think we can prevent meritocratic privilege from reproducing itself; we can, however, increase the supply of meritocrats.
James Traub is at work on a book about Kofi Annan and the United Nations.
Posted Monday, Oct. 24, 2005, at 2:35 AM PT
Thanks to Jerome Karabel, author of The Chosen, I know now a great deal more about the circumstances surrounding my admission to Harvard in 1972 than I ever wanted to know. I understood even then that my unimpressive academic record would not have put me over the top had my father not attended Harvard. But I now know that in the late 1960s and early 1970s, supposedly a time when the admissions process had at last been freed of archaic bias, "legacies" were two-and-a-half to three times likelier to be admitted than was the average applicant; that admitted legacies ranked lower than average admits on everything Harvard cared about—personal attributes, extracurricular activities, academic achievement, recommendations, and so forth; and that the degree of preference granted legacies was only slightly less than that given to black candidates, who in turn received less of a thumb on the scale than did athletes. I was, in short, an affirmative-action baby.
Well, who among us isn't? Karabel notes that even today 40 percent of Princeton's freshman class consists of legacies, athletes, and under-represented minorities, the three chief beneficiaries of admissions preference. But Karabel's larger aim in this epically scaled and scrupulously rendered history of the admissions systems at Harvard, Yale, and Princeton is to call into question our confident use of words like "preference." Along with works like The Big Test, by Nicholas Lemann, and The Shape of the River, by William Bowen and Derek Bok, The Chosen constitutes a second-generation defense of affirmative action, undermining the pat narrative of critics who imagine that our great universities operated according to a consensual, unarguable definition of "merit" until racial blackmail forced them to betray their principles.
There was never any doubt in my mind as to what Harvard was selecting for in 1972— intellectual brilliance. I knew that somewhere swam shoals of crew jocks and legacies far more square-jawed than I, but my world was IQ-denominated. My dorm consisted largely of ill-bred physics geniuses, Unabombers in the making. I had one friend who could talk to the kid who could, in turn, talk to the kid who as a freshman was said to have corrected a computing error by Harvard's great mathematician Jean-Pierre Serre. Of such stuff were our legends made. But of such stuff, also, are tacit worldviews made. It took me years to figure out that life was not IQ-denominated, and that while academic intelligence was significantly correlated with success, the world defined "merit" far more variously than my little corner of Harvard had.
The task Karabel sets himself in The Chosen is to trace the evolution of tacit worldviews, each appearing fixed and immutable to its advocates, that over the last century determined who would and would not have access to America's finest universities. It turns out, ironically enough from the point of view of my family trajectory, that the admissions systems at the Big Three were built expressly to keep out people like my father—smart, driven Jewish kids from gigantic New York City public high schools. Until 1920 or so, anyone could gain admission to Harvard, Yale, or Princeton by passing a battery of subject-matter exams; the lunkheads from Andover who couldn't parse a literary paragraph could be admitted with "conditions." Of course this meant the student body was heavily salted with "the stupid sons of rich men," in the memorably pithy phrase of Charles Eliot, Harvard's great Victorian-era president. But for the Harvard man, or, even more, for that paragon known as "the Yale man," intellectual brilliance was a deeply suspect attribute, like speaking French too well. These young men had been bred for "character" and "manliness"—that ineffable mix of deeply heritable qualities prized by the WASP establishment, a mix that worthies like Endicott Peabody, the founder of Groton, the greatest of the "feeder schools," believed could best be demonstrated on the football field. They would have considered my dorm companions less than human, not more.
And then along came the Jews—lots and lots of Jews. By 1920, the Big Three presidents were looking on in horror as Columbia, the Ivy League school situated in the midst of the melting pot, became 40 percent Jewish. These men shared the anti-Semitism almost universal in their class, but because they saw themselves as custodians of ancient and indispensable institutions, they did not simply dislike these uncouth scholars; they felt a deep professional obligation to keep their numbers to a manageable minimum. Karabel unearthed a letter from Harvard president Lawrence Lowell that delineates the issue with admirable, if stomach-turning, clarity: "The summer hotel that is ruined by admitting Jews meets its fate, not because the Jews it admits are of bad character, but because they drive away the Gentiles, and then after the Gentiles have left, they leave also." The problem, in other words, was WASP flight.
The answer was selective admissions. In 1922, Lowell was reckless enough to think that he could solve "the Jew problem," as he was wont to call it, with a straightforward quota. This provoked a mighty uproar among faculty members and outsiders with more tender consciences; instead, Lowell agreed to limit the size of the entering class and to institute recommendation letters and personal interviews. Yale and Princeton followed suit; and soon came the whole panoply familiar to this day: lengthy applications, personal essays, descriptions of extracurricular activities. This cumbersome and expensive process served two central functions. It allowed the universities to select for an attribute the disfavored class was thought to lack—i.e., "character"—and it shrouded the admissions process in impenetrable layers of subjectivity and opacity, thus rendering it effectively impervious to criticism. The swift drop in admission of Jews could thus be explained as the byproduct of the application of neutral principles—just as could the increase of minority students, 60 years later, in institutions seeking greater "diversity."
The willingness of these universities to suffer real harms rather than admit more Jews is astonishing. Having long distinguished itself as a "national" and "democratic" institution, Yale by 1930 had become more insular, more parochial, and less intellectual as a consequence of the new admissions system. During World War II, with the size of the entering class size seriously depleted, Yale turned away qualified Jewish students rather than increase the proportion of Jews. "Yale judged its symbolic capital to be even more precious than its academic capital," as Karabel dryly puts it. Or, to put it more contemporary terms, Yale understood the imperative to protect its brand.
We have grown accustomed to the idea that the academic, test-driven meritocracy began to replace the old, ascriptive order in the 1940s. This is the central theme of Lemann's The Big Test. But Karabel demonstrates that the old order had a lot more staying power than is commonly thought. James Bryant Conant, Harvard's midcentury president and an outspoken foe of inherited privilege, is widely credited with democratizing Harvard's student body. But it turns out that Jews had only slightly better chances of admission under Conant, and the lunkheads of "St. Grottlesex," as the feeder prep schools were collectively known, only slightly worse, than they had in the Lowell era. This was true not so much because Conant shared Lowell's prejudices as because he operated under his constraints: Harvard needed "paying customers," and it needed to preserve an environment that would keep those Brahmin scions happy. But it is also true that great WASP patriarchs like Whitney Griswold, Yale's president in the '50s, shared the tribal prejudice against "beetle-browed" intellectuals.
The idea of merit-as-brains is really a product of the 1960s. Karabel attributes this in part to the growing power of the professoriat, whose deepest loyalties were to knowledge rather than to the institutions with which they were affiliated. Changes in the economy and Cold War competition also turned brain-power into a precious resource, thus changing the social definition of merit. And the egalitarianism of the 1960s, along with the enfeeblement of the WASP elite, made the old association of character with "breeding"—indeed, the very idea of character as a fixable commodity—seem ludicrous. As blacks, Jews, and women clambered over the ramparts, the one interest group that clung to the ancient ideals—the alumni—took up arms in defense of the walled ethnic garden of yesteryear. They were fossils, of course; but many of them were rich fossils. Karabel quotes the humiliating 1973 recantation of Yale president Kingman Brewster after many an Old Eli had committed rebellion-by-checkbook: "If Yale is going to expect her alumni to care about Yale, then she must convince her alumni that Yale cares about them." And that helps explain why you-know-who was able to enroll you-know-where.
By the time the reader arrives at Page 374 of The Chosen, where the book's affirmative action exegesis begins, he is fully persuaded of the folly of objectifying "merit" or "preference," of piercing the veil of opacity, or in any case of preventing the great private universities from doing anything they deem in their self-interest. Are the same smokescreens that were once used to exclude the underprivileged now to be used to include them? Let it be. Karabel, whose role in redesigning Berkeley's admissions policy in the late '80s in order to pass constitutional muster is described in The Big Test, and who remains one of the most thoughtful advocates of affirmative action, candidly concedes that the Big Three ramped up the admission of black students almost overnight owing not to some midnight conversion but to terror at the rising tide of black anger and violence—owing, that is, to racial blackmail. And because the elite universities began admitting large numbers of black students with sub-par academic records at precisely the moment they were becoming more "meritocratic"—i.e, more academically selective—affirmative action felt more like a violation of meritocratic principle than a recalibration of it. This painful fact continues to haunt affirmative action and is why even some advocates, like the Harvard sociologist Orlando Patterson, have called for such programs to be phased out over time. But this is unlikely ever to happen, because universities now define "diversity" as a central virtue.
Karabel's ultimate goal in deconstructing merit is not, however, to vindicate affirmative action but to expose the hollowness of the central American myth of equal opportunity. The selection process at elite universities is widely understood as the outward symbol, and in many ways the foundation, of our society's distribution of opportunities and rewards. It thus "legitimates the established order as one that rewards ability and hard work over the prerogatives of birth." But the truth, Karabel argues, is very nearly the opposite: Social mobility is diminishing, privilege is increasingly reproducing itself, and the system of higher education has become the chief means whereby well-situated parents pass on the "cultural capital" indispensable to success. "Merit" is always a political tool, always "bears the imprint of the distribution of power in the larger society." When merit was defined according to character attributes associated with the upper class, that imprint was plain for all to see, and to attack, but now that elite universities reward academic skills theoretically attainable by all, but in practice concentrated among the children of the well-to-do and the well-educated, the mark of power is, like the admissions process itself, "veiled." And it is precisely this appearance of equal opportunity that makes current-day admissions systems so effective a legitimating device.
What, then, to do? Karabel proposes that colleges extend affirmative action from race to class, as some have tentatively begun to do, and end preferences for legacies and athletes. I am on record elsewhere as having renounced the legacy privilege on behalf of my son—not that I asked him at the time—but Karabel's own narrative has persuaded me that the elite universities are unlikely to end affirmative action for the overprivileged. If anything, The Chosen demonstrates the danger of imagining great universities as miniature replicas of the social order, and their admissions policies as simulacra of the national reward system. Yes, Harvard, Yale, and Princeton are plainly open to, and in many ways driven by, our animating national ideals; but Karabel shows us that their admissions choices are profoundly shaped by cultural, political, and economic considerations that can not be wished away. If we care about equality of opportunity, perhaps we would do better to focus our attention on the public schools, on the tax system, on such social goods as housing and health care. I don't think we can prevent meritocratic privilege from reproducing itself; we can, however, increase the supply of meritocrats.
James Traub is at work on a book about Kofi Annan and the United Nations.
Sunday, October 23
The Arts Administration
By JENNIFER STEINHAUER, NY TIMES
MAYOR Michael R. Bloomberg is not wildly fond of looking at art.
This should not insult anyone. Spectating is just not his thing.
In his pre-mayoral days, when he attended the opera with his friend Beverly Sills, he would fall into a slumber ended with a sharp jab of her elbow. During a private tour of the Vatican in 2000, the thing that piqued his interest was the church's collection of Bloomberg terminals, an aide who traveled with him recalled.
Television? Movies? Concerts? Happenings? Never, rarely (unless they feature Will Ferrell), reluctantly and not on purpose.
Even his own enormous personal art collection, which ranges from the Hudson River School to 20th-century masters, has not been built through the sorts of obsessions and longings that guide most collectors. He works with a dealer and a decorator, and has a hard time remembering when and where he even acquired some works. When it comes to choosing art for his corporation or the city - with the exception of the few times he has requested something huge, challenging and preferably next-to-impossible to install - he also delegates taste to the art experts around him.
But in the most striking paradox of his mayoralty, his administration has done more to promote and support the arts than any in a generation.
Under Mr. Bloomberg, public art has flourished in every corner of the city - from "Element E," a Roy Lichtenstein sculpture in the center of the former Tweed Courthouse, to a classic limestone statue in the Bronx, to "The Gates," set up by Christo and Jeanne-Claude last winter in Central Park, a project for which he personally lobbied for almost a decade. The city's art commission, once knee-capped by the Giuliani administration as an elitist irritant, has been empowered at the highest level, with a voice in every significant public-works project in the city.
Mr. Bloomberg doles out city arts awards, and holds quiet dinners for less-endowed arts institutions, where he woos potential donors and board members over cocktails and burgers. He has donated millions of dollars from his own fortune to various art groups. His administration has borrowed significant works of art for City Hall and the lawn of Gracie Mansion.
Even something as prosaic as the city government's directory - known as the Green Book - now features modern artworks on its cover. This year, instead of green, it will be saffron, in honor of "The Gates."
The mayor's arts agenda has infused policy-making throughout the municipal government. The administration has created the first public school arts curriculum in a generation, and created zoning policies to encourage the growth of the art galleries in Chelsea. In the process, it has developed a constituency that is perhaps more enamored of the mayor than any other special interest group in the city.
Mr. Bloomberg "clearly understands that the small dance company with the office in the choreographer's kitchen is as much a part of what makes New York the great vibrant city that it is as the Met," said Ruby Lerner, president of Creative Capital, a nonprofit arts foundation that does not receive city money.
"That the reason artists want to be here, to stay here, to come here," she added, "is the dynamic mix between the institutional and the individual, the commercial and nonprofit, the flow back and forth, the incredible diversity here, aesthetic, cultural - the striving aspiration of all, at every level, to set the highest standards."
Beyond that, Mr. Bloomberg knows, art brings in tourists and deep-pocketed sophisticates. World-class cities have buildings designed by famous architects, controversial public art pieces and passels of van Goghs. World-class cities in turn attract large companies, community investment and good international press.
Art, in short, is good business.
"It's not about personal aesthetics," said Patricia E. Harris, Mr. Bloomberg's closest aide and the force behind much of his art agenda. "I think it is very pragmatic."
Nothing in particular about Mr. Bloomberg's 2001 campaign suggested he would place an emphasis on the arts. Indeed, in his first budget, Mayor Bloomberg hacked away at the operating budget of the Department of Cultural Affairs, infuriating some of the city's largest arts institutions. He reneged on the Giuliani administration's promise to turn the spectacular 19th-century Tweed Courthouse into the Museum of the City of New York, opting instead to put the Department of Education there. The building has too much natural light for art to be properly viewed, he said just last week, adding: "What is more important than our children!"
Still, as one of his first acts as mayor, he dispensed with the "decency commission" that Mayor Rudolph W. Giuliani had set up to contemplate standards for museums, formed in the wake of the "Sensation" show at the Brooklyn Museum. And he mentioned to Ms. Harris that he would like to see "The Gates" - for which, as a Central Park Conservancy board member, he had lobbied unsuccessfully - finally erected.
A few months later, men were seen wheeling giant bubble-wrapped objects into City Hall, past the security desk and into the mayor's side of the building. With little fanfare, the city had borrowed nine paintings and three sculptures from the Whitney Museum of American Art. The mayor's ceremonial office, a pair of tobacco-scented rooms where predecessors hammered out deals, is now basically an art gallery. Rare works from the city's archives dot the rest of the building, a striking contrast to municipal offices that are usually painted the color of a pediatrician's restroom and adorned with ripped maps.
Later that year, he revived the Doris C. Freedman Awards, named for one of the city's first cultural affairs commissioners. Even the grating hold music at the City Hall switchboard was replaced by Wynton Marsalis performing with the Lincoln Center Jazz Orchestra.
The larger city was brought in on the change, as public artworks began popping up all over the five boroughs with the Bloomberg administration's direct help. What was going on here?
In truth, no one much noticed.
The city was lurching toward a fiscal crisis, and disconsolate over the recent terrorist attacks. Firehouses were closing in Brooklyn, and normal people were staring suspiciously into the eyes of the passengers seated next to them on the A train, looking for signs of ill intent. There was not a lot of chatter about lofty arts awards, or the growing number of wacky large-scale sculptures dotting the city's parks.
Then, word leaked out about something that gets every New Yorker's attention: large sums of money. It was disclosed that Mr. Bloomberg had given $10 million of his own money to the Carnegie Corporation to benefit 162 small and medium-size cultural institutions around the city, in awards ranging from $25,000 to $100,000. (The mayor has repeated this gesture three times; the latest gift was $20 million.)
The donation blunted criticism about his budget-cutting. But it also sealed the view that he was friendly to the arts. "The Carnegie gift was very smart," said Gabriella de Ferrari, an New York art critic and curator. "He is there for the art world. And remember, some of the things we do are not exactly uncontroversial."
That, it seems, is part of the appeal.
The mayor of New York is the consummate insider, but Mr. Bloomberg relishes his self-generated reputation as the outsider insider, the person who owes no one anything, who pushes on doors that others had fastened shut. He is fond of boasting about how he started his own company against the advice of others and only after being fired with flourish from Salomon Brothers in 1981. Even his first mayoral campaign began as something of a large citywide joke.
While his life choices have led him to wealth and power, his aides say he is endlessly intrigued with anyone who defies convention in pursuit of some large-scale goal. Mayor Giuliani denounced "Sensation," but Mr. Bloomberg attended its opening reception with Ms. Harris.
Last year, Kate D. Levin, his cultural affairs commissioner, encouraged him to see "The Hard Nut," Mark Morris's postmodern take on "The Nutcracker," at the Brooklyn Academy of Music. The off-beat, slightly twisted piece left him riveted. "He'll say, 'It never occurred to me someone could think that way,' " Ms. Levin said. "He was really intrigued."
Some argue that while Mr. Bloomberg may favor established venues, he has been openly hostile to artists plying their wares on the street, even after a court ruling in their favor. "He and his commissioners have continued to harass street artists in every park in New York City," said Robert Lederman, who is president of Artists' Response to Illegal State Tactics. "I realize that Mike Bloomberg is an arts collector that has contributed to many arts institutions but that doesn't translate into money for artists. We believed he would automatically have a great deal of respect for artists, but we have not found that to be the case."
The city's rising fortunes aren't uniformly beneficial to artists, either. As the art market has heated up, so has the real estate market, driving struggling painters, experimental dance troupes and shoestring theater groups further out of the neighborhoods they have traditionally occupied - not just the East Village and Soho in Manhattan, but now Williamsburg and Red Hook in Brooklyn, too. That has been the city's loss as well: during this administration, some artist communities have been forced into exile, reassembling in Jersey City or Poughkeepsie.
"What was so great about New York when I came here 15 years ago was it was still possible for artists to find places to work and live that were affordable," said Randy Wray, a painter and sculpture who lives in Williamsburg, and has moved four times as rents have risen. "For a number of years, graduate students I met around the country told me that they wanted to move to New York, and that seems to be diminishing. I wonder what impact that might have on the New York art world."
As for the art that does capture the mayor's attention, the unifying theme is not school or period, but rather sheer scale.
At his company, Bloomberg L.P., then on Park Avenue, he decided the open area to the right of the elevator banks looked a little bare. After looking at several proposals, he opted for a 20-foot glass work by the artist Michael Scheiner - a sprawling installation that hung from the ceiling and spanned, octopuslike, across a staircase and hovered over the conference tables. The artist worked in the middle of the night for months to install it, and Mr. Bloomberg would check the progress each morning when he arrived at 7.
As a board member of Lincoln Center in 1999, Mr. Bloomberg used his largess and lobbying to get "The Peony Pavilion," a 20-hour classic Chinese opera, to New York after officials in Shanghai refused to allow the performers to leave China.
As mayor, he saw a huge sculpture in the Seagram Building, and decided that the former Tweed Courthouse needed one too. The result is the 50-foot-tall sculpture by Roy Lichtenstein, part of the artist's series called "Five Brushstrokes."
And visiting Athens in 2002, Mayor Bloomberg was fascinated by the Acropolis, but much the same way he had been with Visy Paper, a recycling plant he toured on Staten Island. It is the mechanics, the feat of engineering, the bureaucratic obstacles that intrigue him.
As for issues that require hours of contemplation, his friends, his aides and reporters can all attest, he is simply not built for them. (He can barely even suffer through the civic duty of Mets and Yankees games, and then only with beer and ample doses of popcorn.)
Mr. Bloomberg attends Broadway shows because that is what mayors do, but has been overheard more than once complaining about it - don't get him started on "Hairspray." "He is more into participation than observation," said his spokesman, Edward Skyler. "That is just the way he is. He would rather be running than watching sports. He would rather be learning Spanish than watching a movie. He would rather be making policy than listening to speeches."
His own art collection includes 19th- and 20th-century paintings, porcelain objects, a Lichtenstein sculpture, and two preparatory drawings of "The Gates," one of them 96 inches tall. "I don't have any more wall space left and I just don't have time," said Mr. Bloomberg during a brief chat last week. "I get home at 11, I am up to run at 5, out by 7. Who has time?"
When it comes to choosing most of the art that decorates the city's landscape - like the five Isamu Noguchi sculptures now on the Gracie Mansion lawn - Mr. Bloomberg relies on the management technique he applies to the rest of governing: he delegates.
"I think he is knowledgeable about contemporary art," said Christo in an interview, "because he is surrounded by people who are knowledgeable about contemporary art."
Ms. Harris, who oversaw Mr. Bloomberg's philanthropy at his company, is the city's unofficial curator, selecting the works the mayor ultimately approves for City Hall, Gracie Mansion, city publications and elsewhere. He has allowed Commissioner Levin - who is married to the sculptor Mark di Suvero - to extend her reach to numerous areas of government.
She helped to develop a zoning policy that let mid-block building owners in Chelsea transfer their air rights to the corner of blocks, a move that left mid-block galleries insulated against rent inflation. Her development of a mandated arts curriculum for public school students is the first of its kind since the city gutted arts education during the 1970's fiscal crisis.
Ms. Levin is also working with the city agency that preserves and develops housing on a program to address the long-standing problem of artists who are being priced out of the neighborhoods they help gentrify.
City agencies have also been instructed to smooth the path for artists to to let weird stuff happen without impediment. Tom Eccles, the former director of the Public Art Fund, a nonprofit group that presents art around the city, remembered a procession from Midtown to Queens to celebrate the temporary movement of the Museum of Modern Art in 2002. "We needed the support of the police department," he said, "and that would have been unthinkable under the Giuliani administration because you weren't given that kind of platform and were not treated with that level of respect."
While posing for a photograph in front of the Lichtenstein sculpture, Mr. Bloomberg mused about when he bought his own far smaller Lichtenstein piece, "one of the pencils." He could not recall when, or from whom. "I think I saw it in a catalog," he said, "no, maybe it was in a gallery. No, it was definitely a catalog."
And then, he was off.
MAYOR Michael R. Bloomberg is not wildly fond of looking at art.
This should not insult anyone. Spectating is just not his thing.
In his pre-mayoral days, when he attended the opera with his friend Beverly Sills, he would fall into a slumber ended with a sharp jab of her elbow. During a private tour of the Vatican in 2000, the thing that piqued his interest was the church's collection of Bloomberg terminals, an aide who traveled with him recalled.
Television? Movies? Concerts? Happenings? Never, rarely (unless they feature Will Ferrell), reluctantly and not on purpose.
Even his own enormous personal art collection, which ranges from the Hudson River School to 20th-century masters, has not been built through the sorts of obsessions and longings that guide most collectors. He works with a dealer and a decorator, and has a hard time remembering when and where he even acquired some works. When it comes to choosing art for his corporation or the city - with the exception of the few times he has requested something huge, challenging and preferably next-to-impossible to install - he also delegates taste to the art experts around him.
But in the most striking paradox of his mayoralty, his administration has done more to promote and support the arts than any in a generation.
Under Mr. Bloomberg, public art has flourished in every corner of the city - from "Element E," a Roy Lichtenstein sculpture in the center of the former Tweed Courthouse, to a classic limestone statue in the Bronx, to "The Gates," set up by Christo and Jeanne-Claude last winter in Central Park, a project for which he personally lobbied for almost a decade. The city's art commission, once knee-capped by the Giuliani administration as an elitist irritant, has been empowered at the highest level, with a voice in every significant public-works project in the city.
Mr. Bloomberg doles out city arts awards, and holds quiet dinners for less-endowed arts institutions, where he woos potential donors and board members over cocktails and burgers. He has donated millions of dollars from his own fortune to various art groups. His administration has borrowed significant works of art for City Hall and the lawn of Gracie Mansion.
Even something as prosaic as the city government's directory - known as the Green Book - now features modern artworks on its cover. This year, instead of green, it will be saffron, in honor of "The Gates."
The mayor's arts agenda has infused policy-making throughout the municipal government. The administration has created the first public school arts curriculum in a generation, and created zoning policies to encourage the growth of the art galleries in Chelsea. In the process, it has developed a constituency that is perhaps more enamored of the mayor than any other special interest group in the city.
Mr. Bloomberg "clearly understands that the small dance company with the office in the choreographer's kitchen is as much a part of what makes New York the great vibrant city that it is as the Met," said Ruby Lerner, president of Creative Capital, a nonprofit arts foundation that does not receive city money.
"That the reason artists want to be here, to stay here, to come here," she added, "is the dynamic mix between the institutional and the individual, the commercial and nonprofit, the flow back and forth, the incredible diversity here, aesthetic, cultural - the striving aspiration of all, at every level, to set the highest standards."
Beyond that, Mr. Bloomberg knows, art brings in tourists and deep-pocketed sophisticates. World-class cities have buildings designed by famous architects, controversial public art pieces and passels of van Goghs. World-class cities in turn attract large companies, community investment and good international press.
Art, in short, is good business.
"It's not about personal aesthetics," said Patricia E. Harris, Mr. Bloomberg's closest aide and the force behind much of his art agenda. "I think it is very pragmatic."
Nothing in particular about Mr. Bloomberg's 2001 campaign suggested he would place an emphasis on the arts. Indeed, in his first budget, Mayor Bloomberg hacked away at the operating budget of the Department of Cultural Affairs, infuriating some of the city's largest arts institutions. He reneged on the Giuliani administration's promise to turn the spectacular 19th-century Tweed Courthouse into the Museum of the City of New York, opting instead to put the Department of Education there. The building has too much natural light for art to be properly viewed, he said just last week, adding: "What is more important than our children!"
Still, as one of his first acts as mayor, he dispensed with the "decency commission" that Mayor Rudolph W. Giuliani had set up to contemplate standards for museums, formed in the wake of the "Sensation" show at the Brooklyn Museum. And he mentioned to Ms. Harris that he would like to see "The Gates" - for which, as a Central Park Conservancy board member, he had lobbied unsuccessfully - finally erected.
A few months later, men were seen wheeling giant bubble-wrapped objects into City Hall, past the security desk and into the mayor's side of the building. With little fanfare, the city had borrowed nine paintings and three sculptures from the Whitney Museum of American Art. The mayor's ceremonial office, a pair of tobacco-scented rooms where predecessors hammered out deals, is now basically an art gallery. Rare works from the city's archives dot the rest of the building, a striking contrast to municipal offices that are usually painted the color of a pediatrician's restroom and adorned with ripped maps.
Later that year, he revived the Doris C. Freedman Awards, named for one of the city's first cultural affairs commissioners. Even the grating hold music at the City Hall switchboard was replaced by Wynton Marsalis performing with the Lincoln Center Jazz Orchestra.
The larger city was brought in on the change, as public artworks began popping up all over the five boroughs with the Bloomberg administration's direct help. What was going on here?
In truth, no one much noticed.
The city was lurching toward a fiscal crisis, and disconsolate over the recent terrorist attacks. Firehouses were closing in Brooklyn, and normal people were staring suspiciously into the eyes of the passengers seated next to them on the A train, looking for signs of ill intent. There was not a lot of chatter about lofty arts awards, or the growing number of wacky large-scale sculptures dotting the city's parks.
Then, word leaked out about something that gets every New Yorker's attention: large sums of money. It was disclosed that Mr. Bloomberg had given $10 million of his own money to the Carnegie Corporation to benefit 162 small and medium-size cultural institutions around the city, in awards ranging from $25,000 to $100,000. (The mayor has repeated this gesture three times; the latest gift was $20 million.)
The donation blunted criticism about his budget-cutting. But it also sealed the view that he was friendly to the arts. "The Carnegie gift was very smart," said Gabriella de Ferrari, an New York art critic and curator. "He is there for the art world. And remember, some of the things we do are not exactly uncontroversial."
That, it seems, is part of the appeal.
The mayor of New York is the consummate insider, but Mr. Bloomberg relishes his self-generated reputation as the outsider insider, the person who owes no one anything, who pushes on doors that others had fastened shut. He is fond of boasting about how he started his own company against the advice of others and only after being fired with flourish from Salomon Brothers in 1981. Even his first mayoral campaign began as something of a large citywide joke.
While his life choices have led him to wealth and power, his aides say he is endlessly intrigued with anyone who defies convention in pursuit of some large-scale goal. Mayor Giuliani denounced "Sensation," but Mr. Bloomberg attended its opening reception with Ms. Harris.
Last year, Kate D. Levin, his cultural affairs commissioner, encouraged him to see "The Hard Nut," Mark Morris's postmodern take on "The Nutcracker," at the Brooklyn Academy of Music. The off-beat, slightly twisted piece left him riveted. "He'll say, 'It never occurred to me someone could think that way,' " Ms. Levin said. "He was really intrigued."
Some argue that while Mr. Bloomberg may favor established venues, he has been openly hostile to artists plying their wares on the street, even after a court ruling in their favor. "He and his commissioners have continued to harass street artists in every park in New York City," said Robert Lederman, who is president of Artists' Response to Illegal State Tactics. "I realize that Mike Bloomberg is an arts collector that has contributed to many arts institutions but that doesn't translate into money for artists. We believed he would automatically have a great deal of respect for artists, but we have not found that to be the case."
The city's rising fortunes aren't uniformly beneficial to artists, either. As the art market has heated up, so has the real estate market, driving struggling painters, experimental dance troupes and shoestring theater groups further out of the neighborhoods they have traditionally occupied - not just the East Village and Soho in Manhattan, but now Williamsburg and Red Hook in Brooklyn, too. That has been the city's loss as well: during this administration, some artist communities have been forced into exile, reassembling in Jersey City or Poughkeepsie.
"What was so great about New York when I came here 15 years ago was it was still possible for artists to find places to work and live that were affordable," said Randy Wray, a painter and sculpture who lives in Williamsburg, and has moved four times as rents have risen. "For a number of years, graduate students I met around the country told me that they wanted to move to New York, and that seems to be diminishing. I wonder what impact that might have on the New York art world."
As for the art that does capture the mayor's attention, the unifying theme is not school or period, but rather sheer scale.
At his company, Bloomberg L.P., then on Park Avenue, he decided the open area to the right of the elevator banks looked a little bare. After looking at several proposals, he opted for a 20-foot glass work by the artist Michael Scheiner - a sprawling installation that hung from the ceiling and spanned, octopuslike, across a staircase and hovered over the conference tables. The artist worked in the middle of the night for months to install it, and Mr. Bloomberg would check the progress each morning when he arrived at 7.
As a board member of Lincoln Center in 1999, Mr. Bloomberg used his largess and lobbying to get "The Peony Pavilion," a 20-hour classic Chinese opera, to New York after officials in Shanghai refused to allow the performers to leave China.
As mayor, he saw a huge sculpture in the Seagram Building, and decided that the former Tweed Courthouse needed one too. The result is the 50-foot-tall sculpture by Roy Lichtenstein, part of the artist's series called "Five Brushstrokes."
And visiting Athens in 2002, Mayor Bloomberg was fascinated by the Acropolis, but much the same way he had been with Visy Paper, a recycling plant he toured on Staten Island. It is the mechanics, the feat of engineering, the bureaucratic obstacles that intrigue him.
As for issues that require hours of contemplation, his friends, his aides and reporters can all attest, he is simply not built for them. (He can barely even suffer through the civic duty of Mets and Yankees games, and then only with beer and ample doses of popcorn.)
Mr. Bloomberg attends Broadway shows because that is what mayors do, but has been overheard more than once complaining about it - don't get him started on "Hairspray." "He is more into participation than observation," said his spokesman, Edward Skyler. "That is just the way he is. He would rather be running than watching sports. He would rather be learning Spanish than watching a movie. He would rather be making policy than listening to speeches."
His own art collection includes 19th- and 20th-century paintings, porcelain objects, a Lichtenstein sculpture, and two preparatory drawings of "The Gates," one of them 96 inches tall. "I don't have any more wall space left and I just don't have time," said Mr. Bloomberg during a brief chat last week. "I get home at 11, I am up to run at 5, out by 7. Who has time?"
When it comes to choosing most of the art that decorates the city's landscape - like the five Isamu Noguchi sculptures now on the Gracie Mansion lawn - Mr. Bloomberg relies on the management technique he applies to the rest of governing: he delegates.
"I think he is knowledgeable about contemporary art," said Christo in an interview, "because he is surrounded by people who are knowledgeable about contemporary art."
Ms. Harris, who oversaw Mr. Bloomberg's philanthropy at his company, is the city's unofficial curator, selecting the works the mayor ultimately approves for City Hall, Gracie Mansion, city publications and elsewhere. He has allowed Commissioner Levin - who is married to the sculptor Mark di Suvero - to extend her reach to numerous areas of government.
She helped to develop a zoning policy that let mid-block building owners in Chelsea transfer their air rights to the corner of blocks, a move that left mid-block galleries insulated against rent inflation. Her development of a mandated arts curriculum for public school students is the first of its kind since the city gutted arts education during the 1970's fiscal crisis.
Ms. Levin is also working with the city agency that preserves and develops housing on a program to address the long-standing problem of artists who are being priced out of the neighborhoods they help gentrify.
City agencies have also been instructed to smooth the path for artists to to let weird stuff happen without impediment. Tom Eccles, the former director of the Public Art Fund, a nonprofit group that presents art around the city, remembered a procession from Midtown to Queens to celebrate the temporary movement of the Museum of Modern Art in 2002. "We needed the support of the police department," he said, "and that would have been unthinkable under the Giuliani administration because you weren't given that kind of platform and were not treated with that level of respect."
While posing for a photograph in front of the Lichtenstein sculpture, Mr. Bloomberg mused about when he bought his own far smaller Lichtenstein piece, "one of the pencils." He could not recall when, or from whom. "I think I saw it in a catalog," he said, "no, maybe it was in a gallery. No, it was definitely a catalog."
And then, he was off.
PERFORMA
I recently founded PERFORMA (Spring 2004) to establish a distinctive biennial for the vast array of new performance by visual artists from around the world. I was prompted by the belief that curating and contextualizing such cutting edge material serves to build an exciting community of artists and audiences, and provides a strong basis for educational initiatives. A performance biennial also underlines the important influence of artists’ performance in the history of twentieth century art, and its ongoing significance in the early years of the 21st.
PERFORMA05, the first visual art performance biennial, will take place this year from November 3-21, 2005 in New York City. An exciting three week program of performances, exhibitions, film screenings, lectures and symposia, it has been organized in collaboration with a consortium of international curators and artists to provide audiences with an overview of contemporary visual art performance. With more than 90 participating artists at over twenty venues, PERFORMA05 will articulate a broad range of ideas and sensibilities across disciplines and media.
Looking back to the important era of downtown New York of the Seventies, when SoHo was a beacon for the newest developments in dance, film, music and visual arts, this first biennial also looks to the future, with its focus on new media, and the infinite possibilities of generating new directions for the visual and performing arts of the new century.
PERFORMA proudly announces its PERFORMA Commissions program, that originates new multi-media productions and supports artists in the development of their work, from concept to presentation. Jesper Just’sTrue Love is Yet to Come and Francis Aly’s Ensayo are the first PERFORMA Commissions to be developed through this initiative.
Please join our active investigation of performance and its history in all aspects of the visual arts, and in establishing a vital platform for public debate of this important work.
With PERFORMA05, November is the month to be in New York! We hope to see you there.
RoseLee Goldberg
Founding Director and Curator
PERFORMA
new visual art performance
327 East 18th Street
New York, New York 10003
212 533 5720
www.performa-arts.org
PERFORMA05, the first visual art performance biennial, will take place this year from November 3-21, 2005 in New York City. An exciting three week program of performances, exhibitions, film screenings, lectures and symposia, it has been organized in collaboration with a consortium of international curators and artists to provide audiences with an overview of contemporary visual art performance. With more than 90 participating artists at over twenty venues, PERFORMA05 will articulate a broad range of ideas and sensibilities across disciplines and media.
Looking back to the important era of downtown New York of the Seventies, when SoHo was a beacon for the newest developments in dance, film, music and visual arts, this first biennial also looks to the future, with its focus on new media, and the infinite possibilities of generating new directions for the visual and performing arts of the new century.
PERFORMA proudly announces its PERFORMA Commissions program, that originates new multi-media productions and supports artists in the development of their work, from concept to presentation. Jesper Just’sTrue Love is Yet to Come and Francis Aly’s Ensayo are the first PERFORMA Commissions to be developed through this initiative.
Please join our active investigation of performance and its history in all aspects of the visual arts, and in establishing a vital platform for public debate of this important work.
With PERFORMA05, November is the month to be in New York! We hope to see you there.
RoseLee Goldberg
Founding Director and Curator
PERFORMA
new visual art performance
327 East 18th Street
New York, New York 10003
212 533 5720
www.performa-arts.org
Karl and Scooter's Excellent Adventure
October 23, 2005
by FRANK RICH
THERE were no weapons of mass destruction. There was no collaboration between Saddam Hussein and Al Qaeda on 9/11. There was scant Pentagon planning for securing the peace should bad stuff happen after America invaded. Why, exactly, did we go to war in Iraq?
"It still isn't possible to be sure - and this remains the most remarkable thing about the Iraq war," writes the New Yorker journalist George Packer, a disenchanted liberal supporter of the invasion, in his essential new book, "The Assassins' Gate: America in Iraq." Even a former Bush administration State Department official who was present at the war's creation, Richard Haass, tells Mr. Packer that he expects to go to his grave "not knowing the answer."
Maybe. But the leak investigation now reaching its climax in Washington continues to offer big clues. We don't yet know whether Lewis (Scooter) Libby or Karl Rove has committed a crime, but the more we learn about their desperate efforts to take down a bit player like Joseph Wilson, the more we learn about the real secret they wanted to protect: the "why" of the war.
To piece that story together, you have to follow each man's history before the invasion of Iraq - before anyone had ever heard of Valerie Plame Wilson, let alone leaked her identity as a C.I.A. officer. It is not an accident that Mr. Libby's and Mr. Rove's very different trajectories - one of a Washington policy intellectual, the other of a Texas political operative - would collide before Patrick Fitzgerald's grand jury. They are very different men who play very different White House roles, but they are bound together now by the sordid shared past that the Wilson affair has exposed.
In Mr. Rove's case, let's go back to January 2002. By then the post-9/11 war in Afghanistan had succeeded in its mission to overthrow the Taliban and had done so with minimal American casualties. In a triumphalist speech to the Republican National Committee, Mr. Rove for the first time openly advanced the idea that the war on terror was the path to victory for that November's midterm elections. Candidates "can go to the country on this issue," he said, because voters "trust the Republican Party to do a better job of protecting and strengthening America's military might and thereby protecting America." It was an early taste of the rhetoric that would be used habitually to smear any war critics as unpatriotic.
But there were unspoken impediments to Mr. Rove's plan that he certainly knew about: Afghanistan was slipping off the radar screen of American voters, and the president's most grandiose objective, to capture Osama bin Laden "dead or alive," had not been achieved. How do you run on a war if the war looks as if it's shifting into neutral and the No. 1 evildoer has escaped?
Hardly had Mr. Rove given his speech than polls started to register the first erosion of the initial near-universal endorsement of the administration's response to 9/11. A USA Today/CNN/Gallup survey in March 2002 found that while 9 out of 10 Americans still backed the war on terror at the six-month anniversary of the attacks, support for an expanded, long-term war had fallen to 52 percent.
Then came a rapid barrage of unhelpful news for a political campaign founded on supposed Republican superiority in protecting America: the first report (in The Washington Post) that the Bush administration had lost Bin Laden's trail in Tora Bora in December 2001 by not committing ground troops to hunt him down; the first indications that intelligence about Bin Laden's desire to hijack airplanes barely clouded President Bush's August 2001 Crawford vacation; the public accusations by an F.B.I. whistle-blower, Coleen Rowley, that higher-ups had repeatedly shackled Minneapolis agents investigating the so-called 20th hijacker, Zacarias Moussaoui, in the days before 9/11.
These revelations took their toll. By Memorial Day 2002, a USA Today poll found that just 4 out of 10 Americans believed that the United States was winning the war on terror, a steep drop from the roughly two-thirds holding that conviction in January. Mr. Rove could see that an untelevised and largely underground war against terrorists might not nail election victories without a jolt of shock and awe. It was a propitious moment to wag the dog.
Enter Scooter, stage right. As James Mann details in his definitive group biography of the Bush war cabinet, "Rise of the Vulcans," Mr. Libby had been joined at the hip with Dick Cheney and Paul Wolfowitz since their service in the Defense Department of the Bush 41 administration, where they conceived the neoconservative manifesto for the buildup and exercise of unilateral American military power after the cold war. Well before Bush 43 took office, they had become fixated on Iraq, though for reasons having much to do with their ideas about realigning the states in the Middle East and little or nothing to do with the stateless terrorism of Al Qaeda. Mr. Bush had specifically disdained such interventionism when running against Al Gore, but he embraced the cause once in office. While others might have had cavils - American military commanders testified before Congress about their already overtaxed troops and equipment in March 2002 - the path was clear for a war in Iraq to serve as the political Viagra Mr. Rove needed for the election year.
But here, too, was an impediment: there had to be that "why" for the invasion, the very why that today can seem so elusive that Mr. Packer calls Iraq "the 'Rashomon' of wars." Abstract (and highly debatable) neocon notions of marching to Baghdad to make the Middle East safe for democracy (and more secure for Israel and uninterrupted oil production) would never fly with American voters as a trigger for war or convince them that such a war was relevant to the fight against those who attacked us on 9/11. And though Americans knew Saddam was a despot and mass murderer, that in itself was also insufficient to ignite a popular groundswell for regime change. Polls in the summer of 2002 showed steadily declining support among Americans for going to war in Iraq, especially if we were to go it alone.
For Mr. Rove and Mr. Bush to get what they wanted most, slam-dunk midterm election victories, and for Mr. Libby and Mr. Cheney to get what they wanted most, a war in Iraq for reasons predating 9/11, their real whys for going to war had to be replaced by fictional, more salable ones. We wouldn't be invading Iraq to further Rovian domestic politics or neocon ideology; we'd be doing so instead because there was a direct connection between Saddam and Al Qaeda and because Saddam was on the verge of attacking America with nuclear weapons. The facts and intelligence had to be fixed to create these whys; any contradictory evidence had to be dismissed or suppressed.
Mr. Libby and Mr. Cheney were in the boiler room of the disinformation factory. The vice president's repetitive hyping of Saddam's nuclear ambitions in the summer and fall of 2002 as well as his persistence in advertising bogus Saddam-Qaeda ties were fed by the rogue intelligence operation set up in his own office. As we know from many journalistic accounts, Mr. Cheney and Mr. Libby built their "case" by often making an end run around the C.I.A., State Department intelligence and the Defense Intelligence Agency. Their ally in cherry-picking intelligence was a similar cadre of neocon zealots led by Douglas Feith at the Pentagon.
THIS is what Col. Lawrence Wilkerson, then-Secretary of State Colin Powell's wartime chief of staff, was talking about last week when he publicly chastised the "Cheney-Rumsfeld cabal" for sowing potential disaster in Iraq, North Korea and Iran. It's this cabal that in 2002 pushed for much of the bogus W.M.D. evidence that ended up in Mr. Powell's now infamous February 2003 presentation to the U.N. It's this cabal whose propaganda was sold by the war's unannounced marketing arm, the White House Iraq Group, or WHIG, in which both Mr. Libby and Mr. Rove served in the second half of 2002. One of WHIG's goals, successfully realized, was to turn up the heat on Congress so it would rush to pass a resolution authorizing war in the politically advantageous month just before the midterm election.
Joseph Wilson wasn't a player in these exalted circles; he was a footnote who began to speak out loudly only after Saddam had been toppled and the mission in Iraq had been "accomplished." He challenged just one element of the W.M.D. "evidence," the uranium that Saddam's government had supposedly been seeking in Africa to fuel its ominous mushroom clouds.
But based on what we know about Mr. Libby's and Mr. Rove's hysterical over-response to Mr. Wilson's accusation, he scared them silly. He did so because they had something to hide. Should Mr. Libby and Mr. Rove have lied to investigators or a grand jury in their panic, Mr. Fitzgerald will bring charges. But that crime would seem a misdemeanor next to the fables that they and their bosses fed the nation and the world as the whys for invading Iraq.
by FRANK RICH
THERE were no weapons of mass destruction. There was no collaboration between Saddam Hussein and Al Qaeda on 9/11. There was scant Pentagon planning for securing the peace should bad stuff happen after America invaded. Why, exactly, did we go to war in Iraq?
"It still isn't possible to be sure - and this remains the most remarkable thing about the Iraq war," writes the New Yorker journalist George Packer, a disenchanted liberal supporter of the invasion, in his essential new book, "The Assassins' Gate: America in Iraq." Even a former Bush administration State Department official who was present at the war's creation, Richard Haass, tells Mr. Packer that he expects to go to his grave "not knowing the answer."
Maybe. But the leak investigation now reaching its climax in Washington continues to offer big clues. We don't yet know whether Lewis (Scooter) Libby or Karl Rove has committed a crime, but the more we learn about their desperate efforts to take down a bit player like Joseph Wilson, the more we learn about the real secret they wanted to protect: the "why" of the war.
To piece that story together, you have to follow each man's history before the invasion of Iraq - before anyone had ever heard of Valerie Plame Wilson, let alone leaked her identity as a C.I.A. officer. It is not an accident that Mr. Libby's and Mr. Rove's very different trajectories - one of a Washington policy intellectual, the other of a Texas political operative - would collide before Patrick Fitzgerald's grand jury. They are very different men who play very different White House roles, but they are bound together now by the sordid shared past that the Wilson affair has exposed.
In Mr. Rove's case, let's go back to January 2002. By then the post-9/11 war in Afghanistan had succeeded in its mission to overthrow the Taliban and had done so with minimal American casualties. In a triumphalist speech to the Republican National Committee, Mr. Rove for the first time openly advanced the idea that the war on terror was the path to victory for that November's midterm elections. Candidates "can go to the country on this issue," he said, because voters "trust the Republican Party to do a better job of protecting and strengthening America's military might and thereby protecting America." It was an early taste of the rhetoric that would be used habitually to smear any war critics as unpatriotic.
But there were unspoken impediments to Mr. Rove's plan that he certainly knew about: Afghanistan was slipping off the radar screen of American voters, and the president's most grandiose objective, to capture Osama bin Laden "dead or alive," had not been achieved. How do you run on a war if the war looks as if it's shifting into neutral and the No. 1 evildoer has escaped?
Hardly had Mr. Rove given his speech than polls started to register the first erosion of the initial near-universal endorsement of the administration's response to 9/11. A USA Today/CNN/Gallup survey in March 2002 found that while 9 out of 10 Americans still backed the war on terror at the six-month anniversary of the attacks, support for an expanded, long-term war had fallen to 52 percent.
Then came a rapid barrage of unhelpful news for a political campaign founded on supposed Republican superiority in protecting America: the first report (in The Washington Post) that the Bush administration had lost Bin Laden's trail in Tora Bora in December 2001 by not committing ground troops to hunt him down; the first indications that intelligence about Bin Laden's desire to hijack airplanes barely clouded President Bush's August 2001 Crawford vacation; the public accusations by an F.B.I. whistle-blower, Coleen Rowley, that higher-ups had repeatedly shackled Minneapolis agents investigating the so-called 20th hijacker, Zacarias Moussaoui, in the days before 9/11.
These revelations took their toll. By Memorial Day 2002, a USA Today poll found that just 4 out of 10 Americans believed that the United States was winning the war on terror, a steep drop from the roughly two-thirds holding that conviction in January. Mr. Rove could see that an untelevised and largely underground war against terrorists might not nail election victories without a jolt of shock and awe. It was a propitious moment to wag the dog.
Enter Scooter, stage right. As James Mann details in his definitive group biography of the Bush war cabinet, "Rise of the Vulcans," Mr. Libby had been joined at the hip with Dick Cheney and Paul Wolfowitz since their service in the Defense Department of the Bush 41 administration, where they conceived the neoconservative manifesto for the buildup and exercise of unilateral American military power after the cold war. Well before Bush 43 took office, they had become fixated on Iraq, though for reasons having much to do with their ideas about realigning the states in the Middle East and little or nothing to do with the stateless terrorism of Al Qaeda. Mr. Bush had specifically disdained such interventionism when running against Al Gore, but he embraced the cause once in office. While others might have had cavils - American military commanders testified before Congress about their already overtaxed troops and equipment in March 2002 - the path was clear for a war in Iraq to serve as the political Viagra Mr. Rove needed for the election year.
But here, too, was an impediment: there had to be that "why" for the invasion, the very why that today can seem so elusive that Mr. Packer calls Iraq "the 'Rashomon' of wars." Abstract (and highly debatable) neocon notions of marching to Baghdad to make the Middle East safe for democracy (and more secure for Israel and uninterrupted oil production) would never fly with American voters as a trigger for war or convince them that such a war was relevant to the fight against those who attacked us on 9/11. And though Americans knew Saddam was a despot and mass murderer, that in itself was also insufficient to ignite a popular groundswell for regime change. Polls in the summer of 2002 showed steadily declining support among Americans for going to war in Iraq, especially if we were to go it alone.
For Mr. Rove and Mr. Bush to get what they wanted most, slam-dunk midterm election victories, and for Mr. Libby and Mr. Cheney to get what they wanted most, a war in Iraq for reasons predating 9/11, their real whys for going to war had to be replaced by fictional, more salable ones. We wouldn't be invading Iraq to further Rovian domestic politics or neocon ideology; we'd be doing so instead because there was a direct connection between Saddam and Al Qaeda and because Saddam was on the verge of attacking America with nuclear weapons. The facts and intelligence had to be fixed to create these whys; any contradictory evidence had to be dismissed or suppressed.
Mr. Libby and Mr. Cheney were in the boiler room of the disinformation factory. The vice president's repetitive hyping of Saddam's nuclear ambitions in the summer and fall of 2002 as well as his persistence in advertising bogus Saddam-Qaeda ties were fed by the rogue intelligence operation set up in his own office. As we know from many journalistic accounts, Mr. Cheney and Mr. Libby built their "case" by often making an end run around the C.I.A., State Department intelligence and the Defense Intelligence Agency. Their ally in cherry-picking intelligence was a similar cadre of neocon zealots led by Douglas Feith at the Pentagon.
THIS is what Col. Lawrence Wilkerson, then-Secretary of State Colin Powell's wartime chief of staff, was talking about last week when he publicly chastised the "Cheney-Rumsfeld cabal" for sowing potential disaster in Iraq, North Korea and Iran. It's this cabal that in 2002 pushed for much of the bogus W.M.D. evidence that ended up in Mr. Powell's now infamous February 2003 presentation to the U.N. It's this cabal whose propaganda was sold by the war's unannounced marketing arm, the White House Iraq Group, or WHIG, in which both Mr. Libby and Mr. Rove served in the second half of 2002. One of WHIG's goals, successfully realized, was to turn up the heat on Congress so it would rush to pass a resolution authorizing war in the politically advantageous month just before the midterm election.
Joseph Wilson wasn't a player in these exalted circles; he was a footnote who began to speak out loudly only after Saddam had been toppled and the mission in Iraq had been "accomplished." He challenged just one element of the W.M.D. "evidence," the uranium that Saddam's government had supposedly been seeking in Africa to fuel its ominous mushroom clouds.
But based on what we know about Mr. Libby's and Mr. Rove's hysterical over-response to Mr. Wilson's accusation, he scared them silly. He did so because they had something to hide. Should Mr. Libby and Mr. Rove have lied to investigators or a grand jury in their panic, Mr. Fitzgerald will bring charges. But that crime would seem a misdemeanor next to the fables that they and their bosses fed the nation and the world as the whys for invading Iraq.
Thursday, October 20
McGill U., in Canada, cancels entire football season over hazing incident
By KAREN BIRCHARD
Canada's oldest university football team will forfeit the rest of its season following an extensive investigation into a player's hazing complaint. Officials at McGill University said on Tuesday that both the team and individual students would also be disciplined for acting inappropriately during training camp, in August.
"It's reinforcement of our zero tolerance for hazing," said McGill's acting provost, Anthony C. Masi. "The same thing will happen to any team who engages in hazing in the future. ... No excuses. No exceptions."
That tough line goes beyond sports, he added, saying that if hazing is part of any activity at the Montreal university, the activity will be suspended.
The football player who complained, an 18-year-old rookie, left both the team and the university after asserting that he and other recruits had suffered humiliating and degrading hazing. The student, who remains unidentified, said they had been taken to a darkened squash court to meet "Dr. Broom," and had been asked to remove their shorts. The student said he had refused to do so. He was then forced, he said, to his hands and knees, was gagged with a dog's chew toy, and was touched anally with the end of a broomstick.
A report on the university's investigation, completed on Monday, found that "the event did involve nudity, degrading positions and behaviors, gagging, touching in inappropriate manners with a broomstick, as well as verbal and physical intimidation of rookies by a large portion of the team." The report said that no actual sodomy had taken place.
The head coach initially suspended one player indefinitely and five others for one game. But the university's report found the entire team responsible and ordered its members to perform at least two years of community service. The university said it was pursuing disciplinary actions against individuals as well, but was not seeking expulsions.
The father of the rookie who complained, himself a former professional player and McGill alumnus, said the family had decided to make no statement at this time. The player intends to resume his studies at the University of Toronto in January.
Canadian Interuniversity Sport, the organization that oversees varsity sports in Canada, said it supported the university's actions against the team, which was formed in 1872. The team's withdrawal, however, does create some logistical problems for the league -- McGill's game against top-ranked Laval University, scheduled for Saturday, was supposed to have been televised.
Canada's oldest university football team will forfeit the rest of its season following an extensive investigation into a player's hazing complaint. Officials at McGill University said on Tuesday that both the team and individual students would also be disciplined for acting inappropriately during training camp, in August.
"It's reinforcement of our zero tolerance for hazing," said McGill's acting provost, Anthony C. Masi. "The same thing will happen to any team who engages in hazing in the future. ... No excuses. No exceptions."
That tough line goes beyond sports, he added, saying that if hazing is part of any activity at the Montreal university, the activity will be suspended.
The football player who complained, an 18-year-old rookie, left both the team and the university after asserting that he and other recruits had suffered humiliating and degrading hazing. The student, who remains unidentified, said they had been taken to a darkened squash court to meet "Dr. Broom," and had been asked to remove their shorts. The student said he had refused to do so. He was then forced, he said, to his hands and knees, was gagged with a dog's chew toy, and was touched anally with the end of a broomstick.
A report on the university's investigation, completed on Monday, found that "the event did involve nudity, degrading positions and behaviors, gagging, touching in inappropriate manners with a broomstick, as well as verbal and physical intimidation of rookies by a large portion of the team." The report said that no actual sodomy had taken place.
The head coach initially suspended one player indefinitely and five others for one game. But the university's report found the entire team responsible and ordered its members to perform at least two years of community service. The university said it was pursuing disciplinary actions against individuals as well, but was not seeking expulsions.
The father of the rookie who complained, himself a former professional player and McGill alumnus, said the family had decided to make no statement at this time. The player intends to resume his studies at the University of Toronto in January.
Canadian Interuniversity Sport, the organization that oversees varsity sports in Canada, said it supported the university's actions against the team, which was formed in 1872. The team's withdrawal, however, does create some logistical problems for the league -- McGill's game against top-ranked Laval University, scheduled for Saturday, was supposed to have been televised.
Wednesday, October 19
Portrait of the Writer as a Computer User, Verlyn Klinkenborg, NY Times
Readers often ask me if I write on a computer. The question comes up because I sometimes write about nature, and because readers always wonder how writers actually work. The answer is that, at some point in my writing career, I have used just about every popular technology developed for writers since the invention of the lead pencil.
I've written with cedar pencils and mechanical pencils, with ballpoint and felt-tip and fountain pens. I've written on manual and electric typewriters. I have never used a quill. And unlike my Times colleague, David Pogue, I don't write with voice-recognition software, because my prose doesn't work that way. But if the software designers ever develop thought-recognition software, I will certainly give it a try.
When I write, I compose sentences in my head. Most writers could say the same. Untold numbers of sentences - hundreds of thousands of them - have expired in my thoughts without ever taking on ink. They have left no trace except for a certain economy in the way I work, the result of years of ruthless sentence slaughter.
I have never believed in making a fetish of the tools I write with. A perfect pen, a clothbound journal, a supportive chair, a clean, well-lighted desk, the right time of day, the ideal stimulant or inebriant, a favorite mug - these are beloved of aspiring writers and eagerly touted by the aspiring-writer industry. Sooner or later they all turn into reasons you can't write, when you run out of green tea or lavender ink. I need to be able to write whenever and wherever I can. That's why I write in my head . It's always with me.
But I do write on a computer, and that's been true since 1984, when I was living in the Bronx and writing on my new Zenith PC. I can see now how primitive that machine and its software really were. And yet in some ways, the computer I'm writing these words on now - a powerful and nearly silent Apple iMac - seems far less exotic to me than that Zenith did. How I got from one to the other - and what I noticed along the way - is the story you are about to read.
1984, The Bronx, and a Zenith PC
In 21 years, I've owned a dozen personal computers, not counting the three machines The Times has provided since I joined the editorial board. There's really no excuse for so much hardware except an interest in computers and an appetite - nearly a sickness, as every computer-lover knows - for keeping up to date. For conveying the words I write into print, the newest computers work best because they seamlessly intersect with the machines that make e-mail and the Internet and the electronic editing of books and magazines and newspapers possible. But when it comes to the basic task of taking down sentences, my very first computer probably did the job almost as well as my new G5 iMac.
My first computer was a Zenith PC-compatible with two genuinely floppy 5 1/4 inch floppy discs. I bought it in December 1984, a year after I finished typing my doctoral dissertation - a year too late, in other words. I remember almost nothing about the technical data of that PC, which seems appropriate for a computer that had virtually no memory itself. But I see, from the invoice, that it came with MS-DOS 2.11, Basic 2.0, Microsoft Word 1.1, and 320 kilobytes of RAM. With a faculty discount from Fordham University, where I was teaching at the time, that PC cost $1,656.67, or $3066.45 adjusted for inflation.
It was easy to rationalize buying a good typewriter. I typed my dissertation on a topaz IBM Correcting Selectric III, a wonderful machine even though it remembered only the last key I pressed. (It cost $920.13 with tax, or $1757.63 in today's dollars.) In fact, I wish I still had it, since I prefer to write letters, actual snail-mail letters, on a typewriter.
Rationalizing the PC took a little more self-deception. For a writer, owning a personal computer in 1984 was really only about owning a computer. The glow of green letters on a black background, the static of the cathode ray tube, the sense of glimpsing the future when I sat down to work - these things had nothing to do with writing. They had to do with a sense, deeply embedded in me in grade school, that science might someday make a better world for all of us. But there was something disappointing about that PC. The mechanics of manipulating text on it weren't very different from writing on a typewriter. It was just phosphorescent typing. If I made a mistake, I could correct it without having to throw away a sheet of paper. But I could do the same on my typewriter.
All in all, the Selectric was the better machine. It didn't run DOS. It didn't require a printer, or type in dot-matrix. There was something gratifyingly physical about the sound of the spherical typing element hitting the paper and the platen behind it. I could feel in my bones the solidity of every word. There was even something satisfying about ripping a sheet out of the carriage in disgust, crumpling it up, and tossing it into the wastebasket. I could turn the Selectric on or off in an instant, without booting up or shutting down. And when it was off, it seemed to recede into the room - just a familiar piece of office furniture.
Unlike the typewriter, the PC always wanted to be on. After I booted it up, it even pretended to have a life of its own, though it wasn't much of a life. The monitor liked to stare me in the face, cursor blinking, the command prompt quietly needling me as if there were some dim intelligence on the other side of the screen. "A" it kept saying, "A." That was all. I wanted to work in English. It wanted to work in Basic or Fortran or Pascal, languages I had never heard of before. I wanted to write prose of many moods. It preferred the imperative - short, choppy commands in a code intelligible only to itself. I wanted to think. It wanted to run Flight Simulator. And there was always the worry, no matter how often I backed up my work, that the machine would accidentally delete it - a treason no typewriter is capable of.
The Anti-Romantic Computer
Verlyn Klinkenborg
There is a tired myth that writers fear the blank sheet of paper. The fact is that a sheet of paper conveys no expectation. It just lies there keeping its counsel, infinitely patient. But not a computer. The cursor winks dictatorially, pointing to the spot on the screen where you must begin. "Here," it says, "Go ahead and write. Go ahead. Write. Here. Now." That subtle electronic tyranny is often, I think, what gives real point to the question, "Do you write on a computer?"
Most of the ordinary notions about writing are essentially romantic. Readers may not be thinking specifically of Keats in the blushing stubble or Coleridge by the midnight fire, but they do tend to take it for granted that the best parts of a writer's work, especially a writer whose subject is nature, arise spontaneously, organically.
The premise of that romantic faith, as commonly misunderstood, is that writers don't really work on command. Inspiration is supposed to rise up within them like an intermittent spring. Running a newspaper along those lines would be almost impossible, of course - no end of trouble for the Public Editor. But so would writing a novel or a work of literary nonfiction or, for that matter, a lyric poem.
Writing while seated at the computer - the cursor winking expectantly as the seconds tick past - is the very antithesis of the romantic myth. It implies that you expect to be able to find what's best in you, as a writer, by looking for it instead of waiting for it to bubble up among the nightingales.
And that is exactly what I do expect, whether I'm writing about Interior Department management policy or the paw-print of a wolf in Yellowstone. Writing means allowing yourself to think while paying attention to what you think as you think it. A computer is no impediment to that, unless you have broadband.
Beginning writers (and an awful lot of readers) love to believe that writing flows naturally. The ghost of spontaneity is upon them. They assume that a computer is so inorganic, so soulless, that it somehow dams the flood of words. But in good writing there's no such thing as a natural flow of words, except as an effect in the reader's mind. The real labor - and the real art - is considering each word one by one and laying it in its proper place. (George Orwell makes a version of this point in his seminal essay, "Politics and the English Language.")
The problem, of course, is the word "natural" - a word that "motherboard," "Microsoft" and even "Macintosh" seem inherently to contradict. That word underlies most of the false assumptions about how writers work. There's nothing natural about writing, whether you're dipping ink out of a bottle with the nib of a pen or working in VoodooPad on a PowerBook. Talking is natural. Writing is not. You learn to do the one without knowing it. The other takes years of practice. For some reason, this confuses people.
All writing is revision. (Just look at William Wordsworth's manuscripts.) And revision is where computers come in handy. They make it easy to revisit any sentence, any paragraph, and to write from the middle instead of the end. And they make seeing the results of your revisions effortless. They bring the task of revising your words as close to the dexterity of thought as it is ever likely to get. And, in a certain sense, because computers make managing text so effortless - no retyping yet another draft when you thought you were finally finished - they remove any plausible excuse for not getting every single word, every single phrase, just the way you want it. (Not that there ever was a plausible excuse.)
Yet even now, after 20 years of writing on computers, I still print a hard copy of my day's work, and I still read it aloud to myself and edit it with a pen. My ear is still vastly smarter than my eye, just as it was when I used to type up in the evening what I had written in pencil during the day and edit it one more time before I went to bed. I suppose I could ask one of the personalities in the VoiceOver Utility on my iMac - Fred or Kathy or Zarvox - to read my work aloud to me. But I'm a much better reader than they are. I know what the words actually mean.
Another Zenith, Circa 1988, and the Claustrophobia of DOS
Of all the computers I've owned, the only one I've kept around for sentimental reasons- does anything have less sentimental value than an old computer? - is a Zenith laptop, or, as the label on the bottom says, "Lap Top Computer Model ZFL-181-92." I bought it in 1988. It is nearly as big as four iBooks stacked in pairs side by side. It has aged the way beige plastic ages, turning as yellow as a smoker's teeth.
My wife found the Zenith in the attic the other day. I dug out a copy of MS-DOS 3.2 and slipped it into the A drive. I turned on the switch, and abruptly the A prompt appeared on that small blue screen. I asked for the B prompt, inserted a data disk in the B drive, and asked to see the disk's contents by typing "dir/w." I was surprised to find myself speaking DOS. The Zenith was not. The screen scrolled, and there I was - just where I'd left off working 16 or 17 years ago.
What struck me - besides the residue of DOS commands cluttering my brain - was the claustrophobic quality of the actual workspace, which, for all the bulk of the laptop, was only about half the size of a sheet of typing paper. The screen was small, yes, but MS-DOS also lacked the graphical user interface pioneered by Apple and made popular, as Windows 3.0 by Microsoft in 1990. The operating systems we all use now offer the illusion of looking into the screen as if it were a terrarium filled with three-dimensional objects. That was not possible in DOS 3.2, not in 1988, and not on the Zenith. What I was looking at was no deeper or more illusory than the display on a calculator.
By 1993 the Zenith was just about worthless, which was a function of time not wear. Suddenly, there was no longer any point to a laptop running DOS on 3 1/2 inch disks. I bought an IBM Thinkpad, a Windows laptop that seemed like a miracle of miniaturization and power and graphical beauty. I never gave the Zenith another thought. It took ten years before I began to regard it fondly, as a relic of a simpler time. I looked up "Zenith laptop" on eBay the other day, and the opening bid for the machine I found - a ZFL-181-93 - was $9.99, or $6.16 in 1988 dollars No one was bidding on it.
The Software of Getting What You See
Hardware is less than half the story, of course. It wasn't just the screen on the Zenith laptop that made me feel claustrophobic the other day - it was MS-DOS, too. I have half a dozen different versions of DOS stuck away in a closet, and piles of disks for other, long-forgotten software, all of it utterly familiar to me once upon a time.
The old elaborate protocols for buying new software have largely vanished. Installation disks have been replaced by instant downloads and e-mailed registration codes. Detailed manuals have been replaced by nothing. A major software release used to feel like a mental land-rush. There was a flurry of dust and confusion at the beginning and a race to the most promising homesteads. Everything looked fresh and fertile for a while. And then eventually you realized that this, too, had become a well-settled country and it was time to move on. All that old software in my closet describes an antique land, no longer habitable now that I've seen the present.
Nearly every computer user, writer or not, is caught up in a basic tension: learning new software as rapidly as possible while carrying old habits forward. I've been using Microsoft Word my whole computing life. I began with version 1.1 in 1984, when the basic program was about 72,000 bytes in size. (The core of the version I now use is about 13 megabytes.) I've been working in Word for nearly its entire history. I've switched platforms - from Microsoft Windows to Apple OS X - but I still use Word. The boundaries of the program have steadily pushed outward, embracing new languages (like HTML and technologies like e-mail and the Internet that were only in their infancy when I began using it. But within Word I still live in the same pleasant township I used to live in, never mind how large or sophisticated the district around me has grown. Every now and then I cultivate a new patch of ground. But with every upgrade of Word, I use proportionately less and less of the program's full potential. And so I have stopped upgrading.
Since 1984, I've lived in 7 different apartments or houses in New York city, the Berkshires, and now upstate New York. I've taught at 6 different colleges and universities and worked at The Times for 8 years. I've owned a dozen different computers and written four books. But I've lived in only one writing program. I was a Word man from the get go. Word came bundled with my original Zenith PC, and that was enough. I was lucky. It could have been XyWrite.
Information or, Alas, the Efficiency of Inefficiency
Verlyn Klinkenborg
I vividly remember an article that James Fallows wrote about Lotus Agenda, a powerful "PIM" or personal information manager, in the Atlantic Monthly in 1992. It was a glimpse inside a remarkable piece of software, but it was also a tantalizing peek inside the working habits of another writer. I was struck by the flexibility and capaciousness of Agenda, as Fallows described it, but I was even more impressed by the time he had put into learning the program, which helped him organize the masses of information he was gathering as a reporter. I almost bought a copy of Agenda. What stopped me was the price - $399 - and the recollection that I am not James Fallows.
For me, writing is an inherently inefficient business. It takes a lot of staring out the window, and walking downstairs to see if the ducks and geese are getting in the road. It is often accompanied by a nap, a far more powerful organizational tool than most people realize. I have sound scholarly habits and good reporting skills. But I have yet to be conquered by the recent passion - of which Agenda is an early instance - for Getting Things Done, based on the book of that name by David Allen. (Fallows has also written in the Atlantic about Allen and his GTD gospel of efficiency. My model is closer to Robert Benchley's "How to Get Things Done." That essay, published 75 years ago, begins, "A great many people have come up to me and asked me how I manage to get so much work done and still keep looking so dissipated."
There are plenty of software tools to help writers get organized. But to me they're useful in inverse proportion to the time it takes to learn them. I'm constantly trying them out in hopes of finding one that is both life-changing and simple. I've worked in DevonThink and Ulysses (forgive me if I'm entering Mac-only territory) and liked them well enough to buy them. (I especially like one of DevonThink's most trivial features - its ability to make an instant concordance of anything I'm writing.) I've downloaded countless other note-taking and organizational programs that went into the trash long before the trial period ended. Most of them expected me to become as systematic and energetic as James Fallows.
I don't lack system. I was trained as a scholar, and my note-taking for the books I've written has been as extensive and as scrupulous as I know how to make it. But from the narrow perspective of my own writing, what I've realized is this: I don't need a complex program I can dump all my notes into as I gather them, like Agenda or even DevonThink. I need a simple program that can search outward, across the electronic galaxy of my hard drive, and find the notes I need no matter where, or in what format, I've dumped them.
I discovered last year, for instance, that the "find" command in Adobe Reader 6.0 ("search" in Adobe Reader 7.0) allows you to find any word in any set of PDF documents you specify. Instead of finding the word you want by searching, instance by instance, through a single text, this command gathers every occurrence, in context, into a single list, essentially creating a concordance entry for that word in that set of texts. To me this seemed like a miracle. I converted all the notes I'd taken for my new book - hundreds of pages of them -into PDF format (a simple matter that took only a few minutes). Suddenly I could see every occurrence of the word "tortoise," for instance, or "curate." To someone who is utterly text-based - a writer, in other words - this is an invaluable tool. What made it work, however, was the fact that I'd already typed hundreds of pages of notes into the computer. There is no real shortcut for that, if only because typing those notes was a way of imprinting them in my memory.
All the formatting power of the big word-processors - and the only one left standing is Word- isn't worth as much to me as the increasingly powerful search tools that are embedded in new operating systems. The best example is Spotlight, which appeared in the Apple OS X release called Tiger (10.4) earlier this year. Like most long-time computer users, I've almost completely absorbed the hierarchical thinking - the dendritic structure of folders and files - embedded beneath the surface of the desktop metaphor. I still file documents carefully and accurately, just as if I were still working in DOS. But that doesn't mean I remember where I've put them.
With Spotlight, I can type a word or phrase into the search window and find every occurrence on my computer. I can think of almost nothing that changes the feel of using a personal computer more than this utility. Nearly everything can be filed almost anywhere - perfect chaos under the hood-and it makes practically no difference.
For most computer users these days, the big question is how to aggregate usefully all the bits of information that flow through the machine - contacts, calendars, web-pages, emails, graphics, style-sheets, RSS feeds, and every other sort of flotsam - in a way that makes them easily accessible. (One example is Onlife, which tracks your actions in various applications. Another is Scrapbook, a Firefox extension, which allows you to clip web-pages to a virtual scrapbook as you surf the Web.) The important thing to notice about these tools is that they pertain almost wholly to a computer-defined universe. They can manipulate only a certain kind of reality - what used to be called data or information and is now, oddly enough, often called content.
The metaphors in these programs are a giveaway - scrapbooks and shoeboxes. I never write on or in or with a shoebox. These tools are handy only as long as I remember that they have nothing to do with writing. My office at home, like my office at The Times, is filled with non-metaphorical aggregations of information: stacks of books, mounds of magazines, piles of paper, notebooks filled with handwritten notes, files old enough to contain, for example, the invoices for my IBM Selectric and my first PC. I often spend a half hour looking for a book I know is somewhere in the house, just as I used to spend whole afternoons pleasantly lost in the library stacks at Princeton and Harvard. Over the years, I've probably learned as much from the books I found on the way to the book I was looking for as I have from the book I wanted. That is an example of the efficiency of inefficiency.
My own brain is the information manager I'm interested in. I need to be able to listen for hints that are utterly intuitive. As I am writing, I need to be alert to a feeling I can't quite explain, a realization I haven't realized yet, an analogy that's still missing some of its parts. I'm interested in understanding the shape of what I don't know because that too is a form of knowledge.
In a sense, writing editorials for The New York Times epitomizes the problem that many writers face - or believe they face - these days. I usually have, at most, a few hundred words to work in. On any subject - the redevelopment of the World Trade Center site, the evils of factory farming, the fundamentalist attack on evolution - I can find reams of information very quickly. But good prose can carry only so much freight. In a few hundred words, there is room for only a few facts and one central idea, no matter how many interviews I've done or how many reports I've read. But all writing, really, is reticence. It is about saying one thing at a time, not everything at once. The trick isn't learning how to manage information. It's learning how to think. There is no software for that.
The Switch
I gave up in the spring of 2002. Eighteen years of Microsoft operating systems was enough. I had been through every version of DOS and nearly every version of Windows up to that point. If I knew as much about my tractor as I did about the fine points those systems, I would be able to rebuild its hydraulics from scratch. It made no difference. As Windows grew more and more complicated (or kludgier and kludgier) - and as Microsoft tried to make it seem simpler and friendlier to novices - it also got uglier and uglier. So it seemed to me. I moved my last Windows desktop to the basement, where I hope the mold is eating its hard-drive. I bought an iBook and have lived happily ever after.
It was just the right time. The iBook came with OS X 10.1. That's what I use. I never glanced at OS 9. I wanted nothing to do with the past - even Apple's past. I now no longer have to worry about crashes or screen freezes, regular occurrences in my Microsoft days. This has nothing to do with writing, I know, but it has everything to do with allowing me to keep my composure. The real reason for switching platforms, though, was to recover some of the pleasure of using a computer, which had almost vanished for me. The stability of my 12" iBook (and its successors, a 12" PowerBook and a 20" iMac) was important and so was ease of use and a sense of inventiveness. But what has won me over is the esthetics of the Apple cosmos. It's a fine-grained universe with smooth, clean edges. The world within the screen appears to recognize, and obey, the laws of gravity. Solids appear solid, not pixilated and porous. My Apple seemed surprisingly willing to leave me alone to do my work. It never nagged me. It never panicked. It had made a clean break with the past and it let me do so too.
I still write in Microsoft Word, I know. I'm happy with Word X for Mac, which hasn't been updated in a couple of years, and I will not upgrade to Office 2004. Even the name makes me nervous. Every now and then I hit just the wrong combination of keys, and the Office Assistant pops up. I loathe the Office Assistant - even the Mac version, which is far less annoying than the Windows one. Animated assistance is the last thing I want. It's as useless as a grammar checker. I suspect that after 21 years I've nearly come to the end of the Word road. I suppose I'll miss it when I find the right replacement. And yet I haven't missed Windows - or MS-DOS - yet. I don't think I ever will.
This, too, has nothing to do with writing. And yet everything has something to do with writing. That's the nature of the job.
I've written with cedar pencils and mechanical pencils, with ballpoint and felt-tip and fountain pens. I've written on manual and electric typewriters. I have never used a quill. And unlike my Times colleague, David Pogue, I don't write with voice-recognition software, because my prose doesn't work that way. But if the software designers ever develop thought-recognition software, I will certainly give it a try.
When I write, I compose sentences in my head. Most writers could say the same. Untold numbers of sentences - hundreds of thousands of them - have expired in my thoughts without ever taking on ink. They have left no trace except for a certain economy in the way I work, the result of years of ruthless sentence slaughter.
I have never believed in making a fetish of the tools I write with. A perfect pen, a clothbound journal, a supportive chair, a clean, well-lighted desk, the right time of day, the ideal stimulant or inebriant, a favorite mug - these are beloved of aspiring writers and eagerly touted by the aspiring-writer industry. Sooner or later they all turn into reasons you can't write, when you run out of green tea or lavender ink. I need to be able to write whenever and wherever I can. That's why I write in my head . It's always with me.
But I do write on a computer, and that's been true since 1984, when I was living in the Bronx and writing on my new Zenith PC. I can see now how primitive that machine and its software really were. And yet in some ways, the computer I'm writing these words on now - a powerful and nearly silent Apple iMac - seems far less exotic to me than that Zenith did. How I got from one to the other - and what I noticed along the way - is the story you are about to read.
1984, The Bronx, and a Zenith PC
In 21 years, I've owned a dozen personal computers, not counting the three machines The Times has provided since I joined the editorial board. There's really no excuse for so much hardware except an interest in computers and an appetite - nearly a sickness, as every computer-lover knows - for keeping up to date. For conveying the words I write into print, the newest computers work best because they seamlessly intersect with the machines that make e-mail and the Internet and the electronic editing of books and magazines and newspapers possible. But when it comes to the basic task of taking down sentences, my very first computer probably did the job almost as well as my new G5 iMac.
My first computer was a Zenith PC-compatible with two genuinely floppy 5 1/4 inch floppy discs. I bought it in December 1984, a year after I finished typing my doctoral dissertation - a year too late, in other words. I remember almost nothing about the technical data of that PC, which seems appropriate for a computer that had virtually no memory itself. But I see, from the invoice, that it came with MS-DOS 2.11, Basic 2.0, Microsoft Word 1.1, and 320 kilobytes of RAM. With a faculty discount from Fordham University, where I was teaching at the time, that PC cost $1,656.67, or $3066.45 adjusted for inflation.
It was easy to rationalize buying a good typewriter. I typed my dissertation on a topaz IBM Correcting Selectric III, a wonderful machine even though it remembered only the last key I pressed. (It cost $920.13 with tax, or $1757.63 in today's dollars.) In fact, I wish I still had it, since I prefer to write letters, actual snail-mail letters, on a typewriter.
Rationalizing the PC took a little more self-deception. For a writer, owning a personal computer in 1984 was really only about owning a computer. The glow of green letters on a black background, the static of the cathode ray tube, the sense of glimpsing the future when I sat down to work - these things had nothing to do with writing. They had to do with a sense, deeply embedded in me in grade school, that science might someday make a better world for all of us. But there was something disappointing about that PC. The mechanics of manipulating text on it weren't very different from writing on a typewriter. It was just phosphorescent typing. If I made a mistake, I could correct it without having to throw away a sheet of paper. But I could do the same on my typewriter.
All in all, the Selectric was the better machine. It didn't run DOS. It didn't require a printer, or type in dot-matrix. There was something gratifyingly physical about the sound of the spherical typing element hitting the paper and the platen behind it. I could feel in my bones the solidity of every word. There was even something satisfying about ripping a sheet out of the carriage in disgust, crumpling it up, and tossing it into the wastebasket. I could turn the Selectric on or off in an instant, without booting up or shutting down. And when it was off, it seemed to recede into the room - just a familiar piece of office furniture.
Unlike the typewriter, the PC always wanted to be on. After I booted it up, it even pretended to have a life of its own, though it wasn't much of a life. The monitor liked to stare me in the face, cursor blinking, the command prompt quietly needling me as if there were some dim intelligence on the other side of the screen. "A" it kept saying, "A." That was all. I wanted to work in English. It wanted to work in Basic or Fortran or Pascal, languages I had never heard of before. I wanted to write prose of many moods. It preferred the imperative - short, choppy commands in a code intelligible only to itself. I wanted to think. It wanted to run Flight Simulator. And there was always the worry, no matter how often I backed up my work, that the machine would accidentally delete it - a treason no typewriter is capable of.
The Anti-Romantic Computer
Verlyn Klinkenborg
There is a tired myth that writers fear the blank sheet of paper. The fact is that a sheet of paper conveys no expectation. It just lies there keeping its counsel, infinitely patient. But not a computer. The cursor winks dictatorially, pointing to the spot on the screen where you must begin. "Here," it says, "Go ahead and write. Go ahead. Write. Here. Now." That subtle electronic tyranny is often, I think, what gives real point to the question, "Do you write on a computer?"
Most of the ordinary notions about writing are essentially romantic. Readers may not be thinking specifically of Keats in the blushing stubble or Coleridge by the midnight fire, but they do tend to take it for granted that the best parts of a writer's work, especially a writer whose subject is nature, arise spontaneously, organically.
The premise of that romantic faith, as commonly misunderstood, is that writers don't really work on command. Inspiration is supposed to rise up within them like an intermittent spring. Running a newspaper along those lines would be almost impossible, of course - no end of trouble for the Public Editor. But so would writing a novel or a work of literary nonfiction or, for that matter, a lyric poem.
Writing while seated at the computer - the cursor winking expectantly as the seconds tick past - is the very antithesis of the romantic myth. It implies that you expect to be able to find what's best in you, as a writer, by looking for it instead of waiting for it to bubble up among the nightingales.
And that is exactly what I do expect, whether I'm writing about Interior Department management policy or the paw-print of a wolf in Yellowstone. Writing means allowing yourself to think while paying attention to what you think as you think it. A computer is no impediment to that, unless you have broadband.
Beginning writers (and an awful lot of readers) love to believe that writing flows naturally. The ghost of spontaneity is upon them. They assume that a computer is so inorganic, so soulless, that it somehow dams the flood of words. But in good writing there's no such thing as a natural flow of words, except as an effect in the reader's mind. The real labor - and the real art - is considering each word one by one and laying it in its proper place. (George Orwell makes a version of this point in his seminal essay, "Politics and the English Language.")
The problem, of course, is the word "natural" - a word that "motherboard," "Microsoft" and even "Macintosh" seem inherently to contradict. That word underlies most of the false assumptions about how writers work. There's nothing natural about writing, whether you're dipping ink out of a bottle with the nib of a pen or working in VoodooPad on a PowerBook. Talking is natural. Writing is not. You learn to do the one without knowing it. The other takes years of practice. For some reason, this confuses people.
All writing is revision. (Just look at William Wordsworth's manuscripts.) And revision is where computers come in handy. They make it easy to revisit any sentence, any paragraph, and to write from the middle instead of the end. And they make seeing the results of your revisions effortless. They bring the task of revising your words as close to the dexterity of thought as it is ever likely to get. And, in a certain sense, because computers make managing text so effortless - no retyping yet another draft when you thought you were finally finished - they remove any plausible excuse for not getting every single word, every single phrase, just the way you want it. (Not that there ever was a plausible excuse.)
Yet even now, after 20 years of writing on computers, I still print a hard copy of my day's work, and I still read it aloud to myself and edit it with a pen. My ear is still vastly smarter than my eye, just as it was when I used to type up in the evening what I had written in pencil during the day and edit it one more time before I went to bed. I suppose I could ask one of the personalities in the VoiceOver Utility on my iMac - Fred or Kathy or Zarvox - to read my work aloud to me. But I'm a much better reader than they are. I know what the words actually mean.
Another Zenith, Circa 1988, and the Claustrophobia of DOS
Of all the computers I've owned, the only one I've kept around for sentimental reasons- does anything have less sentimental value than an old computer? - is a Zenith laptop, or, as the label on the bottom says, "Lap Top Computer Model ZFL-181-92." I bought it in 1988. It is nearly as big as four iBooks stacked in pairs side by side. It has aged the way beige plastic ages, turning as yellow as a smoker's teeth.
My wife found the Zenith in the attic the other day. I dug out a copy of MS-DOS 3.2 and slipped it into the A drive. I turned on the switch, and abruptly the A prompt appeared on that small blue screen. I asked for the B prompt, inserted a data disk in the B drive, and asked to see the disk's contents by typing "dir/w." I was surprised to find myself speaking DOS. The Zenith was not. The screen scrolled, and there I was - just where I'd left off working 16 or 17 years ago.
What struck me - besides the residue of DOS commands cluttering my brain - was the claustrophobic quality of the actual workspace, which, for all the bulk of the laptop, was only about half the size of a sheet of typing paper. The screen was small, yes, but MS-DOS also lacked the graphical user interface pioneered by Apple and made popular, as Windows 3.0 by Microsoft in 1990. The operating systems we all use now offer the illusion of looking into the screen as if it were a terrarium filled with three-dimensional objects. That was not possible in DOS 3.2, not in 1988, and not on the Zenith. What I was looking at was no deeper or more illusory than the display on a calculator.
By 1993 the Zenith was just about worthless, which was a function of time not wear. Suddenly, there was no longer any point to a laptop running DOS on 3 1/2 inch disks. I bought an IBM Thinkpad, a Windows laptop that seemed like a miracle of miniaturization and power and graphical beauty. I never gave the Zenith another thought. It took ten years before I began to regard it fondly, as a relic of a simpler time. I looked up "Zenith laptop" on eBay the other day, and the opening bid for the machine I found - a ZFL-181-93 - was $9.99, or $6.16 in 1988 dollars No one was bidding on it.
The Software of Getting What You See
Hardware is less than half the story, of course. It wasn't just the screen on the Zenith laptop that made me feel claustrophobic the other day - it was MS-DOS, too. I have half a dozen different versions of DOS stuck away in a closet, and piles of disks for other, long-forgotten software, all of it utterly familiar to me once upon a time.
The old elaborate protocols for buying new software have largely vanished. Installation disks have been replaced by instant downloads and e-mailed registration codes. Detailed manuals have been replaced by nothing. A major software release used to feel like a mental land-rush. There was a flurry of dust and confusion at the beginning and a race to the most promising homesteads. Everything looked fresh and fertile for a while. And then eventually you realized that this, too, had become a well-settled country and it was time to move on. All that old software in my closet describes an antique land, no longer habitable now that I've seen the present.
Nearly every computer user, writer or not, is caught up in a basic tension: learning new software as rapidly as possible while carrying old habits forward. I've been using Microsoft Word my whole computing life. I began with version 1.1 in 1984, when the basic program was about 72,000 bytes in size. (The core of the version I now use is about 13 megabytes.) I've been working in Word for nearly its entire history. I've switched platforms - from Microsoft Windows to Apple OS X - but I still use Word. The boundaries of the program have steadily pushed outward, embracing new languages (like HTML and technologies like e-mail and the Internet that were only in their infancy when I began using it. But within Word I still live in the same pleasant township I used to live in, never mind how large or sophisticated the district around me has grown. Every now and then I cultivate a new patch of ground. But with every upgrade of Word, I use proportionately less and less of the program's full potential. And so I have stopped upgrading.
Since 1984, I've lived in 7 different apartments or houses in New York city, the Berkshires, and now upstate New York. I've taught at 6 different colleges and universities and worked at The Times for 8 years. I've owned a dozen different computers and written four books. But I've lived in only one writing program. I was a Word man from the get go. Word came bundled with my original Zenith PC, and that was enough. I was lucky. It could have been XyWrite.
Information or, Alas, the Efficiency of Inefficiency
Verlyn Klinkenborg
I vividly remember an article that James Fallows wrote about Lotus Agenda, a powerful "PIM" or personal information manager, in the Atlantic Monthly in 1992. It was a glimpse inside a remarkable piece of software, but it was also a tantalizing peek inside the working habits of another writer. I was struck by the flexibility and capaciousness of Agenda, as Fallows described it, but I was even more impressed by the time he had put into learning the program, which helped him organize the masses of information he was gathering as a reporter. I almost bought a copy of Agenda. What stopped me was the price - $399 - and the recollection that I am not James Fallows.
For me, writing is an inherently inefficient business. It takes a lot of staring out the window, and walking downstairs to see if the ducks and geese are getting in the road. It is often accompanied by a nap, a far more powerful organizational tool than most people realize. I have sound scholarly habits and good reporting skills. But I have yet to be conquered by the recent passion - of which Agenda is an early instance - for Getting Things Done, based on the book of that name by David Allen. (Fallows has also written in the Atlantic about Allen and his GTD gospel of efficiency. My model is closer to Robert Benchley's "How to Get Things Done." That essay, published 75 years ago, begins, "A great many people have come up to me and asked me how I manage to get so much work done and still keep looking so dissipated."
There are plenty of software tools to help writers get organized. But to me they're useful in inverse proportion to the time it takes to learn them. I'm constantly trying them out in hopes of finding one that is both life-changing and simple. I've worked in DevonThink and Ulysses (forgive me if I'm entering Mac-only territory) and liked them well enough to buy them. (I especially like one of DevonThink's most trivial features - its ability to make an instant concordance of anything I'm writing.) I've downloaded countless other note-taking and organizational programs that went into the trash long before the trial period ended. Most of them expected me to become as systematic and energetic as James Fallows.
I don't lack system. I was trained as a scholar, and my note-taking for the books I've written has been as extensive and as scrupulous as I know how to make it. But from the narrow perspective of my own writing, what I've realized is this: I don't need a complex program I can dump all my notes into as I gather them, like Agenda or even DevonThink. I need a simple program that can search outward, across the electronic galaxy of my hard drive, and find the notes I need no matter where, or in what format, I've dumped them.
I discovered last year, for instance, that the "find" command in Adobe Reader 6.0 ("search" in Adobe Reader 7.0) allows you to find any word in any set of PDF documents you specify. Instead of finding the word you want by searching, instance by instance, through a single text, this command gathers every occurrence, in context, into a single list, essentially creating a concordance entry for that word in that set of texts. To me this seemed like a miracle. I converted all the notes I'd taken for my new book - hundreds of pages of them -into PDF format (a simple matter that took only a few minutes). Suddenly I could see every occurrence of the word "tortoise," for instance, or "curate." To someone who is utterly text-based - a writer, in other words - this is an invaluable tool. What made it work, however, was the fact that I'd already typed hundreds of pages of notes into the computer. There is no real shortcut for that, if only because typing those notes was a way of imprinting them in my memory.
All the formatting power of the big word-processors - and the only one left standing is Word- isn't worth as much to me as the increasingly powerful search tools that are embedded in new operating systems. The best example is Spotlight, which appeared in the Apple OS X release called Tiger (10.4) earlier this year. Like most long-time computer users, I've almost completely absorbed the hierarchical thinking - the dendritic structure of folders and files - embedded beneath the surface of the desktop metaphor. I still file documents carefully and accurately, just as if I were still working in DOS. But that doesn't mean I remember where I've put them.
With Spotlight, I can type a word or phrase into the search window and find every occurrence on my computer. I can think of almost nothing that changes the feel of using a personal computer more than this utility. Nearly everything can be filed almost anywhere - perfect chaos under the hood-and it makes practically no difference.
For most computer users these days, the big question is how to aggregate usefully all the bits of information that flow through the machine - contacts, calendars, web-pages, emails, graphics, style-sheets, RSS feeds, and every other sort of flotsam - in a way that makes them easily accessible. (One example is Onlife, which tracks your actions in various applications. Another is Scrapbook, a Firefox extension, which allows you to clip web-pages to a virtual scrapbook as you surf the Web.) The important thing to notice about these tools is that they pertain almost wholly to a computer-defined universe. They can manipulate only a certain kind of reality - what used to be called data or information and is now, oddly enough, often called content.
The metaphors in these programs are a giveaway - scrapbooks and shoeboxes. I never write on or in or with a shoebox. These tools are handy only as long as I remember that they have nothing to do with writing. My office at home, like my office at The Times, is filled with non-metaphorical aggregations of information: stacks of books, mounds of magazines, piles of paper, notebooks filled with handwritten notes, files old enough to contain, for example, the invoices for my IBM Selectric and my first PC. I often spend a half hour looking for a book I know is somewhere in the house, just as I used to spend whole afternoons pleasantly lost in the library stacks at Princeton and Harvard. Over the years, I've probably learned as much from the books I found on the way to the book I was looking for as I have from the book I wanted. That is an example of the efficiency of inefficiency.
My own brain is the information manager I'm interested in. I need to be able to listen for hints that are utterly intuitive. As I am writing, I need to be alert to a feeling I can't quite explain, a realization I haven't realized yet, an analogy that's still missing some of its parts. I'm interested in understanding the shape of what I don't know because that too is a form of knowledge.
In a sense, writing editorials for The New York Times epitomizes the problem that many writers face - or believe they face - these days. I usually have, at most, a few hundred words to work in. On any subject - the redevelopment of the World Trade Center site, the evils of factory farming, the fundamentalist attack on evolution - I can find reams of information very quickly. But good prose can carry only so much freight. In a few hundred words, there is room for only a few facts and one central idea, no matter how many interviews I've done or how many reports I've read. But all writing, really, is reticence. It is about saying one thing at a time, not everything at once. The trick isn't learning how to manage information. It's learning how to think. There is no software for that.
The Switch
I gave up in the spring of 2002. Eighteen years of Microsoft operating systems was enough. I had been through every version of DOS and nearly every version of Windows up to that point. If I knew as much about my tractor as I did about the fine points those systems, I would be able to rebuild its hydraulics from scratch. It made no difference. As Windows grew more and more complicated (or kludgier and kludgier) - and as Microsoft tried to make it seem simpler and friendlier to novices - it also got uglier and uglier. So it seemed to me. I moved my last Windows desktop to the basement, where I hope the mold is eating its hard-drive. I bought an iBook and have lived happily ever after.
It was just the right time. The iBook came with OS X 10.1. That's what I use. I never glanced at OS 9. I wanted nothing to do with the past - even Apple's past. I now no longer have to worry about crashes or screen freezes, regular occurrences in my Microsoft days. This has nothing to do with writing, I know, but it has everything to do with allowing me to keep my composure. The real reason for switching platforms, though, was to recover some of the pleasure of using a computer, which had almost vanished for me. The stability of my 12" iBook (and its successors, a 12" PowerBook and a 20" iMac) was important and so was ease of use and a sense of inventiveness. But what has won me over is the esthetics of the Apple cosmos. It's a fine-grained universe with smooth, clean edges. The world within the screen appears to recognize, and obey, the laws of gravity. Solids appear solid, not pixilated and porous. My Apple seemed surprisingly willing to leave me alone to do my work. It never nagged me. It never panicked. It had made a clean break with the past and it let me do so too.
I still write in Microsoft Word, I know. I'm happy with Word X for Mac, which hasn't been updated in a couple of years, and I will not upgrade to Office 2004. Even the name makes me nervous. Every now and then I hit just the wrong combination of keys, and the Office Assistant pops up. I loathe the Office Assistant - even the Mac version, which is far less annoying than the Windows one. Animated assistance is the last thing I want. It's as useless as a grammar checker. I suspect that after 21 years I've nearly come to the end of the Word road. I suppose I'll miss it when I find the right replacement. And yet I haven't missed Windows - or MS-DOS - yet. I don't think I ever will.
This, too, has nothing to do with writing. And yet everything has something to do with writing. That's the nature of the job.