Saturday, May 31

On Loyalty

What George Forgot

“DISLOYAL, SICKENING AND DESPICABLE DISLOYAL, SICKENING AND DESPICABLE,” wrote Bernard Kerik in an e-mail that he was circulating around this week. Kerik, you may remember, was the former New York City police commissioner who George W. Bush once tried to make chief of Homeland Security. This was during Kerik’s happier, preindictment era.

Kerik’s outrage was directed at Scott McClellan, the former Bush press secretary whose much-discussed memoir, “What Happened,” reveals that the Bush White House put politics ahead of truth and openness with the American people.

I know it’s a shock, but try to be brave.

The administration’s defenders have not really attacked the book’s thesis — really, what could you say? But they’ve been frothing at the mouth over McClellan’s lack of loyalty. “This will stand as the epitome, the ultimate breach of that code of honor,” said Mary Matalin.

We’ve heard a lot about loyalty this year. Remember when Bill Richardson endorsed Barack Obama and James Carville compared Richardson to Judas Iscariot? And the whole Jeremiah Wright drama was mainly about Obama’s coming to grips with the sad fact that presidents do not have the luxury of being loyal to anybody outside of their immediate gene pool.

“Having been through all I have been through in the past four years, disloyalty and betrayal seem more prevalent today than ever before in my lifetime, and that in itself, to me, is sickening,” Kerik wrote in his e-mail, which also suggested that writing unflattering memoirs about working for the president “should be a crime.”

Currently under indictment for multiple counts of fraud, conspiracy and tax evasion, Kerik is not, at this point, a person the administration calls upon when it wants to be defended. But he is a perfect example of what a worthless quality loyalty is in high government officials.

Kerik is stupendously loyal, which is what endeared him to Rudy Giuliani, his great patron. The Bush administration, which also prizes loyalty, shipped him off to Iraq with the critical job of supervising the rebuilding of the Iraqi police. Kerik stayed only three months, during which he devoted himself to giving interviews and being gregarious, the two things he does very well. Management, however, turned out not to be a strong point.

Back home, Bush was embarrassed when Kerik’s Homeland Security nomination immediately ran aground on reports of his ethics issues. His downfall was a terrible blow to Giuliani’s presidential candidacy — although given Rudy’s multitudinous deficiencies as presidential timber, it’s hard to pick the one that made the difference.

Anyway, there’s the loyalty trade-off for you: On the one hand, Kerik did a terrible job in a critical assignment in Iraq, allowed himself to be nominated to a hugely important post for which he was ill qualified and showed a stupendous lack of interest in ethical considerations when he served in New York City.

On the plus side, he will never, ever write a tell-all memoir about any of the great men he has served.

Whoever the next president is, I hope he-she picks incredibly well-qualified people who are strong enough to speak their minds and cynical enough not to assume the chief executive knows what he-she is doing. Loyalty does not tend to be a great virtue in these types, and the goal should be to wring as much accomplishment as possible out of them before the inevitable betrayal.

My favorite moment in “What Happened” was from 1999 when George W. Bush was deeply irritated about questions from the press on his past drug use. “The media won’t let go of these ridiculous cocaine rumors,” the future president said. “You know, the truth is I honestly don’t remember whether I tried it or not.”

“I remember thinking to myself, How can that be? It didn’t make a lot of sense,” McClellan wrote.

While the bracing effects of being pushed out of his job have helped McClellan face reality, clarity might have come earlier if he’d just been more canny about personal relationships. His White House career could have been so different if, when Bush started babbling about W.M.D.’s in Iraq, McClellan reminded himself that this was coming from a guy who couldn’t remember what drugs he had ingested.

Even now, McClellan still appears to have trouble with the critical concept that deeds matter more than words.

“Waging an unnecessary war is a grave mistake,” he writes. “But in reflecting on all that happened during the Bush administration, I’ve come to believe that an even more fundamental mistake was made — a decision to turn away from candor and honesty when those qualities were most needed.”

Personally, I’m a huge fan of candor and honesty. But when it comes to fundamental mistakes, I’ll start with the unnecessary war.

Thursday, May 29

McCain's Foreign Policy Worse than Bush's?


Many foreign-policy mavens have wondered which John McCain would step to the fore once he started running for president in earnest—the McCain who consorts with such pragmatists as Richard Armitage, Colin Powell, and George Shultz; or the McCain who huddles with "neocons" like Robert Kagan, John Bolton, and William Kristol (before he started writing op-eds for the New York Times).

Last month, the Times published a story about the battle for McCain's soul that's being waged by those two factions.

On Tuesday, McCain cleared up the mystery: He's with the neocons. He is, fundamentally, in sync with the foreign policy pursued by George W. Bush for his first six years in office. The clincher is that he has now broken with the president on the one issue where Bush himself reversed course more than a year ago after realizing that his policy had failed. In two op-ed articles and a speech—all of them published or delivered on Tuesday, May 27—McCain called for a return to Bush's original, disastrous approach.

The issue is nuclear negotiations with North Korea.

First, a quick recap (taken mainly from Chapter 2 of my book, Daydream Believers: How a Few Grand Ideas Wrecked American Power):

In 1994, top officials for President Bill Clinton and North Korean dictator Kim Jong-il signed the Agreed Framework, an imperfect, interim accord that nonetheless froze Pyongyang's plutonium program, kept its nuclear fuel rods locked up and monitored by international inspectors, and thus prevented the tyrant from developing an A-bomb for the next eight years.

When Bush took office, his secretary of state, Colin Powell, wanted to pick up where Clinton left off—the two sides were on the verge of hammering out a treaty banning the production and export of long-range missiles—but Bush shut him down. The principle, as stated by Vice President Dick Cheney: "We don't negotiate with evil; we defeat it."

So, the North Koreans kicked out the inspectors, unlocked the fuel rods, reprocessed a half-dozen A-bombs' worth of plutonium—and Bush did nothing. Finally, in August 2003, Bush agreed to set up "six-party talks" on the North Korean problem—along with China, Russia, Japan, and North and South Korea—but stopped short of offering Pyongyang any incentives to reverse their course. His position was that Kim Jong-il must dismantle his nuclear program as a precondition to negotiations—an absurd stance on its face, since plutonium was Kim's only bargaining chip, and he wasn't about to cash it in before talks even began.

In October 2006, the all-but-inevitable took place: The North Koreans set off a nuclear explosion at a remote test site.

Nobody said so at the time, but what happened here was that Bush had gone eyeball-to-eyeball with the pygmy of Pyongyang—and lost. He and most of his aides had figured that all they had to do was to hold out—that Kim Jong-il's monstrous regime would collapse before it managed to set off a bomb. They were wrong.

So, at the beginning of last year, Secretary of State Condoleezza Rice convinced Bush that it was time to negotiate for real. She sent her emissary Christopher Hill to Berlin to conduct one-on-one talks with his North Korean counterpart—something Bush had said repeatedly that he would never do. Within a few days, the two struck a deal that did not require the North Koreans to dismantle their program as a prerequisite—another violation of earlier principles.

Former Bush officials hit the ceiling—especially John Bolton, who, during the first term, had tried to disrupt the six-party talks, limited as they were. (Some aides still in office also rebelled; Eliot Abrams, Bush's deputy national security adviser, sent out e-mails to his neocon comrades, rallying them to protest.)

But guess what? The deal has worked out pretty well. The North Koreans have halted their plutonium program, shut down and started to take apart their nuclear facilities, and handed over 18,000 pages of documentation on the program to date.

Things are far from perfect. There are still outstanding—and important—questions about North Korea's role in assisting Syria and perhaps Iran in developing a nuclear program. We don't yet know how complete those 18,000 pages are. And nothing has been worked out on how to verify any future North Korean claim that they have destroyed all their nuclear materials.

Then again, it was Bush who forfeited his leverage when he stood by and let North Korea build an A-bomb to begin with. Unable to take military action (the risks of North Korean retaliation against South Korea or Japan were deemed too dreadful) and unwilling to pursue diplomacy, he instead did nothing—and the consequences were inevitable. The deal that Hill worked out isn't great; it's not even as tight as Clinton's Agreed Framework; but the North Koreans hadn't reprocessed their plutonium when Clinton was president. Hill's deal might be the best that could have been negotiated under the circumstances. In any case, it's better than nothing.

McCain wants to undo the deal; he wants to go back to nothing. In an op-ed for the Asia edition of the Wall Street Journal, McCain and his co-author, Sen. Joe Lieberman, wrote, "We must use the leverage available from the United Nations Security Council resolution passed after Pyongyang's 2006 nuclear test to ensure the full and complete declaration, disablement and irreversible dismantlement of [North Korea's] nuclear facilities, in a verifiable manner."

Absent knowledge of the historical context, this sounds reasonable. (Even with such knowledge, it's desirable.) The U.N. Security Council did pass a resolution that condemned the nuclear test and called on North Korea to dismantle its facilities.

However, the members of the Security Council knew, soon enough, that the resolution was unenforceable. Even Bush realized that, contrary to McCain and Lieberman's premise, the resolution gave them no "leverage" whatever.

In a similar op-ed for the Japanese newspaper Yomiuri Shimbun, McCain and Lieberman urged using the six-party talks "to press for a full, complete, verifiable declaration, disablement and dismantlement of North Korea's nuclear weapons program."

However, the fact is, the six-party talks really don't exist anymore, except as a ratifying body for bilateral talks between the United States and North Korea. (Hill, in fact, is reportedly in Beijing today, continuing these one-on-one sessions.) Bush decided, realistically, that demanding dismantlement as a first step was a nonstarter and that a freeze followed by a gradual disabling—prodded by the delivery of free fuel oil and other economic aid—was more feasible and imminently worthwhile. He had tried cutting off economic aid before, but it had no effect in weakening Kim's hold on power.

As Daniel Sneider, assistant director of the Shorenstein Asian-Pacific Research Center at Stanford University, put it in a phone interview Tuesday night: "The policy that John McCain proposes is the policy that George W. Bush pursued—and that policy failed. There's not much to be said for going back to a policy that failed to contain North Korea's nuclear program."

Finally, in a speech at the University of Denver, delivered the same day that the op-eds were published, McCain suggested that his demand for nuclear dismantlement was contrary to the position of the Democrats' likely presidential candidate, Sen. Barack Obama. "Many believe," McCain said, "all we need to do to end the nuclear programs of hostile governments is have our president talk with leaders in Pyongyang and Tehran, as if we haven't tried talking with the governments repeatedly over the past two decades."

In effect, McCain was really criticizing George W. Bush. It was Bush who dropped the demand for North Korean dismantlement as a first step, much less as a precondition to talks. And as for McCain's snide aside—"as if we haven't tried talking with the governments repeatedly"—well, in fact, we haven't, or at least Bush hadn't, until he let Hill talk directly with the North Koreans. And, as it happened, that was all we needed to do to end (or at least to halt and start to tear down) their nuclear program.

And so, if John McCain is elected president, it's not quite true that he'll continue the policies of George W. Bush, as Sen. Obama charges. When it comes to controlling and disarming North Korea's nuclear program, McCain would set back Bush's policy several years.

Saturday, May 24

Where Breathing Is Deadly

BADUI, China

China’s biggest health disaster isn’t the terrible Sichuan earthquake this month. It’s the air.

The quake killed at least 55,000 people, generating a response that has been heartwarming and inspiring, with even schoolchildren in China donating to the victims. Yet with little notice, somewhere between 300,000 and 400,000 Chinese die prematurely every year from the effects of outdoor air pollution, according to studies by Chinese and international agencies alike.

In short, roughly as many Chinese die every two months from the air as were killed in the earthquake. And the problem is becoming international: just as Californians can find Chinese-made shoes in their stores, they can now find Chinese-made haze in their skies.

This summer’s Beijing Olympics will showcase the most remarkable economic explosion in history, and also some of the world’s thickest pollution in both air and water. So I’ve returned to the Yellow River in western China’s Gansu Province to an isolated village that has haunted me since I saw it a decade ago.

Badui is known locally as the “village of dunces.” That’s because of the large number of mentally retarded people here — as well as the profusion of birth defects, skin rashes and physical deformities. Residents are sure that the problems result from a nearby fertilizer factory dumping effluent that taints their drinking water.

“Even if you’re afraid, you have to drink,” said Zhou Genger, the mother of a 15-year-old girl who is mentally retarded and has a hunchback. The girl, Kong Dongmei, mumbled unintelligibly, and Ms. Zhou said she had never been able to speak clearly.

Ms. Zhou pulled up the back of her daughter’s shirt, revealing a twisted, disfiguring mass of bones.

A 10-year-old neighbor girl named Hong Xia watched, her eyes filled with wonder at my camera. The neighbors say she, too, is retarded.

None of this is surprising: rural China is full of “cancer villages” caused by pollution from factories. Beijing’s air sometimes has a particulate concentration that is four times the level considered safe by the World Health Organization.

Scientists have tracked clouds of Chinese pollution as they drift over the Pacific and descend on America’s West Coast. The impact on American health is uncertain.

In fairness, China has been better than most other countries in curbing pollution, paying attention to the environment at a much earlier stage of development than the United States, Europe or Japan. Most impressive, in 2004, China embraced tighter fuel economy standards than the Bush administration was willing to accept at the time.

The city of Shanghai charges up to $7,000 for a license plate, thus reducing the number of new vehicles, and China has planted millions of trees and hugely expanded the use of natural gas to reduce emissions. If you look at what China’s leaders are doing, you wish that President Bush were half as green.

But then you peer into the Chinese haze — and despair. The economic boom is raising living standards hugely in many ways, but the toll of the resulting pollution can be brutal. The filth is prompting public protests, but the government has tightly curbed the civil society organizations that could help monitor pollution and keep it in check.

An environmental activist named Wu Lihong warned for years that Lake Tai, China’s third-largest freshwater lake, was endangered by chemical factories along its banks. Mr. Wu was proved right when the lake filled with toxins last summer — shortly after the authorities had sentenced him to three years in prison.

Here in Badui, the picture is as complex as China’s development itself. The government has taken action since my previous visit: the factory supposedly is no longer dumping pollutants, and the villages have been supplied with water that, in theory, is pure. The villagers don’t entirely believe this, but they acknowledge that their health problems have diminished.

Moreover, economic development has reached Badui. It is still poor, with a per-capita income of $100 a year, but there is now a rough dirt road to the village. On my last visit, there was only a footpath.

The road has increased economic opportunities. Farmers have dug ponds to raise fish that are trucked to the markets, but the fish are raised in water taken from the Yellow River just below the fertilizer factory. When I looked in one pond, the first thing I saw was a dead fish.

“We eat the fish ourselves,” said the village leader, Li Yuntang. “We worry about the chemicals, but we have to eat.” He said that as far as he knew, the fish had never been inspected for safety.

Now those fish from this dubious water are sold to unsuspecting residents in the city of Lanzhou. And the complexities and ambiguities about that progress offer a window into the shadings of China’s economic boom.

Microsoft kills books search engine because books aren't fun


Microsoft has announced that it is shutting down Live Search Books and Live Search Academic, two search engines that aimed to index scholarly works that are often difficult to find online. The company is also ceasing its ambitious effort to digitize library books, a project that it had long promoted as an alternative to Google's own such efforts.

The company says it "recognizes" that closing these services will "come as disappointing news" to publishers and Web searchers. And yet Microsoft says it must shut them down anyway, because letting people search through books and academic journals no longer fits into the company's business strategy.

What's that new strategy? Microsoft wants to help people who have "high commercial intent."

I am not making that up. Satya Nadella, the company's vice president for search, actually uses those words. Microsoft would simply prefer to build search engine just for people looking to buy stuff.

Indeed, if you're buying stuff, Microsoft really wants to be your online friend. At Live Search cashback, the new search engine it unveiled this week, Microsoft even gives you cash for using its product. Why? Because you're giving away your money! And advertisers just love that!

On the other hand, if you are, inexplicably and ungratefully, simply looking for information, Microsoft wants no part of that. Why don't you go to Google or some kind of soup kitchen, you no-good freeloader?

This is heroically stupid. Seriously, is it any wonder that this company -- this company which has, for a decade now, flailed about in all its efforts online -- has found itself so outgunned by that Ph.D.-machine over in Mountain View?

I praised Live Search cashback as a brilliant effort born out of desperation, but make no mistake about the source of that desperation: Microsoft is giving people money to use its site because it has failed to get people interested in the vast majority of its online products.

So why is this? Why does MS have to pay people to search there? Why do we go to Google, for free, instead?

Of course, because early on, Google worked better. But why do we stick to Google? No small part of it is the company's brand, which is built upon the company's stated mission, "to organize the world's information and make it universally accessible and useful."

To be sure, Google wants to make money, and it, like Microsoft, has been fantastically successful at that. But on many of its products, Google makes no money at all.

It sees no cash in scanning library books or searching scholarly journals. Indeed, it's had to spend money: Both to digitize the books and to defend a lawsuit by authors and publishers who object to its methods. (Microsoft had long promised authors that its books effort was better for them; guess not, eh?)

But Google derives enormous indirect benefits from these non-commercial projects. College students, for instance, spend endless hours on Google's Web search engine, as well as on Google Scholar and Google Books, as part of the research. Where do you suppose the students will be inclined to go, later on, when they're looking for sunglasses?

Google's willingness to spend on not-in-it-for-the-money projects also surely helps it recruit the best minds in tech. I've spoken to Googlers who joined the firm primarily because they believed in its mission.

If you were a young tech engineer and had to choose between signing up with one firm that wanted to set free all the world's information and another that wanted to focus on customers with "high commercial intent," which would you pick, all else being equal?

Yeah, you'd choose the one with a brighter future. These days, it's pretty clear which one that is.

Friday, May 23

In Praise of Liberal Guilt

Posted Thursday, May 22, 2008, at 5:34 PM ET

When did "liberal guilt" get such a bad reputation? You hear it all the time now from people who sneeringly dismiss whites who support Obama's candidacy as "guilty liberals." There are, of course, many reasons why whites might support Obama that have nothing to do with race. But what if redeeming our shameful racial past is one factor for some? Why delegitimize sincere excitement that his nomination and potential election would represent a historic civil rights landmark: making an abstract right a reality at last. Instead, their feeling must be disparaged as merely the result of a somehow shameful "liberal guilt."

If you Google "liberal guilt" and "Obama," among the nearly 32,000 hits you get are a syndicated Charles Krauthammer column under the headline "Obama's Speech Plays On Liberal Guilt"; a Mark Steyn post on the National Review Online that describes "a Democrat nominating process that's a self-torturing satire of upscale liberal guilt confusions"; a column by self-styled "crunchy con" Rod Dreher, who suggests the mainstream media coverage of Obama indicates that "liberal guilt will work [on them] like kryptonite." Even liberals make fun of liberal guilt. A couple of years ago, Salon coyly proposed supplementing the Oscars with the Liberal Guilt Awards and awarding political dramas with "Guilties."

Since when has guilt become shameful? Since when is shame shameful when it's shame about a four-centuries-long historical crime? Not one of us is a slave owner today, segregation is no longer enshrined in law, and there are fewer overt racists than before, but if we want to praise America's virtues, we have to concede—and feel guilty about—America's sins, else we praise a false god, a golden calf, a whited sepulcher, a Potemkin village of virtue. (I've run out of metaphors, but you get the picture.)

Guilt is good, people! The only people who don't suffer guilt are sociopaths and serial killers. Guilt means you have a conscience. You have self-awareness, you have—in the case of America's history of racism—historical awareness. Just because things have gotten better in the present doesn't mean we can erase racism from our past or ignore its enduring legacy.

Critics of Obama supporters who use the phrase "guilty liberal" or "liberal guilt" in a condescending, above-it-all manner suggest there's something weak about feeling guilt; they paint a trivializing, Woody Allen caricature of it.

Actually, I think it requires a kind of strength, not weakness, to face the ugly truths of history and to react to them in an honest way. "Liberal guilt" isn't a reason one must automatically support a black candidate, but that doesn't mean that liberal guilt—better defined as an awareness of the need to contend with, and overcome, a racist past—shouldn't be a factor in politics.

Of course, it's not enough just to feel guilty or to act on guilt alone. But guilt can often spur us to deal with the enduring consequences of the injustices of the past and force us not to pretend there are none.

It's especially surprising to hear "guilt" being disparaged by conservatives, since they present themselves as moralists; they are quick to decry liberals for seeking to abolish guilt over various practices conservatives deem immoral. But was slavery not immoral? For those conservatives who make a fetish of "values": Was not the century of institutionalized racism and segregation that followed the end of slavery a perpetuation of "flawed values" that the nation should feel an enduring guilt over? For those conservatives who are forever speaking of the way they value history and memory more than liberals: Should we abolish the history and memory of slavery and racism just because they're no longer legally institutionalized?

Do we abolish its memories and its effects? Do we abolish the very consciousness of the past and pretend we have a clear conscience? Pretend that on the question of racism, there is no problem anymore? America is impeccably virtuous? This sounds more like Jacobin "Year Zero" thinking than true conservatism.

What I don't understand is why there doesn't seem to be any conservative guilt over racism. Contemporary conservatives could learn from their revered godfather William F. Buckley Jr., who, early in his career at the National Review, wrote a pro-Jim Crow lead editorial—little remembered in liberal and other encomia to the man—titled "Why the South Must Prevail," in which he argued that segregation should persist even by illegal means because "the White community … for the time being … is the advanced race."

A valuable essay on this question by William Hogeland in the May/June issue of the Boston Review reminds us that even Buckley felt guilt—if not precisely "liberal guilt"—about this editorial, guilt that he expressed in a 2004 Time interview. "Have you taken any positions you now regret?" Time asked him. "Yes. I once believed we could evolve our way up from Jim Crow. I was wrong: federal intervention was necessary." Why can't conservative wiseguys (especially at the National Review) stop sneering at liberals long enough to learn from the admirable guilty wisdom of their sainted leader?

Shouldn't conservatives feel guilty about slavery and racism and the consequences thereof, or must they disdain such feelings, however moral, because they are associated with liberals? Do they choose their moral priorities because of their popularity among others? That doesn't seem like a conservative way of thinking about moral values. It sounds like a form of relativism. It's the kind of thinking that treats values as a brand identity. Guilt over racism is not part of the conservative brand identity. The more shame if that be the case.

(The conservative brand identity also doesn't have much room for opposition to sexism, another legitimate source of liberal guilt. But Hillary Clinton's problems, it seems to me, stem less from sexism than from Clintonism.)

Or could it be that conservatives disdain liberal guilt about race because they have historically more guilt to bear for the perpetuation of racism and segregation?

I'm not talking about Republicans per se. The fact that the GOP was the party of Lincoln and most strongly supported anti-lynching and anti-Jim Crow legislation in the first half of the 20th century is to its eternal credit, just as the "Southern strategy" was much to its discredit in the second half of the century. And, needless to say, liberal Democrats collaborated with a stone-cold racist wing of their party when they needed electoral votes for most of the century.

No, it's not a Democrat or Republican issue; it's a liberal and conservative issue. And there are those on the conservative side who understand that the first step to justice is an acknowledgment of guilt. Just not many and not very vocal.

This is what I don't understand about the conservative attacks on "the '60s." They willfully ignore, in their rote denunciations of the sex, drugs, and rock 'n' roll aspect of that decade, the great movement of moralists known as the civil rights movement. The movement that brought deserved honor and pride to America. The movement that may well have been motivated (among whites participating) by liberal guilt. But so what! The guilt was justified. The truly guilty were the ones who didn't feel guilt. Such as the conservative movement of the day that largely stood on the sidelines making carping augments about states' rights that were a shamelessly transparent defense of institutionalized racism. Where's the conservative guilt about that? No wonder they ignore the civil rights movement, one of the great epochs in American history, when they demonize "the '60s."

The question of liberal guilt and guilty liberals often comes up in discussions of reactions to "black anger," unfortunately expressed most loudly and bitterly in this campaign by the Rev. Jeremiah Wright. But it's all too easy to dismiss the legitimacy of black anger merely on the basis of the Rev. Wright's sadly twisted version of it.

Do the people who dismiss black anger think there's nothing to be angry about? As a Jew, I think I have a right to be angry, still, about the Holocaust, even though it happened before I was born. It would be hard for me to understand an African-American not being angry about 400 years of murder, rape, and enslavement on the basis of race. Anger, like guilt, shouldn't be the endpoint, but anger at injustice is not illegitimate and can be a starting point, a spur to moral action. Where you end up is, alas, often a different matter.

But it seems to me that some people use the Rev. Wright's ugly expression of anger as a fig leaf to discredit Obama, who has clearly ended up at a different place from the Rev. Wright (largely due, one imagines, to the civil rights movement). Yes, Obama may well have an understanding of the Rev. Wright's anger, but if you can't see the difference between the two men historically, culturally, generationally, and temperamentally, then I'd say you just don't want to: It's a kind of willful blindness that seeks to find ways of discrediting Obama and his "guilty liberal" supporters by holding up the Rev. Wright as the true face of black anger. I think intelligent people are able to make these distinctions.

...

People who lack guilt also lack humility, which is another one of those virtues conservatives are always flogging (although not with a lot of humility).

I'm always amused, listening to the Sean Hannity radio show, how the host and caller frequently salute each other with the phrase: "You're a great American." (There's humility for you!) What's so great about being "great" if it depends on historical ignorance or denial? Again, to love America truly, one has to love the America that is and was, not a fantasy America free from flaws.

To be a truly "great American," one doesn't have to be a guilty liberal, but one has to know guilt.

Ron Rosenbaum is the author of The Shakespeare Wars and Explaining Hitler.

Monday, May 5

libertarian paternalism

The Chronicle of Higher Education

The New Paternalism

An economist and a legal scholar argue that policy makers should nudge people into making good decisions

"You see that?" Richard H. Thaler asks as we ride down picturesque Lake Shore Drive in Chicago. Thaler knows the route well. He travels it every day on his commute home from the University of Chicago Graduate School of Business, where he is a professor of behavioral science and economics. At the moment, he is excitedly jabbing his finger toward an approaching curve in the road, telling me that it is the scene of numerous accidents caused by drivers who fail to sufficiently reduce their speed. Then he directs my attention to a grid of lines that appear on the road ahead of us: Evenly spaced at first, as we near the apex of the curve, the lines begin to bunch closer together, which makes us feel like we are speeding up.

As Thaler taps the brakes and gently steers into the bend, he explains how the tightly spaced lines trigger an instinct that causes drivers to slow down. With evident glee, he notes that Chicago is effectively exploiting — to society's benefit — one of the many ways in which human perception is flawed. Or, as Thaler puts it, drivers are being "nudged" toward safety.

What does a peculiar pattern on the road have to do with fixing the nation's health-care woes, protecting the environment, resolving the thorny issue of gay marriage, and increasing donations to charity? Everything, according to Thaler and Cass R. Sunstein, a professor of law and political science at the University of Chicago. They are authors of a new book, Nudge: Improving Decisions About Health, Wealth, and Happiness (Yale University Press), in which they articulate an approach to designing social and economic policies that incorporates an understanding of people's cognitive limitations.

They call this governing philosophy "libertarian paternalism." That is not an oxymoron, they insist in their book. Rather it is a corrective to the longstanding assumption of policy makers that the average person is capable of thinking like Albert Einstein, storing as much memory as IBM's Big Blue, and exercising the willpower of Mahatma Gandhi. That is simply not how people are, they say. In reality human beings are lazy, busy, impulsive, inert, and irrational creatures highly susceptible to predictable biases and errors. That's why they can be nudged in socially desirable directions.

A nudge is thus any noncoercive alteration in the context in which people make decisions. The libertarian paternalism behind it is rooted in Thaler's lifelong fascination with the power of small, seemingly innocuous details — the arrangement of food in a cafeteria, the drawing of a small fly in the bowl of a urinal, a pattern of lines on the road — to influence people's behavior. David Laibson, a professor of economics at Harvard University, says that Thaler's ideas, once a cry in the wilderness, are so influential that "about half of the profession now believes that psychology has a useful role to play in economic modeling, and that number is growing."

"Cass is always about 10 minutes late," Thaler tells me as we slide into a well-worn booth at Noodles Etc., a modestly priced Pan-Asian restaurant on the periphery of Chicago's Gothic stone campus. As we thumb through our menus, he predicts that Sunstein will order the same thing he always orders: tofu salad. Thaler should know. He and Sunstein have maintained a standing weekly lunch date at Noodles for the past few years as they sketched out the ideas in Nudge. (The restaurant is duly thanked in the book's acknowledgments.)

Ten minutes later, Sunstein appears, loping toward the table, a sheepish grin on his face. He blames his tardiness on the dearth of parking in the neighborhood, sits down, and promptly orders the tofu salad. Thaler opts for a heaping dish of chicken, noodles, and broccoli.

Sunstein explains the appeal of libertarian paternalism: "For too long, the United States has been trapped in a debate between the laissez-faire types who believe markets will solve all our problems and the command-and-control types who believe that if there is a market failure then you need a mandate." That debate has been exhausted, he says.

"The laissez-faire types are right that … government can blunder, so opt-outs are important," he says. "The mandate types are right that people are fallible, and they make mistakes, and sometimes people who are specialists know better and can steer people in directions that will make their lives better."

Sunstein argues that understanding human irrationality can improve how public and private institutions shape policy by increasing the likelihood that people will make decisions that are in their own self-interest. Most important, he and Thaler insist, such nudges can be executed while protecting freedom of choice.

Take two examples in their book. Studies show that placing fruit at eye level in school cafeterias enhances its popularity by as much as 25 percent. Or consider this stroke of creativity by an economist in Amsterdam charged with cleaning up the restrooms at the Schiphol Airport: He had a fly etched into the wells of urinals, giving male patrons something to aim at. Spillage was reduced by 80 percent. The problems of childhood obesity and foul restrooms are remedied with very little inconvenience to people — or cost. Children remain free to grab that piece of chocolate cake, and there is nothing preventing visitors to Schiphol's restrooms from ignoring the fly and aiming elsewhere. It is merely less likely that either group will do so.

"Nudges are inevitable, so they might as well be smart," Sunstein says with a grin. The inevitability — and potential — of nudges is most clear when it comes to default options. For example, 401(k) employee-savings plans generally have an opt-in design, meaning that when employees become eligible to participate, the onus is on them to join. Many will procrastinate — even though it is usually in their best interest not to. According to Sunstein and Thaler, that inertia can be harnessed. They suggest that companies adopt automatic enrollment for 401(k) programs, pointing to studies that show how doing so significantly increases levels of employee participation. And, they stress, because there is still an opt-out, people aren't forced to do anything against their will.

Thaler has spent his career thinking about how people make decisions. As a Ph.D. candidate in economics at the University of Rochester in the early 1970s, he was struck by numerous anomalies that conflicted with the traditional assumption that people act rationally in pursuit of self-interest. He found, for instance, a strong tendency to go along with the status quo, or default option, even when it makes little sense. In addition, people are predictably overoptimistic, and they care twice as much about losing money as they do about gaining it. They are more fearful of unlikely threats like a nuclear-power accident than they are of something far more probable, like a car accident. Thaler began making a list of such anomalies, which he posted on a wall in his office. Those insights became the building blocks of behavioral economics, which he was instrumental in creating.

Thaler's anomalies were a direct affront to neoclassical economics, which holds that if people are left alone, they will efficiently maximize personal gain. His apostasy was not looked upon kindly by some economists. When he arrived at Chicago in 1995, after 17 years at Cornell University, he found that the Nobel Prize-winning economist Merton Miller refused to talk to him. (I could not help but smile when I recently attended one of Thaler's lectures and noticed a giant portrait of Miller in the hallway outside the classroom.)

"The first paper I wrote on behavioral economics was rejected by five journals," Thaler recalls in his office. "Part of the problem was that I didn't have any role models, so I wasn't sure what to do." The business school is housed in a sleek new building that is all sharp angles and glass. Thaler has a corner office with dramatic floor-to-ceiling windows. It is the first day of the semester, and there are cardboard boxes strewn about the room.

Everything changed in 1976, when Thaler came across an article in Science by two psychologists, Amos Tversky and Daniel Kahneman. The article argued that rules of thumb — or mental shortcuts — lead to systematic errors or biases. "When I read this paper I could hardly contain myself," Thaler wrote in a short autobiographical essay years later. "This concept is what was necessary to make the psychology of decision making relevant for economics." He began developing his theories and, in 1987, sharing them in a widely read column — aptly called "Anomalies" — for The Journal of Economic Perspectives. In 1992 those columns were collected in The Winner's Curse: Paradoxes and Anomalies of Economic Life (Free Press).

Not everyone was cold to Thaler's arrival in Hyde Park. Within a week, Sunstein introduced himself. The two had not previously met, but Sunstein described himself as an admirer. "I have always been interested in departure from rationality," Sunstein says, adding that he found Thaler's work "phenomenal."

The two struck up a friendship that soon evolved into a collaboration. In 1998 they published a highly influential article, "A Behavioral Approach to Law and Economics," in the Stanford Law Review. (The article was written with Christine Jolls, now a Yale Law School professor.) Brian Leiter, who will leave the University of Texas School of Law for Chicago's law faculty in the fall, credits that article with stimulating a "huge industry" of new legal research.

The field's credibility was further solidified in 2002, when Kah-neman, an emeritus professor of psychology at Princeton University, won the Nobel in economic science "for having integrated insights from psychological research into economic science, especially concerning human judgment and decision making under uncertainty." Since then, behavioral economics has made a rapid march from heresy to near-orthodoxy, especially among scholars 40 years old or younger. (Thaler gleefully cites that as proof that his notoriety stems in large part from his ability to "corrupt the youth.")

Edward L. Glaeser, a professor of economics at Harvard University, who edits The Quarterly Journal of Economics, says that there is no longer a market for just another anomaly paper demonstrating that people don't act in a hyperrational manner. "We got it," he says laughing, "which is another way of saying that Thaler has won."

Tyler Cowen, a professor of economics at George Mason University, considers Thaler's work worthy of a Nobel Prize. "While I wouldn't call him a shoo-in, I would say he is favored to win the prize someday," Cowen writes in an e-mail message.

Sunstein and thaler make for an odd couple. Thaler, 62, is exuberant, sounds a bit like the comedian Al Franken, and possesses a seemingly endless stockpile of entertaining stories and anecdotes. Befitting his casual demeanor, he wears open-necked shirts, often with rolled-up sleeves, and he exudes a general sense of comfort in his own skin. His youthful, fleshy face is topped by a pile of tousled salt-and-pepper hair. (He remains both perplexed and amused at being described as "leonine" in The New York Times a few years ago.)

Sunstein, on the other hand, is lean, with a high, narrow forehead. And though he is wearing a suit and tie, he has the slightly disheveled appearance of a man who spends a lot of time alone in a room writing. His eyes dance around frantically while he talks, contrasting with his slow, deliberate cadence. Leiter describes Sunstein, 53, as "almost embarrassingly self-effacing," adding, "He is just an unusually nice guy, and that is not necessarily the norm among successful academics."

At lunch, when I ask Sunstein and Thaler about their collaborative process, Sunstein describes it as "incredibly fun." Shooting a smile in Thaler's direction, he says, "I, of course, worked harder and produced more." Thaler chuckles and nods his head.

To be sure, few people manage to produce more than Sunstein, who publishes at a daunting rate across numerous fields — constitutional law, environmental law, behavioral law and economics, and family law. He is also an authority on gay rights, animal rights, the Internet's impact on democracy, and more. According to the Library of Congress catalog, he is the author or editor of nearly 30 books. Elena Kagan, dean of Harvard Law School, whose faculty Sunstein will join in the fall, has described him as "the pre-eminent legal scholar of our time — the most wide-ranging, the most prolific … the most influential."

Indeed, a 2007 study by Leiter found that Sunstein is, by far, the most-cited full-time legal scholar. In 2006 two law professors at Vanderbilt University used network theory to show that Sunstein is the "central legal academic." Running calculations to determine other scholars' degrees of separation from the center of the legal academy, they memorably dubbed that figure the "Sunstein number."

"Cass can write very fast," Thaler says. "It's funny, when he gets under stress, when something goes wrong in his life, he writes. There were some stressful periods during the writing of the book when he sent me like five chapters in rapid succession."

Thaler contrasts that with his own style. "I've never even written a book out of whole cloth," he says. "I've merely stapled together four" (meaning edited volumes of his essays).

When I ask Sunstein if he ever worries that he publishes too much, he acknowledges that he has heard that criticism "secondhand," but he insists that he doesn't publish everything he writes (though he says he is tempted). "Cass has certain rules," Thaler interjects, like "longer is better." Sunstein was excited that the first draft of Nudge was more than 100,000 words; Thaler was excited that it was cut to 80,000.

In 2004, Thaler and his wife were invited to a fund-raising event that a neighbor was holding for a Democratic candidate for a U.S. Senate seat: Barack Obama. The Thalers had never heard of him. At the time, Obama was running third among Democrats in the heavily contested Illinois primary. "We were blown away by the guy," Thaler recalls. "The next day I called Cass and said, 'I met this Obama guy. You know, he seems like the real deal.' And Cass, in his typical way, said, 'The first day Obama is in the Senate, he will be the most-qualified guy in the Senate.'"

While Sunstein and Thaler are both ardent supporters of Obama's presidential campaign, it is hard to precisely decipher each man's role within the Obama orbit. But by all accounts — except their own — they are major figures. The New Republic recently dubbed Thaler the "in-house intellectual guru" of Obama's policy shop. When I ask Thaler about it, he rolls his eyes and calls the characterization "a bit of an overstatement."

He describes his role as an adviser as "very informal," saying that at the start of Obama's campaign, he served on a committee counseling the candidate on Social Security. Since then, "I occasionally get calls from his policy people," he says. Thaler chalks up whatever influence he might have to his close relationship with Austan Goolsbee, a professor of economics at Chicago's business school who is now Obama's chief economic adviser. "Cass is closer to the Obama camp than I am. He and Barack are very good friends," Thaler says.

Sunstein's relationship with Obama dates back to 1992, when Obama began teaching part time at Chicago's law school, something he continued doing until his election to the Senate in 2004. (Sunstein tried to persuade Obama to join the faculty full time on numerous occasions, but Obama declined.)

When I ask Sunstein about the candidate, however, he becomes noticeably wary. In fact, when I first ask him and Thaler for permission to record my interview, they agree, but Sunstein jokes that he has nothing but nice things to say about "a certain senator from New York." The two men then begin to heap over-the-top praise on everyone, including John McCain and President Bush. They succeed in cracking each other up. The shtick is a clear reference to the trouble that the onetime Obama adviser Samantha Power, a professor of public policy at Harvard University, got herself into when she was quoted calling Hillary Rodham Clinton a "monster." (Sunstein and Power are dating.)

So has Obama read Nudge? Would a President Obama be the first nudger in chief? Thaler and Sunstein sent advance copies to several members of Obama's inner circle, including a copy for the candidate. But they assume he is more than a little too busy to read much these days. Furthermore, they are eager to portray libertarian paternalism as a bipartisan philosophy. On many issues, including environmental protection, family law, and school choice, they argue for less government coercion. "If incentives and nudges replace requirements and bans, government will be both smaller and more modest," they write. "We are not for bigger government, just for better governance."

Sunstein is "elated" that their first joint appearance to promote the book was held at the American Enterprise Institute, a conservative think tank in Washington, where they received a very warm reception. (Thaler was in peak form, delivering an extemporaneous 30-minute talk that was one part policy lecture and one part stand-up comedy routine.) As evidence of libertarian paternalism's broad appeal, they point to the 2006 Pension Protection Act, which incorporated principles of behavioral economics. Those companies that moved to automatic enrollment for pensions were rewarded with a waiver that freed them from some burdensome paperwork. The law received support from an unlikely coalition of politicians, including conservative Republicans like Sen. Robert F. Bennett of Utah and Rick Santorum, then a senator from Pennsylvania, as well as liberal Democrats like Rep. Rahm Emanuel of Illinois.

"It made me think that these ideas really could bridge the political divide," says Thaler.

Evan R. Goldstein is a staff editor at The Chronicle Review.

http://chronicle.com
Section: The Chronicle Review
Volume 54, Issue 31, Page B8

Sunday, May 4

The All-White Elephant in the Room

By FRANK RICH

BORED by those endless replays of the Rev. Jeremiah Wright? If so, go directly to YouTube, search for “John Hagee Roman Church Hitler,” and be recharged by a fresh jolt of clerical jive. Or just click here.

What you’ll find is a white televangelist, the Rev. John Hagee, lecturing in front of an enormous diorama. Wielding a pointer, he pokes at the image of a woman with Pamela Anderson-sized breasts, her hand raising a golden chalice. The woman is “the Great Whore,” Mr. Hagee explains, and she is drinking “the blood of the Jewish people.” That’s because the Great Whore represents “the Roman Church,” which, in his view, has thirsted for Jewish blood throughout history, from the Crusades to the Holocaust.

Mr. Hagee is not a fringe kook but the pastor of a Texas megachurch. On Feb. 27, he stood with John McCain and endorsed him over the religious conservatives’ favorite, Mike Huckabee, who was then still in the race.

Are we really to believe that neither Mr. McCain nor his camp knew anything then about Mr. Hagee’s views? This particular YouTube video — far from the only one — was posted on Jan. 1, nearly two months before the Hagee-McCain press conference. Mr. Hagee appears on multiple religious networks, including twice daily on the largest, Trinity Broadcasting, which reaches 75 million homes. Any 12-year-old with a laptop could have vetted this preacher in 30 seconds, tops.

Since then, Mr. McCain has been shocked to learn that his clerical ally has made many other outrageous statements. Mr. Hagee, it’s true, did not blame the American government for concocting AIDS. But he did say that God created Hurricane Katrina to punish New Orleans for its sins, particularly a scheduled “homosexual parade there on the Monday that Katrina came.”

Mr. Hagee didn’t make that claim in obscure circumstances, either. He broadcast it on one of America’s most widely heard radio programs, “Fresh Air” on NPR, back in September 2006. He reaffirmed it in a radio interview less than two weeks ago. Only after a reporter asked Mr. McCain about this Katrina homily on April 24 did the candidate brand it as “nonsense” and the preacher retract it.

Mr. McCain says he does not endorse any of Mr. Hagee’s calumnies, any more than Barack Obama endorses Mr. Wright’s. But those who try to give Mr. McCain a pass for his embrace of a problematic preacher have a thin case. It boils down to this: Mr. McCain was not a parishioner for 20 years at Mr. Hagee’s church.

That defense implies, incorrectly, that Mr. McCain was a passive recipient of this bigot’s endorsement. In fact, by his own account, Mr. McCain sought out Mr. Hagee, who is perhaps best known for trying to drum up a pre-emptive “holy war” with Iran. (This preacher’s rantings may tell us more about Mr. McCain’s policy views than Mr. Wright’s tell us about Mr. Obama’s.) Even after Mr. Hagee’s Catholic bashing bubbled up in the mainstream media, Mr. McCain still did not reject and denounce him, as Mr. Obama did an unsolicited endorser, Louis Farrakhan, at the urging of Tim Russert and Hillary Clinton. Mr. McCain instead told George Stephanopoulos two Sundays ago that while he condemns any “anti-anything” remarks by Mr. Hagee, he is still “glad to have his endorsement.”

I wonder if Mr. McCain would have given the same answer had Mr. Stephanopoulos confronted him with the graphic video of the pastor in full “Great Whore” glory. But Mr. McCain didn’t have to fear so rude a transgression. Mr. Hagee’s videos have never had the same circulation on television as Mr. Wright’s. A sonorous white preacher spouting venom just doesn’t have the telegenic zing of a theatrical black man.

Perhaps that’s why virtually no one has rebroadcast the highly relevant prototype for Mr. Wright’s fiery claim that 9/11 was America’s chickens “coming home to roost.” That would be the Sept. 13, 2001, televised exchange between Pat Robertson and Jerry Falwell, who blamed the attacks on America’s abortionists, feminists, gays and A.C.L.U. lawyers. (Mr. Wright blamed the attacks on America’s foreign policy.) Had that video re-emerged in the frenzied cable-news rotation, Mr. McCain might have been asked to explain why he no longer calls these preachers “agents of intolerance” and chose to cozy up to Mr. Falwell by speaking at his Liberty University in 2006.

None of this is to say that two wacky white preachers make a Wright right. It is entirely fair for any voter to weigh Mr. Obama’s long relationship with his pastor in assessing his fitness for office. It is also fair to weigh Mr. Obama’s judgment in handling this personal and political crisis as it has repeatedly boiled over. But whatever that verdict, it is disingenuous to pretend that there isn’t a double standard operating here. If we’re to judge black candidates on their most controversial associates — and how quickly, sternly and completely they disown them — we must judge white politicians by the same yardstick.

When Rudy Giuliani, still a viable candidate, successfully courted Pat Robertson for an endorsement last year, few replayed Mr. Robertson’s greatest past insanities. Among them is his best-selling 1991 tome, “The New World Order,” which peddled some of the same old dark conspiracy theories about “European bankers” (who just happened to be named Warburg, Schiff and Rothschild) that Mr. Farrakhan has trafficked in. Nor was Mr. Giuliani ever seriously pressed to explain why his cronies on the payroll at Giuliani Partners included a priest barred from the ministry by his Long Island diocese in 2002 following allegations of sexual abuse. Much as Mr. Wright officiated at the Obamas’ wedding, so this priest officiated at (one of) Mr. Giuliani’s. Did you even hear about it?

There is not just a double standard for black and white politicians at play in too much of the news media and political establishment, but there is also a glaring double standard for our political parties. The Clintons and Mr. Obama are always held accountable for their racial stands, as they should be, but the elephant in the room of our politics is rarely acknowledged: In the 21st century, the so-called party of Lincoln does not have a single African-American among its collective 247 senators and representatives in Washington. Yes, there are appointees like Clarence Thomas and Condi Rice, but, as we learned during the Mark Foley scandal, even gay men may hold more G.O.P. positions of power than blacks.

A near half-century after the civil rights acts of the 1960s, this is quite an achievement. Yet the holier-than-thou politicians and pundits on the right passing shrill moral judgment over every Democratic racial skirmish are almost never asked to confront or even acknowledge the racial dysfunction in their own house. In our mainstream political culture, this de facto apartheid is simply accepted as an intractable given, unworthy of notice, and just too embarrassing to mention aloud in polite Beltway company. Those who dare are instantly accused of “political correctness” or “reverse racism.”

An all-white Congressional delegation doesn’t happen by accident. It’s the legacy of race cards that have been dealt since the birth of the Southern strategy in the Nixon era. No one knows this better than Mr. McCain, whose own adopted daughter of color was the subject of a vicious smear in his party’s South Carolina primary of 2000.

This year Mr. McCain has called for a respectful (i.e., non-race-baiting) campaign and has gone so far as to criticize (ineffectually) North Carolina’s Republican Party for running a Wright-demonizing ad in that state’s current primary. Mr. McCain has been posing (awkwardly) with black people in his tour of “forgotten” America. Speaking of Katrina in New Orleans, he promised that “never again” would a federal recovery effort be botched on so grand a scale.

This is all surely sincere, and a big improvement over Mitt Romney’s dreams of his father marching with the Rev. Dr. Martin Luther King Jr. Up to a point. Here, too, there’s a double standard. Mr. McCain is graded on a curve because the G.O.P. bar is set so low. But at a time when the latest Wall Street Journal-NBC News poll shows that President Bush is an even greater drag on his popularity than Mr. Wright is on Mr. Obama’s, Mr. McCain’s New Orleans visit is more about the self-interested politics of distancing himself from Mr. Bush than the recalibration of policy.

Mr. McCain took his party’s stingier line on Katrina aid and twice opposed an independent commission to investigate the failed government response. Asked on his tour what should happen to the Ninth Ward now, he called for “a conversation” about whether anyone should “rebuild it, tear it down, you know, whatever it is.” Whatever, whenever, never mind.

For all this primary season’s obsession with the single (and declining) demographic of white working-class men in Rust Belt states, America is changing rapidly across all racial, generational and ethnic lines. The Census Bureau announced last week that half the country’s population growth since 2000 is due to Hispanics, another group understandably alienated from the G.O.P.

Anyone who does the math knows that America is on track to become a white-minority nation in three to four decades. Yet if there’s any coherent message to be gleaned from the hypocrisy whipped up by Hurricane Jeremiah, it’s that this nation’s perennially promised candid conversation on race has yet to begin.

Thursday, May 1

The Opt-Out Revolution Revisited

The Opt-Out Revolution Revisited
By Joan C. Williams
The American Prospect
Monday 05 March 2007 Issue

Women aren't forsaking careers for domestic life. The ground rules just make it impossible to have both.

"I was tired of juggling. I was tired of feeling guilty. I was tired of holding the household reins in one hand. So I quit."

On the cover of The New York Times Magazine for October 26, 2003, a classy looking white woman with long, straight hair sits serenely with her baby, ignoring the ladder that climbs behind her. "Why Don't More Women Get to the Top?" asks the headline. "They Choose Not to."

Inside, Times columnist Lisa Belkin reported on interviews with eight women who graduated from Princeton and a handful of others, three of them with MBAs. All are "elite, successful women who can afford real choice," Belkin acknowledges, yet the Magazine does not evince any hesitation about making generalizations about "women" based on this group's decisions - to use Belkin's phrase - to "opt out."

Belkin's piece shifted the cultural frame for understanding women's workforce participation. Prior to her article, coverage typically focused on women who had "dropped out" - left the workforce altogether. A key insight of Belkin's was that many women who remain employed nonetheless step off the fast track, working part time, as independent contractors, or full time on the "mommy track." Belkin lumped these women with stay-at-home moms as evidence that many women who had not "dropped out" had, nonetheless, "opted out" of the fast track.

Belkin's success in naming and framing reshaped and refreshed a well-entrenched story line: that women are returning home as a matter of choice, the result of an internal psychological or biological "pull" rather than a workplace "push." This has been the interpretation of choice at The New York Times for more than half a century. The Times has been announcing and re-announcing the "opt out" trend since 1953 - when it published the quote used as the epigram above.

This article presents the highlights of a full-length report analyzing 119 news stories printed between 1980 and 2006. The report demonstrates that many mothers do not opt out, but are instead pushed out by workplace inflexibility, failures of public policy, and workplace bias. For evidence, we need look no further than Belkin's original article. The first woman discussed is Sally Sears, a former TV anchorwoman. Sears took nine years to quit, and "she did so with great regret." "I would have hung in there, except the days kept getting longer and longer," Sears explained. "My five-day, 50-hour week was becoming a 60-hour week."

So she quit, recognizing she lacked the fire in her belly, right? No, actually. Sears tried unsuccessfully to negotiate a part-time schedule. "They said it was all or nothing ... It was wrenching for me to leave Channel 2 ... I miss being the lioness in the newsroom ... [and i]t kills me that I'm not contributing to my 401(k) anymore." (This reference to the economic vulnerability of women who opt out is never followed up.)

In fact, the same all-or-nothing employer who refused to let Sears work part time later offered her part-time work - but without benefits, with no future, and at a much lower pay rate. The real message of Sears' story is that, despite her talents, she ended up doing virtually the same work she had always done, but in a low-paid, dead-end job. A more accurate headline for her story: "Talented Mother Pushed Out of a Good Job Into a Bad One; Economic Vulnerability Results."

The Opt-Out Story Line Exposed

The central thrust of the opt out story line is that women are "getting real" about their limitations and realizing that their values were more traditional than they thought they were, thus leading them to forego careers in favor of traditional motherhood. This story line has several major weaknesses.

In nearly three-fourths (73 percent) of the newspaper stories we analyzed, the overall tone was one of pulls rather than pushes - women following the pull toward home, with little mention of how the workplace pushes them out. Yet in a 2004 study by Pamela Stone and Meg Lovejoy, 86 percent of highly qualified women surveyed said work-related reasons, including workplace inflexibility, were key considerations in their decisions to quit. Only 6 percent of the articles we reviewed identified workplace pushes as key reasons why women left work.

The opt-out story line also discusses work/family conflict predominantly as an issue for professional women. More than half (58 percent) of the women discussed in opt-out stories in The New York Times were in high-status or other traditionally masculine white-collar jobs; the number spiked to 100 percent in The Washington Times. This picture is misleading. Only about 8 percent of U.S. women hold high-level and other traditionally masculine jobs. And data shows that highly educated mothers are more - not less - likely to remain in the labor force than other women.

Distorted newspaper coverage can distort public policy. When I was talking with a Capitol Hill staffer several years ago, she told me that her office was not interested in public policies to help Americans balance work and family. "My boss is not interested in the problems of professional women," she said - a misconception taken straight from the American press.

In addition, such stories often paint an unrealistically rosy picture about women's chances of picking up their careers where they left off. More than one-third of the articles we reviewed explicitly adopt the view that women are being "realistic" when they recognize that they cannot "have it all" (i.e., what men have always had: both families and careers). In fact, it is opt-out articles that are unrealistic about women's chances of opting back in. "My degree is my insurance policy," one of Belkin's interviewees states.

But this is an illusion. In a 2005 study for the Wharton Center for Leadership and Change, Monica McGrath and her coauthors surveyed 130 highly qualified women who had spent at least two years away from work. They found that, while 70 percent of those surveyed reported feeling positive about their decisions to leave the labor force, 50 percent felt "frustrated" when they tried to return to work, and 18 percent became "depressed." Some respondents reported that employers interviewed them as if they had no work experience at all. More than one-third (36 percent) thought they might have to take a lower-level position than they had left. One particularly frustrated respondent said she was thinking of taking her MBA off her résumé. Said another, "Be prepared for the realization that in the business world your stepping-out time counts for less than zero ... [and] may make potential employers think you are not as reliable as other applicants."

Another 2005 study by Christy Spivey published in the Industrial and Labor Relations Review found that women experienced a significant negative effect on wages even 20 years after a career interruption. This is a message that young women are not getting; the press is not telling them.

Women's Economic Jeopardy

The opt-out story line presents the economic impact of mothers leaving the workforce as a short-term picture of giving up luxuries. No major paper would cover unemployment by having a reporter interview a handful of well-heeled acquaintances and muse on a personal period of unemployment. The idea is ludicrous; unemployment is a serious economic issue - except, in U.S. papers, the unemployment of mothers.

By contrast, in its April 2006 article entitled "A Guide to Womenomics," the British magazine The Economist decried the fact that "women remain the world's most under-utilized resource." "To make full use of their national pools of female talent," the article stated, "governments need to remove obstacles that make it hard for women to combine work with having children," such as "parental leave and child care, allowing more flexible working hours, and reforming tax and social-security systems that create disincentives for women to work."

In sharp contrast, only 12 percent of the U.S. news articles we surveyed discuss the negative impact on the economy of that loss of talent. Instead, upbeat opt-out stories feature a steady diet of interviews with women after they opt out, and before any of them divorce, in which affluent women explain how they made ends meet by giving up expensive vacations, shopping sprees, and dining out. This story hardly describes the typical American family. Given that American women bring home an average of 28 percent of the family income, most families cannot make ends meet after a mother stops working simply by giving up luxuries.

Very few of the 119 articles we surveyed linked women's opting out with long-term economic vulnerability. In a society in which "displaced homemakers'" incomes fall very sharply after divorce, only two out of 119 articles mentioned any divorced women. Yet, if past trends are any indication, close to half of opt-out women will end up in this position, with not only their own long-term economic futures in jeopardy, but also those of their children (who are statistically less likely than other children to reach the education level or class status of their fathers).

The Great American Speed-Up

A final shortcoming of the opt-out story line is its failure to acknowledge the impact of new, sped-up ideals of mothering and of work in America. Nearly two-thirds (64 percent) of the articles in our survey refer to women's return to "traditional" roles. Yet recent studies report that much of what these "new traditionalist" mothers stay home to do is not traditional. Sociologists Annette Lareau and Sharon Hays have documented the rise of an "ideology of intensive mothering" in professional and managerial families - the belief that each child needs to be driven to countless practices, play dates, tutoring, and other enrichment activities.

Newspapers' confident assertions of "new traditionalism" also erase the newness of the all-or-nothing workplace. What many opt-out women are rejecting is not work per se, but "extreme jobs." Americans work longer hours than in virtually any other industrialized country, and American men's working hours have risen so sharply since 1980, that now nearly 40 percent of college-educated men work 50-plus hours a week.

The Great American Speed-Up at Work - like the Great American Speed-Up at Home - is not traditional. In one-opt out article we reviewed (printed in The Union Leader of Manchester, New Hampshire, in 1995), a mother who left a corporate job for freelance work told the paper that her guiding principle was, "I like to be with my family for dinner." This is a goal virtually any Company Man of the 1950s could attain.

Accurate Stories the Press Should Report

Newspapers could easily replace the eternal drone of opt-out stories with three new story lines - ones that more accurately reflect real data.

First, newspapers can describe how American women are pushed out of good jobs by workplace inflexibility. The American economy has lots of good, long-hours jobs, but part time jobs tend to be hard to find, dead end, and low paid. "I felt like I threw away my career with the placenta," said one lawyer who returned to work part time after giving birth. The economic penalty associated with part-time work is much harsher in the United States than in Europe. Women who work part time here earn 21 percent less per hour than full timers, a penalty seven times higher than in Sweden and more than twice as high as in the UK. On average, people who work 44 hours per week in the United States earn more than twice what those working 34 hours per week earn, according to Warren Farrell in a 2005 New York Times editorial. All this tends to drive professional and managerial couples into neotraditional roles, as the "parenting vacuum" produced by husbands' absence is filled by women who opt out. "He has always said to me, 'You can do whatever you want to do,'" says one respondent in Stone and Lovejoy's 2004 study, "But he's not there to pick up any load."

In working-class families, parents tend to "tag team," where mom works one shift and dad works a different shift, thereby minimizing the need for child care, which is expensive and hard to find. Yet, when combined with workplace inflexibility, tag-teaming can be problematic. If either parent is ordered to work overtime, the family has to choose between mom's job and dad's job - in a context in which the family usually needs both jobs to make ends meet. Similar problems arise if a child or elderly relative is sick. Many working-class families are one sick child away from being fired (the title of my report on working class families). Not surprisingly, given that many tag-team couples rarely see each other awake, tag-team families also have sky-high levels of divorce, with a divorce rate three to six times the national average.

A second alternative story line for the press is how the failures of U.S. public policy force many women out of the workforce. Nearly one-quarter (23 percent) of the articles we surveyed mention that the high cost and/or the low quality of child care drove mothers out of work. Are these stories about women getting what they want, or stories about the systematic de-skilling of American women (many of them educated at public expense) due to a lack of supports for working families? Lack of adequate non-family child care plays a central role in driving American women out of the workforce and into economic vulnerability. That's why most other industrialized countries have been so attentive to creating systems that provide families with both good options for non-family child care and workplace flexibility so workers can care for their own children. Few, if any, articles in American newspapers mention the role of U.S. public policy in creating the unattractive choices that cause many mothers to leave work.

A third new story line is one explaining how many women leave the workforce because of workplace bias against mothers. Perhaps the most damaging part of the opt-out story line is that it excuses gender discrimination under the rhetoric of "choice." Lawsuits brought by women who hit the maternal wall offer vivid, compelling stories of women who did not opt out; they were pushed out by stereotyping and gender discrimination. Take Shireen Walsh, a top salesperson with outstanding reviews who encountered a sharp change in working conditions after she returned from maternity leave. Her first week back at work, her supervisor told her to stop disrupting the office when she showed her baby pictures. Her hours were closely scrutinized, although, as is common in off-site sales jobs, her coworkers' were not. When Walsh had to leave to take her son, who had persistent ear infections, to the doctor, she was required to sign in and out and to make up every minute of work she missed, despite a policy allowing for unlimited sick leave. Her supervisor threw a phone book at her, telling her to find a pediatrician open after business hours. In her 2003 case against her employer, National Computer Systems, Inc., a federal court upheld a jury verdict of $625,000.

The Center for WorkLife Law has identified more than 800 cases of "family responsibilities discrimination" or FRD. Such cases increased nearly 400 percent during the last decade. FRD plaintiffs are more likely to win than employment discrimination plaintiffs generally, in part because lawyers litigate these as family values cases. Potential liability is substantial: More than 75 cases have involved verdicts or settlements of $100,000 or more, with the highest individual verdict at $11.65 million and the highest class recovery at $49 million.

The argument that women opt out rests on the assumption that we are talking about mothers' choices, not systemic discrimination. Yet choice and discrimination are not mutually exclusive: Consider the "don't ask, don't tell" policy. Under it, gay soldiers do have a choice - they can remain closeted and keep their jobs, or they can come out and get fired. Yet their choice, like mothers' choices, occur within the context of discrimination. Many mothers quit when their careers stall after they are told that mothers belong at home with their children; or when they find that disabled men are offered light duty but pregnant women are not; or when the quality of their assignments declines sharply when they return from maternity leave; the examples go on and on.

News stories about family responsibilities discrimination have spiked in the months since the Center for WorkLife Law released a July 2006 report on the rise of FRD litigation. One was an article by Lisa Belkin, the very reporter who coined the language of opting out, in which she described FRD (and dubbed it "Fred"). Recent press interest may signal receptiveness to new story lines around women and employment. Let's hope so.

Joan C. Williams is Distinguished Professor of Law and director of the Center for WorkLife Law at the University of California, Hastings College of the Law. For the full-length report on which this article is based, "'Opt Out' or Pushed Out?: How the Press Covers Work/Family Conflict," go here.


----

This study is full of surprises, such as the finding that many women feel pushed out of the workplace.–Working Mother

"A deft analysis."–Phyllis Moen, Science (aaas)

"Adds some much-needed substance. Rather than arguing that mothers should or should not stay home, Stone, a sociologist at the City University of New York, describes the complex reasons that 54 women left their high-powered positions after having children and how their lives proceeded from that decision. Opting Out? shows how a mix of forces conspired to nudge women out of their careers, despite the fact that most originally intended to stay in them. . . . It's refreshing to read a balanced take on the upsides and downsides to staying home with kids."–Salon

"Fascinating and surprising . . . Stone's revealing study adds an important counterpoint to Leslie Bennetts's forthcoming The Feminine Mistake."–Publishers Weekly

"Provocative, superbly researched."–BusinessWeek
"With insight and compassion, Pamela Stone shows convincingly that, far from representing a return to tradition, the decision of some women to relinquish high-powered careers is a reluctant and conflict-ridden response to the growing mismatch between privatized families and time-demanding jobs. By charting the institutional obstacles and cultural pressures that continue to leave even the most advantaged women facing impossible options, Opting Out? gets beneath the hype and offers the real story behind the misleading headlines. This groundbreaking study is required reading for anyone who cares about the fate of families, work, and gender equality in contemporary America."–Kathleen Gerson, author of Hard Choices: How Women Decide About Work, Career, and Motherhood

"Pamela Stone's Opting Out? is a creative and beautifully written addition to the burgeoning scholarly and popular literature on work and family. Stone gives voice to those elite career women–the 'best and the brightest'–who have returned home to raise their kids. She creatively unpacks these women's 'choices,' describing both the 'pulls' of family life but also the labor market 'pushes.' Opting Out? is a fully nuanced portrait of women (and their husbands) struggling to make important life decisions in a culture that often provides only simplistic zero-sum alternatives: mom or worker, even though most women are already working moms. Women want alternative visions of working motherhood, yet are often stymied by outmoded workplace models (and firms and managers) insensitive to the concerns of working families. Stone's work challenges our organizational leaders and policy makers to do better, for women, but also more generally for working families, workplace organizations, and society as a whole."–Patricia A. Roos, Rutgers University

"Pamela Stone has listened to women with high powered careers now at home with their kids as no one else has. Bringing an open mind and equal parts sympathy and skepticism, coupled with years of training as a social scientist, Stone analyzes the opt out decision and comes to surprising conclusions. Delving beneath the superficial, media-friendly explanations, she finds the real movers in the drama: rising norms of intensive mothering, fathers ensconced in even more demanding and better-paying jobs, and inflexible workplaces that refuse to accommodate reduced hours. Brilliantly written and argued, this book reveals what's really going on in women's minds and corporate America today, and what we can do to make equal opportunity at home and on the job reality rather than rhetoric."–Heidi Hartmann, Institute for Women's Policy Research

"A fascinating, fine-grained look at the real reasons why many professional women with children leave the workplace. Stone's research and her well-written account make it clear that educated mothers aren't opting out; they are being shut out by inflexible employers. Must reading for anyone interested in understanding the 'reality of constraint' behind the 'rhetoric of choice.'"–Ann Crittenden, author of The Price of Motherhood

"Based on a study, but told through the eloquent stories of women who are at-home mothers, this seminal book goes beyond the myths, misconceptions, and even what is usually said, to reveal very important and compelling truths. Everyone who cares about work and family life in the United States today needs to read this book."–Ellen Galinsky, President, Families and Work Institute, and author of Ask the Children

"A brilliant analysis. With exquisite sensitivity, Stone unpacks the painful process by which most women who 'opt out' feel pushed out by workplace pressures from their own-and their husbands'-all-or-nothing careers. This book offers sophisticated sociology at its accessible best, in the tradition of Arlie Hochschild's pathbreaking work."–Joan Williams, author of Unbending Gender

"'Ladies, start your engines.' This exhortation concludes the illuminating analysis of vibrant, fully realized stories Pamela Stone heard in talking to women who left professional work for full-time home life. Thank you, Pamela Stone, for producing new knowledge that both individuals and business policy-makers will find essential in creating the conditions that will enable business professionals to meet this profound social and economic challenge."–Stewart Friedman, Director, Wharton Work/Life Integration Project, University of Pennsylvania


Description

Noting a phenomenon that might seem to recall a previous era, The New York Times Magazine recently portrayed women who leave their careers in order to become full-time mothers as "opting out." But, are high-achieving professional women really choosing to abandon their careers in order to return home? This provocative study is the first to tackle this issue from the perspective of the women themselves. Based on a series of candid, in-depth interviews with women who returned home after working as doctors, lawyers, bankers, scientists, and other professions, Pamela Stone explores the role that their husbands, children, and coworkers play in their decision; how women's efforts to construct new lives and new identities unfold once they are home; and where their aspirations and plans for the future lie. What we learn–contrary to many media perceptions–is that these high-flying women are not opting out but are instead being pushed out of the workplace. Drawing on their experiences, Stone outlines concrete ideas for redesigning workplaces to make it easier for women–and men–to attain their goal of living rewarding lives that combine both families and careers.

The Feminine Mistake: "how dare you judge me!"

The Feminine Mistake

Everyone knows that authors have to be prepared for negative reviews. What I didn't anticipate was an avalanche of blistering attacks by women who hadn't read my book but couldn't wait to condemn it. Their fury says a great deal about the current debate over women's choices -- all of it alarming.

I wrote The Feminine Mistake: Are We Giving Up Too Much? because the typical reporting on the job-versus-family issue was so biased and incomplete. The media gave lots of coverage to women who quit the labor force to become full-time mothers, but they treated this decision as if it were simply a lifestyle choice. They never seemed to mention the risks of economic dependency -- or the myriad benefits of work. As a result, women were being lulled into a dangerous sense of complacency about relinquishing their financial autonomy. Why wasn't anyone telling the truth about how much they were sacrificing -- or what the consequences could be?

When I researched the subject myself, my findings made it all too clear how false that sense of security really is. Over time, most stay-at-home wives are likely to face major hardships as a result of divorce, widowhood, a spouse's unemployment or illness, or any number of other challenges. Women who abandon their careers and become financially dependent on their husbands often look back on that decision as the biggest mistake of their lives -- even women in stable, enduring marriages. I interviewed women all over the country, of every age, socio-economic level and background, but many used the exact same words to ask an angry question: "Why didn't anybody tell me what a mistake this was?"

My goal in writing The Feminine Mistake was to provide women with what I saw as one-stop-shopping that would help close this information gap. My goal was to gather into a single neat package all the financial, legal, sociological, psychological, medical, labor-force, child-rearing and other information necessary for them to protect themselves. My reporting revealed that the bad news is just as ominous as I'd feared; so many women are unaware of practical realities that range from crucial changes in the divorce laws to the difficulties of reentering the work force and the penalties they pay for taking a time-out. I devoted two chapters to financial information alone.

But the good news is just as dramatic -- and equally neglected in much of the current debate. Work confers enormous benefits in addition to a paycheck. Despite the undeniable challenges of the juggling act, working women tend to be happier and even healthier than stay-at-home moms, in ways that have been documented by a broad range of surprising medical, psychological and social science data. Their incomes give them power in their marriages and options in the larger world, not to mention opportunities that benefit their families. Women are socialized not to brag, but it's very gratifying to make money, be successful, and get recognition for your work. Like most men, many working women wouldn't even consider giving up such rewards.

As for the children's welfare, sociologists have spent decades comparing the kids of working moms with those of full-time homemakers, consistently failing to prove that the latter do better. "The research on the impact of working mothers on kids shows that there isn't any," reported sociologist Pamela Stone. And when the kids grow up, the futures of working mothers are usually brighter than those of the homemakers, who often find themselves financially stranded and bereft of viable opportunities for employment.

And yet millions of women continue to be misled by the fairy-tale version of life, in which Prince Charming comes along and takes care of you forever. Our culture programs women to believe that they can depend on a man to support them -- the classic feminine mistake -- and fails to explain how often that alluring promise is betrayed, whether by a change of heart or a heartless fate.

Naively, I assumed that once women were offered more accurate information, they would be eager to get it. After all, women aren't stupid; it's true that they've been deserting the labor force in record numbers, but surely the problem was just that unfortunate information gap. Wouldn't they want to protect their own interests by educating themselves about the dangers that lie ahead -- and to plan accordingly?

The first warning that I had misjudged the situation popped up on my computer screen as a Google alert, months before my book was published. I was thrilled to see that bloggers were already talking about The Feminine Mistake -- until I saw what they were saying. The first woman to weigh in hadn't actually read it, but she was nonetheless certain that it would serve as "an indictment on my whole life as I currently live it." She held equally firm views about the content: "I'm sure there will be pages that make me shriek in anger on all sides of the issues Bennetts raises." At least she admitted that she might be bringing some personal baggage into her critique: "Am I bothered because I have a sneaking decision (I think she meant suspicion) that I've just been called a 'mistake'?...Sadly, I think I know."

And then the final jab: "that little jealousy thing where I'm secretly hoping this author is interviewed by Katie Couric on the nightly news with lipstick on her teeth."

Equally encouraging was the woman who, after being introduced to me at a cocktail party, made a horrible face when the hostess told her about The Feminine Mistake. "I don't think I want to read it," she said, pursing her lips as if she'd just sucked a lemon. "The last thing I need is a whole book telling me why I should feel even more guilty about my life than I already do."

These days women are so defensive about their choices that many seem to have closed their minds entirely. Unfortunately this will not serve our best interests, but apparently it's preferable to facing the facts. "The Latest Polemic Against Stay-At-Home Moms!" was the headline on one recent essay about The Feminine Mistake. If this were accurate, I wouldn't mind someone complaining about it, but my book is not a polemic; it's a painstakingly reported collection of information and interviews. If you want to disagree with my conclusions, you need to address the facts on which they're based rather than acting as if these were simply matters of opinion. They're not.

But you can't tell that to the stay-at-home brigade, who are enraged that I wrote it at all. When Glamour published a brief essay adapted from the book, the magazine was inundated with furious letters denouncing me. "I am so insulted by Leslie Bennetts!" and "I am so offended by Leslie Bennetts!" were typical openers. Of course, these women hadn't read the book either, but they weren't about to let the evidence get in the way of their pre-conceived biases.

It shouldn't be news that educating ourselves can help us to make smarter choices. You wouldn't buy a car without doing some comparison shopping and researching the advantages of different options, would you? So why would you make a major life choice that could jeopardize your future without informing yourself about the risks -- and the alternatives?

And yet many stay-at-home mothers seem unwilling to do so. In my interviews, most said they didn't want to think about the problems they might encounter in the future, let alone to do any contingency planning. When I asked about the dangers of economic dependency, they bristled and insisted that bad things would never happen to them, only to other people.

Wondering whether my findings were representative, I interviewed social scientists who have studied opt-out moms, and discovered that they had found the same thing: when most women quit their jobs, the long-term risks of economic dependency aren't even on their radar screens.

"None of them talked about 'What if I end up divorced?'" reported Louise Roth, a sociologist at the University of Arizona. "They never mentioned other risk factors like death or illness or unemployment."

Among full-time homemakers, this overdeveloped capacity for denial is often accompanied by a highly combative sense of indignation about views that challenge their own. In recent years, stay-at-home moms have gone on the offensive, demanding that their choices be respected and attacking those who question them. Many people have thus been intimidated into silence -- a phenomenon I encountered with increasing frequency over the last few months. Publications whose readership includes a high proportion of working women have been very enthusiastic about covering my book. But other publications catering primarily to stay-at-home mothers are terrified of offending them, and any coverage has to be tailored to accommodate their sensitivities, real or imagined. "We don't want to upset the stay-at-home mommies," more than one editor told me in a patronizing tone of voice that suggested the conspiratorial whisper of adults who are trying not to wake the cranky children.

The same thing is happening with organizations that are interested in speaking engagements. Groups of professional women are eager to hear what I have to say, but those whose membership includes many stay-at-home mothers are afraid to risk their wrath by offering potentially upsetting information. Institutions that rely on the volunteer efforts of stay-at-home moms are particularly leery of presenting any program that might challenge their assumptions and rouse their ire.

As a result, the information contained in my book is being disseminated widely among working women, but stay-at-home wives -- the ones most at risk, and therefore the ones I most wanted to reach with my findings -- are being insulated from the truth by well-meaning decision-makers who are, in my opinion, infantilizing them. Yes, it's true that women who don't work are often so defensive about their choice that they've helped to create this regrettable climate. But do they really want to be treated like children who must be shielded from distressing information?

It's as if the adult world of work and public affairs regards these self-appointed CHO's ("chief household officers," in the self-congratulatory parlance of one magazine aimed at that constituency) as somewhat dimwitted second-class citizens who aren't really up to the task of dealing with reality, which has to be left to the grown-ups. And I'm not just talking about the mommy wars; if anything, this kind of condescension about stay-at-home moms is more apparent among men than among working women.

Thus buffered from harsh realities, stay-at-home mothers can often preserve their illusions for quite a while. But over the long run, neither willful obliviousness nor a double standard that treats them like second-class citizens will save these women from the all-too-real problems I have documented in my book. The facts don't change just because you refuse to look at them.

I hope I'm wrong about this. Maybe the stay-at-home moms will devour the information in The Feminine Mistake and debate my findings in their book clubs. Maybe some of them will even reconsider their choices and start making more sensible plans for the future than relying on the blithe assumption that there will always be an obliging husband around to support them.

But judging by the opening salvos, I wouldn't bet the whole suburban Colonial on it.