By PAUL KRUGMAN
"How does one report the facts," asked Rob Corddry on "The Daily Show," "when the facts themselves are biased?" He explained to Jon Stewart, who played straight man, that "facts in Iraq have an anti-Bush agenda," and therefore can't be reported.
Mr. Corddry's parody of journalists who believe they must be "balanced" even when the truth isn't balanced continues, alas, to ring true. The most recent example is the peculiar determination of some news organizations to cast the scandal surrounding Jack Abramoff as "bipartisan."
Let's review who Mr. Abramoff is and what he did.
Here's how a 2004 Washington Post article described Mr. Abramoff's background: "Abramoff's conservative-movement credentials date back more than two decades to his days as a national leader of the College Republicans." In the 1990's, reports the article, he found his "niche" as a lobbyist "with entree to the conservatives who were taking control of Congress. He enjoys a close bond with [Tom] DeLay."
Mr. Abramoff hit the jackpot after Republicans took control of the White House as well as Congress. He persuaded several Indian tribes with gambling interests that they needed to pay vast sums for his services and those of Michael Scanlon, a former DeLay aide. From the same Washington Post article: "Under Abramoff's guidance, the four tribes ... have also become major political donors. They have loosened their traditional ties to the Democratic Party, giving Republicans two-thirds of the $2.9 million they have donated to federal candidates since 2001, records show."
So Mr. Abramoff is a movement conservative whose lobbying career was based on his connections with other movement conservatives. His big coup was persuading gullible Indian tribes to hire him as an adviser; his advice was to give less money to Democrats and more to Republicans. There's nothing bipartisan about this tale, which is all about the use and abuse of Republican connections.
Yet over the past few weeks a number of journalists, ranging from The Washington Post's ombudsman to the "Today" show's Katie Couric, have declared that Mr. Abramoff gave money to both parties. In each case the journalists or their news organization, when challenged, grudgingly conceded that Mr. Abramoff himself hasn't given a penny to Democrats. But in each case they claimed that this is only a technical point, because Mr. Abramoff's clients — those Indian tribes — gave money to Democrats as well as Republicans, money the news organizations say he "directed" to Democrats.
But the tribes were already giving money to Democrats before Mr. Abramoff entered the picture; he persuaded them to reduce those Democratic donations, while giving much more money to Republicans. A study commissioned by The American Prospect shows that the tribes' donations to Democrats fell by 9 percent after they hired Mr. Abramoff, while their contributions to Republicans more than doubled. So in any normal sense of the word "directed," Mr. Abramoff directed funds away from Democrats, not toward them.
True, some Democrats who received tribal donations before Mr. Abramoff's entrance continued to receive donations after his arrival. How, exactly, does this implicate them in Mr. Abramoff's machinations? Bear in mind that no Democrat has been indicted or is rumored to be facing indictment in the Abramoff scandal, nor has any Democrat been credibly accused of doing Mr. Abramoff questionable favors.
There have been both bipartisan and purely Democratic scandals in the past. Based on everything we know so far, however, the Abramoff affair is a purely Republican scandal.
Why does the insistence of some journalists on calling this one-party scandal bipartisan matter? For one thing, the public is led to believe that the Abramoff affair is just Washington business as usual, which it isn't. The scale of the scandals now coming to light, of which the Abramoff affair is just a part, dwarfs anything in living memory.
More important, this kind of misreporting makes the public feel helpless. Voters who are told, falsely, that both parties were drawn into Mr. Abramoff's web are likely to become passive and shrug their shoulders instead of demanding reform.
So the reluctance of some journalists to report facts that, in this case, happen to have an anti-Republican agenda is a serious matter. It's not a stretch to say that these journalists are acting as enablers for the rampant corruption that has emerged in Washington over the last decade.
Monday, January 30
Protests Grow Over Danish Cartoon of Muhammad
By THE ASSOCIATED PRESS
Filed at 5:06 p.m. ET
BEIRUT, Lebanon (AP) -- From the burning of its flag to a boycott of its brands of butter and cookies, Denmark is feeling Islamic outrage over newspaper cartoons of the Prophet Muhammad.
Angered by the drawings, masked Palestinian gunmen briefly took over a European Union office in Gaza on Monday. Islamists in Bahrain urged street demonstrations, while Syria called for the offenders to be punished. A Saudi company paid thousands of dollars for an ad thanking a business that snubbed Danish products.
The anger is reminiscent of the 1989 wrath that followed publication of ''The Satanic Verses,'' the Salman Rushdie novel that radicals said insulted Islam. Iran's Ayatollah Ruhollah Khomeini issued a death sentence against the British writer.
The cartoons originally were published nearly four months ago in Denmark and reprinted Jan. 10 by the Norwegian evangelical newspaper Magazinet in the name of defending free expression.
The spasm of vilification in newspapers and mosque sermons, by governments, citizens and radicals appears to have spoken to pent up Muslim anger typically reserved for former colonial powers Britain and France, as well as the United States.
''This will be used by regimes who resent Western pressures to reform to say that the West is waging a war against Muslims and doesn't have their best interests at heart,'' Sulaiman al-Hattlan, a Dubai-based Saudi writer, told The Associated Press.
The Danish paper Jyllands-Posten first published the 12 cartoons Sept. 30. The drawings included one showing Muhammad wearing a turban shaped as a bomb with a lit fuse. Another portrayed him with a bushy gray beard and holding a sword, his eyes covered by a black rectangle. A third pictured a middle-aged prophet standing in the desert with a walking stick, in front of a donkey and a sunset.
Islamic tradition bars any depiction of the prophet, favorable or otherwise.
Saudi Arabia recalled its ambassador to Denmark and initiated a boycott of Danish goods. It was warned Monday by Trade Commissioner Peter Mandelson that the European Union would take WTO action if the boycott persisted.
The newspaper issued an apology Monday to the world's Muslims.
The cartoons ''were not in violation of Danish law but have undoubtedly offended many Muslims, which we would like to apologize for,'' the Jyllands-Posten's editor-in-chief Carsten Juste said in a statement posted on the newspaper's Web site.
On Sunday, the newspaper printed a statement in Arabic addressed to Saudis, who had initiated the boycott. It said the drawings were published as part of a Danish dialogue about freedom of expression but were misinterpreted ''as if it were a campaign against Muslims in Denmark and in the Islamic world.''
Few were swayed by the explanation.
''In (the West) it is considered freedom of speech if they insult Islam and Muslims,'' Mohammed al-Shaibani, a columnist, wrote in Kuwait's Al-Qabas daily Monday. ''But such freedom becomes racism and a breach of human rights and anti-Semitism if Arabs and Muslims criticize their religion and religious laws.''
Emirates Justice and Islamic Affairs Minister Mohammed Al Dhaheri called it ''cultural terrorism, not freedom of expression,'' according to the official WAM news agency. ''The repercussions of such irresponsible acts will have adverse impact on international relations.''
In Tunisia, the head of the Islamic world's counterpart to UNESCO called the drawings ''a form of racism and discrimination that one must counter by all available means.''
''It's regrettable to state today, as we are calling for dialogue, that other parties feed animosity and hate and attack sacred symbols of Muslims and of their prophet,'' said Abdulaziz Othman Altwaijri, president of the Islamic Organization for Education, Science and Culture,
Jordan's largest circulation daily, government-run Al-Rai, said the Danish government must apologize.
In two West Bank towns Sunday, Palestinians burned Danish flags and demanded an apology. Several Islamist groups, including the Palestinian militant Hamas party and Egypt's Muslim Brotherhood, called for a worldwide boycott of Danish products.
The same call in several Persian Gulf countries has resulted in supermarkets clearing shelves of Danish cream cheese, butter and cookies. Kuwait's Al-Jahra Cooperative Society said in an ad in the Al-Rai Al-Aam daily that all Danish products have been removed from its shelves.
In Saudi Arabia, the daily Al-Watan refused to publish an ad from Denmark-based dairy group Arla Foods, which has said the boycott of its products was almost total.
Luai Mutabakani, senior editor at the paper, told the AP the full-page ad, titled ''The Danish government respects Islam,'' did not carry any apology or reprimand the paper.
In Iraq, thousands denounced the caricatures during Friday prayers.
The Egyptian parliament's Economic Committee refused to discuss a $72.49 million loan from Denmark to Egypt.
President Emile Lahoud of Lebanon condemned the cartoon, saying his country ''cannot accept any insult to any religion.''
Danish government officials expressed regret for the furor caused by the drawings but refused to become involved, citing freedom of expression. Mindful of the outrage, the government advised its citizens to ''show extra vigilance'' in the region.
Swedish Foreign Ministry spokesman Claes Jernaeus warned against travel to Gaza and the West Bank.
Filed at 5:06 p.m. ET
BEIRUT, Lebanon (AP) -- From the burning of its flag to a boycott of its brands of butter and cookies, Denmark is feeling Islamic outrage over newspaper cartoons of the Prophet Muhammad.
Angered by the drawings, masked Palestinian gunmen briefly took over a European Union office in Gaza on Monday. Islamists in Bahrain urged street demonstrations, while Syria called for the offenders to be punished. A Saudi company paid thousands of dollars for an ad thanking a business that snubbed Danish products.
The anger is reminiscent of the 1989 wrath that followed publication of ''The Satanic Verses,'' the Salman Rushdie novel that radicals said insulted Islam. Iran's Ayatollah Ruhollah Khomeini issued a death sentence against the British writer.
The cartoons originally were published nearly four months ago in Denmark and reprinted Jan. 10 by the Norwegian evangelical newspaper Magazinet in the name of defending free expression.
The spasm of vilification in newspapers and mosque sermons, by governments, citizens and radicals appears to have spoken to pent up Muslim anger typically reserved for former colonial powers Britain and France, as well as the United States.
''This will be used by regimes who resent Western pressures to reform to say that the West is waging a war against Muslims and doesn't have their best interests at heart,'' Sulaiman al-Hattlan, a Dubai-based Saudi writer, told The Associated Press.
The Danish paper Jyllands-Posten first published the 12 cartoons Sept. 30. The drawings included one showing Muhammad wearing a turban shaped as a bomb with a lit fuse. Another portrayed him with a bushy gray beard and holding a sword, his eyes covered by a black rectangle. A third pictured a middle-aged prophet standing in the desert with a walking stick, in front of a donkey and a sunset.
Islamic tradition bars any depiction of the prophet, favorable or otherwise.
Saudi Arabia recalled its ambassador to Denmark and initiated a boycott of Danish goods. It was warned Monday by Trade Commissioner Peter Mandelson that the European Union would take WTO action if the boycott persisted.
The newspaper issued an apology Monday to the world's Muslims.
The cartoons ''were not in violation of Danish law but have undoubtedly offended many Muslims, which we would like to apologize for,'' the Jyllands-Posten's editor-in-chief Carsten Juste said in a statement posted on the newspaper's Web site.
On Sunday, the newspaper printed a statement in Arabic addressed to Saudis, who had initiated the boycott. It said the drawings were published as part of a Danish dialogue about freedom of expression but were misinterpreted ''as if it were a campaign against Muslims in Denmark and in the Islamic world.''
Few were swayed by the explanation.
''In (the West) it is considered freedom of speech if they insult Islam and Muslims,'' Mohammed al-Shaibani, a columnist, wrote in Kuwait's Al-Qabas daily Monday. ''But such freedom becomes racism and a breach of human rights and anti-Semitism if Arabs and Muslims criticize their religion and religious laws.''
Emirates Justice and Islamic Affairs Minister Mohammed Al Dhaheri called it ''cultural terrorism, not freedom of expression,'' according to the official WAM news agency. ''The repercussions of such irresponsible acts will have adverse impact on international relations.''
In Tunisia, the head of the Islamic world's counterpart to UNESCO called the drawings ''a form of racism and discrimination that one must counter by all available means.''
''It's regrettable to state today, as we are calling for dialogue, that other parties feed animosity and hate and attack sacred symbols of Muslims and of their prophet,'' said Abdulaziz Othman Altwaijri, president of the Islamic Organization for Education, Science and Culture,
Jordan's largest circulation daily, government-run Al-Rai, said the Danish government must apologize.
In two West Bank towns Sunday, Palestinians burned Danish flags and demanded an apology. Several Islamist groups, including the Palestinian militant Hamas party and Egypt's Muslim Brotherhood, called for a worldwide boycott of Danish products.
The same call in several Persian Gulf countries has resulted in supermarkets clearing shelves of Danish cream cheese, butter and cookies. Kuwait's Al-Jahra Cooperative Society said in an ad in the Al-Rai Al-Aam daily that all Danish products have been removed from its shelves.
In Saudi Arabia, the daily Al-Watan refused to publish an ad from Denmark-based dairy group Arla Foods, which has said the boycott of its products was almost total.
Luai Mutabakani, senior editor at the paper, told the AP the full-page ad, titled ''The Danish government respects Islam,'' did not carry any apology or reprimand the paper.
In Iraq, thousands denounced the caricatures during Friday prayers.
The Egyptian parliament's Economic Committee refused to discuss a $72.49 million loan from Denmark to Egypt.
President Emile Lahoud of Lebanon condemned the cartoon, saying his country ''cannot accept any insult to any religion.''
Danish government officials expressed regret for the furor caused by the drawings but refused to become involved, citing freedom of expression. Mindful of the outrage, the government advised its citizens to ''show extra vigilance'' in the region.
Swedish Foreign Ministry spokesman Claes Jernaeus warned against travel to Gaza and the West Bank.
2005 unreleased films - Film Comment listing
BEST UNRELEASED FILMS OF 2005
(Seen last year but not yet released in the U.S.)
1. Three Times (Hou Hsiao-hsien, Taiwan) 227 points
2. L'Enfant (Jean-Pierre and Luc Dardenne, Belgium/France) 185
3. The Death of Mr. Lazarescu (Cristi Puiu, Romania) 161
4. Regular Lovers (Philippe Garrel, France) 123
5. The Sun (Alexander Sokurov, Russia/France/Italy/Switzerland) 114
6. Police Beat (Robinson Devor, U.S.) 95
7. Tristram Shandy (Michael Winterbottom, U.K.) 76
8. Mutual Appreciation (Andrew Bujalski, U.S.) 70
9. Gabrielle (Patrice Leconte, France) 68
10. The Wayward Cloud (Tsai Ming-liang, France/Taiwan) 59
11. Who's Camus Anyway? (Mitsuo Yanagimachi, Japan) 57
12. Tale of Cinema (Hong Sang-soo, South Korea/France) 51
13. Princess Raccoon (Seijun Suzuki, Japan) 48
14. Mary Abel Ferrara, Italy/France/U.S. 46
15. 4 (Ilya Khrzhanovsky, Russia) 44
16. 13 Lakes (James Benning, U.S.) 35
17. Sympathy for Lady Vengeance Park Chan-wook, South Korea 34
18. Le Pont des Arts (Eugene Green, France) 32
19. Clean (Olivier Assayas, Canada/France/U.K.) 30
20. Avenge But One of My Two Eyes (Avi Mograbi, France/Israel) 29
21. Dave Chappelle's Block Party (Michel Gondry, U.S.) 27
22. I Am a Sex Addict 26
23. This Charming Girl (Lee Yoon-ki, South Korea) 23
24. South of the Clouds Wen Zhu, China 22
25. Why We Fight Eugene Jarecki, U.S./France/U.K. 21.5
26. Odete (João Pedro Rodrigues, Portugal) 21
27. Los Muertos 20.5
28. Route 181 19
29. Linda Linda Linda 17
30. The Ister (David Barison & Daniel Ross, Australia) 17
31. Thank You for Smoking 16
32. The Devil and Daniel Johnston 16
33. Through the Forest 16
34. 12 and Holding 15
35. Bubble 15
(Seen last year but not yet released in the U.S.)
1. Three Times (Hou Hsiao-hsien, Taiwan) 227 points
2. L'Enfant (Jean-Pierre and Luc Dardenne, Belgium/France) 185
3. The Death of Mr. Lazarescu (Cristi Puiu, Romania) 161
4. Regular Lovers (Philippe Garrel, France) 123
5. The Sun (Alexander Sokurov, Russia/France/Italy/Switzerland) 114
6. Police Beat (Robinson Devor, U.S.) 95
7. Tristram Shandy (Michael Winterbottom, U.K.) 76
8. Mutual Appreciation (Andrew Bujalski, U.S.) 70
9. Gabrielle (Patrice Leconte, France) 68
10. The Wayward Cloud (Tsai Ming-liang, France/Taiwan) 59
11. Who's Camus Anyway? (Mitsuo Yanagimachi, Japan) 57
12. Tale of Cinema (Hong Sang-soo, South Korea/France) 51
13. Princess Raccoon (Seijun Suzuki, Japan) 48
14. Mary Abel Ferrara, Italy/France/U.S. 46
15. 4 (Ilya Khrzhanovsky, Russia) 44
16. 13 Lakes (James Benning, U.S.) 35
17. Sympathy for Lady Vengeance Park Chan-wook, South Korea 34
18. Le Pont des Arts (Eugene Green, France) 32
19. Clean (Olivier Assayas, Canada/France/U.K.) 30
20. Avenge But One of My Two Eyes (Avi Mograbi, France/Israel) 29
21. Dave Chappelle's Block Party (Michel Gondry, U.S.) 27
22. I Am a Sex Addict 26
23. This Charming Girl (Lee Yoon-ki, South Korea) 23
24. South of the Clouds Wen Zhu, China 22
25. Why We Fight Eugene Jarecki, U.S./France/U.K. 21.5
26. Odete (João Pedro Rodrigues, Portugal) 21
27. Los Muertos 20.5
28. Route 181 19
29. Linda Linda Linda 17
30. The Ister (David Barison & Daniel Ross, Australia) 17
31. Thank You for Smoking 16
32. The Devil and Daniel Johnston 16
33. Through the Forest 16
34. 12 and Holding 15
35. Bubble 15
Wednesday, January 25
The Inner Lives of Men
The Inner Lives of Men
Judith Warner, NYT
Something earthshaking happened a few days ago. My husband called home and announced that he had Something to Say.
“You know how you always say I don’t tell you anything about myself?” he asked. “Well, I have something to tell you now.”
I immediately got off call-waiting, shut down my laptop, shooed the children from the kitchen and turned off the gas under whatever it was I was burning on the stove. (“That smell is zucchini. It’s not burnt, it’s lightly fried. I happen to like it like that.”)
“Are you ready?” he said.
A bit of prehistory: About a year ago, a friend of mine called and described to me how my husband’s general reticence about all things personal gave him an aura of intrigue and mystery and added, generally, to his popularity in our little circle of displaced New Yorkers and other existential misfits in Upper Northwest Washington.
“I mean, he’s really smart, of course, and really funny, and we love him for that,” she confided. “But you always feel, with Max, that if only you could get to know him better, there’s so much more that he could say.”
I knew what she meant. The disconnect between Max’s daunting intellect, his breadth of knowledge, brilliant capacity for synthesizing information and repackaging it into smart and unique ideas, and the utter paucity of what, on a personal level, he has to say for himself, has been an endless source of fascination for friends and family members for as long as I’ve known him.
In his defense, though, he’s not alone in this condition. Indeed, as Michael Gurian explains, in “What Could He Be Thinking: How a Man’s Mind Really Works,’’ which I discovered recently while procrastinating, a relative lack of ability to make compelling personal revelations is a hard-wired feature of the male brain.
The corpus callosum, a small bundle of nerves that permits communication between the brain’s right and left hemispheres, is, Gurian notes, on average 25 percent smaller in men than in women. “Because of this,” he writes, “men don’t connect as many feelings to words, or even thoughts to words.”
He goes on:
If the feeling or thought needs to move from the right to the left hemisphere, a man has 25 percent less chance of moving it over. This is crucial because the male brain does its language in the left hemisphere, while women use six or seven cortical areas for language in both hemispheres. The end result is that men have a more difficult time making language out of experience than women do. In fact, they use, on average, about half the amount of words that women do.
(And while we’re having a good time exploring men’s natural-born deficiencies, see the biological basis of their lack of “emotional intelligence”; autism’s possible link to the “extreme male brain”; and the male sex’s propensity toward schadenfreude.)
As soon as I got off the phone with my friend, I called Max, who was, as always, at the office.
“Ah, yes,” he said, the sound of a keyboard clicking away in the background. “People think I have undiscovered depths. But they’re wrong.”
I met my husband in 1988. I married him in 1989. I considered him then, and have continued, till today, to think of him as the smartest person I’ve ever met, as well as the funniest, and, generally, the best all-around friend and companion imaginable. But everything I know about him now I learned in the first year, in those endless volleys of personal revelations that tend to accompany falling in love.
As the years, as the decades now, have passed, I have kept up a steady stream of personal information, often, indeed, sending Max screaming, newspaper clamped over his ears, from the room. But his own divulgences have slowed to what can hardly be described as a trickle.
He describes his work days as “fine” or “You don’t want to know.” What little bits of information I can glean about his out-of-home life usually come when we’re out to dinner with other people. I learn of his phone and e-mail communications because his interlocutors, too frustrated to go on dealing with him, usually end up forwarding their messages to me.
Sometimes, at dinner parties, when people ask him the usual questions about himself (Where are you from? What did your parents do? What is your background? Religion? Maternal language?), I step into the denotative silence to answer the questions myself, weaving together his past comments, testimony from his father and mother and half-siblings, even portions of his as-yet-unpublished novel, just to keep the conversation going.
This annoys him, but words must flow.
So you can imagine my supreme, extreme, show-stopping joy and pleasure when that voice at the other end of the phone line said, “I have something to share.
“You always say I don’t tell you anything,” he continued. “And that this puts distance in our marriage. And I’ve been worrying about that. And now I have something to tell you.”
He sounded proud. Clearly, this had been working on him for some time.
“Today,” he said, shy, but with growing self-confidence. “Well, today… You know how I always go out at lunchtime to buy a sandwich?”
“Yes!” My heart leapt.
“Well, today, I had my sandwich on rye bread. And do you know what?” he paused significantly. “I really like rye bread.”
He pronounced the words “rye bread” like the name of an exotic foreign city. It sounded like a newly coined psychiatric diagnosis.
“How much of it did you eat?” I asked him. (LSD, let’s remember, came originally from rye ergot.)
“Just a sandwich,” he said, daintily. “I think I might get one everyday.”
I started to laugh in little nose-stuffed shrieks. My children ran into the room in alarm.
“It’s true,” I gasped. “You have no hidden depths.
“In fact,” I fairly screamed. “After all these years, I’m just discovering: You’re simply not all that smart!”
A postscript:
My husband was actually quite pleased by my reaction. For now that I’ve determined, once and for all, that he indeed does lack depth, has a rather limited inner life, and, I’m afraid to say, may well be far less intelligent than we’ve all previously imagined, there’s only one thing left for me to do: Treat him like a sex object.
Judith Warner, NYT
Something earthshaking happened a few days ago. My husband called home and announced that he had Something to Say.
“You know how you always say I don’t tell you anything about myself?” he asked. “Well, I have something to tell you now.”
I immediately got off call-waiting, shut down my laptop, shooed the children from the kitchen and turned off the gas under whatever it was I was burning on the stove. (“That smell is zucchini. It’s not burnt, it’s lightly fried. I happen to like it like that.”)
“Are you ready?” he said.
A bit of prehistory: About a year ago, a friend of mine called and described to me how my husband’s general reticence about all things personal gave him an aura of intrigue and mystery and added, generally, to his popularity in our little circle of displaced New Yorkers and other existential misfits in Upper Northwest Washington.
“I mean, he’s really smart, of course, and really funny, and we love him for that,” she confided. “But you always feel, with Max, that if only you could get to know him better, there’s so much more that he could say.”
I knew what she meant. The disconnect between Max’s daunting intellect, his breadth of knowledge, brilliant capacity for synthesizing information and repackaging it into smart and unique ideas, and the utter paucity of what, on a personal level, he has to say for himself, has been an endless source of fascination for friends and family members for as long as I’ve known him.
In his defense, though, he’s not alone in this condition. Indeed, as Michael Gurian explains, in “What Could He Be Thinking: How a Man’s Mind Really Works,’’ which I discovered recently while procrastinating, a relative lack of ability to make compelling personal revelations is a hard-wired feature of the male brain.
The corpus callosum, a small bundle of nerves that permits communication between the brain’s right and left hemispheres, is, Gurian notes, on average 25 percent smaller in men than in women. “Because of this,” he writes, “men don’t connect as many feelings to words, or even thoughts to words.”
He goes on:
If the feeling or thought needs to move from the right to the left hemisphere, a man has 25 percent less chance of moving it over. This is crucial because the male brain does its language in the left hemisphere, while women use six or seven cortical areas for language in both hemispheres. The end result is that men have a more difficult time making language out of experience than women do. In fact, they use, on average, about half the amount of words that women do.
(And while we’re having a good time exploring men’s natural-born deficiencies, see the biological basis of their lack of “emotional intelligence”; autism’s possible link to the “extreme male brain”; and the male sex’s propensity toward schadenfreude.)
As soon as I got off the phone with my friend, I called Max, who was, as always, at the office.
“Ah, yes,” he said, the sound of a keyboard clicking away in the background. “People think I have undiscovered depths. But they’re wrong.”
I met my husband in 1988. I married him in 1989. I considered him then, and have continued, till today, to think of him as the smartest person I’ve ever met, as well as the funniest, and, generally, the best all-around friend and companion imaginable. But everything I know about him now I learned in the first year, in those endless volleys of personal revelations that tend to accompany falling in love.
As the years, as the decades now, have passed, I have kept up a steady stream of personal information, often, indeed, sending Max screaming, newspaper clamped over his ears, from the room. But his own divulgences have slowed to what can hardly be described as a trickle.
He describes his work days as “fine” or “You don’t want to know.” What little bits of information I can glean about his out-of-home life usually come when we’re out to dinner with other people. I learn of his phone and e-mail communications because his interlocutors, too frustrated to go on dealing with him, usually end up forwarding their messages to me.
Sometimes, at dinner parties, when people ask him the usual questions about himself (Where are you from? What did your parents do? What is your background? Religion? Maternal language?), I step into the denotative silence to answer the questions myself, weaving together his past comments, testimony from his father and mother and half-siblings, even portions of his as-yet-unpublished novel, just to keep the conversation going.
This annoys him, but words must flow.
So you can imagine my supreme, extreme, show-stopping joy and pleasure when that voice at the other end of the phone line said, “I have something to share.
“You always say I don’t tell you anything,” he continued. “And that this puts distance in our marriage. And I’ve been worrying about that. And now I have something to tell you.”
He sounded proud. Clearly, this had been working on him for some time.
“Today,” he said, shy, but with growing self-confidence. “Well, today… You know how I always go out at lunchtime to buy a sandwich?”
“Yes!” My heart leapt.
“Well, today, I had my sandwich on rye bread. And do you know what?” he paused significantly. “I really like rye bread.”
He pronounced the words “rye bread” like the name of an exotic foreign city. It sounded like a newly coined psychiatric diagnosis.
“How much of it did you eat?” I asked him. (LSD, let’s remember, came originally from rye ergot.)
“Just a sandwich,” he said, daintily. “I think I might get one everyday.”
I started to laugh in little nose-stuffed shrieks. My children ran into the room in alarm.
“It’s true,” I gasped. “You have no hidden depths.
“In fact,” I fairly screamed. “After all these years, I’m just discovering: You’re simply not all that smart!”
A postscript:
My husband was actually quite pleased by my reaction. For now that I’ve determined, once and for all, that he indeed does lack depth, has a rather limited inner life, and, I’m afraid to say, may well be far less intelligent than we’ve all previously imagined, there’s only one thing left for me to do: Treat him like a sex object.
Monday, January 23
domestic surveillance
The ostensible aim of the president's domestic surveillance program, conducted by the supersecret National Security Agency, is to home in on communications into and out of the United States that involve individuals or organizations suspected of some sort of terror connection. But, as The Times reported last week, F.B.I. officials have repeatedly complained that the N.S.A. has bombarded them with thousands upon thousands of unsubstantiated tips - names, telephone numbers, e-mail addresses and so forth - that have either led nowhere, or to completely innocent individuals.
Whatever its stated goals, the N.S.A. seems to be operating the greatest fishing expedition in the history of the world.
The American Civil Liberties Union, in a lawsuit seeking a halt to the spying, warned that scholars, lawyers, journalists and others who communicate with people outside the U.S. are already experiencing a chilling effect. People who are doing nothing wrong, but who feel they may become targets of the program, for whatever reasons, are curtailing their conversations and censoring their correspondence, according to the suit.
Laurence Tribe, a professor of constitutional law at Harvard, noted that people who are aware of the surveillance program and who believe that their political views may be seen as hostile by the government, may also become less candid in their telephone conversations and e-mail. Others could unwittingly become the victim of contacts by individuals that the government may be interested in.
He gave an example:
"I recently got a series of e-mails from someone, quite without invitation, that got rather scary in the sense that they started saying positive things about Osama bin Laden. I asked the person in reply to stop e-mailing me, and I got an e-mail today saying, 'Your request is permanently granted.' But in the meantime, granted or not granted, that could easily put me on some kind of targeting list."
Speaking about the potential long-term effect of widespread domestic spying, Professor Tribe said:
"The more people grow accustomed to a listening environment in which the ear of Big Brother is assumed to be behind every wall, behind every e-mail, and invisibly present in every electronic communication, telephonic or otherwise - that is the kind of society, as people grow accustomed to it, in which you can end up being boiled to death without ever noticing that the water is getting hotter, degree by degree.
"The background assumptions of privacy will be gradually eroded to the point where we'll wake up one day, or our children will, and it will seem quaint that people at one time, long ago, thought that they could speak in candor."
Whatever its stated goals, the N.S.A. seems to be operating the greatest fishing expedition in the history of the world.
The American Civil Liberties Union, in a lawsuit seeking a halt to the spying, warned that scholars, lawyers, journalists and others who communicate with people outside the U.S. are already experiencing a chilling effect. People who are doing nothing wrong, but who feel they may become targets of the program, for whatever reasons, are curtailing their conversations and censoring their correspondence, according to the suit.
Laurence Tribe, a professor of constitutional law at Harvard, noted that people who are aware of the surveillance program and who believe that their political views may be seen as hostile by the government, may also become less candid in their telephone conversations and e-mail. Others could unwittingly become the victim of contacts by individuals that the government may be interested in.
He gave an example:
"I recently got a series of e-mails from someone, quite without invitation, that got rather scary in the sense that they started saying positive things about Osama bin Laden. I asked the person in reply to stop e-mailing me, and I got an e-mail today saying, 'Your request is permanently granted.' But in the meantime, granted or not granted, that could easily put me on some kind of targeting list."
Speaking about the potential long-term effect of widespread domestic spying, Professor Tribe said:
"The more people grow accustomed to a listening environment in which the ear of Big Brother is assumed to be behind every wall, behind every e-mail, and invisibly present in every electronic communication, telephonic or otherwise - that is the kind of society, as people grow accustomed to it, in which you can end up being boiled to death without ever noticing that the water is getting hotter, degree by degree.
"The background assumptions of privacy will be gradually eroded to the point where we'll wake up one day, or our children will, and it will seem quaint that people at one time, long ago, thought that they could speak in candor."
Iraq's Power Vacuum
By PAUL KRUGMAN
In the State of the Union address, President Bush will surely assert, to choreographed applause, that he has a strategy for victory in Iraq. I don't believe him. In fact, I believe that three years into the conflict his administration refuses to admit defeat but has given up even trying to win.
To explain myself, let me tell you some stories about electricity.
Power shortages are a crucial issue for ordinary Iraqis, and for the credibility of their government. As Muhsin Shlash, Iraq's electricity minister, said last week, "When you lose electricity the country is destroyed, nothing works, all industry is down and terrorist activity is increased."
Mr. Shlash has reason to be strident. In today's Iraq, blackouts are the rule rather than the exception. According to Agence France-Presse, Baghdad and "much of the central regions" - in other words, the areas where the insurgency is most active and dangerous - currently get only between two and six hours of power a day.
Lack of electricity isn't just an inconvenience. It prevents businesses from operating, destroys jobs and generates a sense of demoralization and rage that feeds the insurgency.
So why is power scarcer than ever, almost three years after Saddam's fall? Sabotage by insurgents is one factor. But as an analysis of Iraq's electricity shortage in The Los Angeles Times last month showed, the blackouts are also the result of some incredible missteps by U.S. officials.
Most notably, during the period when Iraq was run by U.S. officials, they decided to base their electricity plan on natural gas: in order to boost electrical output, American companies were hired to install gas-fired generators in power plants across Iraq. But, as The Los Angeles Times explains, "pipelines needed to transport the gas" - that is, to supply gas to the new generators - "weren't built because Iraq's Oil Ministry, with U.S. encouragement, concentrated instead on boosting oil production." Whoops.
Meanwhile, in the early days of the occupation U.S. officials chose not to raise the prices of electricity and fuel, which had been kept artificially cheap under Saddam, for fear of creating unrest. But as a first step toward their dream of turning Iraq into a free-market utopia, they removed tariffs and other restrictions on the purchase of imported consumer goods.
The result was that wealthy and middle-class Iraqis rushed to buy imported refrigerators, heaters and other power-hungry products, and the demand for electricity surged - with no capacity available to meet that surge in demand. This caused even more blackouts.
In short, U.S. officials thoroughly botched their handling of Iraq's electricity sector. They did much the same in the oil sector. But the Bush administration is determined to achieve victory in Iraq, so it must have a plan to rectify its errors, right?
Um, no. Although there has been no formal declaration, all indications are that the Bush administration, which once made grand promises about a program to rebuild Iraq comparable to the Marshall Plan, doesn't plan to ask for any more money for Iraqi reconstruction.
Another Los Angeles Times report on Iraq reconstruction contains some jaw-dropping quotes from U.S. officials, who now seem to be lecturing the Iraqis on self-reliance. "The world is a competitive place," declared the economics counselor at the U.S. embassy. "No pain, no gain," said another official. "We were never intending to rebuild Iraq," said a third. We came, we saw, we conquered, we messed up your infrastructure, we're outta here.
Mr. Shlash certainly sounds as if he's given up expecting more American help. "The American donation is almost finished," he said, "and it was not that effective." Yet he also emphasized the obvious: partly because of the similar failure of reconstruction in the oil sector, Iraq's government doesn't have the funds to do much power plant construction. In fact, it will be hard pressed to maintain the capacity it has, and protect that capacity from insurgent attacks.
And if reconstruction stalls, as seems inevitable, it's hard to see how anything else in Iraq can go right.
So what does it mean that the Bush administration is apparently walking away from responsibility for Iraq's reconstruction? It means that the administration doesn't have a plan; it's entirely focused on short-term political gain. Mr. Bush is just getting by from sound bite to sound bite, while Iraq and America sink ever deeper into the quagmire.
In the State of the Union address, President Bush will surely assert, to choreographed applause, that he has a strategy for victory in Iraq. I don't believe him. In fact, I believe that three years into the conflict his administration refuses to admit defeat but has given up even trying to win.
To explain myself, let me tell you some stories about electricity.
Power shortages are a crucial issue for ordinary Iraqis, and for the credibility of their government. As Muhsin Shlash, Iraq's electricity minister, said last week, "When you lose electricity the country is destroyed, nothing works, all industry is down and terrorist activity is increased."
Mr. Shlash has reason to be strident. In today's Iraq, blackouts are the rule rather than the exception. According to Agence France-Presse, Baghdad and "much of the central regions" - in other words, the areas where the insurgency is most active and dangerous - currently get only between two and six hours of power a day.
Lack of electricity isn't just an inconvenience. It prevents businesses from operating, destroys jobs and generates a sense of demoralization and rage that feeds the insurgency.
So why is power scarcer than ever, almost three years after Saddam's fall? Sabotage by insurgents is one factor. But as an analysis of Iraq's electricity shortage in The Los Angeles Times last month showed, the blackouts are also the result of some incredible missteps by U.S. officials.
Most notably, during the period when Iraq was run by U.S. officials, they decided to base their electricity plan on natural gas: in order to boost electrical output, American companies were hired to install gas-fired generators in power plants across Iraq. But, as The Los Angeles Times explains, "pipelines needed to transport the gas" - that is, to supply gas to the new generators - "weren't built because Iraq's Oil Ministry, with U.S. encouragement, concentrated instead on boosting oil production." Whoops.
Meanwhile, in the early days of the occupation U.S. officials chose not to raise the prices of electricity and fuel, which had been kept artificially cheap under Saddam, for fear of creating unrest. But as a first step toward their dream of turning Iraq into a free-market utopia, they removed tariffs and other restrictions on the purchase of imported consumer goods.
The result was that wealthy and middle-class Iraqis rushed to buy imported refrigerators, heaters and other power-hungry products, and the demand for electricity surged - with no capacity available to meet that surge in demand. This caused even more blackouts.
In short, U.S. officials thoroughly botched their handling of Iraq's electricity sector. They did much the same in the oil sector. But the Bush administration is determined to achieve victory in Iraq, so it must have a plan to rectify its errors, right?
Um, no. Although there has been no formal declaration, all indications are that the Bush administration, which once made grand promises about a program to rebuild Iraq comparable to the Marshall Plan, doesn't plan to ask for any more money for Iraqi reconstruction.
Another Los Angeles Times report on Iraq reconstruction contains some jaw-dropping quotes from U.S. officials, who now seem to be lecturing the Iraqis on self-reliance. "The world is a competitive place," declared the economics counselor at the U.S. embassy. "No pain, no gain," said another official. "We were never intending to rebuild Iraq," said a third. We came, we saw, we conquered, we messed up your infrastructure, we're outta here.
Mr. Shlash certainly sounds as if he's given up expecting more American help. "The American donation is almost finished," he said, "and it was not that effective." Yet he also emphasized the obvious: partly because of the similar failure of reconstruction in the oil sector, Iraq's government doesn't have the funds to do much power plant construction. In fact, it will be hard pressed to maintain the capacity it has, and protect that capacity from insurgent attacks.
And if reconstruction stalls, as seems inevitable, it's hard to see how anything else in Iraq can go right.
So what does it mean that the Bush administration is apparently walking away from responsibility for Iraq's reconstruction? It means that the administration doesn't have a plan; it's entirely focused on short-term political gain. Mr. Bush is just getting by from sound bite to sound bite, while Iraq and America sink ever deeper into the quagmire.
Tuesday, January 17
Are Conservative Republicans Now America's Permanent Ruling Class? (Why what you've heard is wrong)
By JOHN J. DIIULIO JR.
Section: The Chronicle Review
Volume 52, Issue 20, Page B9
Following the 2002 midterm Congressional elections, Democrats were blue about their party's future. With the September 11, 2001, terrorist attacks still uppermost in the American public mind, and with President Bush enjoying record-high job-approval ratings, most voters favored Republican candidates and voiced conservative opinions in polls. Several pundits proclaimed that the GOP was now America's "permanent majority" at the national level, becoming so at the state level, and even resurrecting itself in some cities where Democrats had long reigned supreme.
Supposedly this political realignment was, if anything, long overdue. Since the early 1970s, public opinion had been trending conservative. By the early 1990s, lower taxes, tougher crime policies, and traditional moral values all consistently polled popular majorities. Southern voters began bolting from the Democratic Party in 1964 when President Lyndon B. Johnson signed the Civil Rights Act. Nixon's law-and-order "Southern strategy" iced the break. In 1980 and again in 1984, Reagan attracted one in four votes cast by Democrats. In 1994 Newt Gingrichled Republicans ended Democrats' 40 years at the House's helm.
In 2002 here it supposedly was: the long-predicted shift to Republican Party dominance and conservative ideological hegemony. Two years later, Bush's bigger-than-expected win over Sen. John F. Kerry ostensibly confirmed that conservative Republicans had become America's ruling class.
Besides, the 2004 election results supposedly revealed deep culture-war differences concerning religion that sealed the Republicans' permanent majority status. About two-thirds of people who attended church regularly (weekly or more) voted for Bush. As the analysts John C. Green and Mark Silk have documented, the small plurality of Americans who chose "moral values" as "the one issue that mattered most" to their presidential vote in 2004 — so-called "moral values" voters — put Bush safely over the top in the South, the Mountain West, and the Midwest. Millions more evangelical Christians voted in 2004 than had voted in 2000.
But what a difference a year makes. According to the Washington chattering class, Bush and the Republicans' governing majority are suddenly but surely in decline. Many among the selfsame talking heads who were only recently talking Republican realignment, conservative hegemony, and Bush's lasting Reagan-like legacy, are now talking conservative crack-up, the lame-duck president's political meltdown, and the Democrats' winning back the House in 2006.
All the pundits point to much the same reasons for this apparent reversal in conservative Republicans' political fortunes: rising popular sentiment against the U.S. occupation of Iraq; news-media spotlights on the bungled federal response to Hurricane Katrina; prosecutorial probes into alleged misdeeds by high-profile Repub-lican leaders; revolts by conservatives against the president's second pick for the Supreme Court, Harriet Miers; and retreats by the White House on Social Security privatization and several other domestic-policy priorities.
There is only one problem with this latest conventional political wisdom. It is, like the conventional political wisdom that immediately preceded it, almost completely wrong in virtually every respect.
Today's true big political picture is mostly gray shades against a purple (red mixed with blue) canvas. Conservative Republicans, beset by deep ideological divisions, are not even close to becoming the country's permanent ruling class. Neither the post-Reagan Republican Party in general, nor the present Bush White House in particular, ever actually rode so high politically.
Just the same, neither the GOP nor the president is in any definite long-term political trouble. Conservative Republicans, even without permanent-majority clout, are still more potent politically than liberal Democrats, and likely to remain so. Centrist and neoprogressive Democrats could credibly compete for power with conservative Republicans, but they must first pry their party's presidential nomination process and key leadership posts from the old-left hands that still primarily control them. Despite strenuous efforts to do so since the mid-1980s by various New Democrat groups, the party is still led mainly by its liberals. Not even the New Democrats have ever really reached out to the culturally conservative and anti-abortion Democrats who have been defecting to the Republican Party since the Reagan years.
True, Bush won over two-thirds of regular churchgoers, but Kerry won two-thirds of voters who said they never went to church. Together the "churched" (a sixth) and the "unchurched" (a seventh) constituted less than a third of the total electorate. As the political scientist James Q. Wilson, of UCLA and Pepperdine University, stated in his November Tanner Lecture at Harvard, "religion makes a difference, but very religious and very irreligious voters are only a minority of the electorate." Amen, and as studies by the Stanford political scientist Morris P. Fiorina have shown, even on most hot-button issues, the electorate is far less polarized than ideological elites on each side would like them to be.
The political pundits are wrong, but your high-school civics teacher was right: Thanks to federalism, separated powers, checks and balances, staggered elections, and myriad other constitutional contrivances, the party in power has to govern by the ABC's — forging interparty alliances, striking bargains with officials in other branches and at other levels of government, and effecting compromises that usually induce less loyalty from the winners than enmity from the losers.
Especially when, as today, national political elites are ideologically polarized into partisan camps, unified party government is constitutionally conditioned to be a splendid curse for the party in power. Once that party is "in control" in both Congressional chambers and in the White House, the ABC's rudely awaken latent intraparty divisions and spark new, high-stakes internal battles over both ideas (who believes what) and interests (who gets what).
Historically the Democrats' New Deal coalition — Southern whites, northern blacks, union members, Catholics, Jews, and disparate others — had pretty much fallen apart by the time Nixon resold himself to America in 1968. But the Republicans' grand old "Main Street and Wall Street" coalition has always been a true political witch's brew, bound to bubble and boil over whenever the GOP and its conservative base — that is, bases, plural — control both Congressional chambers plus the White House.
In an early October 2005 cover story, "What's Gone Wrong for America's Right," The Economist magazine listed the contemporary GOP's conservative cleavages: small-government conservitives versus big-government conservatives, conservatives of faith versus conservatives of doubt, insurgent conservatives versus establishment conservatives, business conservatives versus religious conservatives, and neoconservatives versus traditional conservatives.
Exhibit A is the libertarian Cato Institute's edited volume assessing what Republicans have wrought since taking back the House in 1994 and achieving unified party control under George W. Bush. As the small-government conservatives see it, 10 years after the "Republican revolution," Bush-led Washington and the Republican Party have backslid into "business as usual."
Cato's best-known analyst-activist, Stephen Moore, says it all in his chapter's subtitle, "The Triumph of Big Government." In a section headed "Republicans Break the Bank Under President Bush," Moore notes that nondefense discretionary spending rose 34 percent during Bush's first term, which is "exactly the opposite of what was promised by Republican leaders when they came to power in the 1990s." The Bush "spending spree," as Moore dubs it, started before 9/11 and "is spread across many federal agencies, whether they have a security function or not." And don't blame only the Democrats: "Bush has not vetoed a single bill. ... If Bush is displeased with big spending in Congress, he has shown no sign of it."
The libertarians lambaste more than Bush's budgets. Health-care policy, Michael F. Cannon says, has been the Republican revolution's "mitigated disaster." Republicans defeated Clinton's universal health-insurance plan, but they have yet to rein in federal spending on Medicaid; and in 2003 Bush backed the Medicare Prescription Drug Improvement and Modernization Act, affording prescription-drug coverage to qualified senior citizens starting in 2006. Bush's landmark No Child Left Behind law, argues David F. Salisbury, "greatly increased federal education spending and perpetuated funding for most of the old federal education programs, many of which are ineffective and wasteful." According to Jerry Taylor, "the Republican revolution has left virtually no footprints on the environmental code or on federal land holdings." On foreign policy and national security, avers Christopher A. Preble, there are now "serious divisions within the party." Preble charges that both the first and second Presidents Bush, Gingrich, and other Republican revolutionaries have proved unwilling "to part with the military-industrial complex that had expanded during the cold war." He labels our present national government a "warfare-welfare state."
In the second chapter, Richard K. Armey, former Republican House majority leader, offers the "Armey Axiom" that "Freedom Works," advising that "America will prosper and create unlimited opportunity if we have limited government and reward the hard work and initiative of citizens." Armey's political advice is simple: "When We Act Like Us, We Win."
Really? Cato's president, Edward H. Crane, is Armey's ideological twin, but he acknowledges that Republicans have won elections while straying far from the small-government gospel. Reagan, complains Crane, won big in 1984 by running on a platform "with no substance" and returned to office with no "mandate for cutting the government."
"Today," he writes, the GOP is led intellectually "by neoconservatives and other Republicans who are explicitly pro-big government."
Why is small-government conservatism so little honored by Republican policy makers even now that they control the Congress and the White House? Even a nonlibertarian like me can be moved by certain libertarian ideas and values (especially each April 15). The simple truth, however, is that most citizens, including most who are registered as Republicans, carp about taxes but, when push comes to shove, wanteven demandmost of what "big government" does and delivers. No national politician can stay in office long or get things done legislatively if he or she always talks or routinely votes the way a committed libertarian should.
To wit: Republicans have won seven of the last 10 presidential elections. Nixon, Ford, Reagan in 1984, and the two Presidents Bush read little from the libertarian liturgy. Only Reagan in 1980 talked a small-government line, and he received just 51 percent of the vote in a three-way race. As Crane notes, in 1980 Reagan promised to abolish the Department of Education and the Department of Energy. That promise got big applause before certain conservative audiences, but Reagan never really pushed hard to get rid of those agencies, and they are still very much with us today.
After delivering his 1981 tax cuts, Reagan did not retire his anti-big-government and bureaucracy-bashing rhetoric. During his two terms, however, federal-government spending as a percentage of gross domestic product changed little, military spending skyrocketed, and there were no big reductions in the federal civilian work force (those occurred in the mid-1990s under Clinton). When Reagan left office in 1989, the Federal Register was slimmer, but the federal government's regulatory reach was, if anything, far greater than it had been in 1980. In 1984 the less libertarian-sounding Reagan won in a landslide (59 percent to 41 percent).
In the mid-1990s, often downbeat and divisive Republican revolutionaries lost what little ground the upbeat and avuncular Reagan had gained for the small-government cause. The public liked the Contract With America, but not its policy fine print. As I predicted in more than a half-dozen lectures I gave in early 1995, once people heard a gavel-wielding Gingrich talk about cutting major social programs, they balked. When Clinton called Gingrich's bluff about "shutting down" the federal government, the only remaining question was when, not whether, the small-government moment would quickly pass into House history footnotes.
George W. Bush has never hidden his differences with libertarians. His very first campaign speech, on July 22, 1999, articulated what he believed as a "compassionate conservative." Speaking before inner-city clergymen and women in Indianapolis, "economic growth," Bush preached, "is not the solution to every problem." He labeled as "destructive" the idea that government is bad and called explicitly for increasing government support for Medicaid and other federal programs. He also rebutted the notion that government needs only to step aside for families and communities to flourish. In particular he stressed that, when it comes to addressing poverty and urban blight, it "is not enough to call for volunteerism. Without more support — public and private — we are asking" local community-serving groups, both religious and secular, "to make bricks with-out straw."
Bush, like Reagan before him, is a true believer in tax cuts. In 2001 he put tax cuts first on his agenda, and the administration has been quick to court groups with grass-roots networks dedicated to lowering taxes. But the president also proceeded, both before and after 9/11, to try to make good on his pledges of activist domestic government: more federal aid to Title I schools; bipartisan initiatives to expand volunteer-mobilization programs, including Clinton's AmeriCorps program; fresh federal funding for best-practices programs that benefit at-risk urban youth; and much more. To many libertarian leaders' dismay, in 2004 Bush ran mainly on Iraq, homeland security, and his record as a compassionate conservative.
Libertarians aside, the GOP's most interesting but least well-understood intraparty political schism is among its religious conservatives. On the one side are what some political scientists term the party's religious purists. Essentially the purists want to push for policies that challenge constitutional church-state limits and to nominate as federal judges those whom only an activist opposed to abortion or gay rights could love. On the other side are its religious pragmatists. Essentially the pragmatists want government to be more faith-friendly while remaining pluralistic; and, though they are mostly for restricting abortions and against same-sex marriage, they want traditional family values to be promoted less through pitched battles over federal judgeships and more through bipartisan "fatherhood" or "healthy marriage" initiatives and the like.
If Jacob S. Hacker and Paul Pierson are correct, religious pragmatists in the Republican Party don't have a prayer. In Off Center: The Republican Revolution and the Erosion of American Democracy, the two political scientists argue that the GOP is dominated by religious, libertarian, and other conservative ideological extremists who vary only according to their media savvy. The young authors are progressives, but they wage their case as public intellectuals with expert research skills. Like or agree with its thesis or not, their pithy, well-written book is certainly worth reading.
To Hacker and Pierson, the Republican religious base is synonymous with the "Christian right." They cite a study indicating that in 1994 no fewer than "31 state Republican parties" were "significantly shaped by the Christian right," led back then by the Christian Coalition. "The story of the Christian right," they argue, "is the story of many conservative activist groups." Those groups have graduated from mobilizing conservatives to take over local school boards. Rather, as "conservative activism has shifted toward national politics, it has also focused increasingly on the recruitment and certification of aspirants to elected office."
The Christian right, as depicted by Hacker and Pierson, is a conservative first cousin to libertarian Republican anti-tax lobbies (for example, the authors give Grover G. Norquist's influential Americans for Tax Reform ample treatment). The groups "share three key characteristics that increasingly define the organizational base of the GOP: They are radical; they focus on guiding and disciplining Republicans in Congress, not mobilizing large numbers of citizens; and they are effective." The third chapter, "New Rules for Radicals," concludes with broad generalizations: "Republicans are running the show in American politics. They are doing so in opposition to the moderate center of public opinion."
Off Center devotes several pages to "Fissures in the Republican Facade." Still, Hacker and Pierson arguably underplay the rifts within the party's conservative base and underestimate the gaps between far-right rhetoric and center-right Republican policies. Some GOP libertarians may be "radical," but, as the Cato chorus painfully croons, they have hardly proved highly "effective" in getting federal policies to mirror their ideological preferences. Ditto for the so-called Christian right. Reagan repeatedly promised, but did not deliver, strong action to roll back abortion. Bush in 2004 initially embraced a constitutional amendment banning same-sex marriage; then, shortly before Election Day, he declared that he supported "civil unions." In 2005 the White House did nothing to follow up on the issue one way or the other, and so it often goes.
In truth, too many leaders and activists in both parties are way "off center." It takes at least two to do the ideological polarization tango. If Hacker and Pierson ever revise the book's section on "Increasing Transparency and Accountability" in Congress, I would vote for two proposals only slightly more quixotic than several they have already embraced.
First, cut Capitol Hill staff sizes in half and require that all standing Congressional-committee staff members be nonpartisan civil servants. The most partisan and ideological Republicans — and Democrats — in Congress are not the elected members themselves but their respective culture-war-mongering, inside-the-Beltway staff members. Second, cut the number of presidential political appointees in half, following the advice that former Federal Reserve Chairman Paul A. Volcker's commission on national public service gave over a decade ago.
Given intraparty divides, White House staff members inevitably spend much time anticipating criticisms or soothing disappointments that emanate from this or that group on the right (when Republicans are in charge) or on the left (when Democrats are in office). Frustrating though it may be to a quirky, pro-life, pro-poor, Catholic, New Democrat, academic, political moderate like me, the center is a lonely place to be in Washington, and as things stand, no president, Republican or Democratic, can govern squarely from the center.
If you can't beat the pundits, join them. Here are four parting predictions: When in political trouble, Bush has a proven presidential knack for binding an intraparty conservative coalition, finding the public center, and occupying it with novel policy ideas and actions that leave Democrats either divided or nonplussed. His 2006 State of the Union Address will begin to reverse his 2005 political slide.
Unified Republican government will continue to split conservatives, but most political media mavens will continue to peddle the usual pat stories about left-right, red-blue partisan warfare and miss the more interesting intraparty stories.
A New Democrat will win the presidency in 2008, but not by much, not with coattails that carry Democrats into majority status in Congress, and not for reasons reflecting any new realities or fundamental shifts in the body politic.
And finally, the pundits will nonetheless dress the next Democratic presidential victory in some silly new conventional wisdom ("New Blue Nation"? "The Bush Backlash"?) that will be widely forgotten, save by academic nerds or curmudgeons like me, before the decade is out.
John J. DiIulio Jr., a professor of political science at the University of Pennsylvania, served as first director of the White House Office of Faith-Based and Community Initiatives. He is co-author, with Meena Bose, of Classic Ideas and Current Issues in American Government, just published by Houghton Mifflin.
Section: The Chronicle Review
Volume 52, Issue 20, Page B9
Following the 2002 midterm Congressional elections, Democrats were blue about their party's future. With the September 11, 2001, terrorist attacks still uppermost in the American public mind, and with President Bush enjoying record-high job-approval ratings, most voters favored Republican candidates and voiced conservative opinions in polls. Several pundits proclaimed that the GOP was now America's "permanent majority" at the national level, becoming so at the state level, and even resurrecting itself in some cities where Democrats had long reigned supreme.
Supposedly this political realignment was, if anything, long overdue. Since the early 1970s, public opinion had been trending conservative. By the early 1990s, lower taxes, tougher crime policies, and traditional moral values all consistently polled popular majorities. Southern voters began bolting from the Democratic Party in 1964 when President Lyndon B. Johnson signed the Civil Rights Act. Nixon's law-and-order "Southern strategy" iced the break. In 1980 and again in 1984, Reagan attracted one in four votes cast by Democrats. In 1994 Newt Gingrichled Republicans ended Democrats' 40 years at the House's helm.
In 2002 here it supposedly was: the long-predicted shift to Republican Party dominance and conservative ideological hegemony. Two years later, Bush's bigger-than-expected win over Sen. John F. Kerry ostensibly confirmed that conservative Republicans had become America's ruling class.
Besides, the 2004 election results supposedly revealed deep culture-war differences concerning religion that sealed the Republicans' permanent majority status. About two-thirds of people who attended church regularly (weekly or more) voted for Bush. As the analysts John C. Green and Mark Silk have documented, the small plurality of Americans who chose "moral values" as "the one issue that mattered most" to their presidential vote in 2004 — so-called "moral values" voters — put Bush safely over the top in the South, the Mountain West, and the Midwest. Millions more evangelical Christians voted in 2004 than had voted in 2000.
But what a difference a year makes. According to the Washington chattering class, Bush and the Republicans' governing majority are suddenly but surely in decline. Many among the selfsame talking heads who were only recently talking Republican realignment, conservative hegemony, and Bush's lasting Reagan-like legacy, are now talking conservative crack-up, the lame-duck president's political meltdown, and the Democrats' winning back the House in 2006.
All the pundits point to much the same reasons for this apparent reversal in conservative Republicans' political fortunes: rising popular sentiment against the U.S. occupation of Iraq; news-media spotlights on the bungled federal response to Hurricane Katrina; prosecutorial probes into alleged misdeeds by high-profile Repub-lican leaders; revolts by conservatives against the president's second pick for the Supreme Court, Harriet Miers; and retreats by the White House on Social Security privatization and several other domestic-policy priorities.
There is only one problem with this latest conventional political wisdom. It is, like the conventional political wisdom that immediately preceded it, almost completely wrong in virtually every respect.
Today's true big political picture is mostly gray shades against a purple (red mixed with blue) canvas. Conservative Republicans, beset by deep ideological divisions, are not even close to becoming the country's permanent ruling class. Neither the post-Reagan Republican Party in general, nor the present Bush White House in particular, ever actually rode so high politically.
Just the same, neither the GOP nor the president is in any definite long-term political trouble. Conservative Republicans, even without permanent-majority clout, are still more potent politically than liberal Democrats, and likely to remain so. Centrist and neoprogressive Democrats could credibly compete for power with conservative Republicans, but they must first pry their party's presidential nomination process and key leadership posts from the old-left hands that still primarily control them. Despite strenuous efforts to do so since the mid-1980s by various New Democrat groups, the party is still led mainly by its liberals. Not even the New Democrats have ever really reached out to the culturally conservative and anti-abortion Democrats who have been defecting to the Republican Party since the Reagan years.
True, Bush won over two-thirds of regular churchgoers, but Kerry won two-thirds of voters who said they never went to church. Together the "churched" (a sixth) and the "unchurched" (a seventh) constituted less than a third of the total electorate. As the political scientist James Q. Wilson, of UCLA and Pepperdine University, stated in his November Tanner Lecture at Harvard, "religion makes a difference, but very religious and very irreligious voters are only a minority of the electorate." Amen, and as studies by the Stanford political scientist Morris P. Fiorina have shown, even on most hot-button issues, the electorate is far less polarized than ideological elites on each side would like them to be.
The political pundits are wrong, but your high-school civics teacher was right: Thanks to federalism, separated powers, checks and balances, staggered elections, and myriad other constitutional contrivances, the party in power has to govern by the ABC's — forging interparty alliances, striking bargains with officials in other branches and at other levels of government, and effecting compromises that usually induce less loyalty from the winners than enmity from the losers.
Especially when, as today, national political elites are ideologically polarized into partisan camps, unified party government is constitutionally conditioned to be a splendid curse for the party in power. Once that party is "in control" in both Congressional chambers and in the White House, the ABC's rudely awaken latent intraparty divisions and spark new, high-stakes internal battles over both ideas (who believes what) and interests (who gets what).
Historically the Democrats' New Deal coalition — Southern whites, northern blacks, union members, Catholics, Jews, and disparate others — had pretty much fallen apart by the time Nixon resold himself to America in 1968. But the Republicans' grand old "Main Street and Wall Street" coalition has always been a true political witch's brew, bound to bubble and boil over whenever the GOP and its conservative base — that is, bases, plural — control both Congressional chambers plus the White House.
In an early October 2005 cover story, "What's Gone Wrong for America's Right," The Economist magazine listed the contemporary GOP's conservative cleavages: small-government conservitives versus big-government conservatives, conservatives of faith versus conservatives of doubt, insurgent conservatives versus establishment conservatives, business conservatives versus religious conservatives, and neoconservatives versus traditional conservatives.
Exhibit A is the libertarian Cato Institute's edited volume assessing what Republicans have wrought since taking back the House in 1994 and achieving unified party control under George W. Bush. As the small-government conservatives see it, 10 years after the "Republican revolution," Bush-led Washington and the Republican Party have backslid into "business as usual."
Cato's best-known analyst-activist, Stephen Moore, says it all in his chapter's subtitle, "The Triumph of Big Government." In a section headed "Republicans Break the Bank Under President Bush," Moore notes that nondefense discretionary spending rose 34 percent during Bush's first term, which is "exactly the opposite of what was promised by Republican leaders when they came to power in the 1990s." The Bush "spending spree," as Moore dubs it, started before 9/11 and "is spread across many federal agencies, whether they have a security function or not." And don't blame only the Democrats: "Bush has not vetoed a single bill. ... If Bush is displeased with big spending in Congress, he has shown no sign of it."
The libertarians lambaste more than Bush's budgets. Health-care policy, Michael F. Cannon says, has been the Republican revolution's "mitigated disaster." Republicans defeated Clinton's universal health-insurance plan, but they have yet to rein in federal spending on Medicaid; and in 2003 Bush backed the Medicare Prescription Drug Improvement and Modernization Act, affording prescription-drug coverage to qualified senior citizens starting in 2006. Bush's landmark No Child Left Behind law, argues David F. Salisbury, "greatly increased federal education spending and perpetuated funding for most of the old federal education programs, many of which are ineffective and wasteful." According to Jerry Taylor, "the Republican revolution has left virtually no footprints on the environmental code or on federal land holdings." On foreign policy and national security, avers Christopher A. Preble, there are now "serious divisions within the party." Preble charges that both the first and second Presidents Bush, Gingrich, and other Republican revolutionaries have proved unwilling "to part with the military-industrial complex that had expanded during the cold war." He labels our present national government a "warfare-welfare state."
In the second chapter, Richard K. Armey, former Republican House majority leader, offers the "Armey Axiom" that "Freedom Works," advising that "America will prosper and create unlimited opportunity if we have limited government and reward the hard work and initiative of citizens." Armey's political advice is simple: "When We Act Like Us, We Win."
Really? Cato's president, Edward H. Crane, is Armey's ideological twin, but he acknowledges that Republicans have won elections while straying far from the small-government gospel. Reagan, complains Crane, won big in 1984 by running on a platform "with no substance" and returned to office with no "mandate for cutting the government."
"Today," he writes, the GOP is led intellectually "by neoconservatives and other Republicans who are explicitly pro-big government."
Why is small-government conservatism so little honored by Republican policy makers even now that they control the Congress and the White House? Even a nonlibertarian like me can be moved by certain libertarian ideas and values (especially each April 15). The simple truth, however, is that most citizens, including most who are registered as Republicans, carp about taxes but, when push comes to shove, wanteven demandmost of what "big government" does and delivers. No national politician can stay in office long or get things done legislatively if he or she always talks or routinely votes the way a committed libertarian should.
To wit: Republicans have won seven of the last 10 presidential elections. Nixon, Ford, Reagan in 1984, and the two Presidents Bush read little from the libertarian liturgy. Only Reagan in 1980 talked a small-government line, and he received just 51 percent of the vote in a three-way race. As Crane notes, in 1980 Reagan promised to abolish the Department of Education and the Department of Energy. That promise got big applause before certain conservative audiences, but Reagan never really pushed hard to get rid of those agencies, and they are still very much with us today.
After delivering his 1981 tax cuts, Reagan did not retire his anti-big-government and bureaucracy-bashing rhetoric. During his two terms, however, federal-government spending as a percentage of gross domestic product changed little, military spending skyrocketed, and there were no big reductions in the federal civilian work force (those occurred in the mid-1990s under Clinton). When Reagan left office in 1989, the Federal Register was slimmer, but the federal government's regulatory reach was, if anything, far greater than it had been in 1980. In 1984 the less libertarian-sounding Reagan won in a landslide (59 percent to 41 percent).
In the mid-1990s, often downbeat and divisive Republican revolutionaries lost what little ground the upbeat and avuncular Reagan had gained for the small-government cause. The public liked the Contract With America, but not its policy fine print. As I predicted in more than a half-dozen lectures I gave in early 1995, once people heard a gavel-wielding Gingrich talk about cutting major social programs, they balked. When Clinton called Gingrich's bluff about "shutting down" the federal government, the only remaining question was when, not whether, the small-government moment would quickly pass into House history footnotes.
George W. Bush has never hidden his differences with libertarians. His very first campaign speech, on July 22, 1999, articulated what he believed as a "compassionate conservative." Speaking before inner-city clergymen and women in Indianapolis, "economic growth," Bush preached, "is not the solution to every problem." He labeled as "destructive" the idea that government is bad and called explicitly for increasing government support for Medicaid and other federal programs. He also rebutted the notion that government needs only to step aside for families and communities to flourish. In particular he stressed that, when it comes to addressing poverty and urban blight, it "is not enough to call for volunteerism. Without more support — public and private — we are asking" local community-serving groups, both religious and secular, "to make bricks with-out straw."
Bush, like Reagan before him, is a true believer in tax cuts. In 2001 he put tax cuts first on his agenda, and the administration has been quick to court groups with grass-roots networks dedicated to lowering taxes. But the president also proceeded, both before and after 9/11, to try to make good on his pledges of activist domestic government: more federal aid to Title I schools; bipartisan initiatives to expand volunteer-mobilization programs, including Clinton's AmeriCorps program; fresh federal funding for best-practices programs that benefit at-risk urban youth; and much more. To many libertarian leaders' dismay, in 2004 Bush ran mainly on Iraq, homeland security, and his record as a compassionate conservative.
Libertarians aside, the GOP's most interesting but least well-understood intraparty political schism is among its religious conservatives. On the one side are what some political scientists term the party's religious purists. Essentially the purists want to push for policies that challenge constitutional church-state limits and to nominate as federal judges those whom only an activist opposed to abortion or gay rights could love. On the other side are its religious pragmatists. Essentially the pragmatists want government to be more faith-friendly while remaining pluralistic; and, though they are mostly for restricting abortions and against same-sex marriage, they want traditional family values to be promoted less through pitched battles over federal judgeships and more through bipartisan "fatherhood" or "healthy marriage" initiatives and the like.
If Jacob S. Hacker and Paul Pierson are correct, religious pragmatists in the Republican Party don't have a prayer. In Off Center: The Republican Revolution and the Erosion of American Democracy, the two political scientists argue that the GOP is dominated by religious, libertarian, and other conservative ideological extremists who vary only according to their media savvy. The young authors are progressives, but they wage their case as public intellectuals with expert research skills. Like or agree with its thesis or not, their pithy, well-written book is certainly worth reading.
To Hacker and Pierson, the Republican religious base is synonymous with the "Christian right." They cite a study indicating that in 1994 no fewer than "31 state Republican parties" were "significantly shaped by the Christian right," led back then by the Christian Coalition. "The story of the Christian right," they argue, "is the story of many conservative activist groups." Those groups have graduated from mobilizing conservatives to take over local school boards. Rather, as "conservative activism has shifted toward national politics, it has also focused increasingly on the recruitment and certification of aspirants to elected office."
The Christian right, as depicted by Hacker and Pierson, is a conservative first cousin to libertarian Republican anti-tax lobbies (for example, the authors give Grover G. Norquist's influential Americans for Tax Reform ample treatment). The groups "share three key characteristics that increasingly define the organizational base of the GOP: They are radical; they focus on guiding and disciplining Republicans in Congress, not mobilizing large numbers of citizens; and they are effective." The third chapter, "New Rules for Radicals," concludes with broad generalizations: "Republicans are running the show in American politics. They are doing so in opposition to the moderate center of public opinion."
Off Center devotes several pages to "Fissures in the Republican Facade." Still, Hacker and Pierson arguably underplay the rifts within the party's conservative base and underestimate the gaps between far-right rhetoric and center-right Republican policies. Some GOP libertarians may be "radical," but, as the Cato chorus painfully croons, they have hardly proved highly "effective" in getting federal policies to mirror their ideological preferences. Ditto for the so-called Christian right. Reagan repeatedly promised, but did not deliver, strong action to roll back abortion. Bush in 2004 initially embraced a constitutional amendment banning same-sex marriage; then, shortly before Election Day, he declared that he supported "civil unions." In 2005 the White House did nothing to follow up on the issue one way or the other, and so it often goes.
In truth, too many leaders and activists in both parties are way "off center." It takes at least two to do the ideological polarization tango. If Hacker and Pierson ever revise the book's section on "Increasing Transparency and Accountability" in Congress, I would vote for two proposals only slightly more quixotic than several they have already embraced.
First, cut Capitol Hill staff sizes in half and require that all standing Congressional-committee staff members be nonpartisan civil servants. The most partisan and ideological Republicans — and Democrats — in Congress are not the elected members themselves but their respective culture-war-mongering, inside-the-Beltway staff members. Second, cut the number of presidential political appointees in half, following the advice that former Federal Reserve Chairman Paul A. Volcker's commission on national public service gave over a decade ago.
Given intraparty divides, White House staff members inevitably spend much time anticipating criticisms or soothing disappointments that emanate from this or that group on the right (when Republicans are in charge) or on the left (when Democrats are in office). Frustrating though it may be to a quirky, pro-life, pro-poor, Catholic, New Democrat, academic, political moderate like me, the center is a lonely place to be in Washington, and as things stand, no president, Republican or Democratic, can govern squarely from the center.
If you can't beat the pundits, join them. Here are four parting predictions: When in political trouble, Bush has a proven presidential knack for binding an intraparty conservative coalition, finding the public center, and occupying it with novel policy ideas and actions that leave Democrats either divided or nonplussed. His 2006 State of the Union Address will begin to reverse his 2005 political slide.
Unified Republican government will continue to split conservatives, but most political media mavens will continue to peddle the usual pat stories about left-right, red-blue partisan warfare and miss the more interesting intraparty stories.
A New Democrat will win the presidency in 2008, but not by much, not with coattails that carry Democrats into majority status in Congress, and not for reasons reflecting any new realities or fundamental shifts in the body politic.
And finally, the pundits will nonetheless dress the next Democratic presidential victory in some silly new conventional wisdom ("New Blue Nation"? "The Bush Backlash"?) that will be widely forgotten, save by academic nerds or curmudgeons like me, before the decade is out.
John J. DiIulio Jr., a professor of political science at the University of Pennsylvania, served as first director of the White House Office of Faith-Based and Community Initiatives. He is co-author, with Meena Bose, of Classic Ideas and Current Issues in American Government, just published by Houghton Mifflin.
Monday, January 16
Healthcare 103
Op-Ed Columnist
First, Do More Harm
By PAUL KRUGMAN
It's widely expected that President Bush will talk a lot about health care in his State of the Union address. He probably won't boast about his prescription drug plan, whose debut has been a Katrina-like saga of confusion and incompetence. But he probably will tout proposals for so-called "consumer driven" health care.
So it's important to realize that the administration's idea of health care reform is to take what's wrong with our system and make it worse. Consider the harrowing series of articles The New York Times printed last week about the rising tide of diabetes.
Diabetes is a horrifying disease. It's also an important factor in soaring medical costs. The likely future impact of the disease on those costs terrifies health economists. And the problem of dealing with diabetes is a clear illustration of the real issues in health care.
Here's what we should be doing: since the rise in diabetes is closely linked to the rise in obesity, we should be getting Americans to lose weight and exercise more. We should also support disease management: people with diabetes have a much better quality of life and place much less burden on society if they can be induced to monitor their blood sugar carefully and control their diet.
But it turns out that the U.S. system of paying for health care doesn't let medical professionals do the right thing. There's hardly any money for prevention, partly because of the influence of food-industry lobbyists. And even disease management gets severely shortchanged. As the Times series pointed out, insurance companies "will often refuse to pay $150 for a diabetic to see a podiatrist, who can help prevent foot ailments associated with the disease. Nearly all of them, though, cover amputations, which typically cost more than $30,000."
As a result, diabetes management isn't a paying proposition. Centers that train diabetics to manage the disease have been medical successes but financial failures.
The point is that we can't deal with the diabetes epidemic in part because insurance companies don't pay for preventive medicine or disease management, focusing only on acute illness and extreme remedies. Which brings us to the Bush administration's notion of health care reform.
The administration's principles for reform were laid out in the 2004 Economic Report of the President. The first and most important of these principles is "to encourage contracts" - that is, insurance policies - "that focus on large expenditures that are truly the result of unforeseen circumstances," as opposed to small or predictable costs.
The report didn't give any specifics about what this principle might mean in practice. So let me help out by supplying a real example: the administration is saying that we need to make sure that insurance companies pay only for things like $30,000 amputations, that they don't pay for $150 visits to podiatrists that might have averted the need for amputation.
To encourage insurance companies not to pay for podiatrists, the administration has turned to its favorite tool: tax breaks. The 2003 Medicare bill, although mainly concerned with prescription drugs, also allowed people who buy high-deductible health insurance policies - policies that cover only extreme expenses - to deposit money, tax-free, into health savings accounts that can be used to pay medical bills. Since then the administration has floated proposals to make the tax breaks bigger and wider, and these proposals may resurface in the State of the Union.
Critics of health savings accounts have mostly focused on two features of the accounts Mr. Bush won't mention. First, such accounts mainly benefit people with high incomes. Second, they encourage wealthy corporate employees to opt out of company health plans, further undermining the already fraying system of employment-based health insurance.
But the case of diabetes and other evidence suggest that a third problem with health savings accounts may be even more important: in practice, people who are forced to pay for medical care out of pocket don't have the ability to make good decisions about what care to purchase. "Consumer driven" is a nice slogan, but it turns out that buying health care isn't at all like buying clothing.
The bottom line is that what the Bush administration calls reform is actually the opposite. Driven by an ideology at odds with reality, the administration wants to accentuate, not fix, what's wrong with America's health care system.
First, Do More Harm
By PAUL KRUGMAN
It's widely expected that President Bush will talk a lot about health care in his State of the Union address. He probably won't boast about his prescription drug plan, whose debut has been a Katrina-like saga of confusion and incompetence. But he probably will tout proposals for so-called "consumer driven" health care.
So it's important to realize that the administration's idea of health care reform is to take what's wrong with our system and make it worse. Consider the harrowing series of articles The New York Times printed last week about the rising tide of diabetes.
Diabetes is a horrifying disease. It's also an important factor in soaring medical costs. The likely future impact of the disease on those costs terrifies health economists. And the problem of dealing with diabetes is a clear illustration of the real issues in health care.
Here's what we should be doing: since the rise in diabetes is closely linked to the rise in obesity, we should be getting Americans to lose weight and exercise more. We should also support disease management: people with diabetes have a much better quality of life and place much less burden on society if they can be induced to monitor their blood sugar carefully and control their diet.
But it turns out that the U.S. system of paying for health care doesn't let medical professionals do the right thing. There's hardly any money for prevention, partly because of the influence of food-industry lobbyists. And even disease management gets severely shortchanged. As the Times series pointed out, insurance companies "will often refuse to pay $150 for a diabetic to see a podiatrist, who can help prevent foot ailments associated with the disease. Nearly all of them, though, cover amputations, which typically cost more than $30,000."
As a result, diabetes management isn't a paying proposition. Centers that train diabetics to manage the disease have been medical successes but financial failures.
The point is that we can't deal with the diabetes epidemic in part because insurance companies don't pay for preventive medicine or disease management, focusing only on acute illness and extreme remedies. Which brings us to the Bush administration's notion of health care reform.
The administration's principles for reform were laid out in the 2004 Economic Report of the President. The first and most important of these principles is "to encourage contracts" - that is, insurance policies - "that focus on large expenditures that are truly the result of unforeseen circumstances," as opposed to small or predictable costs.
The report didn't give any specifics about what this principle might mean in practice. So let me help out by supplying a real example: the administration is saying that we need to make sure that insurance companies pay only for things like $30,000 amputations, that they don't pay for $150 visits to podiatrists that might have averted the need for amputation.
To encourage insurance companies not to pay for podiatrists, the administration has turned to its favorite tool: tax breaks. The 2003 Medicare bill, although mainly concerned with prescription drugs, also allowed people who buy high-deductible health insurance policies - policies that cover only extreme expenses - to deposit money, tax-free, into health savings accounts that can be used to pay medical bills. Since then the administration has floated proposals to make the tax breaks bigger and wider, and these proposals may resurface in the State of the Union.
Critics of health savings accounts have mostly focused on two features of the accounts Mr. Bush won't mention. First, such accounts mainly benefit people with high incomes. Second, they encourage wealthy corporate employees to opt out of company health plans, further undermining the already fraying system of employment-based health insurance.
But the case of diabetes and other evidence suggest that a third problem with health savings accounts may be even more important: in practice, people who are forced to pay for medical care out of pocket don't have the ability to make good decisions about what care to purchase. "Consumer driven" is a nice slogan, but it turns out that buying health care isn't at all like buying clothing.
The bottom line is that what the Bush administration calls reform is actually the opposite. Driven by an ideology at odds with reality, the administration wants to accentuate, not fix, what's wrong with America's health care system.
Pseudo-Familiarity and Corporate Egalitarianism
Don't Call Me Thomas
By THOMAS H. BENTON
"Hello, Thomas, did you find everything you wanted?"
Silence, five seconds.
"Thomas?"
I don't blame the cashier of course. It's not her fault if some marketing director dictates that calling customers by their first name is somehow good for business. She, too, seemed slightly embarrassed at having to use a stranger's name in this artificial fashion. Her employer must have paid a couple million dollars to a clutch of corporate consultants who constructed a data-driven vision of the average shopper's longing for the warmth and authenticity of the lost mom-and-pop grocery store.
I guess I'm an anti-egalitarian prig to bristle at being called "Thomas" by anyone besides my mother. Well, whatever. It's not like I'll stop shopping there. But I still wonder how even the most obtuse executive can fail to visualize the impropriety of a 17-year-old girl saying to a distinguished elderly lady, "Hello, Miriam, did you find everything you wanted?"
It would be wrong to chastise her, but how should I respond? It's one thing to make employees pretend to love their minimum-wage jobs, but it seems categorically different to demand feigned bonhomie from customers. More and more, life in the United States is taking on the Orwellian qualities of Disneyworld, where you must pretend to be happy or risk being harassed by fascists in bunny costumes.
That was the world of entry-level corporate culture from which I hoped to escape by going to graduate school. I tried the business path but like Holden Caulfield armed with B.A. in English (and familiarity with Sinclair Lewis), I couldn't quite participate in the earnest phoniness of corporate sales meetings. All around us were framed posters of people who were newly happy -- leaping into the air with joy -- because they had bought our products. There was no "reality," we were told; sales were manifested by the will to believe. If you didn't have faith in the product, then the customer wouldn't have faith in you. You were really selling yourself. Sales figures confirmed the sanctification of true believers. It was Puritanism via William James via L. Ron Hubbard.
Meanwhile, I daydreamed about professors who didn't have to wear suits and tassel loafers. They didn't have to glad hand prospects. They didn't have to play golf and memorize sports statistics. They didn't have to pretend to believe in things that were manifestly untrue. They could perform their jobs without psychotic smiles pasted on their faces. They could have authentic relationships, and they were regarded with sincere respect by admiring students.
Or so I liked to believe.
Flash forward 15 years, and I find myself a professor of English at a small, liberal-arts college. I can work in semi-casual clothes. I don't play golf or care much about football. And I get to teach subjects using the closest approximation of complex truth I can muster. But there are still moments of awkwardness, and, like my experience buying groceries, those moments often involve forms of address.
I'm walking across campus with a female faculty member, and a student greets us as we pass, "Hello, Valerie. Hello, Professor Benton."
I immediately wondered whether the student had made an offensive distinction in perceived status based on gender? But my colleague didn't seem to take it as such. Did that young woman know my colleague in a social context? Or does my colleague simply encourage some students to call her by her first name? I don't actually tell students what to call me. But am I a stuffed shirt and a stick-in-the-mud because I think it's best for students to address professors formally -- at least until after graduation?
How would it be if a judge began his proceedings with, "Sit down, folks, and call me Bob"? What if he asked the prosecutor to call him "Bob" but made the defense address him formally?
How can I give a C to someone who is close enough to me to use my first name?
Does teacher-student informality reflect the efflorescence of liberal egalitarianism, which I like to think I support? Or, is it yet another symptom of the corporatization of academic culture, which I like to think I oppose?
The adjunct's informality with students has the air of resignation; it's subordination to student-customers rather than a reflection of an egalitarian spirit: "Hi, I'm Thomas, and I'll be serving you Nietzsche this evening. And, remember, in this class, everyone is a winner!" [big smile].
These days about two-thirds of humanities teachers -- part-time, transient, cowed -- do not have the institutional backing to command much respect from their students, even if they have prestigious doctorates and long lists of refereed publications. Their job is not to be authorities so much as it is to get high scores on performance evaluations completed by large number of students. And, since the majority of students take their classes with the minimum-wage cashiers of the academic world, it must seem strange to be expected to refer to one or two privileged faculty members as "Professor." It's like calling someone "your royal highness."
Are we surrendering our titles in the spirit of equality? Or are we doing this because authority is steadily diminishing, and we can at least prolong our decline by making virtues of our necessities? "I'm not a contemptible, underpaid teacher of a dying discipline. You see, I believe in student-centered education. I can make them buy my product with a shoeshine and a smile."
I suppose, on some level, my reaction against being called by my first name has origins in my working-class background. I can remember, as a young child, hearing a salesman describe my father as "that guy" instead of as "the gentleman," which he had just applied to another man. How did he make that distinction? Who was I if I was "that guy's" son?
Can upper-middle-class students see right through me? Do my mannerisms give me away? Do they instinctively think I should be laying pipe instead of teaching literature? "Thomas, would you please bring the car around?" My feelings are similar to those of minority faculty members, who also tend to maintain formality in their relations with students. The use of "Professor" has helped me to maintain a professional identity that is distinct from my inner self who sometimes feels like an imposter.
I remember I put "Ph.D." on the heading of my CV within an hour after I submitted my dissertation for binding. To this day, my framed degrees hang on the wall of my office, though perhaps a few of my colleagues regard that as a déclassé affectation, as I increasingly do myself.
And I guess I am still a little sensitive to slights, such as when a student submits a paper with just "Benton" in the heading without being preceded by "Dr." or "Prof." It's not a big deal, and I don't lower their grade for it. But little, inadvertent discourtesies -- even when they are intended as signs of friendliness -- can play on my lingering insecurities.
For the most part, I find that comfortable professional relationships are more likely to exist between people of drastically different ranks when there is no ambiguity about relative status and when the senior person makes no distinction among subordinates. I know from experience that relationships between teachers and students can easily founder on confusion over modes of address.
That was the case with me and my adviser during the first two years of graduate school. I called him "professor" with enthusiasm; he was the smartest man I had ever met. After my qualifying exams, he asked me to call him by his first name. And, paradoxically, that seemed to mark a decline in the cordiality of our relationship. Using his first name assumed a familiarity that his relative experience and ability -- to say nothing of his almost absolute power over my career -- could not allow without a feeling of dishonesty akin to my experience in the megastore.
The reality of my institutional subordination ran against the grain of his benevolent desire to signify my progress toward collegial stature. And my desire to be friends with an eminent personage conflicted with the fact that he was, ultimately, my boss.
I suppose, like my adviser, as I get older and more confident, I find it decreasingly necessary to insist on the formalities my younger self wanted -- and still, to some degree, wants -- from his students. When students who are now half my age call me "professor," it no longer has the little spin of irony -- perhaps imagined -- that it had when I was a twenty-something adjunct faculty member. I think some of my students are starting to have trouble thinking of me as "Thomas," just as I had -- and have -- trouble addressing my old adviser by his first name.
Ultimately, I am not opposed to informality as a matter of principle in every context. I want to say that courtesy in language is comparable to appropriateness in attire: Clothing should not distract from the person, and titles should not draw attention away from the content of a conversation. Forms of address should just seem natural; they should develop organically like a well-ordered society.
But reassuring nostrums such as those -- direct from Edmund Burke via Dress for Success -- do not make our choices as faculty members and students less complicated in reality. Our identities are moving targets, and we live in a time of transition in which we often find ourselves playing the academic name game according to different rules. Our common desire for honest relationships -- for the mom-and-pop era of higher education (if it ever existed) -- makes us want to feel comfortable, authentic, and not phony, even when our contradictory institutional contexts and our complicated personal histories make that impossible, or at least fleeting and fraught with confusion.
But, if you want to be safe, just don't call me "Thomas" unless I ask you to, or you are my mother.
Thomas H. Benton is the pseudonym of an assistant professor of English at a Midwestern liberal-arts college. He writes about academic culture and the tenure track and welcomes reader mail directed to his attention at careers@chronicle.com
By THOMAS H. BENTON
"Hello, Thomas, did you find everything you wanted?"
Silence, five seconds.
"Thomas?"
I don't blame the cashier of course. It's not her fault if some marketing director dictates that calling customers by their first name is somehow good for business. She, too, seemed slightly embarrassed at having to use a stranger's name in this artificial fashion. Her employer must have paid a couple million dollars to a clutch of corporate consultants who constructed a data-driven vision of the average shopper's longing for the warmth and authenticity of the lost mom-and-pop grocery store.
I guess I'm an anti-egalitarian prig to bristle at being called "Thomas" by anyone besides my mother. Well, whatever. It's not like I'll stop shopping there. But I still wonder how even the most obtuse executive can fail to visualize the impropriety of a 17-year-old girl saying to a distinguished elderly lady, "Hello, Miriam, did you find everything you wanted?"
It would be wrong to chastise her, but how should I respond? It's one thing to make employees pretend to love their minimum-wage jobs, but it seems categorically different to demand feigned bonhomie from customers. More and more, life in the United States is taking on the Orwellian qualities of Disneyworld, where you must pretend to be happy or risk being harassed by fascists in bunny costumes.
That was the world of entry-level corporate culture from which I hoped to escape by going to graduate school. I tried the business path but like Holden Caulfield armed with B.A. in English (and familiarity with Sinclair Lewis), I couldn't quite participate in the earnest phoniness of corporate sales meetings. All around us were framed posters of people who were newly happy -- leaping into the air with joy -- because they had bought our products. There was no "reality," we were told; sales were manifested by the will to believe. If you didn't have faith in the product, then the customer wouldn't have faith in you. You were really selling yourself. Sales figures confirmed the sanctification of true believers. It was Puritanism via William James via L. Ron Hubbard.
Meanwhile, I daydreamed about professors who didn't have to wear suits and tassel loafers. They didn't have to glad hand prospects. They didn't have to play golf and memorize sports statistics. They didn't have to pretend to believe in things that were manifestly untrue. They could perform their jobs without psychotic smiles pasted on their faces. They could have authentic relationships, and they were regarded with sincere respect by admiring students.
Or so I liked to believe.
Flash forward 15 years, and I find myself a professor of English at a small, liberal-arts college. I can work in semi-casual clothes. I don't play golf or care much about football. And I get to teach subjects using the closest approximation of complex truth I can muster. But there are still moments of awkwardness, and, like my experience buying groceries, those moments often involve forms of address.
I'm walking across campus with a female faculty member, and a student greets us as we pass, "Hello, Valerie. Hello, Professor Benton."
I immediately wondered whether the student had made an offensive distinction in perceived status based on gender? But my colleague didn't seem to take it as such. Did that young woman know my colleague in a social context? Or does my colleague simply encourage some students to call her by her first name? I don't actually tell students what to call me. But am I a stuffed shirt and a stick-in-the-mud because I think it's best for students to address professors formally -- at least until after graduation?
How would it be if a judge began his proceedings with, "Sit down, folks, and call me Bob"? What if he asked the prosecutor to call him "Bob" but made the defense address him formally?
How can I give a C to someone who is close enough to me to use my first name?
Does teacher-student informality reflect the efflorescence of liberal egalitarianism, which I like to think I support? Or, is it yet another symptom of the corporatization of academic culture, which I like to think I oppose?
The adjunct's informality with students has the air of resignation; it's subordination to student-customers rather than a reflection of an egalitarian spirit: "Hi, I'm Thomas, and I'll be serving you Nietzsche this evening. And, remember, in this class, everyone is a winner!" [big smile].
These days about two-thirds of humanities teachers -- part-time, transient, cowed -- do not have the institutional backing to command much respect from their students, even if they have prestigious doctorates and long lists of refereed publications. Their job is not to be authorities so much as it is to get high scores on performance evaluations completed by large number of students. And, since the majority of students take their classes with the minimum-wage cashiers of the academic world, it must seem strange to be expected to refer to one or two privileged faculty members as "Professor." It's like calling someone "your royal highness."
Are we surrendering our titles in the spirit of equality? Or are we doing this because authority is steadily diminishing, and we can at least prolong our decline by making virtues of our necessities? "I'm not a contemptible, underpaid teacher of a dying discipline. You see, I believe in student-centered education. I can make them buy my product with a shoeshine and a smile."
I suppose, on some level, my reaction against being called by my first name has origins in my working-class background. I can remember, as a young child, hearing a salesman describe my father as "that guy" instead of as "the gentleman," which he had just applied to another man. How did he make that distinction? Who was I if I was "that guy's" son?
Can upper-middle-class students see right through me? Do my mannerisms give me away? Do they instinctively think I should be laying pipe instead of teaching literature? "Thomas, would you please bring the car around?" My feelings are similar to those of minority faculty members, who also tend to maintain formality in their relations with students. The use of "Professor" has helped me to maintain a professional identity that is distinct from my inner self who sometimes feels like an imposter.
I remember I put "Ph.D." on the heading of my CV within an hour after I submitted my dissertation for binding. To this day, my framed degrees hang on the wall of my office, though perhaps a few of my colleagues regard that as a déclassé affectation, as I increasingly do myself.
And I guess I am still a little sensitive to slights, such as when a student submits a paper with just "Benton" in the heading without being preceded by "Dr." or "Prof." It's not a big deal, and I don't lower their grade for it. But little, inadvertent discourtesies -- even when they are intended as signs of friendliness -- can play on my lingering insecurities.
For the most part, I find that comfortable professional relationships are more likely to exist between people of drastically different ranks when there is no ambiguity about relative status and when the senior person makes no distinction among subordinates. I know from experience that relationships between teachers and students can easily founder on confusion over modes of address.
That was the case with me and my adviser during the first two years of graduate school. I called him "professor" with enthusiasm; he was the smartest man I had ever met. After my qualifying exams, he asked me to call him by his first name. And, paradoxically, that seemed to mark a decline in the cordiality of our relationship. Using his first name assumed a familiarity that his relative experience and ability -- to say nothing of his almost absolute power over my career -- could not allow without a feeling of dishonesty akin to my experience in the megastore.
The reality of my institutional subordination ran against the grain of his benevolent desire to signify my progress toward collegial stature. And my desire to be friends with an eminent personage conflicted with the fact that he was, ultimately, my boss.
I suppose, like my adviser, as I get older and more confident, I find it decreasingly necessary to insist on the formalities my younger self wanted -- and still, to some degree, wants -- from his students. When students who are now half my age call me "professor," it no longer has the little spin of irony -- perhaps imagined -- that it had when I was a twenty-something adjunct faculty member. I think some of my students are starting to have trouble thinking of me as "Thomas," just as I had -- and have -- trouble addressing my old adviser by his first name.
Ultimately, I am not opposed to informality as a matter of principle in every context. I want to say that courtesy in language is comparable to appropriateness in attire: Clothing should not distract from the person, and titles should not draw attention away from the content of a conversation. Forms of address should just seem natural; they should develop organically like a well-ordered society.
But reassuring nostrums such as those -- direct from Edmund Burke via Dress for Success -- do not make our choices as faculty members and students less complicated in reality. Our identities are moving targets, and we live in a time of transition in which we often find ourselves playing the academic name game according to different rules. Our common desire for honest relationships -- for the mom-and-pop era of higher education (if it ever existed) -- makes us want to feel comfortable, authentic, and not phony, even when our contradictory institutional contexts and our complicated personal histories make that impossible, or at least fleeting and fraught with confusion.
But, if you want to be safe, just don't call me "Thomas" unless I ask you to, or you are my mother.
Thomas H. Benton is the pseudonym of an assistant professor of English at a Midwestern liberal-arts college. He writes about academic culture and the tenure track and welcomes reader mail directed to his attention at careers@chronicle.com
Saturday, January 14
Covering, Assimilation, and Civil Rights
The Pressure to Cover
By KENJI YOSHINO
January 15, 2006, NYT Magazine
When I began teaching at Yale Law School in 1998, a friend spoke to me frankly. "You'll have a better chance at tenure," he said, "if you're a homosexual professional than if you're a professional homosexual." Out of the closet for six years at the time, I knew what he meant. To be a "homosexual professional" was to be a professor of constitutional law who "happened" to be gay. To be a "professional homosexual" was to be a gay professor who made gay rights his work. Others echoed the sentiment in less elegant formulations. Be gay, my world seemed to say. Be openly gay, if you want. But don't flaunt.
I didn't experience the advice as antigay. The law school is a vigorously tolerant place, embedded in a university famous for its gay student population. (As the undergraduate jingle goes: "One in four, maybe more/One in three, maybe me/One in two, maybe you.") I took my colleague's words as generic counsel to leave my personal life at home. I could see that research related to one's identity - referred to in the academy as "mesearch" - could raise legitimate questions about scholarly objectivity.
I also saw others playing down their outsider identities to blend into the mainstream. Female colleagues confided that they would avoid references to their children at work, lest they be seen as mothers first and scholars second. Conservative students asked for advice about how open they could be about their politics without suffering repercussions at some imagined future confirmation hearing. A religious student said he feared coming out as a believer, as he thought his intellect would be placed on a 25 percent discount. Many of us, it seemed, had to work our identities as well as our jobs.
It wasn't long before I found myself resisting the demand to conform. What bothered me was not that I had to engage in straight-acting behavior, much of which felt natural to me. What bothered me was the felt need to mute my passion for gay subjects, people, culture. At a time when the law was transforming gay rights, it seemed ludicrous not to suit up and get in the game.
"Mesearch" being what it is, I soon turned my scholarly attention to the pressure to conform. What puzzled me was that I felt that pressure so long after my emergence from the closet. When I stopped passing, I exulted that I could stop thinking about my sexuality. This proved naïve. Long after I came out, I still experienced the need to assimilate to straight norms. But I didn't have a word for this demand to tone down my known gayness.
Then I found my word, in the sociologist Erving Goffman's book "Stigma." Written in 1963, the book describes how various groups - including the disabled, the elderly and the obese - manage their "spoiled" identities. After discussing passing, Goffman observes that "persons who are ready to admit possession of a stigma. . .may nonetheless make a great effort to keep the stigma from looming large." He calls this behavior covering. He distinguishes passing from covering by noting that passing pertains to the visibility of a characteristic, while covering pertains to its obtrusiveness. He relates how F.D.R. stationed himself behind a desk before his advisers came in for meetings. Roosevelt was not passing, since everyone knew he used a wheelchair. He was covering, playing down his disability so people would focus on his more conventionally presidential qualities.
As is often the case when you learn a new idea, I began to perceive covering everywhere. Leafing through a magazine, I read that Helen Keller replaced her natural eyes (one of which protruded) with brilliant blue glass ones. On the radio, I heard that Margaret Thatcher went to a voice coach to lower the pitch of her voice. Friends began to send me e-mail. Did I know that Martin Sheen was Ramon Estevez on his birth certificate, that Ben Kingsley was Krishna Bhanji, that Kirk Douglas was Issur Danielovitch Demsky and that Jon Stewart was Jonathan Leibowitz?
In those days, spotting instances of covering felt like a parlor game. It's hard to get worked up about how celebrities and politicians have to manage their public images. Jon Stewart joked that he changed his name because Leibowitz was "too Hollywood," and that seemed to get it exactly right. My own experience with covering was also not particularly difficult - once I had the courage to write from my passions, I was immediately embraced.
It was only when I looked for instances of covering in the law that I saw how lucky I had been. Civil rights case law is peopled with plaintiffs who were severely punished for daring to be openly different. Workers were fired for lapsing into Spanish in English-only workplaces, women were fired for behaving in stereotypically "feminine" ways and gay parents lost custody of their children for engaging in displays of same-sex affection. These cases revealed that far from being a parlor game, covering was the civil rights issue of our time.
The New Discrimination
In recent decades, discrimination in America has undergone a generational shift. Discrimination was once aimed at entire groups, resulting in the exclusion of all racial minorities, women, gays, religious minorities and people with disabilities. A battery of civil rights laws - like the Civil Rights Act of 1964 and the Americans with Disabilities Act of 1990 - sought to combat these forms of discrimination. The triumph of American civil rights is that such categorical exclusions by the state or employers are now relatively rare.
Now a subtler form of discrimination has risen to take its place. This discrimination does not aim at groups as a whole. Rather, it aims at the subset of the group that refuses to cover, that is, to assimilate to dominant norms. And for the most part, existing civil rights laws do not protect individuals against such covering demands. The question of our time is whether we should understand this new discrimination to be a harm and, if so, whether the remedy is legal or social in nature.
Consider the following cases:
• Renee Rogers, an African-American employee at American Airlines, wore cornrows to work. American had a grooming policy that prevented employees from wearing an all-braided hairstyle. When American sought to enforce this policy against Rogers, she filed suit, alleging race discrimination. In 1981, a federal district court rejected her argument. It first observed that cornrows were not distinctively associated with African-Americans, noting that Rogers had only adopted the hairstyle after it "had been popularized by a white actress in the film '10.' " As if recognizing the unpersuasiveness of what we might call the Bo Derek defense, the court further alleged that because hairstyle, unlike skin color, was a mutable characteristic, discrimination on the basis of grooming was not discrimination on the basis of race. Renee Rogers lost her case.
• Lydia Mikus and Ismael Gonzalez were called for jury service in a case involving a defendant who was Latino. When the prosecutor asked them whether they could speak Spanish, they answered in the affirmative. The prosecutor struck them, and the defense attorney then brought suit on their behalf, claiming national-origin discrimination. The prosecutor responded that he had not removed the potential jurors for their ethnicity but for their ability to speak Spanish. His stated concern was that they would not defer to the court translator in listening to Spanish-language testimony. In 1991, the Supreme Court credited this argument. Lydia Mikus and Ismael Gonzalez lost their case.
• Diana Piantanida had a child and took a maternity leave from her job at the Wyman Center, a charitable organization in Missouri. During her leave, she was demoted, supposedly for previously having handed in work late. The man who was then the Wyman Center's executive director, however, justified her demotion by saying the new position would be easier "for a new mom to handle." As it turned out, the new position had less responsibility and half the pay of the original one. But when Piantanida turned this position down, her successor was paid Piantanida's old salary. Piantanida brought suit, claiming she had been discharged as a "new mom." In 1997, a federal appellate court refused to analyze her claim as a sex-discrimination case, which would have led to comparing the treatment she received to the treatment of "new dads." Instead, it found that Piantanida's (admittedly vague) pleadings raised claims only under the Pregnancy Discrimination Act, which it correctly interpreted to protect women only while they are pregnant. Diana Piantanida lost her case.
Robin Shahar was a lesbian attorney who received a job offer from the Georgia Department of Law, where she had worked as a law student. The summer before she started her new job, Shahar had a religious same-sex commitment ceremony with her partner. She asked a supervisor for a late starting date because she was getting married and wanted to go on a celebratory trip to Greece. Believing Shahar was marrying a man, the supervisor offered his congratulations. Senior officials in the office soon learned, however, that Shahar's partner was a woman. This news caused a stir, reports of which reached Michael Bowers, the attorney general of Georgia who had successfully defended his state's prohibition of sodomy before the United States Supreme Court. After deliberating with his lawyers, Bowers rescinded her job offer. The staff member who informed her read from a script, concluding, "Thanks again for coming in, and have a nice day." Shahar brought suit, claiming discrimination on the basis of sexual orientation. In court, Bowers testified that he knew Shahar was gay when he hired her, and would never have terminated her for that reason. In 1997, a federal appellate court accepted that defense, maintaining that Bowers had terminated Shahar on the basis of her conduct, not her status. Robin Shahar lost her case.
• Simcha Goldman, an Air Force officer who was also an ordained rabbi, wore a yarmulke at all times. Wearing a yarmulke is part of the Orthodox tradition of covering one's head out of deference to an omnipresent god. Goldman's religious observance ran afoul of an Air Force regulation that prohibited wearing headgear while indoors. When he refused his commanding officer's order to remove his yarmulke, Goldman was threatened with a court martial. He brought a First Amendment claim, alleging discrimination on the basis of religion. In 1986, the Supreme Court rejected his claim. It stated that the Air Force had drawn a reasonable line between "religious apparel that is visible and that which is not." Simcha Goldman lost his case.
These five cases represent only a fraction of those in which courts have refused to protect plaintiffs from covering demands. In such cases, the courts routinely distinguish between immutable and mutable traits, between being a member of a legally protected group and behavior associated with that group. Under this rule, African-Americans cannot be fired for their skin color, but they could be fired for wearing cornrows. Potential jurors cannot be struck for their ethnicity but can be struck for speaking (or even for admitting proficiency in) a foreign language. Women cannot be discharged for having two X chromosomes but can be penalized (in some jurisdictions) for becoming mothers. Although the weaker protections for sexual orientation mean gays can sometimes be fired for their status alone, they will be much more vulnerable if they are perceived to "flaunt" their sexuality. Jews cannot be separated from the military for being Jewish but can be discharged for wearing yarmulkes.
This distinction between being and doing reflects a bias toward assimilation. Courts will protect traits like skin color or chromosomes because such traits cannot be changed. In contrast, the courts will not protect mutable traits, because individuals can alter them to fade into the mainstream, thereby escaping discrimination. If individuals choose not to engage in that form of self-help, they must suffer the consequences.
The judicial bias toward assimilation will seem correct and just to many Americans. Assimilation, after all, is a precondition of civilization - wearing clothes, having manners and obeying the law are all acts of assimilation. Moreover, the tie between assimilation and American civilization may be particularly strong. At least since Hector St. John de Crèvecoeur's 1782 "Letters from an American Farmer," this country has promoted assimilation as the way Americans of different backgrounds would be "melted into a new race of men." By the time Israel Zangwill's play "The Melting Pot" made its debut in 1908, the term had acquired the burnish of an American ideal. Theodore Roosevelt, who believed hyphenations like "Polish-American" were a "moral treason," is reputed to have yelled, "That's a great play!" from his box when it was performed in Washington. (He was wrong - it's no accident the title has had a longer run than the play.) And notwithstanding challenges beginning in the 1960's to move "beyond the melting pot" and to "celebrate diversity," assimilation has never lost its grip on the American imagination.
If anything, recent years have seen a revival of the melting-pot ideal. We are currently experiencing a pluralism explosion in the United States. Patterns of immigration since the late 1960's have made the United States the most religiously various country in the history of the world. Even when the demographics of a group - like the number of individuals with disabilities - are presumably constant, the number of individuals claiming membership in that group may grow exponentially. In 1970, there were 9 disability-related associations listed in the Encyclopedia of Associations; in 1980, there were 16; in 1990, there were 211; and in 2000, there were 799. The boom in identity politics has led many thoughtful commentators to worry that we are losing our common culture as Americans. Fearful that we are breaking apart into balkanized fiefs, even liberal lions like Arthur Schlesinger have called for a recommitment to the ethic of assimilation.
Beyond keeping pace with the culture, the judiciary has institutional reasons for encouraging assimilation. In the yarmulke case, the government argued that ruling in favor of the rabbi's yarmulke would immediately invite suits concerning the Sikh's turban, the yogi's saffron robes and the Rastafarian's dreadlocks. Because the courts must articulate principled grounds for their decisions, they are particularly ill equipped to protect some groups but not others in an increasingly diverse society. Seeking to avoid judgments about the relative worth of groups, the judiciary has decided instead to rely on the relatively uncontroversial principle of protecting immutable traits.
Viewed in this light, the judiciary's failure to protect individuals against covering demands seems eminently reasonable. Unfortunately, it also represents an abdication of its responsibility to protect civil rights.
The Case Against Assimilation
The flaw in the judiciary's analysis is that it casts assimilation as an unadulterated good. Assimilation is implicitly characterized as the way in which groups can evade discrimination by fading into the mainstream - after all, the logic goes, if a bigot cannot discriminate between two individuals, he cannot discriminate against one of them. But sometimes assimilation is not an escape from discrimination, but precisely its effect. When a Jew is forced to convert to Protestantism, for instance, we do not celebrate that as an evasion of anti-Semitism. We should not blind ourselves to the dark underbelly of the American melting pot.
Take the cornrows case. Initially, this case appears to be an easy one for the employer, as hairstyle seems like such a trivial thing. But if hair is so trivial, we might ask why American Airlines made it a condition of Renee Rogers's employment. What's frustrating about the employment discrimination jurisprudence is that courts often don't force employers to answer the critical question of why they are requiring employees to cover. If we look to other sources, the answers can be troubling.
John T. Molloy's perennially popular self-help manual "New Dress for Success" also tells racial minorities to cover. Molloy advises African-Americans to avoid "Afro hairstyles" and to wear "conservative pinstripe suits, preferably with vests, accompanied by all the establishment symbols, including the Ivy League tie." He urges Latinos to "avoid pencil-line mustaches," "any hair tonic that tends to give a greasy or shiny look to the hair," "any articles of clothing that have Hispanic associations" and "anything that is very sharp or precise."
Molloy is equally frank about why covering is required. The "model of success," he says, is "white, Anglo-Saxon and Protestant." Those who do not possess these traits "will elicit a negative response to some degree, regardless of whether that response is conscious or subconscious." Indeed, Molloy says racial minorities must go "somewhat overboard" to compensate for immutable differences from the white mainstream. After conducting research on African-American corporate grooming, Molloy reports that "blacks had not only to dress more conservatively but also more expensively than their white counterparts if they wanted to have an equal impact."
Molloy's basic point is supported by social-science research. The economists Marianne Bertrand and Sendhil Mullainathan recently conducted a study in which they sent out résumés that were essentially identical except for the names at the top. They discovered that résumés with white-sounding names like Emily Walsh or Greg Baker drew 50 percent more callbacks than those with African-American-sounding names like Lakisha Washington or Jamal Jones. So it seems that even when Americans have collectively set our faces against racism, we still react negatively to cultural traits - like hairstyles, clothes or names - that we associate with historically disfavored races.
We can see a similar dynamic in the termination of Robin Shahar. Michael Bowers, the state attorney general, disavowed engaging in first-generation discrimination when he said he had no problem with gay employees. This raises the question of why he fired Shahar for having a religious same-sex commitment ceremony. Unlike American Airlines, Bowers provided some answers. He argued that retaining Shahar would compromise the department's ability to deny same-sex couples marriage licenses and to enforce sodomy statutes.
Neither argument survives scrutiny. At no point did Shahar seek to marry her partner legally, nor did she agitate for the legalization of same-sex marriage. The Georgia citizenry could not fairly have assumed that Shahar's religious ceremony would entitle the couple to a civil license. Bowers's claim that Shahar's wedding would compromise her ability to enforce sodomy statutes is also off the mark. Georgia's sodomy statute (which has since been struck down) punished cross-sex as well as same-sex sodomy, meaning that any heterosexual in the department who had ever had oral sex was as compromised as Shahar.
Stripped of these rationales, Bowers's termination of Shahar looks more sinister. When she told a supervisor she was getting married, he congratulated her. When he discovered she was marrying a woman, it wasn't long before she no longer had a job. Shahar's religious ceremony was not in itself indiscreet; cross-sex couples engage in such ceremonies all the time. If Shahar was flaunting anything, it was her belief in her own equality: her belief that she, and not the state, should determine what personal bonds are worthy of celebration.
The demand to cover is anything but trivial. It is the symbolic heartland of inequality - what reassures one group of its superiority to another. When dominant groups ask subordinated groups to cover, they are asking them to be small in the world, to forgo prerogatives that the dominant group has and therefore to forgo equality. If courts make critical goods like employment dependent on covering, they are legitimizing second-class citizenship for the subordinated group. In doing so, they are failing to vindicate the promise of civil rights.
So the covering demand presents a conundrum. The courts are right to be leery of intervening in too brusque a manner here, as they cannot risk playing favorites among groups. Yet they also cannot ignore the fact that the covering demand is where many forms of inequality continue to have life. We need a paradigm that gives both these concerns their due, adapting the aspirations of the civil rights movement to an increasingly pluralistic society.
The New Civil Rights
The new civil rights begins with the observation that everyone covers. When I lecture on covering, I often encounter what I think of as the "angry straight white man" reaction. A member of the audience, almost invariably a white man, almost invariably angry, denies that covering is a civil rights issue. Why shouldn't racial minorities or women or gays have to cover? These groups should receive legal protection against discrimination for things they cannot help. But why should they receive protection for behaviors within their control - wearing cornrows, acting "feminine" or flaunting their sexuality? After all, the questioner says, I have to cover all the time. I have to mute my depression, or my obesity, or my alcoholism, or my shyness, or my working-class background or my nameless anomie. I, too, am one of the mass of men leading lives of quiet desperation. Why should legally protected groups have a right to self-expression I do not? Why should my struggle for an authentic self matter less?
I surprise these individuals when I agree. Contemporary civil rights has erred in focusing solely on traditional civil rights groups - racial minorities, women, gays, religious minorities and people with disabilities. This assumes those in the so-called mainstream - those straight white men - do not also cover. They are understood only as obstacles, as people who prevent others from expressing themselves, rather than as individuals who are themselves struggling for self-definition. No wonder they often respond to civil rights advocates with hostility. They experience us as asking for an entitlement they themselves have been refused - an expression of their full humanity.
Civil rights must rise into a new, more inclusive register. That ascent makes use of the recognition that the mainstream is a myth. With respect to any particular identity, the word "mainstream" makes sense, as in the statement that straights are more mainstream than gays. Used generically, however, the word loses meaning. Because human beings hold many identities, the mainstream is a shifting coalition, and none of us are entirely within it. It is not normal to be completely normal.
This does not mean discrimination against racial minorities is the same as discrimination against poets. American civil rights law has correctly directed its concern toward certain groups and not others. But the aspiration of civil rights - the aspiration that we be free to develop our human capacities without the impediment of witless conformity - is an aspiration that extends beyond traditional civil rights groups.
To fulfill that aspiration, we must think differently both within the law and outside it. With respect to legal remedies, we must shift away from claims that demand equality for particular groups toward claims that demand liberty for us all. This is not an exhortation that we strip protections from currently recognized groups. Rather, it is a prediction that future courts will be unable to sustain a group-based vision of civil rights when faced with the broad and irreversible trend toward demographic pluralism. In an increasingly diverse society, the courts must look to what draws us together as citizens rather than to what drives us apart.
As if in recognition of that fact, the Supreme Court has moved in recent years away from extending protections on the basis of group membership and toward doing so on the basis of liberties we all possess. In 2003, the court struck down a Texas statute that prohibited same-sex sodomy. It did not, however, frame the case as one concerning the equality rights of gays. Instead, it cast the case as one concerning the interest we all - straight, gay or otherwise - have in controlling our intimate lives. Similarly, in 2004, the court held that a state could be required by a Congressional statute to make its courthouses wheelchair accessible. Again, the court ruled in favor of the minority group without framing its analysis in group-based equality rhetoric. Rather, it held that all people - disabled or otherwise - have a "right of access to the courts," which had been denied in that instance.
In these cases, the court implicitly acknowledged the national exhaustion with group-based identity politics and quieted the anxiety about pluralism that is driving us back toward the assimilative ideal. By emphasizing the interest all individuals have in our own liberty, the court focused on what unites us rather than on what divides us. While preserving the distinction between being and doing, the court decided to protect doing in its own right.
If the Supreme Court protects individuals against covering demands in the future, I believe it will do so by invoking the universal rights of people. I predict that if the court ever recognizes the right to speak a native language, it will protect that right as a liberty to which we are all entitled, rather than as a remedial concession granted to a particular national-origin group. If the court recognizes rights to grooming, like the right to wear cornrows, I believe it will do so under something akin to the German Constitution's right to personality rather than as a right attached to racial minorities. And I hope that if the court protects the right of gays to marry, it will do so by framing it as the right we all have to marry the person we love, rather than defending "gay marriage" as if it were a separate institution.
A liberty-based approach to civil rights, of course, brings its own complications, beginning with the question of where my liberty ends and yours begins. But the ability of liberty analysis to illuminate our common humanity should not be underestimated. This virtue persuaded both Martin Luther King Jr. and Malcolm X to argue for the transition from civil rights to human rights at the ends of their lives. It is time for American law to follow suit.
While I have great hopes for this new legal paradigm, I also believe law will play a relatively small part in the new civil rights. A doctor friend told me that in his first year of medical school, his dean described how doctors were powerless to cure the vast majority of human ills. People would get better, or they would not, but it would not be doctors who would cure them. Part of becoming a doctor, the dean said, was to surrender a layperson's awe for medical authority. I wished then that someone would give an analogous lecture to law students and to Americans at large. My education in law has been in no small part an education in its limitations.
As an initial matter, many covering demands are made by actors the law does not - and in my view should not - hold accountable, like friends, family, neighbors, the "culture" or individuals themselves. When I think of the covering demands I have experienced, I can trace many of them only to my own censorious consciousness. And while I am often tempted to sue myself, I recognize this is not my healthiest impulse.
Law is also an incomplete solution to coerced assimilation because it has yet to recognize the myriad groups that are subjected to covering demands even though these groups cannot be defined by traditional classifications like race, sex, orientation, religion and disability. Whenever I speak about covering, I receive new instances of identities that can be covered. The law may someday move to protect some of these identities. But it will never protect them all.
For these and other reasons, I am troubled that Americans seem increasingly inclined to turn toward the law to do the work of civil rights precisely when they should be turning away from it. The primary solution lies in all of us as citizens, not in the tiny subset of us who are lawyers. People confronted with demands to cover should feel emboldened to seek a reason for that demand, even if the law does not reach the actors making the demand or recognize the group burdened by it. These reason-forcing conversations should happen outside courtrooms - in public squares and prayer circles, in workplaces and on playgrounds. They should occur informally and intimately, in the everyday places where tolerance is made and unmade.
What will constitute a good-enough reason to justify assimilation will obviously be controversial. We have come to some consensus that certain reasons are illegitimate - like racism, sexism or religious intolerance. Beyond that, we should expect conversations rather than foreordained results - what reasons count, and for what purposes, will be for us all to decide by facing one another as citizens. My personal inclination is always to privilege the claims of the individual against countervailing interests like "neatness" or "workplace harmony." But we should have that conversation.
Such conversations are the best - and perhaps the only - way to give both assimilation and authenticity their due. They will help us alleviate conservative alarmists' fears of a balkanized America and radical multiculturalists' fears of a monocultural America. The aspiration of civil rights has always been to permit people to pursue their human flourishing without limitations based on bias. Focusing on law prevents us from seeing the revolutionary breadth of that aspiration. It is only when we leave the law that civil rights suddenly stops being about particular agents of oppression and particular victimized groups and starts to become a project of human flourishing in which we all have a stake.
I don't teach classes on gay rights any more. I suspect many of my students now experience me as a homosexual professional rather than as a professional homosexual, if they think of me in such terms at all. But I don't experience myself as covering. I've just moved on to other interests, in the way scholars do. So the same behavior - not teaching gay rights - has changed in meaning over time.
This just brings home to me that the only right I have wanted with any consistency is the freedom to be who I am. I'll be the first to admit that I owe much of that freedom to group-based equality movements, like the gay rights movement. But it is now time for us as a nation to shift the emphasis away from equality and toward liberty in our debates about identity politics. Only through such freedom can we live our lives as works in progress, which is to say, as the complex, changeful and contradictory creatures that we are.
Kenji Yoshino is a professor at Yale Law School. This article is adapted from his book,"Covering: The Hidden Assault on Our Civil Rights," which will be published by Random House later this month.
By KENJI YOSHINO
January 15, 2006, NYT Magazine
When I began teaching at Yale Law School in 1998, a friend spoke to me frankly. "You'll have a better chance at tenure," he said, "if you're a homosexual professional than if you're a professional homosexual." Out of the closet for six years at the time, I knew what he meant. To be a "homosexual professional" was to be a professor of constitutional law who "happened" to be gay. To be a "professional homosexual" was to be a gay professor who made gay rights his work. Others echoed the sentiment in less elegant formulations. Be gay, my world seemed to say. Be openly gay, if you want. But don't flaunt.
I didn't experience the advice as antigay. The law school is a vigorously tolerant place, embedded in a university famous for its gay student population. (As the undergraduate jingle goes: "One in four, maybe more/One in three, maybe me/One in two, maybe you.") I took my colleague's words as generic counsel to leave my personal life at home. I could see that research related to one's identity - referred to in the academy as "mesearch" - could raise legitimate questions about scholarly objectivity.
I also saw others playing down their outsider identities to blend into the mainstream. Female colleagues confided that they would avoid references to their children at work, lest they be seen as mothers first and scholars second. Conservative students asked for advice about how open they could be about their politics without suffering repercussions at some imagined future confirmation hearing. A religious student said he feared coming out as a believer, as he thought his intellect would be placed on a 25 percent discount. Many of us, it seemed, had to work our identities as well as our jobs.
It wasn't long before I found myself resisting the demand to conform. What bothered me was not that I had to engage in straight-acting behavior, much of which felt natural to me. What bothered me was the felt need to mute my passion for gay subjects, people, culture. At a time when the law was transforming gay rights, it seemed ludicrous not to suit up and get in the game.
"Mesearch" being what it is, I soon turned my scholarly attention to the pressure to conform. What puzzled me was that I felt that pressure so long after my emergence from the closet. When I stopped passing, I exulted that I could stop thinking about my sexuality. This proved naïve. Long after I came out, I still experienced the need to assimilate to straight norms. But I didn't have a word for this demand to tone down my known gayness.
Then I found my word, in the sociologist Erving Goffman's book "Stigma." Written in 1963, the book describes how various groups - including the disabled, the elderly and the obese - manage their "spoiled" identities. After discussing passing, Goffman observes that "persons who are ready to admit possession of a stigma. . .may nonetheless make a great effort to keep the stigma from looming large." He calls this behavior covering. He distinguishes passing from covering by noting that passing pertains to the visibility of a characteristic, while covering pertains to its obtrusiveness. He relates how F.D.R. stationed himself behind a desk before his advisers came in for meetings. Roosevelt was not passing, since everyone knew he used a wheelchair. He was covering, playing down his disability so people would focus on his more conventionally presidential qualities.
As is often the case when you learn a new idea, I began to perceive covering everywhere. Leafing through a magazine, I read that Helen Keller replaced her natural eyes (one of which protruded) with brilliant blue glass ones. On the radio, I heard that Margaret Thatcher went to a voice coach to lower the pitch of her voice. Friends began to send me e-mail. Did I know that Martin Sheen was Ramon Estevez on his birth certificate, that Ben Kingsley was Krishna Bhanji, that Kirk Douglas was Issur Danielovitch Demsky and that Jon Stewart was Jonathan Leibowitz?
In those days, spotting instances of covering felt like a parlor game. It's hard to get worked up about how celebrities and politicians have to manage their public images. Jon Stewart joked that he changed his name because Leibowitz was "too Hollywood," and that seemed to get it exactly right. My own experience with covering was also not particularly difficult - once I had the courage to write from my passions, I was immediately embraced.
It was only when I looked for instances of covering in the law that I saw how lucky I had been. Civil rights case law is peopled with plaintiffs who were severely punished for daring to be openly different. Workers were fired for lapsing into Spanish in English-only workplaces, women were fired for behaving in stereotypically "feminine" ways and gay parents lost custody of their children for engaging in displays of same-sex affection. These cases revealed that far from being a parlor game, covering was the civil rights issue of our time.
The New Discrimination
In recent decades, discrimination in America has undergone a generational shift. Discrimination was once aimed at entire groups, resulting in the exclusion of all racial minorities, women, gays, religious minorities and people with disabilities. A battery of civil rights laws - like the Civil Rights Act of 1964 and the Americans with Disabilities Act of 1990 - sought to combat these forms of discrimination. The triumph of American civil rights is that such categorical exclusions by the state or employers are now relatively rare.
Now a subtler form of discrimination has risen to take its place. This discrimination does not aim at groups as a whole. Rather, it aims at the subset of the group that refuses to cover, that is, to assimilate to dominant norms. And for the most part, existing civil rights laws do not protect individuals against such covering demands. The question of our time is whether we should understand this new discrimination to be a harm and, if so, whether the remedy is legal or social in nature.
Consider the following cases:
• Renee Rogers, an African-American employee at American Airlines, wore cornrows to work. American had a grooming policy that prevented employees from wearing an all-braided hairstyle. When American sought to enforce this policy against Rogers, she filed suit, alleging race discrimination. In 1981, a federal district court rejected her argument. It first observed that cornrows were not distinctively associated with African-Americans, noting that Rogers had only adopted the hairstyle after it "had been popularized by a white actress in the film '10.' " As if recognizing the unpersuasiveness of what we might call the Bo Derek defense, the court further alleged that because hairstyle, unlike skin color, was a mutable characteristic, discrimination on the basis of grooming was not discrimination on the basis of race. Renee Rogers lost her case.
• Lydia Mikus and Ismael Gonzalez were called for jury service in a case involving a defendant who was Latino. When the prosecutor asked them whether they could speak Spanish, they answered in the affirmative. The prosecutor struck them, and the defense attorney then brought suit on their behalf, claiming national-origin discrimination. The prosecutor responded that he had not removed the potential jurors for their ethnicity but for their ability to speak Spanish. His stated concern was that they would not defer to the court translator in listening to Spanish-language testimony. In 1991, the Supreme Court credited this argument. Lydia Mikus and Ismael Gonzalez lost their case.
• Diana Piantanida had a child and took a maternity leave from her job at the Wyman Center, a charitable organization in Missouri. During her leave, she was demoted, supposedly for previously having handed in work late. The man who was then the Wyman Center's executive director, however, justified her demotion by saying the new position would be easier "for a new mom to handle." As it turned out, the new position had less responsibility and half the pay of the original one. But when Piantanida turned this position down, her successor was paid Piantanida's old salary. Piantanida brought suit, claiming she had been discharged as a "new mom." In 1997, a federal appellate court refused to analyze her claim as a sex-discrimination case, which would have led to comparing the treatment she received to the treatment of "new dads." Instead, it found that Piantanida's (admittedly vague) pleadings raised claims only under the Pregnancy Discrimination Act, which it correctly interpreted to protect women only while they are pregnant. Diana Piantanida lost her case.
Robin Shahar was a lesbian attorney who received a job offer from the Georgia Department of Law, where she had worked as a law student. The summer before she started her new job, Shahar had a religious same-sex commitment ceremony with her partner. She asked a supervisor for a late starting date because she was getting married and wanted to go on a celebratory trip to Greece. Believing Shahar was marrying a man, the supervisor offered his congratulations. Senior officials in the office soon learned, however, that Shahar's partner was a woman. This news caused a stir, reports of which reached Michael Bowers, the attorney general of Georgia who had successfully defended his state's prohibition of sodomy before the United States Supreme Court. After deliberating with his lawyers, Bowers rescinded her job offer. The staff member who informed her read from a script, concluding, "Thanks again for coming in, and have a nice day." Shahar brought suit, claiming discrimination on the basis of sexual orientation. In court, Bowers testified that he knew Shahar was gay when he hired her, and would never have terminated her for that reason. In 1997, a federal appellate court accepted that defense, maintaining that Bowers had terminated Shahar on the basis of her conduct, not her status. Robin Shahar lost her case.
• Simcha Goldman, an Air Force officer who was also an ordained rabbi, wore a yarmulke at all times. Wearing a yarmulke is part of the Orthodox tradition of covering one's head out of deference to an omnipresent god. Goldman's religious observance ran afoul of an Air Force regulation that prohibited wearing headgear while indoors. When he refused his commanding officer's order to remove his yarmulke, Goldman was threatened with a court martial. He brought a First Amendment claim, alleging discrimination on the basis of religion. In 1986, the Supreme Court rejected his claim. It stated that the Air Force had drawn a reasonable line between "religious apparel that is visible and that which is not." Simcha Goldman lost his case.
These five cases represent only a fraction of those in which courts have refused to protect plaintiffs from covering demands. In such cases, the courts routinely distinguish between immutable and mutable traits, between being a member of a legally protected group and behavior associated with that group. Under this rule, African-Americans cannot be fired for their skin color, but they could be fired for wearing cornrows. Potential jurors cannot be struck for their ethnicity but can be struck for speaking (or even for admitting proficiency in) a foreign language. Women cannot be discharged for having two X chromosomes but can be penalized (in some jurisdictions) for becoming mothers. Although the weaker protections for sexual orientation mean gays can sometimes be fired for their status alone, they will be much more vulnerable if they are perceived to "flaunt" their sexuality. Jews cannot be separated from the military for being Jewish but can be discharged for wearing yarmulkes.
This distinction between being and doing reflects a bias toward assimilation. Courts will protect traits like skin color or chromosomes because such traits cannot be changed. In contrast, the courts will not protect mutable traits, because individuals can alter them to fade into the mainstream, thereby escaping discrimination. If individuals choose not to engage in that form of self-help, they must suffer the consequences.
The judicial bias toward assimilation will seem correct and just to many Americans. Assimilation, after all, is a precondition of civilization - wearing clothes, having manners and obeying the law are all acts of assimilation. Moreover, the tie between assimilation and American civilization may be particularly strong. At least since Hector St. John de Crèvecoeur's 1782 "Letters from an American Farmer," this country has promoted assimilation as the way Americans of different backgrounds would be "melted into a new race of men." By the time Israel Zangwill's play "The Melting Pot" made its debut in 1908, the term had acquired the burnish of an American ideal. Theodore Roosevelt, who believed hyphenations like "Polish-American" were a "moral treason," is reputed to have yelled, "That's a great play!" from his box when it was performed in Washington. (He was wrong - it's no accident the title has had a longer run than the play.) And notwithstanding challenges beginning in the 1960's to move "beyond the melting pot" and to "celebrate diversity," assimilation has never lost its grip on the American imagination.
If anything, recent years have seen a revival of the melting-pot ideal. We are currently experiencing a pluralism explosion in the United States. Patterns of immigration since the late 1960's have made the United States the most religiously various country in the history of the world. Even when the demographics of a group - like the number of individuals with disabilities - are presumably constant, the number of individuals claiming membership in that group may grow exponentially. In 1970, there were 9 disability-related associations listed in the Encyclopedia of Associations; in 1980, there were 16; in 1990, there were 211; and in 2000, there were 799. The boom in identity politics has led many thoughtful commentators to worry that we are losing our common culture as Americans. Fearful that we are breaking apart into balkanized fiefs, even liberal lions like Arthur Schlesinger have called for a recommitment to the ethic of assimilation.
Beyond keeping pace with the culture, the judiciary has institutional reasons for encouraging assimilation. In the yarmulke case, the government argued that ruling in favor of the rabbi's yarmulke would immediately invite suits concerning the Sikh's turban, the yogi's saffron robes and the Rastafarian's dreadlocks. Because the courts must articulate principled grounds for their decisions, they are particularly ill equipped to protect some groups but not others in an increasingly diverse society. Seeking to avoid judgments about the relative worth of groups, the judiciary has decided instead to rely on the relatively uncontroversial principle of protecting immutable traits.
Viewed in this light, the judiciary's failure to protect individuals against covering demands seems eminently reasonable. Unfortunately, it also represents an abdication of its responsibility to protect civil rights.
The Case Against Assimilation
The flaw in the judiciary's analysis is that it casts assimilation as an unadulterated good. Assimilation is implicitly characterized as the way in which groups can evade discrimination by fading into the mainstream - after all, the logic goes, if a bigot cannot discriminate between two individuals, he cannot discriminate against one of them. But sometimes assimilation is not an escape from discrimination, but precisely its effect. When a Jew is forced to convert to Protestantism, for instance, we do not celebrate that as an evasion of anti-Semitism. We should not blind ourselves to the dark underbelly of the American melting pot.
Take the cornrows case. Initially, this case appears to be an easy one for the employer, as hairstyle seems like such a trivial thing. But if hair is so trivial, we might ask why American Airlines made it a condition of Renee Rogers's employment. What's frustrating about the employment discrimination jurisprudence is that courts often don't force employers to answer the critical question of why they are requiring employees to cover. If we look to other sources, the answers can be troubling.
John T. Molloy's perennially popular self-help manual "New Dress for Success" also tells racial minorities to cover. Molloy advises African-Americans to avoid "Afro hairstyles" and to wear "conservative pinstripe suits, preferably with vests, accompanied by all the establishment symbols, including the Ivy League tie." He urges Latinos to "avoid pencil-line mustaches," "any hair tonic that tends to give a greasy or shiny look to the hair," "any articles of clothing that have Hispanic associations" and "anything that is very sharp or precise."
Molloy is equally frank about why covering is required. The "model of success," he says, is "white, Anglo-Saxon and Protestant." Those who do not possess these traits "will elicit a negative response to some degree, regardless of whether that response is conscious or subconscious." Indeed, Molloy says racial minorities must go "somewhat overboard" to compensate for immutable differences from the white mainstream. After conducting research on African-American corporate grooming, Molloy reports that "blacks had not only to dress more conservatively but also more expensively than their white counterparts if they wanted to have an equal impact."
Molloy's basic point is supported by social-science research. The economists Marianne Bertrand and Sendhil Mullainathan recently conducted a study in which they sent out résumés that were essentially identical except for the names at the top. They discovered that résumés with white-sounding names like Emily Walsh or Greg Baker drew 50 percent more callbacks than those with African-American-sounding names like Lakisha Washington or Jamal Jones. So it seems that even when Americans have collectively set our faces against racism, we still react negatively to cultural traits - like hairstyles, clothes or names - that we associate with historically disfavored races.
We can see a similar dynamic in the termination of Robin Shahar. Michael Bowers, the state attorney general, disavowed engaging in first-generation discrimination when he said he had no problem with gay employees. This raises the question of why he fired Shahar for having a religious same-sex commitment ceremony. Unlike American Airlines, Bowers provided some answers. He argued that retaining Shahar would compromise the department's ability to deny same-sex couples marriage licenses and to enforce sodomy statutes.
Neither argument survives scrutiny. At no point did Shahar seek to marry her partner legally, nor did she agitate for the legalization of same-sex marriage. The Georgia citizenry could not fairly have assumed that Shahar's religious ceremony would entitle the couple to a civil license. Bowers's claim that Shahar's wedding would compromise her ability to enforce sodomy statutes is also off the mark. Georgia's sodomy statute (which has since been struck down) punished cross-sex as well as same-sex sodomy, meaning that any heterosexual in the department who had ever had oral sex was as compromised as Shahar.
Stripped of these rationales, Bowers's termination of Shahar looks more sinister. When she told a supervisor she was getting married, he congratulated her. When he discovered she was marrying a woman, it wasn't long before she no longer had a job. Shahar's religious ceremony was not in itself indiscreet; cross-sex couples engage in such ceremonies all the time. If Shahar was flaunting anything, it was her belief in her own equality: her belief that she, and not the state, should determine what personal bonds are worthy of celebration.
The demand to cover is anything but trivial. It is the symbolic heartland of inequality - what reassures one group of its superiority to another. When dominant groups ask subordinated groups to cover, they are asking them to be small in the world, to forgo prerogatives that the dominant group has and therefore to forgo equality. If courts make critical goods like employment dependent on covering, they are legitimizing second-class citizenship for the subordinated group. In doing so, they are failing to vindicate the promise of civil rights.
So the covering demand presents a conundrum. The courts are right to be leery of intervening in too brusque a manner here, as they cannot risk playing favorites among groups. Yet they also cannot ignore the fact that the covering demand is where many forms of inequality continue to have life. We need a paradigm that gives both these concerns their due, adapting the aspirations of the civil rights movement to an increasingly pluralistic society.
The New Civil Rights
The new civil rights begins with the observation that everyone covers. When I lecture on covering, I often encounter what I think of as the "angry straight white man" reaction. A member of the audience, almost invariably a white man, almost invariably angry, denies that covering is a civil rights issue. Why shouldn't racial minorities or women or gays have to cover? These groups should receive legal protection against discrimination for things they cannot help. But why should they receive protection for behaviors within their control - wearing cornrows, acting "feminine" or flaunting their sexuality? After all, the questioner says, I have to cover all the time. I have to mute my depression, or my obesity, or my alcoholism, or my shyness, or my working-class background or my nameless anomie. I, too, am one of the mass of men leading lives of quiet desperation. Why should legally protected groups have a right to self-expression I do not? Why should my struggle for an authentic self matter less?
I surprise these individuals when I agree. Contemporary civil rights has erred in focusing solely on traditional civil rights groups - racial minorities, women, gays, religious minorities and people with disabilities. This assumes those in the so-called mainstream - those straight white men - do not also cover. They are understood only as obstacles, as people who prevent others from expressing themselves, rather than as individuals who are themselves struggling for self-definition. No wonder they often respond to civil rights advocates with hostility. They experience us as asking for an entitlement they themselves have been refused - an expression of their full humanity.
Civil rights must rise into a new, more inclusive register. That ascent makes use of the recognition that the mainstream is a myth. With respect to any particular identity, the word "mainstream" makes sense, as in the statement that straights are more mainstream than gays. Used generically, however, the word loses meaning. Because human beings hold many identities, the mainstream is a shifting coalition, and none of us are entirely within it. It is not normal to be completely normal.
This does not mean discrimination against racial minorities is the same as discrimination against poets. American civil rights law has correctly directed its concern toward certain groups and not others. But the aspiration of civil rights - the aspiration that we be free to develop our human capacities without the impediment of witless conformity - is an aspiration that extends beyond traditional civil rights groups.
To fulfill that aspiration, we must think differently both within the law and outside it. With respect to legal remedies, we must shift away from claims that demand equality for particular groups toward claims that demand liberty for us all. This is not an exhortation that we strip protections from currently recognized groups. Rather, it is a prediction that future courts will be unable to sustain a group-based vision of civil rights when faced with the broad and irreversible trend toward demographic pluralism. In an increasingly diverse society, the courts must look to what draws us together as citizens rather than to what drives us apart.
As if in recognition of that fact, the Supreme Court has moved in recent years away from extending protections on the basis of group membership and toward doing so on the basis of liberties we all possess. In 2003, the court struck down a Texas statute that prohibited same-sex sodomy. It did not, however, frame the case as one concerning the equality rights of gays. Instead, it cast the case as one concerning the interest we all - straight, gay or otherwise - have in controlling our intimate lives. Similarly, in 2004, the court held that a state could be required by a Congressional statute to make its courthouses wheelchair accessible. Again, the court ruled in favor of the minority group without framing its analysis in group-based equality rhetoric. Rather, it held that all people - disabled or otherwise - have a "right of access to the courts," which had been denied in that instance.
In these cases, the court implicitly acknowledged the national exhaustion with group-based identity politics and quieted the anxiety about pluralism that is driving us back toward the assimilative ideal. By emphasizing the interest all individuals have in our own liberty, the court focused on what unites us rather than on what divides us. While preserving the distinction between being and doing, the court decided to protect doing in its own right.
If the Supreme Court protects individuals against covering demands in the future, I believe it will do so by invoking the universal rights of people. I predict that if the court ever recognizes the right to speak a native language, it will protect that right as a liberty to which we are all entitled, rather than as a remedial concession granted to a particular national-origin group. If the court recognizes rights to grooming, like the right to wear cornrows, I believe it will do so under something akin to the German Constitution's right to personality rather than as a right attached to racial minorities. And I hope that if the court protects the right of gays to marry, it will do so by framing it as the right we all have to marry the person we love, rather than defending "gay marriage" as if it were a separate institution.
A liberty-based approach to civil rights, of course, brings its own complications, beginning with the question of where my liberty ends and yours begins. But the ability of liberty analysis to illuminate our common humanity should not be underestimated. This virtue persuaded both Martin Luther King Jr. and Malcolm X to argue for the transition from civil rights to human rights at the ends of their lives. It is time for American law to follow suit.
While I have great hopes for this new legal paradigm, I also believe law will play a relatively small part in the new civil rights. A doctor friend told me that in his first year of medical school, his dean described how doctors were powerless to cure the vast majority of human ills. People would get better, or they would not, but it would not be doctors who would cure them. Part of becoming a doctor, the dean said, was to surrender a layperson's awe for medical authority. I wished then that someone would give an analogous lecture to law students and to Americans at large. My education in law has been in no small part an education in its limitations.
As an initial matter, many covering demands are made by actors the law does not - and in my view should not - hold accountable, like friends, family, neighbors, the "culture" or individuals themselves. When I think of the covering demands I have experienced, I can trace many of them only to my own censorious consciousness. And while I am often tempted to sue myself, I recognize this is not my healthiest impulse.
Law is also an incomplete solution to coerced assimilation because it has yet to recognize the myriad groups that are subjected to covering demands even though these groups cannot be defined by traditional classifications like race, sex, orientation, religion and disability. Whenever I speak about covering, I receive new instances of identities that can be covered. The law may someday move to protect some of these identities. But it will never protect them all.
For these and other reasons, I am troubled that Americans seem increasingly inclined to turn toward the law to do the work of civil rights precisely when they should be turning away from it. The primary solution lies in all of us as citizens, not in the tiny subset of us who are lawyers. People confronted with demands to cover should feel emboldened to seek a reason for that demand, even if the law does not reach the actors making the demand or recognize the group burdened by it. These reason-forcing conversations should happen outside courtrooms - in public squares and prayer circles, in workplaces and on playgrounds. They should occur informally and intimately, in the everyday places where tolerance is made and unmade.
What will constitute a good-enough reason to justify assimilation will obviously be controversial. We have come to some consensus that certain reasons are illegitimate - like racism, sexism or religious intolerance. Beyond that, we should expect conversations rather than foreordained results - what reasons count, and for what purposes, will be for us all to decide by facing one another as citizens. My personal inclination is always to privilege the claims of the individual against countervailing interests like "neatness" or "workplace harmony." But we should have that conversation.
Such conversations are the best - and perhaps the only - way to give both assimilation and authenticity their due. They will help us alleviate conservative alarmists' fears of a balkanized America and radical multiculturalists' fears of a monocultural America. The aspiration of civil rights has always been to permit people to pursue their human flourishing without limitations based on bias. Focusing on law prevents us from seeing the revolutionary breadth of that aspiration. It is only when we leave the law that civil rights suddenly stops being about particular agents of oppression and particular victimized groups and starts to become a project of human flourishing in which we all have a stake.
I don't teach classes on gay rights any more. I suspect many of my students now experience me as a homosexual professional rather than as a professional homosexual, if they think of me in such terms at all. But I don't experience myself as covering. I've just moved on to other interests, in the way scholars do. So the same behavior - not teaching gay rights - has changed in meaning over time.
This just brings home to me that the only right I have wanted with any consistency is the freedom to be who I am. I'll be the first to admit that I owe much of that freedom to group-based equality movements, like the gay rights movement. But it is now time for us as a nation to shift the emphasis away from equality and toward liberty in our debates about identity politics. Only through such freedom can we live our lives as works in progress, which is to say, as the complex, changeful and contradictory creatures that we are.
Kenji Yoshino is a professor at Yale Law School. This article is adapted from his book,"Covering: The Hidden Assault on Our Civil Rights," which will be published by Random House later this month.
What Is a Living Wage?
What Is a Living Wage?
NYT Magazine, January 15, 2006
By JON GERTNER
If It Happened in Baltimore, Maybe It Can Happen Anywhere
For a few weeks in the summer of 1995, Jen Kern spent her days at a table in the Library of Congress in Washington, poring over the fine print of state constitutions from around the country. This was, at the time, a somewhat-eccentric strategy to fight poverty in America. Kern was not a high-powered lawyer or politician; she was 25 and held a low-paying, policy-related job at Acorn, the national community organization. Yet to understand why living-wage campaigns matter - where they began, what they mean, and why they inspire such passion and hope - it helps to consider what Kern was doing years ago in the library, reading obscure legislation from states like Missouri and New Mexico.
A few months earlier, she and her colleagues at Acorn witnessed an energetic grass-roots campaign in Baltimore, led by a coalition of church groups and labor unions. Workers in some of Baltimore's homeless shelters and soup kitchens had noticed something new and troubling about many of the visitors coming in for meals and shelter: they happened to have full-time jobs. In response, local religious leaders successfully persuaded the City Council to raise the base pay for city contract workers to $6.10 an hour from $4.25, the federal minimum at the time. The Baltimore campaign was ostensibly about money. But to those who thought about it more deeply, it was about the force of particular moral propositions: first, that work should be rewarded, and second, that no one who works full time should have to live in poverty.
So Kern and another colleague were dispatched to find out if what happened in Baltimore could be tried - and expanded - elsewhere. As she plowed through documents, Kern was unsure whether to look for a particular law or the absence of one. Really, what she was trying to do was compile a list of places in the U.S. where citizens or officials could legally mount campaigns to raise the minimum wage above the federal standard. In other words, she needed to know if anything stood in the way, like a state regulation or court decision. What she discovered was that in many states a law more ambitious than Baltimore's - one that didn't apply to only city contractors but to all local businesses - seemed permissible.
Whether a wage campaign was winnable turned out to be a more complicated matter. In the late 1990's, Kern helped Acorn in a series of attempts to raise the minimum wage in Denver, Houston and Missouri. They all failed. "It wasn't even close," she says. In the past few years, though, as the federal minimum wage has remained fixed at $5.15 and the cost of living (specifically housing) has risen drastically in many regions, similar campaigns have produced so many victories (currently, 134) that Kern speaks collectively of "a widespread living-wage movement."
Santa Fe has been one of the movement's crowning achievements. This month the city's minimum wage rose to $9.50 an hour, the highest rate in the United States. But other recent victories include San Francisco in 2003 and Nevada in 2004. And if a ending bill in Chicago is any indication, the battles over wage laws will soon evolve into campaigns to force large, private-sector businesses like Wal-Mart to provide not only higher wages but also more money for employee health care.
It is a common sentiment that economic fairness - or economic justice, as living-wage advocates phrase it - should, or must, come in a sweeping and righteous gesture from the top. From Washington, that is. But most wage campaigns arise from the bottom, from residents and low-level officials and from cities and states - from everywhere except the federal government. "I think what the living-wage movement has done in the past 11 years is incredible," David Neumark, a frequent critic of the phenomenon who is a senior fellow at the Public Policy Institute of California, told me recently. "How many other issues are there where progressives have been this successful? I can't think of one."
The immediate goal for living-wage strategists is to put initiatives on the ballots in several swing states this year. If their reckoning is correct, the laws should effect a financial gain for low-income workers and boost turnout for candidates who campaign for higher wages. In Florida, a ballot initiative to raise the state's minimum wage by a dollar, to $6.15, won 71 percent of the vote in 2004, a blowout that surprised even people like Kern, who spent several weeks in Miami working on the measure. "We would like it to become a fact of political life," Kern says, "where every year the other side has to contend with a minimum-wage law in some state." Though victories like the one in Florida may have done little to help the Kerry-Edwards ticket - George Bush won 52 percent of the state's vote - Kern and some in the Democratic establishment have come to believe that the left, after years of electoral frustration, has finally found its ultimate moral-values issue. "This is what moves people to the polls now," Kern insists. "This is our gay marriage." Already, during the past few months, a coalition of grass-roots and labor organizations have begun gathering hundreds of thousands of signatures to ensure that proposed laws to increase wages are voted on in November. The first targets, Kern told me, will be Arizona, Colorado, Michigan and Ohio. Next in line, either this year or soon after, are Montana, Oklahoma and Arkansas, the home of Wal-Mart.
Does America Care About the Gap Between Rich and Poor?
I first met Kern on a sunny morning in late September in Albuquerque, a city of 4770,000 that made her list when she was working in the Library of Congress 10 years ago. She was now, at age 35, campaigning for a ballot initiative that would raise the minimum wage in the city to $7.50 an hour from $5.15. There was no face for the placards, no charismatic presence to rally the troops at midnight or shake hands at dawn outside 7-Eleven. Instead, there was a number, $7.50, a troop of campaign workers to canvass the neighborhoods and an argument: that many low-wage workers were being paid poverty wages. That a full-time job at the federal minimum rate added up to $10,712 a year. That local businesses could afford the pay raise. And that it was up to the voters to restore balance.
One of the more intriguing questions about campaigns like the one in Albuquerque, and those planned for swing states next fall, is whether they reflect a profound sense of public alarm about the divergence between rich and poor in this country. Certainly most Americans do not support higher wages out of immediate self-interest. Probably only around 3 percent of those in the work force are actually paid $5.15 or less an hour; most low-wage workers, including Wal-Mart employees, who generally start at between $6.50 and $7.50 an hour, earn more. Increasing the minimum wage to $7.25 an hour would directly affect the wages of only about 7 percent of the work force. Nevertheless, pollsters have discovered that a hypothetical state ballot measure typically generates support of around 70 percent. A recent poll by the Pew Research Center actually put the support for raising the national minimum wage to $6.45 at 86 percent. Rick Berman, a lobbyist who started the Employment Policies Institute and who is a longtime foe of living-wage laws, agrees that "the natural tendency is for people to support these things. They believe it's a free lunch." On the other hand, the electorate's reasons for crossing party lines to endorse the measures may be due to the simple fact that at least 60 percent of Americans have at one time or another been paid the minimum wage. Voters may just know precisely what they're voting for and why.
In the mid-1990's, the last time Congress raised the minimum wage, the Clinton White House was reluctant to start a war over the federal rate, according to Robert Reich, the former labor secretary. For an administration bent on policy innovation, that would have seemed "old" Democrat. "Then we did some polling and discovered that the public is overwhelmingly in favor," Reich told me recently. "At which point the White House gave the green light to Democrats in Congress." Reich, now a professor at the University of California, Berkeley, happens to view the minimum wage as a somewhat inefficient tool for alleviating poverty (compared with earned income tax credits, say). But he acknowledges that it has a powerful moral and political impact, in states red as well as blue, and especially now, in an era when workers see the social contract with their employers vanishing. "They see neighbors and friends being fired for no reason by profitable companies, executives making off like bandits while thousands of their own workers are being laid off," Reich says. "They see health insurance drying up, employer pensions shrinking. Promises to retirees of health benefits are simply thrown overboard. The whole system has aspects that seem grossly immoral to average working people." As Reich points out, whatever the minimum wage's limitations may be as a policy instrument, as an idea "it demarcates our concept of decency with regard to work."
The idea, Reich points out, isn't new, even if the recent fervor for it is. Massachusetts enacted a state minimum wage in 1912, several decades before the federal minimum wage of 25 cents an hour was adopted in 1938. And most of the wage ordinances of the past decade specifically trace their origins back to Baltimore, in 1995. After that moment, in fact, the phrase "living wage" soon caught on - or, you might say, returned. It was a popular workers' refrain in the late 19th century and was the title of a 1906 book by John Ryan, a Roman Catholic priest. In the late 1990's, a loose national network of advocates sprang up, incorporating organized labor, grass-roots groups like Acorn and the Industrial Areas Foundation and, more recently, the National Council of Churches. Legal advice often came out of the Brennan Center for Justice at New York University's law school, where a lawyer named Paul Sonn helped write wage ordinances and ballot measures for various states and cities.
By dint of its piecemeal, localized progress, the modern living-wage movement has grown without fanfare; one reason is that until recently, most of the past decade's wage laws, like Baltimore's, have been narrow in scope and modest in effect. Strictly speaking, a "living wage" law has typically required that any company receiving city contracts, and thus taxpayers' money, must pay its workers a wage far above the federal minimum, usually between $9 and $11 an hour. These regulations often apply to employees at companies to which municipalities have outsourced tasks like garbage collection, security services and home health care. Low-wage workers in the private sector - in restaurants, hotels, retail stores or the like - have been unaffected. Their pay stays the same.
In Santa Fe, the City Council passed a similar kind of wage law in 2002, raising the hourly pay for city employees and contractors. Some officials in Santa Fe, however, had decided from the start that its wage rules should ultimately be different - that the small city (population 666,000) could even serve as a test example for the rest of the U.S. Early on, several city councilors told me, they anticipated that Santa Fe - with a high cost of living, a large community of low-paid immigrants and a liberal City Council - would eventually extend its wage floor to all local businesses, private as well as public, so that every worker in the city, no matter the industry, would make more than $5.15. The initial numbers the councilors considered as they began to strategize seemed stratospheric: a living wage that began at $10 or $12 or even $14.50 an hour. For some laborers, that would constitute a raise of 200 percent or more. Nothing remotely like it existed in any other city in the country.
The Economists Are Surprised
In the years before the enactment of the federal minimum wage in the late 1930's, the country's post-Depression economy was so weak that the notion that government should leave private business to its own devices was effectively marginalized. During the past few decades, though, in the wake of a fairly robust economy, debates on raising the minimum wage have consistently resulted in a rhetorical caterwaul. While the arguments have usually been between those on the labor side, who think the minimum wage should be raised substantially, and those on the employer side, who oppose any increase, a smaller but vocal contingent has claimed, more broadly and more philosophically, that it is in the best interest of both business and labor to let the market set wages, not the politicians. And certainly not the voters.
This last position was long underpinned by the academic consensus that a rise in the minimum wage hurts employment by interfering with the flow of supply and demand. In simplest terms, most economists accepted that when government forces businesses to pay higher wages, businesses, in turn ,hire fewer employees. It is a powerful argument against the minimum wage, since it suggests that private businesses as a group, along with teenagers and low-wage employees, will be penalized by a mandatory raise.
The tenor of this debate began to change in the mid-1990's following some work done by two Princeton economists, David Card (now at the University of California at Berkeley) and Alan B. Krueger. In 1992, New Jersey increased the state minimum wage to $5.05 an hour (applicable to both the public and private sectors), which gave the two young professors an opportunity to study the comparative effects of that raise on fast-food restaurants and low-wage employment in New Jersey and Pennsylvania, where the minimum wage remained at the federal level of $4.25 an hour. Card and Krueger agreed that the hypothesis that a rise in wages would destroy jobs was "one of the clearest and most widely appreciated in the field of economics." Both told me they believed, at the start, that their work would reinforce that hypothesis. But in 1995, and again in 2000, the two academics effectively shredded the conventional wisdom. Their data demonstrated that a modest increase in wages did not appear to cause any significant harm to employment; in some cases, a rise in the minimum wage even resulted in a slight increase in employment.
Card and Krueger's conclusions have not necessarily made philosophical converts of Congress or the current administration. Attempts to raise the federal minimum wage - led by Senators Edward M. Kennedy on the left and Rick Santorum on the right - have made little headway over the past few years. And the White House went so far as to temporarily suspend the obligation of businesses with U.S. government construction contracts to pay so-called prevailing wages (that is, whatever is paid to a majority of workers in an industry in a particular area) during the rebuilding after Hurricane Katrina. David Card, who seems nothing short of disgusted by the ideological nature of the debates over the wage issue, says he feels that opinions on the minimum wage are so politically entrenched that even the most scientific studies can't change anyone's mind. "People think we're biased, partisan," he says. And he's probably right. While Card has never advocated for or against raising the minimum wage, many who oppose wage laws have made exactly those assertions about his research. Nonetheless, in Krueger's view, he and Card changed the debate. "I'm willing to declare a partial victory," Krueger told me. Some recent surveys of top academics show a significant majority now agree that a modest raise in the minimum wage does little to harm employment, he points out.
If nothing else, Card and Krueger's findings have provided persuasive data, and a degree of legitimacy, to those who maintain that raising the minimum wage, whether at the city, state or federal level, need not be toxic. The Economic Policy Institute, which endorses wage regulations, has succeeded recently in getting hundreds of respected economists - excluding Card and Krueger, however, who choose to remain outside the debate - to support raising the federal minimum to $7 an hour. That would have been impossible as recently as five years ago, says Jeff Chapman, an economist at the E.P.I. Even Wal-Mart's president and C.E.O., Lee Scott, recently spoke out in favor of raising the minimum wage. It wasn't altruism or economic theory or even public relations that motivated him, but a matter of bottom-line practicality. "Our current average hourly wage for workers is $9.68," Lee Culpepper, a Wal-Mart spokesman, told me. "So I would think raising the wage would have minimal impact on our workers. But we think it would have a beneficial effect on our customers."
What a Higher Minimum Wage Can Mean to Those Making It
One evening in Santa Fe, I sat down with some of the people Wal-Mart is worried about. Like Louis Alvarez, a 58-year-old cafeteria worker in the Santa Fe schools who for many years helped prepare daily meals for 700 children. For that he was paid $6.85 an hour and brought home $203 every two weeks. He had no disposable income - indeed, he wasn't sure what I meant by disposable income; he barely had money for rent. Statistically speaking, he was far below the poverty line, which for a family of two is about $12,800 a year. For Alvarez, an increase in the minimum wage meant he would be able to afford to go to flea markets, he said.
I also met with Ashley Gutierrez, 20, and Adelina Reyes, 19, who have low-paying customer-service and restaurant jobs. By most estimates, 35 percent of those who make $7 an hour or less in the U.S. are teenagers. A few months ago, Reyes told me, she was spending 86 hours every two weeks at two minimum-wage jobs to pay for her car and to pay for college. Gutierrez, also in school, was working 20 hours a week at Blockbuster video for the minimum wage. People like Alvarez and Gutierrez and Reyes were the ones who spurred two city councilors in Santa Fe, Frank Montaño and Jimmie Martinez, to introduce the living-wage ordinance. "Our schools here don't do so well," Montaño told me, explaining that he believed higher-wage jobs would let parents, who might otherwise have to work a second job, spend more time with their children. (At the same time working teenagers like Gutierrez would have more time with their parents.) For Santa Fe residents who were living five or six to a room in two-bedroom adobes, Montaño said he hoped a higher minimum wage might put having their own places to live at least within the realm of possibility.
Montaño was confident - perhaps too confident, as it would turn out - that businesses would become acclimated to higher payroll costs. He has run a restaurant and a tour-bus company himself, and he knew that the tight labor market in Santa Fe had pushed up wages so that many entry-level workers were already earning more than $8 an hour. "The business owners believe that government, especially at the local level, should not dictate to business, so to them it was a matter of principle," Montaño says. It was to him too. "We knew that other communities were watching what we were doing," he explains. He and his colleagues on the council were already receiving help from Paul Sonn at the Brennan Center in New York. "I knew that their involvement meant that they saw this as something that was important nationally," Montaño says. "As we got our foot in the door in terms of this ordinance being applied to the private sector," he surmised, that would give the living-wage network the ammunition to help other communities across the country do likewise. "I always knew, early on," Montaño says, "that if Santa Fe enacted such an ordinance, that it likely would go to court, and that if it passed the legal test, it would be the kind of ordinance that other communities would copy." The problem, at least from Montaño's perspective, was getting it enacted in the first place.
The Moral Argument Carries the Day in Santa Fe
Santa Fe's City Council asked nine residents, representing the interests of labor and management, to join a round table that would settle the specifics of the proposed living-wage law - how high the wage would be, for instance, and how soon it would be phased in. Some members of the round table, like Al Lucero, who owns a popular local restaurant, Maria's New Mexican Kitchen, found the entire premise of a city wage law objectionable. "I think the minimum wage at $5.15 is ridiculous," Lucero told me. "If the state were to raise it overall, to $7 an hour or $7.50 an hour, I think that would be wonderful. I think we need to do it." But $9 or $10 or $11 was too high, in his view - and it would put Santa Fe at a disadvantage to other cities in the state or region that could pay workers less. Also, there were the free-market principles that Frank Montaño had anticipated: "They were trying to push and tell us how to live our lives and how to conduct our business," says Lucero, who employs about 60 people.
Not surprisingly, Lucero's opponents on the round table saw things in a different light. For example, Carol Oppenheimer, a labor attorney, viewed the proposed law as a practical and immediate solution. "I got involved with the living-wage network because unions are having a very hard time," she told me. She assumed that local businesses could manage with a higher payroll. Yet after only a few meetings of the task force, both sides dug in, according to Oppenheimer.
It was then that the living-wage proponents hit on a scorched-earth, tactical approach. "What really got the other side was when we said, 'It's just immoral to pay people $5.15, they can't live on that,' " Oppenheimer recalls. "It made the businesspeople furious. And we realized then that we had something there, so we said it over and over again. Forget the economic argument. This was a moral one. It made them crazy. And we knew that was our issue."
The moral argument soon trumped all others. The possibility that a rise in the minimum wage, even a very substantial one, would create unemployment or compromise the health of the city's small businesses was not necessarily irrelevant. Yet for many in Santa Fe, that came to be seen as an ancillary issue, one that inevitably led to fruitless discussions in which opposing sides cited conflicting studies or anecdotal evidence. Maybe all of that was beside the point, anyway. Does it - or should it - even matter what a wage increase does to a local economy, barring some kind of catastrophic change? Should an employer be allowed to pay a full-time employee $5.15 an hour, this argument went, if that's no longer enough to live on? Is it just under our system of government? Or in the eyes of God?
The Rev. Jerome Martinez, the city's influential monsignor, began to throw his support behind the living-wage ordinance. When I met with him in his parish, in a tidy, paneled office near the imposing 18th-century church that looks over the city plaza, Martinez traced for me the moral justification for a living wage back to the encyclicals of Popes Leo XIII and Pius XI and John Paul II, in which the pontiffs warned against the excesses of capitalism. "The church's position on social justice is long established," Father Jerome said. "I think unfortunately it's one of our best-kept secrets."
I asked if it had been a difficult decision to support the wage law. He smiled slightly. "It was a no-brainer," he said. "You know, I am not by nature a political person. I have gotten a lot of grief from some people, business owners, who say, 'Father, why don't you stick to religion?' Well, pardon me - this is religion. The scripture is full of matters of justice. How can you worship a God that you do not see and then oppress the workers that you do see?"
I heard refrains of the moral argument all over Santa Fe. One afternoon I walked around the city with Morty Simon, a labor lawyer and a staunch supporter of the living wage whose wife is Carol Oppenheimer. "This used to be the Sears," Simon told me as we walked, pointing to boutiques and high-end chain stores. "And we had a supermarket over here, and there was a hardware store too." Simon came to Santa Fe 34 years ago as a refugee from New York, he said, and for him the unpretentious city he once knew was gone. The wealthy retirees and second-home buyers had come in droves, and so had the movie stars. Gene Hackman and Val Kilmer had settled here; Simon had recently found out that someone had plans for a 26,000-square-foot house, a new local record. For him, the moral component of the law, the possibility of regaining some kind of balance, was what mattered. "It was really a question of, what kind of world do you want to live in?" he said.
Several Santa Fe councilors had, over the course of the previous year, come to Morty Simon's view that the wage ordinance presented an opportunity to stop the drift between haves and have-nots. Carol Robertson Lopez, for example, had initially opposed the living-wage law but changed her mind after 30 hours of debate. "We take risks, oftentimes, to benefit businesses," she told me, "and we take risks to benefit different sectors. I felt like this was an economic risk that we were taking on behalf of the worker." She acknowledged that some residents thought the city had started down a slippery slope toward socialism; jokes about the People's Republic of Santa Fe were rampant. But Robertson Lopez says that by the night of the vote she had few reservations. "I think the living wage is an indicator of when we've given up on the federal government to solve our problems," she says. "So local people have to take it on their own."
The living-wage ordinance had its final hearing on Feb. 26, 2003, in a rancorous debate that drew 600 people and lasted until 3 a.m. The proposal set a wage floor at $8.50 an hour, which would increase to $9.50 in January 2006 and $10.50 in 2008. It would also regulate only businesses with 25 or more employees.
It passed the City Council easily, by a vote of 7 to 1. A few weeks later, a group of restaurant and hotel owners filed suit in state court on the grounds that the living-wage ordinance exceeded the city's powers and was a violation of their rights under New Mexico's constitution. A judge suspended the wage law until a trial could resolve the issues.
Businesses Fight Back
To business owners in Santa Fe, the most worrisome aspect of the living-wage law is that the city has sailed into uncharted territory. Most of the minimum-wage campaigns in the U.S. have been modest increases of a dollar or a dollar and a half. The numerous state campaigns for 2006 will probably propose raises to between $6.15 and $7 and hour. (When San Francisco raised its minimum wage to $8.50 an hour in 2004 - indexed to inflation, it is now $8.82 - California's state minimum wage was $6.75, so the increase was 26 percent.) And even staunch supporters of a higher minimum wage accept that there is a point at which a wage is set so high as to do more harm than good. "There is no other municipality in the country that believes that $9.50 should be the living wage," says Rob Day, the owner of the Santa Fe Bar and Grill and one of the plaintiffs who sued the city. In fact, the most apt comparison would be Great Britain, which now has a minimum wage equivalent to about $8.80 an hour. "They have minimum wages that are Santa Fe level," says Richard Freeman, a Harvard economist. And at least for the moment, he says, "they have lower unemployment than we do."
As the lawsuit against the city progressed, though, Europe wasn't even a distant consideration. The focus was on the people of Santa Fe. I read through a transcript of New Mexicans for Free Enterprise v. City of Santa Fe one day this fall in a conference room at Paul, Weiss, Rifkind, Wharton & Garrison, the white-shoe law firm in Midtown Manhattan that defended, pro bono, Santa Fe's right to enact the living-wage ordinance. In many respects, the trial, which took place over the course of a week in April 2004, was an unusual public exchange on profits, poverty and class in America. Paul Sonn, the lawyer at the Brennan Center at New York University who wrote the Santa Fe ordinance, had enlisted Sidney Rosdeitcher, a partner at Paul, Weiss, to be lead counsel for Santa Fe's defense. Rosdeitcher told me that before the trial began he wasn't convinced that there were many factual issues in dispute; as he saw it, the living-wage controversy was about the law and, in particular, whether Santa Fe had a legal "home rule" authority, under the provisions of the New Mexico constitution, to set wages, even for private industry. Nevertheless, several low-wage workers took the stand to relate the facts, as they saw them, of what the wage increase would do to improve their quality of life. The Rev.
Jerome Martinez took the stand as an employer of 65 people in his parish and Catholic school. And a number of restaurant owners, in turn, explained how the new law could ultimately force them out of business.
The plaintiffs - the New Mexicans for free enterprise - were not unsympathetic: the restaurateurs who took the stand, like Rob Day or Elizabeth Draiscol, who runs the popular Zia Diner in town, opened their books to show that their margins were thin, their costs high, their payrolls large. They cared about their employees (providing health care and benefits), trained unskilled workers who spoke little or no English, gave regular raises and paid starting salaries well above $5.15. They had built up their businesses through an extraordinary amount of hard work. Draiscol testified that her restaurant, for instance, had $2.17 million in annual revenue in the fiscal year of 2003. Though her assets were substantial - a restaurant can be valued at anywhere from 30 to 70 percent of its annual revenues, and Draiscol said Zia had been appraised at 66 percent of revenues, or about $1.4 million - she earned a salary of $49,000 a year. Draiscol testified that the living wage would raise her payroll, which accounted for 55 to 65 employees (depending on the season), by about $43,192 a year. Rob Day put the expenses of a living-wage increase even higher. In addition to labor costs, he estimated that the price of goods would go up as his local suppliers, forced to pay employees higher wages themselves, passed along their expenses to the Santa Fe Bar and Grill.
Rosdeitcher showed that the restaurants had made serious errors overestimating their costs. Still, the increase in expenditures was not negligible. Over the past few years, a variety of experts have tried to perfect the science of predicting what will happen to a community in the wake of a minimum-wage change, and one of those experts, Robert Pollin, a professor of economics at the University of Massachusetts, Amherst, served as the expert witness on behalf of Santa Fe. Pollin projected that the living wage would affect the wages of about 17,000 workers. About 9,000 of those workers would receive raises because of the ordinance, he said; the rest would receive what he called "ripple effect" increases - which meant that those making, say, $8.50 or more before the raise would most likely receive an additional raise from their employers to reflect their job seniority. Pollin calculated that wage increases would cost businesses a total of $33 million. And to pay for those amounts, restaurants and hotels and stores would probably need to raise prices by between 1 and 3 percent. The question, therefore, was whether business owners were willing to raise prices or make less in profits. In the trial, Pollin cited an obscure 1994 academic experiment in which several economists had set a different price within the same restaurant for a fried-haddock dinner. In varying the price of the haddock between $8.95 and $10.95, the researchers' goal was to find out whether variations in cost affected demand in a controlled environment. As it turned out, they didn't. Customers ordered the haddock at both $8.95 and $10.95.
Results From the Santa Fe Experiment
That the city of Santa Fe has effectively become a very large fried-haddock-dinner experiment is difficult to deny. A state court judge ruled in favor of the city soon after the trial, allowing the living-wage ordinance to take effect in June 2004; recently, the judge's decision was affirmed by a state appellate court, giving the city, and its living-wage advocates, a sweeping victory. Many business owners have found these legal losses discouraging. This fall, not long after I visited the city, the Santa Fe Chamber of Commerce sent a note to its members to gauge their opinion on the $8.50 living wage and the hike on Jan. 1 to $9.50. Some members reported that they had no trouble adjusting to the first raise and supported a further increase. (Some of these owners, whose high-end businesses employ skilled workers, paid more than the $8.50 to begin with.) Others insisted that they were not averse to a state or federal raise in the minimum but that Santa Fe's citywide experiment had put local businesses at a competitive disadvantage: companies could move outside the city limits or could outsource their work to cheaper places in the state. But most respondents opposed the law. The living wage had forced them to raise prices on their products and services, which they feared would cut into business.
To look at the data that have accumulated since the wage went into effect is to get a more positive impression of the law. Last month, the University of New Mexico's Bureau of Business and Economic Research issued some preliminary findings on what had happened to the city over the past year and a half. The report listed some potential unintended consequences of the wage raise: the exemption in the living-wage law for businesses with fewer than 25 employees, for instance, created "perverse incentives" for owners to keep their payrolls below 25 workers. There was some concern that the high living wage might encourage more high-school students to drop out; in addition, some employers reported that workers had begun commuting in to Santa Fe to earn more for a job there than they could make outside the city.
Yet the city's employment picture stayed healthy - overall employment had increased in each quarter after the living wage went into effect and had been especially strong for hotels and restaurants, which have the most low-wage jobs. Most encouraging to supporters: the number of families in need of temporary assistance - a reasonably good indicator of the squeeze on the working poor - have declined significantly. On the other hand, the city's gross receipts totals, a reflection of consumer spending and tourism, have been disappointing since the wage went into effect. That could suggest that prices are driving people away. Or it could merely mean that high gas and housing prices are hitting hard. The report calculates that the cost of living in Santa Fe rose by 9 percent a year over the past two and a half years.
Rob Day of the Santa Fe Bar and Grill sees this as the crux of the matter. In his view, the problem with Santa Fe is the cost of housing, and there are better ways than wage regulations - housing subsidies, for example - to make homes more affordable. In the wake of the wage raise, Day told me, he eventually tweaked his prices, but not enough to offset the payroll increases. He let go of his executive chef and was himself working longer hours. "Now in the matter of a year and a half, I think there is a whole group of us who thought, if we were going to start over, this isn't the business we would have gone into," he says.
Al Lucero, the owner of Maria's New Mexican Kitchen, says that the living-wage battle has risked turning him into a caricature. Opponents backing the living wage "paint us as people who take advantage of workers," he told me. By contrast, Lucero sees himself as an upstanding member of the community who provides jobs (he has 60 employees) and had always paid well above the federal minimum. Other business owners said similar things but would not speak out publicly. They feared alienating customers. As some told it, they had started businesses with a desire to create wealth and jobs in a picturesque small city. Then they had awakened in a mad laboratory for urban liberalism.
The Issue in Albuquerque
Long after he did his influential research with David Card on the effect of minimum-wage raises, Alan Krueger says, he came to see that ultimately the minimum wage is less about broad economic outcomes than about values. Which is not to say that workers' values should trump those of owners'. Rather, that when wealth is being redistributed from one party to another - and not, in the case of Santa Fe, from overpaid C.E.O.'s and hedge-fund managers but from everyday entrepreneurs who have worked long hours to succeed in their businesses - things can get complicated. Indeed, while it is tempting to see the wage disputes in Santa Fe and elsewhere as a reflection of whether one side is right or wrong, on either economic or moral grounds, they are, more confusingly, small battles in a larger war (and, in America, a very old war) over where to draw the line on free-market capitalism. On one side there is Al Lucero, on the other someone like Morty Simon, or the economist Robert Pollin, who says: "The principled position is, 'Why should anyone tell anyone what to do? Why should the government?' I just happen to disagree with that. A minimally decent employment standard, to me, overrides the case for a free market."
And yet, the fact that voters or elected politicians should decide who wins these battles, rather than economists or policy makers, seems fitting. During Albuquerque's living-wage campaign this past fall, Santa Fe - the smaller, wealthier, northern neighbor - served as a rallying point. But it was also a question mark: Was Santa Fe's experience repeatable? Was it even worth pointing to as an exemplar? In the final days of the Albuquerque effort, Jen Kern of Acorn told me she had little doubt that the wage victory in Santa Fe, like the one in San Francisco, was an indication that a battle for creating high base wages in America's cities, in addition to the states, could be won. But these were also rich cities, liberal cities - "la-la lands," as she put it. "I think with citywide minimums, if this is going to be the next era in the living-wage movement, it's got to look like it's winnable," Kern says. "The danger or the limitations of just having San Francisco and Santa Fe having passed this is that people in other parts of the country are going to say, 'Well, I'm not Santa Fe, I'm not San Francisco."' In Kern's view, a win "in a city like Albuquerque, which I think everyone thinks of as sort of a normal city," was a truer test.
And it didn't pass that test. When the $7.50 ballot initiative lost by 51 percent to 49 percent on Oct. 4, it made many in the living-wage movement wonder how these battles will play out over the next year or two. One political consultant involved in the movement questioned whether the Albuquerque wage itself, at $7.50 an hour, had been set too high by Acorn to win broad support. Matthew Henderson of Acorn, who ran the day-to-day campaign, said he thought they were outspent by their opponents. Most likely, though, the outcome was determined by the actual grounds on which the battle was fought. The businesses that opposed the $7.50 wage, represented mainly by the Greater Albuquerque Chamber of Commerce, challenged a small provision in the proposed living-wage law that would allow those enforcing a living wage to have wide "access" to a workplace. The campaigns soon began trading allegations through television ads and direct mailings about how far such access might go. And so the living-wage campaign had become a surreal fight over privacy (it would allow "complete strangers to enter your child's school," one mailing against the measure alleged) rather than wages. When I met with Terri Cole, the president and C.E.O. of Albuquerque's Chamber of Commerce, a few days before the vote, she acknowledged that the chamber opposed the living-wage law on philosophical grounds. But she said she saw the access clause as a legitimate grounds for a fight.
Will It Play Nationally?
In the aftermath of Albuquerque, Jen Kern took solace in the fact that 10 years after she visited the Library of Congress, and 10 years after she began working on living-wage campaigns, the opposition fought not on the economic merits or risks of a higher wage, but on a side issue like privacy. Still, a loss is a loss. It is possible that the Albuquerque wage campaign may still prevail, in effect: New Mexico's governor, Bill Richardson, has said he would consider a statewide raise this spring, presumably to $7 or $7.50, from $5.15, that would affect all New Mexicans. (It would, in all likelihood, leave Santa Fe's higher wage unaffected.) Yet such an act does little to clarify whether progressives can actually transform strong levels of voter support for higher wages into wins at the polls. Kristina Wilfore, the head of the Ballot Initiative Strategy Center, a progressive advocacy group, says that over the years there has been anywhere from a 2 to 5 percent increase in voter turnout specifically correlated with wage measures. "But people think it's some big panacea, and it's not," says Wilfore, who regards success as dependent on how well a local wage coalition (organized labor, grass-roots groups, church-based organizations) can work together at raising money and mobilizing voters.
For specific candidates in a state or city where a wage measure is on the ballot, it can be similarly complicated. Representative Rahm Emanuel of Illinois, chairman of the Democratic Congressional Campaign Committee, told me that the local battles over living wages reflect the broader debate in the U.S. over health care, retirement security and an advancing global economy. "Every district is different," Emanuel says of the slate of Congressional races for 2006, "But there is not one where the living wage, competitive wages or health care doesn't play out. The minimum-wage issue, if it's on the ballot, is part of the economic argument."
David Mermin of Lake Research Partners, who frequently conducts polls on minimum-wage issues, told me that the dollar level of a wage proposal is important, though it can vary from place to place. ("People have different feelings about what's a lot of money," he says.) But he has found that quirks can emerge. An increase to $6.15 sometimes doesn't poll as well as an increase to $6.75, which can generate more intensity and broader support from voters. Mermin also says that wage measures have had success in recent years, Albuquerque notwithstanding, not because Americans feel differently but because campaigners are getting smarter about stressing morals over economics. And when handled adroitly, a wage platform can motivate the kind of voters who are difficult to engage in other ways: younger voters, infrequent voters, low-income urban voters. His research, Mermin adds, shows that most people who vote for the minimum wage know it's not going to affect their lives tomorrow: "It's not like fixing the health-care system, or repairing the retirement system," he says. "It doesn't rise to that level directly. And if you list it in 10 issues, it doesn't pop out in priority. But when it is on the ballot, it crystallizes a lot of things people feel about the economy and about people who are struggling." In his experience, voters seem to process these measures as an opportunity to take things into their own hands and change their world, just as Morty Simon did.
Still, as an endgame, many in the living-wage movement see the prize not in a series of local victories in 2006 but in Congressional action that results in a substantial increase in the federal minimum wage - and even better - one that is indexed to inflation, so that such battles about raising the wage don't need to be fought every few years. The long-run trajectory, Paul Sonn told me, is for cities and states to create enough pressure to ultimately force a raise on the federal level. Or to put it another way, the hope is that raising wages across the U.S. will ultimately demonstrate to voters and to Washington lawmakers both the feasibility and the necessity of a significantly higher minimum wage. In the meantime, Sonn says, cities like Santa Fe play an important role in policy innovation, "really as sort of laboratories of economic democracy." Richard Freeman of Harvard echoes this point. "If you go back, a lot of the New Deal legislation, good or bad, came about because there was a lot of state legislation," Freeman says. Policies from New York or Wisconsin were adapted into the federal system of laws. "A lot of it came from state variations in the past, and I think we'll see a lot more of this in the next few years. The things that work the best might be adopted nationally."
Of course, it also seems plausible that any kind of national coherence on economic - or moral - matters may have ended long ago. Just as the voters of states and cities have sorted themselves politically into red and blue, and into pro- and anti-gay marriage, in other words, they are increasingly sorting their wage floors and (perhaps soon) their health-care coverage. This trend may produce not progressive national policies but instead a level of local self-determination as yet unseen. Or as Freeman puts it, "Let Santa Fe do what it wants, but let's not impose that on Gadsden, Ala." That wouldn't make a federal increase in the minimum wage insignificant, but it would make it something of a backdrop for major population centers. As Robert Reich says, "The reality is, even if the wage were raised to $6.15, it would not be enough to lift a family out of poverty." And as Jen Kern points out, even a federal minimum wage that goes up to $7.25, which is the proposal from the Senate Democrats and which probably isn't going anywhere until 2008, doesn't approach what it now costs to live in some cities.
This was why, in December, Kern and Acorn were considering the prospects for laying the groundwork for living-wage ordinances in other cities. And it's why, also in December, Paul Sonn was helping to write an ordinance for Lawrence Township, N.J., aimed at forcing the city's big-box retailers like Wal-Mart to pay a higher wage (more than $10 an hour) and to contribute a larger share of employee benefits. Last month, Sonn also pointed out to me that Santa Cruz, Calif., was considering plans to introduce a measure that would establish a minimum wage of $9.25 an hour.
It wasn't quite Santa Fe's level, but close. And that suggested that the small New Mexican city, to the delight of its living-wage advocates and the chagrin of many business owners, was no longer just an experiment. Rather, it had already become something best described, for better or for worse, as a model.
Jon Gertner is a contributing writer for the magazine.
NYT Magazine, January 15, 2006
By JON GERTNER
If It Happened in Baltimore, Maybe It Can Happen Anywhere
For a few weeks in the summer of 1995, Jen Kern spent her days at a table in the Library of Congress in Washington, poring over the fine print of state constitutions from around the country. This was, at the time, a somewhat-eccentric strategy to fight poverty in America. Kern was not a high-powered lawyer or politician; she was 25 and held a low-paying, policy-related job at Acorn, the national community organization. Yet to understand why living-wage campaigns matter - where they began, what they mean, and why they inspire such passion and hope - it helps to consider what Kern was doing years ago in the library, reading obscure legislation from states like Missouri and New Mexico.
A few months earlier, she and her colleagues at Acorn witnessed an energetic grass-roots campaign in Baltimore, led by a coalition of church groups and labor unions. Workers in some of Baltimore's homeless shelters and soup kitchens had noticed something new and troubling about many of the visitors coming in for meals and shelter: they happened to have full-time jobs. In response, local religious leaders successfully persuaded the City Council to raise the base pay for city contract workers to $6.10 an hour from $4.25, the federal minimum at the time. The Baltimore campaign was ostensibly about money. But to those who thought about it more deeply, it was about the force of particular moral propositions: first, that work should be rewarded, and second, that no one who works full time should have to live in poverty.
So Kern and another colleague were dispatched to find out if what happened in Baltimore could be tried - and expanded - elsewhere. As she plowed through documents, Kern was unsure whether to look for a particular law or the absence of one. Really, what she was trying to do was compile a list of places in the U.S. where citizens or officials could legally mount campaigns to raise the minimum wage above the federal standard. In other words, she needed to know if anything stood in the way, like a state regulation or court decision. What she discovered was that in many states a law more ambitious than Baltimore's - one that didn't apply to only city contractors but to all local businesses - seemed permissible.
Whether a wage campaign was winnable turned out to be a more complicated matter. In the late 1990's, Kern helped Acorn in a series of attempts to raise the minimum wage in Denver, Houston and Missouri. They all failed. "It wasn't even close," she says. In the past few years, though, as the federal minimum wage has remained fixed at $5.15 and the cost of living (specifically housing) has risen drastically in many regions, similar campaigns have produced so many victories (currently, 134) that Kern speaks collectively of "a widespread living-wage movement."
Santa Fe has been one of the movement's crowning achievements. This month the city's minimum wage rose to $9.50 an hour, the highest rate in the United States. But other recent victories include San Francisco in 2003 and Nevada in 2004. And if a ending bill in Chicago is any indication, the battles over wage laws will soon evolve into campaigns to force large, private-sector businesses like Wal-Mart to provide not only higher wages but also more money for employee health care.
It is a common sentiment that economic fairness - or economic justice, as living-wage advocates phrase it - should, or must, come in a sweeping and righteous gesture from the top. From Washington, that is. But most wage campaigns arise from the bottom, from residents and low-level officials and from cities and states - from everywhere except the federal government. "I think what the living-wage movement has done in the past 11 years is incredible," David Neumark, a frequent critic of the phenomenon who is a senior fellow at the Public Policy Institute of California, told me recently. "How many other issues are there where progressives have been this successful? I can't think of one."
The immediate goal for living-wage strategists is to put initiatives on the ballots in several swing states this year. If their reckoning is correct, the laws should effect a financial gain for low-income workers and boost turnout for candidates who campaign for higher wages. In Florida, a ballot initiative to raise the state's minimum wage by a dollar, to $6.15, won 71 percent of the vote in 2004, a blowout that surprised even people like Kern, who spent several weeks in Miami working on the measure. "We would like it to become a fact of political life," Kern says, "where every year the other side has to contend with a minimum-wage law in some state." Though victories like the one in Florida may have done little to help the Kerry-Edwards ticket - George Bush won 52 percent of the state's vote - Kern and some in the Democratic establishment have come to believe that the left, after years of electoral frustration, has finally found its ultimate moral-values issue. "This is what moves people to the polls now," Kern insists. "This is our gay marriage." Already, during the past few months, a coalition of grass-roots and labor organizations have begun gathering hundreds of thousands of signatures to ensure that proposed laws to increase wages are voted on in November. The first targets, Kern told me, will be Arizona, Colorado, Michigan and Ohio. Next in line, either this year or soon after, are Montana, Oklahoma and Arkansas, the home of Wal-Mart.
Does America Care About the Gap Between Rich and Poor?
I first met Kern on a sunny morning in late September in Albuquerque, a city of 4770,000 that made her list when she was working in the Library of Congress 10 years ago. She was now, at age 35, campaigning for a ballot initiative that would raise the minimum wage in the city to $7.50 an hour from $5.15. There was no face for the placards, no charismatic presence to rally the troops at midnight or shake hands at dawn outside 7-Eleven. Instead, there was a number, $7.50, a troop of campaign workers to canvass the neighborhoods and an argument: that many low-wage workers were being paid poverty wages. That a full-time job at the federal minimum rate added up to $10,712 a year. That local businesses could afford the pay raise. And that it was up to the voters to restore balance.
One of the more intriguing questions about campaigns like the one in Albuquerque, and those planned for swing states next fall, is whether they reflect a profound sense of public alarm about the divergence between rich and poor in this country. Certainly most Americans do not support higher wages out of immediate self-interest. Probably only around 3 percent of those in the work force are actually paid $5.15 or less an hour; most low-wage workers, including Wal-Mart employees, who generally start at between $6.50 and $7.50 an hour, earn more. Increasing the minimum wage to $7.25 an hour would directly affect the wages of only about 7 percent of the work force. Nevertheless, pollsters have discovered that a hypothetical state ballot measure typically generates support of around 70 percent. A recent poll by the Pew Research Center actually put the support for raising the national minimum wage to $6.45 at 86 percent. Rick Berman, a lobbyist who started the Employment Policies Institute and who is a longtime foe of living-wage laws, agrees that "the natural tendency is for people to support these things. They believe it's a free lunch." On the other hand, the electorate's reasons for crossing party lines to endorse the measures may be due to the simple fact that at least 60 percent of Americans have at one time or another been paid the minimum wage. Voters may just know precisely what they're voting for and why.
In the mid-1990's, the last time Congress raised the minimum wage, the Clinton White House was reluctant to start a war over the federal rate, according to Robert Reich, the former labor secretary. For an administration bent on policy innovation, that would have seemed "old" Democrat. "Then we did some polling and discovered that the public is overwhelmingly in favor," Reich told me recently. "At which point the White House gave the green light to Democrats in Congress." Reich, now a professor at the University of California, Berkeley, happens to view the minimum wage as a somewhat inefficient tool for alleviating poverty (compared with earned income tax credits, say). But he acknowledges that it has a powerful moral and political impact, in states red as well as blue, and especially now, in an era when workers see the social contract with their employers vanishing. "They see neighbors and friends being fired for no reason by profitable companies, executives making off like bandits while thousands of their own workers are being laid off," Reich says. "They see health insurance drying up, employer pensions shrinking. Promises to retirees of health benefits are simply thrown overboard. The whole system has aspects that seem grossly immoral to average working people." As Reich points out, whatever the minimum wage's limitations may be as a policy instrument, as an idea "it demarcates our concept of decency with regard to work."
The idea, Reich points out, isn't new, even if the recent fervor for it is. Massachusetts enacted a state minimum wage in 1912, several decades before the federal minimum wage of 25 cents an hour was adopted in 1938. And most of the wage ordinances of the past decade specifically trace their origins back to Baltimore, in 1995. After that moment, in fact, the phrase "living wage" soon caught on - or, you might say, returned. It was a popular workers' refrain in the late 19th century and was the title of a 1906 book by John Ryan, a Roman Catholic priest. In the late 1990's, a loose national network of advocates sprang up, incorporating organized labor, grass-roots groups like Acorn and the Industrial Areas Foundation and, more recently, the National Council of Churches. Legal advice often came out of the Brennan Center for Justice at New York University's law school, where a lawyer named Paul Sonn helped write wage ordinances and ballot measures for various states and cities.
By dint of its piecemeal, localized progress, the modern living-wage movement has grown without fanfare; one reason is that until recently, most of the past decade's wage laws, like Baltimore's, have been narrow in scope and modest in effect. Strictly speaking, a "living wage" law has typically required that any company receiving city contracts, and thus taxpayers' money, must pay its workers a wage far above the federal minimum, usually between $9 and $11 an hour. These regulations often apply to employees at companies to which municipalities have outsourced tasks like garbage collection, security services and home health care. Low-wage workers in the private sector - in restaurants, hotels, retail stores or the like - have been unaffected. Their pay stays the same.
In Santa Fe, the City Council passed a similar kind of wage law in 2002, raising the hourly pay for city employees and contractors. Some officials in Santa Fe, however, had decided from the start that its wage rules should ultimately be different - that the small city (population 666,000) could even serve as a test example for the rest of the U.S. Early on, several city councilors told me, they anticipated that Santa Fe - with a high cost of living, a large community of low-paid immigrants and a liberal City Council - would eventually extend its wage floor to all local businesses, private as well as public, so that every worker in the city, no matter the industry, would make more than $5.15. The initial numbers the councilors considered as they began to strategize seemed stratospheric: a living wage that began at $10 or $12 or even $14.50 an hour. For some laborers, that would constitute a raise of 200 percent or more. Nothing remotely like it existed in any other city in the country.
The Economists Are Surprised
In the years before the enactment of the federal minimum wage in the late 1930's, the country's post-Depression economy was so weak that the notion that government should leave private business to its own devices was effectively marginalized. During the past few decades, though, in the wake of a fairly robust economy, debates on raising the minimum wage have consistently resulted in a rhetorical caterwaul. While the arguments have usually been between those on the labor side, who think the minimum wage should be raised substantially, and those on the employer side, who oppose any increase, a smaller but vocal contingent has claimed, more broadly and more philosophically, that it is in the best interest of both business and labor to let the market set wages, not the politicians. And certainly not the voters.
This last position was long underpinned by the academic consensus that a rise in the minimum wage hurts employment by interfering with the flow of supply and demand. In simplest terms, most economists accepted that when government forces businesses to pay higher wages, businesses, in turn ,hire fewer employees. It is a powerful argument against the minimum wage, since it suggests that private businesses as a group, along with teenagers and low-wage employees, will be penalized by a mandatory raise.
The tenor of this debate began to change in the mid-1990's following some work done by two Princeton economists, David Card (now at the University of California at Berkeley) and Alan B. Krueger. In 1992, New Jersey increased the state minimum wage to $5.05 an hour (applicable to both the public and private sectors), which gave the two young professors an opportunity to study the comparative effects of that raise on fast-food restaurants and low-wage employment in New Jersey and Pennsylvania, where the minimum wage remained at the federal level of $4.25 an hour. Card and Krueger agreed that the hypothesis that a rise in wages would destroy jobs was "one of the clearest and most widely appreciated in the field of economics." Both told me they believed, at the start, that their work would reinforce that hypothesis. But in 1995, and again in 2000, the two academics effectively shredded the conventional wisdom. Their data demonstrated that a modest increase in wages did not appear to cause any significant harm to employment; in some cases, a rise in the minimum wage even resulted in a slight increase in employment.
Card and Krueger's conclusions have not necessarily made philosophical converts of Congress or the current administration. Attempts to raise the federal minimum wage - led by Senators Edward M. Kennedy on the left and Rick Santorum on the right - have made little headway over the past few years. And the White House went so far as to temporarily suspend the obligation of businesses with U.S. government construction contracts to pay so-called prevailing wages (that is, whatever is paid to a majority of workers in an industry in a particular area) during the rebuilding after Hurricane Katrina. David Card, who seems nothing short of disgusted by the ideological nature of the debates over the wage issue, says he feels that opinions on the minimum wage are so politically entrenched that even the most scientific studies can't change anyone's mind. "People think we're biased, partisan," he says. And he's probably right. While Card has never advocated for or against raising the minimum wage, many who oppose wage laws have made exactly those assertions about his research. Nonetheless, in Krueger's view, he and Card changed the debate. "I'm willing to declare a partial victory," Krueger told me. Some recent surveys of top academics show a significant majority now agree that a modest raise in the minimum wage does little to harm employment, he points out.
If nothing else, Card and Krueger's findings have provided persuasive data, and a degree of legitimacy, to those who maintain that raising the minimum wage, whether at the city, state or federal level, need not be toxic. The Economic Policy Institute, which endorses wage regulations, has succeeded recently in getting hundreds of respected economists - excluding Card and Krueger, however, who choose to remain outside the debate - to support raising the federal minimum to $7 an hour. That would have been impossible as recently as five years ago, says Jeff Chapman, an economist at the E.P.I. Even Wal-Mart's president and C.E.O., Lee Scott, recently spoke out in favor of raising the minimum wage. It wasn't altruism or economic theory or even public relations that motivated him, but a matter of bottom-line practicality. "Our current average hourly wage for workers is $9.68," Lee Culpepper, a Wal-Mart spokesman, told me. "So I would think raising the wage would have minimal impact on our workers. But we think it would have a beneficial effect on our customers."
What a Higher Minimum Wage Can Mean to Those Making It
One evening in Santa Fe, I sat down with some of the people Wal-Mart is worried about. Like Louis Alvarez, a 58-year-old cafeteria worker in the Santa Fe schools who for many years helped prepare daily meals for 700 children. For that he was paid $6.85 an hour and brought home $203 every two weeks. He had no disposable income - indeed, he wasn't sure what I meant by disposable income; he barely had money for rent. Statistically speaking, he was far below the poverty line, which for a family of two is about $12,800 a year. For Alvarez, an increase in the minimum wage meant he would be able to afford to go to flea markets, he said.
I also met with Ashley Gutierrez, 20, and Adelina Reyes, 19, who have low-paying customer-service and restaurant jobs. By most estimates, 35 percent of those who make $7 an hour or less in the U.S. are teenagers. A few months ago, Reyes told me, she was spending 86 hours every two weeks at two minimum-wage jobs to pay for her car and to pay for college. Gutierrez, also in school, was working 20 hours a week at Blockbuster video for the minimum wage. People like Alvarez and Gutierrez and Reyes were the ones who spurred two city councilors in Santa Fe, Frank Montaño and Jimmie Martinez, to introduce the living-wage ordinance. "Our schools here don't do so well," Montaño told me, explaining that he believed higher-wage jobs would let parents, who might otherwise have to work a second job, spend more time with their children. (At the same time working teenagers like Gutierrez would have more time with their parents.) For Santa Fe residents who were living five or six to a room in two-bedroom adobes, Montaño said he hoped a higher minimum wage might put having their own places to live at least within the realm of possibility.
Montaño was confident - perhaps too confident, as it would turn out - that businesses would become acclimated to higher payroll costs. He has run a restaurant and a tour-bus company himself, and he knew that the tight labor market in Santa Fe had pushed up wages so that many entry-level workers were already earning more than $8 an hour. "The business owners believe that government, especially at the local level, should not dictate to business, so to them it was a matter of principle," Montaño says. It was to him too. "We knew that other communities were watching what we were doing," he explains. He and his colleagues on the council were already receiving help from Paul Sonn at the Brennan Center in New York. "I knew that their involvement meant that they saw this as something that was important nationally," Montaño says. "As we got our foot in the door in terms of this ordinance being applied to the private sector," he surmised, that would give the living-wage network the ammunition to help other communities across the country do likewise. "I always knew, early on," Montaño says, "that if Santa Fe enacted such an ordinance, that it likely would go to court, and that if it passed the legal test, it would be the kind of ordinance that other communities would copy." The problem, at least from Montaño's perspective, was getting it enacted in the first place.
The Moral Argument Carries the Day in Santa Fe
Santa Fe's City Council asked nine residents, representing the interests of labor and management, to join a round table that would settle the specifics of the proposed living-wage law - how high the wage would be, for instance, and how soon it would be phased in. Some members of the round table, like Al Lucero, who owns a popular local restaurant, Maria's New Mexican Kitchen, found the entire premise of a city wage law objectionable. "I think the minimum wage at $5.15 is ridiculous," Lucero told me. "If the state were to raise it overall, to $7 an hour or $7.50 an hour, I think that would be wonderful. I think we need to do it." But $9 or $10 or $11 was too high, in his view - and it would put Santa Fe at a disadvantage to other cities in the state or region that could pay workers less. Also, there were the free-market principles that Frank Montaño had anticipated: "They were trying to push and tell us how to live our lives and how to conduct our business," says Lucero, who employs about 60 people.
Not surprisingly, Lucero's opponents on the round table saw things in a different light. For example, Carol Oppenheimer, a labor attorney, viewed the proposed law as a practical and immediate solution. "I got involved with the living-wage network because unions are having a very hard time," she told me. She assumed that local businesses could manage with a higher payroll. Yet after only a few meetings of the task force, both sides dug in, according to Oppenheimer.
It was then that the living-wage proponents hit on a scorched-earth, tactical approach. "What really got the other side was when we said, 'It's just immoral to pay people $5.15, they can't live on that,' " Oppenheimer recalls. "It made the businesspeople furious. And we realized then that we had something there, so we said it over and over again. Forget the economic argument. This was a moral one. It made them crazy. And we knew that was our issue."
The moral argument soon trumped all others. The possibility that a rise in the minimum wage, even a very substantial one, would create unemployment or compromise the health of the city's small businesses was not necessarily irrelevant. Yet for many in Santa Fe, that came to be seen as an ancillary issue, one that inevitably led to fruitless discussions in which opposing sides cited conflicting studies or anecdotal evidence. Maybe all of that was beside the point, anyway. Does it - or should it - even matter what a wage increase does to a local economy, barring some kind of catastrophic change? Should an employer be allowed to pay a full-time employee $5.15 an hour, this argument went, if that's no longer enough to live on? Is it just under our system of government? Or in the eyes of God?
The Rev. Jerome Martinez, the city's influential monsignor, began to throw his support behind the living-wage ordinance. When I met with him in his parish, in a tidy, paneled office near the imposing 18th-century church that looks over the city plaza, Martinez traced for me the moral justification for a living wage back to the encyclicals of Popes Leo XIII and Pius XI and John Paul II, in which the pontiffs warned against the excesses of capitalism. "The church's position on social justice is long established," Father Jerome said. "I think unfortunately it's one of our best-kept secrets."
I asked if it had been a difficult decision to support the wage law. He smiled slightly. "It was a no-brainer," he said. "You know, I am not by nature a political person. I have gotten a lot of grief from some people, business owners, who say, 'Father, why don't you stick to religion?' Well, pardon me - this is religion. The scripture is full of matters of justice. How can you worship a God that you do not see and then oppress the workers that you do see?"
I heard refrains of the moral argument all over Santa Fe. One afternoon I walked around the city with Morty Simon, a labor lawyer and a staunch supporter of the living wage whose wife is Carol Oppenheimer. "This used to be the Sears," Simon told me as we walked, pointing to boutiques and high-end chain stores. "And we had a supermarket over here, and there was a hardware store too." Simon came to Santa Fe 34 years ago as a refugee from New York, he said, and for him the unpretentious city he once knew was gone. The wealthy retirees and second-home buyers had come in droves, and so had the movie stars. Gene Hackman and Val Kilmer had settled here; Simon had recently found out that someone had plans for a 26,000-square-foot house, a new local record. For him, the moral component of the law, the possibility of regaining some kind of balance, was what mattered. "It was really a question of, what kind of world do you want to live in?" he said.
Several Santa Fe councilors had, over the course of the previous year, come to Morty Simon's view that the wage ordinance presented an opportunity to stop the drift between haves and have-nots. Carol Robertson Lopez, for example, had initially opposed the living-wage law but changed her mind after 30 hours of debate. "We take risks, oftentimes, to benefit businesses," she told me, "and we take risks to benefit different sectors. I felt like this was an economic risk that we were taking on behalf of the worker." She acknowledged that some residents thought the city had started down a slippery slope toward socialism; jokes about the People's Republic of Santa Fe were rampant. But Robertson Lopez says that by the night of the vote she had few reservations. "I think the living wage is an indicator of when we've given up on the federal government to solve our problems," she says. "So local people have to take it on their own."
The living-wage ordinance had its final hearing on Feb. 26, 2003, in a rancorous debate that drew 600 people and lasted until 3 a.m. The proposal set a wage floor at $8.50 an hour, which would increase to $9.50 in January 2006 and $10.50 in 2008. It would also regulate only businesses with 25 or more employees.
It passed the City Council easily, by a vote of 7 to 1. A few weeks later, a group of restaurant and hotel owners filed suit in state court on the grounds that the living-wage ordinance exceeded the city's powers and was a violation of their rights under New Mexico's constitution. A judge suspended the wage law until a trial could resolve the issues.
Businesses Fight Back
To business owners in Santa Fe, the most worrisome aspect of the living-wage law is that the city has sailed into uncharted territory. Most of the minimum-wage campaigns in the U.S. have been modest increases of a dollar or a dollar and a half. The numerous state campaigns for 2006 will probably propose raises to between $6.15 and $7 and hour. (When San Francisco raised its minimum wage to $8.50 an hour in 2004 - indexed to inflation, it is now $8.82 - California's state minimum wage was $6.75, so the increase was 26 percent.) And even staunch supporters of a higher minimum wage accept that there is a point at which a wage is set so high as to do more harm than good. "There is no other municipality in the country that believes that $9.50 should be the living wage," says Rob Day, the owner of the Santa Fe Bar and Grill and one of the plaintiffs who sued the city. In fact, the most apt comparison would be Great Britain, which now has a minimum wage equivalent to about $8.80 an hour. "They have minimum wages that are Santa Fe level," says Richard Freeman, a Harvard economist. And at least for the moment, he says, "they have lower unemployment than we do."
As the lawsuit against the city progressed, though, Europe wasn't even a distant consideration. The focus was on the people of Santa Fe. I read through a transcript of New Mexicans for Free Enterprise v. City of Santa Fe one day this fall in a conference room at Paul, Weiss, Rifkind, Wharton & Garrison, the white-shoe law firm in Midtown Manhattan that defended, pro bono, Santa Fe's right to enact the living-wage ordinance. In many respects, the trial, which took place over the course of a week in April 2004, was an unusual public exchange on profits, poverty and class in America. Paul Sonn, the lawyer at the Brennan Center at New York University who wrote the Santa Fe ordinance, had enlisted Sidney Rosdeitcher, a partner at Paul, Weiss, to be lead counsel for Santa Fe's defense. Rosdeitcher told me that before the trial began he wasn't convinced that there were many factual issues in dispute; as he saw it, the living-wage controversy was about the law and, in particular, whether Santa Fe had a legal "home rule" authority, under the provisions of the New Mexico constitution, to set wages, even for private industry. Nevertheless, several low-wage workers took the stand to relate the facts, as they saw them, of what the wage increase would do to improve their quality of life. The Rev.
Jerome Martinez took the stand as an employer of 65 people in his parish and Catholic school. And a number of restaurant owners, in turn, explained how the new law could ultimately force them out of business.
The plaintiffs - the New Mexicans for free enterprise - were not unsympathetic: the restaurateurs who took the stand, like Rob Day or Elizabeth Draiscol, who runs the popular Zia Diner in town, opened their books to show that their margins were thin, their costs high, their payrolls large. They cared about their employees (providing health care and benefits), trained unskilled workers who spoke little or no English, gave regular raises and paid starting salaries well above $5.15. They had built up their businesses through an extraordinary amount of hard work. Draiscol testified that her restaurant, for instance, had $2.17 million in annual revenue in the fiscal year of 2003. Though her assets were substantial - a restaurant can be valued at anywhere from 30 to 70 percent of its annual revenues, and Draiscol said Zia had been appraised at 66 percent of revenues, or about $1.4 million - she earned a salary of $49,000 a year. Draiscol testified that the living wage would raise her payroll, which accounted for 55 to 65 employees (depending on the season), by about $43,192 a year. Rob Day put the expenses of a living-wage increase even higher. In addition to labor costs, he estimated that the price of goods would go up as his local suppliers, forced to pay employees higher wages themselves, passed along their expenses to the Santa Fe Bar and Grill.
Rosdeitcher showed that the restaurants had made serious errors overestimating their costs. Still, the increase in expenditures was not negligible. Over the past few years, a variety of experts have tried to perfect the science of predicting what will happen to a community in the wake of a minimum-wage change, and one of those experts, Robert Pollin, a professor of economics at the University of Massachusetts, Amherst, served as the expert witness on behalf of Santa Fe. Pollin projected that the living wage would affect the wages of about 17,000 workers. About 9,000 of those workers would receive raises because of the ordinance, he said; the rest would receive what he called "ripple effect" increases - which meant that those making, say, $8.50 or more before the raise would most likely receive an additional raise from their employers to reflect their job seniority. Pollin calculated that wage increases would cost businesses a total of $33 million. And to pay for those amounts, restaurants and hotels and stores would probably need to raise prices by between 1 and 3 percent. The question, therefore, was whether business owners were willing to raise prices or make less in profits. In the trial, Pollin cited an obscure 1994 academic experiment in which several economists had set a different price within the same restaurant for a fried-haddock dinner. In varying the price of the haddock between $8.95 and $10.95, the researchers' goal was to find out whether variations in cost affected demand in a controlled environment. As it turned out, they didn't. Customers ordered the haddock at both $8.95 and $10.95.
Results From the Santa Fe Experiment
That the city of Santa Fe has effectively become a very large fried-haddock-dinner experiment is difficult to deny. A state court judge ruled in favor of the city soon after the trial, allowing the living-wage ordinance to take effect in June 2004; recently, the judge's decision was affirmed by a state appellate court, giving the city, and its living-wage advocates, a sweeping victory. Many business owners have found these legal losses discouraging. This fall, not long after I visited the city, the Santa Fe Chamber of Commerce sent a note to its members to gauge their opinion on the $8.50 living wage and the hike on Jan. 1 to $9.50. Some members reported that they had no trouble adjusting to the first raise and supported a further increase. (Some of these owners, whose high-end businesses employ skilled workers, paid more than the $8.50 to begin with.) Others insisted that they were not averse to a state or federal raise in the minimum but that Santa Fe's citywide experiment had put local businesses at a competitive disadvantage: companies could move outside the city limits or could outsource their work to cheaper places in the state. But most respondents opposed the law. The living wage had forced them to raise prices on their products and services, which they feared would cut into business.
To look at the data that have accumulated since the wage went into effect is to get a more positive impression of the law. Last month, the University of New Mexico's Bureau of Business and Economic Research issued some preliminary findings on what had happened to the city over the past year and a half. The report listed some potential unintended consequences of the wage raise: the exemption in the living-wage law for businesses with fewer than 25 employees, for instance, created "perverse incentives" for owners to keep their payrolls below 25 workers. There was some concern that the high living wage might encourage more high-school students to drop out; in addition, some employers reported that workers had begun commuting in to Santa Fe to earn more for a job there than they could make outside the city.
Yet the city's employment picture stayed healthy - overall employment had increased in each quarter after the living wage went into effect and had been especially strong for hotels and restaurants, which have the most low-wage jobs. Most encouraging to supporters: the number of families in need of temporary assistance - a reasonably good indicator of the squeeze on the working poor - have declined significantly. On the other hand, the city's gross receipts totals, a reflection of consumer spending and tourism, have been disappointing since the wage went into effect. That could suggest that prices are driving people away. Or it could merely mean that high gas and housing prices are hitting hard. The report calculates that the cost of living in Santa Fe rose by 9 percent a year over the past two and a half years.
Rob Day of the Santa Fe Bar and Grill sees this as the crux of the matter. In his view, the problem with Santa Fe is the cost of housing, and there are better ways than wage regulations - housing subsidies, for example - to make homes more affordable. In the wake of the wage raise, Day told me, he eventually tweaked his prices, but not enough to offset the payroll increases. He let go of his executive chef and was himself working longer hours. "Now in the matter of a year and a half, I think there is a whole group of us who thought, if we were going to start over, this isn't the business we would have gone into," he says.
Al Lucero, the owner of Maria's New Mexican Kitchen, says that the living-wage battle has risked turning him into a caricature. Opponents backing the living wage "paint us as people who take advantage of workers," he told me. By contrast, Lucero sees himself as an upstanding member of the community who provides jobs (he has 60 employees) and had always paid well above the federal minimum. Other business owners said similar things but would not speak out publicly. They feared alienating customers. As some told it, they had started businesses with a desire to create wealth and jobs in a picturesque small city. Then they had awakened in a mad laboratory for urban liberalism.
The Issue in Albuquerque
Long after he did his influential research with David Card on the effect of minimum-wage raises, Alan Krueger says, he came to see that ultimately the minimum wage is less about broad economic outcomes than about values. Which is not to say that workers' values should trump those of owners'. Rather, that when wealth is being redistributed from one party to another - and not, in the case of Santa Fe, from overpaid C.E.O.'s and hedge-fund managers but from everyday entrepreneurs who have worked long hours to succeed in their businesses - things can get complicated. Indeed, while it is tempting to see the wage disputes in Santa Fe and elsewhere as a reflection of whether one side is right or wrong, on either economic or moral grounds, they are, more confusingly, small battles in a larger war (and, in America, a very old war) over where to draw the line on free-market capitalism. On one side there is Al Lucero, on the other someone like Morty Simon, or the economist Robert Pollin, who says: "The principled position is, 'Why should anyone tell anyone what to do? Why should the government?' I just happen to disagree with that. A minimally decent employment standard, to me, overrides the case for a free market."
And yet, the fact that voters or elected politicians should decide who wins these battles, rather than economists or policy makers, seems fitting. During Albuquerque's living-wage campaign this past fall, Santa Fe - the smaller, wealthier, northern neighbor - served as a rallying point. But it was also a question mark: Was Santa Fe's experience repeatable? Was it even worth pointing to as an exemplar? In the final days of the Albuquerque effort, Jen Kern of Acorn told me she had little doubt that the wage victory in Santa Fe, like the one in San Francisco, was an indication that a battle for creating high base wages in America's cities, in addition to the states, could be won. But these were also rich cities, liberal cities - "la-la lands," as she put it. "I think with citywide minimums, if this is going to be the next era in the living-wage movement, it's got to look like it's winnable," Kern says. "The danger or the limitations of just having San Francisco and Santa Fe having passed this is that people in other parts of the country are going to say, 'Well, I'm not Santa Fe, I'm not San Francisco."' In Kern's view, a win "in a city like Albuquerque, which I think everyone thinks of as sort of a normal city," was a truer test.
And it didn't pass that test. When the $7.50 ballot initiative lost by 51 percent to 49 percent on Oct. 4, it made many in the living-wage movement wonder how these battles will play out over the next year or two. One political consultant involved in the movement questioned whether the Albuquerque wage itself, at $7.50 an hour, had been set too high by Acorn to win broad support. Matthew Henderson of Acorn, who ran the day-to-day campaign, said he thought they were outspent by their opponents. Most likely, though, the outcome was determined by the actual grounds on which the battle was fought. The businesses that opposed the $7.50 wage, represented mainly by the Greater Albuquerque Chamber of Commerce, challenged a small provision in the proposed living-wage law that would allow those enforcing a living wage to have wide "access" to a workplace. The campaigns soon began trading allegations through television ads and direct mailings about how far such access might go. And so the living-wage campaign had become a surreal fight over privacy (it would allow "complete strangers to enter your child's school," one mailing against the measure alleged) rather than wages. When I met with Terri Cole, the president and C.E.O. of Albuquerque's Chamber of Commerce, a few days before the vote, she acknowledged that the chamber opposed the living-wage law on philosophical grounds. But she said she saw the access clause as a legitimate grounds for a fight.
Will It Play Nationally?
In the aftermath of Albuquerque, Jen Kern took solace in the fact that 10 years after she visited the Library of Congress, and 10 years after she began working on living-wage campaigns, the opposition fought not on the economic merits or risks of a higher wage, but on a side issue like privacy. Still, a loss is a loss. It is possible that the Albuquerque wage campaign may still prevail, in effect: New Mexico's governor, Bill Richardson, has said he would consider a statewide raise this spring, presumably to $7 or $7.50, from $5.15, that would affect all New Mexicans. (It would, in all likelihood, leave Santa Fe's higher wage unaffected.) Yet such an act does little to clarify whether progressives can actually transform strong levels of voter support for higher wages into wins at the polls. Kristina Wilfore, the head of the Ballot Initiative Strategy Center, a progressive advocacy group, says that over the years there has been anywhere from a 2 to 5 percent increase in voter turnout specifically correlated with wage measures. "But people think it's some big panacea, and it's not," says Wilfore, who regards success as dependent on how well a local wage coalition (organized labor, grass-roots groups, church-based organizations) can work together at raising money and mobilizing voters.
For specific candidates in a state or city where a wage measure is on the ballot, it can be similarly complicated. Representative Rahm Emanuel of Illinois, chairman of the Democratic Congressional Campaign Committee, told me that the local battles over living wages reflect the broader debate in the U.S. over health care, retirement security and an advancing global economy. "Every district is different," Emanuel says of the slate of Congressional races for 2006, "But there is not one where the living wage, competitive wages or health care doesn't play out. The minimum-wage issue, if it's on the ballot, is part of the economic argument."
David Mermin of Lake Research Partners, who frequently conducts polls on minimum-wage issues, told me that the dollar level of a wage proposal is important, though it can vary from place to place. ("People have different feelings about what's a lot of money," he says.) But he has found that quirks can emerge. An increase to $6.15 sometimes doesn't poll as well as an increase to $6.75, which can generate more intensity and broader support from voters. Mermin also says that wage measures have had success in recent years, Albuquerque notwithstanding, not because Americans feel differently but because campaigners are getting smarter about stressing morals over economics. And when handled adroitly, a wage platform can motivate the kind of voters who are difficult to engage in other ways: younger voters, infrequent voters, low-income urban voters. His research, Mermin adds, shows that most people who vote for the minimum wage know it's not going to affect their lives tomorrow: "It's not like fixing the health-care system, or repairing the retirement system," he says. "It doesn't rise to that level directly. And if you list it in 10 issues, it doesn't pop out in priority. But when it is on the ballot, it crystallizes a lot of things people feel about the economy and about people who are struggling." In his experience, voters seem to process these measures as an opportunity to take things into their own hands and change their world, just as Morty Simon did.
Still, as an endgame, many in the living-wage movement see the prize not in a series of local victories in 2006 but in Congressional action that results in a substantial increase in the federal minimum wage - and even better - one that is indexed to inflation, so that such battles about raising the wage don't need to be fought every few years. The long-run trajectory, Paul Sonn told me, is for cities and states to create enough pressure to ultimately force a raise on the federal level. Or to put it another way, the hope is that raising wages across the U.S. will ultimately demonstrate to voters and to Washington lawmakers both the feasibility and the necessity of a significantly higher minimum wage. In the meantime, Sonn says, cities like Santa Fe play an important role in policy innovation, "really as sort of laboratories of economic democracy." Richard Freeman of Harvard echoes this point. "If you go back, a lot of the New Deal legislation, good or bad, came about because there was a lot of state legislation," Freeman says. Policies from New York or Wisconsin were adapted into the federal system of laws. "A lot of it came from state variations in the past, and I think we'll see a lot more of this in the next few years. The things that work the best might be adopted nationally."
Of course, it also seems plausible that any kind of national coherence on economic - or moral - matters may have ended long ago. Just as the voters of states and cities have sorted themselves politically into red and blue, and into pro- and anti-gay marriage, in other words, they are increasingly sorting their wage floors and (perhaps soon) their health-care coverage. This trend may produce not progressive national policies but instead a level of local self-determination as yet unseen. Or as Freeman puts it, "Let Santa Fe do what it wants, but let's not impose that on Gadsden, Ala." That wouldn't make a federal increase in the minimum wage insignificant, but it would make it something of a backdrop for major population centers. As Robert Reich says, "The reality is, even if the wage were raised to $6.15, it would not be enough to lift a family out of poverty." And as Jen Kern points out, even a federal minimum wage that goes up to $7.25, which is the proposal from the Senate Democrats and which probably isn't going anywhere until 2008, doesn't approach what it now costs to live in some cities.
This was why, in December, Kern and Acorn were considering the prospects for laying the groundwork for living-wage ordinances in other cities. And it's why, also in December, Paul Sonn was helping to write an ordinance for Lawrence Township, N.J., aimed at forcing the city's big-box retailers like Wal-Mart to pay a higher wage (more than $10 an hour) and to contribute a larger share of employee benefits. Last month, Sonn also pointed out to me that Santa Cruz, Calif., was considering plans to introduce a measure that would establish a minimum wage of $9.25 an hour.
It wasn't quite Santa Fe's level, but close. And that suggested that the small New Mexican city, to the delight of its living-wage advocates and the chagrin of many business owners, was no longer just an experiment. Rather, it had already become something best described, for better or for worse, as a model.
Jon Gertner is a contributing writer for the magazine.