By JEFFREY GETTLEMAN
NAIROBI, Kenya
IN the early 1980s, in the lowlands of Mozambique, a new technology of warfare emerged that would sweep across Africa and soon the rest of the world: the child soldier.
Rebel commanders had constructed a four-foot tall killing machine that cut its way through village after village and nearly overran the government. Its trail was smoking huts and sawed off ears.
The Mozambicans learned that children were the perfect weapon: easily manipulated, intensely loyal, fearless and, most important, in endless supply.
Today, human rights groups say, there are 300,000 child soldiers worldwide. And experts say the problem is deepening as the nature of conflict itself changes — especially in Africa.
Here, in one country after another, conflicts have morphed from idea- or cause-driven struggles to warlord-led drives whose essential goal is plunder. Because those new rebel movements are motivated and financed by crime, popular support becomes irrelevant. Those in control don’t care about hearts and minds. They see the local population as prey.
The result is that few adults want to have anything to do with them, and manipulating and abducting children becomes the best way to sustain the organized banditry.
This dynamic has fueled some of the longest-running conflicts on the continent, and it could be seen this month alone in at least three countries:
•
In Somalia, within the last month, more than 1,000 people have been killed in Mogadishu, the capital, in a complex civil war compounded by warlords who command armies of teenagers. The war traces to 1991, when the central government was brought down by clans fighting over old grievances. But soon it became a contest among the warlords for control of airports, seaports and access to international aid. Sixteen years later, they are still blasting away.
•
In Congo, a civil war that started a decade ago to oust the cold-war-era tyrant Mobutu Sese Seko is now a multiheaded fight in which only one of the players is the government. The rest are rebel posses fighting among themselves for a share of the timber, copper, gold, diamonds and other resources. All sides, according to a report issued this month by Human Rights Watch, rely on child soldiers.
•
In Uganda, the latest in a series of peace talks — none successful so far — resumed last week, in an effort to end a reign of terror in rural areas by the Lord’s Resistance Army. That group was mustered in the late 1990s in the name of the oppressed Acholi minority, but it soon degenerated into a drugged-out street gang living in the jungle with military-grade weaponry and 13-year-old brides. Its ranks are filled with boys who have been brainwashed to burn down huts and pound newborn babies to death in wooden mortars, as if they were grinding grain.
Africa didn’t invent the modern under-age soldier. The Nazis drafted adolescents when they got desperate. So did Iran, which gave boys 12 to 16 years old plastic keys to heaven to hang around their necks as they cleared minefields during the Iran-Iraq War. Young teenagers have fought in religion-driven or nationalistic fights in Kosovo, the Palestinian territories and Afghanistan.
But here, in Africa, armed movements that survive on children as young as 9 have acquired a special character, nourished by breakdowns of state power or ideology. Many of these movements are about raw greed, power and brutality, with no effort to make excuses for it.
“There might have been a little rhetoric at the beginning,” said Ishmael Beah, a former child soldier in Sierra Leone and author of the best seller “A Long Way Gone: Memoirs of a Boy Soldier.” “But very quickly the ideology gets lost. And then it just becomes a bloodbath, a way for the commanders to plunder, a war of madness.”
Neil Boothby, a Columbia University professor who has worked with child soldiers across the world, said this new crop of movements lacked the features associated with the winning insurgencies of yesteryear — a charming, intelligent leader, persuasive vocabulary, the goal of taking cities.
The typical rebel leader emerging today wants most of all to run his criminal enterprise deep in the bush. “These are brutally thuggy people who don’t want to rule politically and have no strategy for winning a war,” Dr. Boothby said.
This is a sharp change from the liberation movements of the 1970s and 1980s and the cause-driven conflicts that followed — for example, those in Zimbabwe or Eritrea. Even Rwanda’s 1994 genocide followed some familiar patterns: It remained a contest, however gruesome, for political power between two ethnic groups. And children, by and large, were the victims of atrocities by adults, rather than the other way around.
William Reno, a political scientist at Northwestern University who studies armed movements, said: “If you look back 30 years at how wars were fought in Africa, they were liberation movements — freeing their countries from apartheid or colonial rule. They had to plug into Soviet aid or American aid, and the superpowers wanted to see a state, a vision of the future.”
“The people who ended up doing well,” he said, “were the college-educated ideologues.”
Their plans may have been dreamy. But at least, they had plans.
“We saw a lot of that in the 1980s,” said Chester Crocker, who was assistant secretary of state for Africa in the Reagan administration from 1981 to 1989. “For marketing reasons, leaders would dress themselves up in ideas.”
“Of course,” he added, “we always took it with a grain of salt.”
This charade ended with the cold war. Weak states that had been propped up by foreign aid and outside military assistance quickly collapsed. And Eastern bloc countries that had been cranking out Kalashnikovs for the Soviet Army had to find new markets. Africa, with its unpatrolled skies and endless shorelines, its gold mines and diamond mines and free-flowing cash economies, beckoned.
The result, Dr. Reno said, was that the political landscape opened up to well-armed opportunists, no longer inconvenienced by state regulations, state security or moral principles. “When there isn’t that big barrier anymore,” he said, “all these weird things start to happen.”
Like the use of child soldiers, who are often drawn into these movements, or kept there, with magic and superstition.
In many armed movements, children are taught that life and death depend on spirits, which are conjured up by their commanders and distilled in oils and amulets. Magic can spur children to do unspeakable things. It also bestows otherwise lackluster leaders with a veneer of supernatural respectability. “The commanders would wear certain pearls and said that guns wouldn’t hurt us,” Mr. Beah recalled. “And we believed it.”
Renamo, the South African-backed rebel army that terrorized Mozambique in the 1980s as it tried to destabilize the Marxist government, was among the first to turn to magic; it carved out a special role for witch doctors, whom the Marxists had marginalized.
By the time groups in Congo took that technique to its lowest depths in the late 1990s — some child soldiers there were instructed that eating their victims made them stronger — the world started paying attention. Advocates succeeded in placing the child soldier issue on the United Nations agenda and passing protocols that called for the age of combatants to be at least 18 (the United States and the United Kingdom are among the countries that have refused to sign).
But renegade armed groups continue to be a stumbling block. As lawlessness spreads, they too are spreading, from the bush to urban-area slums, where violent, quasi-religious movements seem to be taking root.
“It’s ridiculous to appeal to human rights with these groups because they are so far on the criminal end of the spectrum,” said Victoria Forbes Adam, director of the London-based Coalition to Stop the Use of Child Soldiers.
Just this month, in a shantytown near Nairobi, Kenya’s capital, enforcers from a group called the Mungiki — essentially a street gang that uses teenage muscle — hacked up several opponents in an effort to control the minibus racket. True to form, their leader has told his young henchmen that he rolled to earth in a ball of stars.
Monday, April 30
Sunday, April 29
Beating Plowshares into Swords
Diplomacy at Its Worst
By NICHOLAS D. KRISTOF
In May 2003, Iran sent a secret proposal to the U.S. for settling our mutual disputes in a “grand bargain.”
It is an astonishing document, for it tries to address a range of U.S. concerns about nuclear weapons, terrorism and Iraq. I’ve placed it and related documents (including multiple drafts of it) on my blog, www.nytimes.com/ontheground.
Hard-liners in the Bush administration killed discussions of a deal, and interviews with key players suggest that was an appalling mistake. There was a real hope for peace; now there is a real danger of war.
Scattered reports of the Iranian proposal have emerged previously, but if you read the full documentary record you’ll see that what the hard-liners killed wasn’t just one faxed Iranian proposal but an entire peace process. The record indicates that officials from the repressive, duplicitous government of Iran pursued peace more energetically and diplomatically than senior Bush administration officials — which makes me ache for my country.
The process began with Afghanistan in 2001-2. Iran and the U.S., both opponents of the Taliban, cooperated closely in stabilizing Afghanistan and providing aid, and unofficial “track two” processes grew to explore opportunities for improved relations.
On the U.S. side, track two involved well-connected former U.S. ambassadors, including Thomas Pickering, Frank Wisner and Nicholas Platt. The Iranian ambassador to the U.N., Javad Zarif, was a central player, as was an Iranian-American professor at Rutgers, Hooshang Amirahmadi, who heads a friendship group called the American Iranian Council.
At a dinner the council sponsored for its board at Ambassador Zarif’s home in September 2002, the group met Iran’s foreign minister, Kamal Kharrazi. According to the notes of Professor Amirahmadi, the foreign minister told the group, “Yes, we are ready to normalize relations,” provided the U.S. made the first move.
This was shaping into a historic opportunity to heal U.S.-Iranian relations, and the track two participants discussed further steps, including joint U.S.-Iranian cooperation against Saddam Hussein. The State Department and National Security Council were fully briefed, and in 2003 Ambassador Zarif met with two U.S. officials, Ryan Crocker and Zalmay Khalilzad, in a series of meetings in Paris and Geneva.
Encouraged, Iran transmitted its “grand bargain” proposals to the U.S. One version was apparently a paraphrase by the Swiss ambassador in Tehran; that was published this year in The Washington Post.
But Iran also sent its own master text of the proposal to the State Department and, through an intermediary, to the White House. I’ve also posted that document, which Iran regards as the definitive one.
In the master document, Iran talks about ensuring “full transparency” and other measures to assure the U.S. that it will not develop nuclear weapons. Iran offers “active Iranian support for Iraqi stabilization.” Iran also contemplates an end to “any material support to Palestinian opposition groups” while pressuring Hamas “to stop violent actions against civilians within” Israel (though not the occupied territories). Iran would support the transition of Hezbollah to be a “mere political organization within Lebanon” and endorse the Saudi initiative calling for a two-state solution to the Israeli-Palestinian conflict.
Iran also demanded a lot, including “mutual respect,” abolition of sanctions, access to peaceful nuclear technology and a U.S. statement that Iran did not belong in the “axis of evil.” Many crucial issues, including verification of Iran’s nuclear program, needed to be hammered out. It’s not clear to me that a grand bargain was reachable, but it was definitely worth pursuing — and still is today.
Instead, Bush administration hard-liners aborted the process. Another round of talks had been scheduled for Geneva, and Ambassador Zarif showed up — but not the U.S. side. That undermined Iranian moderates.
A U.S.-Iranian rapprochement could have saved lives in Iraq, isolated Palestinian terrorists and encouraged civil society groups in Iran. But instead the U.S. hard-liners chose to hammer plowshares into swords.
By NICHOLAS D. KRISTOF
In May 2003, Iran sent a secret proposal to the U.S. for settling our mutual disputes in a “grand bargain.”
It is an astonishing document, for it tries to address a range of U.S. concerns about nuclear weapons, terrorism and Iraq. I’ve placed it and related documents (including multiple drafts of it) on my blog, www.nytimes.com/ontheground.
Hard-liners in the Bush administration killed discussions of a deal, and interviews with key players suggest that was an appalling mistake. There was a real hope for peace; now there is a real danger of war.
Scattered reports of the Iranian proposal have emerged previously, but if you read the full documentary record you’ll see that what the hard-liners killed wasn’t just one faxed Iranian proposal but an entire peace process. The record indicates that officials from the repressive, duplicitous government of Iran pursued peace more energetically and diplomatically than senior Bush administration officials — which makes me ache for my country.
The process began with Afghanistan in 2001-2. Iran and the U.S., both opponents of the Taliban, cooperated closely in stabilizing Afghanistan and providing aid, and unofficial “track two” processes grew to explore opportunities for improved relations.
On the U.S. side, track two involved well-connected former U.S. ambassadors, including Thomas Pickering, Frank Wisner and Nicholas Platt. The Iranian ambassador to the U.N., Javad Zarif, was a central player, as was an Iranian-American professor at Rutgers, Hooshang Amirahmadi, who heads a friendship group called the American Iranian Council.
At a dinner the council sponsored for its board at Ambassador Zarif’s home in September 2002, the group met Iran’s foreign minister, Kamal Kharrazi. According to the notes of Professor Amirahmadi, the foreign minister told the group, “Yes, we are ready to normalize relations,” provided the U.S. made the first move.
This was shaping into a historic opportunity to heal U.S.-Iranian relations, and the track two participants discussed further steps, including joint U.S.-Iranian cooperation against Saddam Hussein. The State Department and National Security Council were fully briefed, and in 2003 Ambassador Zarif met with two U.S. officials, Ryan Crocker and Zalmay Khalilzad, in a series of meetings in Paris and Geneva.
Encouraged, Iran transmitted its “grand bargain” proposals to the U.S. One version was apparently a paraphrase by the Swiss ambassador in Tehran; that was published this year in The Washington Post.
But Iran also sent its own master text of the proposal to the State Department and, through an intermediary, to the White House. I’ve also posted that document, which Iran regards as the definitive one.
In the master document, Iran talks about ensuring “full transparency” and other measures to assure the U.S. that it will not develop nuclear weapons. Iran offers “active Iranian support for Iraqi stabilization.” Iran also contemplates an end to “any material support to Palestinian opposition groups” while pressuring Hamas “to stop violent actions against civilians within” Israel (though not the occupied territories). Iran would support the transition of Hezbollah to be a “mere political organization within Lebanon” and endorse the Saudi initiative calling for a two-state solution to the Israeli-Palestinian conflict.
Iran also demanded a lot, including “mutual respect,” abolition of sanctions, access to peaceful nuclear technology and a U.S. statement that Iran did not belong in the “axis of evil.” Many crucial issues, including verification of Iran’s nuclear program, needed to be hammered out. It’s not clear to me that a grand bargain was reachable, but it was definitely worth pursuing — and still is today.
Instead, Bush administration hard-liners aborted the process. Another round of talks had been scheduled for Geneva, and Ambassador Zarif showed up — but not the U.S. side. That undermined Iranian moderates.
A U.S.-Iranian rapprochement could have saved lives in Iraq, isolated Palestinian terrorists and encouraged civil society groups in Iran. But instead the U.S. hard-liners chose to hammer plowshares into swords.
What Washington Press Corps?
All the President’s Press
By FRANK RICH
SOMEHOW it’s hard to imagine David Halberstam yukking it up with Alberto Gonzales, Paul Wolfowitz and two discarded “American Idol” contestants at the annual White House Correspondents’ Association dinner. Before there was a Woodward and Bernstein, there was Halberstam, still not yet 30 in the early 1960s, calling those in power to account for lying about our “progress” in Vietnam. He did so even though J.F.K. told the publisher of The Times, “I wish like hell that you’d get Halberstam out of there.” He did so despite public ridicule from the dean of that era’s Georgetown punditocracy, the now forgotten columnist (and Vietnam War cheerleader) Joseph Alsop.
It was Alsop’s spirit, not Halberstam’s, that could be seen in C-Span’s live broadcast of the correspondents’ dinner last Saturday, two days before Halberstam’s death in a car crash in California. This fete is a crystallization of the press’s failures in the post-9/11 era: it illustrates how easily a propaganda-driven White House can enlist the Washington news media in its shows. Such is literally the case at the annual dinner, where journalists serve as a supporting cast, but it has been figuratively true year-round. The press has enabled stunts from the manufactured threat of imminent “mushroom clouds” to “Saving Private Lynch” to “Mission Accomplished,” whose fourth anniversary arrives on Tuesday. For all the recrimination, self-flagellation and reforms that followed these journalistic failures, it’s far from clear that the entire profession yet understands why it has lost the public’s faith.
That state of denial was center stage at the correspondents’ dinner last year, when the invited entertainer, Stephen Colbert, “fell flat,” as The Washington Post summed up the local consensus. To the astonishment of those in attendance, a funny thing happened outside the Beltway the morning after: the video of Mr. Colbert’s performance became a national sensation. (Last week it was still No. 2 among audiobook downloads on iTunes.) Washington wisdom had it that Mr. Colbert bombed because he was rude to the president. His real sin was to be rude to the capital press corps, whom he caricatured as stenographers. Though most of the Washington audience failed to find the joke funny, Americans elsewhere, having paid a heavy price for the press’s failure to challenge White House propaganda about Iraq, laughed until it hurt.
You’d think that l’affaire Colbert would have led to a little circumspection, but last Saturday’s dinner was another humiliation. And not just because this year’s entertainer, an apolitical nightclub has-been (Rich Little), was a ludicrously tone-deaf flop. More appalling — and symptomatic of the larger sycophancy — was the press’s insidious role in President Bush’s star turn at the event.
It’s the practice on these occasions that the president do his own comic shtick, but this year Mr. Bush made a grand show of abstaining, saying that the killings at Virginia Tech precluded his being a “funny guy.” Any civilian watching on TV could formulate the question left hanging by this pronouncement: Why did the killings in Iraq not preclude his being a “funny guy” at other press banquets we’ve watched on C-Span? At the equivalent Radio and Television Correspondents’ Association gala three years ago, the president contributed an elaborate (and tasteless) comic sketch about his failed search for Saddam’s W.M.D.
But the revelers in the ballroom last Saturday could not raise that discrepancy and challenge Mr. Bush’s hypocrisy; they could only clap. And so they served as captive dress extras in a propaganda stunt, lending their credibility to the president’s sanctimonious exploitation of the Virginia Tech tragedy for his own political self-aggrandizement on national television. Meanwhile the war was kept as tightly under wraps as the troops’ coffins.
By coincidence, this year’s dinner occurred just before a Congressional hearing filled in some new blanks in the still incomplete story of a more egregious White House propaganda extravaganza: the Pat Tillman hoax. As it turns out, the correspondents’ dinner played an embarrassing cameo role in it, too.
What the hearing underscored was the likelihood that the White House also knew very early on what the Army knew and covered up: the football star’s supposed death in battle in Afghanistan, vividly described in a Pentagon press release awarding him a Silver Star, was a complete fabrication, told to the world (and Tillman’s parents) even though top officers already suspected he had died by friendly fire. The White House apparently decided to join the Pentagon in maintaining that lie so that it could be milked for P.R. purposes on two television shows, the correspondents’ dinner on May 1, 2004, and a memorial service for Tillman two days later.
The timeline of events in the week or so leading up to that dinner is startling. Tillman was killed on April 22, 2004. By the next day top officers knew he had not been killed by enemy fire. On April 29, a top special operations commander sent a memo to John Abizaid, among other generals, suggesting that the White House be warned off making specific public claims about how Tillman died. Simultaneously, according to an e-mail that surfaced last week, a White House speechwriter contacted the Pentagon to gather information about Tillman for use at the correspondents’ dinner.
When President Bush spoke at the dinner at week’s end, he followed his jokes with a eulogy about Tillman’s sacrifice. But he kept the circumstances of Tillman’s death vague, no doubt because the White House did indeed get the message that the Pentagon’s press release about Tillman’s losing his life in battle was fiction. Yet it would be four more weeks before Pat Tillman’s own family was let in on the truth.
To see why the administration wanted to keep the myth going, just look at other events happening in the week before that correspondents’ dinner. On April 28, 2004, CBS broadcast the first photographs from Abu Ghraib; on April 29 a poll on The Times’s front page found the president’s approval rating on the war was plummeting; on April 30 Ted Koppel challenged the administration’s efforts to keep the war dead hidden by reading the names of the fallen on “Nightline.” Tillman could be useful to help drown out all this bad news, and to an extent he was. The Washington press corps that applauded the president at the correspondents’ dinner is the same press corps that was slow to recognize the importance of Abu Ghraib that weekend and, as documented by a new study, “When the Press Fails” (University of Chicago Press), even slower to label the crimes as torture.
In his PBS report last week about the journalism breakdown before the war, Bill Moyers said that “the press has yet to come to terms with its role in enabling the Bush administration to go to war on false pretenses.” That’s not universally true; a number of news organizations have owned up to their disasters and tried to learn from them. Yet old habits die hard: for too long the full weight of the scandal in the Gonzales Justice Department eluded some of the Washington media pack, just as Abu Ghraib and the C.I.A. leak case did.
After last weekend’s correspondents’ dinner, The Times decided to end its participation in such events. But even were the dinner to vanish altogether, it remains but a yearly televised snapshot of the overall syndrome. The current White House, weakened as it is, can still establish story lines as fake as “Mission Accomplished” and get a free pass.
To pick just one overarching example: much of the press still takes it as a given that Iraq has a functioning government that might meet political benchmarks (oil law, de-Baathification reform, etc., etc.) that would facilitate an American withdrawal. In reality, the Maliki “government” can’t meet any benchmarks, even if they were enforced, because that government exists only as a fictional White House talking point. As Gen. Barry McCaffrey said last week, this government doesn’t fully control a single province. Its Parliament, now approaching a scheduled summer recess, has passed no major legislation in months. Iraq’s sole recent democratic achievement is to ban the release of civilian casualty figures, lest they challenge White House happy talk about “progress” in Iraq.
It’s our country’s bitter fortune that while David Halberstam is gone, too many Joe Alsops still hold sway. Take the current dean of the Washington press corps, David Broder, who is leading the charge in ridiculing Harry Reid for saying the obvious — that “this war is lost” (as it is militarily, unless we stay in perpetuity and draft many more troops). In February, Mr. Broder handed down another gem of Beltway conventional wisdom, suggesting that “at the very moment the House of Representatives is repudiating his policy in Iraq, President Bush is poised for a political comeback.”
Some may recall that Stephen Colbert offered the same prediction in his monologue at the correspondents’ dinner a year ago. “I don’t believe this is a low point in this presidency,” he said. “I believe it is just a lull before a comeback.” But the fake pundit, unlike the real one, recognized that this was a joke.
By FRANK RICH
SOMEHOW it’s hard to imagine David Halberstam yukking it up with Alberto Gonzales, Paul Wolfowitz and two discarded “American Idol” contestants at the annual White House Correspondents’ Association dinner. Before there was a Woodward and Bernstein, there was Halberstam, still not yet 30 in the early 1960s, calling those in power to account for lying about our “progress” in Vietnam. He did so even though J.F.K. told the publisher of The Times, “I wish like hell that you’d get Halberstam out of there.” He did so despite public ridicule from the dean of that era’s Georgetown punditocracy, the now forgotten columnist (and Vietnam War cheerleader) Joseph Alsop.
It was Alsop’s spirit, not Halberstam’s, that could be seen in C-Span’s live broadcast of the correspondents’ dinner last Saturday, two days before Halberstam’s death in a car crash in California. This fete is a crystallization of the press’s failures in the post-9/11 era: it illustrates how easily a propaganda-driven White House can enlist the Washington news media in its shows. Such is literally the case at the annual dinner, where journalists serve as a supporting cast, but it has been figuratively true year-round. The press has enabled stunts from the manufactured threat of imminent “mushroom clouds” to “Saving Private Lynch” to “Mission Accomplished,” whose fourth anniversary arrives on Tuesday. For all the recrimination, self-flagellation and reforms that followed these journalistic failures, it’s far from clear that the entire profession yet understands why it has lost the public’s faith.
That state of denial was center stage at the correspondents’ dinner last year, when the invited entertainer, Stephen Colbert, “fell flat,” as The Washington Post summed up the local consensus. To the astonishment of those in attendance, a funny thing happened outside the Beltway the morning after: the video of Mr. Colbert’s performance became a national sensation. (Last week it was still No. 2 among audiobook downloads on iTunes.) Washington wisdom had it that Mr. Colbert bombed because he was rude to the president. His real sin was to be rude to the capital press corps, whom he caricatured as stenographers. Though most of the Washington audience failed to find the joke funny, Americans elsewhere, having paid a heavy price for the press’s failure to challenge White House propaganda about Iraq, laughed until it hurt.
You’d think that l’affaire Colbert would have led to a little circumspection, but last Saturday’s dinner was another humiliation. And not just because this year’s entertainer, an apolitical nightclub has-been (Rich Little), was a ludicrously tone-deaf flop. More appalling — and symptomatic of the larger sycophancy — was the press’s insidious role in President Bush’s star turn at the event.
It’s the practice on these occasions that the president do his own comic shtick, but this year Mr. Bush made a grand show of abstaining, saying that the killings at Virginia Tech precluded his being a “funny guy.” Any civilian watching on TV could formulate the question left hanging by this pronouncement: Why did the killings in Iraq not preclude his being a “funny guy” at other press banquets we’ve watched on C-Span? At the equivalent Radio and Television Correspondents’ Association gala three years ago, the president contributed an elaborate (and tasteless) comic sketch about his failed search for Saddam’s W.M.D.
But the revelers in the ballroom last Saturday could not raise that discrepancy and challenge Mr. Bush’s hypocrisy; they could only clap. And so they served as captive dress extras in a propaganda stunt, lending their credibility to the president’s sanctimonious exploitation of the Virginia Tech tragedy for his own political self-aggrandizement on national television. Meanwhile the war was kept as tightly under wraps as the troops’ coffins.
By coincidence, this year’s dinner occurred just before a Congressional hearing filled in some new blanks in the still incomplete story of a more egregious White House propaganda extravaganza: the Pat Tillman hoax. As it turns out, the correspondents’ dinner played an embarrassing cameo role in it, too.
What the hearing underscored was the likelihood that the White House also knew very early on what the Army knew and covered up: the football star’s supposed death in battle in Afghanistan, vividly described in a Pentagon press release awarding him a Silver Star, was a complete fabrication, told to the world (and Tillman’s parents) even though top officers already suspected he had died by friendly fire. The White House apparently decided to join the Pentagon in maintaining that lie so that it could be milked for P.R. purposes on two television shows, the correspondents’ dinner on May 1, 2004, and a memorial service for Tillman two days later.
The timeline of events in the week or so leading up to that dinner is startling. Tillman was killed on April 22, 2004. By the next day top officers knew he had not been killed by enemy fire. On April 29, a top special operations commander sent a memo to John Abizaid, among other generals, suggesting that the White House be warned off making specific public claims about how Tillman died. Simultaneously, according to an e-mail that surfaced last week, a White House speechwriter contacted the Pentagon to gather information about Tillman for use at the correspondents’ dinner.
When President Bush spoke at the dinner at week’s end, he followed his jokes with a eulogy about Tillman’s sacrifice. But he kept the circumstances of Tillman’s death vague, no doubt because the White House did indeed get the message that the Pentagon’s press release about Tillman’s losing his life in battle was fiction. Yet it would be four more weeks before Pat Tillman’s own family was let in on the truth.
To see why the administration wanted to keep the myth going, just look at other events happening in the week before that correspondents’ dinner. On April 28, 2004, CBS broadcast the first photographs from Abu Ghraib; on April 29 a poll on The Times’s front page found the president’s approval rating on the war was plummeting; on April 30 Ted Koppel challenged the administration’s efforts to keep the war dead hidden by reading the names of the fallen on “Nightline.” Tillman could be useful to help drown out all this bad news, and to an extent he was. The Washington press corps that applauded the president at the correspondents’ dinner is the same press corps that was slow to recognize the importance of Abu Ghraib that weekend and, as documented by a new study, “When the Press Fails” (University of Chicago Press), even slower to label the crimes as torture.
In his PBS report last week about the journalism breakdown before the war, Bill Moyers said that “the press has yet to come to terms with its role in enabling the Bush administration to go to war on false pretenses.” That’s not universally true; a number of news organizations have owned up to their disasters and tried to learn from them. Yet old habits die hard: for too long the full weight of the scandal in the Gonzales Justice Department eluded some of the Washington media pack, just as Abu Ghraib and the C.I.A. leak case did.
After last weekend’s correspondents’ dinner, The Times decided to end its participation in such events. But even were the dinner to vanish altogether, it remains but a yearly televised snapshot of the overall syndrome. The current White House, weakened as it is, can still establish story lines as fake as “Mission Accomplished” and get a free pass.
To pick just one overarching example: much of the press still takes it as a given that Iraq has a functioning government that might meet political benchmarks (oil law, de-Baathification reform, etc., etc.) that would facilitate an American withdrawal. In reality, the Maliki “government” can’t meet any benchmarks, even if they were enforced, because that government exists only as a fictional White House talking point. As Gen. Barry McCaffrey said last week, this government doesn’t fully control a single province. Its Parliament, now approaching a scheduled summer recess, has passed no major legislation in months. Iraq’s sole recent democratic achievement is to ban the release of civilian casualty figures, lest they challenge White House happy talk about “progress” in Iraq.
It’s our country’s bitter fortune that while David Halberstam is gone, too many Joe Alsops still hold sway. Take the current dean of the Washington press corps, David Broder, who is leading the charge in ridiculing Harry Reid for saying the obvious — that “this war is lost” (as it is militarily, unless we stay in perpetuity and draft many more troops). In February, Mr. Broder handed down another gem of Beltway conventional wisdom, suggesting that “at the very moment the House of Representatives is repudiating his policy in Iraq, President Bush is poised for a political comeback.”
Some may recall that Stephen Colbert offered the same prediction in his monologue at the correspondents’ dinner a year ago. “I don’t believe this is a low point in this presidency,” he said. “I believe it is just a lull before a comeback.” But the fake pundit, unlike the real one, recognized that this was a joke.
Friday, April 27
God Is Not Great: How Religion Poisons Everything, by Christopher Hitchens
Religion Poisons Everything
There are four irreducible objections to religious faith: that it wholly misrepresents the origins of man and the cosmos, that because of this original error it manages to combine the maximum of servility with the maximum of solipsism, that it is both the result and the cause of dangerous sexual repression, and that it is ultimately grounded on wish-thinking.
I do not think it is arrogant of me to claim that I had already discovered these four objections (as well as noticed the more vulgar and obvious fact that religion is used by those in temporal charge to invest themselves with authority) before my boyish voice had broken. I am morally certain that millions of other people came to very similar conclusions in very much the same way, and I have since met such people in hundreds of places, and in dozens of different countries. Many of them never believed, and many of them abandoned faith after a difficult struggle. Some of them had blinding moments of un-conviction that were every bit as instantaneous, though perhaps less epileptic and apocalyptic (and later more rationally and more morally justified) than Saul of Tarsus on the Damascene road. And here is the point, about myself and my co-thinkers. Our belief is not a belief. Our principles are not a faith. We do not rely solely upon science and reason, because these are necessary rather than sufficient factors, but we distrust anything that contradicts science or outrages reason. We may differ on many things, but what we respect is free inquiry, openmindedness, and the pursuit of ideas for their own sake. We do not hold our convictions dogmatically: the disagreement between Professor Stephen Jay Gould and Professor Richard Dawkins, concerning "punctuated evolution" and the unfilled gaps in post-Darwinian theory, is quite wide as well as quite deep, but we shall resolve it by evidence and reasoning and not by mutual excommunication. (My own annoyance at Professor Dawkins and Daniel Dennett, for their cringe-making proposal that atheists should conceitedly nominate themselves to be called "brights," is a part of a continuous argument.) We are not immune to the lure of wonder and mystery and awe: we have music and art and literature, and find that the serious ethical dilemmas are better handled by Shakespeare and Tolstoy and Schiller and Dostoyevsky and George Eliot than in the mythical morality tales of the holy books. Literature, not scripture, sustains the mind and—since there is no other metaphor—also the soul. We do not believe in heaven or hell, yet no statistic will ever find that without these blandishments and threats we commit more crimes of greed or violence than the faithful. (In fact, if a proper statistical inquiry could ever be made, I am sure the evidence would be the other way.) We are reconciled to living only once, except through our children, for whom we are perfectly happy to notice that we must make way, and room. We speculate that it is at least possible that, once people accepted the fact of their short and struggling lives, they might behave better toward each other and not worse. We believe with certainty that an ethical life can be lived without religion. And we know for a fact that the corollary holds true—that religion has caused innumerable people not just to conduct themselves no better than others, but to award themselves permission to behave in ways that would make a brothel-keeper or an ethnic cleanser raise an eyebrow.
Most important of all, perhaps, we infidels do not need any machinery of reinforcement. We are those who Blaise Pascal took into account when he wrote to the one who says, "I am so made that I cannot believe."
There is no need for us to gather every day, or every seven days, or on any high and auspicious day, to proclaim our rectitude or to grovel and wallow in our unworthiness. We atheists do not require any priests, or any hierarchy above them, to police our doctrine. Sacrifices and ceremonies are abhorrent to us, as are relics and the worship of any images or objects (even including objects in the form of one of man's most useful innovations: the bound book). To us no spot on earth is or could be "holier" than another: to the ostentatious absurdity of the pilgrimage, or the plain horror of killing civilians in the name of some sacred wall or cave or shrine or rock, we can counterpose a leisurely or urgent walk from one side of the library or the gallery to another, or to lunch with an agreeable friend, in pursuit of truth or beauty. Some of these excursions to the bookshelf or the lunch or the gallery will obviously, if they are serious, bring us into contact with belief and believers, from the great devotional painters and composers to the works of Augustine, Aquinas, Maimonides, and Newman. These mighty scholars may have written many evil things or many foolish things, and been laughably ignorant of the germ theory of disease or the place of the terrestrial globe in the solar system, let alone the universe, and this is the plain reason why there are no more of them today, and why there will be no more of them tomorrow. Religion spoke its last intelligible or noble or inspiring words a long time ago: either that or it mutated into an admirable but nebulous humanism, as did, say, Dietrich Bonhoeffer, a brave Lutheran pastor hanged by the Nazis for his refusal to collude with them. We shall have no more prophets or sages from the ancient quarter, which is why the devotions of today are only the echoing repetitions of yesterday, sometimes ratcheted up to screaming point so as to ward off the terrible emptiness.
While some religious apology is magnificent in its limited way—one might cite Pascal—and some of it is dreary and absurd—here one cannot avoid naming C. S. Lewis—both styles have something in common, namely the appalling load of strain that they have to bear. How much effort it takes to affirm the incredible! The Aztecs had to tear open a human chest cavity every day just to make sure that the sun would rise. Monotheists are supposed to pester their deity more times than that, perhaps, lest he be deaf. How much vanity must be concealed—not too effectively at that—in order to pretend that one is the personal object of a divine plan? How much self-respect must be sacrificed in order that one may squirm continually in an awareness of one's own sin? How many needless assumptions must be made, and how much contortion is required, to receive every new insight of science and manipulate it so as to "fit" with the revealed words of ancient man-made deities? How many saints and miracles and councils and conclaves are required in order first to be able to establish a dogma and then—after infinite pain and loss and absurdity and cruelty—to be forced to rescind one of those dogmas? God did not create man in his own image. Evidently, it was the other way about, which is the painless explanation for the profusion of gods and religions, and the fratricide both between and among faiths, that we see all about us and that has so retarded the development of civilization.
The mildest criticism of religion is also the most radical and the most devastating one. Religion is man-made. Even the men who made it cannot agree on what their prophets or redeemers or gurus actually said or did. Still less can they hope to tell us the "meaning" of later discoveries and developments which were, when they began, either obstructed by their religions or denounced by them. And yet—the believers still claim to know! Not just to know, but to know everything. Not just to know that god exists, and that he created and supervised the whole enterprise, but also to know what "he" demands of us—from our diet to our observances to our sexual morality. In other words, in a vast and complicated discussion where we know more and more about less and less, yet can still hope for some enlightenment as we proceed, one faction—itself composed of mutually warring factions—has the sheer arrogance to tell us that we already have all the essential information we need. Such stupidity, combined with such pride, should be enough on its own to exclude "belief" from the debate. The person who is certain, and who claims divine warrant for his certainty, belongs now to the infancy of our species. It may be a long farewell, but it has begun and, like all farewells, should not be protracted.
The argument with faith is the foundation and origin of all arguments, because it is the beginning—but not the end—of all arguments about philosophy, science, history, and human nature. It is also the beginning—but by no means the end—of all disputes about the good life and the just city. Religious faith is, precisely because we are still-evolving creatures, ineradicable. It will never die out, or at least not until we get over our fear of death, and of the dark, and of the unknown, and of each other. For this reason, I would not prohibit it even if I thought I could. Very generous of me, you may say. But will the religious grant me the same indulgence? I ask because there is a real and serious difference between me and my religious friends, and the real and serious friends are sufficiently honest to admit it. I would be quite content to go to their children's bar mitzvahs, to marvel at their Gothic cathedrals, to "respect" their belief that the Koran was dictated, though exclusively in Arabic, to an illiterate merchant, or to interest myself in Wicca and Hindu and Jain consolations. And as it happens, I will continue to do this without insisting on the polite reciprocal condition—which is that they in turn leave me alone. But this, religion is ultimately incapable of doing. As I write these words, and as you read them, people of faith are in their different ways planning your and my destruction, and the destruction of all the hard-won human attainments that I have touched upon. Religion poisons everything.
From: Christopher Hitchens
Subject: Was Muhammad Epileptic?
Posted Thursday, April 26, 2007, at 10:28 AM ET
There is some question as to whether Islam is a separate religion at all. It initially fulfilled a need among Arabs for a distinctive or special creed, and is forever identified with their language and their impressive later conquests, which, while not as striking as those of the young Alexander of Macedonia, certainly conveyed an idea of being backed by a divine will until they petered out at the fringes of the Balkans and the Mediterranean. But Islam when examined is not much more than a rather obvious and ill-arranged set of plagiarisms, helping itself from earlier books and traditions as occasion appeared to require. Thus, far from being "born in the clear light of history," as Ernest Renan so generously phrased it, Islam in its origins is just as shady and approximate as those from which it took its borrowings. It makes immense claims for itself, invokes prostrate submission or "surrender" as a maxim to its adherents, and demands deference and respect from nonbelievers into the bargain. There is nothing—absolutely nothing—in its teachings that can even begin to justify such arrogance and presumption.
The prophet died in the year 632 of our own approximate calendar. The first account of his life was set down a full hundred and twenty years later by Ibn Ishaq, whose original was lost and can only be consulted through its reworked form, authored by Ibn Hisham, who died in 834. Adding to this hearsay and obscurity, there is no agreed-upon account of how the Prophet's followers assembled the Koran, or of how his various sayings (some of them written down by secretaries) became codified. And this familiar problem is further complicated—even more than in the Christian case—by the matter of succession. Unlike Jesus, who apparently undertook to return to earth very soon and who (pace the absurd Dan Brown) left no known descendants, Muhammad was a general and a politician and—though unlike Alexander of Macedonia a prolific father—left no instruction as to who was to take up his mantle. Quarrels over the leadership began almost as soon as he died, and so Islam had its first major schism—between the Sunni and the Shia—before it had even established itself as a system. We need take no side in the schism, except to point out that one at least of the schools of interpretation must be quite mistaken. And the initial identification of Islam with an earthly caliphate, made up of disputatious contenders for the said mantle, marked it from the very beginning as man-made.
It is said by some Muslim authorities that during the first caliphate of Abu Bakr, immediately after Muhammad's death, concern arose that his orally transmitted words might be forgotten. So many Muslim soldiers had been killed in battle that the number who had the Koran safely lodged in their memories had become alarmingly small. It was therefore decided to assemble every living witness, together with "pieces of paper, stones, palm leaves, shoulder-blades, ribs and bits of leather" on which sayings had been scribbled, and give them to Zaid ibn Thabit, one of the Prophet's former secretaries, for an authoritative collation. Once this had been done, the believers had something like an authorized version.
If true, this would date the Koran to a time fairly close to Muhammad's own life. But we swiftly discover that there is no certainty or agreement about the truth of the story. Some say that it was Ali—the fourth and not the first caliph, and the founder of Shiism—who had the idea. Many others—the Sunni majority—assert that it was Caliph Uthman, who reigned from 644 to 656, who made the finalized decision. Told by one of his generals that soldiers from different provinces were fighting over discrepant accounts of the Koran, Uthman ordered Zaid ibn Thabit to bring together the various texts, unify them, and have them transcribed into one. When this task was complete, Uthman ordered standard copies to be sent to Kufa, Basra, Damascus, and elsewhere, with a master copy retained in Medina. Uthman thus played the canonical role that had been taken, in the standardization and purging and censorship of the Christian Bible, by Irenaeus and by Bishop Athanasius of Alexandria. The roll was called, and some texts were declared sacred and inerrant while others became "apocryphal." Outdoing Athanasius, Uthman ordered that all earlier and rival editions be destroyed.
Even supposing this version of events to be correct, which would mean that no chance existed for scholars ever to determine or even dispute what really happened in Muhammad's time, Uthman's attempt to abolish disagreement was a vain one. The written Arabic language has two features that make it difficult for an outsider to learn: it uses dots to distinguish consonants like "b" and "t," and in its original form it had no sign or symbol for short vowels, which could be rendered by various dashes or comma-type marks. Vastly different readings even of Uthman's version were enabled by these variations. Arabic script itself was not standardized until the later part of the ninth century, and in the meantime the undotted and oddly voweled Koran was generating wildly different explanations of itself, as it still does. This might not matter in the case of the Iliad, but remember that we are supposed to be talking about the unalterable (and final) word of god. There is obviously a connection between the sheer feebleness of this claim and the absolutely fanatical certainty with which it is advanced. To take one instance that can hardly be called negligible, the Arabic words written on the outside of the Dome of the Rock in Jerusalem are different from any version that appears in the Koran.
The situation is even more shaky and deplorable when we come to the hadith, or that vast orally generated secondary literature which supposedly conveys the sayings and actions of Muhammad, the tale of the Koran's compilation, and the sayings of "the companions of the Prophet." Each hadith, in order to be considered authentic, must be supported in turn by an isnad, or chain, of supposedly reliable witnesses. Many Muslims allow their attitude to everyday life to be determined by these anecdotes: regarding dogs as unclean, for example, on the sole ground that Muhammad is said to have done so.
As one might expect, the six authorized collections of hadith, which pile hearsay upon hearsay through the unwinding of the long spool of isnads ("A told B, who had it from C, who learned it from D"), were put together centuries after the events they purport to describe. One of the most famous of the six compilers, Bukhari, died 238 years after the death of Muhammad. Bukhari is deemed unusually reliable and honest by Muslims, and seems to have deserved his reputation in that, of the three hundred thousand attestations he accumulated in a lifetime devoted to the project, he ruled that two hundred thousand of them were entirely valueless and unsupported. Further exclusion of dubious traditions and questionable isnads reduced his grand total to ten thousand hadith. You are free to believe, if you so choose, that out of this formless mass of illiterate and half-remembered witnessing the pious Bukhari, more than two centuries later, managed to select only the pure and undefiled ones that would bear examination.
The likelihood that any of this humanly derived rhetoric is "inerrant," let alone "final," is conclusively disproved not just by its innumerable contradictions and incoherencies but by the famous episode of the Koran's alleged "satanic verses," out of which Salman Rushdie was later to make a literary project. On this much-discussed occasion, Muhammad was seeking to conciliate some leading Meccan poly-theists and in due course experienced a "revelation" that allowed them after all to continue worshipping some of the older local deities. It struck him later that this could not be right and that he must have inadvertently been "channeled" by the devil, who for some reason had briefly chosen to relax his habit of combating monotheists on their own ground. (Muhammad believed devoutly not just in the devil himself but in minor desert devils, or djinns, as well.) It was noticed even by some of his wives that the Prophet was capable of having a "revelation" that happened to suit his short-term needs, and he was sometimes teased about it. We are further told—on no authority that need be believed—that when he experienced revelation in public he would sometimes be gripped by pain and experience loud ringing in his ears. Beads of sweat would burst out on him, even on the chilliest of days. Some heartless Christian critics have suggested that he was an epileptic (though they fail to notice the same symptoms in the seizure experienced by Paul on the road to Damascus), but there is no need for us to speculate in this way. It is enough to rephrase David Hume's unavoidable question. Which is more likely—that a man should be used as a transmitter by god to deliver some already existing revelations, or that he should utter some already existing revelations and believe himself to be, or claim to be, ordered by god to do so? As for the pains and the noises in the head, or the sweat, one can only regret the seeming fact that direct communication with god is not an experience of calm, beauty, and lucidity.
There are four irreducible objections to religious faith: that it wholly misrepresents the origins of man and the cosmos, that because of this original error it manages to combine the maximum of servility with the maximum of solipsism, that it is both the result and the cause of dangerous sexual repression, and that it is ultimately grounded on wish-thinking.
I do not think it is arrogant of me to claim that I had already discovered these four objections (as well as noticed the more vulgar and obvious fact that religion is used by those in temporal charge to invest themselves with authority) before my boyish voice had broken. I am morally certain that millions of other people came to very similar conclusions in very much the same way, and I have since met such people in hundreds of places, and in dozens of different countries. Many of them never believed, and many of them abandoned faith after a difficult struggle. Some of them had blinding moments of un-conviction that were every bit as instantaneous, though perhaps less epileptic and apocalyptic (and later more rationally and more morally justified) than Saul of Tarsus on the Damascene road. And here is the point, about myself and my co-thinkers. Our belief is not a belief. Our principles are not a faith. We do not rely solely upon science and reason, because these are necessary rather than sufficient factors, but we distrust anything that contradicts science or outrages reason. We may differ on many things, but what we respect is free inquiry, openmindedness, and the pursuit of ideas for their own sake. We do not hold our convictions dogmatically: the disagreement between Professor Stephen Jay Gould and Professor Richard Dawkins, concerning "punctuated evolution" and the unfilled gaps in post-Darwinian theory, is quite wide as well as quite deep, but we shall resolve it by evidence and reasoning and not by mutual excommunication. (My own annoyance at Professor Dawkins and Daniel Dennett, for their cringe-making proposal that atheists should conceitedly nominate themselves to be called "brights," is a part of a continuous argument.) We are not immune to the lure of wonder and mystery and awe: we have music and art and literature, and find that the serious ethical dilemmas are better handled by Shakespeare and Tolstoy and Schiller and Dostoyevsky and George Eliot than in the mythical morality tales of the holy books. Literature, not scripture, sustains the mind and—since there is no other metaphor—also the soul. We do not believe in heaven or hell, yet no statistic will ever find that without these blandishments and threats we commit more crimes of greed or violence than the faithful. (In fact, if a proper statistical inquiry could ever be made, I am sure the evidence would be the other way.) We are reconciled to living only once, except through our children, for whom we are perfectly happy to notice that we must make way, and room. We speculate that it is at least possible that, once people accepted the fact of their short and struggling lives, they might behave better toward each other and not worse. We believe with certainty that an ethical life can be lived without religion. And we know for a fact that the corollary holds true—that religion has caused innumerable people not just to conduct themselves no better than others, but to award themselves permission to behave in ways that would make a brothel-keeper or an ethnic cleanser raise an eyebrow.
Most important of all, perhaps, we infidels do not need any machinery of reinforcement. We are those who Blaise Pascal took into account when he wrote to the one who says, "I am so made that I cannot believe."
There is no need for us to gather every day, or every seven days, or on any high and auspicious day, to proclaim our rectitude or to grovel and wallow in our unworthiness. We atheists do not require any priests, or any hierarchy above them, to police our doctrine. Sacrifices and ceremonies are abhorrent to us, as are relics and the worship of any images or objects (even including objects in the form of one of man's most useful innovations: the bound book). To us no spot on earth is or could be "holier" than another: to the ostentatious absurdity of the pilgrimage, or the plain horror of killing civilians in the name of some sacred wall or cave or shrine or rock, we can counterpose a leisurely or urgent walk from one side of the library or the gallery to another, or to lunch with an agreeable friend, in pursuit of truth or beauty. Some of these excursions to the bookshelf or the lunch or the gallery will obviously, if they are serious, bring us into contact with belief and believers, from the great devotional painters and composers to the works of Augustine, Aquinas, Maimonides, and Newman. These mighty scholars may have written many evil things or many foolish things, and been laughably ignorant of the germ theory of disease or the place of the terrestrial globe in the solar system, let alone the universe, and this is the plain reason why there are no more of them today, and why there will be no more of them tomorrow. Religion spoke its last intelligible or noble or inspiring words a long time ago: either that or it mutated into an admirable but nebulous humanism, as did, say, Dietrich Bonhoeffer, a brave Lutheran pastor hanged by the Nazis for his refusal to collude with them. We shall have no more prophets or sages from the ancient quarter, which is why the devotions of today are only the echoing repetitions of yesterday, sometimes ratcheted up to screaming point so as to ward off the terrible emptiness.
While some religious apology is magnificent in its limited way—one might cite Pascal—and some of it is dreary and absurd—here one cannot avoid naming C. S. Lewis—both styles have something in common, namely the appalling load of strain that they have to bear. How much effort it takes to affirm the incredible! The Aztecs had to tear open a human chest cavity every day just to make sure that the sun would rise. Monotheists are supposed to pester their deity more times than that, perhaps, lest he be deaf. How much vanity must be concealed—not too effectively at that—in order to pretend that one is the personal object of a divine plan? How much self-respect must be sacrificed in order that one may squirm continually in an awareness of one's own sin? How many needless assumptions must be made, and how much contortion is required, to receive every new insight of science and manipulate it so as to "fit" with the revealed words of ancient man-made deities? How many saints and miracles and councils and conclaves are required in order first to be able to establish a dogma and then—after infinite pain and loss and absurdity and cruelty—to be forced to rescind one of those dogmas? God did not create man in his own image. Evidently, it was the other way about, which is the painless explanation for the profusion of gods and religions, and the fratricide both between and among faiths, that we see all about us and that has so retarded the development of civilization.
The mildest criticism of religion is also the most radical and the most devastating one. Religion is man-made. Even the men who made it cannot agree on what their prophets or redeemers or gurus actually said or did. Still less can they hope to tell us the "meaning" of later discoveries and developments which were, when they began, either obstructed by their religions or denounced by them. And yet—the believers still claim to know! Not just to know, but to know everything. Not just to know that god exists, and that he created and supervised the whole enterprise, but also to know what "he" demands of us—from our diet to our observances to our sexual morality. In other words, in a vast and complicated discussion where we know more and more about less and less, yet can still hope for some enlightenment as we proceed, one faction—itself composed of mutually warring factions—has the sheer arrogance to tell us that we already have all the essential information we need. Such stupidity, combined with such pride, should be enough on its own to exclude "belief" from the debate. The person who is certain, and who claims divine warrant for his certainty, belongs now to the infancy of our species. It may be a long farewell, but it has begun and, like all farewells, should not be protracted.
The argument with faith is the foundation and origin of all arguments, because it is the beginning—but not the end—of all arguments about philosophy, science, history, and human nature. It is also the beginning—but by no means the end—of all disputes about the good life and the just city. Religious faith is, precisely because we are still-evolving creatures, ineradicable. It will never die out, or at least not until we get over our fear of death, and of the dark, and of the unknown, and of each other. For this reason, I would not prohibit it even if I thought I could. Very generous of me, you may say. But will the religious grant me the same indulgence? I ask because there is a real and serious difference between me and my religious friends, and the real and serious friends are sufficiently honest to admit it. I would be quite content to go to their children's bar mitzvahs, to marvel at their Gothic cathedrals, to "respect" their belief that the Koran was dictated, though exclusively in Arabic, to an illiterate merchant, or to interest myself in Wicca and Hindu and Jain consolations. And as it happens, I will continue to do this without insisting on the polite reciprocal condition—which is that they in turn leave me alone. But this, religion is ultimately incapable of doing. As I write these words, and as you read them, people of faith are in their different ways planning your and my destruction, and the destruction of all the hard-won human attainments that I have touched upon. Religion poisons everything.
From: Christopher Hitchens
Subject: Was Muhammad Epileptic?
Posted Thursday, April 26, 2007, at 10:28 AM ET
There is some question as to whether Islam is a separate religion at all. It initially fulfilled a need among Arabs for a distinctive or special creed, and is forever identified with their language and their impressive later conquests, which, while not as striking as those of the young Alexander of Macedonia, certainly conveyed an idea of being backed by a divine will until they petered out at the fringes of the Balkans and the Mediterranean. But Islam when examined is not much more than a rather obvious and ill-arranged set of plagiarisms, helping itself from earlier books and traditions as occasion appeared to require. Thus, far from being "born in the clear light of history," as Ernest Renan so generously phrased it, Islam in its origins is just as shady and approximate as those from which it took its borrowings. It makes immense claims for itself, invokes prostrate submission or "surrender" as a maxim to its adherents, and demands deference and respect from nonbelievers into the bargain. There is nothing—absolutely nothing—in its teachings that can even begin to justify such arrogance and presumption.
The prophet died in the year 632 of our own approximate calendar. The first account of his life was set down a full hundred and twenty years later by Ibn Ishaq, whose original was lost and can only be consulted through its reworked form, authored by Ibn Hisham, who died in 834. Adding to this hearsay and obscurity, there is no agreed-upon account of how the Prophet's followers assembled the Koran, or of how his various sayings (some of them written down by secretaries) became codified. And this familiar problem is further complicated—even more than in the Christian case—by the matter of succession. Unlike Jesus, who apparently undertook to return to earth very soon and who (pace the absurd Dan Brown) left no known descendants, Muhammad was a general and a politician and—though unlike Alexander of Macedonia a prolific father—left no instruction as to who was to take up his mantle. Quarrels over the leadership began almost as soon as he died, and so Islam had its first major schism—between the Sunni and the Shia—before it had even established itself as a system. We need take no side in the schism, except to point out that one at least of the schools of interpretation must be quite mistaken. And the initial identification of Islam with an earthly caliphate, made up of disputatious contenders for the said mantle, marked it from the very beginning as man-made.
It is said by some Muslim authorities that during the first caliphate of Abu Bakr, immediately after Muhammad's death, concern arose that his orally transmitted words might be forgotten. So many Muslim soldiers had been killed in battle that the number who had the Koran safely lodged in their memories had become alarmingly small. It was therefore decided to assemble every living witness, together with "pieces of paper, stones, palm leaves, shoulder-blades, ribs and bits of leather" on which sayings had been scribbled, and give them to Zaid ibn Thabit, one of the Prophet's former secretaries, for an authoritative collation. Once this had been done, the believers had something like an authorized version.
If true, this would date the Koran to a time fairly close to Muhammad's own life. But we swiftly discover that there is no certainty or agreement about the truth of the story. Some say that it was Ali—the fourth and not the first caliph, and the founder of Shiism—who had the idea. Many others—the Sunni majority—assert that it was Caliph Uthman, who reigned from 644 to 656, who made the finalized decision. Told by one of his generals that soldiers from different provinces were fighting over discrepant accounts of the Koran, Uthman ordered Zaid ibn Thabit to bring together the various texts, unify them, and have them transcribed into one. When this task was complete, Uthman ordered standard copies to be sent to Kufa, Basra, Damascus, and elsewhere, with a master copy retained in Medina. Uthman thus played the canonical role that had been taken, in the standardization and purging and censorship of the Christian Bible, by Irenaeus and by Bishop Athanasius of Alexandria. The roll was called, and some texts were declared sacred and inerrant while others became "apocryphal." Outdoing Athanasius, Uthman ordered that all earlier and rival editions be destroyed.
Even supposing this version of events to be correct, which would mean that no chance existed for scholars ever to determine or even dispute what really happened in Muhammad's time, Uthman's attempt to abolish disagreement was a vain one. The written Arabic language has two features that make it difficult for an outsider to learn: it uses dots to distinguish consonants like "b" and "t," and in its original form it had no sign or symbol for short vowels, which could be rendered by various dashes or comma-type marks. Vastly different readings even of Uthman's version were enabled by these variations. Arabic script itself was not standardized until the later part of the ninth century, and in the meantime the undotted and oddly voweled Koran was generating wildly different explanations of itself, as it still does. This might not matter in the case of the Iliad, but remember that we are supposed to be talking about the unalterable (and final) word of god. There is obviously a connection between the sheer feebleness of this claim and the absolutely fanatical certainty with which it is advanced. To take one instance that can hardly be called negligible, the Arabic words written on the outside of the Dome of the Rock in Jerusalem are different from any version that appears in the Koran.
The situation is even more shaky and deplorable when we come to the hadith, or that vast orally generated secondary literature which supposedly conveys the sayings and actions of Muhammad, the tale of the Koran's compilation, and the sayings of "the companions of the Prophet." Each hadith, in order to be considered authentic, must be supported in turn by an isnad, or chain, of supposedly reliable witnesses. Many Muslims allow their attitude to everyday life to be determined by these anecdotes: regarding dogs as unclean, for example, on the sole ground that Muhammad is said to have done so.
As one might expect, the six authorized collections of hadith, which pile hearsay upon hearsay through the unwinding of the long spool of isnads ("A told B, who had it from C, who learned it from D"), were put together centuries after the events they purport to describe. One of the most famous of the six compilers, Bukhari, died 238 years after the death of Muhammad. Bukhari is deemed unusually reliable and honest by Muslims, and seems to have deserved his reputation in that, of the three hundred thousand attestations he accumulated in a lifetime devoted to the project, he ruled that two hundred thousand of them were entirely valueless and unsupported. Further exclusion of dubious traditions and questionable isnads reduced his grand total to ten thousand hadith. You are free to believe, if you so choose, that out of this formless mass of illiterate and half-remembered witnessing the pious Bukhari, more than two centuries later, managed to select only the pure and undefiled ones that would bear examination.
The likelihood that any of this humanly derived rhetoric is "inerrant," let alone "final," is conclusively disproved not just by its innumerable contradictions and incoherencies but by the famous episode of the Koran's alleged "satanic verses," out of which Salman Rushdie was later to make a literary project. On this much-discussed occasion, Muhammad was seeking to conciliate some leading Meccan poly-theists and in due course experienced a "revelation" that allowed them after all to continue worshipping some of the older local deities. It struck him later that this could not be right and that he must have inadvertently been "channeled" by the devil, who for some reason had briefly chosen to relax his habit of combating monotheists on their own ground. (Muhammad believed devoutly not just in the devil himself but in minor desert devils, or djinns, as well.) It was noticed even by some of his wives that the Prophet was capable of having a "revelation" that happened to suit his short-term needs, and he was sometimes teased about it. We are further told—on no authority that need be believed—that when he experienced revelation in public he would sometimes be gripped by pain and experience loud ringing in his ears. Beads of sweat would burst out on him, even on the chilliest of days. Some heartless Christian critics have suggested that he was an epileptic (though they fail to notice the same symptoms in the seizure experienced by Paul on the road to Damascus), but there is no need for us to speculate in this way. It is enough to rephrase David Hume's unavoidable question. Which is more likely—that a man should be used as a transmitter by god to deliver some already existing revelations, or that he should utter some already existing revelations and believe himself to be, or claim to be, ordered by god to do so? As for the pains and the noises in the head, or the sweat, one can only regret the seeming fact that direct communication with god is not an experience of calm, beauty, and lucidity.
Monday, April 23
Tougher gun laws are better politics than you think.
By Bruce Reed, SLATE
Monday, Apr. 23, 2007
Misfire: George H.W. Bush is famous for saying, "Read my lips," but the three words that best captured the way America felt during the first Bush administration were a catch phrase from Dana Carvey—"not gonna happen." The country faced a host of daunting social and economic problems, from rising crime rates to shrinking incomes to deep divisions that burst into view in South Central Los Angeles. But what troubled people most was that no matter how urgent the problem, the answer from Washington was always the same: "not gonna happen."
One Bush later, we find ourselves in the same grim mood today. We face a series of monumental challenges—Iraq, climate change, a vanishing social contract. Such problems would be breathtakingly difficult in any era but seem virtually impossible in this one. Glaciers move faster than our politics, and both are receding.
We have good reason to feel this way. Nothing happened after Hurricane Katrina. Nothing new ever seems to happen in Iraq. Even when something appears to happen, such as last week's decision on abortion, we know better: Nothing's happening when the same issues never go away.
But last week's response to the Virginia Tech tragedy made it official: Not-Gonna-Happen Days are here again. Across the political spectrum, commentators reached the same conclusion. Whatever they think ought to be done to prevent future tragedies, they're unanimous on one point: We're not going to do it.
Even in the ivory towers, where the laws of political gravity don't apply, the dreamers were silent. For its online feature, Think Tank Town, the Washington Post asked a variety of scholars, "How can policies be improved in the wake of the Virginia Tech shootings?" All the posts had more or less the same headlines: "The Real Problem Transcends Policy," "Gun Control Doesn't Fit This Crime," "Not Every Tragedy Has a Solution," "Evil Is Always With Us." Another post concluded, "There is not much we can or should do." Another warned not to pass new laws because existing ones might be the culprit. No scholar proposed much of anything on guns.
Granted, most of the scholars in the Post survey come from center-right think tanks and have ideological biases toward doing nothing. But they're not the only ones the Post asked. The center-left think tanks on the Post's list—like Brookings and the Center for American Progress—didn't even bother to show up.
Those of us who work in think tanks are supposed to come up with ideas with little or no chance of passage. Yet in this age of policy ennui, even people who get paid to be hopelessly unrealistic can't suspend disbelief on guns.
I grew up in gun country, and I know what it's like to be strafed by the NRA. I understand why Democrats from red states don't want to risk the next election on an issue of little interest back home. But over the long haul, it is a substantive and political mistake to duck the issue altogether. Guns are a cultural issue but also a crime one—and both parties should have learned over the years that they dodge any crime issue at their peril.
The substantive case for common-sense gun crime and safety measures is clear enough. When Clinton signed the Brady Bill in 1993—after seven years of talk that it would never happen—the NRA said the new law was pointless. In the years since, it has kept handguns out of the hands of tens of thousands of criminals, stalkers, and troubled individuals. If Virginia had properly interpreted the law, it probably would have stopped Cho from buying the guns that wreaked havoc at Virginia Tech.
When the 1994 crime bill banned the manufacture of high-capacity ammunition clips, the NRA once again went ballistic. The bill wasn't as tough as it should have been, because NRA sympathizers in Congress grandfathered existing clips. But the ban kept more clips from flooding the market. The best testimony to its impact is how much gun manufacturers tout that it has lapsed. TopGlock.com offers "new Glock factory magazines that are legal under the repeal of the 1994 Assault Weapons bill." The 15-round clip Cho used with his Glock semiautomatic pistol is on sale for $19.72. TopGlock advertises the clips on a "sunset" page (to mark the law's sunset), which you can access by clicking on the ad for ammunition clips that's just above the tribute to the victims at Virginia Tech.
The political case for not running for cover on guns is equally straightforward. Unlike most politicians, voters are not ideological about crime. They don't care what it takes, they just want it to go down. The Brady Bill and the clip ban passed because the most influential gun owners in America—police officers and sheriffs—were tired of being outgunned by drug lords, madmen, and thugs.
When Democrats ignore the gun issue, they think about the political bullet they're dodging but not about the opportunity they'll miss. In the 1980s, Republicans talked tough on crime and ran ads about Willie Horton but sat on their hands while the crime rate went up. When Bill Clinton promised to try everything to fight crime—with more police officers on the street, and fewer guns—police organizations dropped their support for the GOP and stood behind him instead.
The current political calculus is that guns cost Gore the 2000 election by denying him West Virginia and his home state of Tennessee. This argument might be more convincing if Gore hadn't essentially carried the gun-mad state of Florida. In some states, the gun issue made it more difficult for Gore to bridge the cultural divide but hardly caused it. Four years ago, Gore and Clinton carried those same states with the same position on guns and the memory of the assault-weapons ban much fresher in voters' minds.
Not so long ago, in fact, Republicans were the ones who feared the gun issue. At his first campaign stop en route to the 1996 Democratic convention, Clinton stood with police officers to promise that in his second term, he would expand the Brady Bill to cover people with histories of domestic violence. Republicans in Congress were so afraid guns would hurt them in the suburbs, they sent Clinton the Brady expansion a few weeks later.
In those days, Rudy Giuliani was still in favor of tough gun-crime laws, either because he believed in them as a former prosecutor or because they were wildly popular. Giuliani's politics have changed, but contrary to conventional wisdom, the politics of guns have not. If gun laws were a true third rail, Michael Bloomberg—who wants to be president as much as any candidate in the race—wouldn't be seizing the opening to launch a national crusade around them.
Voters aren't the obstacle to banning high-capacity clips or closing the gun-show loophole; they support those measures by broad margins. The real hurdle is finding leaders who are willing to get tough on crime, no matter where they find it—and who have the standing to prove they know the difference between hunters and criminals. Bill Clinton wasn't a lifelong hunter, like Mitt Romney. He didn't need to be. He was a Bubba.
In recent years, Democrats have suffered a Bubba shortage. But Democratic Bubbas are making a comeback in the South, Midwest, and West. As they gain confidence, they will realize, as Clinton did, that real Bubbas look to cops for approval, not the NRA.
As it happens, one Bubba is in a unique position to lead a hard-headed look at gun laws and gun-crime enforcement: the new senator from Virginia, Jim Webb. Webb is one of the most independent-minded senators in memory and an outspoken man of principle. With an aide who was arrested for bringing a loaded gun into his Senate office, he has an unassailable pro-gun record. Moreover, the state Webb represents is deep in grief over a tragedy that underscores points that both the NRA and gun-control proponents have made—that our gun laws have too many loopholes and that existing laws need to be better enforced. Webb could even lead the effort hand in hand with his Republican colleague, Sen. John Warner, who voted against the assault ban in 1994 but stood with police officers in opposing its repeal in 2004.
A thorough look at gun laws might not lead in predictable ways. But the gun debate desperately needs what Webb and Warner could bring—a preference for independence over ideology, and the moral authority that comes from rejecting the politics of "not gonna happen" in favor of trying to find ways to prevent senseless crimes from happening again.
Monday, Apr. 23, 2007
Misfire: George H.W. Bush is famous for saying, "Read my lips," but the three words that best captured the way America felt during the first Bush administration were a catch phrase from Dana Carvey—"not gonna happen." The country faced a host of daunting social and economic problems, from rising crime rates to shrinking incomes to deep divisions that burst into view in South Central Los Angeles. But what troubled people most was that no matter how urgent the problem, the answer from Washington was always the same: "not gonna happen."
One Bush later, we find ourselves in the same grim mood today. We face a series of monumental challenges—Iraq, climate change, a vanishing social contract. Such problems would be breathtakingly difficult in any era but seem virtually impossible in this one. Glaciers move faster than our politics, and both are receding.
We have good reason to feel this way. Nothing happened after Hurricane Katrina. Nothing new ever seems to happen in Iraq. Even when something appears to happen, such as last week's decision on abortion, we know better: Nothing's happening when the same issues never go away.
But last week's response to the Virginia Tech tragedy made it official: Not-Gonna-Happen Days are here again. Across the political spectrum, commentators reached the same conclusion. Whatever they think ought to be done to prevent future tragedies, they're unanimous on one point: We're not going to do it.
Even in the ivory towers, where the laws of political gravity don't apply, the dreamers were silent. For its online feature, Think Tank Town, the Washington Post asked a variety of scholars, "How can policies be improved in the wake of the Virginia Tech shootings?" All the posts had more or less the same headlines: "The Real Problem Transcends Policy," "Gun Control Doesn't Fit This Crime," "Not Every Tragedy Has a Solution," "Evil Is Always With Us." Another post concluded, "There is not much we can or should do." Another warned not to pass new laws because existing ones might be the culprit. No scholar proposed much of anything on guns.
Granted, most of the scholars in the Post survey come from center-right think tanks and have ideological biases toward doing nothing. But they're not the only ones the Post asked. The center-left think tanks on the Post's list—like Brookings and the Center for American Progress—didn't even bother to show up.
Those of us who work in think tanks are supposed to come up with ideas with little or no chance of passage. Yet in this age of policy ennui, even people who get paid to be hopelessly unrealistic can't suspend disbelief on guns.
I grew up in gun country, and I know what it's like to be strafed by the NRA. I understand why Democrats from red states don't want to risk the next election on an issue of little interest back home. But over the long haul, it is a substantive and political mistake to duck the issue altogether. Guns are a cultural issue but also a crime one—and both parties should have learned over the years that they dodge any crime issue at their peril.
The substantive case for common-sense gun crime and safety measures is clear enough. When Clinton signed the Brady Bill in 1993—after seven years of talk that it would never happen—the NRA said the new law was pointless. In the years since, it has kept handguns out of the hands of tens of thousands of criminals, stalkers, and troubled individuals. If Virginia had properly interpreted the law, it probably would have stopped Cho from buying the guns that wreaked havoc at Virginia Tech.
When the 1994 crime bill banned the manufacture of high-capacity ammunition clips, the NRA once again went ballistic. The bill wasn't as tough as it should have been, because NRA sympathizers in Congress grandfathered existing clips. But the ban kept more clips from flooding the market. The best testimony to its impact is how much gun manufacturers tout that it has lapsed. TopGlock.com offers "new Glock factory magazines that are legal under the repeal of the 1994 Assault Weapons bill." The 15-round clip Cho used with his Glock semiautomatic pistol is on sale for $19.72. TopGlock advertises the clips on a "sunset" page (to mark the law's sunset), which you can access by clicking on the ad for ammunition clips that's just above the tribute to the victims at Virginia Tech.
The political case for not running for cover on guns is equally straightforward. Unlike most politicians, voters are not ideological about crime. They don't care what it takes, they just want it to go down. The Brady Bill and the clip ban passed because the most influential gun owners in America—police officers and sheriffs—were tired of being outgunned by drug lords, madmen, and thugs.
When Democrats ignore the gun issue, they think about the political bullet they're dodging but not about the opportunity they'll miss. In the 1980s, Republicans talked tough on crime and ran ads about Willie Horton but sat on their hands while the crime rate went up. When Bill Clinton promised to try everything to fight crime—with more police officers on the street, and fewer guns—police organizations dropped their support for the GOP and stood behind him instead.
The current political calculus is that guns cost Gore the 2000 election by denying him West Virginia and his home state of Tennessee. This argument might be more convincing if Gore hadn't essentially carried the gun-mad state of Florida. In some states, the gun issue made it more difficult for Gore to bridge the cultural divide but hardly caused it. Four years ago, Gore and Clinton carried those same states with the same position on guns and the memory of the assault-weapons ban much fresher in voters' minds.
Not so long ago, in fact, Republicans were the ones who feared the gun issue. At his first campaign stop en route to the 1996 Democratic convention, Clinton stood with police officers to promise that in his second term, he would expand the Brady Bill to cover people with histories of domestic violence. Republicans in Congress were so afraid guns would hurt them in the suburbs, they sent Clinton the Brady expansion a few weeks later.
In those days, Rudy Giuliani was still in favor of tough gun-crime laws, either because he believed in them as a former prosecutor or because they were wildly popular. Giuliani's politics have changed, but contrary to conventional wisdom, the politics of guns have not. If gun laws were a true third rail, Michael Bloomberg—who wants to be president as much as any candidate in the race—wouldn't be seizing the opening to launch a national crusade around them.
Voters aren't the obstacle to banning high-capacity clips or closing the gun-show loophole; they support those measures by broad margins. The real hurdle is finding leaders who are willing to get tough on crime, no matter where they find it—and who have the standing to prove they know the difference between hunters and criminals. Bill Clinton wasn't a lifelong hunter, like Mitt Romney. He didn't need to be. He was a Bubba.
In recent years, Democrats have suffered a Bubba shortage. But Democratic Bubbas are making a comeback in the South, Midwest, and West. As they gain confidence, they will realize, as Clinton did, that real Bubbas look to cops for approval, not the NRA.
As it happens, one Bubba is in a unique position to lead a hard-headed look at gun laws and gun-crime enforcement: the new senator from Virginia, Jim Webb. Webb is one of the most independent-minded senators in memory and an outspoken man of principle. With an aide who was arrested for bringing a loaded gun into his Senate office, he has an unassailable pro-gun record. Moreover, the state Webb represents is deep in grief over a tragedy that underscores points that both the NRA and gun-control proponents have made—that our gun laws have too many loopholes and that existing laws need to be better enforced. Webb could even lead the effort hand in hand with his Republican colleague, Sen. John Warner, who voted against the assault ban in 1994 but stood with police officers in opposing its repeal in 2004.
A thorough look at gun laws might not lead in predictable ways. But the gun debate desperately needs what Webb and Warner could bring—a preference for independence over ideology, and the moral authority that comes from rejecting the politics of "not gonna happen" in favor of trying to find ways to prevent senseless crimes from happening again.
Friday, April 20
The Conservatives on the Supreme Court have gone into Wonderland
"Beyond Alice in Wonderland" . . .
Marty Lederman (Yale Law, reposted from Balkanization)
. . . to criminalize abortion to protect women. That's how Reva Siegel puts the point in this important piece from Linda Greenhouse in the New York Times. Greenhouse elaborates:
[N]ever until Wednesday had the court held that an abortion procedure could be prohibited because the procedure itself, not the pregnancy, threatened a woman’s health — mental health, in this case, and moral health as well. In his majority opinion, Justice Anthony M. Kennedy suggested that a pregnant woman who chooses abortion falls away from true womanhood.
“Respect for human life finds an ultimate expression in the bond of love the mother has for her child,” he said.
Justice Kennedy conceded that “we find no reliable data” on whether abortion in general, or the procedure prohibited by the Partial-Birth Abortion Ban Act, causes women emotional harm. But he said it was nonetheless “self-evident” and “unexceptional to conclude” that “some women” who choose to terminate their pregnancies suffer “regret,” “severe depression,” “loss of esteem” and other ills.
Consequently, he said, the government has a legitimate interest in banning a particularly problematic abortion procedure to prevent women from casually or ill-advisedly making “so grave a choice.”
* * * *
In an article to be published shortly in The University of Illinois Law Review, Professor Siegel traces the migration of the notion of abortion’s harm to women from internal strategy sessions of the anti-abortion movement in the 1990s to the formation of legal arguments and public policy.
* * * *
On his blog, Balkinization, Prof. Jack M. Balkin of Yale Law School defined the message behind what he called the “new paternalism”: “Either a woman is crazy when she undergoes an abortion, or she will become crazy later on.”
Despite the activity in the states, the anti-abortion movement’s new focus remained largely under the radar until it emerged full-blown in Justice Kennedy’s opinion. As evidence that “some women come to regret their choice to abort the infant life they once created and sustained,” Justice Kennedy cited a brief filed in the case by the Justice Foundation, an anti-abortion group that runs a Web site and telephone help line for women “hurting from abortion.” The brief contained affidavits from 180 such women, describing feelings of shame, guilt and depression.
As the Greenhouse piece explains, this new woman-protective component in the Court's jurisprudence consists of at least two parts. The first is an empirical presumption that women choosing abortions regularly suffer "regret," "anguished grief," "profound sorrow," and (as if that weren't enough) "severe depression and loss of esteem."
The second is a form of paternalism in which the Court (in Justice Ginsburg's words) "deprives women of the right to make an autonomous choice, even at the expense of their safety." As Jack writes below, "[t]he basic goal of this new rhetoric is to undermine the notion that women exercise any kind of choice when they decide to have abortions. It seeks to turn the rhetoric of the pro-choice movement on its head. Women, the new rhetoric argues, don't really understand what they are doing when they decide to have abortions; as a result, they often regret having them later on."
The argument for abortion restrictions (and the informed-consent laws that Jack idetifies) is based in no small part not only on such empirical presumptions about the dire fate of women who choose abortions, but also on essentialist sex-based generalizations, most importantly that "respect for human life finds an ultimate expression in the bond of love the mother has for her child."
There are (at least) two obvious and significant problems with this new move. The first is that, until this week, to base state policy on such stereotypes and paternalism was itself a violation of the Equal Protection Clause, under numerous modern precedents. "This way of thinking," Justice Ginsburg lamented in dissent, "reflects ancient notions about women's place in the family and under the Constitution ideas that have long since been discredited." As Justice Ginsburg's citations demonstrate, the Court itself had played no small role in discrediting those ideas, in a series of cases litigated by, if not (in the case of the 1996 VMI decision) written by, Justice Ginsburg herself over the past 40-plus years. (Hence my quotation in the Times yesterday that the Court's women-protective rationale was "an attack on [Justice Ginsburg's] entire life's work.") (For more on this theme, see Dahlia Lithwick here.)
The second problem is that even if what Jack calls the "New Paternalism" would be justifiable if based on actual empirical trends -- again, something that the Court has routinely rejected as unconstitutional since 1970 -- the assumptions the Court invokes about the effects of abortions upon women are concededly based not on any reliable evidence ("While we find no reliable data to measure the phenomenon . . . "), but instead merely on anecdotes culled from an amicus brief (the only authority cited).
In this respect, as Jack notes below and as as Reva Siegel's article examines in great detail, the Court's opinion is the striking culmination of a concerted litigation strategy that abortion-rights opponents designed in the past few years, in which they have so resolutely publicized anecdotal "proof" that abortions are bad for women that Justice Kennedy is now comfortable stating -- as the law of the land -- that the Court finds these truths to be "self-evident" and "unexceptional." (For much more on how this counter-narrative strategy took root and flourished so quickly, see this important article by Emily Bazelon in the New York Times Magazine back in January.)
This reliance on anecdote and popular myth, rather than on empirical proof, is a troubling development, one that has echoes in the Court's first decision of the Term, the little-noticed but significant per curiam opinion in Purcell v. Gonzales.
Purcell involved an Arizona voter-identification law that had been challenged as a violation of the Constitution and federal election laws. The court of appeals temporarily enjoined operation of the law, and Arizona made an emergency application for a stay to the Supreme Court. The Court sua sponte treated the request as a petition for certiorari, granted it, and reversed the court of appeals just before last November's election, thereby allowing the identification requirements to be implemented pending a full trial on the merits. In so doing, the Court described constitutional interests on both sides of the case.
The State's compelling interest, the Court said, was preventing so-called voter fraud: "Confidence in the integrity of our electoral processes is essential to the functioning of our participatory democracy. Voter fraud drives honest citizens out of the democratic process and breeds distrust of our government. Voters who fear their legitimate votes will be outweighed by fraudulent ones will feel disenfranchised."
The Court in Purcell thus unanimously credited the view that voters "feel" disenfranchised when voter fraud (allegedly) takes place in elections, and that such a "feeling" offsets the interests of voters who are disenfranchised by voter ID laws by actually driving honest citizens out of the democratic process!
This striking claim was accompanied by no citation of authority, and was based on no evidence whatsoever. Indeed, as Rick Hasen has explained in decrying the Court's "wholly unsupported empirical assumption":
[T]he available evidence . . . seems to suggest that voter identification requirements are more likely to depress turnout than to increase it, and that voter confidence in the electoral process, at least among African-Americans, is decreasing because of voter identification requirements. The empirical case for that contrary point is not yet solid, but the assumption is at least plausible given the evidence. The Court’s supposition does not even rely on any suggestive evidence, and I am aware of none. . . . Moreover, the Supreme Court did not acknowledge that some voters might “feel” disenfranchised when the state imposes barriers on voting such as a voter-identification law without proof that such laws are necessary to deter voter fraud. At the very least, the Court should have ordered briefing and oral argument on the question, which would have allowed the challengers to bring to the Court’s attention the Missouri Supreme Court’s important discussion of the issue [in its 2006 Weinschenk decision], which concluded that "if this Court were to approve the placement of severe restrictions on Missourians’ fundamental rights owing to the mere perception of a problem in this instance, then the tactic of shaping public misperception could be used in the future as a mechanism for further burdening the right to vote or other fundamental rights."
As we have learned in the wake of recent events, and as I discussed last week in connection with the several scandals swirling about the Department of Justice, this common perception of widespread voter fraud -- like the new recevied wisdom that abortions harm women -- is overwhelmingly the result not of any actual problem, but instead of a concerted effort (in this case, alas, powerfully supported by those wielding state power) to establish a mythical but increasingly canonical narrative. And here, the object of that narrative is to disenfranchise likely Democratic voters. (See, e.g., recent stories in the New York Times, McClatchy, and In These Times. See also Rick Hasen's account of one district court's recent rejection of a DOJ case based on alleged but nonexistent "voter fraud.")
Unfortunately, as in Wednesday's abortion case, the myth has so successfully taken hold in the public imagination that it has now been enshrined as the predicate for the Law of the Land.
Curiouser and curiouser . . .
Posted 1:04 AM by Marty Lederman [link] (0) comments
Marty Lederman (Yale Law, reposted from Balkanization)
. . . to criminalize abortion to protect women. That's how Reva Siegel puts the point in this important piece from Linda Greenhouse in the New York Times. Greenhouse elaborates:
[N]ever until Wednesday had the court held that an abortion procedure could be prohibited because the procedure itself, not the pregnancy, threatened a woman’s health — mental health, in this case, and moral health as well. In his majority opinion, Justice Anthony M. Kennedy suggested that a pregnant woman who chooses abortion falls away from true womanhood.
“Respect for human life finds an ultimate expression in the bond of love the mother has for her child,” he said.
Justice Kennedy conceded that “we find no reliable data” on whether abortion in general, or the procedure prohibited by the Partial-Birth Abortion Ban Act, causes women emotional harm. But he said it was nonetheless “self-evident” and “unexceptional to conclude” that “some women” who choose to terminate their pregnancies suffer “regret,” “severe depression,” “loss of esteem” and other ills.
Consequently, he said, the government has a legitimate interest in banning a particularly problematic abortion procedure to prevent women from casually or ill-advisedly making “so grave a choice.”
* * * *
In an article to be published shortly in The University of Illinois Law Review, Professor Siegel traces the migration of the notion of abortion’s harm to women from internal strategy sessions of the anti-abortion movement in the 1990s to the formation of legal arguments and public policy.
* * * *
On his blog, Balkinization, Prof. Jack M. Balkin of Yale Law School defined the message behind what he called the “new paternalism”: “Either a woman is crazy when she undergoes an abortion, or she will become crazy later on.”
Despite the activity in the states, the anti-abortion movement’s new focus remained largely under the radar until it emerged full-blown in Justice Kennedy’s opinion. As evidence that “some women come to regret their choice to abort the infant life they once created and sustained,” Justice Kennedy cited a brief filed in the case by the Justice Foundation, an anti-abortion group that runs a Web site and telephone help line for women “hurting from abortion.” The brief contained affidavits from 180 such women, describing feelings of shame, guilt and depression.
As the Greenhouse piece explains, this new woman-protective component in the Court's jurisprudence consists of at least two parts. The first is an empirical presumption that women choosing abortions regularly suffer "regret," "anguished grief," "profound sorrow," and (as if that weren't enough) "severe depression and loss of esteem."
The second is a form of paternalism in which the Court (in Justice Ginsburg's words) "deprives women of the right to make an autonomous choice, even at the expense of their safety." As Jack writes below, "[t]he basic goal of this new rhetoric is to undermine the notion that women exercise any kind of choice when they decide to have abortions. It seeks to turn the rhetoric of the pro-choice movement on its head. Women, the new rhetoric argues, don't really understand what they are doing when they decide to have abortions; as a result, they often regret having them later on."
The argument for abortion restrictions (and the informed-consent laws that Jack idetifies) is based in no small part not only on such empirical presumptions about the dire fate of women who choose abortions, but also on essentialist sex-based generalizations, most importantly that "respect for human life finds an ultimate expression in the bond of love the mother has for her child."
There are (at least) two obvious and significant problems with this new move. The first is that, until this week, to base state policy on such stereotypes and paternalism was itself a violation of the Equal Protection Clause, under numerous modern precedents. "This way of thinking," Justice Ginsburg lamented in dissent, "reflects ancient notions about women's place in the family and under the Constitution ideas that have long since been discredited." As Justice Ginsburg's citations demonstrate, the Court itself had played no small role in discrediting those ideas, in a series of cases litigated by, if not (in the case of the 1996 VMI decision) written by, Justice Ginsburg herself over the past 40-plus years. (Hence my quotation in the Times yesterday that the Court's women-protective rationale was "an attack on [Justice Ginsburg's] entire life's work.") (For more on this theme, see Dahlia Lithwick here.)
The second problem is that even if what Jack calls the "New Paternalism" would be justifiable if based on actual empirical trends -- again, something that the Court has routinely rejected as unconstitutional since 1970 -- the assumptions the Court invokes about the effects of abortions upon women are concededly based not on any reliable evidence ("While we find no reliable data to measure the phenomenon . . . "), but instead merely on anecdotes culled from an amicus brief (the only authority cited).
In this respect, as Jack notes below and as as Reva Siegel's article examines in great detail, the Court's opinion is the striking culmination of a concerted litigation strategy that abortion-rights opponents designed in the past few years, in which they have so resolutely publicized anecdotal "proof" that abortions are bad for women that Justice Kennedy is now comfortable stating -- as the law of the land -- that the Court finds these truths to be "self-evident" and "unexceptional." (For much more on how this counter-narrative strategy took root and flourished so quickly, see this important article by Emily Bazelon in the New York Times Magazine back in January.)
This reliance on anecdote and popular myth, rather than on empirical proof, is a troubling development, one that has echoes in the Court's first decision of the Term, the little-noticed but significant per curiam opinion in Purcell v. Gonzales.
Purcell involved an Arizona voter-identification law that had been challenged as a violation of the Constitution and federal election laws. The court of appeals temporarily enjoined operation of the law, and Arizona made an emergency application for a stay to the Supreme Court. The Court sua sponte treated the request as a petition for certiorari, granted it, and reversed the court of appeals just before last November's election, thereby allowing the identification requirements to be implemented pending a full trial on the merits. In so doing, the Court described constitutional interests on both sides of the case.
The State's compelling interest, the Court said, was preventing so-called voter fraud: "Confidence in the integrity of our electoral processes is essential to the functioning of our participatory democracy. Voter fraud drives honest citizens out of the democratic process and breeds distrust of our government. Voters who fear their legitimate votes will be outweighed by fraudulent ones will feel disenfranchised."
The Court in Purcell thus unanimously credited the view that voters "feel" disenfranchised when voter fraud (allegedly) takes place in elections, and that such a "feeling" offsets the interests of voters who are disenfranchised by voter ID laws by actually driving honest citizens out of the democratic process!
This striking claim was accompanied by no citation of authority, and was based on no evidence whatsoever. Indeed, as Rick Hasen has explained in decrying the Court's "wholly unsupported empirical assumption":
[T]he available evidence . . . seems to suggest that voter identification requirements are more likely to depress turnout than to increase it, and that voter confidence in the electoral process, at least among African-Americans, is decreasing because of voter identification requirements. The empirical case for that contrary point is not yet solid, but the assumption is at least plausible given the evidence. The Court’s supposition does not even rely on any suggestive evidence, and I am aware of none. . . . Moreover, the Supreme Court did not acknowledge that some voters might “feel” disenfranchised when the state imposes barriers on voting such as a voter-identification law without proof that such laws are necessary to deter voter fraud. At the very least, the Court should have ordered briefing and oral argument on the question, which would have allowed the challengers to bring to the Court’s attention the Missouri Supreme Court’s important discussion of the issue [in its 2006 Weinschenk decision], which concluded that "if this Court were to approve the placement of severe restrictions on Missourians’ fundamental rights owing to the mere perception of a problem in this instance, then the tactic of shaping public misperception could be used in the future as a mechanism for further burdening the right to vote or other fundamental rights."
As we have learned in the wake of recent events, and as I discussed last week in connection with the several scandals swirling about the Department of Justice, this common perception of widespread voter fraud -- like the new recevied wisdom that abortions harm women -- is overwhelmingly the result not of any actual problem, but instead of a concerted effort (in this case, alas, powerfully supported by those wielding state power) to establish a mythical but increasingly canonical narrative. And here, the object of that narrative is to disenfranchise likely Democratic voters. (See, e.g., recent stories in the New York Times, McClatchy, and In These Times. See also Rick Hasen's account of one district court's recent rejection of a DOJ case based on alleged but nonexistent "voter fraud.")
Unfortunately, as in Wednesday's abortion case, the myth has so successfully taken hold in the public imagination that it has now been enshrined as the predicate for the Law of the Land.
Curiouser and curiouser . . .
Posted 1:04 AM by Marty Lederman [link] (0) comments
Thursday, April 19
NYT Editorial on Abortion Decision
Among the major flaws in yesterday’s Supreme Court decision giving the federal government power to limit a woman’s right to make decisions about her health was its fundamental dishonesty.
Under the modest-sounding guise of following existing precedent, the majority opinion — written by Justice Anthony Kennedy and joined by Chief Justice John Roberts and Justices Clarence Thomas, Antonin Scalia and Samuel Alito — gutted a host of thoughtful lower federal court rulings, not to mention past Supreme Court rulings.
It severely eroded the constitutional respect and protection accorded to women and the personal decisions they make about pregnancy and childbirth. The justices went so far as to eviscerate the crucial requirement, which dates to the 1973 ruling in Roe v. Wade, that all abortion regulations must have an exception to protect a woman’s health.
As far as we know, Mr. Kennedy and his four colleagues responsible for this atrocious result are not doctors. Yet these five male justices felt free to override the weight of medical evidence presented during the several trials that preceded the Supreme Court showdown. Instead, they ratified the politically based and dangerously dubious Congressional claim that criminalizing the intact dilation and extraction method of abortion in the second trimester of pregnancy — the so-called partial-birth method — would never pose a significant health risk to a woman. In fact, the American College of Obstetricians and Gynecologists has found the procedure to be medically necessary in certain cases.
Justice Kennedy actually reasoned that banning the procedure was good for women in that it would protect them from a procedure they might not fully understand in advance and would probably come to regret. This way of thinking, that women are flighty creatures who must be protected by men, reflects notions of a woman’s place in the family and under the Constitution that have long been discredited, said a powerful dissenting opinion by Justice Ruth Bader Ginsburg, joined by Justices John Paul Stevens, David Souter and Stephen Breyer.
Far from being compelled by the court’s precedents, Justice Ginsburg aptly objected, the new ruling is so at odds with its jurisprudence — including a concurring opinion by Justice Sandra Day O’Connor (who has now been succeeded by Justice Alito) when a remarkably similar state abortion ban was struck down just seven years ago — that it should not have staying power.
For anti-abortion activists, this case has never been about just one controversial procedure. They have correctly seen it as a wedge that could ultimately be used to undermine and perhaps eliminate abortion rights eventually. The court has handed the Bush administration and other opponents of women’s reproductive rights the big political victory they were hoping to get from the conservative judges Mr. Bush has added to the bench. It comes at a real cost to the court’s credibility, its integrity and the rule of law.
Under the modest-sounding guise of following existing precedent, the majority opinion — written by Justice Anthony Kennedy and joined by Chief Justice John Roberts and Justices Clarence Thomas, Antonin Scalia and Samuel Alito — gutted a host of thoughtful lower federal court rulings, not to mention past Supreme Court rulings.
It severely eroded the constitutional respect and protection accorded to women and the personal decisions they make about pregnancy and childbirth. The justices went so far as to eviscerate the crucial requirement, which dates to the 1973 ruling in Roe v. Wade, that all abortion regulations must have an exception to protect a woman’s health.
As far as we know, Mr. Kennedy and his four colleagues responsible for this atrocious result are not doctors. Yet these five male justices felt free to override the weight of medical evidence presented during the several trials that preceded the Supreme Court showdown. Instead, they ratified the politically based and dangerously dubious Congressional claim that criminalizing the intact dilation and extraction method of abortion in the second trimester of pregnancy — the so-called partial-birth method — would never pose a significant health risk to a woman. In fact, the American College of Obstetricians and Gynecologists has found the procedure to be medically necessary in certain cases.
Justice Kennedy actually reasoned that banning the procedure was good for women in that it would protect them from a procedure they might not fully understand in advance and would probably come to regret. This way of thinking, that women are flighty creatures who must be protected by men, reflects notions of a woman’s place in the family and under the Constitution that have long been discredited, said a powerful dissenting opinion by Justice Ruth Bader Ginsburg, joined by Justices John Paul Stevens, David Souter and Stephen Breyer.
Far from being compelled by the court’s precedents, Justice Ginsburg aptly objected, the new ruling is so at odds with its jurisprudence — including a concurring opinion by Justice Sandra Day O’Connor (who has now been succeeded by Justice Alito) when a remarkably similar state abortion ban was struck down just seven years ago — that it should not have staying power.
For anti-abortion activists, this case has never been about just one controversial procedure. They have correctly seen it as a wedge that could ultimately be used to undermine and perhaps eliminate abortion rights eventually. The court has handed the Bush administration and other opponents of women’s reproductive rights the big political victory they were hoping to get from the conservative judges Mr. Bush has added to the bench. It comes at a real cost to the court’s credibility, its integrity and the rule of law.
Wednesday, April 18
Re-education
The New York Times Magazine
By ANN HULBERT
I. The Student
“Definitely wake me up around 9!!! I have an important presentation . . . wake me up at that time please. . . . Thanks!! Meijie.”
The e-mail message, sent to me at 3:55 a.m. under the subject line “yeah!” was my enthusiastic welcome to Harvard from a freshman named Tang Meijie. That was last May, nine months after she arrived on campus from mainland China. Except for the ungodly hour at which the message was dashed off, you wouldn’t have guessed that its author had come to Cambridge trailing accomplishments and expectations that were impressive even by Harvard standards. Nor was there obvious evidence of a student superstar in the tousled figure in a sweatshirt and khakis who appeared at the Greenhouse Café in the Science Center at around 10 a.m. Greeting me with a reflexive bow, as she had at our first meeting a couple of months earlier, Meijie apologized for taking a few minutes to finish up the talk she had been assigned to give that morning in one of her courses.
Her topic gave her away. What Meijie was editing between bites of a bacon cheeseburger and sips of coffee was a short presentation for an expository-writing class called Success Stories. The questions addressed in the course, which focused on “what philosopher William James once called ‘our national disease,’ the pursuit of success,” have become newly urgent ones in Meijie’s own country. “What is ‘success’?” the course introduction asked. “Is it a measure of one’s financial worth? Moral perfection? Popularity? How do families, schools and popular culture invite us to think about success? And how are we encouraged to think about failure?” At Harvard, she and her classmates were discussing those issues as they read, among other things, “The Great Gatsby” and David Brooks on America’s résumé-rich “organization kids” and watched movies. In China, a nation on a mission to become a 21st-century incubator of “world class” talent, Meijie is the movie. As she progressed through her classes in the cutting-edge city of Shanghai, spent a year abroad at a private high school in Washington, D.C., and came to Harvard, she became a celebrated embodiment of China’s efforts to create a new sort of student — a student trying to expand her country’s sometimes constricting vision of success.
Downstairs in the computer room of the Science Center, Meijie showed me the thousands of Chinese citations that come up when you Google her name. “That’s very crazy,” she said with a laugh, a girl all too familiar with the Chinese ardor for anything associated with the name Harvard. Getting in “early action” in December 2004 set off a media frenzy at home, where it’s still relatively rare for students to enroll as undergraduates at elite American schools, and study abroad promises to provide a crucial edge in a jammed job market. A packet of press coverage her parents gave me — Meijie rolled her eyes at the trove — portrayed her as every Chinese parent’s dream child. Child magazine accompanied photos of Meijie and her parents with counsel on how to “raise a great child.” The winner of no fewer than 76 prizes at the “city level” or above, as one article marveled, she was a model that top Chinese students themselves were dying to emulate. “What Does Her Success Tell Us?” read a headline on an article in The Shanghai Students’ Post. “Meijie Knocked at the Door of Harvard. Do You Want to Copy?” asked The Morning News Express in bold Chinese characters. For months, she was besieged by journalists begging to profile her; publishers, she recalls, clamored to sign her up to write her life story and companies asked her to advertise their products. A director of Goldman Sachs’s China division wanted her on the board of the private school he recently helped found, which was then under construction in an erstwhile rice field outside Shanghai.
But what was truly exceptional about Meijie was how she responded to the adulation. The fervent worship back home made Meijie uncomfortable and anxious to clarify what she wasn’t. “Don’t call me ‘Harvard Girl,’ ” she told one of many magazine interviewers. She was referring to a student six years ahead of her, Liu Yiting, whose arrival at Harvard in 1999 made her a huge celebrity in China when her parents published a book, “Harvard Girl,” describing the meticulous regimen that produced their star. It quickly sold almost a million and a half copies and inspired numerous how-to-groom-your-child-to-get-into-college-abroad knockoffs. For all her triumphs, Meijie wasn’t obsessed with being at the head of the class and didn’t want the well-programmed-paragon treatment. She excelled in assorted subjects, but her school reported that her overall ranking wasn’t in the top 10 percent. Her parents had stood by, a little stunned, as their intrepid daughter won distinction in an unusual way, by accomplishing all kinds of things outside of the classroom.
Amid the hoopla, Meijie insisted that the last thing Chinese students (or parents) needed was to be encouraged in their blind reverence for an academic brand name, much less be told there was some new formula to follow and competitive frenzy to join. That was just the kind of pressure they had too much of already. It was everywhere in a culture with a long tradition of rigidly hierarchical talent selection, dating back to the imperial civil-service-exam system more than a thousand years ago — and still there in a school system driven by a daunting national college-entrance exam. The Chinese call it the gaokao, a three-day ordeal for which the preparation is arduous — and on which a single point difference can spell radically different life options. The cramming ethos, which sets in before high school, was what Meijie had tried hard not to let erode her curiosity. In her experience, America had come to stand for a less pressured and more appealing approach to schooling. “There is something in the American educational system that helps America hold its position in the world,” she told me. “Many people will think it’s a cliché, but there is something huge about it, although there are a lot of flaws — like bad public schools and other stuff. But there’s something really good, and it’s very different from my educational system.”
Once at Harvard, in the fall of 2005, Meijie figured out what she wanted to do. She would try to make liberal education’s ideal of well-rounded self-fulfillment “more real in China.” She plunged into conceiving a summer exchange program run by and for students. Meijie named it the Harvard Summit for Young Leaders in China, or Hsylc — pronounced “H-silk,” evoking the historic trading route. In August 2006, on the campus of that now-completed private school outside Shanghai whose board she had joined, a cosmopolitan array of Harvard undergraduates would offer a dose of the more freewheeling American campus and classroom experience. Meijie and an inner circle of organizers (similarly on-the-go Harvard women, all of Chinese descent, some reared in the U.S.) envisaged nine days of small-group discussions on wide-ranging issues outside of math and science. Hsylc would also offer extracurricular excitement and social discovery — chances for students to try new things and connect with one another, rather than compete for prizes. The participants that Meijie had in mind were several hundred promising Chinese high-schoolers, to be chosen in an un-Chinese way. She and a selection committee would pick them on the basis not of their G.P.A.’s but of their extracurricular activities and their essays in response to the kinds of open-ended prompts they never encountered at school. On her list was a question that might be a banality in the U.S. but was a heresy at home: “If you could do one thing to change the world, what would it be?”
Meijie’s answer to that question — help shake up Chinese education — puts her in step with the latest wave of a 30-year-old government effort to overhaul China’s schools and universities to keep pace with “socialist modernization.” After the chaos of the Cultural Revolution, when schools were closed and cadres of students assaulted “enemies of the state,” Deng Xiaoping resumed the National College Entrance Exam in 1977, marking the start of a radical expansion of the education system. A developing economy demanded it; the implications for politics were less clear, and after Tiananmen Square, there was a brief slowdown. The continued growth since then has been a success in many respects; educational attainments and college attendance have surged. Yet in the process, some prominent government officials have grown concerned that too many students have become the sort of stressed-out, test-acing drone who fails to acquire the skills — creativity, flexibility, initiative, leadership — said to be necessary in the global marketplace. “Students are buried in an endless flood of homework and sit for one mock entrance exam after another, leaving them with heads swimming and eyes blurred,” lamented former Vice Premier Li Lanqing in a book describing his efforts to address the problem. They arrive at college exhausted and emerge from it unenlightened — just when the country urgently needs a talented elite of innovators, the word of the hour. A recent report from the McKinsey consulting firm, “China’s Looming Talent Shortage,” pinpointed the alarming consequences of the country’s so-called “stuffed duck” tradition of dry and outdated knowledge transfer: graduates lacking “the cultural fit,” language skills and practical experience with teamwork and projects that multinational employers in a global era are looking for.
Even as American educators seek to emulate Asian pedagogy — a test-centered ethos and a rigorous focus on math, science and engineering — Chinese educators are trying to blend a Western emphasis on critical thinking, versatility and leadership into their own traditions. To put it another way, in the peremptorily utopian style typical of official Chinese directives (as well as of educationese the world over), the nation’s schools must strive “to build citizens’ character in an all-round way, gear their efforts to each and every student, give full scope to students’ ideological, moral, cultural and scientific potentials and raise their labor skills and physical and psychological aptitudes, achieve vibrant student development and run themselves with distinction.” Meijie’s rise to star student reflects a much-publicized government call to promote “suzhi jiaoyu” — generally translated as “quality education,” and also sometimes as “character education” or “all-round character education.” Her story also raises important questions about the state’s effort, which has been more generously backed by rhetoric than by money. The goal of change is to liberate students to pursue more fulfilling paths in a country where jobs are no longer assigned; it is also to produce the sort of flexibly skilled work force that best fits an international knowledge economy. But can personal desires and national demands be reconciled? Will the most promising students of the new era be as overburdened and regimented as before? As new opportunities have begun to emerge, so have tensions. If Meijie’s own trajectory and her Hsylc brainchild are any guide, the force most likely to spur on deep-seated educational ferment in China may well turn out to be students themselves — still struggling with stress, yet doing so in an era of greater personal independence and international openness. Overachievers of the world unite!
II. The Expansion
Brave Shanghai’s traffic and head southwest for 40 minutes to the well-groomed grounds of Xiwai International School, the site of last year’s Hsylc conference, and you see the broad contours of what has been happening in Chinese education. In an area that is projected to become Shanghai’s biggest satellite city, new construction is everywhere and up-to-date school campuses are being built. While American leaders have been debating how best to demand more accountability from a decentralized education system, the Chinese government has decided to loosen its administrative and financial control. The process dates back 20 years now, to the Decision on the Reform of the Education System, issued in 1985 (the year Meijie was born). The push was on to consolidate the Soviet-style hyperspecialized universities into more comprehensive institutions; with the Compulsory Education Law of 1986, mandating nine years of education for all, a major expansion was also under way. In the early 1990s, the government urged an easing of exam pressures and took the step of encouraging “social forces” to establish private schools alongside the public system.
Parents whose own schooling was curtailed by the Cultural Revolution have been avid to realize their educational ambitions — the Confucian key to social and moral advancement — in the paths they chart for their “little emperors,” the singletons mandated by the one-child policy of the past quarter of a century. The pace of growth and school privatization surged in the course of the 1990s. The goal was to send 15 percent of the college-age population on to the postsecondary level — that figure being the standard definition of “mass higher education” — by 2010. Meanwhile, extra financing went to a group of top universities in a quest to make them “world class.” And in the new millennium, rice paddies are still making way for state-of-the-art school facilities. A nonprofit, private school, Xiwai could be mistaken for a medium-size college. Its spacious brick classroom buildings and dorms (capacity 3,500 students, from pre-K to 12th grade) flank a lovely courtyard with a fountain in the middle. At one end stand an imposing library and a dining facility, and across the way is a large arts-and-sports complex.
“You could say we overbuilt,” said Xiwai’s co-founder, Xu Ziwang. Boyish in his khakis and navy blazer, Xu, who is 50, has energy to match the wealth he earned as one of Goldman Sachs’s first mainland Chinese partners. He has devoted both his zeal and money to establishing the school with Lin Min, Xiwai’s headmaster, plowing proceeds from local real estate development into the enterprise. Theirs is a project with roots in a past that could hardly have seemed more remote on the balmy fall day the two of them proudly showed me around the one-year-old campus. Friends from their teenage years on a farm during the Cultural Revolution, Xu and Lin were sent from school to the countryside when they were about the age of the oldest Xiwai students who greeted us cheerfully on the paved pathways. The two men were among the many millions who, feverishly studying when they weren’t busy at their appointed labors, swarmed to take the college-entrance exam in the first sittings in 1977 and 1978; they ended up among the few who scored high enough to secure a scarce college spot. Thirty years later, both had studied and worked abroad (Xu in the United States, Lin in Slovenia, England and New Zealand), and back home, Xu had played a big role in privatization deals. Here they stood on what had been mud, eagerly sharing their vision of a pedagogical and curricular renaissance that would produce a generation “better than us.”
What a fortunate cohort today’s kids were, both men said: young people growing up in a booming country that had plenty of problems but also a growing middle class and expanding horizons. By 2006, China had vastly exceeded its higher-education enrollment goal; 22 percent of the college-age population — compared with roughly 40 percent of 18- to 24-year-olds in the United States — were receiving some form of postsecondary schooling. Yet Xu and Lin also joined in the widespread worry that Chinese youths, spared the real-life challenges their elders were forced to cope with, faced very different constraints. Hunkered down, doing endless exam-haunted schoolwork, they were constantly hovered over by their parents. In 1998, years before the McKinsey report of a talent shortage, Xu heard the wake-up call when he initiated Chinese recruiting for Goldman Sachs.
He picked three graduates from China’s top universities and was impressed that they all scored 100 percent on the exam following the associate training stint in New York — only to be disappointed a year later, when their performance reviews were in the bottom quartiles. “There’s a price,” he concluded, “for 12 years of prep for an exam, and that’s to always think there’s a narrow, right answer. If you give precise instructions, they do well. If you define a task broadly, they get lost and ask for help.” If he and Lin had their way, independent students eager to use their imaginations would be the dominant breed on their campus. They were counting on a rising tide of “broad-minded” parents eager to provide their children with the less-straitjacketed education — a creative mix of the best of East and West — that Xiwai preached and aimed to find teachers able to impart. But as we toured a campus plastered with exhortations to be “global citizens” and to “Smile, Embrace, Communicate, Cooperate, Negotiate,” Xu was also blunt: there are lots of obstacles, not the least of them the gaokao that exerts such sway. “The dilemma is, everybody realized it is the problem, but nobody knows what to do.”
Chinese routinely say they wish the exam weren’t such a monolithic force, and various provinces have lately been allowed to offer their own versions. Yet bigger changes — like Fudan University’s use last year of broader criteria and a totally different test to admit some 300 students — stir concern. In a country so huge — and in a culture so steeped in cronyism — the fear is that no other process could work as fairly. Meanwhile, the success of China’s educational expansion hasn’t eased gaokao panic, and in fact has made the secondary-school exam a newly fraught hurdle. The unforseen pressures have unfolded this way: As the number of college graduates has outpaced the growth in desirable high-level jobs, generally located in China’s developed eastern region, one result has been a surge of unemployment among degree holders who resist settling for less. Along with that has come a rise in qualifications for lower-level jobs that once didn’t require a college diploma.
The situation has left students still desperately chasing elite-university credentials. A degree from the most prestigious Chinese schools, especially those given extra money in the quest for “world class” status (with Fudan and Jiao Tong universities in Shanghai, Peking and Tsinghua in Beijing at the pinnacle) — or from the University of Hong Kong or, more distinctive yet, from a college abroad — is the best shot at success in a job market where a big gap looms between top jobs and the level below. The college race has led in turn to an intensified struggle to get into the best high schools. They boast records of strong gaokao scorers and prestige university placements — yet high schools in general haven’t multiplied at the rate that colleges have. Xu wasn’t alone in sighing over these strains in the system and at the same time in seeing signs of hope: real change was bound to come.
III. The Experiment
When Meijie next had time to talk, it was in early June of last year, and she was swept up in arrangements for her education summit meeting in August. Among other things she and her fellow Harvard organizers would do when they were in Shanghai (where some Chinese university students would help out, too) was handle the late batch of Hsylc applications from seniors in China’s 10th-to-12th-grade high-school system. Meijie had extended the deadline for those applicants so they wouldn’t have to squeeze in work on the essays — one in English and one in Chinese — at the height of gaokao cramming. Answering Hsylc’s more creative questions would be a nice break for them, she told me at one point with a laugh and a shake of her cropped hair, and she wasn’t entirely kidding. Here was a college freshman who had barely closed her own blue books and was eagerly preparing to stage a $200,000 event (financed primarily by the Goldman Sachs Foundation, thanks to guidance from Xu). Lightening burdens, that “quality education” goal, was not exactly on any of these students’ agendas; juggling competing aspirations was more like it.
From the start, as Shanghai pioneered quality-education experiments during Meijie’s primary-school years in the early 1990s, she has been the rare student who navigated, undaunted, between China’s established educational ways and the emerging opportunities and expectations. Her upbringing reflects the deep-seated zeal for schooling that fuels but also complicates reform efforts. Almost the first thing Meijie told me about her mother (a former opera singer from a musical, Westernized family) and her father (a middle-school teacher of Chinese from a more traditional background) is that “they’re very typical Chinese parents.” By that she meant “they really focus on my education and cultivation.”
In China, a child’s schooling is a family endeavor worthy of great sacrifice, in money and time. Over dinner in Shanghai, a melodiously voluble Mrs. Tang confirmed that “when Meijie was very young we controlled her a lot, watched her very closely and guided her carefully. Luckily she was very cooperative and followed our instruction.” Effort rather than ability is considered the key to achievement — and among the most important expressions of filial piety is studying diligently (a word I heard a lot). “If there is no dark and dogged will, there will be no shining accomplishment; if there is no dull and determined effort, there will be no brilliant achievement” goes an old saying, invoked as soon as school starts — a far cry from the Western progressive interest in encouraging curiosity and play in the early years. Meijie told me her mother had her memorize her primary-school textbooks (much thinner than ours). Like many children, she was also sent to lessons in music, art and calligraphy. This kind of broader training is a legacy of the Confucian focus on self-perfection, and it is in step with the Maoist notion of “all-round development”; the emphasis is on practice and mastery, where American parents, busy enrolling their young kids in arty extras, are likely to stress self-expression and creativity.
For the reformist vision of more individualized, active learning, this ingrained educational drive has been something of a mixed blessing. It is a great core to build on: “quality education” advocates are emphatic that they have no intention of jettisoning a strong Asian heritage of discipline and humble, family-oriented commitment to self-cultivation. At the same time, the traditional emphasis on arduously conformist, adult-driven, hypercompetitive academic performance — well suited though it is to a standard class size of 40 or 50 — can get in the way of liberating individual initiative and easing pressures.
In her compulsory-education years, Meijie had plenty of old-style schooling — sitting in rows, being rigorously trained in the basics by revered teachers, and excelling. This was the well-entrenched approach observed by the developmental psychologists Harold W. Stevenson and James W. Stigler in the 1980s and praised in their frequently cited 1992 book “The Learning Gap: Why Our Schools Are Failing and What We Can Learn From Japanese and Chinese Education.” But she received new-style broadening, too. Seeing he had an eager reader, Meijie’s father began buying her books — she remembers the series of 115 Western classics he got a deal on one summer — in the belief that if she learned one thing from each, they were worth having. Meanwhile, in primary school, Meijie lucked into an early example of just the kind of extracurricular, community-oriented pursuit championed by Vice Premier Li. Thanks to an arrangement between her school and a Shanghai TV station, the 9-year-old Meijie was one of several top third graders tapped to produce a weekly kids’ news segment, which meant skipping class to work on the clips. She ended up doing it single-handedly for three years; her classmates’ parents pulled their children out, worried about school demands and exams.
Meijie moved on to middle school in the late 1990s just as “keypoint” schools, which accept the best students and are better financed, were banned from using the term in the interests of greater egalitarianism (though they remain as sought-after as ever). A lottery was instituted in Shanghai to spread the stellar students around. When Meijie landed in a merely ordinary school, her parents were distraught — and then upset when she flunked a computer-skills test. (She failed to hit “save.”) But soon they backed off, Mrs. Tang explained, to “let her develop herself because we saw how good she is.” Indeed, Meijie proceeded to reap benefits beyond Vice Premier Li’s dreams. “You have time to live your own life,” she told me, remembering the more laid-back atmosphere of her nonselect school, “and you have your freedom to think about a lot.” Among other things, she thought about Web design, partly to prove she was no computer dunce, but mainly because she was an unusually informed girl. Thirteen-year-old Meijie, former journalist, followed the news and was struck that in the midst of the Internet boom, “China is too quiet and behind” in appealing to teens. She saw a niche and focused on building one of the first popular youth sites in China. She was then recruited to help work on kids.eastday.com, a government-endorsed site with comprehensive information and services for younger teenagers.
Up to this point, which brought her to the turn of the millennium, Meijie’s experience was a preview of how less hierarchical, more flexible educational innovations might free an extroverted, quite extraordinary student — even as it also shed light on the persistent power exerted by stringent school expectations and demanding parents. By 2001, the pace of curricular change began to pick up, with private schools often in the lead, trumpeting mottoes like “We must put students in the center of learning and focus on cultivation of creativity.” At Xiwai, where I sat in on a first-grade class of merely 29, there was a smart board and desks arranged, Western-style, in clusters. A lively young teacher had the kids chanting cheerfully (and perfectly) in unison, old style, but also scrambling to find partners with whom to practice their Chinese characters; the room buzzed with collaborative work, as Xiwai’s administrators proudly pointed out.
Another day, over tea and then lunch in a cafe at East China Normal University, I met Cui Yunhuo, a young professor there who has been active in the nationwide curriculum review and implementation process. He gave an upbeat account of the progress he has seen in grades one through nine in a mere five years — though he also lamented the lack of good assessment methods. There is a wider variety of new textbooks to choose from, he explained, reporting that color had been added and outdated and often dense passages removed. Teachers are “more at an international level,” Cui said and gave me a booklet heavy on proclamations about the new importance placed on “encouraging students to inquire” and helping them “learn to learn.” More hands-on, project-based learning and cooperative endeavors are required. Time must also be allotted for “comprehensive practical activities and school-based curriculum,” which include optional courses designed by individual schools to appeal to students’ interests — a hortatory agenda hard to evaluate. At a so-called demonstration middle and high school I visited in Beijing, the vice principal extolled an environmental-studies project, which sent students to visit a waste-water recycling factory. They returned with ideas that they were eager to apply to the new campus under construction on Beijing’s outskirts. Student-run clubs are now de rigueur. There are also new curbs on competition. The middle-school entrance exam has been officially abolished. Shanghai eliminated midterms in the early primary-school grades, and weekend and vacation review classes are widely discouraged.
Yet no one pretends change is smooth. Cui worried that there hadn’t been enough effort devoted to explaining the curricular shifts to parents; others point to the lack of teacher training; and everyone cites dire problems in the countryside, like a shortage of teachers. Cui also joined Yang Deguang, a former president of Shanghai Normal University and now the executive director of the Higher Education Society of China, in voicing concern that competitive duress at the top has been spreading downward. “Don’t let the children lose at the starting point” has become a new parental slogan, Yang told me. The boom in private kindergartens had Cui worried, opposed as he is to the push for the early skill mastery and the highly structured English classes that are often their selling points. (With his own son, a fifth grader, he reported he was trying a new policy of fresh air and freedom: at 5 p.m. he sends him out of the house and tells him not to come home for an hour and a half.) Happy Cheung, who studied at the Harvard Graduate School of Education and is now the chairwoman of the Sino Capital Education Foundation in Beijing (as well as the co-host of a radio show on family education issues), told me that her fourth-grade daughter is one of the few students in her public-school class who isn’t enrolled in after-school English lessons or in “math olympics” — a craze that caught the attention of the Department of Education in Zhejiang Province, which canceled the olympics at the primary level, hoping to “lower the temperature.” In his book, Vice Premier Li warns against a vicious cycle: school efforts to diminish competition fueling a market for tutors outside of school, hired by parents as anxious as ever (or more) not to let their children get left behind — and perhaps give them an edge by developing special talents.
As Cheung is not the first to note, progressive ideas have a way of translating rather differently in a Chinese context. In 1919, John Dewey traveled to China, where his views on “student centered,” democratic education were all the rage, yet the rhetoric was fuzzy — as much a rallying cry for political renewal as a real blueprint for school change. If there is an American figure to whom Chinese proponents of more active, multidimensional, student-centered learning have listened especially attentively over the past half-decade, it is Howard Gardner of the Harvard Graduate School of Education (with whom Cheung herself studied). His Multiple Intelligences theory, which posits various different forms of intelligence in addition to the linguistic and logical-mathematical skills usually honed and rewarded in school, has inspired a huge array of books, articles and conferences. (There are also stuffed animals marketed as good for “interpersonal intelligence,” music boxes for “musical intelligence,” etc.) His work inspired a national project, begun in 2002, “Using M.I. Theory to Guide Discovery of Students’ Potential,” which financed efforts to implement the theory in classrooms all over the country.
Yet the M.I. vogue, as Cheung said over lunch, may reflect more familiar thinking than the fanfare suggests. The seemingly simple yet slippery theory readily lends itself to the homegrown tradition of “all round” cultivation, which is in fact informed by a quite different perspective. Where Gardner urges the individualized development of a distinctive blend of inborn abilities, she explained that his Chinese followers are prone to emphasize the structured mastery of multiple talents. Cheung brought along Zhan Wenling, a private-school principal who had toured around China in 2001 with a Ministry of Education delegation, explaining the “quality education” perspective. “Confucius said that a person is not simply a container,” Zhan exclaimed; a teacher “should be the fire, light the match,” and so must “know what kind of wood you are lighting.” But, she went on, it is not so easy for teachers to grasp the idea. The quest to promote a more student-driven ethos is also complicated by practical constraints, most notably China’s huge class sizes. Promoting discussion can be a problem. Earlier, another educator had told me: “You let them free, but it’s such a big group, it’s hard to get them back. It’s a real challenge how to get the balance right. Now the students may ask hundreds of different questions, and our teachers have to face that, they have to be well prepared.” A recent review of the national M.I. project turned up interesting experiments, I was told. Yet Zhan also saw resistance to less rank-oriented, more student-centered nurture. It is hard to loosen up, Vice Premier Li observed, in a culture that still reveres ancient scholars like Su Qin, who is said to have poked his thigh with the point of an awl to stay focused.
Meijie, confronting the high-school-entrance ordeal in 2001, found herself in the vise, too. Caught up as she was with Web-related activities when she took a mock version of the four-day test, she didn’t do well enough to get into Shanghai’s best schools. Her parents resisted the tutoring frenzy, which has intensified lately, thanks not least to the accelerating trend of applying to college abroad. (The New Oriental Education and Technology Group, which holds sway in English-language test prep and other training, just went public in September on the New York Stock Exchange and boasts skyrocketing enrollment figures: some 100,000 signed up in 1999, and the number is now a million.) But Meijie hunkered down to study, with help from a physics and a math teacher, and once again finessed what has others tied in knots. She ended up the top scorer in her district, and among the top 10 in all of Shanghai. That secured her a spot at her dream school, Fudan Fuzhong, which translates as “the high school affiliated with Fudan University.”
A premier Shanghai public high school with a well-established foreign-exchange program, Fudan Fuzhong was perfect for Meijie. Thanks to its stellar student body, teachers can spend less time on review without jeopardizing exam results, and the campus bustles with clubs, optional courses, service projects. Meijie, a champion debater and student leader, was the kind of self-driven, “quality” personality the faculty and administrators were eager to reward: she was their unanimous choice for the privilege of a year abroad at Sidwell Friends School in Washington, D.C. While the rest of her third-year classmates sweated through gaokao cramming in 2003-4, she had an American experience that she called “fabulous, life-changing, really core.”
At Sidwell, Meijie was an exchange student standout — history buff, bold field-hockey novice, social dynamo. And when she returned to China, she was convinced that an American liberal-arts education was for her. Still, she confessed that more than just her own “passion,” a favorite word of hers, propelled her Ivy League dream. She wasn’t immune to age-old Chinese status obsession: “The Harvard bell rings in me, too.”
IV. The Event
As she prepared for Hsylc during her freshman year in Cambridge, Meijie wasn’t slowed down by concern that her Harvard-affiliated education gathering might end up reinforcing a tendency it aimed to curb: the Chinese worship of a super-academic credential, at odds with the pursue-your-own-passion drive she hoped to inspire. It was the double bind of China’s reform effort in a nutshell (and a dilemma familiar to overachievers everywhere). But Hsylc was built on an insight that can get overlooked in more professional efforts to prescribe the right pedagogical mix of western vigor and structured Asian rigor. Outside the standard classroom, bottom-up cultural exchange is invaluable, as Meijie knew first-hand; impressive faculty, enlightened pedagogy, great facilities, an innovative curriculum — whether imported or homegrown or both — are only part of the story. What may matter more than anything else is student chemistry.
The scene at Hsylc in August was certainly not stress-free, least of all for Meijie. Chinese education is enjoying a wave of international activity, as joint ventures and exchanges proliferate, but her student-driven endeavor called for pioneering entrepreneurship as she rallied sponsors, press coverage and more. During the frenetic lead-up, her parents were thrilled to meet an array of kids from the United States — so much less inhibited than Chinese teenagers, they said, laughing — even as they worried their daughter was overdoing it and hiding the strain. “She’s my daughter,” Mrs. Tang told me, full of fondness. “I know her well. But I also know that everyone has good times and bad times — I know I do. For us, she tells us the good times. She must suffer, pay for that somewhere within. But she doesn’t tell us that.”
The result was an event full of an impromptu energy uncharacteristic of Chinese high-school campuses, even at a flexible place like Fudan Fuzhong. With their messy dorms and worldly ways, Hsylc’s diverse band of Harvard student seminar leaders were skewed toward adventure-seeking underclassmen, by design. Meijie and her fellow Hsylc organizers had selected for bold well-roundedness; the more eclectic the Harvard students’ profiles, the more exotic their travels, the better. And the young seminar leaders had bitten off ambitious and various topics — from “Africa and the Problems of Development” to “Disneyworld!” — that added up to something quite unlike the scripted “quality education” supplements many of the Hsylc kids had been exposed to. (From 26 province-level divisions, including Hong Kong, the 300 participants reflected the map of educational advantage, coming mostly from eastern, urban areas, with big contingents from Beijing and Shanghai, though Meijie also recruited a dozen or so very poor students.) This was real consciousness-raising, rather than conscientious résumé enhancement. “With half-open eyes, sore legs, constant yawning and a spinning head, it is curious why I am still so excited about the coming day,” wrote a student in The Silk Road, the Hsylc newsletter; clearly not alcohol-induced, the daze wasn’t cram-induced either.
The writer, Sindy — the kids mostly went by English-sounding first names they had chosen — proceeded to praise the seminars and the lectures “by distinguished scholars and entrepreneurs with remarkable insight.” (A scheduling glitch produced the highlight event, I was told — two very different Chinese examples of success on stage together: the polished, Western-educated head of Google China, Kai-Fu Lee, and the jeans-clad Jack Ma, homegrown maverick who reportedly bombed the gaokao twice and founded Alibaba.com, China’s thriving e-commerce company.) Sindy was full of enthusiasm too for “various extracurricular activities in which you wish you could send different parts of your body to.” But what held the most allure were the imported seminar leaders themselves. “I am fascinated by the talent, personal charm and perfect articulation of minds in these young people only two or three years older than most of us,” she wrote of the Harvard students.
Meijie had set out to replace blind veneration of the Harvard brand with a more informed appreciation of the charismatic, self-motivated breed of liberal-arts students she found there. What she probably didn’t foresee was that China’s famously studious youth would also be seized with a bad, or rather a very healthy, case of adolescent infatuation. Thanks to flirting and gossiping over meals — “We began searching for beautiful Harvard girls and handsome boys to have dinners with,” one student reported — the seminar leaders became peers, not just paragons. Forget Chinese reticence. The Hsylc participants demonstrated plenty of persistence, seeking out Harvard students to pursue seminar discussions or talk about life problems. Uneasy students were emboldened by a participatory style impossible in their big classes; they joined debates, took on the role of presidential candidates, signed up for a sprawling talent show. “In such an atmosphere, can you keep yourself silent and passive?” asked several fans of the Harvard sophomore Richie Schwartz’s “explosive” seminar on the evolution wars. “Just speak out!” The questions at some of the lectures were contentious bordering on rude, I was told.
A spirit of social adventure was inseparable from the intellectual adventure at Hsylc. That blurring is rare in Chinese high schools, where the new curricular broadening coexists with a student culture that is more familial — nurturing or infantilizing, depending on whom you ask. Chinese high-school students, many of whom board at school during the week, generally spend two years together with the same 40-plus classmates and sometimes choose their third-year concentration in order to stick with their friends. When they go home on the weekends, it usually isn’t to a social life with peers — dating is strongly discouraged — but to parents who “take care of everything. It’s always little baby.” That’s how one Hsylc participant, William, put it during a long conversation with me and four other summit alumni in a Shanghai teahouse on a late October Sunday. He discovered what he felt was missing from the Chinese high-school experience during a year abroad in a Texas public high school. “American high schools are more colorful, more like real life . . . more complicated . . . I don’t know, you feel like you’ve somehow grown to be a more mature person. You have to deal with different people and all the complicated things about relationships, and friendship, and programs and activities.”
Yet William — whose personal style, down to his Nike wristband, was amazingly American, and who was in the midst of applying to American colleges “because I think they suit me better” — didn’t speak for all his peers. The view, from the best Shanghai high schools at any rate, is more complicated. Lily, from Fudan Fuzhong, had returned from a happy year abroad at the Taft School in Connecticut that nonetheless left her thinking how “stimulating” her own high school is — and feeling that for SAT-stressed American students, “it’s not much better than in Shanghai.” The previous morning, five other Hsylc kids argued just as fiercely about the state of Chinese education. Bluesky sang the praises of the well-rounded girls at her school, Shanghai No. 3 Girls High School. But Black, arriving from a weekend review class with “physics problems swimming in my head,” was bleak about gaokao-burdened life in his ordinary school. “I am very tired but I must strive,” he said wearily. Whatever their arguments as they compared schools, there was general agreement: it’s rural kids, stuck in bad schools and granted fewer slots at the best universities, who face truly impossible odds and stress.
As for the HSYLC students’ plans for the future, Meijie’s summit meeting had not triggered a stampede to apply to American colleges (where all but the wealthy must hope for full or generous scholarships). Interest in Harvard certainly was high, yet at the same time Hsylc sent a very different message that worked against reflexive Chinese competitive fervor. All stirred up by the experience, a group of talented young people was also left feeling, as one put it, “more energetic, braver and more confident” about figuring out for themselves what might lie ahead — where they could best pursue the personal goals they had been hearing about and which they were told to take seriously. “What impressed me about the Harvard students,” a girl by the name of Shine told me, “was their definite aim for life, whereas Chinese students just go on the road laid out by their parents. In Shanghai and Beijing, we have the sense that we can go on our road. But sometimes it is an empty concept; we don’t know how to contribute to society.” For her — and many other kids agreed — the Hsylc event was a time when “I always ask myself what I want to do.”
Applications for undergraduate study abroad are rising (six times as many students from the People’s Republic of China, 256, applied to Harvard College in 2005 as in 1999, for example), and more foreign degree holders are now coming back to put their educations to use at home. But perhaps as promising a sign of the momentum behind educational change in China is an Hsylc participant like Neal, master of the wryly raised eyebrow, who said when I spoke with him that he was planning on staying put but who was still thinking big. The most caustic critic of Chinese schooling among those assembled for tea, he had already spent time in Germany. But he was inspired to find that his Chinese schoolmates, “known for diligence, silence and obedience, thick glasses as a symbol,” in his words, could be “most vigorous” in a setting like Hsylc.
As he explained to me in an earlier e-mail message, composed in breaks from gaokao cramming, he said he needed “to watch and feel the system by my personal experience” — endure the burdens at their worst in the third year, the “endless homework, strict discipline, frequent exams and the peer pressure.” If he chose to stay in China, he would know how to push toward a new system in which students’ “curiosity is well protected to learn knowledge.” At the teahouse, Neal was already rallying the troops behind a vision of prodding change along at home. “At Fudan University” — which has just inaugurated a less specialized curriculum for freshmen and a house system modeled on Yale and Harvard — “I can take a lecture, and if I want to be more active, I can ask questions, I can tell my friends to ask questions and then students will change the system,” he said. “When a university is eager to change, the vital power is students.” Neal raised both eyebrows, a boy looking ready to rebel against the bookworm stereotype. “People say, ‘Whoa, you’re from China!’ Yes, I’m from China.”
A generation of more independent-minded students with wider horizons — a generation of Neals and Meijies, busy networking and innovating: it is a prospect that may inspire some trepidation as well as optimism among Chinese leaders. After all, campus unrest has left scars, and vast challenges loom, from the environment to rural discontent. The proof that the recent educational changes go deeper than a proliferation of newfangled curricula and degrees will not be merely how China’s economic future plays out; it will be what kinds of political and cultural repercussions unfold, most immediately among the lucky few who are currently benefiting most from the new opportunities. Will it be enough if a bolder breed of the best and the brightest — as Xu predicted — form a cosmopolitan elite whose roots in China help make them the imaginative hybrids that global enterprises need? Over dinner with his wife and daughter (who was busy doing her homework while she ate), Xu was hopeful. At the same time, in a recent Foreign Affairs article, Xu’s friend John Thornton, a former president of Goldman Sachs, who has been directing a new Global Leadership program at Tsinghua University for several years now (and who also gave a rousing talk at Hsylc), points out a serious problem. In a country whose cutthroat educational system was famous for selecting successful bureaucrats, top talent these days goes just about anywhere but the government, clogged as it is with corrupt insiders. If creative, critical-minded outsiders aren’t given a reason to enter the public realm, the prospects for a world-class, more democratic future for all are only more precarious.
Right now, it’s quite unlikely that Meijie would even think of ending up as a mandarin. She was, though, full of plans as the spring semester of her sophomore year got under way after a somewhat dispiriting fall — her Harvard friends worrying about grades, her Fudan University friends in China panicked about careers. “Now I feel I am back again,” she wrote in a late February e-mail message, sent at — some things never change — 3:55 a.m. “Life is so hectic and overwhelmingly exciting!” With two Harvard friends, Meijie had just founded a company — Strategy Alpha International L.L.C., whose mission is to advise Chinese and American enterprises on carving out niches in the other country’s markets. (She and her team were already talking to a top Chinese financial magazine in search of an American partner.)
Meijie the resourceful global entrepreneur hadn’t stopped thinking about education, though you might say the liberal-arts champion had become more of a realist. With a nonprofit arm of S.A.I., she was eager to pursue college advising back home and to explore new curriculum designs for English-teacher training and — inescapable in China — test prep as well. Meanwhile, Hsylc 2007 was gearing up. Meijie planned to give this year’s event a theme, “Different Paths to Success,” with the emphasis on “different” to jolt Harvard kids and Chinese kids alike out of their G.P.A.-obsessed focus on climbing conventional ladders. (In China, she suggested, American-style options like entertainment or sports, say, could use a boost.) It would be a contrast to last year, the now-seasoned summit organizer told me with a nostalgic grin and a groan: then the theme, she joked, should have been “How to Survive in Chaos.” Meijie was being modest. How to thrive under the stress of new choices and change was the lesson she learned early, and was working hard — very hard — to share.
Ann Hulbert, a contributing writer, is the author of “Raising America: Experts, Parents and a Century of Advice About Children.”
By ANN HULBERT
I. The Student
“Definitely wake me up around 9!!! I have an important presentation . . . wake me up at that time please. . . . Thanks!! Meijie.”
The e-mail message, sent to me at 3:55 a.m. under the subject line “yeah!” was my enthusiastic welcome to Harvard from a freshman named Tang Meijie. That was last May, nine months after she arrived on campus from mainland China. Except for the ungodly hour at which the message was dashed off, you wouldn’t have guessed that its author had come to Cambridge trailing accomplishments and expectations that were impressive even by Harvard standards. Nor was there obvious evidence of a student superstar in the tousled figure in a sweatshirt and khakis who appeared at the Greenhouse Café in the Science Center at around 10 a.m. Greeting me with a reflexive bow, as she had at our first meeting a couple of months earlier, Meijie apologized for taking a few minutes to finish up the talk she had been assigned to give that morning in one of her courses.
Her topic gave her away. What Meijie was editing between bites of a bacon cheeseburger and sips of coffee was a short presentation for an expository-writing class called Success Stories. The questions addressed in the course, which focused on “what philosopher William James once called ‘our national disease,’ the pursuit of success,” have become newly urgent ones in Meijie’s own country. “What is ‘success’?” the course introduction asked. “Is it a measure of one’s financial worth? Moral perfection? Popularity? How do families, schools and popular culture invite us to think about success? And how are we encouraged to think about failure?” At Harvard, she and her classmates were discussing those issues as they read, among other things, “The Great Gatsby” and David Brooks on America’s résumé-rich “organization kids” and watched movies. In China, a nation on a mission to become a 21st-century incubator of “world class” talent, Meijie is the movie. As she progressed through her classes in the cutting-edge city of Shanghai, spent a year abroad at a private high school in Washington, D.C., and came to Harvard, she became a celebrated embodiment of China’s efforts to create a new sort of student — a student trying to expand her country’s sometimes constricting vision of success.
Downstairs in the computer room of the Science Center, Meijie showed me the thousands of Chinese citations that come up when you Google her name. “That’s very crazy,” she said with a laugh, a girl all too familiar with the Chinese ardor for anything associated with the name Harvard. Getting in “early action” in December 2004 set off a media frenzy at home, where it’s still relatively rare for students to enroll as undergraduates at elite American schools, and study abroad promises to provide a crucial edge in a jammed job market. A packet of press coverage her parents gave me — Meijie rolled her eyes at the trove — portrayed her as every Chinese parent’s dream child. Child magazine accompanied photos of Meijie and her parents with counsel on how to “raise a great child.” The winner of no fewer than 76 prizes at the “city level” or above, as one article marveled, she was a model that top Chinese students themselves were dying to emulate. “What Does Her Success Tell Us?” read a headline on an article in The Shanghai Students’ Post. “Meijie Knocked at the Door of Harvard. Do You Want to Copy?” asked The Morning News Express in bold Chinese characters. For months, she was besieged by journalists begging to profile her; publishers, she recalls, clamored to sign her up to write her life story and companies asked her to advertise their products. A director of Goldman Sachs’s China division wanted her on the board of the private school he recently helped found, which was then under construction in an erstwhile rice field outside Shanghai.
But what was truly exceptional about Meijie was how she responded to the adulation. The fervent worship back home made Meijie uncomfortable and anxious to clarify what she wasn’t. “Don’t call me ‘Harvard Girl,’ ” she told one of many magazine interviewers. She was referring to a student six years ahead of her, Liu Yiting, whose arrival at Harvard in 1999 made her a huge celebrity in China when her parents published a book, “Harvard Girl,” describing the meticulous regimen that produced their star. It quickly sold almost a million and a half copies and inspired numerous how-to-groom-your-child-to-get-into-college-abroad knockoffs. For all her triumphs, Meijie wasn’t obsessed with being at the head of the class and didn’t want the well-programmed-paragon treatment. She excelled in assorted subjects, but her school reported that her overall ranking wasn’t in the top 10 percent. Her parents had stood by, a little stunned, as their intrepid daughter won distinction in an unusual way, by accomplishing all kinds of things outside of the classroom.
Amid the hoopla, Meijie insisted that the last thing Chinese students (or parents) needed was to be encouraged in their blind reverence for an academic brand name, much less be told there was some new formula to follow and competitive frenzy to join. That was just the kind of pressure they had too much of already. It was everywhere in a culture with a long tradition of rigidly hierarchical talent selection, dating back to the imperial civil-service-exam system more than a thousand years ago — and still there in a school system driven by a daunting national college-entrance exam. The Chinese call it the gaokao, a three-day ordeal for which the preparation is arduous — and on which a single point difference can spell radically different life options. The cramming ethos, which sets in before high school, was what Meijie had tried hard not to let erode her curiosity. In her experience, America had come to stand for a less pressured and more appealing approach to schooling. “There is something in the American educational system that helps America hold its position in the world,” she told me. “Many people will think it’s a cliché, but there is something huge about it, although there are a lot of flaws — like bad public schools and other stuff. But there’s something really good, and it’s very different from my educational system.”
Once at Harvard, in the fall of 2005, Meijie figured out what she wanted to do. She would try to make liberal education’s ideal of well-rounded self-fulfillment “more real in China.” She plunged into conceiving a summer exchange program run by and for students. Meijie named it the Harvard Summit for Young Leaders in China, or Hsylc — pronounced “H-silk,” evoking the historic trading route. In August 2006, on the campus of that now-completed private school outside Shanghai whose board she had joined, a cosmopolitan array of Harvard undergraduates would offer a dose of the more freewheeling American campus and classroom experience. Meijie and an inner circle of organizers (similarly on-the-go Harvard women, all of Chinese descent, some reared in the U.S.) envisaged nine days of small-group discussions on wide-ranging issues outside of math and science. Hsylc would also offer extracurricular excitement and social discovery — chances for students to try new things and connect with one another, rather than compete for prizes. The participants that Meijie had in mind were several hundred promising Chinese high-schoolers, to be chosen in an un-Chinese way. She and a selection committee would pick them on the basis not of their G.P.A.’s but of their extracurricular activities and their essays in response to the kinds of open-ended prompts they never encountered at school. On her list was a question that might be a banality in the U.S. but was a heresy at home: “If you could do one thing to change the world, what would it be?”
Meijie’s answer to that question — help shake up Chinese education — puts her in step with the latest wave of a 30-year-old government effort to overhaul China’s schools and universities to keep pace with “socialist modernization.” After the chaos of the Cultural Revolution, when schools were closed and cadres of students assaulted “enemies of the state,” Deng Xiaoping resumed the National College Entrance Exam in 1977, marking the start of a radical expansion of the education system. A developing economy demanded it; the implications for politics were less clear, and after Tiananmen Square, there was a brief slowdown. The continued growth since then has been a success in many respects; educational attainments and college attendance have surged. Yet in the process, some prominent government officials have grown concerned that too many students have become the sort of stressed-out, test-acing drone who fails to acquire the skills — creativity, flexibility, initiative, leadership — said to be necessary in the global marketplace. “Students are buried in an endless flood of homework and sit for one mock entrance exam after another, leaving them with heads swimming and eyes blurred,” lamented former Vice Premier Li Lanqing in a book describing his efforts to address the problem. They arrive at college exhausted and emerge from it unenlightened — just when the country urgently needs a talented elite of innovators, the word of the hour. A recent report from the McKinsey consulting firm, “China’s Looming Talent Shortage,” pinpointed the alarming consequences of the country’s so-called “stuffed duck” tradition of dry and outdated knowledge transfer: graduates lacking “the cultural fit,” language skills and practical experience with teamwork and projects that multinational employers in a global era are looking for.
Even as American educators seek to emulate Asian pedagogy — a test-centered ethos and a rigorous focus on math, science and engineering — Chinese educators are trying to blend a Western emphasis on critical thinking, versatility and leadership into their own traditions. To put it another way, in the peremptorily utopian style typical of official Chinese directives (as well as of educationese the world over), the nation’s schools must strive “to build citizens’ character in an all-round way, gear their efforts to each and every student, give full scope to students’ ideological, moral, cultural and scientific potentials and raise their labor skills and physical and psychological aptitudes, achieve vibrant student development and run themselves with distinction.” Meijie’s rise to star student reflects a much-publicized government call to promote “suzhi jiaoyu” — generally translated as “quality education,” and also sometimes as “character education” or “all-round character education.” Her story also raises important questions about the state’s effort, which has been more generously backed by rhetoric than by money. The goal of change is to liberate students to pursue more fulfilling paths in a country where jobs are no longer assigned; it is also to produce the sort of flexibly skilled work force that best fits an international knowledge economy. But can personal desires and national demands be reconciled? Will the most promising students of the new era be as overburdened and regimented as before? As new opportunities have begun to emerge, so have tensions. If Meijie’s own trajectory and her Hsylc brainchild are any guide, the force most likely to spur on deep-seated educational ferment in China may well turn out to be students themselves — still struggling with stress, yet doing so in an era of greater personal independence and international openness. Overachievers of the world unite!
II. The Expansion
Brave Shanghai’s traffic and head southwest for 40 minutes to the well-groomed grounds of Xiwai International School, the site of last year’s Hsylc conference, and you see the broad contours of what has been happening in Chinese education. In an area that is projected to become Shanghai’s biggest satellite city, new construction is everywhere and up-to-date school campuses are being built. While American leaders have been debating how best to demand more accountability from a decentralized education system, the Chinese government has decided to loosen its administrative and financial control. The process dates back 20 years now, to the Decision on the Reform of the Education System, issued in 1985 (the year Meijie was born). The push was on to consolidate the Soviet-style hyperspecialized universities into more comprehensive institutions; with the Compulsory Education Law of 1986, mandating nine years of education for all, a major expansion was also under way. In the early 1990s, the government urged an easing of exam pressures and took the step of encouraging “social forces” to establish private schools alongside the public system.
Parents whose own schooling was curtailed by the Cultural Revolution have been avid to realize their educational ambitions — the Confucian key to social and moral advancement — in the paths they chart for their “little emperors,” the singletons mandated by the one-child policy of the past quarter of a century. The pace of growth and school privatization surged in the course of the 1990s. The goal was to send 15 percent of the college-age population on to the postsecondary level — that figure being the standard definition of “mass higher education” — by 2010. Meanwhile, extra financing went to a group of top universities in a quest to make them “world class.” And in the new millennium, rice paddies are still making way for state-of-the-art school facilities. A nonprofit, private school, Xiwai could be mistaken for a medium-size college. Its spacious brick classroom buildings and dorms (capacity 3,500 students, from pre-K to 12th grade) flank a lovely courtyard with a fountain in the middle. At one end stand an imposing library and a dining facility, and across the way is a large arts-and-sports complex.
“You could say we overbuilt,” said Xiwai’s co-founder, Xu Ziwang. Boyish in his khakis and navy blazer, Xu, who is 50, has energy to match the wealth he earned as one of Goldman Sachs’s first mainland Chinese partners. He has devoted both his zeal and money to establishing the school with Lin Min, Xiwai’s headmaster, plowing proceeds from local real estate development into the enterprise. Theirs is a project with roots in a past that could hardly have seemed more remote on the balmy fall day the two of them proudly showed me around the one-year-old campus. Friends from their teenage years on a farm during the Cultural Revolution, Xu and Lin were sent from school to the countryside when they were about the age of the oldest Xiwai students who greeted us cheerfully on the paved pathways. The two men were among the many millions who, feverishly studying when they weren’t busy at their appointed labors, swarmed to take the college-entrance exam in the first sittings in 1977 and 1978; they ended up among the few who scored high enough to secure a scarce college spot. Thirty years later, both had studied and worked abroad (Xu in the United States, Lin in Slovenia, England and New Zealand), and back home, Xu had played a big role in privatization deals. Here they stood on what had been mud, eagerly sharing their vision of a pedagogical and curricular renaissance that would produce a generation “better than us.”
What a fortunate cohort today’s kids were, both men said: young people growing up in a booming country that had plenty of problems but also a growing middle class and expanding horizons. By 2006, China had vastly exceeded its higher-education enrollment goal; 22 percent of the college-age population — compared with roughly 40 percent of 18- to 24-year-olds in the United States — were receiving some form of postsecondary schooling. Yet Xu and Lin also joined in the widespread worry that Chinese youths, spared the real-life challenges their elders were forced to cope with, faced very different constraints. Hunkered down, doing endless exam-haunted schoolwork, they were constantly hovered over by their parents. In 1998, years before the McKinsey report of a talent shortage, Xu heard the wake-up call when he initiated Chinese recruiting for Goldman Sachs.
He picked three graduates from China’s top universities and was impressed that they all scored 100 percent on the exam following the associate training stint in New York — only to be disappointed a year later, when their performance reviews were in the bottom quartiles. “There’s a price,” he concluded, “for 12 years of prep for an exam, and that’s to always think there’s a narrow, right answer. If you give precise instructions, they do well. If you define a task broadly, they get lost and ask for help.” If he and Lin had their way, independent students eager to use their imaginations would be the dominant breed on their campus. They were counting on a rising tide of “broad-minded” parents eager to provide their children with the less-straitjacketed education — a creative mix of the best of East and West — that Xiwai preached and aimed to find teachers able to impart. But as we toured a campus plastered with exhortations to be “global citizens” and to “Smile, Embrace, Communicate, Cooperate, Negotiate,” Xu was also blunt: there are lots of obstacles, not the least of them the gaokao that exerts such sway. “The dilemma is, everybody realized it is the problem, but nobody knows what to do.”
Chinese routinely say they wish the exam weren’t such a monolithic force, and various provinces have lately been allowed to offer their own versions. Yet bigger changes — like Fudan University’s use last year of broader criteria and a totally different test to admit some 300 students — stir concern. In a country so huge — and in a culture so steeped in cronyism — the fear is that no other process could work as fairly. Meanwhile, the success of China’s educational expansion hasn’t eased gaokao panic, and in fact has made the secondary-school exam a newly fraught hurdle. The unforseen pressures have unfolded this way: As the number of college graduates has outpaced the growth in desirable high-level jobs, generally located in China’s developed eastern region, one result has been a surge of unemployment among degree holders who resist settling for less. Along with that has come a rise in qualifications for lower-level jobs that once didn’t require a college diploma.
The situation has left students still desperately chasing elite-university credentials. A degree from the most prestigious Chinese schools, especially those given extra money in the quest for “world class” status (with Fudan and Jiao Tong universities in Shanghai, Peking and Tsinghua in Beijing at the pinnacle) — or from the University of Hong Kong or, more distinctive yet, from a college abroad — is the best shot at success in a job market where a big gap looms between top jobs and the level below. The college race has led in turn to an intensified struggle to get into the best high schools. They boast records of strong gaokao scorers and prestige university placements — yet high schools in general haven’t multiplied at the rate that colleges have. Xu wasn’t alone in sighing over these strains in the system and at the same time in seeing signs of hope: real change was bound to come.
III. The Experiment
When Meijie next had time to talk, it was in early June of last year, and she was swept up in arrangements for her education summit meeting in August. Among other things she and her fellow Harvard organizers would do when they were in Shanghai (where some Chinese university students would help out, too) was handle the late batch of Hsylc applications from seniors in China’s 10th-to-12th-grade high-school system. Meijie had extended the deadline for those applicants so they wouldn’t have to squeeze in work on the essays — one in English and one in Chinese — at the height of gaokao cramming. Answering Hsylc’s more creative questions would be a nice break for them, she told me at one point with a laugh and a shake of her cropped hair, and she wasn’t entirely kidding. Here was a college freshman who had barely closed her own blue books and was eagerly preparing to stage a $200,000 event (financed primarily by the Goldman Sachs Foundation, thanks to guidance from Xu). Lightening burdens, that “quality education” goal, was not exactly on any of these students’ agendas; juggling competing aspirations was more like it.
From the start, as Shanghai pioneered quality-education experiments during Meijie’s primary-school years in the early 1990s, she has been the rare student who navigated, undaunted, between China’s established educational ways and the emerging opportunities and expectations. Her upbringing reflects the deep-seated zeal for schooling that fuels but also complicates reform efforts. Almost the first thing Meijie told me about her mother (a former opera singer from a musical, Westernized family) and her father (a middle-school teacher of Chinese from a more traditional background) is that “they’re very typical Chinese parents.” By that she meant “they really focus on my education and cultivation.”
In China, a child’s schooling is a family endeavor worthy of great sacrifice, in money and time. Over dinner in Shanghai, a melodiously voluble Mrs. Tang confirmed that “when Meijie was very young we controlled her a lot, watched her very closely and guided her carefully. Luckily she was very cooperative and followed our instruction.” Effort rather than ability is considered the key to achievement — and among the most important expressions of filial piety is studying diligently (a word I heard a lot). “If there is no dark and dogged will, there will be no shining accomplishment; if there is no dull and determined effort, there will be no brilliant achievement” goes an old saying, invoked as soon as school starts — a far cry from the Western progressive interest in encouraging curiosity and play in the early years. Meijie told me her mother had her memorize her primary-school textbooks (much thinner than ours). Like many children, she was also sent to lessons in music, art and calligraphy. This kind of broader training is a legacy of the Confucian focus on self-perfection, and it is in step with the Maoist notion of “all-round development”; the emphasis is on practice and mastery, where American parents, busy enrolling their young kids in arty extras, are likely to stress self-expression and creativity.
For the reformist vision of more individualized, active learning, this ingrained educational drive has been something of a mixed blessing. It is a great core to build on: “quality education” advocates are emphatic that they have no intention of jettisoning a strong Asian heritage of discipline and humble, family-oriented commitment to self-cultivation. At the same time, the traditional emphasis on arduously conformist, adult-driven, hypercompetitive academic performance — well suited though it is to a standard class size of 40 or 50 — can get in the way of liberating individual initiative and easing pressures.
In her compulsory-education years, Meijie had plenty of old-style schooling — sitting in rows, being rigorously trained in the basics by revered teachers, and excelling. This was the well-entrenched approach observed by the developmental psychologists Harold W. Stevenson and James W. Stigler in the 1980s and praised in their frequently cited 1992 book “The Learning Gap: Why Our Schools Are Failing and What We Can Learn From Japanese and Chinese Education.” But she received new-style broadening, too. Seeing he had an eager reader, Meijie’s father began buying her books — she remembers the series of 115 Western classics he got a deal on one summer — in the belief that if she learned one thing from each, they were worth having. Meanwhile, in primary school, Meijie lucked into an early example of just the kind of extracurricular, community-oriented pursuit championed by Vice Premier Li. Thanks to an arrangement between her school and a Shanghai TV station, the 9-year-old Meijie was one of several top third graders tapped to produce a weekly kids’ news segment, which meant skipping class to work on the clips. She ended up doing it single-handedly for three years; her classmates’ parents pulled their children out, worried about school demands and exams.
Meijie moved on to middle school in the late 1990s just as “keypoint” schools, which accept the best students and are better financed, were banned from using the term in the interests of greater egalitarianism (though they remain as sought-after as ever). A lottery was instituted in Shanghai to spread the stellar students around. When Meijie landed in a merely ordinary school, her parents were distraught — and then upset when she flunked a computer-skills test. (She failed to hit “save.”) But soon they backed off, Mrs. Tang explained, to “let her develop herself because we saw how good she is.” Indeed, Meijie proceeded to reap benefits beyond Vice Premier Li’s dreams. “You have time to live your own life,” she told me, remembering the more laid-back atmosphere of her nonselect school, “and you have your freedom to think about a lot.” Among other things, she thought about Web design, partly to prove she was no computer dunce, but mainly because she was an unusually informed girl. Thirteen-year-old Meijie, former journalist, followed the news and was struck that in the midst of the Internet boom, “China is too quiet and behind” in appealing to teens. She saw a niche and focused on building one of the first popular youth sites in China. She was then recruited to help work on kids.eastday.com, a government-endorsed site with comprehensive information and services for younger teenagers.
Up to this point, which brought her to the turn of the millennium, Meijie’s experience was a preview of how less hierarchical, more flexible educational innovations might free an extroverted, quite extraordinary student — even as it also shed light on the persistent power exerted by stringent school expectations and demanding parents. By 2001, the pace of curricular change began to pick up, with private schools often in the lead, trumpeting mottoes like “We must put students in the center of learning and focus on cultivation of creativity.” At Xiwai, where I sat in on a first-grade class of merely 29, there was a smart board and desks arranged, Western-style, in clusters. A lively young teacher had the kids chanting cheerfully (and perfectly) in unison, old style, but also scrambling to find partners with whom to practice their Chinese characters; the room buzzed with collaborative work, as Xiwai’s administrators proudly pointed out.
Another day, over tea and then lunch in a cafe at East China Normal University, I met Cui Yunhuo, a young professor there who has been active in the nationwide curriculum review and implementation process. He gave an upbeat account of the progress he has seen in grades one through nine in a mere five years — though he also lamented the lack of good assessment methods. There is a wider variety of new textbooks to choose from, he explained, reporting that color had been added and outdated and often dense passages removed. Teachers are “more at an international level,” Cui said and gave me a booklet heavy on proclamations about the new importance placed on “encouraging students to inquire” and helping them “learn to learn.” More hands-on, project-based learning and cooperative endeavors are required. Time must also be allotted for “comprehensive practical activities and school-based curriculum,” which include optional courses designed by individual schools to appeal to students’ interests — a hortatory agenda hard to evaluate. At a so-called demonstration middle and high school I visited in Beijing, the vice principal extolled an environmental-studies project, which sent students to visit a waste-water recycling factory. They returned with ideas that they were eager to apply to the new campus under construction on Beijing’s outskirts. Student-run clubs are now de rigueur. There are also new curbs on competition. The middle-school entrance exam has been officially abolished. Shanghai eliminated midterms in the early primary-school grades, and weekend and vacation review classes are widely discouraged.
Yet no one pretends change is smooth. Cui worried that there hadn’t been enough effort devoted to explaining the curricular shifts to parents; others point to the lack of teacher training; and everyone cites dire problems in the countryside, like a shortage of teachers. Cui also joined Yang Deguang, a former president of Shanghai Normal University and now the executive director of the Higher Education Society of China, in voicing concern that competitive duress at the top has been spreading downward. “Don’t let the children lose at the starting point” has become a new parental slogan, Yang told me. The boom in private kindergartens had Cui worried, opposed as he is to the push for the early skill mastery and the highly structured English classes that are often their selling points. (With his own son, a fifth grader, he reported he was trying a new policy of fresh air and freedom: at 5 p.m. he sends him out of the house and tells him not to come home for an hour and a half.) Happy Cheung, who studied at the Harvard Graduate School of Education and is now the chairwoman of the Sino Capital Education Foundation in Beijing (as well as the co-host of a radio show on family education issues), told me that her fourth-grade daughter is one of the few students in her public-school class who isn’t enrolled in after-school English lessons or in “math olympics” — a craze that caught the attention of the Department of Education in Zhejiang Province, which canceled the olympics at the primary level, hoping to “lower the temperature.” In his book, Vice Premier Li warns against a vicious cycle: school efforts to diminish competition fueling a market for tutors outside of school, hired by parents as anxious as ever (or more) not to let their children get left behind — and perhaps give them an edge by developing special talents.
As Cheung is not the first to note, progressive ideas have a way of translating rather differently in a Chinese context. In 1919, John Dewey traveled to China, where his views on “student centered,” democratic education were all the rage, yet the rhetoric was fuzzy — as much a rallying cry for political renewal as a real blueprint for school change. If there is an American figure to whom Chinese proponents of more active, multidimensional, student-centered learning have listened especially attentively over the past half-decade, it is Howard Gardner of the Harvard Graduate School of Education (with whom Cheung herself studied). His Multiple Intelligences theory, which posits various different forms of intelligence in addition to the linguistic and logical-mathematical skills usually honed and rewarded in school, has inspired a huge array of books, articles and conferences. (There are also stuffed animals marketed as good for “interpersonal intelligence,” music boxes for “musical intelligence,” etc.) His work inspired a national project, begun in 2002, “Using M.I. Theory to Guide Discovery of Students’ Potential,” which financed efforts to implement the theory in classrooms all over the country.
Yet the M.I. vogue, as Cheung said over lunch, may reflect more familiar thinking than the fanfare suggests. The seemingly simple yet slippery theory readily lends itself to the homegrown tradition of “all round” cultivation, which is in fact informed by a quite different perspective. Where Gardner urges the individualized development of a distinctive blend of inborn abilities, she explained that his Chinese followers are prone to emphasize the structured mastery of multiple talents. Cheung brought along Zhan Wenling, a private-school principal who had toured around China in 2001 with a Ministry of Education delegation, explaining the “quality education” perspective. “Confucius said that a person is not simply a container,” Zhan exclaimed; a teacher “should be the fire, light the match,” and so must “know what kind of wood you are lighting.” But, she went on, it is not so easy for teachers to grasp the idea. The quest to promote a more student-driven ethos is also complicated by practical constraints, most notably China’s huge class sizes. Promoting discussion can be a problem. Earlier, another educator had told me: “You let them free, but it’s such a big group, it’s hard to get them back. It’s a real challenge how to get the balance right. Now the students may ask hundreds of different questions, and our teachers have to face that, they have to be well prepared.” A recent review of the national M.I. project turned up interesting experiments, I was told. Yet Zhan also saw resistance to less rank-oriented, more student-centered nurture. It is hard to loosen up, Vice Premier Li observed, in a culture that still reveres ancient scholars like Su Qin, who is said to have poked his thigh with the point of an awl to stay focused.
Meijie, confronting the high-school-entrance ordeal in 2001, found herself in the vise, too. Caught up as she was with Web-related activities when she took a mock version of the four-day test, she didn’t do well enough to get into Shanghai’s best schools. Her parents resisted the tutoring frenzy, which has intensified lately, thanks not least to the accelerating trend of applying to college abroad. (The New Oriental Education and Technology Group, which holds sway in English-language test prep and other training, just went public in September on the New York Stock Exchange and boasts skyrocketing enrollment figures: some 100,000 signed up in 1999, and the number is now a million.) But Meijie hunkered down to study, with help from a physics and a math teacher, and once again finessed what has others tied in knots. She ended up the top scorer in her district, and among the top 10 in all of Shanghai. That secured her a spot at her dream school, Fudan Fuzhong, which translates as “the high school affiliated with Fudan University.”
A premier Shanghai public high school with a well-established foreign-exchange program, Fudan Fuzhong was perfect for Meijie. Thanks to its stellar student body, teachers can spend less time on review without jeopardizing exam results, and the campus bustles with clubs, optional courses, service projects. Meijie, a champion debater and student leader, was the kind of self-driven, “quality” personality the faculty and administrators were eager to reward: she was their unanimous choice for the privilege of a year abroad at Sidwell Friends School in Washington, D.C. While the rest of her third-year classmates sweated through gaokao cramming in 2003-4, she had an American experience that she called “fabulous, life-changing, really core.”
At Sidwell, Meijie was an exchange student standout — history buff, bold field-hockey novice, social dynamo. And when she returned to China, she was convinced that an American liberal-arts education was for her. Still, she confessed that more than just her own “passion,” a favorite word of hers, propelled her Ivy League dream. She wasn’t immune to age-old Chinese status obsession: “The Harvard bell rings in me, too.”
IV. The Event
As she prepared for Hsylc during her freshman year in Cambridge, Meijie wasn’t slowed down by concern that her Harvard-affiliated education gathering might end up reinforcing a tendency it aimed to curb: the Chinese worship of a super-academic credential, at odds with the pursue-your-own-passion drive she hoped to inspire. It was the double bind of China’s reform effort in a nutshell (and a dilemma familiar to overachievers everywhere). But Hsylc was built on an insight that can get overlooked in more professional efforts to prescribe the right pedagogical mix of western vigor and structured Asian rigor. Outside the standard classroom, bottom-up cultural exchange is invaluable, as Meijie knew first-hand; impressive faculty, enlightened pedagogy, great facilities, an innovative curriculum — whether imported or homegrown or both — are only part of the story. What may matter more than anything else is student chemistry.
The scene at Hsylc in August was certainly not stress-free, least of all for Meijie. Chinese education is enjoying a wave of international activity, as joint ventures and exchanges proliferate, but her student-driven endeavor called for pioneering entrepreneurship as she rallied sponsors, press coverage and more. During the frenetic lead-up, her parents were thrilled to meet an array of kids from the United States — so much less inhibited than Chinese teenagers, they said, laughing — even as they worried their daughter was overdoing it and hiding the strain. “She’s my daughter,” Mrs. Tang told me, full of fondness. “I know her well. But I also know that everyone has good times and bad times — I know I do. For us, she tells us the good times. She must suffer, pay for that somewhere within. But she doesn’t tell us that.”
The result was an event full of an impromptu energy uncharacteristic of Chinese high-school campuses, even at a flexible place like Fudan Fuzhong. With their messy dorms and worldly ways, Hsylc’s diverse band of Harvard student seminar leaders were skewed toward adventure-seeking underclassmen, by design. Meijie and her fellow Hsylc organizers had selected for bold well-roundedness; the more eclectic the Harvard students’ profiles, the more exotic their travels, the better. And the young seminar leaders had bitten off ambitious and various topics — from “Africa and the Problems of Development” to “Disneyworld!” — that added up to something quite unlike the scripted “quality education” supplements many of the Hsylc kids had been exposed to. (From 26 province-level divisions, including Hong Kong, the 300 participants reflected the map of educational advantage, coming mostly from eastern, urban areas, with big contingents from Beijing and Shanghai, though Meijie also recruited a dozen or so very poor students.) This was real consciousness-raising, rather than conscientious résumé enhancement. “With half-open eyes, sore legs, constant yawning and a spinning head, it is curious why I am still so excited about the coming day,” wrote a student in The Silk Road, the Hsylc newsletter; clearly not alcohol-induced, the daze wasn’t cram-induced either.
The writer, Sindy — the kids mostly went by English-sounding first names they had chosen — proceeded to praise the seminars and the lectures “by distinguished scholars and entrepreneurs with remarkable insight.” (A scheduling glitch produced the highlight event, I was told — two very different Chinese examples of success on stage together: the polished, Western-educated head of Google China, Kai-Fu Lee, and the jeans-clad Jack Ma, homegrown maverick who reportedly bombed the gaokao twice and founded Alibaba.com, China’s thriving e-commerce company.) Sindy was full of enthusiasm too for “various extracurricular activities in which you wish you could send different parts of your body to.” But what held the most allure were the imported seminar leaders themselves. “I am fascinated by the talent, personal charm and perfect articulation of minds in these young people only two or three years older than most of us,” she wrote of the Harvard students.
Meijie had set out to replace blind veneration of the Harvard brand with a more informed appreciation of the charismatic, self-motivated breed of liberal-arts students she found there. What she probably didn’t foresee was that China’s famously studious youth would also be seized with a bad, or rather a very healthy, case of adolescent infatuation. Thanks to flirting and gossiping over meals — “We began searching for beautiful Harvard girls and handsome boys to have dinners with,” one student reported — the seminar leaders became peers, not just paragons. Forget Chinese reticence. The Hsylc participants demonstrated plenty of persistence, seeking out Harvard students to pursue seminar discussions or talk about life problems. Uneasy students were emboldened by a participatory style impossible in their big classes; they joined debates, took on the role of presidential candidates, signed up for a sprawling talent show. “In such an atmosphere, can you keep yourself silent and passive?” asked several fans of the Harvard sophomore Richie Schwartz’s “explosive” seminar on the evolution wars. “Just speak out!” The questions at some of the lectures were contentious bordering on rude, I was told.
A spirit of social adventure was inseparable from the intellectual adventure at Hsylc. That blurring is rare in Chinese high schools, where the new curricular broadening coexists with a student culture that is more familial — nurturing or infantilizing, depending on whom you ask. Chinese high-school students, many of whom board at school during the week, generally spend two years together with the same 40-plus classmates and sometimes choose their third-year concentration in order to stick with their friends. When they go home on the weekends, it usually isn’t to a social life with peers — dating is strongly discouraged — but to parents who “take care of everything. It’s always little baby.” That’s how one Hsylc participant, William, put it during a long conversation with me and four other summit alumni in a Shanghai teahouse on a late October Sunday. He discovered what he felt was missing from the Chinese high-school experience during a year abroad in a Texas public high school. “American high schools are more colorful, more like real life . . . more complicated . . . I don’t know, you feel like you’ve somehow grown to be a more mature person. You have to deal with different people and all the complicated things about relationships, and friendship, and programs and activities.”
Yet William — whose personal style, down to his Nike wristband, was amazingly American, and who was in the midst of applying to American colleges “because I think they suit me better” — didn’t speak for all his peers. The view, from the best Shanghai high schools at any rate, is more complicated. Lily, from Fudan Fuzhong, had returned from a happy year abroad at the Taft School in Connecticut that nonetheless left her thinking how “stimulating” her own high school is — and feeling that for SAT-stressed American students, “it’s not much better than in Shanghai.” The previous morning, five other Hsylc kids argued just as fiercely about the state of Chinese education. Bluesky sang the praises of the well-rounded girls at her school, Shanghai No. 3 Girls High School. But Black, arriving from a weekend review class with “physics problems swimming in my head,” was bleak about gaokao-burdened life in his ordinary school. “I am very tired but I must strive,” he said wearily. Whatever their arguments as they compared schools, there was general agreement: it’s rural kids, stuck in bad schools and granted fewer slots at the best universities, who face truly impossible odds and stress.
As for the HSYLC students’ plans for the future, Meijie’s summit meeting had not triggered a stampede to apply to American colleges (where all but the wealthy must hope for full or generous scholarships). Interest in Harvard certainly was high, yet at the same time Hsylc sent a very different message that worked against reflexive Chinese competitive fervor. All stirred up by the experience, a group of talented young people was also left feeling, as one put it, “more energetic, braver and more confident” about figuring out for themselves what might lie ahead — where they could best pursue the personal goals they had been hearing about and which they were told to take seriously. “What impressed me about the Harvard students,” a girl by the name of Shine told me, “was their definite aim for life, whereas Chinese students just go on the road laid out by their parents. In Shanghai and Beijing, we have the sense that we can go on our road. But sometimes it is an empty concept; we don’t know how to contribute to society.” For her — and many other kids agreed — the Hsylc event was a time when “I always ask myself what I want to do.”
Applications for undergraduate study abroad are rising (six times as many students from the People’s Republic of China, 256, applied to Harvard College in 2005 as in 1999, for example), and more foreign degree holders are now coming back to put their educations to use at home. But perhaps as promising a sign of the momentum behind educational change in China is an Hsylc participant like Neal, master of the wryly raised eyebrow, who said when I spoke with him that he was planning on staying put but who was still thinking big. The most caustic critic of Chinese schooling among those assembled for tea, he had already spent time in Germany. But he was inspired to find that his Chinese schoolmates, “known for diligence, silence and obedience, thick glasses as a symbol,” in his words, could be “most vigorous” in a setting like Hsylc.
As he explained to me in an earlier e-mail message, composed in breaks from gaokao cramming, he said he needed “to watch and feel the system by my personal experience” — endure the burdens at their worst in the third year, the “endless homework, strict discipline, frequent exams and the peer pressure.” If he chose to stay in China, he would know how to push toward a new system in which students’ “curiosity is well protected to learn knowledge.” At the teahouse, Neal was already rallying the troops behind a vision of prodding change along at home. “At Fudan University” — which has just inaugurated a less specialized curriculum for freshmen and a house system modeled on Yale and Harvard — “I can take a lecture, and if I want to be more active, I can ask questions, I can tell my friends to ask questions and then students will change the system,” he said. “When a university is eager to change, the vital power is students.” Neal raised both eyebrows, a boy looking ready to rebel against the bookworm stereotype. “People say, ‘Whoa, you’re from China!’ Yes, I’m from China.”
A generation of more independent-minded students with wider horizons — a generation of Neals and Meijies, busy networking and innovating: it is a prospect that may inspire some trepidation as well as optimism among Chinese leaders. After all, campus unrest has left scars, and vast challenges loom, from the environment to rural discontent. The proof that the recent educational changes go deeper than a proliferation of newfangled curricula and degrees will not be merely how China’s economic future plays out; it will be what kinds of political and cultural repercussions unfold, most immediately among the lucky few who are currently benefiting most from the new opportunities. Will it be enough if a bolder breed of the best and the brightest — as Xu predicted — form a cosmopolitan elite whose roots in China help make them the imaginative hybrids that global enterprises need? Over dinner with his wife and daughter (who was busy doing her homework while she ate), Xu was hopeful. At the same time, in a recent Foreign Affairs article, Xu’s friend John Thornton, a former president of Goldman Sachs, who has been directing a new Global Leadership program at Tsinghua University for several years now (and who also gave a rousing talk at Hsylc), points out a serious problem. In a country whose cutthroat educational system was famous for selecting successful bureaucrats, top talent these days goes just about anywhere but the government, clogged as it is with corrupt insiders. If creative, critical-minded outsiders aren’t given a reason to enter the public realm, the prospects for a world-class, more democratic future for all are only more precarious.
Right now, it’s quite unlikely that Meijie would even think of ending up as a mandarin. She was, though, full of plans as the spring semester of her sophomore year got under way after a somewhat dispiriting fall — her Harvard friends worrying about grades, her Fudan University friends in China panicked about careers. “Now I feel I am back again,” she wrote in a late February e-mail message, sent at — some things never change — 3:55 a.m. “Life is so hectic and overwhelmingly exciting!” With two Harvard friends, Meijie had just founded a company — Strategy Alpha International L.L.C., whose mission is to advise Chinese and American enterprises on carving out niches in the other country’s markets. (She and her team were already talking to a top Chinese financial magazine in search of an American partner.)
Meijie the resourceful global entrepreneur hadn’t stopped thinking about education, though you might say the liberal-arts champion had become more of a realist. With a nonprofit arm of S.A.I., she was eager to pursue college advising back home and to explore new curriculum designs for English-teacher training and — inescapable in China — test prep as well. Meanwhile, Hsylc 2007 was gearing up. Meijie planned to give this year’s event a theme, “Different Paths to Success,” with the emphasis on “different” to jolt Harvard kids and Chinese kids alike out of their G.P.A.-obsessed focus on climbing conventional ladders. (In China, she suggested, American-style options like entertainment or sports, say, could use a boost.) It would be a contrast to last year, the now-seasoned summit organizer told me with a nostalgic grin and a groan: then the theme, she joked, should have been “How to Survive in Chaos.” Meijie was being modest. How to thrive under the stress of new choices and change was the lesson she learned early, and was working hard — very hard — to share.
Ann Hulbert, a contributing writer, is the author of “Raising America: Experts, Parents and a Century of Advice About Children.”
Subscribe to:
Posts (Atom)