Tuesday, July 29

"By age 5, it is possible to predict, with depressing accuracy, who will complete high school and college and who won’t."

The Biggest Issue

Why did the United States become the leading economic power of the 20th century? The best short answer is that a ferocious belief that people have the power to transform their own lives gave Americans an unparalleled commitment to education, hard work and economic freedom.

Between 1870 and 1950, the average American’s level of education rose by 0.8 years per decade. In 1890, the average adult had completed about 8 years of schooling. By 1900, the average American had 8.8 years. By 1910, it was 9.6 years, and by 1960, it was nearly 14 years.

As Claudia Goldin and Lawrence Katz describe in their book, “The Race Between Education and Technology,” America’s educational progress was amazingly steady over those decades, and the U.S. opened up a gigantic global lead. Educational levels were rising across the industrialized world, but the U.S. had at least a 35-year advantage on most of Europe. In 1950, no European country enrolled 30 percent of its older teens in full-time secondary school. In the U.S., 70 percent of older teens were in school.

America’s edge boosted productivity and growth. But the happy era ended around 1970 when America’s educational progress slowed to a crawl. Between 1975 and 1990, educational attainments stagnated completely. Since then, progress has been modest. America’s lead over its economic rivals has been entirely forfeited, with many nations surging ahead in school attainment.

This threatens the country’s long-term prospects. It also widens the gap between rich and poor. Goldin and Katz describe a race between technology and education. The pace of technological change has been surprisingly steady. In periods when educational progress outpaces this change, inequality narrows. The market is flooded with skilled workers and so their wages rise modestly. In periods, like the current one, when educational progress lags behind technological change, inequality widens. The relatively few skilled workers command higher prices, while the many unskilled ones have little bargaining power.

The meticulous research of Goldin and Katz is complemented by another report from James Heckman of the University of Chicago. Using his own research, Heckman also concludes that high school graduation rates peaked in the U.S. in the late 1960s, at about 80 percent. Since then they have declined.

In “Schools, Skills and Synapses,” Heckman probes the sources of that decline. It’s not falling school quality, he argues. Nor is it primarily a shortage of funding or rising college tuition costs. Instead, Heckman directs attention at family environments, which have deteriorated over the past 40 years.

Heckman points out that big gaps in educational attainment are present at age 5. Some children are bathed in an atmosphere that promotes human capital development and, increasingly, more are not. By 5, it is possible to predict, with depressing accuracy, who will complete high school and college and who won’t.

I.Q. matters, but Heckman points to equally important traits that start and then build from those early years: motivation levels, emotional stability, self-control and sociability. He uses common sense to intuit what these traits are, but on this subject economists have a lot to learn from developmental psychologists.

I point to these two research projects because the skills slowdown is the biggest issue facing the country. Rising gas prices are bound to dominate the election because voters are slapped in the face with them every time they visit the pump. But this slow-moving problem, more than any other, will shape the destiny of the nation.

Second, there is a big debate under way over the sources of middle-class economic anxiety. Some populists emphasize the destructive forces of globalization, outsourcing and predatory capitalism. These people say we need radical labor market reforms to give the working class a chance. But the populists are going to have to grapple with the Goldin, Katz and Heckman research, which powerfully buttresses the arguments of those who emphasize human capital policies. It’s not globalization or immigration or computers per se that widen inequality. It’s the skills gap. Boosting educational attainment at the bottom is more promising than trying to reorganize the global economy.

Third, it’s worth noting that both sides of this debate exist within the Democratic Party. The G.O.P. is largely irrelevant. If you look at Barack Obama’s education proposals — especially his emphasis on early childhood — you see they flow naturally and persuasively from this research. (It probably helps that Obama and Heckman are nearly neighbors in Chicago). McCain’s policies seem largely oblivious to these findings. There’s some vague talk about school choice, but Republicans are inept when talking about human capital policies.

America rose because it got more out of its own people than other nations. That stopped in 1970. Now, other issues grab headlines and campaign attention. But this tectonic plate is still relentlessly and menacingly shifting beneath our feet.

Why we should stop worrying about China

By John Pomfret
Sunday, July 27, 2008; B01

Nikita Khrushchev said the Soviet Union would bury us, but these days, everybody seems to think that China is the one wielding the shovel. The People's Republic is on the march -- economically, militarily, even ideologically. Economists expect its GDP to surpass America's by 2025; its submarine fleet is reportedly growing five times faster than Washington's; even its capitalist authoritarianism is called a real alternative to the West's liberal democracy. China, the drumbeat goes, is poised to become the 800-pound gorilla of the international system, ready to dominate the 21st century the way the United States dominated the 20th.

Except that it's not.

Ever since I returned to the United States in 2004 from my last posting to China, as this newspaper's Beijing bureau chief, I've been struck by the breathless way we talk about that country. So often, our perceptions of the place have more to do with how we look at ourselves than with what's actually happening over there. Worried about the U.S. education system? China's becomes a model. Fretting about our military readiness? China's missiles pose a threat. Concerned about slipping U.S. global influence? China seems ready to take our place.

But is China really going to be another superpower? I doubt it.

It's not that I'm a China-basher, like those who predict its collapse because they despise its system and assume that it will go the way of the Soviet Union. I first went to China in 1980 as a student, and I've followed its remarkable transformation over the past 28 years. I met my wife there and call it a second home. I'm hardly expecting China to implode. But its dream of dominating the century isn't going to become a reality anytime soon.

Too many constraints are built into the country's social, economic and political systems. For four big reasons -- dire demographics, an overrated economy, an environment under siege and an ideology that doesn't travel well -- China is more likely to remain the muscle-bound adolescent of the international system than to become the master of the world.

In the West, China is known as "the factory to the world," the land of unlimited labor where millions are eager to leave the hardscrabble countryside for a chance to tighten screws in microwaves or assemble Apple's latest gizmo. If the country is going to rise to superpowerdom, says conventional wisdom, it will do so on the back of its massive workforce.

But there's a hitch: China's demographics stink. No country is aging faster than the People's Republic, which is on track to become the first nation in the world to get old before it gets rich. Because of the Communist Party's notorious one-child-per-family policy, the average number of children born to a Chinese woman has dropped from 5.8 in the 1970s to 1.8 today -- below the rate of 2.1 that would keep the population stable. Meanwhile, life expectancy has shot up, from just 35 in 1949 to more than 73 today. Economists worry that as the working-age population shrinks, labor costs will rise, significantly eroding one of China's key competitive advantages.

Worse, Chinese demographers such as Li Jianmin of Nankai University now predict a crisis in dealing with China's elderly, a group that will balloon from 100 million people older than 60 today to 334 million by 2050, including a staggering 100 million age 80 or older. How will China care for them? With pensions? Fewer than 30 percent of China's urban dwellers have them, and none of the country's 700 million farmers do. And China's state-funded pension system makes Social Security look like Fort Knox. Nicholas Eberstadt, a demographer and economist at the American Enterprise Institute, calls China's demographic time bomb "a slow-motion humanitarian tragedy in the making" that will "probably require a rewrite of the narrative of the rising China."

I count myself lucky to have witnessed China's economic rise first-hand and seen its successes etched on the bodies of my Chinese classmates. When I first met them in the early 1980s, my fellow students were hard and thin as rails; when I found them again almost 20 years later, they proudly sported what the Chinese call the "boss belly." They now golfed and lolled around in swanky saunas.

But in our exuberance over these incredible economic changes, we seem to have forgotten that past performance doesn't guarantee future results. Not a month goes by without some Washington think tank crowing that China's economy is overtaking America's. The Carnegie Endowment for International Peace is the latest, predicting earlier this month that the Chinese economy would be twice the size of ours by the middle of the century.

There are two problems with predictions like these. First, in the universe where these reports are generated, China's graphs always go up, never down. Second, while the documents may include some nuance, it vanishes when the studies are reported to the rest of us.

One important nuance we keep forgetting is the sheer size of China's population: about 1.3 billion, more than four times that of the United States. China should have a big economy. But on a per capita basis, the country isn't a dragon; it's a medium-size lizard, sitting in 109th place on the International Monetary Fund's World Economic Outlook Database, squarely between Swaziland and Morocco. China's economy is large, but its average living standard is low, and it will stay that way for a very long time, even assuming that the economy continues to grow at impressive rates.

The big number wheeled out to prove that China is eating our economic lunch is the U.S. trade deficit with China, which last year hit $256 billion. But again, where's the missing nuance? Nearly 60 percent of China's total exports are churned out by companies not owned by Chinese (including plenty of U.S. ones). When it comes to high-tech exports such as computers and electronic goods, 89 percent of China's exports come from non-Chinese-owned companies. China is part of the global system, but it's still the low-cost assembly and manufacturing part -- and foreign, not Chinese, firms are reaping the lion's share of the profits.

When my family and I left China in 2004, we moved to Los Angeles, the smog capital of the United States. No sooner had we set foot in southern California than my son's asthma attacks and chronic chest infections -- so worryingly frequent in Beijing -- stopped. When people asked me why we'd moved to L.A., I started joking, "For the air."

China's environmental woes are no joke. This year, China will surpass the United States as the world's No. 1 emitter of greenhouse gases. It continues to be the largest depleter of the ozone layer. And it's the largest polluter of the Pacific Ocean. But in the accepted China narrative, the country's environmental problems will merely mean a few breathing complications for the odd sprinter at the Beijing games. In fact, they could block the country's rise.

The problem is huge: Sixteen of the world's 20 most polluted cities are in China, 70 percent of the country's lakes and rivers are polluted, and half the population lacks clean drinking water. The constant smoggy haze over northern China diminishes crop yields. By 2030, the nation will face a water shortage equal to the amount it consumes today; factories in the northwest have already been forced out of business because there just isn't any water. Even Chinese government economists estimate that environmental troubles shave 10 percent off the country's gross domestic product each year. Somehow, though, the effect this calamity is having on China's rise doesn't quite register in the West .

And then there's "Kung Fu Panda." That Hollywood movie embodies the final reason why China won't be a superpower: Beijing's animating ideas just aren't that animating.

In recent years, we've been bombarded with articles and books about China's rising global ideological influence. (One typical title: "Charm Offensive: How China's Soft Power Is Transforming the World.") These works portray China's model -- a one-party state with a juggernaut economy -- as highly attractive to elites in many developing nations, although China's dreary current crop of acolytes (Zimbabwe, Burma and Sudan) don't amount to much of a threat.

But consider the case of the high-kicking panda who uses ancient Chinese teachings to turn himself into a kung fu warrior. That recent Hollywood smash broke Chinese box-office records -- and caused no end of hand-wringing among the country's glitterati. "The film's protagonist is China's national treasure, and all the elements are Chinese, but why didn't we make such a film?" Wu Jiang, president of the China National Peking Opera Company, told the official New China News Agency.

The content may be Chinese, but the irreverence and creativity of "Kung Fu Panda" are 100 percent American. That highlights another weakness in the argument about China's inevitable rise: The place remains an authoritarian state run by a party that limits the free flow of information, stifles ingenuity and doesn't understand how to self-correct. Blockbusters don't grow out of the barrel of a gun. Neither do superpowers in the age of globalization.

And yet we seem to revel in overestimating China. One recent evening, I was at a party where a senior aide to a Democratic senator was discussing the business deal earlier this year in which a Chinese state-owned investment company had bought a big chunk of the Blackstone Group, a U.S. investment firm. The Chinese company has lost more than $1 billion, but the aide wouldn't believe that it was just a bum investment. "It's got to be part of a broader plan," she insisted. "It's China."

I tried to convince her otherwise. I don't think I succeeded.

pomfretj@washpost.com

John Pomfret is the editor of Outlook. He is a former Beijing bureau chief of The Washington Post and the author of "Chinese Lessons: Five Classmates and the Story of the New China."

Thursday, July 24

An odd cabal of timorous Europeans, myopic media outlets, corrupt Afghans, blinkered Pentagon officers, politically motivated Democrats & Taliban



Is Afghanistan a Narco-State?

On March 1, 2006, I met Hamid Karzai for the first time. It was a clear, crisp day in Kabul. The Afghan president joined President and Mrs. Bush, Secretary of State Condoleezza Rice and Ambassador Ronald Neumann to dedicate the new United States Embassy. He thanked the American people for all they had done for Afghanistan. I was a senior counternarcotics official recently arrived in a country that supplied 90 percent of the world’s heroin. I took to heart Karzai’s strong statements against the Afghan drug trade. That was my first mistake.

Over the next two years I would discover how deeply the Afghan government was involved in protecting the opium trade — by shielding it from American-designed policies. While it is true that Karzai’s Taliban enemies finance themselves from the drug trade, so do many of his supporters. At the same time, some of our NATO allies have resisted the anti-opium offensive, as has our own Defense Department, which tends to see counternarcotics as other people’s business to be settled once the war-fighting is over. The trouble is that the fighting is unlikely to end as long as the Taliban can finance themselves through drugs — and as long as the Kabul government is dependent on opium to sustain its own hold on power.

It wasn’t supposed to be like this. When I attended an Afghanistan briefing for Anne Patterson on Dec. 1, 2005, soon after she became assistant secretary of state for international narcotics and law-enforcement affairs, she turned to me with her characteristic smile and said, “What have we gotten ourselves into?” We had just learned that in the two previous months Afghan farmers had planted almost 60 percent more poppy than the year before, for a total of 165,000 hectares (637 square miles). The 2006 harvest would be the biggest narco-crop in history. That was the challenge we faced. Patterson — already a three-time ambassador — made me her deputy at the law-enforcement bureau, which has anti-crime programs in dozens of countries.

At the beginning of 2006, I went to the high-profile London Conference on Afghanistan. It was a grand event mired in deception, at least with respect to the drug situation. Everyone from the Afghan delegation and most in the international community knew that poppy cultivation and heroin production would increase significantly in 2006. But the delegates to the London Conference instead dwelled on the 2005 harvest, which was lower than that of 2004, principally because of poor weather and market manipulation by drug lords like Sher Muhammad Akhundzada, who had been governor of the heroin capital of the world — Helmand Province — and then a member of Afghanistan’s Parliament. So the Afghans congratulated themselves on their tremendous success in fighting drugs even as everyone knew the problem was worse than ever.

About three months later, after meeting with local officials in Helmand — my helicopter touched down in the middle of a poppy field — I went to the White House to brief Vice President Cheney, Secretary Rice, Defense Secretary Donald Rumsfeld and others on the expanding opium problem. I advocated a policy replicating what had worked in other countries: public education about the evils of heroin and the illegality of cultivating poppies; alternative crops; eradication of poppy fields; interdiction of drug shipments and arrest of traffickers; and improvements to the judicial system.

I emphasized at this and subsequent meetings that crop eradication, although claiming less than a third of the $500 million budgeted for Afghan counternarcotics, was the most controversial part of the program. But because no other crop came even close to the value of poppies, we needed the threat of eradication to force farmers to accept less-lucrative alternatives. (Eradication was an essential component of successful anti-poppy efforts in Guatemala, Southeast Asia and Pakistan.) The most effective method of eradication was the use of herbicides delivered by crop-dusters. But Karzai had long opposed aerial eradication, saying it would be misunderstood as some sort of poison coming from the sky. He claimed to fear that aerial eradication would result in an uprising that would cause him to lose power. We found this argument perplexing because aerial eradication was used in rural areas of other poor countries without a significant popular backlash. The chemical used, glyphosate, was a weed killer used all over the United States, Europe and even Afghanistan. (Drug lords use it in their gardens in Kabul.) There were volumes of evidence demonstrating that it was harmless to humans and became inert when it hit the ground. My assistant at the time was a Georgia farmer, and he told me that his father mixed glyphosate with his hands before applying it to their orchards.

Nonetheless, Karzai opposed it, and we at the Bureau of International Narcotics and Law Enforcement Affairs went along. We financed ground-based eradication instead: police using tractors and weed-whackers to destroy the fields of farmers who refused to plant alternative crops. Ground-based eradication was inefficient, costly, dangerous and more subject to corrupt dealings among local officials than aerial eradication. But it was our only option.

Yet I continued to press for aerial eradication and a greater commitment to providing security for eradicators. Rumsfeld was already in political trouble, so when he started to resist my points, Rice quickly and easily shut him down. The briefing at the White House was well received by Rice and the others present. White House staff members also made clear to me that Bush continued to be “a big fan of aerial eradication.”

The vice president made only one comment: “You got a tough job.”

Even before she got to the bureau of international narcotics, Anne Patterson knew that the Pentagon was hostile to the antidrug mission. A couple of weeks into the job, she got the story firsthand from Lt. Gen. Karl Eikenberry, who commanded all U.S. forces in Afghanistan. He made it clear: drugs are bad, but his orders were that drugs were not a priority of the U.S. military in Afghanistan. Patterson explained to Eikenberry that, when she was ambassador to Colombia, she saw the Revolutionary Armed Forces of Colombia (FARC) finance their insurgency with profits from the cocaine trade, and she warned Eikenberry that the risk of a narco-insurgency in Afghanistan was very high. Eikenberry was familiar with the Colombian situation, but the Pentagon strategy was “sequencing” — defeat the Taliban, then have someone else clean up the drug business.

The Drug Enforcement Administration worked the heroin trafficking and interdiction effort with the Afghans. They targeted kingpins and disrupted drug-smuggling networks. The D.E.A. had excellent agents in Afghanistan, but there were not enough of them, and they had seemingly unending difficulties getting Mi-17 helicopters and other equipment that the Pentagon promised for the training of the counternarcotics police of Afghanistan. In addition, the Pentagon had reneged on a deal to allow the D.E.A. the use of precious ramp space at the Kabul airport. Consequently, the effort to interdict drug shipments and arrest traffickers had stalled. Less than 1 percent of the opium produced in Afghanistan was being seized there. The effort became even more complicated later in 2006, when Benjamin Freakley, the two-star U.S. general who ran the eastern front, shut down all operations by the D.E.A. and Afghan counternarcotics police in Nangarhar — a key heroin-trafficking province. The general said that antidrug operations were an unnecessary obstacle to his military operations.

The United States Agency for International Development (USAid) was also under fire — particularly from Congress — for not providing better alternative crops for farmers. USAid had distributed seed and fertilizer to most of Afghanistan, but more comprehensive agricultural programs were slow to start in parts of the country. The USAid officers in Kabul were competent and committed, but they had already lost several workers to insurgent attacks, and were understandably reluctant to go into Taliban territory to implement their programs.

The Department of Justice had just completed an effort to open the Afghan anti-narcotics court, so capacity to prosecute was initially low. Justice in Afghanistan was administered unevenly by tribes, religious leaders and poorly paid, highly corruptible judges. In the rare cases in which drug traffickers were convicted, they often walked in the front door of a prison, paid a bribe and walked out the back door. We received dozens of reports to this effect.

And then there was the problem of the Afghan National Police. The Pentagon frequently proclaimed that the Afghan National Army (which the Pentagon trained) was performing wonderfully, but that the police (trained mainly by the Germans and the State Department) were not. A respected American general in Afghanistan, however, confided to me that the army was not doing well, either; that the original plan for training the army was flimsy and underfinanced; and that, consequently, they were using police to fill holes in the army mission. Thrust into a military role, unprepared police lost their lives trying to hold territory in dangerous areas.

There was no coherent strategy to resolve these issues among the U.S. agencies and the Afghan government. When I asked career officers at the State Department for the interagency strategy for Afghan counternarcotics, they produced the same charts I used to brief the cabinet in Washington months before. “There is no written strategy,” they confessed.

As big as these challenges were, there were even bigger ones. A lot of intelligence — much of it unclassified and possible to discuss here — indicated that senior Afghan officials were deeply involved in the narcotics trade. Narco-traffickers were buying off hundreds of police chiefs, judges and other officials. Narco-corruption went to the top of the Afghan government. The attorney general, Abdul Jabbar Sabit, a fiery Pashtun who had begun a self-described “jihad against corruption,” told me and other American officials that he had a list of more than 20 senior Afghan officials who were deeply corrupt — some tied to the narcotics trade. He added that President Karzai — also a Pashtun — had directed him, for political reasons, not to prosecute any of these people. (On July 16 of this year, Karzai dismissed Sabit after Sabit announced his candidacy for president. Karzai’s office said Sabit’s candidacy violated laws against political activity by officials. Sabit told a press conference that Karzai “has never been able to tolerate rivals.”)

A nearly equal challenge in 2006 was the lack of resolve in the international community. Although Britain’s foreign office strongly backed antinarcotics efforts (with the exception of aerial eradication), the British military were even more hostile to the antidrug mission than the U.S. military. British forces — centered in Helmand — actually issued leaflets and bought radio advertisements telling the local criminals that the British military was not part of the anti-poppy effort. I had to fly to Brussels and show one of these leaflets to the supreme allied commander in Europe, who oversees Afghan operations for NATO, to have this counterproductive information campaign stopped. It was a small victory; the truth was that many of our allies in the International Security Assistance Force were lukewarm on antidrug operations, and most were openly hostile to aerial eradication.

Nonetheless, throughout 2006 and into 2007 there were positive developments (although the Pentagon did not supply the helicopters to the D.E.A. until early 2008). The D.E.A. was training special Afghan narcotics units, while the Pentagon began to train Afghan pilots for drug operations. We put together educational teams that convened effective antidrug meetings in the more stable northern provinces. We used manual eradication to eliminate about 10 percent of the crop. In some provinces with little insurgent activity, the eradication numbers reached the 20 percent threshold — a level that drug experts see as a tipping point in eradication — and poppy cultivation all but disappeared in those areas by 2007. And the Department of Justice got the counternarcotics tribunal to process hundreds of midlevel cases.

By late 2006, however, we had startling new information: despite some successes, poppy cultivation over all would grow by about 17 percent in 2007 and would be increasingly concentrated in the south of the country, where the insurgency was the strongest and the farmers were the wealthiest. The poorest farmers of Afghanistan — those who lived in the north, east and center of the country — were taking advantage of antidrug programs and turning away from poppy cultivation in large numbers. The south was going in the opposite direction, and the Taliban were now financing the insurgency there with drug money — just as Patterson predicted.

In late January 2007, there was an urgent U.S. cabinet meeting to discuss the situation. The attendees agreed that the deputy secretary of state John Negroponte and John Walters, the drug czar, would oversee the development of the first interagency counternarcotics strategy for Afghanistan. They asked me to coordinate the effort, and, after Patterson’s intervention, I was promoted to ambassadorial rank. We began the effort with a briefing for Negroponte, Walters, Attorney General Alberto Gonzales and several senior Pentagon officials. We displayed a map showing how poppy cultivation was becoming limited to the south, more associated with the insurgency and disassociated from poverty. The Pentagon chafed at the briefing because it reflected a new reality: narcotics were becoming less a problem of humanitarian assistance and more a problem of insurgency and war.

The United Nations Office on Drugs and Crime was arriving at the same conclusion. Later that year, they issued a report linking the drug trade to the insurgency and made a controversial statement: “Opium cultivation in Afghanistan is no longer associated with poverty — quite the opposite.” The office convincingly demonstrated that poor farmers were abandoning the crop and that poppy growth was largely confined to some of the wealthiest parts of Afghanistan. The report recommended that eradication efforts be pursued “more honestly and more vigorously,” along with stronger anticorruption measures. Earlier this year, the U.N. published an even more detailed paper titled “Is Poverty Driving the Afghan Opium Boom?” It rejected the idea that farmers would starve without the poppy, concluding that “poverty does not appear to have been the main driving factor in the expansion of opium poppy cultivation in recent years.”

The U.N. reports shattered the myth that poppies are grown by destitute farmers who have no other source of income. They demonstrated that approximately 80 percent of the land under poppy cultivation in the south had been planted with it only in the last two years. It was not a matter of “tradition,” and these farmers did not need an alternative livelihood. They had abandoned their previous livelihoods — mainly vegetables, cotton and wheat (which was in severely short supply) — to take advantage of the security vacuum to grow a more profitable crop: opium.

Around the same time, the United States released photos of industrial-size poppy farms — many owned by pro-government opportunists, others owned by Taliban sympathizers. Most of these narco-farms were near major southern cities. Farmers were digging wells, surveying new land for poppy cultivation, diverting U.S.-built irrigation canals to poppy fields and starting expensive reclamation projects.

Yet Afghan officials continued to say that poppy cultivation was the only choice for its poor farmers. My first indication of the insincerity of this position came at a lunch in Brussels in September 2006 attended by Habibullah Qaderi, who was then Afghanistan’s minister for counternarcotics. He gave a speech in which he said that poor Afghan farmers have no choice but to grow poppies, and asked for more money. A top European diplomat challenged him, holding up a U.N. map showing the recent trend: poppy growth decreasing in the poorest areas and growing in the wealthier areas. The minister, taken aback, simply reiterated his earlier point that Afghanistan needed more money for its destitute farmers. After the lunch, however, Qaderi approached me and whispered: “I know what you say is right. Poverty is not the main reason people are growing poppy. But this is what the president of Afghanistan tells me to tell others.”

In July 2007, I briefed President Karzai on the drive for a new strategy. He was interested in the new incentives that we were developing, but became sullen and unresponsive when I discussed the need to balance those incentives with new disincentives — including arrests of high-level traffickers and eradication of poppy fields in the wealthier areas of the Pashtun south, where Karzai had his roots and power base.

We also tried to let the public know about the changing dynamics of the trade. Unfortunately, most media outlets clung to the myth that the problem was out of control all over the country, that only desperate farmers grew poppies and that any serious law-enforcement effort would drive them into the hands of the Taliban. The “starving farmer” was a convenient myth. It allowed some European governments to avoid involvement with the antidrug effort. Many of these countries had only one- or two-year legislative mandates to be in Afghanistan, so they wanted to avoid any uptick in violence that would most likely result from an aggressive strategy, even if the strategy would result in long-term success. The myth gave military officers a reason to stay out of the drug war, while prominent Democrats used the myth to attack Bush administration policies. And the Taliban loved it because their propaganda campaign consisted of trotting out farmers whose fields had been eradicated and having them say that they were going to starve.

An odd cabal of timorous Europeans, myopic media outlets, corrupt Afghans, blinkered Pentagon officers, politically motivated Democrats and the Taliban were preventing the implementation of an effective counterdrug program. And the rest of us could not turn them around.

Nonetheless, we stayed hopeful as we worked on what became the U.S. Counternarcotics Strategy for Afghanistan. The Defense Department was initially cooperative (as I testified to Congress). We agreed to expand the local meetings and education campaign that worked well in the north. Afghan religious leaders would issue anti-poppy statements, focusing on the anti-Islamic nature of drugs and the increasing addiction rate in Afghanistan. In the area of agricultural incentives, since most farmers already had an alternative crop, we agreed to improve access to markets not only in Afghanistan but also in Pakistan and the wider region. USAid would establish more cold-storage facilities, build roads and establish buying cooperatives that could guarantee prices for legal crops. With the British, we developed an initiative to reward provinces that became poppy-free or reduced their poppy crop by a specified amount. Governors who performed well would get development projects: schools, bridges and hospitals.

But there had to be disincentives too. We agreed to provide security for manual poppy eradication, so that we could show the Afghan people that the more-powerful farmers were vulnerable. We focused on achieving better ground-based eradication, but reintroduced the possibility of aerial eradication. We agreed to increase D.E.A. training of counternarcotics police and establish special investigative units to gather physical and documentary evidence against corrupt Afghan officials. And we developed policies that would increase the Afghan capacity to prosecute traffickers.

Adding to the wave of optimism was the arrival of William Wood as the new U.S. ambassador to Afghanistan. He had been ambassador in Colombia, so he understood drugs and insurgency well. His view was that poppy cultivation was illegal in Afghanistan, so he didn’t really care whether the farmers were poor or rich. “We have a lot of poor people in the drug trade in the U.S.A. — people mixing meth in their trailers in rural areas and people selling crack in the inner cities — and we put them in jail,” he said.

At first Wood advocated — in an unclassified e-mail message, surprisingly — a massive aerial-eradication program that would wipe out 80,000 hectares of poppies in Helmand Province, delivering a fatal blow to the root of the narcotics problem. “If there is no poppy, there is nothing to traffic,” Wood said. The plan looked good on paper, but we knew it would be impossible to sell to Karzai and the Pentagon. Wood eventually agreed to language advocating, at a minimum, force-protected ground-based eradication with the possibility of limited aerial eradication.

Another ally for a more aggressive approach to the problem was David Kilcullen, a blunt counterterrorism expert. He became increasingly concerned about the drug money flowing to the Taliban. He noted that, while Afghans often shift alliances, what remains constant is their respect for strength and consistency. He recommended mobile courts that had the authority to execute drug kingpins in their own provinces. (You could have heard a pin drop when he first made that suggestion at a large meeting of diplomats.) In support of aerial eradication, Kilcullen pointed out that, with manual eradication you have to “fight your way in and fight your way out” of the poppy fields, making it deadly, inefficient and subject to corrupt bargaining. Aerial eradication, by contrast, is quick, fair and efficient. “If we are already bombing Taliban positions, why won’t we spray their fields with a harmless herbicide and cut off their money?” Kilcullen asked.

So it appeared that things were moving nicely. We were going to increase incentives to farmers and politicians while also increasing the disincentives with aggressive eradication and arrest of criminal officials and leading traffickers. The Pentagon seemed on board.

Then it all began to unravel.

In May 2007, Anthony Harriman, the senior director for Afghanistan at the National Security Council, in order to ensure the strategy paper would be executed, decided to take it to the Deputies Committee — a group of cabinet deputy secretaries led by Lt. Gen. Douglas Lute, whom President Bush had appointed his “war czar” — which had the power to make the document official U.S. policy. Harriman asked me to start developing an unclassified version for public release.

Almost immediately, the Pentagon bureaucracy — particularly the South Asia office — made an about-face. First, they resisted bringing the paper to the deputies. When that effort failed (largely because of unexpected support for the plan from new field commanders like Gen. Dan McNeill, who saw the narcotics-insurgency nexus and were willing to buck their Pentagon minders), the Pentagon bureaucrats tried to prevent the release of an unclassified version to the public. Indeed, two senior Pentagon officials threatened me with professional retaliation if we made the unclassified document public. When we went ahead anyway, the Pentagon leaked the contents of the classified version to Peter Gilchrist, a British general posted in Washington. Defense Department officials were thus enlisting a foreign government to help kill U.S. policy — a policy that implicitly recognized that the Pentagon’s “sequencing” approach had failed and that the Defense Department would have to get more involved in fighting the narcotics trade.

Gilchrist told me that the plan was unacceptable to Britain. Britain, apparently joined by Sweden (which has fewer than 500 troops in a part of the country where there is no poppy cultivation), sent letters to Karzai urging him to reject key elements of the U.S. plan. By the time Wood and Secretary Rice pressed Karzai for more aggressive action, Karzai told Rice that because some people in the U.S. government did not support the plan, and some allies did not support it, he was not going to support it, either. An operations-center assistant, who summarized the call for me over my car phone just after it occurred, made an uncharacteristic editorial comment: “It was not a good call, ambassador.”

Even more startling, it appeared that top Pentagon officials knew nothing about the changing nature of the drug problem or about the new plan. When, through a back channel, I briefed the under secretary of defense for intelligence, James Clapper, on the relationship between drugs and the insurgency, he said he had “never heard any of this.” Worse still, Defense Secretary Robert Gates testified to Congress in December 2007 that we did not have a strategy for fighting drugs in Afghanistan. I received a quick apology from the Pentagon counterdrugs unit, which sent a memo to Gates informing him that we actually did have a strategy.

This dissension was, I believe, music to Karzai’s ears. When he convened all 34 Afghan provincial governors in Kabul in September 2007 (I was a “guest of honor”), he made antidrug statements at the beginning of his speech, but then lashed out at the international community for wanting to spray his people’s crops and giving him conflicting advice. He got a wild ovation. Not surprising — since so many in the room were closely tied to the narcotics trade. Sure, Karzai had Taliban enemies who profited from drugs, but he had even more supporters who did.

Karzai was playing us like a fiddle: the U.S. would spend billions of dollars on infrastructure improvement; the U.S. and its allies would fight the Taliban; Karzai’s friends could get rich off the drug trade; he could blame the West for his problems; and in 2009 he would be elected to a new term.

This is not just speculation, even when you stick with unclassified materials. In September 2007, The Kabul Weekly, an independent newspaper, ran a blunt editorial laying out the issue: “It is obvious that the Afghan government is more than kind to poppy growers. . . . [It] opposes the American proposal for political reasons. The administration believes that it will lose popularity in the southern provinces where the majority of opium is cultivated. They’re afraid of losing votes. More than 95 percent of the residents of . . . the poppy growing provinces — voted for President Karzai.” The editorial recommended aerial eradication. That same week, the first vice president of Afghanistan, Ahmad Zia Massoud, wrote a scathing op-ed article in The Sunday Telegraph in London: “Millions of pounds have been committed in provinces including Helmand Province for irrigation projects and road building to help farmers get their produce to market. But for now this has simply made it easier for them to grow and transport opium. . . . Deep-rooted corruption . . . exists in our state institutions.” The Afghan vice president concluded, “We must switch from ground-based eradication to aerial spraying.”

But Karzai did not care. Back in January 2007, Karzai appointed a convicted heroin dealer, Izzatulla Wasifi, to head his anticorruption commission. Karzai also appointed several corrupt local police chiefs. There were numerous diplomatic reports that his brother Ahmed Wali, who was running half of Kandahar, was involved in the drug trade. (Said T. Jawad, Afghanistan’s ambassador to the United States, said Karzai has “taken the step of issuing a decree asking the government to be vigilant of any business dealing involving his family, and requesting that any suspicions be fully investigated.”) Some governors of Helmand and other provinces — Pashtuns who had advocated aerial eradication — changed their positions after the “palace” spoke to them. Karzai was lining up his Pashtun allies for re-election, and the drug war was going to have to wait. “Maybe we taught him too much about politics,” Rice said to me after I briefed her on these developments.

Karzai then put General Khodaidad (who, like many Afghans, goes by only one name) in charge of the Afghan counternarcotics efforts. Khodaidad — a conscientious man, competent and apparently not corrupt — was a Hazara. The Hazaras had no influence over the southern Pashtuns who were dominating the drug trade. While Khodaidad did well in the north, he got nowhere in Helmand and Kandahar — and told me so. Karzai had to have known this would be the case.

But the real test for the Afghan government and the Pentagon came with the “force protection” issue. At high-level international conferences, the Afghans — finally, under European pressure — agreed to eradicate 50,000 hectares (more than 25 percent of the crop) in the first months of this year; and they agreed that the Afghan National Army would provide force protection.

The plan was simple. The Afghan Poppy Eradication Force would go to Helmand Province with two battalions of the national army and eradicate the fields of the wealthier farmers — including fields owned by local officials. Protecting the eradication force would also enable the arrest of key traffickers. The U.S. military, which trained the Afghan army, would assist in moving the soldiers there and provide outer-perimeter security. The U.S. military would not participate directly in eradication or arrest operations; it would only enable them.

But once again, Karzai and his Pentagon friends thwarted the plan. First, Anthony Harriman was replaced at the National Security Council by a colonel who held the old-school Pentagon view that “we don’t do the drug thing.” He would not let me see General Lute or Stephen J. Hadley, the national security adviser, when the force-protection plans failed to materialize. We asked numerous Pentagon officials to lobby the defense minister, Abdul Rahim Wardak, for immediate force protection, but they did little.

Consequently, in late March, the central eradication force set out for Helmand without the promised Afghan National Army. Almost immediately, they came under withering attack for several days — 107-millimeter rockets, rocket-propelled grenades, machine-gun fire and mortars. Three members of the Afghan force were killed and several were seriously wounded. They eradicated just over 1,000 hectares, about 1 percent of the Helmand crop, before withdrawing to Kabul.

This spring, more U.S. troops arrived in Afghanistan. They were effective, experienced warriors — many coming from Iraq — but they knew little about drugs. When they arrived in southern Afghanistan, they announced that they would not interfere with poppy harvesting in the area. “Not our job,” they said. Despite the wheat shortage and the threat of starvation, they gave interviews saying that the farmers had no choice but to grow poppies.

At the same time, the 101st Airborne arrived in eastern Afghanistan. Its commanders promptly informed Ambassador Wood that they would only permit crop eradication if the State Department paid large cash stipends to the farmers for the value of their opium crop. Payment for eradication, however, is disastrous counternarcotics policy: If you pay cash for poppies, farmers keep the cash and grow poppies again next year for more cash. And farmers who grow less-lucrative crops start growing poppies so that they can get the money, too. Drug experts call this type of offer a “perverse incentive,” and it has never worked anywhere in the world. It was not going to work in eastern Afghanistan, either. Farmers were lining up to have their crops eradicated and get the money.

On May 12, at a press conference in Kabul, General Khodaidad declared the 2008 anti-poppy effort in southern Afghanistan to be a failure. Eradication this year would total less than a third of the 20,000 hectares that Afghanistan eradicated in 2007. The north and east — particularly Balkh, Badakhshan and Nangarhar provinces — continued to improve because of strong political will and better civilian-military cooperation. But the base of the Karzai government — Kandahar and Helmand — would have record crops, less eradication and fewer arrests than in years past. And the Taliban would get stronger.

Despite this development, the Afghans were busily putting together an optimistic assessment of their progress for the Paris Conference on Afghanistan — where, on June 12, world leaders, including Karzai, met in an event reminiscent of the London Conference of 2006. In Paris, the Afghan government raised more than $20 billion in additional development assistance. But the drug problem was a nuisance that could jeopardize the financing effort. So drugs were eliminated from the formal agenda and relegated to a 50-minute closed discussion at a lower-level meeting the week before the conference.

That is where we are today. The solution remains a simple one: execute the policy developed in 2007. It requires the following steps:

1. Inform President Karzai that he must stop protecting drug lords and narco-farmers or he will lose U.S. support. Karzai should issue a new decree of zero tolerance for poppy cultivation during the coming growing season. He should order farmers to plant wheat, and guarantee today’s high wheat prices. Karzai must simultaneously authorize aggressive force-protected manual and aerial eradication of poppies in Helmand and Kandahar Provinces for those farmers who do not plant legal crops.

2. Order the Pentagon to support this strategy. Position allied and Afghan troops in places that create security pockets so that Afghan counternarcotics police can arrest powerful drug lords. Enable force-protected eradication with the Afghan-set goal of eradicating 50,000 hectares as the benchmark.

3. Increase the number of D.E.A. agents in Kabul and assist the Afghan attorney general in prosecuting key traffickers and corrupt government officials from all ethnic groups, including southern Pashtuns.

4. Get new development projects quickly to the provinces that become poppy-free or stay poppy free. The north should see significant rewards for its successful anticultivation efforts. Do not, however, provide cash to farmers for eradication.

5. Ask the allies either to help in this effort or stand down and let us do the job.

There are other initiatives that could help as well: better engagement of Afghanistan’s neighbors, more drug-treatment centers in Afghanistan, stopping the flow into Afghanistan of precursor chemicals needed to make heroin and increased demand-reduction programs. But if we — the Afghans and the U.S. — do just the five items listed above, we will bring the rule of law to a lawless country; and we will cut off a key source of financing to the Taliban.

Monday, July 21

Women on Campus - Two important pieces by Richard Whitmire

The Latest Way to Discriminate Against Women
By RICHARD WHITMIRE

There's something all-American about filing lawsuits. McDonald's coffee burn your lap? Dry cleaner lose your favorite pants? Sue! And somehow we find it perfectly logical that social policy should be guided by lawsuits. Upset by the University of Michigan's handing out admissions preferences to black students? Find a willing complainant and sue. Hey, quite often it works.

Why then does one of the biggest, sweetest lawsuits imaginable — colleges routinely discriminating against women in their admissions policies — go unfiled? Recently U.S. News & World Report laid bare the evidence. In desperate attempts to keep their campuses from swinging hugely female, as far more women than men apply to college these days, straight-A girls are told to look elsewhere, while B-average boys get the fat envelope.

Take the University of Richmond, a small, private college that tries to hew close to a 50-50 gender balance, according to U. S News. To do that, however, the admissions rate for boys is 13 percentage points higher. (They say they do it to accommodate university housing.) Can a private college discriminate against girls and get away with it? Perhaps.

Not so with public universities, as the University of Georgia discovered the hard way several years ago when its preferences for men and minority students got a legal thumping. And yet the same U.S. News article reported that the admittance rate for men at the College of William and Mary was an average of 12 percentage points higher than the rate for women from 1997 to 2006. Henry Broaddus, dean of admission at the college explained it: "I don't think that's an issue of equity; it's an issue of institutional prerogative [to create] a community that will best serve both the men and the women who elect to be members of that community."

"Even women who enroll ... expect to see men on campus," he added. "It's not the College of Mary and Mary; it's the College of William and Mary."

To his credit, Broaddus is simply telling it like it is. In a time when on average 57 percent of all undergraduates are women, setting the admissions gender bar equally could tilt the campus past the 60/40 tipping point that radically changes a college. And nobody hates going past the tipping point more than the women on a campus where that occurs.

I've just spent a year working on a book about boys falling behind in elementary and secondary schools (Boy Troubles, to be published by Doubleday in the fall of 2008), which explains their relative absence on college campuses. While my primary focus is explaining why boys aren't succeeding on the lower rungs of schooling, I keep an eye on the widening gender gaps in college. My hunch is that the public won't pay any attention to the problems boys are having in schools until someone notices the startling changes taking place on college campuses.
To date, however, my predictions about campuses flaring up over gender imbalances and admissions preferences have failed to materialize. Oddly, girls don't seem to mind that the University of Richmond sets a higher bar for them. And feminist groups appear more than willing to look the other way while public colleges such as William and Mary do the same. Understanding why requires taking a quick look at each of the players in this loose conspiracy:
The feminists: "There's no easy answer as to what's legal and what isn't legal," Marcia D. Greenberger, co-president of the National Women's Law Center, told U.S. News, referring to admissions biases. In truth, it's murky only because groups such as the National Women's Law Center prefer to keep it that way. If this were a gender case involving sports discrimination, pay gaps, or tenured faculty, I'm guessing the lawsuits would be flying. But in this case, the feminist legal groups aren't out trolling for complainants.

In a phone interview with me, Emily Martin, deputy director of the ACLU Women's Rights Project, was candid about the dilemma feminists face with this issue. "I was surprised at how stark the numbers were," said Martin, referring to the U.S. News list of admission gaps. "I think it's legally questionable, even for the private schools." Title IX was supposed to head off discriminatory practices by colleges, she said. And what could be more discriminatory than turning down a superb female candidate for a less-accomplished male?

Part of the reason for the lack of lawsuits may be the "natural murkiness" of the admissions process, she said. Colleges aren't going to admit they're discriminating against women. And feminist groups are hesitant to test the legal waters, murky or not. For starters, feminists are the first to recognize the benefits of diversity — and isn't having a campus reasonably balanced by gender part of diversity?

Discussions with feminists while researching my book point to another reason. Alerting the public that women increasingly dominate college campuses will make it appear women have "won." And if women have won, why are they still complaining about discrimination in higher education? Feminists have far bigger fish to fry than packing undergraduate classes with ever more women. Looming larger on their agendas are boosting tenured-faculty positions and leveling what they see as huge gender pay gaps in the work force. Hence, from the perspective of feminists, the less said about undergraduate women taking over college campuses, the better.
The conservatives: Both private and public colleges break the law when they favor men over women, Roger Clegg, president of the Center for Equal Opportunity, told me. Why no lawsuits? Race issues attract the litigation, he said, because it's harder to justify racial discrimination from a legal standpoint — plus, colleges place more weight on race than gender in admissions. Finally, the biggest offenders are private colleges, and libertarian-minded litigators prefer not to tangle with private institutions.

College-aspiring high-school girls: If you were a straight-A high-school senior, and your admissions counselor ruled out your top choices because those slots were going to slacker boys, wouldn't you be mad enough to sue? Perhaps, but keep in mind that if nobody's looking for complainants, the chances are pretty good none will be discovered. Jennifer Gratz, the key complainant in the University of Michigan case involving race-friendly admissions policies there, didn't surface totally on her own. The Center for Individual Rights went out fishing for someone to represent their cause. So while plenty of 18-year-old girls may be grumpy about this issue, that doesn't translate into lawsuits.

College women: What most college women want is an equal gender mix on campus. Sure, they object, in theory, to the unequal hurdles they faced getting in. But once inside, they are far more upset about finding gender imbalances. They will be the first to tell you about the problems triggered when a campus passes the 60/40 tipping point. Boys who couldn't win a glance from a pretty girl in high school suddenly become players. And that's really annoying. Worse, women are expected to fulfill a guy's sexual desire immediately or risk losing a prospective mate to the next girl in line. Just get the men into the college gates, women tell the admissions officers; we don't care how you do it!

College men: You're joking, right? I recall walking into an all-guy freshman dorm room at the University of Maine at Farmington, where there are virtually two girls for every boy, and hearing: Sweeeeeet. What's not to like? they ask.

The colleges: As Jennifer Delahunty Britz learned the hard way, college officials are supposed to keep their mouths shut about this issue. In March 2006, Britz wrote an op-ed in The New York Times describing how her daughter got wait-listed at colleges that should have been shoo-ins. As the admissions dean at Kenyon College, Britz was in a privileged position to know the source of her daughter's difficulties: the boy preference.

"The reality is that because young men are rarer, they're more valued applicants," wrote Britz, who later in the article asked, "And what messages are we sending young women that they must, nearly 25 years after the defeat of the Equal Rights Amendment, be even more accomplished than men to gain admission to the nation's top colleges?"

After the op-ed, Britz ran into a buzz saw of criticism. When she appealed to other admissions directors to speak out on her behalf, she was met with silence. At one point, she told me, she feared for her job at Kenyon, but top officials there, who knew about the op-ed before its publication, stood by her.

Nearly a year after the publication of that op-ed, I called Britz, who raised the same question I'm asking here: Why no protests? The answer, it appears, is that lawsuits need more than complainants. They need someone willing to honcho them. And in this case, the honchos are sitting on their hands.

Boy advocates: I know what you're thinking: What's this group doing on the list? Oddly, they are players here. The most prominent boy activist in the country is Thomas G. Mortenson, a higher-education policy analyst who draws up lengthy fact sheets documenting how profoundly boys have fallen behind over the last two decades. But the idea of giving boys preferences for spots at good colleges is anathema to Mortenson.

As he wrote in a USA Today op-ed article, "Addressing the growing gender imbalance in college through affirmative action for young men addresses the symptoms but not the causes. It insults the efforts and accomplishments of young women. Far worse, it leaves many boys in the same confused condition they are in now. And it lets parents — especially fathers — and schoolteachers off the hook for their failure to raise and educate boys to be as accomplished, goal-oriented, engaged, and responsible as young women are today."

There it is. Just when you thought the saga couldn't get any stranger, the sharpest critic of a college-admissions system unfair to girls turns out to be a boys' advocate. And the legal silence goes on and on.



A Tough Time to Be a Girl: Gender Imbalance on Campuses
By RICHARD WHITMIRE

Casual sex. The mere words give parents the jitters, which is partly why the college pickup culture has received so much attention. News-media coverage ranges from checkout-aisle magazine stories serving up titillating details of alcohol-fueled encounters to full-scale reports like the delightfully titled "Hooking Up, Hanging Out, and Hoping for Mr. Right," released by the Institute for American Values, a family-values think tank.

Last year, the writer Laura Sessions Stepp created a stir with her book Unhooked: How Young Women Pursue Sex, Delay Love and Lose at Both, which described what the author says is lost as young men and women move away from traditional romantic relationships and toward fleeting sexual encounters. Not only are women gambling with their health, argues Stepp, but they are making decisions they will regret in future years. The hookup culture could leave them bereft of the skills to build real relationships later in life. Whether Stepp is "retro," as some of her critics charged, may be less important than the fact that the hookup culture shows no signs of reversal.

One key element to the pickup culture, however, remains unreported: American colleges are undergoing a striking gender shift. In 2015 the average college graduating class will be 60-percent female, according to the U.S. Education Department. Some colleges have already reached or passed that threshold, which allows anecdotal insights into how those imbalances affect the pickup culture. What can be seen so far is not encouraging: Stark gender imbalances appear to act as an accelerant on the hookup culture.

Biologists and social scientists can't be surprised by that observation. In the animal kingdom, it is well known that whichever sex is in short supply has the upper hand.

College campuses are not immune to such laws of nature, something I glimpsed while doing research into why boys are lagging in literacy skills and college attendance. In 2006 I visited James Madison University, a public university with 17,000 students. At the time, women made up 61 percent of the campus population.

I chose James Madison because the president had just announced he would eliminate seven men's sports, a move necessary to comply with Title IX. In doing so, the university would bring its sports program back into alignment with its overall gender ratio. Many male athletes appeared shocked by the announcement, as though they had barely noticed the gender imbalances. Female students differed: While they protested the loss of the men's sports teams, they were very aware of those imbalances and saw them as involving far more than sports.
A junior whom I spoke with saw the sports controversy as an opening to expose problems she saw arising from the imbalances. Her first clue that something was different about the university came when she checked out the roughly 30 other students from her high school who attended: All but five were women. Her dorm assignment was the next revelation: Her "coed" dorm of 76 students included only 12 men. She realized that she was seeing a phenomenon unheard of at her high school, where the gender mix was about even. Women at the university would wear anything, and many would do anything, to win the competition to get a guy's attention. A striking brunette, she had no trouble competing, but she soon lost her taste for playing the game at a university where the gender imbalances changed the rules.

"My second semester freshman year I dated a guy, but it only lasted three weeks. I realized he was dabbling, if you will, with every other woman in his dorm. This was completely unacceptable to my standards," she said. However, her fellow female students were putting up with similar behavior. Many women there endure what she called a "mind shift," tolerating things they would never put up with in another setting where the male-female ratios were even.

The party scene was worse: "You'll walk into a room and there will be three boys and 10 girls. The girls are all competing to see who goes home with the boys. The guys have their pick." Another female junior agreed, noting that the phenomenon influences friendships, too: "I have a lot fewer guy friends in college than I did in high school. It's almost a trust issue, because I feel disposable. If he doesn't think I'm a good friend he can go elsewhere. A lot of women here don't invest as much in their guy relationships as they do in their relationships with other women."
A senior added: "The guys see that there are a lot more girls, and they're not interested in having a relationship longer than the next girl to come along. Men know how to take advantage of that competition. They'll set things up at parties to get girls to do stuff, such as having a slip and slide contest," in which girls strip to their underwear and get wet sliding through water on a plastic sheet.

As a result of the rising gender imbalances, the university has become "female centric." But while women may run the clubs, dominate in classes, and generally define the character of the university, the law of supply and demand rules the social scene. That's why the women are both competitive in seeking men and submissive in lowering their standards.

Men at the university don't dispute what the women say. "Since there's such an overwhelming number of girls, they have such competition between each other to get a guy," a male junior admitted. "The guys here aren't stupid. They're plenty aware of that and know that girls have to get into a fight over them, instead of what's normal with guys courting girls."

At James Madison and other colleges I visited with severe gender imbalances, the men appeared to pay an eventual price by failing to develop relationship skills and losing the trust of the women. When guys abuse the women, the women eventually get mad and take it out on all the guys, not just the abusers, the male student acknowledged: "It makes it more difficult for a guy to have a girl at the university come to trust him. A lot of times they think you're one of the bad guys who just wants to hook up."

As a public university that refuses to give admissions preferences to men, James Madison has few options for rebalancing its campus. It is not the only college experiencing fallout from such growing gender imbalances; it just arrived early on the scene. By 2015 what women experience there may become common at hundreds of campuses.

No shortage of grist for the supermarket tabloids.

Richard Whitmire is an editorial writer at USA Today and blogs at http://www.whyboysfail.com.

Monday, July 14

Government as the Big Lender

The desperate worry over the health of huge financial institutions with country cousin names — Fannie Mae and Freddie Mac — reflects a reality that has reshaped major spheres of American life: the government has in recent months taken on an increasingly dominant role in assuring that Americans can buy a home or attend college.

Much of the private money that once surged into the mortgage industry has fled in a panicked horde, leaving most of the responsibility for financing American homes to the government-sponsored Fannie and Freddie.

Two years ago, when commercial banks were still jostling for fatter slices of the housing market, the share of outstanding mortgages Fannie and Freddie owned and guaranteed dipped below 40 percent, according to an analysis of Federal Reserve data by Moody’s Economy.com. By the first three months of this year, Fannie and Freddie were buying more than two-thirds of all new residential mortgages.

A similar trend is playing out in the realm of student loans. As commercial banks concluded that the business of lending to college students was no longer quite so profitable, the Bush administration promised in May to buy their federally guaranteed student loans, giving the banks capital to continue lending.

In short, in a nation that holds itself up as a citadel of free enterprise, the government has transformed from a reliable guarantor into effectively the only lender for millions of Americans engaged in the largest transactions of their lives.

Before, its more modest mission was to make more loans available at lower rates. Now it is to make sure loans are made at all. The government is setting the terms and the standards of Americans’ biggest loans.

On Sunday, that federal oversight and protection was made more explicit, as the Bush administration sought to mount a rescue of Fannie and Freddie, asking Congress to devote public money to buying the two companies’ flagging stocks.

The new reality is scorned by libertarians and conservatives, who fear state intrusions on the market, and by populists and progressives, who dislike the idea of education and housing increasingly resting upon the government’s willingness to finance it.

“If you’re a socialist, you should be happy,” said Michael Lind, a fellow at the New America Foundation, a research institute in Washington. “But you should really wonder whether you want people’s ability to pay for housing and college dependent on the motives of people in Washington.”

The government is trying to support plummeting housing prices and spare strapped homeowners from the wrath of the market: last week, the Senate adopted a bill authorizing the Federal Housing Administration to insure up to $300 billion in refinanced mortgages, enabling borrowers saddled with unaffordable loans to get better terms.

How the government came to dominate these two crucial areas of American lending is — depending on one’s ideological bent — a narrative of regulatory and market failure, or a cautionary tale about bureaucratic meddling in commerce. Perhaps it is both.

To those prone to blame lax regulation, the mortgage fiasco was the inevitable result of a quarter-century in which American policy makers prayed at the altar of market fundamentalism, letting entrepreneurs succeed or fail on their own.

This was the spirit in which Alan Greenspan, the longtime chairman of the Federal Reserve, allowed banks to engineer unfathomably complicated webs of mortgage-based investments that, through the first half of this decade, sent real estate prices soaring and expanded homeownership.

The banks relied on these investments to raise money for the next wave of loans. The system worked so long as lenders could keep selling their mortgages, and so long as someone would guarantee most of the debts. Fannie and Freddie took care of both tasks. Together, they now guarantee or own roughly half of the nation’s $12 trillion mortgage market.

Belief in Fannie and Freddie gave banks a sense of certainty as they plowed more of their capital into residential mortgages. That easy financing, in turn, brought more and more people into the market for homes, generating a belief that American real estate prices could keep rising forever.

And that contributed to the banks’ ultimately making extraordinarily risky loans, which defaulted first when home prices started falling. As lending became conservative, the whole speculative bubble burst.

As some called for intervention by the Fed to cool a speculative binge, Mr. Greenspan resisted. He believed the risks of real estate were effectively limited because debt was widely dispersed. The market would sort it all out.

“Alan Greenspan had this view that the light hand of regulation was best,” said Vincent R. Reinhart, a former Federal Reserve economist and now a scholar at the American Enterprise Institute.

When housing prices commenced plummeting, the ugly truth emerged that many banks did not understand the details of the mortgage-backed investments they owned. Ignorance proved expensive.

As one bank after another announced losses that now exceed $400 billion and that some estimate will ultimately cross the trillion-dollar mark, money ran screaming from the field, leaving Fannie and Freddie pretty much the only players.

A general fear of debt took hold. Banks that had offered loans to students under a federally guaranteed program suddenly could not sell investments linked to those outstanding debts, meaning they could not raise cash for the next crop of loans. Dozens of banks pulled out of the program.

“What’s happened kind of speaks for itself,” said Dean Baker, co-director of the Center for Economic and Policy Research in Washington. “You had this effort to weaken the government’s role. There was this conscious effort to turn things over to the private sector, and it failed.”

But there is a parallel narrative, the story that critics and competitors of Fannie and Freddie have told for years: how the two companies exploited their pedigree as entities backed by the government to secure an unfair advantage over the private sector.

They swelled into highly leveraged behemoths, it was said, on the implicit guarantee that the government would step in and rescue them if they ever got into trouble. This allowed them to borrow money more cheaply than their competitors could, enabling them to make loans more cheaply.

That secured more business and rewarded their shareholders, along with their handsomely compensated executives. It emboldened them to trade in highly risky investments.

“They were using their privileged position as favored children of the government to dominate the market, and taxpayers were on the hook for substantial risk,” said Martin N. Baily, a chairman of the Council of Economic Advisers in the Clinton administration. “You couldn’t possibly say this was a pure unfettered market.”

The government was getting something for its protective largess. It was using Fannie and Freddie to pursue the social goal of broader homeownership, particularly among racial minorities.

“When you’re looking at the upside, here’s the government helping people get mortgages and student loans,” said David R. Henderson, a self-described libertarian economist at the Hoover Institution at Stanford University. “The downside is there might be a bailout and then you pay in taxes. These things don’t come cost-free when government gets involved.”

As the Bush administration readies funds to buy student loans from cash-short banks, and officials plot a potential bailout of Fannie and Freddie that could run into tens of billions of dollars, the government’s outsize role in these two huge areas will not shrink anytime soon.

It seems a strange coda to an era in which markets were sacred, and regulation heresy.

For a generation, American policy makers have lectured the world on the need to unleash the animal instincts of the market. China’s rickety banks should stop lending to protect state factory jobs, Americans said, and focus on the bottom line. Now the Bush administration is reluctantly concluding that Fannie and Freddie might need to be propped up to protect the American homeowner.

During much of Japan’s lost decade of the 1990s, Americans called for an end to its coddling of weak banks. Better to let them keel over, along with the paper tiger companies they sustained. No company was “too big to fail,” Washington said.

Yet here, in the aftermath of a financial crisis brought on by what were once called American virtues — financial engineering and risk management — Washington may bail out Fannie and Freddie for the simple reason that they are too big to fail. If they go down, so do whole neighborhoods. So, perhaps, does the global financial system.

“The thing we have to do now is to make sure that Fannie and Freddie remain solvent and continue to make loans,” Mr. Baily said. “We just don’t have any choice.”

Sunday, July 13

2,691 Decisions - 30 years covering the Supreme Court

WASHINGTON — Sometime during the first of my nearly 30 years reporting on the Supreme Court, a distinct visual image of a Supreme Court term took hold in my mind and never let go. The nine-month term was a mountain. My job was to climb it.

The slope was gentle when the term began, every first Monday in October; the court was busy choosing new cases and hearing arguments, but it was not yet ready to issue decisions. The upward path steepened in January and February, when grants of new cases, arguments and decisions all came at once, competing for attention. Spring brought a breather as the path flattened out again: all the arguments had been heard, and the decisions were sporadic. The steepest climb came, predictably, every June, with the final outpouring of opinions before the summer recess. And then it was over. I could look down from the mountaintop to see the term whole and clear, while off in the distance the next term loomed, another climb.

But not this year. I am retiring from The New York Times to write and teach at Yale Law School. So this time, I can survey all the mountains, stretching back to the morning in 1978 when I first walked up the court’s marble steps — mistakenly, as it turned out, because people with business at the court actually use a less majestic but more practical side entrance at ground level.

I had been a political reporter, covering state government in New York from Albany, before I received a Ford Foundation fellowship for journalists to attend Yale Law School for a year. Certainly my Yale master’s degree, the ink barely dry as I walked up those marble steps, had given me a useful grasp of legal concepts. But it could scarcely prepare me for the texture and flavor, the sheer dailiness, of life at the court. So much happened behind closed doors. What did the justices do all day, anyway? I imagined them in earnest conversation with one another, grappling with the great legal questions of the day (in 1978 affirmative action was the most pressing). I learned only gradually that it isn’t like that at all, that except for their formal gatherings around the conference table once or twice a week, the justices spend their time, when they are not on the bench, in their chambers, alone or with their law clerks. Communications among them tend to be in writing, even today, and the ethos of the place discourages one justice from intruding on another’s space, physically or verbally. Membership in one of the world’s most exclusive clubs can be isolating, a little lonely, which I think is why those justices who enjoy companionship spend a fair amount of their free time on the road, speaking at law schools and judicial conferences.

In The Times’s Albany bureau, contact with the capitol’s newsmakers was constant, and feedback from them was instantaneous — not always pleasant, but essential for understanding competing perspectives and agendas, or simply for avoiding making the same mistake twice. Compared with the frenzied drama of the New York Legislature, the quiet of the Supreme Court press room was the silence of the tomb. In place of the easy banter with politicians that had made the Albany beat so engaging, there was an almost suffocating paper flow. Before I could work my way through one list of newly filed petitions to the court, two more would arrive.

Politics, comfortingly, had presented a moving target — an interpretation that seemed wrong today could well be proven correct tomorrow. But when it came to Supreme Court decisions, it was quite possible to get it wrong, flatly and irrevocably. And if I did get it wrong, how would I know? The fact that I received no feedback from those whose activities I was covering was hardly reassuring. It just underscored how different this new environment was going to be.

And yet I came to see my Albany experience as valuable, rather than irrelevant, to my new assignment. Watching the back-and-forth between a state legislature and the Supreme Court of the United States had given me a real sense of the court as an active participant in the ceaseless American dialogue about constitutional values and priorities, not a remote oracle.

For example, the New York Legislature in the 1970s was determined to channel taxpayer money to parochial schools. A majority of the Supreme Court was equally determined to keep that from happening. Session after legislative session in Albany, I reported on efforts to get around the latest Supreme Court ruling and to do indirectly (by providing textbooks or transportation rather than classroom instruction, for instance) what the court had said could not be done directly. It was a constitutional Ping-Pong match, foreshadowing, in its way, the recent one between the court and the Bush administration over the handling of the Guantánamo detainees; a battle over principle, to be sure, but also over who would get the last word.

There was another useful lesson for me in the struggle over parochial school aid: the court’s makeup changes, and so does the law. As an associate justice, William H. Rehnquist, who wanted to cultivate a much bigger space for religion in public life, planted a few seeds in arid soil. He tended those seeds assiduously as new allies joined the court and the climate warmed, until they germinated in the form of decisions like the one in 2002, Zelman v. Simmons-Harris, which upheld Ohio’s system of taxpayer-financed vouchers for parents to use for parochial school tuition. “A program of true private choice,” Chief Justice Rehnquist said in his 5-to-4 majority opinion — having established years earlier, in less freighted contexts, that when public money passes through parents’ hands, it loses its public character and its use becomes a “private choice.”

And then something interesting happened. The voucher movement, even though its constitutional shackles had been removed, stalled almost everywhere, owing not to the intervention of federal judges but to resistance from state courts, teachers’ unions and taxpayers. An ambitious legislative campaign by voucher advocates in 2004 ended in defeat in state after state. The court can only do so much. It can lead, but the country does not necessarily follow.

In fact, it is most often the Supreme Court that is the follower. It ratifies or consolidates change rather than propelling it, although in the midst of heated debate over a major case, it can often appear otherwise. Without delving into the vast political science and legal academic literature on this point, I’m simply offering my empirical observation that the court lives in constant dialogue with other institutions, formal and informal, and that when it strays too far outside the existing political or social consensus, the result is a palpable tension both inside and outside the court.

Such periods are fascinating, and inherently unstable. The early New Deal period is a classic example. The public demanded change, and the “nine old men” stood in the way. The “court-packing” crisis ensued; President Franklin D. Roosevelt had to back down from adding new and younger justices, and change came from inside the court anyway. Some decisions protecting the rights of criminal suspects, made by Earl Warren’s court in the 1960s, placed the court to the left of the country’s center (and provided useful campaign fodder for Richard M. Nixon).

A year ago, at the end of a Supreme Court term marked by sharp ideological divisions and attacks on precedent by a newly empowered conservative majority, I thought we were entering such a period; the court appeared to be moving to the right of the public. For example, the 5-to-4 decision blocking local communities from taking modest steps to preserve the hard-won gains of public school desegregation threatened to unravel delicate arrangements in school districts around the country. That remains a highly problematic decision, but the more muted and centrist tone of the term that just ended has made me less persuaded that the court is on a collision course with mainstream public opinion.

In any event, it is often the court that eventually retreats when it finds itself out of sync with the prevailing mood. That appeared to be the case with the “federalism revolution” that Chief Justice Rehnquist began in the mid-1990s. In a series of 5-to-4 decisions, the court declared that Congress did not have the power it assumed it had to make federal statutes binding on the states. These decisions, reflecting the chief justice’s longstanding goal to re-adjust the post-New Deal federal-state balance, signaled an abrupt jurisprudential shift.

But then 9/11 happened and the national mood changed. Suddenly, the federal government looked useful, even necessary. The Supreme Court’s federalism revolution had been overtaken by events. In 2003, Chief Justice Rehnquist wrote for a 6-to-3 majority that Congress acted within its constitutional authority when it said state governments could be sued for failing to give their employees the benefits required by the Family and Medical Leave Act. It was a decision of enormous symbolic significance. Without apology or much in the way of explanation, the chief justice gave up the fight and moved on.

I admired Chief Justice Rehnquist as a strategist and tactician; he knew what he wanted and knew his limits, just as in his weekly poker game he knew when to hold ’em and when to fold ’em. Justice Antonin Scalia, who joined the court in 1986, was a flashier attention-grabber, but I never had any doubt that William Rehnquist was the brains behind the court’s ascendant conservatives. He took his role seriously, but himself less so (unlike his stuffy predecessor, Warren E. Burger, the first chief justice of my tenure). When he emerged from behind the courtroom’s velvet curtain one morning in 1995 sporting four gold stripes on each sleeve of his robe — with some of his colleagues struggling to suppress smiles — many people saw pomposity, but I saw a wry or maybe even self-mocking comment on the boredom of basic black after 23 years on the court. He had another 10 years to go.

We had nothing approaching a confidential relationship, but we did chat now and then. On the morning after the 2000 presidential election, I ran into him on the court’s plaza as he was taking his morning walk. Wasn’t it amazing, we agreed, that the outcome of the election was still in doubt.

The court I began covering in 1978 was populated by men who were, for the most part, older than my father. Thurgood Marshall, William J. Brennan Jr. and Byron R. White were historic figures. Harry A. Blackmun had only a few years earlier been propelled from obscurity when he wrote the court’s 7-to-2 majority opinion in Roe v. Wade. Nine new justices joined the court during my time there. Of the original group, only John Paul Stevens remains. Three members of the court are younger than I am.

Amid all that change, nothing touched me as much as the arrival in September 1981 of Sandra Day O’Connor. I had never heard her name before President Ronald Reagan nominated her that summer to succeed Potter Stewart. Although I covered her confirmation hearing, she remained to me basically a blank slate. That didn’t matter. The first time I looked up from the press section and saw a woman sitting on the bench, I was thrilled in a way I would never have predicted. Her presence invaded my subconscious. I had recurring dreams about her. In one, she asked me my opinion on a pending case (something no justice ever did in real life). But mostly, she just had walk-on roles in ordinary nighttime dramas, her presence signifying what it meant to me to know that there was no longer a position in the legal profession that a woman could not aspire to.

Four summers later, I was pregnant. Encountering me in a hallway, Justice O’Connor asked me when the baby was due. “Just before the first Monday in October,” I replied. Sandra Day O’Connor, mother of three, laughed. “Oh, keep your legs crossed,” she urged. “Don’t let that baby come out until the First Monday!” Some 30 minutes into the first Monday in October 1985, my daughter, Hannah, came into the world. I later learned that right before going on the bench that morning for the term’s opening session, Justice O’Connor called the court’s public information office and asked: “Has anyone heard from Linda? Did she have her baby today?”

(Years later, my daughter bluntly reminded me that today’s young women have the luxury of taking for granted the pioneering accomplishments of a Sandra Day O’Connor or Ruth Bader Ginsburg. When I observed that I was out of college before I ever met a woman who was a lawyer, the teenage Hannah regarded me with compassion. “Face it, Mom,” she said. “You’ve led a sheltered life.”)

Continuity and change, the entwined spirals of a double helix, are the court’s DNA. Continuity is anchored by the gravitational pull of precedent. Who would have believed that William Rehnquist, long a vocal critic of the Warren court’s Miranda decision, could write a majority opinion in 2000 not only reaffirming it but proclaiming that the Miranda warnings had become “part of our national culture”?

The pull of precedent is powerful but scarcely all-powerful when a shift of personnel or perspective breaks the spell, allowing the forces of change to exert their counterpull. The road from Bowers v. Hardwick, the 1986 decision that dismissed a claim of gay rights as “at best, facetious,” to Lawrence v. Texas, which 17 years later located the privacy rights of gay men and lesbians at the heart of constitutional due process, was paved, I have no doubt, by the justices’ experience of knowing gay men and women in their personal and professional lives.

But with so many important cases decided by such close margins (the two leading cases of the past term, on the rights of the Guantánamo detainees and the Second Amendment right to own a gun, were decided by votes of 5 to 4), perhaps fragility, rather than stability, best characterizes the court today, and that is a reminder of the stakes involved in any Supreme Court vacancy. The galvanizing battle over the nomination of Robert H. Bork in 1987, a conflagration at the intersection of law and politics that held the country spellbound for three months, was the most riveting public event I ever witnessed at close range. Although Judge Bork was, of course, defeated, in many ways the Bork battle has never really ended, with today’s ceaseless judicial confirmation wars being carried on by ideological combatants too young to remember the original.

President Reagan nominated Robert Bork, a well-known conservative, to the “swing” seat on the court being vacated by Justice Lewis F. Powell Jr. I knew Bob Bork. He had been a professor of mine at Yale, an urbane and witty man who bore little resemblance to the instant portrait painted by his opponents. (“In Robert Bork’s America,” Senator Edward M. Kennedy famously said in response to the nomination, “there is no room at the inn for blacks and no place in the Constitution for women, and in our America there should be no seat on the Supreme Court for Robert Bork.”) The day he was nominated, I left a message on his home answering machine. “Congratulations, and keep your sense of humor,” I said. “I think you’ll need it.”

His sense of humor failed him. As the hearings went on, he became testy and abrupt. When he said that serving on the court would be an “intellectual feast,” he was simply being honest. It would have been more politic, but less candid, to claim that he was motivated by a desire to serve the cause of justice. He and his supporters emerged from defeat filled with bitterness, persuaded that he had been dealt an unfair hand.

To the contrary, I thought then and think now that the debate had been both fair and profound. In five days on the witness stand, Judge Bork had a chance to explain himself fully, to describe and defend his view that the Constitution’s text and the intent of its 18th-century framers provided the only legitimate tools for constitutional interpretation. Through televised hearings that engaged the public to a rare degree, the debate became a national referendum on the modern course of constitutional law. Judge Bork’s constitutional vision, anchored in the past, was tested and found wanting, in contrast to the later declaration by Judge Anthony M. Kennedy, the successful nominee, that the Constitution’s framers had “made a covenant with the future.”

It has made a substantial difference during these last 21 years that Anthony Kennedy got the seat intended for Robert Bork. The invective aimed at Justice Kennedy from the right this year alone, for his majority opinions upholding the rights of the Guantánamo detainees and overturning the death penalty for child rapists — 5-to-4 decisions that would surely have found Judge Bork on the opposite side — is a measure of the lasting significance of what happened during that long-ago summer and fall.

It is also a reminder of something I learned observing the court and the country, and listening in on the vital dialogue between them. The court is in Americans’ collective hands. We shape it; it reflects us. At any given time, we may not have the Supreme Court we want. We may not have the court we need. But we have, most likely, the Supreme Court we deserve.