Softhearted people always advocate spending more on kids. But according to a new and authoritative synthesis of available evidence, there's a hardheaded case for investing more in young kids over older ones.
In the United States, we've spent trillions of dollars over decades on K-12 schooling in the hopes of making young people more productive, or at least less criminally delinquent. The results have been mixed. About 20 percent of the American workforce is essentially illiterate (compared with 5 to 10 percent in Sweden or Germany), creating a major drag on our international competitiveness. And an astronomical 5.6 million adults in the United States have served time in state or federal prison, with 1.3 million there currently. Their incarceration, along with other costs of crime, costs us around $1.3 trillion a year.
Why doesn't all our spending on education buy better results? Nobel Prize winner James Heckman of the University of Chicago and Dimitriy Masterov of the University of Michigan argue that by waiting until kindergarten, we throw money at kids when it's too late. Their evidence urges shifting educational spending to younger children.
The early investment is needed, the authors argue, to supplement the role of the family. Recent developments in neuroscience have shown that the early years are vital to cognitive development, which in turn is important to subsequent success and productivity in school, life, and work. Early-childhood nurturing has traditionally been the province of families. But families are deteriorating. Roughly one in six kids was born into poverty or single parenthood or both in 1970. In 2000, the rate was about one in four. What's more, almost 10 percent of children were born to unmarried teenage mothers in 1999; these kids tend to receive especially low levels of emotional and intellectual support and cognitive stimulation. They arrive at kindergarten cognitively disadvantaged, and the gap widens as they get older, eventually leading to early babies, lousy jobs, and elevated crime.
Heckman and Masterov look at a number of pilot programs in early-childhood education that have targeted high-risk kids in disadvantaged families, and studied them into adulthood. These programs are like Head Start, only more intensive. For example, between 1962 and 1967, the Perry Project in Ypsilanti, Mich., provided two years of intensive preschool to a group of disadvantaged 3-year-old black children, chosen from an eligible pool by a coin flip. The program consisted of a daily session of two and a half hours and a weekly 90-minute teacher home visit. In today's dollars, it would cost $10,000 per child per year.
Perry participants have been followed through age 40, and the program has shown substantial benefits in educational achievement and other social outcomes. Participants achieved greater literacy and higher grades, and they were more likely to graduate high school. Later in life, they were more likely to be employed—and to earn more—and less likely to be on welfare. They also committed less crime and had lower rates of teen pregnancy.
The authors estimate the rate of return for programs like the Perry Project to be a substantial 16 percent. While some of this payback accrues directly to the kids, in the form of higher earnings when they're grown up, about three-quarters of it goes to the rest of us in the form of lower crime and savings on prison spending. Heckman and Masterov compare the return from investing in preschool kids with the returns from lower class size in high school (smaller than the return to preschool) and to GED programs (smaller still). They propose that the return on investment declines with age, although they don't offer a ton of quantitative evidence on this point.
The big economic return for intensive preschool for disadvantaged kids has two implications. First, while many people advocate spending on these kids for reasons of fairness or justice, Heckman and Masterov make a different case. They're saying this preschool spending is a sound economic investment. Each dollar we spend on targeted, intensive preschool returns more than a dollar invested in, say, a pretty good mutual fund.
Many families already make this investment on their own, either by spending time with their kids or by purchasing high-quality child care. Why involve governments? Well, Heckman and Masterov show that if your kid goes to one of these programs, the rest of us get most of the benefit. Economists assume that even if parents of disadvantaged kids are rational and forward-looking—as if they didn't have enough to worry about—they will invest in preschool only to boost their kids' earnings and not to reduce crime and prison costs, which are borne by the rest of us. As a result, even conscientious parents will under-invest. So, Economics 101 tells you—granted, in an end-of-semester lecture that you probably skipped—that clearly this is a job for government.
The hardheaded case for Perry-like preschool extends beyond higher pay and reduced crime. Unlike many efforts to boost productivity—think trickle-down—this one would reduce inequality as well, by raising the incomes of the disadvantaged. Investing in preschools can also enhance international competitiveness. Much of the growth in American standards of living over the past half-century has flowed from our population's ever-increasing educational attainment. But for the generations born since 1950, the growth has stopped. The problem is not that a college education costs too much, but rather that many disadvantaged kids aren't academically ready for college when they finish high school. And Heckman and Masterov argue that it stems from the academic deficits they bring to kindergarten. If preschool whips them into shape, they'll be better prepared for all the other steps along the way.
A sales problem remains: These programs invade the traditional province of the family, and in Heckman and Masterov's conception, they would target disadvantaged populations that are disproportionately minority. Wanted: a credible and sympathetic pitchman. Paging Barack Obama.
Slate.com
----
Joel Waldfogel is the Ehrenkranz Family Professor of business and public policy at the Wharton School of the University of Pennsylvania. His book, The Tyranny of the Market: Why You Can't Always Get What You Want, will be published by Harvard University Press this year.
Tuesday, May 29
Monday, May 28
Israel and the Price of Blindness
By ROGER COHEN
International Herald Tribune
JERUSALEM
A three-minute Palestinian movie says what needs to be said about estrangement and violence in the Middle East. It features a woman driving around Jerusalem asking for directions to the adjacent West Bank town of Ramallah. She is met by dismay, irritation, blank stares and near panic from Israelis.
The documentary, called "A World Apart Within 15 Minutes" and directed by Enas Muthaffar, captures the psychological alienation that has intensified in recent years and left Israelis and Palestinians worlds apart, so alienated from each other that a major Palestinian city has vanished from Israelis' mental maps.
Never mind the latest flare-up in Gaza. What matters in the world's most intractable conflict is the way the personal narratives of Israelis and Palestinians, coaxed toward intersection by the Oslo Accords of the 1990s, have diverged to a point of mutual nonrecognition.
Ramallah is about 10 kilometers north of Jerusalem. For most Israelis, it might as well be on the moon. It is not just the fence, called the "separation barrier" by Israelis and the "racist separating wall" by Palestinians, that gets in the way. It is the death of the idea of peace and its replacement by the notion of security in detachment.
I can understand that notion's appeal. Israelis have had reason enough to throw up their hands since 2000 and say: To heck with suicide bombers, Gaza mayhem, inept Palestinian leadership and annihilationist Hamas. They would rather focus on their dot-com boom, high-speed trains and Goa vacations. They would rather be safe than worry about peace.
But detachment is an illusion. Life goes on behind the physical and mental barriers Israelis have erected. Or rather, it festers. As Itamar Rabinovich, the president of Tel Aviv University, remarked to me: "Palestine is a failed pre-state."
For that failure, Palestinians must take responsibility. But this aborted birth is also Israel's work. I drove recently from Jerusalem to the West Bank city of Nablus. A beautiful terrain of terraced olive groves is scarred by the cold imprint of Israeli occupation: shining garrison-like settlements on hilltops, fenced highways for settlers alone, watchtowers, check-points.
The West Bank, after 40 years under Israeli control, is a shameful place. If this is the price of Israeli security, it is unacceptable. Power corrupts; absolute power can corrupt absolutely. There are no meaningful checks and balances in this territory, none of the mechanisms of Israel's admirable democracy.
The result is what the World Bank this month called a "shattered economic space." If Israelis could be as inventive about seeking bridges to Palestinians as they are now in devising restrictions on their movement, the results could be startling. As it is, the bank noted, Israeli policy has produced "ever smaller and disconnected cantons."
This has been achieved through remorseless permit and ID checks, roadblocks, checkpoints and the creation of closed areas. Palestinians are caged in islets where doing business is near impossible.
More than 500 barriers hinder Palestinian movement. Meanwhile, Jewish settlers move freely; their number, outside East Jerusalem, has increased to about 250,000 from roughly 126,900 at the time of the Oslo Accords. These numbers alone make Palestinian political and religious radicalization less than entirely mysterious.
In his April 14, 2004, statement on a two-state solution, President George W. Bush offered concessions to Israel. He said it was "unrealistic" to expect "a full and complete return" to the Green Line. But he also urged "the establishment of a Palestinian state that is viable, contiguous, sovereign and independent."
More than three years later, there is no such state. What there is of a nascent Palestine is non-viable, non-contiguous, non-sovereign and dependent. While denouncing terrorism with appropriate vigor, Bush has an equal obligation to pressure Israel to accept that ruthless colonization is unworthy of it and no enduring recipe for security.
Israel has an obligation to open its eyes and do some wall-jumping. The country has just been shaken by the Winograd Report, a devastating look at last summer's war against the Lebanese militia, Hezbollah. It is now time for a report of similar scope on Israel's West Bank occupation.
I can see no better way to arrest the cycle of alienation. Time is not on the side of a two-state solution. A fast-growing Palestinian population inhabits a neighborhood where the Ahmadinejad-Hezbollah-Hamas school has leverage.
If Israelis do not rediscover where and what Ramallah is, they may one day be devoured by what they choose not to see.
International Herald Tribune
JERUSALEM
A three-minute Palestinian movie says what needs to be said about estrangement and violence in the Middle East. It features a woman driving around Jerusalem asking for directions to the adjacent West Bank town of Ramallah. She is met by dismay, irritation, blank stares and near panic from Israelis.
The documentary, called "A World Apart Within 15 Minutes" and directed by Enas Muthaffar, captures the psychological alienation that has intensified in recent years and left Israelis and Palestinians worlds apart, so alienated from each other that a major Palestinian city has vanished from Israelis' mental maps.
Never mind the latest flare-up in Gaza. What matters in the world's most intractable conflict is the way the personal narratives of Israelis and Palestinians, coaxed toward intersection by the Oslo Accords of the 1990s, have diverged to a point of mutual nonrecognition.
Ramallah is about 10 kilometers north of Jerusalem. For most Israelis, it might as well be on the moon. It is not just the fence, called the "separation barrier" by Israelis and the "racist separating wall" by Palestinians, that gets in the way. It is the death of the idea of peace and its replacement by the notion of security in detachment.
I can understand that notion's appeal. Israelis have had reason enough to throw up their hands since 2000 and say: To heck with suicide bombers, Gaza mayhem, inept Palestinian leadership and annihilationist Hamas. They would rather focus on their dot-com boom, high-speed trains and Goa vacations. They would rather be safe than worry about peace.
But detachment is an illusion. Life goes on behind the physical and mental barriers Israelis have erected. Or rather, it festers. As Itamar Rabinovich, the president of Tel Aviv University, remarked to me: "Palestine is a failed pre-state."
For that failure, Palestinians must take responsibility. But this aborted birth is also Israel's work. I drove recently from Jerusalem to the West Bank city of Nablus. A beautiful terrain of terraced olive groves is scarred by the cold imprint of Israeli occupation: shining garrison-like settlements on hilltops, fenced highways for settlers alone, watchtowers, check-points.
The West Bank, after 40 years under Israeli control, is a shameful place. If this is the price of Israeli security, it is unacceptable. Power corrupts; absolute power can corrupt absolutely. There are no meaningful checks and balances in this territory, none of the mechanisms of Israel's admirable democracy.
The result is what the World Bank this month called a "shattered economic space." If Israelis could be as inventive about seeking bridges to Palestinians as they are now in devising restrictions on their movement, the results could be startling. As it is, the bank noted, Israeli policy has produced "ever smaller and disconnected cantons."
This has been achieved through remorseless permit and ID checks, roadblocks, checkpoints and the creation of closed areas. Palestinians are caged in islets where doing business is near impossible.
More than 500 barriers hinder Palestinian movement. Meanwhile, Jewish settlers move freely; their number, outside East Jerusalem, has increased to about 250,000 from roughly 126,900 at the time of the Oslo Accords. These numbers alone make Palestinian political and religious radicalization less than entirely mysterious.
In his April 14, 2004, statement on a two-state solution, President George W. Bush offered concessions to Israel. He said it was "unrealistic" to expect "a full and complete return" to the Green Line. But he also urged "the establishment of a Palestinian state that is viable, contiguous, sovereign and independent."
More than three years later, there is no such state. What there is of a nascent Palestine is non-viable, non-contiguous, non-sovereign and dependent. While denouncing terrorism with appropriate vigor, Bush has an equal obligation to pressure Israel to accept that ruthless colonization is unworthy of it and no enduring recipe for security.
Israel has an obligation to open its eyes and do some wall-jumping. The country has just been shaken by the Winograd Report, a devastating look at last summer's war against the Lebanese militia, Hezbollah. It is now time for a report of similar scope on Israel's West Bank occupation.
I can see no better way to arrest the cycle of alienation. Time is not on the side of a two-state solution. A fast-growing Palestinian population inhabits a neighborhood where the Ahmadinejad-Hezbollah-Hamas school has leverage.
If Israelis do not rediscover where and what Ramallah is, they may one day be devoured by what they choose not to see.
Presidents at War, A Tainted History
“The Constitution has never greatly bothered any wartime president,” wrote Francis Biddle, F.D.R.’s attorney general during World War II. Biddle was writing about Roosevelt’s shameful 1942 decision to evacuate Japanese-Americans from the Pacific Coast and place them in internment camps. But Biddle’s comment applies to all presidents in times of crisis. National survival or, perhaps more accurately, the president’s perception of national survival always takes precedence. George W. Bush has been no exception.
In 1798, during the undeclared war against France, President John Adams supported passage of the Alien and Sedition Acts, which criminalized political dissent and gave the president a free hand to deport any noncitizen he deemed “dangerous to the peace and safety of the United States.”
Ten years later, President Thomas Jefferson sought to enforce the Embargo Act, which prohibited trade with Great Britain, by charging those who violated it with treason – an egregious example of executive overreach that the federal courts quickly rejected.
Andrew Jackson’s contempt for the treaty rights of the Cherokee Nation is a familiar story. Less well-known is Jackson’s attempt to halt the distribution of abolitionist literature in the South by censoring the mail.
Abraham Lincoln suspended the writ of habeas corpus during the Civil War, and in several states he ordered the trial of civilians by military tribunals. Although Congress explicitly authorized Lincoln to suspend the writ, it was a draconian measure that the president believed essential to preserve the Union. “Are all the laws, but one, to go unexecuted, and the government itself go to pieces, lest that one be violated?” he asked.
Unlike Adams, Jefferson and Jackson, Lincoln was responding to an armed insurrection that threatened the nation’s survival. Most historians have judged his action as commensurate with the threat.
For more than 100 years, from the expiration of the Sedition Act of 1798 until America’s entry into World War I, the United States had no federal legislation banning rebellious expression. The War of 1812, the Mexican War, the Civil War and the Spanish-American War all were fought without criminalizing the right of dissent.
It was Woodrow Wilson, shortly after his re-election in 1916 but well before America’s entry into World War I, who sought legislation to suppress disloyalty. Wilson requested that Congress give the president absolute authority to censor the press in the event of war, to make it a federal crime to promote the success of America’s enemies and to close the mail to any material deemed “of a treasonable or anarchistic character.” Wilson insisted that the power he requested was “absolutely necessary to the public safety.” After America entered the war, Congress passed the Espionage Act of 1917, which incorporated much of what Wilson asked for but not the authority to censor the press.
F.D.R. may be guilty of the most extreme disregard for civil liberty, although his action was endorsed by Congress and later upheld in two landmark Supreme Court decisions. Unlike Wilson and Adams, F.D.R. had no interest in launching a wartime crusade to promote ideological conformity. But he had been blindsided by the Japanese attack on Pearl Harbor, and he was unwilling to second-guess the War Department when it urged action in the interest of military security. The 1942 relocation of Japanese-Americans from their homes on the West Coast was, in Roosevelt’s view, simply another act of wartime necessity dictated by the risk to America’s defenses.
But there was little justification for the action. Adm. Harold Stark, the chief of naval operations, and Gen. Mark Clark, the Army’s deputy chief of staff, had testified before Congress that the Pacific Coast was in no danger of invasion, and the possibility of Japanese-immigrant-inspired sabotage was no greater than that which might arise from German or Italian immigrants elsewhere in the country.
The initial agitation to remove the Japanese came from California civilians, and was tainted by long-standing racism and greed. The clamor was magnified by the state’s political leaders, including Earl Warren, then California’s attorney general, and was transmitted to Washington by Lt. Gen. John DeWitt, the overall Army commander on the West Coast.
When De Witt’s request arrived at the War Department, the Army general staff vigorously opposed the action. But the Pentagon’s civilian leadership, Secretary Henry L. Stimson and Assistant Secretary John J. McCloy, were convinced of the military necessity and transmitted that view to F.D.R. Roosevelt gave the matter too little attention; if Stimson and McCloy recommended that the Japanese be evacuated, he was not going to dispute them. On Feb. 19, 1942, Roosevelt signed the executive order that they had prepared, authorizing the forcible evacuation of people of Japanese ancestry from a designated war zone along the Pacific Coast.
By presidential directive, 120,000 Japanese residents, 80,000 of whom were American citizens by birth, were taken from their homes, farms and businesses and interned at relocation sites far inland. Roosevelt showed little remorse. In March of 1942, when Henry Morgenthau Jr., the treasury secretary, told F.D.R. about the financial losses the Japanese had suffered, the president said he was “not concerned about that.” History has judged Roosevelt harshly. There is little question that he had the authority to issue the order. Whether he should have done so is another matter.
In the Korean conflict, President Harry Truman stretched his commander-in-chief power to seize and operate the nation’s steel mills. During the Vietnam War, President Richard Nixon sought to prevent The New York Times and The Washington Post from publishing the Pentagon Papers, secret documents pertaining to American military strategy that Daniel Ellsberg had stolen from the Defense Department. In neither case was national survival at risk, and in both cases the Supreme Court struck down the president’s action.
There is an old legal maxim that in time of war the laws are silent: Inter arma silent leges. But the crucial issue is the extent to which the nation is threatened. In the cases of Lincoln and Roosevelt, the survival of the United States hung in the balance. A president will be forgiven by his contemporaries, though not necessarily by later generations, for acting outside the law when that is the case. As more than one Supreme Court justice has said, the Constitution is not a suicide pact. When national survival is not threatened, however, it is essential for a chief executive to resist an unwarranted enlargement of his powers.
A national security concern does not become a war simply because it is baptized as such. President George W. Bush’s questionable use of the metaphor “war on terror” to justify indefinite detention of suspects, warrantless eavesdropping and spying on the reading habits of citizens could invite from historians even more opprobrium than they have cast on the repressive actions taken by other presidents when the survival of the United States was at risk.
---
Jean Edward Smith, the John Marshall Professor of political science at Marshall University, in Huntington, W. Va., is the author of 12 books, including biographies of Ulysses S. Grant, Chief Justice John Marshall and General Lucius D. Clay. His latest book is “F.D.R.”
In 1798, during the undeclared war against France, President John Adams supported passage of the Alien and Sedition Acts, which criminalized political dissent and gave the president a free hand to deport any noncitizen he deemed “dangerous to the peace and safety of the United States.”
Ten years later, President Thomas Jefferson sought to enforce the Embargo Act, which prohibited trade with Great Britain, by charging those who violated it with treason – an egregious example of executive overreach that the federal courts quickly rejected.
Andrew Jackson’s contempt for the treaty rights of the Cherokee Nation is a familiar story. Less well-known is Jackson’s attempt to halt the distribution of abolitionist literature in the South by censoring the mail.
Abraham Lincoln suspended the writ of habeas corpus during the Civil War, and in several states he ordered the trial of civilians by military tribunals. Although Congress explicitly authorized Lincoln to suspend the writ, it was a draconian measure that the president believed essential to preserve the Union. “Are all the laws, but one, to go unexecuted, and the government itself go to pieces, lest that one be violated?” he asked.
Unlike Adams, Jefferson and Jackson, Lincoln was responding to an armed insurrection that threatened the nation’s survival. Most historians have judged his action as commensurate with the threat.
For more than 100 years, from the expiration of the Sedition Act of 1798 until America’s entry into World War I, the United States had no federal legislation banning rebellious expression. The War of 1812, the Mexican War, the Civil War and the Spanish-American War all were fought without criminalizing the right of dissent.
It was Woodrow Wilson, shortly after his re-election in 1916 but well before America’s entry into World War I, who sought legislation to suppress disloyalty. Wilson requested that Congress give the president absolute authority to censor the press in the event of war, to make it a federal crime to promote the success of America’s enemies and to close the mail to any material deemed “of a treasonable or anarchistic character.” Wilson insisted that the power he requested was “absolutely necessary to the public safety.” After America entered the war, Congress passed the Espionage Act of 1917, which incorporated much of what Wilson asked for but not the authority to censor the press.
F.D.R. may be guilty of the most extreme disregard for civil liberty, although his action was endorsed by Congress and later upheld in two landmark Supreme Court decisions. Unlike Wilson and Adams, F.D.R. had no interest in launching a wartime crusade to promote ideological conformity. But he had been blindsided by the Japanese attack on Pearl Harbor, and he was unwilling to second-guess the War Department when it urged action in the interest of military security. The 1942 relocation of Japanese-Americans from their homes on the West Coast was, in Roosevelt’s view, simply another act of wartime necessity dictated by the risk to America’s defenses.
But there was little justification for the action. Adm. Harold Stark, the chief of naval operations, and Gen. Mark Clark, the Army’s deputy chief of staff, had testified before Congress that the Pacific Coast was in no danger of invasion, and the possibility of Japanese-immigrant-inspired sabotage was no greater than that which might arise from German or Italian immigrants elsewhere in the country.
The initial agitation to remove the Japanese came from California civilians, and was tainted by long-standing racism and greed. The clamor was magnified by the state’s political leaders, including Earl Warren, then California’s attorney general, and was transmitted to Washington by Lt. Gen. John DeWitt, the overall Army commander on the West Coast.
When De Witt’s request arrived at the War Department, the Army general staff vigorously opposed the action. But the Pentagon’s civilian leadership, Secretary Henry L. Stimson and Assistant Secretary John J. McCloy, were convinced of the military necessity and transmitted that view to F.D.R. Roosevelt gave the matter too little attention; if Stimson and McCloy recommended that the Japanese be evacuated, he was not going to dispute them. On Feb. 19, 1942, Roosevelt signed the executive order that they had prepared, authorizing the forcible evacuation of people of Japanese ancestry from a designated war zone along the Pacific Coast.
By presidential directive, 120,000 Japanese residents, 80,000 of whom were American citizens by birth, were taken from their homes, farms and businesses and interned at relocation sites far inland. Roosevelt showed little remorse. In March of 1942, when Henry Morgenthau Jr., the treasury secretary, told F.D.R. about the financial losses the Japanese had suffered, the president said he was “not concerned about that.” History has judged Roosevelt harshly. There is little question that he had the authority to issue the order. Whether he should have done so is another matter.
In the Korean conflict, President Harry Truman stretched his commander-in-chief power to seize and operate the nation’s steel mills. During the Vietnam War, President Richard Nixon sought to prevent The New York Times and The Washington Post from publishing the Pentagon Papers, secret documents pertaining to American military strategy that Daniel Ellsberg had stolen from the Defense Department. In neither case was national survival at risk, and in both cases the Supreme Court struck down the president’s action.
There is an old legal maxim that in time of war the laws are silent: Inter arma silent leges. But the crucial issue is the extent to which the nation is threatened. In the cases of Lincoln and Roosevelt, the survival of the United States hung in the balance. A president will be forgiven by his contemporaries, though not necessarily by later generations, for acting outside the law when that is the case. As more than one Supreme Court justice has said, the Constitution is not a suicide pact. When national survival is not threatened, however, it is essential for a chief executive to resist an unwarranted enlargement of his powers.
A national security concern does not become a war simply because it is baptized as such. President George W. Bush’s questionable use of the metaphor “war on terror” to justify indefinite detention of suspects, warrantless eavesdropping and spying on the reading habits of citizens could invite from historians even more opprobrium than they have cast on the repressive actions taken by other presidents when the survival of the United States was at risk.
---
Jean Edward Smith, the John Marshall Professor of political science at Marshall University, in Huntington, W. Va., is the author of 12 books, including biographies of Ulysses S. Grant, Chief Justice John Marshall and General Lucius D. Clay. His latest book is “F.D.R.”
Sunday, May 27
Elite Colleges Open New Door to Low-Income Youths
NEW YORK TIMES
May 27, 2007
By SARA RIMER
AMHERST, Mass. — The discussion in the States of Poverty seminar here at Amherst College was getting a little theoretical. Then Anthony Abraham Jack, a junior from Miami, asked pointedly, “Has anyone here ever actually seen a food stamp?”
To Mr. Jack, unlike many of his classmates, food stamps are not an abstraction. His family has had to use them in emergencies. His mother raised three children as a single parent and earns $26,000 a year as a school security guard. That is just a little more than half the cost of a year’s tuition, room and board, fees and other expenses at Amherst, which for Mr. Jack’s class was close to $48,000.
So when Mr. Jack, now 22 and a senior, graduated with honors here on Sunday, he was not just the first in his family to earn a college degree, but a success story in the effort by Amherst and a growing number of elite colleges to open their doors to talented low-income students.
Concerned that the barriers to elite institutions are being increasingly drawn along class lines, and wanting to maintain some role as engines of social mobility, about two dozen schools — Amherst, Harvard, Princeton, Stanford, the University of Virginia, Williams and the University of North Carolina, among them — have pushed in the past few years to diversify economically.
They are trying tactics like replacing loans with grants and curtailing early admission, which favors the well-to-do and savvy. But most important, Amherst, for instance, is doing more than giving money to low-income students; it is recruiting them and taking their socioeconomic background — defined by family income, parents’ education and occupation level — into account when making admissions decisions.
Amherst’s president, Anthony Marx, turns to stark numbers in a 2004 study by the Century Foundation, a policy institute in New York, to explain the effort: Three-quarters of students at top colleges come from the top socioeconomic quartile, with only one-tenth from the poorer half and 3 percent from the bottom quartile.
“We want talent from across all divides, wherever we can find it,” President Marx said. Amherst covered the full cost of Mr. Jack’s education beyond what he earned in work-study. The only debt he says he owes is the $41 it cost to make copies of his 107-page honors thesis.
Amherst also provides its low-income students important support, from $400 “start-up grants” for winter coats and sheets and blankets for their dorm rooms, to summer science and math tutoring. At the same time, low-income students are expected to put in at least seven hours a week at $8-an-hour work-study jobs.
But they get to use $200 a month in their work-study earnings as spending money to get a haircut, for instance, or go out for pizza with classmates so they don’t feel excluded.
Mr. Jack, who is black and had never been on a plane until he flew to Amherst for his first visit, arrived as an A student, and with a steely focus.
His mother, Marilyn, 53, had guided her son from Head Start to a gifted program in elementary school to a magnet middle school and, in his final year of high school, to the private Gulliver Preparatory School on a full scholarship. But she never had to push Tony, she said. “He was on a mission from Day 1,” she said.
Mr. Jack’s high grades and test scores — a respectable 1200 on the SAT — won him a full scholarship to the University of Florida. But the median score for his Amherst class was 1422, and he would have been excluded had the admissions office not considered his socioeconomic class, and the obstacles he had overcome.
“Tony Jack with his pure intelligence — had he been raised in Greenwich, he would have been a 1500 kid,” said Tom Parker, the dean of admission. “He would have been tutored by Kaplan or Princeton Review. He would have had The New Yorker magazine on the coffee table.”
“Tony Jack is not an anomaly,” he added.
Mr. Jack, Amherst officials say, would likely not have benefited under traditional affirmative action programs. In their groundbreaking 1998 study of 28 selective universities, William Bowen, the former president of Princeton, and Derek Bok, now the interim president of Harvard, found that 86 percent of blacks who enrolled were middle or upper middle class. (Amherst was not included in that study.) The white students were even wealthier.
“Universities have prided themselves on making strides in racial diversity, but for the most part they have avoided the larger issue of class inequality,” said Richard D. Kahlenberg, a senior fellow at the Century Foundation.
For Mr. Jack, there were adjustments at this college, where half the students are affluent enough that their parents pay tuition without any aid from Amherst.
He did not let it bother him, he said, when wealthier classmates blithely inquired about the best clubs in Miami — as if he would know, Mr. Jack said dryly — before flying off to his hometown for spring break. Mr. Jack could afford to go home only at Christmas, and the end of the year, when Amherst paid his plane fare.
Mr. Jack is 6 foot 7 and built like the football player he used to be. In his freshman year, he said, he was walking to his dorm one night when a police car seemed to be following him. He recalled showing the officer his Amherst ID and explaining, “I’m a student here.”
In Mr. Jack’s class of 413, 15 percent, or 61, students, are from families with incomes of less than $45,000 a year; about two-thirds of those are from families earning less than $30,000. He was amazed to discover how much preparation wealthier students had.
“People are groomed for the SAT,” Mr. Jack said. “They take Latin to help them with their vocabulary.”
He seized every opportunity Amherst offered — the pre-freshman summer program in science and math, help from the writing center and faculty office hours. “They didn’t just invite me in,” he said. “They prepared the way.”
For his freshman year, he chose the most challenging classes, including Chemical Principles, even though he had no chemistry in high school. “I didn’t feel like I was in over my head,” he said. “I just felt like I was being pushed to the boundaries of my ability.”
He got all A’s and B’s his first year, except for a C-plus in chemistry. Sophomore year he plunged in even deeper, taking Organic Chemistry I and II. He got a B the first semester, and an A-minus the second.
“Organic chemistry was the happiest time of my life,” said Mr. Jack, who tends to gush about Amherst. “Everything started clicking.”
David Hansen, who taught Organic Chemistry II, called Mr. Jack’s improvement remarkable: “He had the motivation and the desire and the discipline to take advantage of the support that was here.”
Mr. Jack, who is as gregarious as he is studious, found time to mentor other students, serve on committees — and earn an A-plus in calculus last year, one of only 10 A-pluses the professor, David Cox, said he has given out in calculus in 30 years of teaching. This year Mr. Jack was Amherst’s nominee to be a Rhodes scholar at Oxford.
Squeamish about blood, Mr. Jack switched his major from pre-med to religion and gender studies. He said he intended to go to graduate school. For now, he loves Amherst so much, he is staying around as an “alumni fellow,” organizing events on campus. He says he thinks about teaching, or becoming a lawyer so that he can help his community. As for money, he says he just wants to be able to take care of his family.
At Amherst’s commencement on Sunday, Mr. Jack wept — and his classmates gave him a standing ovation — when President Marx awarded him the annual prize for the senior who has “shown by his or her own determination and accomplishment the greatest appreciation of and desire for a college education.”
May 27, 2007
By SARA RIMER
AMHERST, Mass. — The discussion in the States of Poverty seminar here at Amherst College was getting a little theoretical. Then Anthony Abraham Jack, a junior from Miami, asked pointedly, “Has anyone here ever actually seen a food stamp?”
To Mr. Jack, unlike many of his classmates, food stamps are not an abstraction. His family has had to use them in emergencies. His mother raised three children as a single parent and earns $26,000 a year as a school security guard. That is just a little more than half the cost of a year’s tuition, room and board, fees and other expenses at Amherst, which for Mr. Jack’s class was close to $48,000.
So when Mr. Jack, now 22 and a senior, graduated with honors here on Sunday, he was not just the first in his family to earn a college degree, but a success story in the effort by Amherst and a growing number of elite colleges to open their doors to talented low-income students.
Concerned that the barriers to elite institutions are being increasingly drawn along class lines, and wanting to maintain some role as engines of social mobility, about two dozen schools — Amherst, Harvard, Princeton, Stanford, the University of Virginia, Williams and the University of North Carolina, among them — have pushed in the past few years to diversify economically.
They are trying tactics like replacing loans with grants and curtailing early admission, which favors the well-to-do and savvy. But most important, Amherst, for instance, is doing more than giving money to low-income students; it is recruiting them and taking their socioeconomic background — defined by family income, parents’ education and occupation level — into account when making admissions decisions.
Amherst’s president, Anthony Marx, turns to stark numbers in a 2004 study by the Century Foundation, a policy institute in New York, to explain the effort: Three-quarters of students at top colleges come from the top socioeconomic quartile, with only one-tenth from the poorer half and 3 percent from the bottom quartile.
“We want talent from across all divides, wherever we can find it,” President Marx said. Amherst covered the full cost of Mr. Jack’s education beyond what he earned in work-study. The only debt he says he owes is the $41 it cost to make copies of his 107-page honors thesis.
Amherst also provides its low-income students important support, from $400 “start-up grants” for winter coats and sheets and blankets for their dorm rooms, to summer science and math tutoring. At the same time, low-income students are expected to put in at least seven hours a week at $8-an-hour work-study jobs.
But they get to use $200 a month in their work-study earnings as spending money to get a haircut, for instance, or go out for pizza with classmates so they don’t feel excluded.
Mr. Jack, who is black and had never been on a plane until he flew to Amherst for his first visit, arrived as an A student, and with a steely focus.
His mother, Marilyn, 53, had guided her son from Head Start to a gifted program in elementary school to a magnet middle school and, in his final year of high school, to the private Gulliver Preparatory School on a full scholarship. But she never had to push Tony, she said. “He was on a mission from Day 1,” she said.
Mr. Jack’s high grades and test scores — a respectable 1200 on the SAT — won him a full scholarship to the University of Florida. But the median score for his Amherst class was 1422, and he would have been excluded had the admissions office not considered his socioeconomic class, and the obstacles he had overcome.
“Tony Jack with his pure intelligence — had he been raised in Greenwich, he would have been a 1500 kid,” said Tom Parker, the dean of admission. “He would have been tutored by Kaplan or Princeton Review. He would have had The New Yorker magazine on the coffee table.”
“Tony Jack is not an anomaly,” he added.
Mr. Jack, Amherst officials say, would likely not have benefited under traditional affirmative action programs. In their groundbreaking 1998 study of 28 selective universities, William Bowen, the former president of Princeton, and Derek Bok, now the interim president of Harvard, found that 86 percent of blacks who enrolled were middle or upper middle class. (Amherst was not included in that study.) The white students were even wealthier.
“Universities have prided themselves on making strides in racial diversity, but for the most part they have avoided the larger issue of class inequality,” said Richard D. Kahlenberg, a senior fellow at the Century Foundation.
For Mr. Jack, there were adjustments at this college, where half the students are affluent enough that their parents pay tuition without any aid from Amherst.
He did not let it bother him, he said, when wealthier classmates blithely inquired about the best clubs in Miami — as if he would know, Mr. Jack said dryly — before flying off to his hometown for spring break. Mr. Jack could afford to go home only at Christmas, and the end of the year, when Amherst paid his plane fare.
Mr. Jack is 6 foot 7 and built like the football player he used to be. In his freshman year, he said, he was walking to his dorm one night when a police car seemed to be following him. He recalled showing the officer his Amherst ID and explaining, “I’m a student here.”
In Mr. Jack’s class of 413, 15 percent, or 61, students, are from families with incomes of less than $45,000 a year; about two-thirds of those are from families earning less than $30,000. He was amazed to discover how much preparation wealthier students had.
“People are groomed for the SAT,” Mr. Jack said. “They take Latin to help them with their vocabulary.”
He seized every opportunity Amherst offered — the pre-freshman summer program in science and math, help from the writing center and faculty office hours. “They didn’t just invite me in,” he said. “They prepared the way.”
For his freshman year, he chose the most challenging classes, including Chemical Principles, even though he had no chemistry in high school. “I didn’t feel like I was in over my head,” he said. “I just felt like I was being pushed to the boundaries of my ability.”
He got all A’s and B’s his first year, except for a C-plus in chemistry. Sophomore year he plunged in even deeper, taking Organic Chemistry I and II. He got a B the first semester, and an A-minus the second.
“Organic chemistry was the happiest time of my life,” said Mr. Jack, who tends to gush about Amherst. “Everything started clicking.”
David Hansen, who taught Organic Chemistry II, called Mr. Jack’s improvement remarkable: “He had the motivation and the desire and the discipline to take advantage of the support that was here.”
Mr. Jack, who is as gregarious as he is studious, found time to mentor other students, serve on committees — and earn an A-plus in calculus last year, one of only 10 A-pluses the professor, David Cox, said he has given out in calculus in 30 years of teaching. This year Mr. Jack was Amherst’s nominee to be a Rhodes scholar at Oxford.
Squeamish about blood, Mr. Jack switched his major from pre-med to religion and gender studies. He said he intended to go to graduate school. For now, he loves Amherst so much, he is staying around as an “alumni fellow,” organizing events on campus. He says he thinks about teaching, or becoming a lawyer so that he can help his community. As for money, he says he just wants to be able to take care of his family.
At Amherst’s commencement on Sunday, Mr. Jack wept — and his classmates gave him a standing ovation — when President Marx awarded him the annual prize for the senior who has “shown by his or her own determination and accomplishment the greatest appreciation of and desire for a college education.”
Friday, May 25
Repairing the Damage Done
WASHINGTON — More than three decades ago, Nixon White House Counsel John Dean called the Watergate cover-up “a cancer on the presidency.” Another one exists today, posing a challenge for the next president to restore the office as a credible voice in foreign policy.
President Bush’s detour in Iraq off the multilateral track adhered to throughout the Cold War years has caused a deep drop in American prestige abroad, requiring extensive repair by his successor regardless of which party wins in 2008.
While Bush’s invasion and occupation of Iraq has been the immediate trigger for the decline of American influence, just as significant was his original failure to capitalize on the terrorist attacks of 9/11 to mobilize a truly collective global response.
The outpouring of empathy for the United States in the wake of those events was quickly short-circuited by the invasion. In diverting the American military from its legitimate focus against the real perpetrators of the attacks, Bush left the primary job undone in Afghanistan, in order to chase a more ambitious dream of superpower dominance.
A decade earlier, neoconservative theorists in the Republican Party saw in the collapse of the Soviet Union an invitation for America to assume a vastly more assertive, unilateral role in imposing its power and political ideology elsewhere.
Among these theorists at the Pentagon was Paul Wolfowitz, deputy undersecretary to Secretary of Defense Dick Cheney, who worried that with the demise of Soviet communism the strongest rationale for a muscular national defense was gone. Yet serious threats remained, from nuclear ambitions in North Korea and the determination in Iran and Iraq to assure control of their vast oil resources essential to American power.
Under Wolfowitz, a quest was undertaken for a strategy justifying continued American military hegemony. As James Mann wrote in his revealing 2004 book, “The Rise of the Vulcans: The History of Bush’s War Cabinet,” Wolfowitz assigned his chief assistant, I. Lewis “Scooter” Libby, to have a draft prepared that “set forth a new vision for a world dominated by a lone American superpower, actively working to make sure that no rival or group of rivals would ever emerge.”
Libby gave the assignment to another Wolfowitz aide named Zalmay Khalilzad, little known then outside defense circles. He ultimately became the American ambassador to occupied Iraq after the overthrow of Saddam Hussein and the establishment of a new American-sponsored regime in Baghdad, and subsequently ambassador to the United Nations.
A leak of the Khalilzad draft, according to Mann, caused embarrassment and was rewritten, but the finished product became a rough blueprint for the radical new American foreign policy that flowered in the George W. Bush administration.
The draft envisioned a world in which American military power alone would rival or replace the collective security that had marked U.S. containment policy through the Cold War. It even hypothesized, Mann wrote, the possible future need for “preempting an impending attack with nuclear chemical and biological weapons” — the rationale eventually dusted off for the Iraq invasion.
A side incentive for developing the new strategy was pressure from congressional Democrats for a substantial “peace dividend” after the Cold War’s end. To counter such diversions of defense spending for neglected domestic needs, the Pentagon theorists needed a persuasive argument for a lusty military budget.
When Khalilzad’s draft kicked up criticism that it smacked of hostility to other nations, Libby toned down the language in what became the Defense Policy Guidance of 1992, but the essential message remained. By keeping America militarily all-powerful, other countries would be deterred from attempting to match its strength.
When Bill Clinton took over the White House after the 1992 election, he didn’t, according to Mann, seriously challenge the basic force concept, focusing more on domestic matters. The neoconservative theorists, out of power, nevertheless fretted about Congressional projections of static or shrinking defense budgets.
In 1997, they banded together as the Project for the New American Century to build on the 1992 policy statement. A subsequent paper called for more defense spending to preserve “the current Pax Americana … through the coming transformation of war made possible by the new techniques,” including nuclear weapons, in the hands of new, often regional threats.
The group noted critically that the Pentagon’s Quadrennial Defense Review of 1997 “assumed that [North Korea’s] Kim Jong Il and [Iraq’s] Saddam Hussein each could begin a war—perhaps even while employing chemical, biological or even nuclear weapons—and the United States would make no effort to unseat either ruler.”
The paper observed that “past Pentagon war games have given little or no consideration to the force requirements necessary not only to defeat an attack but to remove these regimes from power and conduct post-combat stability operations …
“The current American peace will be short-lived if the United States becomes vulnerable to rogue powers with small, inexpensive arsenals of ballistic missiles and nuclear warheads or other weapons of mass destruction. We cannot allow North Korea, Iran, Iraq or similar states to undermine American leadership, intimidate American allies or threaten the American homeland itself. The blessings of the American peace, purchased at fearful cost and a century of effort, should not be so trivially squandered.”
According to Gary Schmitt, a co-chairman of the project, George W. Bush, governor of Texas at the time, was neither a member of the group nor as far as Schmitt knows aware at the time of its findings. But among the participants were Wolfowitz and Libby, architects of the basic concept of a muscular defense including preemption of threats of weapons of mass destruction.
Did Bush as president come on his own to embrace the precepts of the project or was he sold on them by Cheney, Wolfowitz, Libby and others of the circle known as “the Vulcans”? Either way, events of the post-9/11 years have confirmed that those precepts were at the core of the radical foreign policy that have imperiled his presidency and American leadership across the globe.
Among the first challenges for Bush’s successor in 2009 will be to demonstrate dramatically that he or she has learned the hard lesson of that go-it-alone foreign policy, which in the end forced America to go hat-in-hand to the international community. The new president must waste no time putting America back on the track of multilateralism and collective security.
With very good luck and a return to diplomacy, the United States could be out of Iraq by that time, giving the next president, Republican or Democratic, a free hand to restore the reputation of the American presidency in the eyes of friends and foes abroad, and at home as well.
---
Jules Witcover, a political columnist for 30 years, has written a dozen books about American politics, including “No Way to Pick A President” and “The Year the Dream Died.” His most recent book is “Very Strange Bedfellows: The Short and Unhappy Marriage of Richard Nixon and Spiro Agnew.”
President Bush’s detour in Iraq off the multilateral track adhered to throughout the Cold War years has caused a deep drop in American prestige abroad, requiring extensive repair by his successor regardless of which party wins in 2008.
While Bush’s invasion and occupation of Iraq has been the immediate trigger for the decline of American influence, just as significant was his original failure to capitalize on the terrorist attacks of 9/11 to mobilize a truly collective global response.
The outpouring of empathy for the United States in the wake of those events was quickly short-circuited by the invasion. In diverting the American military from its legitimate focus against the real perpetrators of the attacks, Bush left the primary job undone in Afghanistan, in order to chase a more ambitious dream of superpower dominance.
A decade earlier, neoconservative theorists in the Republican Party saw in the collapse of the Soviet Union an invitation for America to assume a vastly more assertive, unilateral role in imposing its power and political ideology elsewhere.
Among these theorists at the Pentagon was Paul Wolfowitz, deputy undersecretary to Secretary of Defense Dick Cheney, who worried that with the demise of Soviet communism the strongest rationale for a muscular national defense was gone. Yet serious threats remained, from nuclear ambitions in North Korea and the determination in Iran and Iraq to assure control of their vast oil resources essential to American power.
Under Wolfowitz, a quest was undertaken for a strategy justifying continued American military hegemony. As James Mann wrote in his revealing 2004 book, “The Rise of the Vulcans: The History of Bush’s War Cabinet,” Wolfowitz assigned his chief assistant, I. Lewis “Scooter” Libby, to have a draft prepared that “set forth a new vision for a world dominated by a lone American superpower, actively working to make sure that no rival or group of rivals would ever emerge.”
Libby gave the assignment to another Wolfowitz aide named Zalmay Khalilzad, little known then outside defense circles. He ultimately became the American ambassador to occupied Iraq after the overthrow of Saddam Hussein and the establishment of a new American-sponsored regime in Baghdad, and subsequently ambassador to the United Nations.
A leak of the Khalilzad draft, according to Mann, caused embarrassment and was rewritten, but the finished product became a rough blueprint for the radical new American foreign policy that flowered in the George W. Bush administration.
The draft envisioned a world in which American military power alone would rival or replace the collective security that had marked U.S. containment policy through the Cold War. It even hypothesized, Mann wrote, the possible future need for “preempting an impending attack with nuclear chemical and biological weapons” — the rationale eventually dusted off for the Iraq invasion.
A side incentive for developing the new strategy was pressure from congressional Democrats for a substantial “peace dividend” after the Cold War’s end. To counter such diversions of defense spending for neglected domestic needs, the Pentagon theorists needed a persuasive argument for a lusty military budget.
When Khalilzad’s draft kicked up criticism that it smacked of hostility to other nations, Libby toned down the language in what became the Defense Policy Guidance of 1992, but the essential message remained. By keeping America militarily all-powerful, other countries would be deterred from attempting to match its strength.
When Bill Clinton took over the White House after the 1992 election, he didn’t, according to Mann, seriously challenge the basic force concept, focusing more on domestic matters. The neoconservative theorists, out of power, nevertheless fretted about Congressional projections of static or shrinking defense budgets.
In 1997, they banded together as the Project for the New American Century to build on the 1992 policy statement. A subsequent paper called for more defense spending to preserve “the current Pax Americana … through the coming transformation of war made possible by the new techniques,” including nuclear weapons, in the hands of new, often regional threats.
The group noted critically that the Pentagon’s Quadrennial Defense Review of 1997 “assumed that [North Korea’s] Kim Jong Il and [Iraq’s] Saddam Hussein each could begin a war—perhaps even while employing chemical, biological or even nuclear weapons—and the United States would make no effort to unseat either ruler.”
The paper observed that “past Pentagon war games have given little or no consideration to the force requirements necessary not only to defeat an attack but to remove these regimes from power and conduct post-combat stability operations …
“The current American peace will be short-lived if the United States becomes vulnerable to rogue powers with small, inexpensive arsenals of ballistic missiles and nuclear warheads or other weapons of mass destruction. We cannot allow North Korea, Iran, Iraq or similar states to undermine American leadership, intimidate American allies or threaten the American homeland itself. The blessings of the American peace, purchased at fearful cost and a century of effort, should not be so trivially squandered.”
According to Gary Schmitt, a co-chairman of the project, George W. Bush, governor of Texas at the time, was neither a member of the group nor as far as Schmitt knows aware at the time of its findings. But among the participants were Wolfowitz and Libby, architects of the basic concept of a muscular defense including preemption of threats of weapons of mass destruction.
Did Bush as president come on his own to embrace the precepts of the project or was he sold on them by Cheney, Wolfowitz, Libby and others of the circle known as “the Vulcans”? Either way, events of the post-9/11 years have confirmed that those precepts were at the core of the radical foreign policy that have imperiled his presidency and American leadership across the globe.
Among the first challenges for Bush’s successor in 2009 will be to demonstrate dramatically that he or she has learned the hard lesson of that go-it-alone foreign policy, which in the end forced America to go hat-in-hand to the international community. The new president must waste no time putting America back on the track of multilateralism and collective security.
With very good luck and a return to diplomacy, the United States could be out of Iraq by that time, giving the next president, Republican or Democratic, a free hand to restore the reputation of the American presidency in the eyes of friends and foes abroad, and at home as well.
---
Jules Witcover, a political columnist for 30 years, has written a dozen books about American politics, including “No Way to Pick A President” and “The Year the Dream Died.” His most recent book is “Very Strange Bedfellows: The Short and Unhappy Marriage of Richard Nixon and Spiro Agnew.”
Krugmann comes out AGAINST immigration bill
The New York Times
May 25, 2007
By PAUL KRUGMAN
A piece of advice for progressives trying to figure out where they stand on immigration reform: it’s the political economy, stupid. Analyzing the direct economic gains and losses from proposed reform isn’t enough. You also have to think about how the reform would affect the future political environment.
To see what I mean — and why the proposed immigration bill, despite good intentions, could well make things worse — let’s take a look back at America’s last era of mass immigration.
My own grandparents came to this country during that era, which ended with the imposition of severe immigration restrictions in the 1920s. Needless to say, I’m very glad they made it in before Congress slammed the door. And today’s would-be immigrants are just as deserving as Emma Lazarus’s “huddled masses, yearning to breathe free.”
Moreover, as supporters of immigrant rights rightly remind us, everything today’s immigrant-bashers say — that immigrants are insufficiently skilled, that they’re too culturally alien, and, implied though rarely stated explicitly, that they’re not white enough — was said a century ago about Italians, Poles and Jews.
Yet then as now there were some good reasons to be concerned about the effects of immigration.
There’s a highly technical controversy going on among economists about the effects of recent immigration on wages. However that dispute turns out, it’s clear that the earlier wave of immigration increased inequality and depressed the wages of the less skilled. For example, a recent study by Jeffrey Williamson, a Harvard economic historian, suggests that in 1913 the real wages of unskilled U.S. workers were around 10 percent lower than they would have been without mass immigration. But the straight economics was the least of it. Much more important was the way immigration diluted democracy.
In 1910, almost 14 percent of voting-age males in the United States were non-naturalized immigrants. (Women didn’t get the vote until 1920.) Add in the disenfranchised blacks of the Jim Crow South, and what you had in America was a sort of minor-key apartheid system, with about a quarter of the population — in general, the poorest and most in need of help — denied any political voice.
That dilution of democracy helped prevent any effective response to the excesses and injustices of the Gilded Age, because those who might have demanded that politicians support labor rights, progressive taxation and a basic social safety net didn’t have the right to vote. Conversely, the restrictions on immigration imposed in the 1920s had the unintended effect of paving the way for the New Deal and sustaining its achievements, by creating a fully enfranchised working class.
But now we’re living in the second Gilded Age. And as before, one of the things making antiworker, unequalizing policies politically possible is the fact that millions of the worst-paid workers in this country can’t vote. What progressives should care about, above all, is that immigration reform stop our drift into a new system of de facto apartheid.
Now, the proposed immigration reform does the right thing in principle by creating a path to citizenship for those already here. We’re not going to expel 11 million illegal immigrants, so the only way to avoid having those immigrants be a permanent disenfranchised class is to bring them into the body politic.
And I can’t share the outrage of those who say that illegal immigrants broke the law by coming here. Is that any worse than what my grandfather did by staying in America, when he was supposed to return to Russia to serve in the czar’s army?
But the bill creates a path to citizenship so torturous that most immigrants probably won’t even try to legalize themselves. Meanwhile, the bill creates a guest worker program, which is exactly what we don’t want to do. Yes, it would raise the income of the guest workers themselves, and in narrow financial terms guest workers are a good deal for the host nation — because they don’t bring their families, they impose few costs on taxpayers. But it formally creates exactly the kind of apartheid system we want to avoid.
Progressive supporters of the proposed bill defend the guest worker program as a necessary evil, the price that must be paid for business support. Right now, however, the price looks too high and the reward too small: this bill could all too easily end up actually expanding the class of disenfranchised workers.
May 25, 2007
By PAUL KRUGMAN
A piece of advice for progressives trying to figure out where they stand on immigration reform: it’s the political economy, stupid. Analyzing the direct economic gains and losses from proposed reform isn’t enough. You also have to think about how the reform would affect the future political environment.
To see what I mean — and why the proposed immigration bill, despite good intentions, could well make things worse — let’s take a look back at America’s last era of mass immigration.
My own grandparents came to this country during that era, which ended with the imposition of severe immigration restrictions in the 1920s. Needless to say, I’m very glad they made it in before Congress slammed the door. And today’s would-be immigrants are just as deserving as Emma Lazarus’s “huddled masses, yearning to breathe free.”
Moreover, as supporters of immigrant rights rightly remind us, everything today’s immigrant-bashers say — that immigrants are insufficiently skilled, that they’re too culturally alien, and, implied though rarely stated explicitly, that they’re not white enough — was said a century ago about Italians, Poles and Jews.
Yet then as now there were some good reasons to be concerned about the effects of immigration.
There’s a highly technical controversy going on among economists about the effects of recent immigration on wages. However that dispute turns out, it’s clear that the earlier wave of immigration increased inequality and depressed the wages of the less skilled. For example, a recent study by Jeffrey Williamson, a Harvard economic historian, suggests that in 1913 the real wages of unskilled U.S. workers were around 10 percent lower than they would have been without mass immigration. But the straight economics was the least of it. Much more important was the way immigration diluted democracy.
In 1910, almost 14 percent of voting-age males in the United States were non-naturalized immigrants. (Women didn’t get the vote until 1920.) Add in the disenfranchised blacks of the Jim Crow South, and what you had in America was a sort of minor-key apartheid system, with about a quarter of the population — in general, the poorest and most in need of help — denied any political voice.
That dilution of democracy helped prevent any effective response to the excesses and injustices of the Gilded Age, because those who might have demanded that politicians support labor rights, progressive taxation and a basic social safety net didn’t have the right to vote. Conversely, the restrictions on immigration imposed in the 1920s had the unintended effect of paving the way for the New Deal and sustaining its achievements, by creating a fully enfranchised working class.
But now we’re living in the second Gilded Age. And as before, one of the things making antiworker, unequalizing policies politically possible is the fact that millions of the worst-paid workers in this country can’t vote. What progressives should care about, above all, is that immigration reform stop our drift into a new system of de facto apartheid.
Now, the proposed immigration reform does the right thing in principle by creating a path to citizenship for those already here. We’re not going to expel 11 million illegal immigrants, so the only way to avoid having those immigrants be a permanent disenfranchised class is to bring them into the body politic.
And I can’t share the outrage of those who say that illegal immigrants broke the law by coming here. Is that any worse than what my grandfather did by staying in America, when he was supposed to return to Russia to serve in the czar’s army?
But the bill creates a path to citizenship so torturous that most immigrants probably won’t even try to legalize themselves. Meanwhile, the bill creates a guest worker program, which is exactly what we don’t want to do. Yes, it would raise the income of the guest workers themselves, and in narrow financial terms guest workers are a good deal for the host nation — because they don’t bring their families, they impose few costs on taxpayers. But it formally creates exactly the kind of apartheid system we want to avoid.
Progressive supporters of the proposed bill defend the guest worker program as a necessary evil, the price that must be paid for business support. Right now, however, the price looks too high and the reward too small: this bill could all too easily end up actually expanding the class of disenfranchised workers.
Thursday, May 24
Adam and Eve in the Land of the Dinosaurs
Museum Review: The Creation Museum
PETERSBURG, Ky. — The entrance gates here are topped with metallic Stegosauruses. The grounds include a giant tyrannosaur standing amid the trees, and a stone-lined lobby sports varied sauropods. It could be like any other natural history museum, luring families with the promise of immense fossils and dinosaur adventures.
But step a little farther into the entrance hall, and you come upon a pastoral scene undreamt of by any natural history museum. Two prehistoric children play near a burbling waterfall, thoroughly at home in the natural world. Dinosaurs cavort nearby, their animatronic mechanisms turning them into alluring companions, their gaping mouths seeming not threatening, but almost welcoming, as an Apatosaurus munches on leaves a few yards away.
What is this, then? A reproduction of a childhood fantasy in which dinosaurs are friends of inquisitive youngsters? The kind of fantasy that doesn’t care that human beings and these prefossilized thunder-lizards are usually thought to have been separated by millions of years? No, this really is meant to be more like one of those literal dioramas of the traditional natural history museum, an imagining of a real habitat, with plant life and landscape reproduced in meticulous detail.
For here at the $27 million Creation Museum, which opens on May 28 (just a short drive from the Cincinnati-Northern Kentucky International Airport), this pastoral scene is a glimpse of the world just after the expulsion from the Garden of Eden, in which dinosaurs are still apparently as herbivorous as humans, and all are enjoying a little calm in the days after the fall.
It also serves as a vivid introduction to the sheer weirdness and daring of this museum created by the Answers in Genesis ministry that combines displays of extraordinary nautilus shell fossils and biblical tableaus, celebrations of natural wonders and allusions to human sin. Evolution gets its continual comeuppance, while biblical revelations are treated as gospel.
Outside the museum scientists may assert that the universe is billions of years old, that fossils are the remains of animals living hundreds of millions of years ago, and that life’s diversity is the result of evolution by natural selection. But inside the museum the Earth is barely 6,000 years old, dinosaurs were created on the sixth day, and Jesus is the savior who will one day repair the trauma of man’s fall.
It is a measure of the museum’s daring that dinosaurs and fossils — once considered major challenges to belief in the Bible’s creation story — are here so central, appearing not as tests of faith, as one religious authority once surmised, but as creatures no different from the giraffes and cats that still walk the earth. Fossils, the museum teaches, are no older than Noah’s flood; in fact dinosaurs were on the ark.
So dinosaur skeletons and brightly colored mineral crystals and images of the Grand Canyon are here, as are life-size dioramas showing paleontologists digging in mock earth, Moses and Paul teaching their doctrines, Martin Luther chastising the church to return to Scripture, Adam and Eve guiltily standing near skinned animals, covering their nakedness, and a supposedly full-size reproduction of a section of Noah’s ark.
There are 52 videos in the museum, one showing how the transformations wrought by the eruption of Mount St. Helens in 1980 reveal how plausible it is that the waters of Noah’s flood could have carved out the Grand Canyon within days. There is a special-effects theater complete with vibrating seats meant to evoke the flood, and a planetarium paying tribute to God’s glory while exploring the nature of galaxies.
Whether you are willing to grant the premises of this museum almost becomes irrelevant as you are drawn into its mixture of spectacle and narrative. Its 60,000 square feet of exhibits are often stunningly designed by Patrick Marsh, who, like the entire museum staff, declares adherence to the ministry’s views; he evidently also knows the lure of secular sensations, since he designed the “Jaws” and “King Kong” attractions at Universal Studios in Florida.
For the skeptic the wonder is at a strange universe shaped by elaborate arguments, strong convictions and intermittent invocations of scientific principle. For the believer, it seems, this museum provides a kind of relief: Finally the world is being shown as it really is, without the distortions of secularism and natural selection.
The Creation Museum actually stands the natural history museum on its head. Natural history museums developed out of the Enlightenment: encyclopedic collections of natural objects were made subject to ever more searching forms of inquiry and organization. The natural history museum gave order to the natural world, taming its seeming chaos with the principles of human reason. And Darwin’s theory — which gave life a compelling order in time as well as space — became central to its purpose. Put on display was the prehistory of civilization, seeming to allude not just to the evolution of species but also cultures (which is why “primitive” cultures were long part of its domain). The natural history museum is a hall of human origins.
The Creation Museum has a similar interest in dramatizing origins, but sees natural history as divine history. And now that many museums have also become temples to various American ethnic and sociological groups, why not a museum for the millions who believe that the Earth is less than 6,000 years old and was created in six days?
Mark Looy, a founder of Answers in Genesis with its president, Ken Ham, said the ministry expected perhaps 250,000 visitors during the museum’s first year. In preparation Mr. Ham for 13 years has been overseeing 350 seminars annually about the truths of Genesis, which have been drawing thousands of acolytes. The organization’s magazine has 50,000 subscribers. The museum also says that it has 9,000 charter members and international contributors who have left the institution free of debt.
But for a visitor steeped in the scientific world view, the impact of the museum is a disorienting mix of faith and reason, the exotic and the familiar. Nature here is not “red in tooth and claw,” as Tennyson asserted. In fact at first it seems almost as genteel as Eden’s dinosaurs. We learn that chameleons, for example, change colors not because that serves as a survival mechanism, but “to ‘talk’ to other chameleons, to show off their mood, and to adjust to heat and light.”
Meanwhile a remarkable fossil of a perch devouring a herring found in Wyoming offers “silent testimony to God’s worldwide judgment,” not because it shows a predator and prey, but because the two perished — somehow getting preserved in stone — during Noah’s flood. Nearly all fossils, the museum asserts, are relics of that divine retribution.
The heart of the museum is a series of catastrophes. The main one is the fall, with Adam and Eve eating of the tree of knowledge; after that tableau the viewer descends from the brightness of Eden into genuinely creepy cement hallways of urban slums. Photographs show the pain of war, childbirth, death — the wages of primal sin. Then come the biblical accounts of the fallen world, leading up to Noah’s ark and the flood, the source of all significant geological phenomena.
The other catastrophe, in the museum’s view, is of more recent vintage: the abandonment of the Bible by church figures who began to treat the story of creation as if it were merely metaphorical, and by Enlightenment philosophers, who chipped away at biblical authority. The ministry believes this is a slippery slope.
Start accepting evolution or an ancient Earth, and the result is like the giant wrecking ball, labeled “Millions of Years,” that is shown smashing the ground at the foundation of a church, the cracks reaching across the gallery to a model of a home in which videos demonstrate the imminence of moral dissolution. A teenager is shown sitting at a computer; he is, we are told, looking at pornography.
But given the museum’s unwavering insistence on belief in the literal truth of biblical accounts, it is strange that so much energy is put into demonstrating their scientific coherence with discussions of erosion or interstellar space. Are such justifications required to convince the skeptical or reassure the believer?
In the museum’s portrayal, creationists and secularists view the same facts, but come up with differing interpretations, perhaps the way Ptolemaic astronomers in the 16th century saw the Earth at the center of the universe, where Copernicans began to place the sun. But one problem is that scientific activity presumes that the material world is organized according to unchanging laws, while biblical fundamentalism presumes that those laws are themselves subject to disruption and miracle. Is not that a slippery slope as well, even affecting these analyses?
But for debates, a visitor goes elsewhere. The Creation Museum offers an alternate world that has its fascinations, even for a skeptic wary of the effect of so many unanswered assertions. He leaves feeling a bit like Adam emerging from Eden, all the world before him, freshly amazed at its strangeness and extravagant peculiarities.
The Creation Museum opens Monday at 2800 Bullittsburg Church Road, Petersburg, Ky.; (888) 582-4253.
PETERSBURG, Ky. — The entrance gates here are topped with metallic Stegosauruses. The grounds include a giant tyrannosaur standing amid the trees, and a stone-lined lobby sports varied sauropods. It could be like any other natural history museum, luring families with the promise of immense fossils and dinosaur adventures.
But step a little farther into the entrance hall, and you come upon a pastoral scene undreamt of by any natural history museum. Two prehistoric children play near a burbling waterfall, thoroughly at home in the natural world. Dinosaurs cavort nearby, their animatronic mechanisms turning them into alluring companions, their gaping mouths seeming not threatening, but almost welcoming, as an Apatosaurus munches on leaves a few yards away.
What is this, then? A reproduction of a childhood fantasy in which dinosaurs are friends of inquisitive youngsters? The kind of fantasy that doesn’t care that human beings and these prefossilized thunder-lizards are usually thought to have been separated by millions of years? No, this really is meant to be more like one of those literal dioramas of the traditional natural history museum, an imagining of a real habitat, with plant life and landscape reproduced in meticulous detail.
For here at the $27 million Creation Museum, which opens on May 28 (just a short drive from the Cincinnati-Northern Kentucky International Airport), this pastoral scene is a glimpse of the world just after the expulsion from the Garden of Eden, in which dinosaurs are still apparently as herbivorous as humans, and all are enjoying a little calm in the days after the fall.
It also serves as a vivid introduction to the sheer weirdness and daring of this museum created by the Answers in Genesis ministry that combines displays of extraordinary nautilus shell fossils and biblical tableaus, celebrations of natural wonders and allusions to human sin. Evolution gets its continual comeuppance, while biblical revelations are treated as gospel.
Outside the museum scientists may assert that the universe is billions of years old, that fossils are the remains of animals living hundreds of millions of years ago, and that life’s diversity is the result of evolution by natural selection. But inside the museum the Earth is barely 6,000 years old, dinosaurs were created on the sixth day, and Jesus is the savior who will one day repair the trauma of man’s fall.
It is a measure of the museum’s daring that dinosaurs and fossils — once considered major challenges to belief in the Bible’s creation story — are here so central, appearing not as tests of faith, as one religious authority once surmised, but as creatures no different from the giraffes and cats that still walk the earth. Fossils, the museum teaches, are no older than Noah’s flood; in fact dinosaurs were on the ark.
So dinosaur skeletons and brightly colored mineral crystals and images of the Grand Canyon are here, as are life-size dioramas showing paleontologists digging in mock earth, Moses and Paul teaching their doctrines, Martin Luther chastising the church to return to Scripture, Adam and Eve guiltily standing near skinned animals, covering their nakedness, and a supposedly full-size reproduction of a section of Noah’s ark.
There are 52 videos in the museum, one showing how the transformations wrought by the eruption of Mount St. Helens in 1980 reveal how plausible it is that the waters of Noah’s flood could have carved out the Grand Canyon within days. There is a special-effects theater complete with vibrating seats meant to evoke the flood, and a planetarium paying tribute to God’s glory while exploring the nature of galaxies.
Whether you are willing to grant the premises of this museum almost becomes irrelevant as you are drawn into its mixture of spectacle and narrative. Its 60,000 square feet of exhibits are often stunningly designed by Patrick Marsh, who, like the entire museum staff, declares adherence to the ministry’s views; he evidently also knows the lure of secular sensations, since he designed the “Jaws” and “King Kong” attractions at Universal Studios in Florida.
For the skeptic the wonder is at a strange universe shaped by elaborate arguments, strong convictions and intermittent invocations of scientific principle. For the believer, it seems, this museum provides a kind of relief: Finally the world is being shown as it really is, without the distortions of secularism and natural selection.
The Creation Museum actually stands the natural history museum on its head. Natural history museums developed out of the Enlightenment: encyclopedic collections of natural objects were made subject to ever more searching forms of inquiry and organization. The natural history museum gave order to the natural world, taming its seeming chaos with the principles of human reason. And Darwin’s theory — which gave life a compelling order in time as well as space — became central to its purpose. Put on display was the prehistory of civilization, seeming to allude not just to the evolution of species but also cultures (which is why “primitive” cultures were long part of its domain). The natural history museum is a hall of human origins.
The Creation Museum has a similar interest in dramatizing origins, but sees natural history as divine history. And now that many museums have also become temples to various American ethnic and sociological groups, why not a museum for the millions who believe that the Earth is less than 6,000 years old and was created in six days?
Mark Looy, a founder of Answers in Genesis with its president, Ken Ham, said the ministry expected perhaps 250,000 visitors during the museum’s first year. In preparation Mr. Ham for 13 years has been overseeing 350 seminars annually about the truths of Genesis, which have been drawing thousands of acolytes. The organization’s magazine has 50,000 subscribers. The museum also says that it has 9,000 charter members and international contributors who have left the institution free of debt.
But for a visitor steeped in the scientific world view, the impact of the museum is a disorienting mix of faith and reason, the exotic and the familiar. Nature here is not “red in tooth and claw,” as Tennyson asserted. In fact at first it seems almost as genteel as Eden’s dinosaurs. We learn that chameleons, for example, change colors not because that serves as a survival mechanism, but “to ‘talk’ to other chameleons, to show off their mood, and to adjust to heat and light.”
Meanwhile a remarkable fossil of a perch devouring a herring found in Wyoming offers “silent testimony to God’s worldwide judgment,” not because it shows a predator and prey, but because the two perished — somehow getting preserved in stone — during Noah’s flood. Nearly all fossils, the museum asserts, are relics of that divine retribution.
The heart of the museum is a series of catastrophes. The main one is the fall, with Adam and Eve eating of the tree of knowledge; after that tableau the viewer descends from the brightness of Eden into genuinely creepy cement hallways of urban slums. Photographs show the pain of war, childbirth, death — the wages of primal sin. Then come the biblical accounts of the fallen world, leading up to Noah’s ark and the flood, the source of all significant geological phenomena.
The other catastrophe, in the museum’s view, is of more recent vintage: the abandonment of the Bible by church figures who began to treat the story of creation as if it were merely metaphorical, and by Enlightenment philosophers, who chipped away at biblical authority. The ministry believes this is a slippery slope.
Start accepting evolution or an ancient Earth, and the result is like the giant wrecking ball, labeled “Millions of Years,” that is shown smashing the ground at the foundation of a church, the cracks reaching across the gallery to a model of a home in which videos demonstrate the imminence of moral dissolution. A teenager is shown sitting at a computer; he is, we are told, looking at pornography.
But given the museum’s unwavering insistence on belief in the literal truth of biblical accounts, it is strange that so much energy is put into demonstrating their scientific coherence with discussions of erosion or interstellar space. Are such justifications required to convince the skeptical or reassure the believer?
In the museum’s portrayal, creationists and secularists view the same facts, but come up with differing interpretations, perhaps the way Ptolemaic astronomers in the 16th century saw the Earth at the center of the universe, where Copernicans began to place the sun. But one problem is that scientific activity presumes that the material world is organized according to unchanging laws, while biblical fundamentalism presumes that those laws are themselves subject to disruption and miracle. Is not that a slippery slope as well, even affecting these analyses?
But for debates, a visitor goes elsewhere. The Creation Museum offers an alternate world that has its fascinations, even for a skeptic wary of the effect of so many unanswered assertions. He leaves feeling a bit like Adam emerging from Eden, all the world before him, freshly amazed at its strangeness and extravagant peculiarities.
The Creation Museum opens Monday at 2800 Bullittsburg Church Road, Petersburg, Ky.; (888) 582-4253.
Tuesday, May 22
Science, Education and Immigration
By THOMAS L. FRIEDMAN
First I had to laugh. Then I had to cry.
I took part in commencement this year at Rensselaer Polytechnic Institute, one of America’s great science and engineering schools, so I had a front-row seat as the first grads to receive their diplomas came on stage, all of them Ph.D. students. One by one the announcer read their names and each was handed their doctorate — in biotechnology, computing, physics and engineering — by the school’s president, Shirley Ann Jackson.
The reason I had to laugh was because it seemed like every one of the newly minted Ph.D.’s at Rensselaer was foreign born. For a moment, as the foreign names kept coming — “Hong Lu, Xu Xie, Tao Yuan, Fu Tang” — I thought that the entire class of doctoral students in physics were going to be Chinese, until “Paul Shane Morrow” saved the day. It was such a caricature of what President Jackson herself calls “the quiet crisis” in high-end science education in this country that you could only laugh.
Don’t get me wrong. I’m proud that our country continues to build universities and a culture of learning that attract the world’s best minds. My complaint — why I also wanted to cry — was that there wasn’t someone from the Immigration and Naturalization Service standing next to President Jackson stapling green cards to the diplomas of each of these foreign-born Ph.D.’s. I want them all to stay, become Americans and do their research and innovation here. If we can’t educate enough of our own kids to compete at this level, we’d better make sure we can import someone else’s, otherwise we will not maintain our standard of living.
It is pure idiocy that Congress will not open our borders — as wide as possible — to attract and keep the world’s first-round intellectual draft choices in an age when everyone increasingly has the same innovation tools and the key differentiator is human talent. I’m serious. I think any foreign student who gets a Ph.D. in our country — in any subject — should be offered citizenship. I want them. The idea that we actually make it difficult for them to stay is crazy.
Compete America, a coalition of technology companies, is pleading with Congress to boost both the number of H-1B visas available to companies that want to bring in skilled foreign workers and the number of employment-based green cards given to high-tech foreign workers who want to stay here. Give them all they want! Not only do our companies need them now, because we’re not training enough engineers, but they will, over time, start many more companies and create many more good jobs than they would possibly displace. Silicon Valley is living proof of that — and where innovation happens matters. It’s still where the best jobs will be located.
Folks, we can’t keep being stupid about these things. You can’t have a world where foreign-born students dominate your science graduate schools, research labs, journal publications and can now more easily than ever go back to their home countries to start companies — without it eventually impacting our standard of living — especially when we’re also slipping behind in high-speed Internet penetration per capita. America has fallen from fourth in the world in 2001 to 15th today.
My hat is off to Andrew Rasiej and Micah Sifry, co-founders of the Personal Democracy Forum. They are trying to make this an issue in the presidential campaign by creating a movement to demand that candidates focus on our digital deficits and divides. (See: http://www.techpresident.com.) Mr. Rasiej, who unsuccessfully ran for public advocate of New York City in 2005 on a platform calling for low-cost wireless access everywhere, notes that “only half of America has broadband access to the Internet.” We need to go from “No Child Left Behind,” he says, to “Every Child Connected.”
Here’s the sad truth: 9/11, and the failing Iraq war, have sucked up almost all the oxygen in this country — oxygen needed to discuss seriously education, health care, climate change and competitiveness, notes Garrett Graff, an editor at Washingtonian Magazine and author of the upcoming book “The First Campaign,” which deals with this theme. So right now, it’s mostly governors talking about these issues, noted Mr. Graff, but there is only so much they can do without Washington being focused and leading.
Which is why we’ve got to bring our occupation of Iraq to an end in the quickest, least bad way possible — otherwise we are going to lose Iraq and America. It’s coming down to that choice.
First I had to laugh. Then I had to cry.
I took part in commencement this year at Rensselaer Polytechnic Institute, one of America’s great science and engineering schools, so I had a front-row seat as the first grads to receive their diplomas came on stage, all of them Ph.D. students. One by one the announcer read their names and each was handed their doctorate — in biotechnology, computing, physics and engineering — by the school’s president, Shirley Ann Jackson.
The reason I had to laugh was because it seemed like every one of the newly minted Ph.D.’s at Rensselaer was foreign born. For a moment, as the foreign names kept coming — “Hong Lu, Xu Xie, Tao Yuan, Fu Tang” — I thought that the entire class of doctoral students in physics were going to be Chinese, until “Paul Shane Morrow” saved the day. It was such a caricature of what President Jackson herself calls “the quiet crisis” in high-end science education in this country that you could only laugh.
Don’t get me wrong. I’m proud that our country continues to build universities and a culture of learning that attract the world’s best minds. My complaint — why I also wanted to cry — was that there wasn’t someone from the Immigration and Naturalization Service standing next to President Jackson stapling green cards to the diplomas of each of these foreign-born Ph.D.’s. I want them all to stay, become Americans and do their research and innovation here. If we can’t educate enough of our own kids to compete at this level, we’d better make sure we can import someone else’s, otherwise we will not maintain our standard of living.
It is pure idiocy that Congress will not open our borders — as wide as possible — to attract and keep the world’s first-round intellectual draft choices in an age when everyone increasingly has the same innovation tools and the key differentiator is human talent. I’m serious. I think any foreign student who gets a Ph.D. in our country — in any subject — should be offered citizenship. I want them. The idea that we actually make it difficult for them to stay is crazy.
Compete America, a coalition of technology companies, is pleading with Congress to boost both the number of H-1B visas available to companies that want to bring in skilled foreign workers and the number of employment-based green cards given to high-tech foreign workers who want to stay here. Give them all they want! Not only do our companies need them now, because we’re not training enough engineers, but they will, over time, start many more companies and create many more good jobs than they would possibly displace. Silicon Valley is living proof of that — and where innovation happens matters. It’s still where the best jobs will be located.
Folks, we can’t keep being stupid about these things. You can’t have a world where foreign-born students dominate your science graduate schools, research labs, journal publications and can now more easily than ever go back to their home countries to start companies — without it eventually impacting our standard of living — especially when we’re also slipping behind in high-speed Internet penetration per capita. America has fallen from fourth in the world in 2001 to 15th today.
My hat is off to Andrew Rasiej and Micah Sifry, co-founders of the Personal Democracy Forum. They are trying to make this an issue in the presidential campaign by creating a movement to demand that candidates focus on our digital deficits and divides. (See: http://www.techpresident.com.) Mr. Rasiej, who unsuccessfully ran for public advocate of New York City in 2005 on a platform calling for low-cost wireless access everywhere, notes that “only half of America has broadband access to the Internet.” We need to go from “No Child Left Behind,” he says, to “Every Child Connected.”
Here’s the sad truth: 9/11, and the failing Iraq war, have sucked up almost all the oxygen in this country — oxygen needed to discuss seriously education, health care, climate change and competitiveness, notes Garrett Graff, an editor at Washingtonian Magazine and author of the upcoming book “The First Campaign,” which deals with this theme. So right now, it’s mostly governors talking about these issues, noted Mr. Graff, but there is only so much they can do without Washington being focused and leading.
Which is why we’ve got to bring our occupation of Iraq to an end in the quickest, least bad way possible — otherwise we are going to lose Iraq and America. It’s coming down to that choice.
Bush Wars: The Final Frontier
By David Greenberg
With the title of his 1980 book, the journalist Sidney Blumenthal coined a term that has entered the political lexicon: “The Permanent Campaign.” The phrase denotes the blurring of the line in modern times between campaigning and governing. Presidents, of course, have always made decisions with an eye on their popularity. But with the advent of television, polling, and professional consultants, presidents of the 1970s and ’80s—Nixon, Carter, and Reagan in particular—upped the ante by devoting the full arsenal of modern electioneering tools not just to winning office but to holding office as well.
Recently, the permanent campaign has added yet another stage—a final frontier. This is the fight for history, centered on the presidents’ libraries. The presidential libraries date from 1939, when Franklin Delano Roosevelt conceived his as a repository for his public papers, to be built with private funds and run by public officials. Since then, chief executives have sought to outdo their predecessors by entombing their archives in extravagant, self-memorializing shrines. These museums extend the permanent campaign by utilizing the same public relations techniques as a race for the White House—and lately requiring comparable fundraising efforts to boot.
Fittingly, Nixon and Carter were the ex-presidents who most assiduously sought to revive their badly damaged reputations, through their libraries and other activities. George W. Bush has taken this campaign for history to a new level. As the New York Daily News reported last fall, Bush’s team has imagined the most grandiose presidential library yet. His crack fundraisers are aiming to rake in a record-breaking $500 million for his personal temple, passing the cattle hat to “wealthy heiresses, Arab nations and captains of industry,” as the newspaper put it. For the site, Bush chose Southern Methodist University in Dallas, Texas, the First Lady’s alma mater.
Maybe Bush feared the reaction if he sought to build his library at Yale, his own undergraduate stomping grounds. But if he was hoping for smoother sailing in deep-red Texas, he miscalculated. The news of the museum’s impending arrival hardly went well on the S.M.U. campus. Universities, after all, are supposed to promote disinterested research, and the gilded histories that tend to adorn the presidential galleries don’t exactly pulse with evenhandedness.
Making matters worse, the Bush team stipulated that the university should host a “Bush Institute.” This proposed policy shop, a spokesman for the president’s camp told the Daily News, would hire right-wing scholars or journalists and “give them money to write papers and books favorable to the president’s policies.”
The S.M.U. faculty, led by two theology professors, Bill McElvaney and Susanne Johnson, protested. So did the student newspaper, the Daily Campus. But after a flurry of reports over the winter, the issue abruptly dropped out of the national news. Perhaps the seeming determination of S.M.U. President R. Gerald Turner to bring the library to campus, willy-nilly, sapped the opposition’s resolve. Perhaps the news media are simply operating as Walter Lippmann famously described: like the beam of a searchlight that illuminates one thing for a passing moment but quickly moves on to something else.
Whatever the case, the issue deserves continued attention. In mid-April, the S.M.U. faculty senate overwhelmingly passed two resolutions affirming the independence of any proposed Bush Institute from the university proper. And earlier this month the Daily Campus named Professors McElvaney and Johnson “S.M.U. Persons of the Year”—a vote of support for their efforts.
And make no mistake: the students and faculty are right to insist that S.M.U. disassociate itself from the think tank. Arguably the most distressing pattern of the Bush administration has been its jettisoning of long-established standards in virtually all areas of knowledge. From U.S. attorneys to intelligence analysts, environmental scientists to the Washington press corps, Bushies dismiss non-partisan experts as liberals and thus ripe for replacement by conservative partisans. Areas of scholarly inquiry from biology to medicine to law to economics are routinely politicized. We can expect the proposed Bush Institute at S.M.U. to treat history the same way, setting aside the norms and practices of the scholarly community in the interest of generating pseudo-research.
Ironically, this intensification of the final stage of the permanent campaign comes at a moment when changes are afoot at the presidential museum previously known for most egregiously distorting the record: the Nixon Library. Because Nixon had tried to destroy and abscond with his papers, Congress awarded ownership of them (and the papers of subsequent presidents) to the government. The museum that Nixon then built with funds from his wealthy friends remained outside the family of official libraries—a pariah every bit as much as the man it venerated. It could falsify history—by inaccurately suggesting, for example, that the Democrats wanted to impeach Nixon in order to put House Speaker Carl Albert in the White House—but it had to do so on its own dime.
Now, though, due to legislation passed by the Republican-controlled Congress and Bush in 2004, the Nixon Library is going legit. The law allows the Nixon Library to get government funds and official recognition, but—ironically—it will also force the institution to come under the direction of a real historian. The director-designate, Timothy J. Naftali, formerly of the University of Virginia, has already ripped out the notorious Watergate exhibit.
Indeed, even back when the Nixon Library was still shameless about whitewashing Nixon’s crimes, public pressure forced it to impose certain limits. When the library opened in 1990, the then-director, Hugh Hewitt (now a right-wing radio host), announced that he would exclude insufficiently pro-Nixon researchers. An outcry ensued and he was overruled and relieved of his job.
In contrast, the powers at S.M.U. have not full-throatedly condemned the idea of a Bush institute that would generate historical propaganda. For them to do so will require an outpouring of support that can only be triggered by sustained media attention. National journalists, it seems, will have to redirect their searchlight beams onto the doings of the faculty meetings, campus newspaper editorials, and administration boardrooms at S.M.U. History is at stake.
----
David Greenberg, an assistant professor of history and journalism at Rutgers University, in New Brunswick, N.J., is the author of three books, “Nixon’s Shadow,” “Presidential Doodles” and, most recently, “Calvin Coolidge.”
With the title of his 1980 book, the journalist Sidney Blumenthal coined a term that has entered the political lexicon: “The Permanent Campaign.” The phrase denotes the blurring of the line in modern times between campaigning and governing. Presidents, of course, have always made decisions with an eye on their popularity. But with the advent of television, polling, and professional consultants, presidents of the 1970s and ’80s—Nixon, Carter, and Reagan in particular—upped the ante by devoting the full arsenal of modern electioneering tools not just to winning office but to holding office as well.
Recently, the permanent campaign has added yet another stage—a final frontier. This is the fight for history, centered on the presidents’ libraries. The presidential libraries date from 1939, when Franklin Delano Roosevelt conceived his as a repository for his public papers, to be built with private funds and run by public officials. Since then, chief executives have sought to outdo their predecessors by entombing their archives in extravagant, self-memorializing shrines. These museums extend the permanent campaign by utilizing the same public relations techniques as a race for the White House—and lately requiring comparable fundraising efforts to boot.
Fittingly, Nixon and Carter were the ex-presidents who most assiduously sought to revive their badly damaged reputations, through their libraries and other activities. George W. Bush has taken this campaign for history to a new level. As the New York Daily News reported last fall, Bush’s team has imagined the most grandiose presidential library yet. His crack fundraisers are aiming to rake in a record-breaking $500 million for his personal temple, passing the cattle hat to “wealthy heiresses, Arab nations and captains of industry,” as the newspaper put it. For the site, Bush chose Southern Methodist University in Dallas, Texas, the First Lady’s alma mater.
Maybe Bush feared the reaction if he sought to build his library at Yale, his own undergraduate stomping grounds. But if he was hoping for smoother sailing in deep-red Texas, he miscalculated. The news of the museum’s impending arrival hardly went well on the S.M.U. campus. Universities, after all, are supposed to promote disinterested research, and the gilded histories that tend to adorn the presidential galleries don’t exactly pulse with evenhandedness.
Making matters worse, the Bush team stipulated that the university should host a “Bush Institute.” This proposed policy shop, a spokesman for the president’s camp told the Daily News, would hire right-wing scholars or journalists and “give them money to write papers and books favorable to the president’s policies.”
The S.M.U. faculty, led by two theology professors, Bill McElvaney and Susanne Johnson, protested. So did the student newspaper, the Daily Campus. But after a flurry of reports over the winter, the issue abruptly dropped out of the national news. Perhaps the seeming determination of S.M.U. President R. Gerald Turner to bring the library to campus, willy-nilly, sapped the opposition’s resolve. Perhaps the news media are simply operating as Walter Lippmann famously described: like the beam of a searchlight that illuminates one thing for a passing moment but quickly moves on to something else.
Whatever the case, the issue deserves continued attention. In mid-April, the S.M.U. faculty senate overwhelmingly passed two resolutions affirming the independence of any proposed Bush Institute from the university proper. And earlier this month the Daily Campus named Professors McElvaney and Johnson “S.M.U. Persons of the Year”—a vote of support for their efforts.
And make no mistake: the students and faculty are right to insist that S.M.U. disassociate itself from the think tank. Arguably the most distressing pattern of the Bush administration has been its jettisoning of long-established standards in virtually all areas of knowledge. From U.S. attorneys to intelligence analysts, environmental scientists to the Washington press corps, Bushies dismiss non-partisan experts as liberals and thus ripe for replacement by conservative partisans. Areas of scholarly inquiry from biology to medicine to law to economics are routinely politicized. We can expect the proposed Bush Institute at S.M.U. to treat history the same way, setting aside the norms and practices of the scholarly community in the interest of generating pseudo-research.
Ironically, this intensification of the final stage of the permanent campaign comes at a moment when changes are afoot at the presidential museum previously known for most egregiously distorting the record: the Nixon Library. Because Nixon had tried to destroy and abscond with his papers, Congress awarded ownership of them (and the papers of subsequent presidents) to the government. The museum that Nixon then built with funds from his wealthy friends remained outside the family of official libraries—a pariah every bit as much as the man it venerated. It could falsify history—by inaccurately suggesting, for example, that the Democrats wanted to impeach Nixon in order to put House Speaker Carl Albert in the White House—but it had to do so on its own dime.
Now, though, due to legislation passed by the Republican-controlled Congress and Bush in 2004, the Nixon Library is going legit. The law allows the Nixon Library to get government funds and official recognition, but—ironically—it will also force the institution to come under the direction of a real historian. The director-designate, Timothy J. Naftali, formerly of the University of Virginia, has already ripped out the notorious Watergate exhibit.
Indeed, even back when the Nixon Library was still shameless about whitewashing Nixon’s crimes, public pressure forced it to impose certain limits. When the library opened in 1990, the then-director, Hugh Hewitt (now a right-wing radio host), announced that he would exclude insufficiently pro-Nixon researchers. An outcry ensued and he was overruled and relieved of his job.
In contrast, the powers at S.M.U. have not full-throatedly condemned the idea of a Bush institute that would generate historical propaganda. For them to do so will require an outpouring of support that can only be triggered by sustained media attention. National journalists, it seems, will have to redirect their searchlight beams onto the doings of the faculty meetings, campus newspaper editorials, and administration boardrooms at S.M.U. History is at stake.
----
David Greenberg, an assistant professor of history and journalism at Rutgers University, in New Brunswick, N.J., is the author of three books, “Nixon’s Shadow,” “Presidential Doodles” and, most recently, “Calvin Coolidge.”
When Government Was the Solution
By Jean Edward Smith
For more than a generation, Americans have been told that government is the problem, not the solution. The mantra can be traced back to Barry Goldwater’s presidential bid in 1964. It provided the mind-set for the Reagan administration, and it has come to ultimate fruition during the presidency of George W. Bush.
On college campuses and at think tanks across the country, libertarian scholars stoke the urge to eliminate government from our lives. This thinking has led to the privatization of vital government functions such as the care of disabled veterans, the appointment to regulatory commissions of members at odds with the regulations they are sworn to enforce, the refusal of the Environmental Protection Agency to protect the environment, and the surrender of the government’s management of military operations to profit-seeking contractors.
A look back at Franklin D. Roosevelt’s presidency shows how differently Americans once viewed the government’s role, how much more optimistic they were and how much more they trusted the president.
F.D.R., like his cousin Theodore, saw government in positive terms. In 1912, speaking in Troy, N.Y., F.D.R. warned of the dangers of excessive individualism. The liberty of the individual must be harnessed for the benefit of the community, said Roosevelt. “Don’t call it regulation. People will hold up their hands in horror and say ‘un-American.’ Call it ‘cooperation.’ ”
When F.D.R. took office in 1933, one third of the nation was unemployed. Agriculture was destitute, factories were idle, businesses were closing their doors, and the banking system teetered on the brink of collapse. Violence lay just beneath the surface.
Roosevelt seized the opportunity. He galvanized the nation with an inaugural address that few will ever forget (”The only thing we have to fear is fear itself.”), closed the nation’s banks to restore depositor confidence and initiated a flurry of legislative proposals to put the country back on its feet. Sound banks were quickly reopened, weak ones were consolidated and, despite cries on the left for nationalization, the banking system was preserved.
Roosevelt had no master plan for recovery but responded pragmatically. Some initiatives, such as the Civilian Conservation Corps, which employed young men to reclaim the nation’s natural resources, were pure F.D.R. Others, such as the National Industrial Recovery Act, were Congressionally inspired. But for the first time in American history, government became an active participant in the country’s economic life.
After saving the banks, Roosevelt turned to agriculture. In Iowa, a bushel of corn was selling for less than a package of chewing gum. Crops rotted unharvested in the fields, and 46 percent of the nation’s farms faced foreclosure.
The New Deal responded with acreage allotments, price supports and the Farm Credit Administration. Farm mortgages were refinanced and production credit provided at low interest rates. A network of county agents, established under the Agricultural Adjustment Act, brought soil testing and the latest scientific advances to every county in the country.
The urban housing market was in equal disarray. Almost half of the nation’s homeowners could not make their mortgage payments, and new home construction was at a standstill. Roosevelt responded with the Home Owners’ Loan Corporation. Mortgages were refinanced. Distressed home owners were provided money for taxes and repairs. And new loan criteria, longer amortization periods and low interest rates made home ownership more widely affordable, also for the first time in American history.
The Glass-Steagall Banking Act, passed in 1933, authorized the Federal Reserve to set interest rates and established the Federal Deposit Insurance Corporation to insure individual bank deposits. No measure has had a greater impact on American lives or provided greater security for the average citizen.
The Tennessee Valley Authority, also established in 1933, brought cheap electric power and economic development to one of the most poverty-stricken regions of the country. Rural electrification, which we take for granted today, was virtually unknown when Roosevelt took office. Only about one in 10 American farms had electricity. In Mississippi, fewer than 1 in 100 did. The Rural Electrification Administration, which F.D.R. established by executive order in 1935, brought electric power to the countryside, aided by the construction of massive hydroelectric dams, not only on the Tennessee River system, but on the Columbia, Colorado and Missouri rivers as well.
To combat fraud in the securities industry, Roosevelt oversaw passage of the Truth in Securities Act, and then in 1934 established the Securities and Exchange Commission. As its first head he chose Joseph P. Kennedy. “Set a thief to catch a thief,” he joked afterward.
By overwhelming majorities, Congress passed laws establishing labor’s right to bargain collectively and the authority of the federal government to regulate hours and working conditions and to set minimum wages.
An alphabet soup of public works agencies — the C.W.A. (Civil Works Administration), the W.P.A. (Works Progress Administration) and the P.W.A. (Public Works Administration) — not only provided jobs, but restored the nation’s neglected infrastructure. Between 1933 and 1937, the federal government constructed more than half a million miles of highways and secondary roads, 5,900 schools, 2,500 hospitals, 8,000 parks, 13,000 playgrounds and 1,000 regional airports. Cultural projects employed and stimulated a generation of artists and writers, including such luminaries as Willem de Kooning, Jackson Pollock, John Cheever and Richard Wright.
Roosevelt saw Social Security, enacted in 1935, as the centerpiece of the New Deal. “If our Federal Government was established … ‘to promote the general welfare,’ ” said F.D.R., “it is our plain duty to provide for that security upon which welfare depends.”
For the first time, the government assumed responsibility for unemployment compensation, old-age and survivor benefits, as well as aid to dependent children and the handicapped. At F.D.R.’s insistence, Social Security was self-funding – supported by contributions paid jointly by employers and employees. (In most industrialized countries, the government provides the major funding for pension plans.) “Those taxes are in there,” Roosevelt said later, “so that no damn politician can ever scrap my Social Security program.”
The government’s positive role did not end when the New Deal lost effective control of Congress in 1938. Neither Wendell Willkie, the G.O.P. standard-bearer in 1940, nor Thomas E. Dewey, in 1944 and ’48, advocated turning back the clock.
The G.I. Bill of Rights, adopted unanimously by both houses of Congress in 1944, provided massive government funding to provide university and vocational training for returning veterans. The G.I. Bill changed the face of higher education by making universities accessible to virtually every American.
The Eisenhower administration continued to see government in positive terms. President Eisenhower added 15 million low-income wage earners to Social Security, and he launched the interstate highway system – which also was self-funding, through additional gasoline taxes. Only the federal government could have organized so vast an undertaking, the benefits of which continue to accrue.
The ideological obsession of the Bush administration to diminish the role of government has served the country badly. But perhaps this government’s demonstrated inability to improve the lives of ordinary Americans will ensure that future efforts to “repeal the New Deal” are not successful.
------
Jean Edward Smith, the John Marshall Professor of political science at Marshall University, in Huntington, W. Va., is the author of 12 books, including biographies of Ulysses S. Grant, Chief Justice John Marshall and General Lucius D. Clay. His latest book is “F.D.R.”
For more than a generation, Americans have been told that government is the problem, not the solution. The mantra can be traced back to Barry Goldwater’s presidential bid in 1964. It provided the mind-set for the Reagan administration, and it has come to ultimate fruition during the presidency of George W. Bush.
On college campuses and at think tanks across the country, libertarian scholars stoke the urge to eliminate government from our lives. This thinking has led to the privatization of vital government functions such as the care of disabled veterans, the appointment to regulatory commissions of members at odds with the regulations they are sworn to enforce, the refusal of the Environmental Protection Agency to protect the environment, and the surrender of the government’s management of military operations to profit-seeking contractors.
A look back at Franklin D. Roosevelt’s presidency shows how differently Americans once viewed the government’s role, how much more optimistic they were and how much more they trusted the president.
F.D.R., like his cousin Theodore, saw government in positive terms. In 1912, speaking in Troy, N.Y., F.D.R. warned of the dangers of excessive individualism. The liberty of the individual must be harnessed for the benefit of the community, said Roosevelt. “Don’t call it regulation. People will hold up their hands in horror and say ‘un-American.’ Call it ‘cooperation.’ ”
When F.D.R. took office in 1933, one third of the nation was unemployed. Agriculture was destitute, factories were idle, businesses were closing their doors, and the banking system teetered on the brink of collapse. Violence lay just beneath the surface.
Roosevelt seized the opportunity. He galvanized the nation with an inaugural address that few will ever forget (”The only thing we have to fear is fear itself.”), closed the nation’s banks to restore depositor confidence and initiated a flurry of legislative proposals to put the country back on its feet. Sound banks were quickly reopened, weak ones were consolidated and, despite cries on the left for nationalization, the banking system was preserved.
Roosevelt had no master plan for recovery but responded pragmatically. Some initiatives, such as the Civilian Conservation Corps, which employed young men to reclaim the nation’s natural resources, were pure F.D.R. Others, such as the National Industrial Recovery Act, were Congressionally inspired. But for the first time in American history, government became an active participant in the country’s economic life.
After saving the banks, Roosevelt turned to agriculture. In Iowa, a bushel of corn was selling for less than a package of chewing gum. Crops rotted unharvested in the fields, and 46 percent of the nation’s farms faced foreclosure.
The New Deal responded with acreage allotments, price supports and the Farm Credit Administration. Farm mortgages were refinanced and production credit provided at low interest rates. A network of county agents, established under the Agricultural Adjustment Act, brought soil testing and the latest scientific advances to every county in the country.
The urban housing market was in equal disarray. Almost half of the nation’s homeowners could not make their mortgage payments, and new home construction was at a standstill. Roosevelt responded with the Home Owners’ Loan Corporation. Mortgages were refinanced. Distressed home owners were provided money for taxes and repairs. And new loan criteria, longer amortization periods and low interest rates made home ownership more widely affordable, also for the first time in American history.
The Glass-Steagall Banking Act, passed in 1933, authorized the Federal Reserve to set interest rates and established the Federal Deposit Insurance Corporation to insure individual bank deposits. No measure has had a greater impact on American lives or provided greater security for the average citizen.
The Tennessee Valley Authority, also established in 1933, brought cheap electric power and economic development to one of the most poverty-stricken regions of the country. Rural electrification, which we take for granted today, was virtually unknown when Roosevelt took office. Only about one in 10 American farms had electricity. In Mississippi, fewer than 1 in 100 did. The Rural Electrification Administration, which F.D.R. established by executive order in 1935, brought electric power to the countryside, aided by the construction of massive hydroelectric dams, not only on the Tennessee River system, but on the Columbia, Colorado and Missouri rivers as well.
To combat fraud in the securities industry, Roosevelt oversaw passage of the Truth in Securities Act, and then in 1934 established the Securities and Exchange Commission. As its first head he chose Joseph P. Kennedy. “Set a thief to catch a thief,” he joked afterward.
By overwhelming majorities, Congress passed laws establishing labor’s right to bargain collectively and the authority of the federal government to regulate hours and working conditions and to set minimum wages.
An alphabet soup of public works agencies — the C.W.A. (Civil Works Administration), the W.P.A. (Works Progress Administration) and the P.W.A. (Public Works Administration) — not only provided jobs, but restored the nation’s neglected infrastructure. Between 1933 and 1937, the federal government constructed more than half a million miles of highways and secondary roads, 5,900 schools, 2,500 hospitals, 8,000 parks, 13,000 playgrounds and 1,000 regional airports. Cultural projects employed and stimulated a generation of artists and writers, including such luminaries as Willem de Kooning, Jackson Pollock, John Cheever and Richard Wright.
Roosevelt saw Social Security, enacted in 1935, as the centerpiece of the New Deal. “If our Federal Government was established … ‘to promote the general welfare,’ ” said F.D.R., “it is our plain duty to provide for that security upon which welfare depends.”
For the first time, the government assumed responsibility for unemployment compensation, old-age and survivor benefits, as well as aid to dependent children and the handicapped. At F.D.R.’s insistence, Social Security was self-funding – supported by contributions paid jointly by employers and employees. (In most industrialized countries, the government provides the major funding for pension plans.) “Those taxes are in there,” Roosevelt said later, “so that no damn politician can ever scrap my Social Security program.”
The government’s positive role did not end when the New Deal lost effective control of Congress in 1938. Neither Wendell Willkie, the G.O.P. standard-bearer in 1940, nor Thomas E. Dewey, in 1944 and ’48, advocated turning back the clock.
The G.I. Bill of Rights, adopted unanimously by both houses of Congress in 1944, provided massive government funding to provide university and vocational training for returning veterans. The G.I. Bill changed the face of higher education by making universities accessible to virtually every American.
The Eisenhower administration continued to see government in positive terms. President Eisenhower added 15 million low-income wage earners to Social Security, and he launched the interstate highway system – which also was self-funding, through additional gasoline taxes. Only the federal government could have organized so vast an undertaking, the benefits of which continue to accrue.
The ideological obsession of the Bush administration to diminish the role of government has served the country badly. But perhaps this government’s demonstrated inability to improve the lives of ordinary Americans will ensure that future efforts to “repeal the New Deal” are not successful.
------
Jean Edward Smith, the John Marshall Professor of political science at Marshall University, in Huntington, W. Va., is the author of 12 books, including biographies of Ulysses S. Grant, Chief Justice John Marshall and General Lucius D. Clay. His latest book is “F.D.R.”
Sunday, May 20
Conservatives
WITH the death on Tuesday of the Rev. Jerry Falwell, the Baptist minister and founder of the Moral Majority, and the announcement on Thursday that Paul D. Wolfowitz would resign from the presidency of the World Bank, two major figures in the modern conservative movement exited the political stage. To many, this is the latest evidence that the conservative movement, which has dominated politics during the last quarter century, is finished.
But conservatives have heard this before, and have yet to give in. Weeks after Barry Goldwater suffered a humiliating defeat in 1964 to Lyndon B. Johnson, his supporters organized the American Conservative Union to take on the Republican Party establishment. After failing to unseat Gerald Ford for the Republican nomination in 1976, Ronald Reagan positioned himself for the 1980 election. The conservatives dismayed by the election of Bill Clinton spent the next eight years attacking him at every opportunity. And after failing to win a conviction of Mr. Clinton following his impeachment, Republicans, far from retreating into caution or self-doubt, kept up the pressure and turned the 2000 election into a referendum on Mr. Clinton’s character.
What accounts for this resilience — or stubbornness?
For one thing, since its beginnings in the 1950s, conservatism has been an insurgent movement fought on many fronts — cultural, moral and philosophical. Leaders on the right, as well as the rank and file, have always believed that defeats were inevitable and the odds often long.
Consider the careers, or cases, of Mr. Falwell and Mr. Wolfowitz . That the two were polar opposites in almost every way says a good deal about the movement they served — for one thing about its ability in its formative years, the 1970s and 80s, to make room for a constellation of agendas.
Mr. Falwell, a bootlegger’s son who disapproved of Martin Luther King Jr.’s civil rights activism because “preachers are not called to be politicians, but soul-winners” changed his mind amid the “secular” excesses of the 1960s and went on to found Moral Majority Inc. The organization mobilized the Christian right into a political force that eventually helped elect Ronald Reagan in 1980 and also established the Republican Party as the home of politically active Christian evangelicals.
Mr. Wolfowitz was not a “movement conservative.” He did not inveigh against the sins of “secular liberalism” or homosexuals and the American Civil Liberties Union.
But like Mr. Falwell, he came to politics in the 1970s, when the conservative insurgency was reaching its apex and new ideological lines were emerging. Mr. Wolfowitz, from a family of immigrant Jews from Poland, was a brilliant student whose teachers included three neoconservative giants — Allan Bloom, Leo Strauss and Albert Wohlstetter. He felt out of place on the Yale faculty and discovered his métier as a military strategist in the Ford Administration. In his first major assignment he joined a team of analysts who drew up a dire, and some said, exaggerated analysis of Soviet military strength.
The report clashed with intelligence assessments, presaging Mr. Wolfowitz’s clash with intelligence experts in the months leading up to the Iraq invasion.
During this period, the ideas espoused by Mr. Falwell and Mr. Wolfowitz, though both were valued on the right, did not mesh; they were unconnected, spokes in the large conservative wheel. And so they remained during the first months of George W. Bush’s administration.
But after 9/11, neoconservatives and evangelicals found common cause in their shared belief in American exceptionalism and in the idea that the country’s values could be exported abroad. Mr. Bush was receptive to the synthesis, and it became the ideological centerpiece of the war on terror, with its stated mission to combat the “axis of evil” in a global “war on terror.”
Today, of course, this vision, has been widely repudiated, if not altogether discredited. The public has grown skeptical, or maybe just tired, of the hard-edged and often polarizing politics. And this change coincides with broad-based skepticism of the Bush presidency itself — as witnessed by Mr. Bush’s and his party’s perilously low approval ratings. The G.O.P.’s embrace of the conservative movement is beginning, some say, to resemble a death grip.
But there have been no signs of atonement within the movement. Mr. Falwell, who notoriously suggested that the Sept. 11 attacks reflected God’s judgment of “the pagans, and the abortionists, and the feminists, and the gays and the lesbians,” said last September that he hoped Hillary Clinton was running for president because she would outdo “Lucifer” in energizing his constituents.
Then there is Mr. Wolfowitz. The figure he is often compared to is, of course, Robert McNamara, the former defense secretary in the Kennedy and Johnson administrations, who like him had been a brutal infighter at the Pentagon, as he helped engineer a disastrous war. When Mr. McNamara left to head the World Bank, he seemed stricken by his failure in Vietnam and approached his new position with “a touch of penance,” as David Halberstam wrote in “The Best and the Brightest.”
Not Mr. Wolfowitz. Apparently unchastened by the Iraq disaster — though he was its intellectual architect — he simply carried his mission to a new venue, “adopting a single-minded position on certain matters, refusing to entertain alternative views, marginalizing dissenters,” as Steven R. Weisman reported in The Times.
These differences aren’t surprising. Mr. McNamara was a technician, an efficiency expert and systems manager unattached to any political ideology. He viewed Vietnam as a problem, not a cause. Whereas for Mr. Wolfowitz, Iraq was a fixation — as indeed it has been, and remains, for the principal figures in the Bush administration, who are still deeply committed conservatives.
George Bush himself is, after all, the first president who came of age while the conservative movement was in full force, and it continues to guide him, just as it continues to guide members of the Republican base who still back the administration and who are almost certain to choose the party’s next nominee.
These views, of course, are sharply at odds with public opinion. The unpopularity of the Iraq war is nearing Vietnam proportions, and President Bush’s plummeting poll numbers resemble those of Mr. McNamara’s boss, Lyndon B. Johnson.
Meanwhile, the conservative movement finds itself in a new place. No longer insurgents, its leaders now form an entrenched establishment, just like the liberals and moderates they defeated a generation ago. But if Mr. Wolfowitz’s brief tenure at the World Bank is any indication, they are a long way from feeling chastened.
But conservatives have heard this before, and have yet to give in. Weeks after Barry Goldwater suffered a humiliating defeat in 1964 to Lyndon B. Johnson, his supporters organized the American Conservative Union to take on the Republican Party establishment. After failing to unseat Gerald Ford for the Republican nomination in 1976, Ronald Reagan positioned himself for the 1980 election. The conservatives dismayed by the election of Bill Clinton spent the next eight years attacking him at every opportunity. And after failing to win a conviction of Mr. Clinton following his impeachment, Republicans, far from retreating into caution or self-doubt, kept up the pressure and turned the 2000 election into a referendum on Mr. Clinton’s character.
What accounts for this resilience — or stubbornness?
For one thing, since its beginnings in the 1950s, conservatism has been an insurgent movement fought on many fronts — cultural, moral and philosophical. Leaders on the right, as well as the rank and file, have always believed that defeats were inevitable and the odds often long.
Consider the careers, or cases, of Mr. Falwell and Mr. Wolfowitz . That the two were polar opposites in almost every way says a good deal about the movement they served — for one thing about its ability in its formative years, the 1970s and 80s, to make room for a constellation of agendas.
Mr. Falwell, a bootlegger’s son who disapproved of Martin Luther King Jr.’s civil rights activism because “preachers are not called to be politicians, but soul-winners” changed his mind amid the “secular” excesses of the 1960s and went on to found Moral Majority Inc. The organization mobilized the Christian right into a political force that eventually helped elect Ronald Reagan in 1980 and also established the Republican Party as the home of politically active Christian evangelicals.
Mr. Wolfowitz was not a “movement conservative.” He did not inveigh against the sins of “secular liberalism” or homosexuals and the American Civil Liberties Union.
But like Mr. Falwell, he came to politics in the 1970s, when the conservative insurgency was reaching its apex and new ideological lines were emerging. Mr. Wolfowitz, from a family of immigrant Jews from Poland, was a brilliant student whose teachers included three neoconservative giants — Allan Bloom, Leo Strauss and Albert Wohlstetter. He felt out of place on the Yale faculty and discovered his métier as a military strategist in the Ford Administration. In his first major assignment he joined a team of analysts who drew up a dire, and some said, exaggerated analysis of Soviet military strength.
The report clashed with intelligence assessments, presaging Mr. Wolfowitz’s clash with intelligence experts in the months leading up to the Iraq invasion.
During this period, the ideas espoused by Mr. Falwell and Mr. Wolfowitz, though both were valued on the right, did not mesh; they were unconnected, spokes in the large conservative wheel. And so they remained during the first months of George W. Bush’s administration.
But after 9/11, neoconservatives and evangelicals found common cause in their shared belief in American exceptionalism and in the idea that the country’s values could be exported abroad. Mr. Bush was receptive to the synthesis, and it became the ideological centerpiece of the war on terror, with its stated mission to combat the “axis of evil” in a global “war on terror.”
Today, of course, this vision, has been widely repudiated, if not altogether discredited. The public has grown skeptical, or maybe just tired, of the hard-edged and often polarizing politics. And this change coincides with broad-based skepticism of the Bush presidency itself — as witnessed by Mr. Bush’s and his party’s perilously low approval ratings. The G.O.P.’s embrace of the conservative movement is beginning, some say, to resemble a death grip.
But there have been no signs of atonement within the movement. Mr. Falwell, who notoriously suggested that the Sept. 11 attacks reflected God’s judgment of “the pagans, and the abortionists, and the feminists, and the gays and the lesbians,” said last September that he hoped Hillary Clinton was running for president because she would outdo “Lucifer” in energizing his constituents.
Then there is Mr. Wolfowitz. The figure he is often compared to is, of course, Robert McNamara, the former defense secretary in the Kennedy and Johnson administrations, who like him had been a brutal infighter at the Pentagon, as he helped engineer a disastrous war. When Mr. McNamara left to head the World Bank, he seemed stricken by his failure in Vietnam and approached his new position with “a touch of penance,” as David Halberstam wrote in “The Best and the Brightest.”
Not Mr. Wolfowitz. Apparently unchastened by the Iraq disaster — though he was its intellectual architect — he simply carried his mission to a new venue, “adopting a single-minded position on certain matters, refusing to entertain alternative views, marginalizing dissenters,” as Steven R. Weisman reported in The Times.
These differences aren’t surprising. Mr. McNamara was a technician, an efficiency expert and systems manager unattached to any political ideology. He viewed Vietnam as a problem, not a cause. Whereas for Mr. Wolfowitz, Iraq was a fixation — as indeed it has been, and remains, for the principal figures in the Bush administration, who are still deeply committed conservatives.
George Bush himself is, after all, the first president who came of age while the conservative movement was in full force, and it continues to guide him, just as it continues to guide members of the Republican base who still back the administration and who are almost certain to choose the party’s next nominee.
These views, of course, are sharply at odds with public opinion. The unpopularity of the Iraq war is nearing Vietnam proportions, and President Bush’s plummeting poll numbers resemble those of Mr. McNamara’s boss, Lyndon B. Johnson.
Meanwhile, the conservative movement finds itself in a new place. No longer insurgents, its leaders now form an entrenched establishment, just like the liberals and moderates they defeated a generation ago. But if Mr. Wolfowitz’s brief tenure at the World Bank is any indication, they are a long way from feeling chastened.
Thursday, May 17
R.I.H. Jerry Falwell
The right's holy fool.
By Timothy Noah
God, they say, is love, but the Rev. Jerry Falwell, who died May 15, hit the jackpot trafficking in small-minded condemnation. The controversies Falwell generated followed a predictable loop. 1) Falwell would say something hateful or clownish about some person or group associated with liberalism. 2) A public outcry would ensue. 3) Falwell would apologize and retract the offending comment. 4) Falwell would repeat the comment, slightly rephrased.
Sen. John McCain, R.-Ariz., a presidential candidate in 2000 condemned Falwell's intolerance ("The political tactics of division and slander are not our values, they are corrupting influences on religion and politics, and those who practice them in the name of religion or in the name of the Republican Party or in the name of America shame our faith, our party and our country") but last year, as a presidential candidate positioning for 2008, made peace with Falwell and gave a commencement address ("We have nothing to fear from each other") to the 2006 graduating class at Falwell's Liberty University. On news of Falwell's death, McCain said in a statement, "Dr. Falwell was a man of distinguished accomplishment who devoted his life to serving his faith and country."
Nonsense. He was a bigot, a reactionary, a liar, and a fool. Herewith, a Falwell sampler.
On Sept. 11: "The abortionists have got to bear some burden for this because God will not be mocked. And when we destroy 40 million little innocent babies, we make God mad. I really believe that the pagans, and the abortionists, and the feminists, and the gays and the lesbians who are actively trying to make that an alternative lifestyle, the ACLU, People for the American Way—all of them who have tried to secularize America—I point the finger in their face and say 'you helped this happen.' "
On AIDS: "AIDS is the wrath of a just God against homosexuals."
On homosexuality: "I believe that all of us are born heterosexual, physically created with a plumbing that's heterosexual, and created with the instincts and desires that are basically, fundamentally, heterosexual. But I believe that we have the ability to experiment in every direction. Experimentation can lead to habitual practice, and then to a lifestyle. But I don't believe anyone begins a homosexual."
On Martin Luther King Jr.: "I must personally say that I do question the sincerity and nonviolent intentions of some civil rights leaders such as Dr. Martin Luther King Jr., Mr. James Farmer, and others, who are known to have left-wing associations."
On Martin Luther King Jr., four decades later: "You know, I supported Martin Luther King Jr., who did practice civil disobedience."
On public education: "I hope I live to see the day when, as in the early days of our country, we won't have any public schools. The churches will have taken them over again, and Christians will be running them."
On the separation of church and state: "There is no separation of church and state."
On feminists: "I listen to feminists and all these radical gals. ... These women just need a man in the house. That's all they need. Most of the feminists need a man to tell them what time of day it is and to lead them home. And they blew it and they're mad at all men. Feminists hate men. They're sexist. They hate men; that's their problem."
On global warming: "I can tell you, our grandchildren will laugh at those who predicted global warming. We'll be in global cooling by then, if the Lord hasn't returned. I don't believe a moment of it. The whole thing is created to destroy America's free enterprise system and our economic stability."
On Bishop Desmond Tutu: "I think he's a phony, period, as far as representing the black people of South Africa."
On Islam: "I think Mohammed was a terrorist. I read enough of the history of his life, written by both Muslims and non-Muslims, that he was a violent man, a man of war."
On Jews: "In my opinion, the Antichrist will be a counterfeit of the true Christ, which means that he will be male and Jewish, since Jesus was male and Jewish."
Rest in peace, you blowhard.
By Timothy Noah
God, they say, is love, but the Rev. Jerry Falwell, who died May 15, hit the jackpot trafficking in small-minded condemnation. The controversies Falwell generated followed a predictable loop. 1) Falwell would say something hateful or clownish about some person or group associated with liberalism. 2) A public outcry would ensue. 3) Falwell would apologize and retract the offending comment. 4) Falwell would repeat the comment, slightly rephrased.
Sen. John McCain, R.-Ariz., a presidential candidate in 2000 condemned Falwell's intolerance ("The political tactics of division and slander are not our values, they are corrupting influences on religion and politics, and those who practice them in the name of religion or in the name of the Republican Party or in the name of America shame our faith, our party and our country") but last year, as a presidential candidate positioning for 2008, made peace with Falwell and gave a commencement address ("We have nothing to fear from each other") to the 2006 graduating class at Falwell's Liberty University. On news of Falwell's death, McCain said in a statement, "Dr. Falwell was a man of distinguished accomplishment who devoted his life to serving his faith and country."
Nonsense. He was a bigot, a reactionary, a liar, and a fool. Herewith, a Falwell sampler.
On Sept. 11: "The abortionists have got to bear some burden for this because God will not be mocked. And when we destroy 40 million little innocent babies, we make God mad. I really believe that the pagans, and the abortionists, and the feminists, and the gays and the lesbians who are actively trying to make that an alternative lifestyle, the ACLU, People for the American Way—all of them who have tried to secularize America—I point the finger in their face and say 'you helped this happen.' "
On AIDS: "AIDS is the wrath of a just God against homosexuals."
On homosexuality: "I believe that all of us are born heterosexual, physically created with a plumbing that's heterosexual, and created with the instincts and desires that are basically, fundamentally, heterosexual. But I believe that we have the ability to experiment in every direction. Experimentation can lead to habitual practice, and then to a lifestyle. But I don't believe anyone begins a homosexual."
On Martin Luther King Jr.: "I must personally say that I do question the sincerity and nonviolent intentions of some civil rights leaders such as Dr. Martin Luther King Jr., Mr. James Farmer, and others, who are known to have left-wing associations."
On Martin Luther King Jr., four decades later: "You know, I supported Martin Luther King Jr., who did practice civil disobedience."
On public education: "I hope I live to see the day when, as in the early days of our country, we won't have any public schools. The churches will have taken them over again, and Christians will be running them."
On the separation of church and state: "There is no separation of church and state."
On feminists: "I listen to feminists and all these radical gals. ... These women just need a man in the house. That's all they need. Most of the feminists need a man to tell them what time of day it is and to lead them home. And they blew it and they're mad at all men. Feminists hate men. They're sexist. They hate men; that's their problem."
On global warming: "I can tell you, our grandchildren will laugh at those who predicted global warming. We'll be in global cooling by then, if the Lord hasn't returned. I don't believe a moment of it. The whole thing is created to destroy America's free enterprise system and our economic stability."
On Bishop Desmond Tutu: "I think he's a phony, period, as far as representing the black people of South Africa."
On Islam: "I think Mohammed was a terrorist. I read enough of the history of his life, written by both Muslims and non-Muslims, that he was a violent man, a man of war."
On Jews: "In my opinion, the Antichrist will be a counterfeit of the true Christ, which means that he will be male and Jewish, since Jesus was male and Jewish."
Rest in peace, you blowhard.
Sunday, May 13
Earth to G.O.P.: The Gipper Is Dead
By FRANK RICH
OF course you didn’t watch the first Republican presidential debate on MSNBC. Even the party’s most loyal base didn’t abandon Fox News, where Bill O’Reilly, interviewing the already overexposed George Tenet, drew far more viewers. Yet the few telling video scraps that entered the 24/7 mediasphere did turn the event into an instant “Saturday Night Live” parody without “SNL” having to lift a finger. The row of 10 middle-aged white candidates, David Letterman said, looked like “guys waiting to tee off at a restricted country club.”
Since then, panicked Republicans have been either blaming the “Let’s Make a Deal” debate format or praying for salvation-by-celebrity in the form of another middle-aged white guy who might enter the race, Fred Thompson. They don’t seem to get that there is not another major brand in the country — not Wal-Mart, not G.E., not even Denny’s nowadays — that would try to sell a mass product with such a demographically homogeneous sales force. And that’s only half the problem. The other half is that the Republicans don’t have a product to sell. Aside from tax cuts and a wall on the Mexican border, the only issue that energized the presidential contenders was Ronald Reagan. The debate’s most animated moments by far came as they clamored to lip-sync his “optimism,” his “morning in America,” his “shining city on the hill” and even, in a bizarre John McCain moment out of a Chucky movie, his grin.
The candidates mentioned Reagan’s name 19 times, the current White House occupant’s once. Much as the Republicans hope that the Gipper can still be a panacea for all their political ills, so they want to believe that if only President Bush would just go away and take his rock-bottom approval rating and equally unpopular war with him, all of their problems would be solved. But it could be argued that the Iraq fiasco, disastrous to American interests as it is, actually masks the magnitude of the destruction this presidency has visited both on the country in general and the G.O.P. in particular.
By my rough, conservative calculation — feel free to add — there have been corruption, incompetence, and contracting or cronyism scandals in these cabinet departments: Defense, Education, Justice, Interior, Homeland Security, Veterans Affairs, Health and Human Services, and Housing and Urban Development. I am not counting State, whose deputy secretary, a champion of abstinence-based international AIDS funding, resigned last month in a prostitution scandal, or the General Services Administration, now being investigated for possibly steering federal favors to Republican Congressional candidates in 2006. Or the Office of Management and Budget, whose chief procurement officer was sentenced to prison in the Abramoff fallout. I will, however, toss in a figure that reveals the sheer depth of the overall malfeasance: no fewer than four inspectors general, the official watchdogs charged with investigating improprieties in each department, are themselves under investigation simultaneously — an all-time record.
Wrongdoing of this magnitude does not happen by accident, but it is not necessarily instigated by a Watergate-style criminal conspiracy. When corruption is this pervasive, it can also be a byproduct of a governing philosophy. That’s the case here. That Bush-Rove style of governance, the common denominator of all the administration scandals, is the Frankenstein creature that stalks the G.O.P. as it faces 2008. It has become the Republican brand and will remain so, even after this president goes, until courageous Republicans disown it and eradicate it.
It’s not the philosophy Mr. Bush campaigned on. Remember the candidate who billed himself as a “different kind of Republican” and a “compassionate conservative”? Karl Rove wanted to build a lasting Republican majority by emulating the tactics of the 1896 candidate, William McKinley, whose victory ushered in G.O.P. dominance that would last until the New Deal some 35 years later. The Rove plan was to add to the party’s base, much as McKinley had at the dawn of the industrial era, by attracting new un-Republican-like demographic groups, including Hispanics and African-Americans. Hence, No Child Left Behind, an education program pitched particularly to urban Americans, and a 2000 nominating convention that starred break dancers, gospel singers, Colin Powell and, as an M.C., the only black Republican member of Congress, J. C. Watts.
As always, the salesmanship was brilliant. One smitten liberal columnist imagined in 1999 that Mr. Bush could redefine his party: “If compassion and inclusion are his talismans, education his centerpiece and national unity his promise, we may say a final, welcome goodbye to the wedge issues that have divided Americans by race, ethnicity and religious conviction.” Or not. As Matthew Dowd, the disaffected Bush pollster, concluded this spring, the uniter he had so eagerly helped elect turned out to be “not the person” he thought, but instead a divider who wanted to appeal to the “51 percent of the people” who would ensure his hold on power.
But it isn’t just the divisive Bush-Rove partisanship that led to scandal. The corruption grew out of the White House’s insistence that partisanship — the maintenance of that 51 percent — dictate every governmental action no matter what the effect on the common good. And so the first M.B.A. president ignored every rule of sound management. Loyal ideologues or flunkies were put in crucial positions regardless of their ethics or competence. Government business was outsourced to campaign contributors regardless of their ethics or competence. Even orthodox Republican fiscal prudence was tossed aside so Congressional allies could be bought off with bridges to nowhere.
This was true way before many, let alone Matthew Dowd, were willing to see it. It was true before the Iraq war. In retrospect, the first unimpeachable evidence of the White House’s modus operandi was reported by the journalist Ron Suskind, for Esquire, at the end of 2002. Mr. Suskind interviewed an illustrious Bush appointee, the University of Pennsylvania political scientist John DiIulio, who had run the administration’s compassionate-conservative flagship, the Office of Faith-Based and Community Initiatives. Bemoaning an unprecedented “lack of a policy apparatus” in the White House, Mr. DiIulio said: “What you’ve got is everything — and I mean everything — being run by the political arm. It’s the reign of the Mayberry Machiavellis.”
His words have been borne out repeatedly: by the unqualified political hacks and well-connected no-bid contractors who sabotaged the occupation and reconstruction of Iraq; the politicization of science at the Food and Drug Administration and the Environmental Protection Agency; the outsourcing of veterans’ care to a crony company at Walter Reed; and the purge of independent United States attorneys at Alberto Gonzales’s Justice Department. But even more pertinent, perhaps, to the Republican future is how the Mayberry Machiavellis alienated the precise groups that Mr. Bush had promised to add to his party’s base.
By installing a political hack, his 2000 campaign manager, Joe Allbaugh, at the top of FEMA, the president foreordained the hiring of Brownie and the disastrous response to Katrina. At the Education Department, the signature No Child Left Behind program, Reading First, is turning out to be a cesspool of contracting conflicts of interest. It’s also at that department that Bush loyalists stood passively by while the student-loan industry scandal exploded; at its center is Nelnet, the single largest corporate campaign contributor to the 2006 G.O.P. Congressional campaign committee. Back at Mr. Gonzales’s operation, where revelations of politicization and cover-ups mount daily, it turns out that no black lawyers have been hired in the nearly all-white criminal section of the civil rights division since 2003.
The sole piece of compassionate conservatism that Mr. Bush has tried not to sacrifice to political expedience — nondraconian immigration reform — is also on the ropes, done in by a wave of xenophobia that he has failed to combat. Just how knee-jerk this strain has become could be seen in the MSNBC debate when Chris Matthews asked the candidates if they would consider a constitutional amendment to allow presidential runs by naturalized citizens like their party’s star governor, Arnold Schwarzenegger (an American since 1983), and its national chairman, Senator Mel Martinez of Florida. Seven out of 10 said no.
We’ve certainly come a long way from that 2000 Philadelphia convention, with its dream of forging an inclusive, long-lasting G.O.P. majority. Instead of break dancers and a black Republican congressman (there are none now), we’ve had YouTube classics like Mr. Rove’s impersonation of a rapper at a Washington journalists’ banquet and George Allen’s “macaca” meltdown. Simultaneously, the once-reliable evangelical base is starting to drift as some of its leaders join the battle against global warming and others recognize that they’ve been played for fools on “family values” by the G.O.P. establishment that covered up for Mark Foley.
Meanwhile, most of the pressing matters that the public cares passionately about — Iraq, health care, the environment and energy independence — belong for now to the Democrats. Though that party’s first debate wasn’t exactly an intellectual feast either, actual issues were engaged by presidential hopefuls representing a cross section of American demographics. You don’t see Democratic candidates changing the subject to J.F.K. and F.D.R. They are free to start wrestling with the future while the men inheriting the Bush-Rove brand of Republicanism are reduced to harking back to a morning in America on which the sun set in 1989.
OF course you didn’t watch the first Republican presidential debate on MSNBC. Even the party’s most loyal base didn’t abandon Fox News, where Bill O’Reilly, interviewing the already overexposed George Tenet, drew far more viewers. Yet the few telling video scraps that entered the 24/7 mediasphere did turn the event into an instant “Saturday Night Live” parody without “SNL” having to lift a finger. The row of 10 middle-aged white candidates, David Letterman said, looked like “guys waiting to tee off at a restricted country club.”
Since then, panicked Republicans have been either blaming the “Let’s Make a Deal” debate format or praying for salvation-by-celebrity in the form of another middle-aged white guy who might enter the race, Fred Thompson. They don’t seem to get that there is not another major brand in the country — not Wal-Mart, not G.E., not even Denny’s nowadays — that would try to sell a mass product with such a demographically homogeneous sales force. And that’s only half the problem. The other half is that the Republicans don’t have a product to sell. Aside from tax cuts and a wall on the Mexican border, the only issue that energized the presidential contenders was Ronald Reagan. The debate’s most animated moments by far came as they clamored to lip-sync his “optimism,” his “morning in America,” his “shining city on the hill” and even, in a bizarre John McCain moment out of a Chucky movie, his grin.
The candidates mentioned Reagan’s name 19 times, the current White House occupant’s once. Much as the Republicans hope that the Gipper can still be a panacea for all their political ills, so they want to believe that if only President Bush would just go away and take his rock-bottom approval rating and equally unpopular war with him, all of their problems would be solved. But it could be argued that the Iraq fiasco, disastrous to American interests as it is, actually masks the magnitude of the destruction this presidency has visited both on the country in general and the G.O.P. in particular.
By my rough, conservative calculation — feel free to add — there have been corruption, incompetence, and contracting or cronyism scandals in these cabinet departments: Defense, Education, Justice, Interior, Homeland Security, Veterans Affairs, Health and Human Services, and Housing and Urban Development. I am not counting State, whose deputy secretary, a champion of abstinence-based international AIDS funding, resigned last month in a prostitution scandal, or the General Services Administration, now being investigated for possibly steering federal favors to Republican Congressional candidates in 2006. Or the Office of Management and Budget, whose chief procurement officer was sentenced to prison in the Abramoff fallout. I will, however, toss in a figure that reveals the sheer depth of the overall malfeasance: no fewer than four inspectors general, the official watchdogs charged with investigating improprieties in each department, are themselves under investigation simultaneously — an all-time record.
Wrongdoing of this magnitude does not happen by accident, but it is not necessarily instigated by a Watergate-style criminal conspiracy. When corruption is this pervasive, it can also be a byproduct of a governing philosophy. That’s the case here. That Bush-Rove style of governance, the common denominator of all the administration scandals, is the Frankenstein creature that stalks the G.O.P. as it faces 2008. It has become the Republican brand and will remain so, even after this president goes, until courageous Republicans disown it and eradicate it.
It’s not the philosophy Mr. Bush campaigned on. Remember the candidate who billed himself as a “different kind of Republican” and a “compassionate conservative”? Karl Rove wanted to build a lasting Republican majority by emulating the tactics of the 1896 candidate, William McKinley, whose victory ushered in G.O.P. dominance that would last until the New Deal some 35 years later. The Rove plan was to add to the party’s base, much as McKinley had at the dawn of the industrial era, by attracting new un-Republican-like demographic groups, including Hispanics and African-Americans. Hence, No Child Left Behind, an education program pitched particularly to urban Americans, and a 2000 nominating convention that starred break dancers, gospel singers, Colin Powell and, as an M.C., the only black Republican member of Congress, J. C. Watts.
As always, the salesmanship was brilliant. One smitten liberal columnist imagined in 1999 that Mr. Bush could redefine his party: “If compassion and inclusion are his talismans, education his centerpiece and national unity his promise, we may say a final, welcome goodbye to the wedge issues that have divided Americans by race, ethnicity and religious conviction.” Or not. As Matthew Dowd, the disaffected Bush pollster, concluded this spring, the uniter he had so eagerly helped elect turned out to be “not the person” he thought, but instead a divider who wanted to appeal to the “51 percent of the people” who would ensure his hold on power.
But it isn’t just the divisive Bush-Rove partisanship that led to scandal. The corruption grew out of the White House’s insistence that partisanship — the maintenance of that 51 percent — dictate every governmental action no matter what the effect on the common good. And so the first M.B.A. president ignored every rule of sound management. Loyal ideologues or flunkies were put in crucial positions regardless of their ethics or competence. Government business was outsourced to campaign contributors regardless of their ethics or competence. Even orthodox Republican fiscal prudence was tossed aside so Congressional allies could be bought off with bridges to nowhere.
This was true way before many, let alone Matthew Dowd, were willing to see it. It was true before the Iraq war. In retrospect, the first unimpeachable evidence of the White House’s modus operandi was reported by the journalist Ron Suskind, for Esquire, at the end of 2002. Mr. Suskind interviewed an illustrious Bush appointee, the University of Pennsylvania political scientist John DiIulio, who had run the administration’s compassionate-conservative flagship, the Office of Faith-Based and Community Initiatives. Bemoaning an unprecedented “lack of a policy apparatus” in the White House, Mr. DiIulio said: “What you’ve got is everything — and I mean everything — being run by the political arm. It’s the reign of the Mayberry Machiavellis.”
His words have been borne out repeatedly: by the unqualified political hacks and well-connected no-bid contractors who sabotaged the occupation and reconstruction of Iraq; the politicization of science at the Food and Drug Administration and the Environmental Protection Agency; the outsourcing of veterans’ care to a crony company at Walter Reed; and the purge of independent United States attorneys at Alberto Gonzales’s Justice Department. But even more pertinent, perhaps, to the Republican future is how the Mayberry Machiavellis alienated the precise groups that Mr. Bush had promised to add to his party’s base.
By installing a political hack, his 2000 campaign manager, Joe Allbaugh, at the top of FEMA, the president foreordained the hiring of Brownie and the disastrous response to Katrina. At the Education Department, the signature No Child Left Behind program, Reading First, is turning out to be a cesspool of contracting conflicts of interest. It’s also at that department that Bush loyalists stood passively by while the student-loan industry scandal exploded; at its center is Nelnet, the single largest corporate campaign contributor to the 2006 G.O.P. Congressional campaign committee. Back at Mr. Gonzales’s operation, where revelations of politicization and cover-ups mount daily, it turns out that no black lawyers have been hired in the nearly all-white criminal section of the civil rights division since 2003.
The sole piece of compassionate conservatism that Mr. Bush has tried not to sacrifice to political expedience — nondraconian immigration reform — is also on the ropes, done in by a wave of xenophobia that he has failed to combat. Just how knee-jerk this strain has become could be seen in the MSNBC debate when Chris Matthews asked the candidates if they would consider a constitutional amendment to allow presidential runs by naturalized citizens like their party’s star governor, Arnold Schwarzenegger (an American since 1983), and its national chairman, Senator Mel Martinez of Florida. Seven out of 10 said no.
We’ve certainly come a long way from that 2000 Philadelphia convention, with its dream of forging an inclusive, long-lasting G.O.P. majority. Instead of break dancers and a black Republican congressman (there are none now), we’ve had YouTube classics like Mr. Rove’s impersonation of a rapper at a Washington journalists’ banquet and George Allen’s “macaca” meltdown. Simultaneously, the once-reliable evangelical base is starting to drift as some of its leaders join the battle against global warming and others recognize that they’ve been played for fools on “family values” by the G.O.P. establishment that covered up for Mark Foley.
Meanwhile, most of the pressing matters that the public cares passionately about — Iraq, health care, the environment and energy independence — belong for now to the Democrats. Though that party’s first debate wasn’t exactly an intellectual feast either, actual issues were engaged by presidential hopefuls representing a cross section of American demographics. You don’t see Democratic candidates changing the subject to J.F.K. and F.D.R. They are free to start wrestling with the future while the men inheriting the Bush-Rove brand of Republicanism are reduced to harking back to a morning in America on which the sun set in 1989.
Saturday, May 12
Free Speech is Essential to a Functioning Stock Market - it's just not in any individual company's interest...
Making Sure the Negative Can Be Heard
By JOE NOCERA, NYTIMES
“For securities analysis to be valid it needs to be objective.”
Although 50 percent of all trades consist of people selling stocks, the vast majority of analyst recommendations remain bullish. Despite the settlement that the former New York Attorney General Eliot Spitzer (now governor, of course) extracted from Wall Street in April 2003 — a settlement aimed specifically at making research more tough-minded and independent — “there are very strong pressures to be only positive about companies,” Mr. Rocker said.
One pressure point is investors themselves; individuals and institutions alike want to see stocks go up, so they prefer bullish analysts over bearish ones. But another, greater source of pressure are the companies the analysts cover. Managements that are showered in stock options have their personal wealth directly tied to a rising stock price, so they are often infuriated when an analyst puts out a critical report or downgrades a rating to a sell. And they retaliate.
They refuse to allow the negative analyst to ask questions on conference calls. They somehow “forget” to include him in e-mail messages that are sent to other analysts. They decline to attend that analyst’s conferences. They complain to his boss, who then inquires as to why the analyst has to be so darn negative all the time. Many companies still use investment banking business as a way to reward the firms that employ analysts they like and punish the ones with analysts they don’t like — even though that is a practice Mr. Spitzer sought to eradicate.
And sometimes companies sue, just as Overstock did. True, it’s not an everyday occurrence — and when it does happen, companies never admit that their suits are intended to silence critics, or put them out of business. Mr. Byrne, for instance, has always denied that his intent was to shut down criticism of Overstock. But it is hard to believe that silencing critics isn’t the intent.
“It is a pernicious practice,” said Owen Lamont, a professor at the Yale School of Management and a defender of short sellers.
“The rising threat of litigation is a huge disincentive for expressing negative views,” Mr. Rocker said in his speech. “It is costly and immensely time consuming. You have to face personal disparagement in the media, because they always have to include the allegation that you supposedly manipulated stocks. And it creates internal discord within the firm.”
Plus, it works.
You don’t believe me? Exhibit A: David Maris, an analyst who used to cover Biovail, a Canadian biotech company, for Banc of America Securities. After writing a series of blistering reports about Biovail, Mr. Maris and his wife discovered they were being followed by private investigators. For a time, Banc of America stood by its analyst, providing security and giving him free rein to cover the company as he saw fit. But in February 2006, Biovail filed a lawsuit against Gradient, a handful of hedge funds and Mr. Maris, making the same kind of stock manipulation accusations that Overstock made six months earlier. Not long after, a suit was filed on behalf of Biovail’s shareholders that included Banc of America Securities itself as a defendant.
Once Banc of America Securities was involved in the litigation, it dropped coverage of Biovail. “Regrettably,” wrote Joan Solotar, the firm’s head of equity research, “our coverage decision removes an informed and independent voice on Biovail.” Well, that’s one way of putting it. Here’s another: Biovail won.
Officials at Banc of America Securities insist that the situation is an aberration. But is it, really? What will it do the next time an angry company sues? If the tactic worked for Biovail, which by the way insists that the evidence will support our allegations, why won’t it work for any other company annoyed at BofA’s analysts? As for Mr. Maris, he is no longer with Banc of America Securities. He left last December for a hedge fund, which means his research no longer helps anybody except his fund. So his independent voice has been silenced permanently.
Exhibit B: Timothy Mulligan, former author of the newsletter “The Eyeshade Report.” Mr. Mulligan, a forensic accountant and lawyer, started a small business writing reports on “quality of earnings” — that is, how shaky (or solid) a company’s reported earnings were. A few years ago, a company called Matrixx Initiatives, which makes Zicam, a nasal spray, sued a group of anonymous message board critics — presumably to shut them up. In 2004, Matrixx subpoenaed Mr. Mulligan, demanding that he turn over his sources and subscriber list, on the theory that that would help Matrixx identify the message board critics.
Mr. Mulligan resisted, but the case dragged on for years. In the meantime, Mr. Mulligan, who was representing himself, was spending more time on his legal case than on his business. “It was overwhelming,” he said. “If I had had a lawyer, I would have had a legal bill in excess of $300,000. Our legal system is such that you can easily keep a frivolous lawsuit alive for years.” Indeed, although Matrixx abandoned the case in January, it was too late for Mr. Mulligan. By November 2005, he had closed the business. (A Matrixx official did not return my phone call.) Another independent voice silenced.
Exhibit C: John Gwynn, an analyst with the small firm of Morgan Keegan, who covered a Canadian insurance company called Fairfax Financial Holdings, which also has a history of lashing back at critics. (What is it about these small Canadian companies?) Last July, Fairfax filed a big lawsuit against 20 defendants, asserting — what else? — stock manipulation. One defendant was Mr. Gwynn, who had been one of the few bearish analysts covering the company.
In the immediate aftermath, Mr. Gwynn, who declined to return my phone calls, continued covering Fairfax. But in January, he dropped coverage. “The discontinuation of Fairfax coverage is not a reflection of any change in our relatively negative perspective of the company’s fundamental business prospects,” he wrote in a note to his clients. “Rather it is the result of a litigation strategy designed by Fairfax to silence negative research coverage.” Another firm that has been openly skeptical of Fairfax, Institutional Credit Partners, has had its staff followed and investigated, just like Mr. Maris.
In an e-mail message, Mike Sitrick, who represents Biovail and Fairfax, said: “The only analysts Fairfax named in its lawsuit are those which it alleges used improper means to attack the company and impact the price of the company’s stock. Mr. Rocker’s assertions that the suit was filed to stifle criticism are not only untrue in the case of Fairfax, but a thinly veiled attempt to divert attention from the wrongdoing alleged in the complaint.” Mr. Sitrick denied that Mr. Maris was ever “tailed” but says that Fairfax has investigated International Credit Partners.
And then there’s Gradient, a skeptical independent voice if ever there was one. As it happens, Gradient’s co-founder and editor in chief, Donn Vickery, has also decided the time has come to speak out. I talked to him a few days after I heard Mr. Rocker’s speech.
“If we hadn’t been around for 10 years, the lawsuits might have put us out of business,” he said. “As it was, our growth was slowed, and our customer retention was more difficult. Certainly our legal costs are way up. We’ve had employees accosted in parking lots by private investigators. Calls to home numbers and cellphones. We’ve had more turnover during this time, and you have to figure this has had an impact.” And of course, he’s still embroiled in two lawsuits, both of which deserve to be thrown out of court, but will probably last for years.
These are bullish times in the stock market, so it is easy to forget how important it is to have skeptical — and even negative —voices to counterbalance all the happy talk surrounding stocks. Even when the skeptics are wrong, they make the market healthier because they offer a point of view that people need to hear. And quite often, of course, they’re right. Mr. Rocker, for instance, sniffed out problems at Boston Chicken and Krispy Kreme long before the market did. Wouldn’t you have wanted to know what he was saying about those companies?
“I hope all of you will recognize this threat and act courageously to protect free expression in the investment business,” Mr. Rocker said to the assembled analysts at the end of his speech. “We need to show those who would silence us that they are wrong.”
Are you listening, Banc of America Securities?
By JOE NOCERA, NYTIMES
“For securities analysis to be valid it needs to be objective.”
Although 50 percent of all trades consist of people selling stocks, the vast majority of analyst recommendations remain bullish. Despite the settlement that the former New York Attorney General Eliot Spitzer (now governor, of course) extracted from Wall Street in April 2003 — a settlement aimed specifically at making research more tough-minded and independent — “there are very strong pressures to be only positive about companies,” Mr. Rocker said.
One pressure point is investors themselves; individuals and institutions alike want to see stocks go up, so they prefer bullish analysts over bearish ones. But another, greater source of pressure are the companies the analysts cover. Managements that are showered in stock options have their personal wealth directly tied to a rising stock price, so they are often infuriated when an analyst puts out a critical report or downgrades a rating to a sell. And they retaliate.
They refuse to allow the negative analyst to ask questions on conference calls. They somehow “forget” to include him in e-mail messages that are sent to other analysts. They decline to attend that analyst’s conferences. They complain to his boss, who then inquires as to why the analyst has to be so darn negative all the time. Many companies still use investment banking business as a way to reward the firms that employ analysts they like and punish the ones with analysts they don’t like — even though that is a practice Mr. Spitzer sought to eradicate.
And sometimes companies sue, just as Overstock did. True, it’s not an everyday occurrence — and when it does happen, companies never admit that their suits are intended to silence critics, or put them out of business. Mr. Byrne, for instance, has always denied that his intent was to shut down criticism of Overstock. But it is hard to believe that silencing critics isn’t the intent.
“It is a pernicious practice,” said Owen Lamont, a professor at the Yale School of Management and a defender of short sellers.
“The rising threat of litigation is a huge disincentive for expressing negative views,” Mr. Rocker said in his speech. “It is costly and immensely time consuming. You have to face personal disparagement in the media, because they always have to include the allegation that you supposedly manipulated stocks. And it creates internal discord within the firm.”
Plus, it works.
You don’t believe me? Exhibit A: David Maris, an analyst who used to cover Biovail, a Canadian biotech company, for Banc of America Securities. After writing a series of blistering reports about Biovail, Mr. Maris and his wife discovered they were being followed by private investigators. For a time, Banc of America stood by its analyst, providing security and giving him free rein to cover the company as he saw fit. But in February 2006, Biovail filed a lawsuit against Gradient, a handful of hedge funds and Mr. Maris, making the same kind of stock manipulation accusations that Overstock made six months earlier. Not long after, a suit was filed on behalf of Biovail’s shareholders that included Banc of America Securities itself as a defendant.
Once Banc of America Securities was involved in the litigation, it dropped coverage of Biovail. “Regrettably,” wrote Joan Solotar, the firm’s head of equity research, “our coverage decision removes an informed and independent voice on Biovail.” Well, that’s one way of putting it. Here’s another: Biovail won.
Officials at Banc of America Securities insist that the situation is an aberration. But is it, really? What will it do the next time an angry company sues? If the tactic worked for Biovail, which by the way insists that the evidence will support our allegations, why won’t it work for any other company annoyed at BofA’s analysts? As for Mr. Maris, he is no longer with Banc of America Securities. He left last December for a hedge fund, which means his research no longer helps anybody except his fund. So his independent voice has been silenced permanently.
Exhibit B: Timothy Mulligan, former author of the newsletter “The Eyeshade Report.” Mr. Mulligan, a forensic accountant and lawyer, started a small business writing reports on “quality of earnings” — that is, how shaky (or solid) a company’s reported earnings were. A few years ago, a company called Matrixx Initiatives, which makes Zicam, a nasal spray, sued a group of anonymous message board critics — presumably to shut them up. In 2004, Matrixx subpoenaed Mr. Mulligan, demanding that he turn over his sources and subscriber list, on the theory that that would help Matrixx identify the message board critics.
Mr. Mulligan resisted, but the case dragged on for years. In the meantime, Mr. Mulligan, who was representing himself, was spending more time on his legal case than on his business. “It was overwhelming,” he said. “If I had had a lawyer, I would have had a legal bill in excess of $300,000. Our legal system is such that you can easily keep a frivolous lawsuit alive for years.” Indeed, although Matrixx abandoned the case in January, it was too late for Mr. Mulligan. By November 2005, he had closed the business. (A Matrixx official did not return my phone call.) Another independent voice silenced.
Exhibit C: John Gwynn, an analyst with the small firm of Morgan Keegan, who covered a Canadian insurance company called Fairfax Financial Holdings, which also has a history of lashing back at critics. (What is it about these small Canadian companies?) Last July, Fairfax filed a big lawsuit against 20 defendants, asserting — what else? — stock manipulation. One defendant was Mr. Gwynn, who had been one of the few bearish analysts covering the company.
In the immediate aftermath, Mr. Gwynn, who declined to return my phone calls, continued covering Fairfax. But in January, he dropped coverage. “The discontinuation of Fairfax coverage is not a reflection of any change in our relatively negative perspective of the company’s fundamental business prospects,” he wrote in a note to his clients. “Rather it is the result of a litigation strategy designed by Fairfax to silence negative research coverage.” Another firm that has been openly skeptical of Fairfax, Institutional Credit Partners, has had its staff followed and investigated, just like Mr. Maris.
In an e-mail message, Mike Sitrick, who represents Biovail and Fairfax, said: “The only analysts Fairfax named in its lawsuit are those which it alleges used improper means to attack the company and impact the price of the company’s stock. Mr. Rocker’s assertions that the suit was filed to stifle criticism are not only untrue in the case of Fairfax, but a thinly veiled attempt to divert attention from the wrongdoing alleged in the complaint.” Mr. Sitrick denied that Mr. Maris was ever “tailed” but says that Fairfax has investigated International Credit Partners.
And then there’s Gradient, a skeptical independent voice if ever there was one. As it happens, Gradient’s co-founder and editor in chief, Donn Vickery, has also decided the time has come to speak out. I talked to him a few days after I heard Mr. Rocker’s speech.
“If we hadn’t been around for 10 years, the lawsuits might have put us out of business,” he said. “As it was, our growth was slowed, and our customer retention was more difficult. Certainly our legal costs are way up. We’ve had employees accosted in parking lots by private investigators. Calls to home numbers and cellphones. We’ve had more turnover during this time, and you have to figure this has had an impact.” And of course, he’s still embroiled in two lawsuits, both of which deserve to be thrown out of court, but will probably last for years.
These are bullish times in the stock market, so it is easy to forget how important it is to have skeptical — and even negative —voices to counterbalance all the happy talk surrounding stocks. Even when the skeptics are wrong, they make the market healthier because they offer a point of view that people need to hear. And quite often, of course, they’re right. Mr. Rocker, for instance, sniffed out problems at Boston Chicken and Krispy Kreme long before the market did. Wouldn’t you have wanted to know what he was saying about those companies?
“I hope all of you will recognize this threat and act courageously to protect free expression in the investment business,” Mr. Rocker said to the assembled analysts at the end of his speech. “We need to show those who would silence us that they are wrong.”
Are you listening, Banc of America Securities?
Friday, May 11
$350 Champagne - Private Industry steps in to do what Govt. doesn't: Redistribute Wealth
----
Steve Leuthold, a money manager out of Minneapolis, told his clients last month that he was appalled by something he had read in The New York Times.
He pointed to an article about “bottle hosts” at chic nightclubs in New York, whose job was to spot, and give the best tables to, the people willing to spend the most money on drinks. “One club had a table minimum of two bottles of liquor or Champagne at $350 per bottle,” Mr. Leuthold wrote. “But this is a minimum.” He noted that one brand of Champagne fetched $1,600 a bottle, and that the average liquor bill for a table at one club was about $3,500, with some tables bringing in $12,000 or more.
“Maybe I am too Norwegian (or too Midwest), but I find this reprehensible,” he wrote. “How about you?”
Mr. Leuthold is an old friend of mine, and I am surprised he does not understand and applaud the economic function of such things. It is a classic example of private enterprise stepping in to fill a void left when the government no longer fills a role it once did.
That role is income redistribution.
Half a century ago, when Dwight D. Eisenhower was in the White House, personal income tax rates ranged up to 91 percent on income of more than $400,000. The rate for those who made more than $100,000 a year was 75 percent.
Adjusted for inflation, that would be equivalent to around $3 million now for the 91 percent rate, and $730,000 for the 75 percent rate. The current top rate is 35 percent.
There are plenty of other ways that the tax code is now more friendly to the wealthy than it was. In 1957, long-term capital gains faced a 25 percent tax. Now the rate is 15 percent. Dividends were taxed at ordinary income tax rates, but are now taxed at 15 percent. The estate tax is on its way to being repealed in 2010, although for just one year. Hedge fund barons who make hundreds of millions a year arrange to keep the money offshore, delaying for years the need to pay any taxes at all on the money. The high tax rates of long ago were often avoided through tax shelters, but they reflected a certain egalitarian spirit, which itself was a reaction to the excesses of the very wealthy “robber barons” of the late 19th century.
“Surely we can afford to make a distinction between the people whose only capital is their mental and physical energy and the people whose income is derived from investments,” wrote Andrew Mellon, the Treasury secretary in the 1920s. He thought it was “beyond question” that taxes should be higher on investment income than on income earned from labor.
Those days are gone, and it may be that they cannot come back, even if the political will arrives to bring them back. Capital is too mobile, and companies and individuals can shift income overseas much easier than they could in Mellon’s day.
But whether or not the government can redistribute the money, the private economy will try to do it. The wealthy are persuaded that they simply must be in hedge funds and private equity funds — and should pay a fee to a bank for getting them into such funds. That is on top of the high fees charged by the funds themselves. The investments may or may not do well, but those collecting the fees are sure to prosper.
Then there are the growth areas of the economy — and I don’t mean semiconductors. Last year at the World Economic Forum in Davos, Switzerland, officials suggested there was great opportunity for personal trainers, people who can keep the rich looking fit and thin. This week, The Wall Street Journal reported that membership in the Institute for Divorce Financial Analysts is rising at a rate of 25 percent a year. Such analysts claim to be able to help sort out divorce fights for the wealthy.
In that context, outrageously priced drinks at fancy clubs can be seen as simply taking money from those with too much of it, and passing it on to others.
Those who get it may not be the most deserving, but that was also true when the government was handing out the money. And such clubs at least are more honest than the penny-stock frauds that often aim at people who have made a lot of money without really understanding much about it.
Thirty years ago this month, Peter F. Drucker, the great writer on management, bemoaned the fact that a few chief executives were collecting millions. Attention given such pay, he wrote then, obscured “the achievement of U.S. business in this century: the steady narrowing of the income gap between the ‘boss man’ and the ‘working man.’ ”
That trend reversed, and in this century the boss makes more than ever. The government does not tax it away, so others seek ways to get some of the wealth.
Steve Leuthold, a money manager out of Minneapolis, told his clients last month that he was appalled by something he had read in The New York Times.
He pointed to an article about “bottle hosts” at chic nightclubs in New York, whose job was to spot, and give the best tables to, the people willing to spend the most money on drinks. “One club had a table minimum of two bottles of liquor or Champagne at $350 per bottle,” Mr. Leuthold wrote. “But this is a minimum.” He noted that one brand of Champagne fetched $1,600 a bottle, and that the average liquor bill for a table at one club was about $3,500, with some tables bringing in $12,000 or more.
“Maybe I am too Norwegian (or too Midwest), but I find this reprehensible,” he wrote. “How about you?”
Mr. Leuthold is an old friend of mine, and I am surprised he does not understand and applaud the economic function of such things. It is a classic example of private enterprise stepping in to fill a void left when the government no longer fills a role it once did.
That role is income redistribution.
Half a century ago, when Dwight D. Eisenhower was in the White House, personal income tax rates ranged up to 91 percent on income of more than $400,000. The rate for those who made more than $100,000 a year was 75 percent.
Adjusted for inflation, that would be equivalent to around $3 million now for the 91 percent rate, and $730,000 for the 75 percent rate. The current top rate is 35 percent.
There are plenty of other ways that the tax code is now more friendly to the wealthy than it was. In 1957, long-term capital gains faced a 25 percent tax. Now the rate is 15 percent. Dividends were taxed at ordinary income tax rates, but are now taxed at 15 percent. The estate tax is on its way to being repealed in 2010, although for just one year. Hedge fund barons who make hundreds of millions a year arrange to keep the money offshore, delaying for years the need to pay any taxes at all on the money. The high tax rates of long ago were often avoided through tax shelters, but they reflected a certain egalitarian spirit, which itself was a reaction to the excesses of the very wealthy “robber barons” of the late 19th century.
“Surely we can afford to make a distinction between the people whose only capital is their mental and physical energy and the people whose income is derived from investments,” wrote Andrew Mellon, the Treasury secretary in the 1920s. He thought it was “beyond question” that taxes should be higher on investment income than on income earned from labor.
Those days are gone, and it may be that they cannot come back, even if the political will arrives to bring them back. Capital is too mobile, and companies and individuals can shift income overseas much easier than they could in Mellon’s day.
But whether or not the government can redistribute the money, the private economy will try to do it. The wealthy are persuaded that they simply must be in hedge funds and private equity funds — and should pay a fee to a bank for getting them into such funds. That is on top of the high fees charged by the funds themselves. The investments may or may not do well, but those collecting the fees are sure to prosper.
Then there are the growth areas of the economy — and I don’t mean semiconductors. Last year at the World Economic Forum in Davos, Switzerland, officials suggested there was great opportunity for personal trainers, people who can keep the rich looking fit and thin. This week, The Wall Street Journal reported that membership in the Institute for Divorce Financial Analysts is rising at a rate of 25 percent a year. Such analysts claim to be able to help sort out divorce fights for the wealthy.
In that context, outrageously priced drinks at fancy clubs can be seen as simply taking money from those with too much of it, and passing it on to others.
Those who get it may not be the most deserving, but that was also true when the government was handing out the money. And such clubs at least are more honest than the penny-stock frauds that often aim at people who have made a lot of money without really understanding much about it.
Thirty years ago this month, Peter F. Drucker, the great writer on management, bemoaned the fact that a few chief executives were collecting millions. Attention given such pay, he wrote then, obscured “the achievement of U.S. business in this century: the steady narrowing of the income gap between the ‘boss man’ and the ‘working man.’ ”
That trend reversed, and in this century the boss makes more than ever. The government does not tax it away, so others seek ways to get some of the wealth.