CIAO DATE: 07/2009
Volume: 3, Issue: 3
Fall 2008
Craig Biddle
The Resurgence of Big Government
Yaron Brook
Following the economic disasters of the 1960s and 1970s, brought on by the statist policies of the political left, America seemed to change course. Commentators called the shift the "swing to the right"-that is, toward capitalism. From about 1980 to 2000, a new attitude took hold: the idea that government should be smaller, that recessions are best dealt with through tax cuts and deregulation, that markets work pretty effectively, and that many existing government interventions are doing more harm than good. President Bill Clinton found it necessary to declare, "The era of big government is over."
Today that attitude has virtually vanished from the public stage. We are now witnessing a swing back to the left-toward statism. As a wave of recent articles have proclaimed: The era of big government is back.1
The evidence is hard to miss. Consider our current housing and credit crisis. From day one, it was blamed on the market and a lack of oversight by regulators who were said to be "asleep at the wheel." In response to the crisis, the government, the policy analysts, the media, and the American people demanded action, and everyone understood this to mean more government, more regulation, more controls. We got our wish.
First came the Fed's panicked slashing of interest rates. Then the bailout of Bear Stearns. Then the bailout of Freddie Mac. Then a $300 billion mortgage bill, which passed by a substantial margin and was signed into law by President Bush. No doubt more is to come.
All of this intervention, of course, is supported by our presidential candidates. Both blame Wall Street for the current problems and vow to increase the power of the Fed's and the SEC's financial regulators. John McCain has announced that there are "some greedy people on Wall Street that perhaps need to be punished."2 Both he and Barack Obama envision an ever-growing role for government in the marketplace, each promises to raise taxes in some form or another, and both support more regulations, particularly on Wall Street. Few doubt they will keep these promises.
What do Americans think of all this? A recent poll by the Wall Street Journal and NBC News found that 53 percent of Americans want the government to "do more to solve problems." Twelve years earlier, Americans said they opposed government interference by a 2-to-1 margin.3
In fact, our government has been "doing more" throughout this decade. While President Bush has paid lip service to freer markets, his administration has engineered a vast increase in the size and reach of government.
He gave us Sarbanes-Oxley, the largest expansion of business regulation in decades. He gave us the Medicare prescription drug benefit, the largest new entitlement program in thirty years. He gave us the "No Child Left Behind Act," the largest expansion of the federal government in education since 1979. This is to say nothing of the orgy of spending over which he has presided: His 2009 budget stands at more than $3 trillion-an increase of more than a $1 trillion since he took office.4 All of this led one conservative columnist to label Bush "a big government conservative."5 It was not meant as a criticism.
Americans entered the 21st century enjoying the greatest prosperity in mankind's history. And many agreed that this prosperity was mainly the result of freeing markets from government intervention, not only in America, but also around the world. Yet today, virtually everyone agrees that markets have failed.
Why? What happened?
To identify the cause of today's swing to the left, we need first to understand the cause and consequences of the swing to the right. . . .
The Mystical Ethics of the New Atheists
Alan Germani
Examines the moral ideas of Christopher Hitchens, Sam Harris, Daniel Dennett, and Richard Dawkins, exposes some curious truths about their ethics, and provides sound advice for theists and atheists alike who wish to discover and uphold a rational, secular morality.
Mandatory Health Insurance: Wrong for Massachusetts, Wrong for America
Paul Hsieh
Identifies the theory behind the Massachusetts mandatory health insurance program, exposes the program as a fiasco, explains why the theory had to fail in practice, and sheds light on the only genuine, rights-respecting means to affordable, accessible health care for Americans.
Deeper Than Kelo: The Roots of the Property Rights Crisis
Eric Daniels
On June 23, 2005, the United States Supreme Court's acquiescence in a municipal government's use of eminent domain to advance "economic development" goals sent shockwaves across the country. When the Court announced its decision in Kelo v. City of New London, average homeowners realized that their houses could be condemned, seized, and handed over to other private parties.1 They wanted to know what had gone wrong, why the Constitution and Fifth Amendment had failed to protect their property rights.
The crux of the decision, and the source of so much indignation, was the majority opinion of Justice John Paul Stevens, which contended that "economic development" was such a "traditional and long accepted function of government" that it fell under the rubric of "public use."2 If a municipality or state determined, through a "carefully considered" planning process, that taking land from one owner and giving it to another would lead to increased tax revenue, job growth, and the revitalization of depressed urban areas, the Court would allow it. If the government had to condemn private homes to meet "the diverse and always evolving needs of society,"3 Stevens wrote, so be it.
The reaction to the Kelo decision was swift and widespread. Surveys showed that 80 to 90 percent of Americans opposed the decision. Politicians from both parties spoke out against it. Such strange bedfellows as Rush Limbaugh and Ralph Nader were united in their opposition to the Court's ruling. Legislatures in more than forty states proposed and most then passed eminent domain "reforms." In the 2006 elections, nearly one dozen states considered anti-Kelo ballot initiatives, and ten such measures passed. On the one-year anniversary of the decision, President Bush issued an executive order that barred federal agencies from using eminent domain to take property for economic development purposes (even though the primary use of eminent domain is by state and local agencies).4 The "backlash" against the Court's Kelo decision continues today by way of reform efforts in California and other states.
Public outcry notwithstanding, the Kelo decision did not represent a substantial worsening of the state of property rights in America. Rather, the Kelo decision reaffirmed decades of precedent-precedent unfortunately rooted in the origins of the American system. Nor is eminent domain the only threat to property rights in America. Even if the federal and state governments abolished eminent domain tomorrow, property rights would still be insecure, because the cause of the problem is more fundamental than law or politics.
In order to identify the fundamental cause of the property rights crisis, we must observe how the American legal and political system has treated property rights over the course of the past two centuries and take note of the ideas offered in support of their rulings and regulations.5 In so doing, we will see that the assault on property rights in America is the result of a long chain of historical precedent moored in widespread acceptance of a particular moral philosophy.
Property, Principle, and Precedent
In the Revolutionary era, America's Founding Fathers argued that respect for property rights formed the very foundation of good government. For instance, Arthur Lee, a Virginia delegate to the Continental Congress, wrote that "the right of property is the guardian of every other right, and to deprive a people of this, is in fact to deprive them of their liberty."6 In a 1792 essay on property published in the National Gazette, James Madison expressed the importance of property to the founding generation. "Government is instituted to protect property of every sort," he explained, "this being the end of government, that alone is a just government, which impartially secures to every man, whatever is his own."7
Despite this prevalent attitude-along with the strong protections for property contained in the United States Constitution's contracts clause, ex post facto clause, and the prohibition of state interference with currency-the founders accepted the idea that the power of eminent domain, the power to forcibly wrest property from private individuals, was a legitimate power of sovereignty resting in all governments. Although the founders held that the "despotic power" of eminent domain should be limited to taking property for "public use," and that the victims of such takings were due "just compensation," their acceptance of its legitimacy was the tip of a wedge.8
The principle that property rights are inalienable had been violated. If the government can properly take property for "public use," then property rights are not absolute, and the extent to which they can be violated depends on the meaning ascribed to "public use."
From the earliest adjudication of eminent domain cases, it became clear that the term "public use" would cause problems. Although the founders intended eminent domain to be used only for public projects such as roads, 19th-century legislatures began using it to transfer property to private parties, such as mill and dam owners or canal and railroad companies, on the grounds that they were open to public use and provided wide public benefits. Add to this the fact that, during the New Deal, the Supreme Court explicitly endorsed the idea that property issues were to be determined not by reference to the principle of individual rights but by legislative majorities, and you have the foundation for all that followed.9 . . .
The Menace of Pragmatism
Tara Smith
Author's note: This essay is based on a lecture delivered at the Objectivist Conference (OCON) held in Newport Beach, CA, July 2008, and retains some of the informal character of an oral presentation.
While people commonly disagree about competing world views and substantive ideologies-arguing the merits of different religious creeds or value systems, for instance, of environmentalism or dominant business practices, of volunteerism or the specifics of political platforms-many are blind to the fact that nearly all these ideologies are fueled by a single, more basic philosophy: pragmatism. As people increasingly complain that political candidates are "all the same," in fact, many of the ideas and approaches supported by these candidates do reflect a shared method. It is important to understand this common element not simply because of the breadth of its influence, but because of its destructiveness. While pragmatism presents itself as a tool of reason and enjoys the image of mature moderation, of common sense and practical "realism," in truth, it is anything but realistic or practical. Pragmatism has become a highly corrosive force in people's thinking. And insofar as it is thinking that drives actions-the actions of individuals and correlatively, the course of history-as long as a person or a nation is infected by a warped philosophical approach, genuine progress will be impossible.
In this essay, I seek to demonstrate the stealth but all too live menace that pragmatism poses. Pragmatism is not a substantive set of doctrines so much as a way of thinking, a unifying approach that helps to sustain an array of doctrines that are, in their content, irrational. Because it is a method, however, and informs the way that a practitioner tackles any issue, it proves much more difficult to unroot than an erroneous conclusion. Moreover, thanks to its positive image, pragmatism tends to give harmful ideas a good name, bestowing them with the misplaced aura of reason. It thereby makes people who wish to be rational all the more susceptible to those ideas.
I will begin by clarifying exactly what pragmatism is and proceed to supply evidence of its prevalence. I will then consider the distinctive appeal of pragmatism, as well as the heart of its error-where its goes wrong. Next, I will explain its destructive impact, the principal means by which pragmatism is, indeed, corrosive. Finally, I will offer some thoughts concerning means of combating its influence.1
What Pragmatism Is
As a formal school of philosophy, pragmatism was founded by C. S. Peirce (1839-1914) in the late 19th century. Its more renowned early advocates included William James (1842-1910) and John Dewey (1859-1952). Primarily, pragmatism is a way of tackling philosophical questions. This, according to its founders, is what made pragmatism different from all previous philosophy. James wrote that pragmatism does not stand for any results or specific substantive doctrines; rather, it is distinguished by its method of "clarifying ideas" in practical terms by tracing the practical consequences of accepting one idea or another.2 The meaning and the truth of any claim depend entirely on its practical effects. The mind, accordingly, should not be thought of as a mirror held up to the external world, but as a tool whose role is not to discover, but to do, to act.3
What, then, should we make of the concept of truth?-or the concept of reality? Don't we need to respect those, in order to achieve practical consequences? Well, of course truth exists, says James, but truth is not a stagnant property. Rather, an idea becomes true-"truth happens to an idea." Truth "lives on a credit system" in his view; what a truth has going for it is that people treat it in a certain way. The true is the "expedient," "any idea upon which we can ride." Any idea is true so long as it is "profitable."4
All truths do have something in common, then, namely, "that they pay."5 The question to ask of any proposed idea is: What is its "cash value in experiential terms?"6 The traditional notion of purely objective truth, however, is "nowhere to be found."7 The world we live in is "malleable, waiting to receive its final touches at our hands."8 As Peirce memorably put it, "there is absolutely no difference between a hard thing and a soft thing so long as they are not brought to the test."9 In the view of a much more recent and influential pragmatist, Richard Rorty, truth is "what your contemporaries let you get away with."10 To call a statement true is essentially to give it a rhetorical pat on the back.11
In short, for the pragmatists, we find no ready-made reality. Instead, we create reality. Correlatively, there are no absolutes-no facts, no fixed laws of logic, no certainty. . . .
How the FDA Violates Rights and Hinders Health
Stella Daily
This article is dedicated to Anna Tomalis, a young girl who died of liver cancer on August 15, 2008. Anna's parents desperately sought experimental treatment that might have saved her life, but were delayed for months by FDA bureaucracy. Anna finally received approval to obtain treatment through a clinical trial in July, but died after receiving just one round of treatment. She was thirteen years old.
Abigail Burroughs was not the typical cancer patient: She was just nineteen years old when she was diagnosed with squamous cell cancer that had spread to her neck and lungs. Her prognosis was poor, but a then-experimental drug, Erbitux, offered the hope of saving her life.
Abigail was denied that hope by the Food and Drug Administration. Because the drug was considered experimental, she could receive it only as part of a clinical trial-and Abigail was ineligible to participate in any trials at the time. Despite the best efforts of her family, friends, and doctor, Abigail was unable to receive the treatment that might have saved her life. At twenty-one years old, Abigail died of her disease.1
Abigail's father, Frank Burroughs, thought other patients with life-threatening illnesses should not be denied the ability to try any treatment that might give them a chance. In his daughter's name, he formed the Abigail Alliance for Better Access to Developmental Drugs, which sued the FDA in 2003. The group argued that the FDA's restrictions on access to experimental treatments constitute a violation of the right to self-defense as well as of the Fifth Amendment right not to be deprived of life, liberty, or property without due process of law. In August 2007, the Appeals Court of the District of Columbia struck a blow against the Abigail Alliance, and against individual rights, when it ruled that patients, even the terminally ill, do not have the right to receive treatment that has not been approved by the FDA.2
Erbitux has since been approved by the FDA to treat cancer of the head and neck-too late, of course, for Abigail Burroughs. How has America come to a point where the government denies dying patients the right to try to save their own lives? To answer that question, let us begin with a brief history of the Food and Drug Administration.
A Brief History of the FDA
Prior to the 20th century, the government did not regulate pharmaceutical products in the United States. Although Congress had considered federal regulations on food and drug safety as early as 1879, it had refrained from passing any legislation in this regard. However, with the muckraking journalism of the early 1900s, and especially with the publication of Upton Sinclair's novel The Jungle, which portrayed unsavory practices in the meatpacking industry, the American public clamored for laws to ensure the safe production of food and drugs. This public outcry pushed Congress to pass federal legislation in 1906. As the resulting Food and Drugs Act applied to drugs specifically, products were required to be sold only at certain levels of purity, strength, and quality; and ingredients considered dangerous (such as morphine or alcohol) had to be listed on the product's label. Violators would be subject to seizure of goods, fines, or imprisonment. Thus, in order to enforce the Act, the Food and Drug Administration was born.3
In its early years, the agency focused primarily on food rather than on pharmaceuticals, but in 1937 it increased its focus on drugs after a new formulation of sulfanilamide, a drug that had previously been successfully used to treat certain bacterial infections, proved to be deadly. The drug's manufacturer, S. E. Massengill Company, had dissolved an effective drug in a toxic solvent. More than one hundred people, babies and children among them, died as a result of taking Massengill's product, known as Elixir Sulfanilamide.
Under the 1906 Food and Drugs Act, the FDA was not authorized to prosecute Massengill for selling an unsafe drug, and the agency had the power to recall Elixir Sulfanilamide only via a technicality. Because "elixir" was defined as a drug dissolved in alcohol, and because Massengill's formulation used the nonalcoholic solvent ethylene glycol, the product was technically mislabeled, bringing it under FDA jurisdiction and enabling the agency to recall the product. The public and legislators wanted more: They wanted the FDA not only to recall mislabeled products, but to prevent the sale of unsafe drugs in the first place. Thus, popular demand gave rise to the Food, Drug, and Cosmetics Act of 1938, which greatly expanded the FDA's authority.4
The most important change brought about by this Act was a shift in the burden of proof. Rather than prosecuting a drugmaker after the fact for having fraudulently marketed a product, the FDA would now require proof of safety before a drug could be marketed at all.5 (Note that this required manufacturers to prove a negative-i.e., that a given drug would not harm consumers.)
After World War II, pharmaceutical companies came under still more scrutiny. Then, as now, complaints about the cost of drugs reached Congress, and in 1961 Senator Estes Kefauver led the charge in an investigation not only of drug pricing, but of the relationship between the drug industry and the FDA. Kefauver sought to pass legislation that would increase the agency's authority over drug production, distribution, and advertising. Whereas previously proof of safety alone was required to gain FDA approval, the proposed law would require drug manufacturers also to prove the efficacy of their products.
Kefauver's bill might have languished in congressional debate but for the emergence at that time of data showing that thalidomide, which was then sold as a sleep aid and antinausea medication for pregnant women, caused severe birth defects in the children of women who took it. Thalidomide had not yet been approved for use in the United States at that time due to concerns of an FDA reviewer over a different side effect noted in the drug's application for approval. The drug was widely used in other countries, however, and the babies of many women who used it were born with grotesquely deformed limbs. As their harrowing images flooded the media, Americans realized they had narrowly escaped inflicting these deformities on their own children. The resulting public outcry led to Kefauver's bill being made law in 1962. This law served as the cornerstone for the wide powers that the FDA acquired thereafter, from requiring companies to include warnings in drug advertisements to dictating the way companies must investigate their own experimental compounds.6
Thus, although the scope and power of the FDA were modest at the agency's inception, its scope widened and its power increased markedly in the decades that followed. Now, a century later, the agency's purview includes foods and drugs for humans and animals, cosmetics, medical devices (including everything from breast implants to powered wheelchairs), blood and tissues, vaccines, and any products deemed to be radiation emitters (including cell phones and lasers). And the agency's power is nothing short of enormous. . . .
Mugged by Reality: The Liberation of Iraq and the Failure of Good Intentions by John Agresto
Elan Journo
New York: Encounter Books, 2007. 202 pp. $25.95 (cloth).
Reviewed by Elan Journo
The measure of success in the Iraq war has undergone a curious progression. Early on, the Bush administration held up the vision of a peaceful, prosperous, pro-Western Iraq as its benchmark. But the torture chambers of Saddam Hussein were replaced by the horrors of a sadistic sectarian war and a fierce insurgency that consumed thousands of American lives. And the post-invasion Iraqi regime, it turns out, is led by Islamist parties allied with religious militias and intimately tied to the belligerent Iranian regime. The benchmark, if we can call it that, then shrank to the somewhat lesser vision of an Iraqi government that can stand up on its own, so that America can stand down. But that did not materialize, either. So we heard that if only the fractious Sunni and Shiite factions in the Iraqi government could have breathing space to reconcile their differences, and if only we could do more to blunt the force of the insurgency, that would be progress. To that end, in early 2007, the administration ordered a "surge" of tens of thousands more American forces to rein in the chaos in Iraq.
Today, we hear John McCain and legions of conservatives braying that we are, in fact, winning (some go so far as to say we have already won). Why? Because the "surge" has reduced the number of attacks on U.S. troops to the levels seen a few years ago (when the insurgency was raging wildly) and the number of Iraqis slaughtering their fellow countrymen has taken a momentary dip. Victory, apparently, requires only clearing out insurgents (for a while) from their perches in some neighborhoods, even though Teheran's influence in the country grows and Islamists carve out Taliban-like fiefdoms in Iraq.
The goals in Iraq "have visibly been getting smaller," observes John Agresto, a once keen but now disillusioned supporter of the campaign (p. 172). Iraq, he argues contra his fellow conservatives, has been a fiasco. "If we call it ‘success,' it's only because we've lowered the benchmark to near zero" (p. 191). . . .
Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard H. Thaler and Cass R. Sunstein
Eric Daniels
New Haven: Yale University Press, 2008. 304 pp. $26.00 (cloth).
Reviewed by Eric Daniels
One of the distinguishing features of American life is the large degree of freedom we have in making choices about our lives. When choosing our diets, we have the freedom to choose everything from subsisting exclusively on junk food to consuming meticulously planned portions of fat, protein, and carbohydrate. When choosing how to conduct ourselves financially, we have the freedom to choose everything from a highly leveraged lifestyle of debt to a modest save-for-a-rainy-day approach. In every area of life, from health care to education to personal relationships, we are free to make countless decisions that affect our long-term happiness and prosperity-or lack thereof.
According to Richard Thaler and Cass Sunstein, professors at the University of Chicago and authors of Nudge: Improving Decisions About Health, Wealth, and Happiness, this freedom and range of options is problematic. The problem, they say, is that most people, when given the opportunity, make bad choices; although Americans naturally want to do what is best for themselves, human fallibility often prevents them from knowing just what that is. "Most of us are busy, our lives are complicated, and we can't spend all our time thinking and analyzing everything" (p. 22). Average Americans, say Thaler and Sunstein, tend to favor the status quo, fall victim to temptation, use mental shortcuts, lack self-control, and follow the herd; as a result, they eat too much junk food, save too little, make bad investments, and buy faddish but useless products. Many Americans, according to the authors, are more like Homer Simpson (impulsive and easily fooled) than homo economicus (cool, calculating, and rational). "One of our major goals in this book," they note, "is to see how the world might be made easier, or safer, for the Homers among us" (p. 22). The particular areas where these Homers need the most help are those in which choices "have delayed effects . . . [are] difficult, infrequent, and offer poor feedback, and those for which the relation between choice and experience is ambiguous" (pp. 77-78).
The central theme of Nudge is the idea that government and the private sector can improve people's choices by manipulating the "choice architecture" they face. As Thaler and Sunstein explain, people's choices are often shaped by the way in which alternatives are presented. If a doctor explains to his patient that a proposed medical procedure results in success in 90 percent of cases, that patient will often make a different decision from the one he would have made if the doctor had told him that one in ten patients dies from the procedure. Free markets, the authors argue, too often cater to and exploit people's tendencies to make less than rational choices. Faced with choices about extended warranties or health care plans or investing in one's education, only the most exceptional and rational people will make the "correct" choices. Most people, the authors argue, cannot avoid the common foibles of bad thinking; thus we ought to adopt a better way of framing and structuring choices so that people will be more likely to make better decisions and thereby do better for themselves. Hence the title: By presenting information in a specific way, "choice architects" can "nudge" the chooser in the "right" direction, even while maintaining his "freedom of choice."
The Terrorist Watch: Inside the Desperate Race to Stop the Next Attack by Ronald Kessler
Joe Kroeger
New York: Crown Forum, 2007. 260 pp. $26.95 (hardcover).
Reviewed by Joe Kroeger
In the years since the attacks of 9/11, there have been numerous attempts by terrorists to attack Americans on our own soil, but all of these attempts have been foiled. Who is responsible for this remarkable record, and how have they achieved it? These questions are answered in Ronald Kessler's recent book, The Terrorist Watch: Inside the Desperate Race to Stop the Next Attack, which surveys the work of the individuals involved in America's intelligence community since 9/11.
In twenty-seven brief chapters, Kessler documents the post-9/11 work of the CIA, FBI, National Security Agency (NSA), National Geospatial-Intelligence Agency (NGA), National Counterterrorism Center (NCTC), and other agencies-showing the organizational, tactical, and technological changes that have occurred, along with their positive results.
The book begins by recounting the events of September 11, 2001, from President Bush being informed of the first plane crashing into the World Trade Center, to his "We're at war" declaration, to the initial coordination of efforts among the vice president, the military, and law enforcement and intelligence agencies. Proceeding from there, Kessler shows how the CIA immediately linked some of the hijackers to Al Qaeda and how, a few days later, the president began redirecting the priorities of the FBI and the Justice Department from prosecuting terrorists to preventing attacks. . . .
The Tyranny of the Market: Why You Can't Always Get What You Want by Joel Waldfogel
Eric Daniels
Cambridge: Harvard University Press, 2007. 216 pp. $35.00 (cloth).
Reviewed by Eric Daniels
According to Joel Waldfogel, a professor of business and public policy at the Wharton School of Business, "a dominant strand of current thinking" regards markets as superior to government when it comes to providing consumers with what they want. When government undertakes the provision of goods, the products offered are limited to those that meet with the approval of the majority, whereas "[m]arkets are thought to avoid the tyranny of the majority because in markets each person can decide what she wants." According to this dominant argument, he writes, "what's available to me in markets depends only on my preferences, not on anyone else's" (p. 2).
In his recent book, The Tyranny of the Market, Waldfogel challenges this assumption. When one considers what actually happens in free markets, when one considers the products available therein, says Waldfogel, "it's clear that you can be better off in your capacity as a consumer of a particular product as more consumers share your preferences" (p. 4). In other words, you are more likely to get exactly what you want if your tastes are shared by the majority and less likely to get exactly what you want if your tastes differ from the majority. Thus, Waldfogel contends, when it comes to providing the goods that people want, "the market does not generally avoid the tyranny of the majority" any more than does a democratic political system that allocates goods (p. 6).
Waldfogel's goal is to examine "how markets actually work" in order to allow policy makers and citizens to balance the shortcomings of markets against the shortcomings of government and "to determine an appropriate mix in each arena" (p. 36). Toward this end, he leads the reader through a series of examples in which there appears to be a breakdown in the market provision of goods. . . .
First into Nagasaki: The Censored Eyewitness Dispatches on Post-Atomic Japan and Its Prisoners of War by George Weller, edited and with an essay by Anthony Weller
John David Lewis
New York: Crown Publishers, 2006. 320 pp. $25.00 (cloth), $14.95 (paperback).
Reviewed by John David Lewis
During World War II, the prime source of information for Americans about the war overseas was the dispatches of foreign correspondents-men who put their lives on the line in war zones to report the truth. George Weller was a giant among such men. Captured by the Nazis and traded for a German journalist, Weller watched the Belgian Congolese Army attack Italians in Ethiopia, saw the invasion of Crete, interviewed Charles de Gaulle in South Africa following an escape through Lisbon, and overcame malaria to report on the war in the Pacific. He was the first foreign correspondent trained as a paratrooper, and he won a Pulitzer Prize for his report of an appendectomy on a submarine. He wrote the book Singapore is Silent in 1942 after seeing the city fall to the Japanese, and he advocated a global system of United States bases in his 1943 book Bases Overseas. After witnessing Japan's surrender on September 2, 1945, he broke General Douglas MacArthur's order against travel to Nagasaki by impersonating an American colonel and taking a train to the bombed-out city. In a period of six weeks, he sent typewritten dispatches totaling some fifty thousand words back to American newspapers through official channels of the military occupation. Under MacArthur's directives, they were censored and never made it into print.
Weller died in 2002 thinking his dispatches had been lost. Months later his son, Anthony Weller, found a crate of moldy papers with the only surviving carbon copies. Anthony Weller edited the dispatches and included his own essay about his father, resulting in this priceless addition to our information about World War II in the Pacific, and the birth of the atomic age. The importance of the dispatches, however, extends far beyond the value of the information from Nagasaki. George Weller is a voice from a past generation, and the publication of his censored dispatches raises a series of deeply important issues and, in the process, reveals an immense cultural divide between his world and ours today.
On September 8, 1945, two days after he arrived in Nagasaki, Weller wrote his third dispatch concerning Nagasaki itself. He described wounded Japanese in two of Nagasaki's undestroyed hospitals, and recorded the question posed by his official guide:
Showing them to you, as the first American outsider to reach Nagasaki since the surrender, your propaganda-conscious official guide looks meaningfully in your face and wants to know: "What do you think?"
What this question means is: Do you intend writing that America did something inhuman in loosing this weapon against Japan? That is what we want you to write (p. 37).
What would many reporters today write if asked this question by bombed enemy civilians? . . .
Craig Biddle
Surveys the promises of John McCain and Barack Obama, shows that these intentions are at odds with the American ideal of individual rights, demonstrates that the cause of such political aims is a particular moral philosophy (shared by McCain and Obama), and calls for Americans to repudiate that morality and to embrace instead a morality that supports the American ideal.