CIAO DATE: 04/2010
Volume: 5, Issue: 1
Spring 2010
Citizens United and the Battle for Free Speech in America
Steve Simpson
Government-Run Health Care vs. the Hippocratic Oath
Paul Hsieh
When medical students graduate from medical school, they take an oath-the Hippocratic oath-in which they solemnly swear, above all, to use their best judgment in treating their patients. Doctors hold this oath as sacrosanct; they regard upholding it as morally mandatory, and violating it as out of the question. But in order to uphold this oath, in order to practice medicine in accordance with their best judgment, doctors must be free to practice in accordance with their best judgment. Unfortunately, U.S. politicians are working feverishly to prevent doctors from upholding the Hippocratic oath. How so? By implementing government-run health care.
Politicians' efforts to impose government-run health care include their goal of "guaranteeing" health care to everyone. But whenever the government attempts to "guarantee" health care, it must also control the costs of that service-which means, it must dictate how doctors may and may not practice. Toward this end, as Harvard professor Martin Feldstein notes, advocates of government-run health care call for "comparative effectiveness" practice guidelines. Quoting the White House Council of Economic Advisers, Feldstein points out that these guidelines are designed to ration health care and reduce spending by "implementing a set of performance measures that all providers would adopt" and by "directly targeting individual providers . . . (and other) high-end outliers."1 ("High-end outliers" is government-speak for "physicians who order more tests or perform more procedures than the government deems appropriate.")
An example of such "effectiveness" guidelines is the new federal recommendations for screening mammography. The U.S. Preventive Services Task Force (USPSTF) recently recommended restricting mammogram screening to women over age fifty, despite the fact that medical organizations such as the American Cancer Society and the American College of Radiology-whose conclusions are based on years of peer-reviewed scientific research-have long recommended that women begin routine mammography at age forty.2
The USPSTF argues that eliminating mammograms for women between ages forty and forty-nine would result in only one additional cancer death per nineteen hundred women screened-an increase in death that they evidently consider acceptable.3 The announcement of these new guidelines caused so much public controversy that Secretary of Health and Human Services Kathleen Sibelius quickly backpedaled and stated that these particular USPSTF recommendations would be "nonbinding."4 But what does "nonbinding" mean when it refers to the guidelines of a government agency? The government is an agent of force. Any government "recommendations" come with at least the implicit threat that recalcitrant doctors may face negative consequences.
Not surprisingly, government medical agencies have already adopted the new guidelines. The California state government has begun using the USPSTF guidelines to determine which services patients in the Medi-Cal program may and may not receive.5 (Medi-Cal is the California equivalent of Medicaid in other states.) Government-funded health programs in New York and Ohio have already begun turning away women under fifty seeking mammograms.6 And, Sibelius's reassurances notwithstanding, Congress is considering giving the USPSTF legal authority to determine which screening tests will or will not be covered for patients with private health insurance.7
How are American physicians responding to these developments? Fortunately, many have chosen to ignore the guidelines, to continue practicing according to their best medical judgment, and to order mammograms on their female patients between ages forty and fifty as they see fit.8 But bear in mind that the White House Council of Economic Advisers has already pejoratively labeled such physicians "high-end outliers." If the government decided to enforce its "comparative effectiveness" guidelines, such doctors could be punished at any moment. And bear in mind what the punishment would be for: upholding their Hippocratic oath, their promise to practice according to their best judgment for the best interests of their patients. . . .
Endnotes
Acknowledgment: I would like to thank Evan Madianos, MD, for his assistance with an early version of this article.
1 Martin Feldstein, "ObamaCare Is All about Rationing," Wall Street Journal, August 18, 2009.
2 Rob Stein, "Breast Exam Guidelines Now Call for Less Testing," Washington Post, November 17, 2009; "Can Breast Cancer Be Found Early?" American Cancer Society, September 19, 2009, http://www.cancer.org/docroot/cri/content/cri_2_4_3x_can_breast_cancer_be_found_early_5.asp; Wendie Berg et al., "Frequently Asked Questions about Mammography and the USPSTF Recommendations: A Guide for Practitioners," Society of Breast Imaging, December 11, 2009, http://www.sbi-online.org/associations/8199/files/Detailed_Response_to_USPSTF_Guidelines-12-11-09-Berg.pdf.
3 "American Cancer Society Responds to Changes to USPSTF Mammography Guidelines," November 16, 2009, http://www.cancer.org/docroot/MED/content/MED_2_1x_American_Cancer_Society_Responds_to_Changes_to_USPSTF_Mammography_Guidelines.asp.
4 Valerie Richardson, "Sebelius Shuns New Mammogram Report," Washington Times, November 19, 2009.
5 Jim Sanders, "Move to Curb Mammograms for Poorest Women Sparks Outrage," Sacramento Bee, December 16, 2009.
6 Valerie Bauman, "Poor Being Turned Away from Free Cancer Screenings," Denver Post, December 12, 2009.
7 "Who Will Determine Who Gets a Mammogram and How Often under ObamaCare?" National Center for Policy Analysis, November 24, 2009, http://www.ncpa.org/pdfs/who_determines_mammogram.pdf.
8 Pam Belluck, "Many Doctors to Stay Course on Breast Exams for Now," New York Times, November 17, 2009.
The Virtue of Treating People Like Animals: Why Human Health Care Should Mirror Veterinary Health Care
Sarah Gelberg
When my two-year-old cat, Lily, began vomiting and refused her food and water, I took her to my veterinarian who, after a battery of X-rays and other tests, found nothing conclusive. The vet offered a preliminary diagnosis of gastritis, an inflammation of the stomach lining, and sent us home with medication to treat the condition. When twenty-four hours of the treatment yielded no improvement, we returned to the vet, who admitted Lily for observation overnight. The next evening, the vet phoned to say: "Lily is still vomiting and refusing food and water, so we ran a second set of X-rays and a comparison of the two sets revealed that her intestines are bunching as if something's lodged inside. There's an emergency veterinary clinic twenty miles away that has an ultrasound machine, which will enable us to see what's inside. Please come pick up Lily and drive her there; we'll notify them that you're on your way."
The ultrasound revealed a large quantity of thread tangled in Lily's digestive tract. Unbeknownst to me, she had extracted a bobbin of thread from my sewing kit and swallowed the contents. The condition required surgery, which the vet at the emergency clinic performed that night, removing the thread (which was lodged in Lily's stomach, small intestine, and large intestine) without complications. Lily remained in intensive care for two days before the vet sent her home with a scar on her stomach, some antibiotics, and a list of instructions for postoperative care. She recovered fully and was back to mischief in short order.
As this story indicates, the state of animal health care in America, in terms of the quality of the diagnostics and treatments available, is in many ways on par with that of human health care. And the fact that advancements in veterinary medicine have progressed in close parallel with those in human medicine should come as little surprise: Animals are important to us. They provide us with, among other things, food, labor, and companionship. To ensure that our animals are respectively tasty, reliable, healthy, and happy, we need the services of well-trained veterinarians equipped with the latest technologies. That demand is nicely satisfied.
Most veterinarians in private practice specialize in either large-animal or small-animal medicine, a division that roughly corresponds to the distinction between livestock, such as cows and sheep, and companion animals, such as dogs and cats. Small-animal veterinary medicine is, in important respects, remarkably similar to human medicine. The skills required in small-animal medicine are, by and large, the same as those required in human medicine,1 and today's veterinary schools are every bit as rigorous as their counterparts in human medicine. After earning their undergraduate degrees, veterinary students must complete four years of medical training and then pass national and state licensure exams. Those who choose to become specialists must also complete an internship and residency and pass an examination for their chosen specialty.2
The technologies used by veterinarians and those used by medical doctors are similar as well. Vets use many of the same drugs as medical doctors, albeit in different concentrations, doses, and formulations;3 and their facilities are equipped with essentially the same kind of medical equipment to treat essentially the same kinds of medical problems. In fact, a great deal of the medical equipment used in veterinary medicine, including surgical instruments, common devices such as stethoscopes, and CT scan machines, is either identical to that used in human medicine or downsized to accommodate the smaller size of most pets.4 In the United States, advancements in human medicine-whether in training, medications, or facilities-are generally mirrored in small-animal veterinary medicine.
Fortunately for our pets, however, veterinary medicine has not paralleled human medicine in two important respects: accessibility and affordability. . . .
Endnotes
1 "Animal Health: Veterinarians," American Veterinary Medical Association, revised February 2009, http://www.avma.org/animal_health/brochures/veterinarian/veterinarian_brochure.asp.
2 "Becoming a Veterinarian FAQs," Aardvarks to Zebras, http://aardvarks2zebras.org/becoming-a-veterinarian/ becoming-a-veterinarian-faqs/.
3 Kara Rogers, "The Animals' Medicine Cabinet: Human Drugs and Clinical Trials for Animals," Encyclopædia Britannica's Advocacy for Animals, May 18, 2009, http://advocacy.britannica.com/blog/advocacy/2009/05/the-animals%E2%80%99-medicine-cabinet-human-drugs-and-clinical-trials-for-animals/.
4 See http://www.spectrumsurgical.com/index.php or http://www.medical-tools.com/index.php for comparison pricing between vet and human surgical instruments and tools.
The Practicality of Private Waterways
Alan Germani, J. Brian Phillips
For centuries, few have questioned the idea that waterways-streams, rivers, lakes, and oceans-are or should be "public property." The doctrine of "public trust," with roots in both Roman and English common law, holds that these resources should not be privately owned but rather held in trust by government for use by all. The United States Supreme Court cited this doctrine in 1892, ruling that state governments properly hold title to waterways such as lakes and rivers, "a title held in trust for the people of the state that they may enjoy the navigation of the waters, carry on commerce over them, and have liberty of fishing therein freed from the obstruction or interference of private parties."1
This "public ownership," however, is increasingly thwarting the life-serving nature of waterways as sources of drinking water, fish, and recreation. Predictably, when a resource-whether a park, an alleyway, or a pond-is owned by "everyone," its users have less incentive to protect or improve its long-term value than they would if it were owned by an individual or a corporation. Users of "public property" tend to use the resource for short-term gain, often causing the deterioration of its long-term value-the well-known "tragedy of the commons." This phenomenon is perhaps nowhere clearer than in the case of waterways.
"Public ownership" of waterways has led to, among other problems, harmful levels of pollution and depleted fish populations. Many waterways around the world have become so polluted that they are no longer fit for human use. In 2004, the Environmental Protection Agency reported that one-third of America's lakes and nearly one-fourth of its rivers were under fish-consumption advisories due to polluted waters.2 In 2005, officials in China estimated that 75 percent of that nation's lakes were contaminated with potentially toxic algal blooms caused by sewage and industrial waste.3 And the World Commission on Water has found that half the world's rivers are either seriously polluted or running dry from irrigation and other human uses or both.4 By one estimate, the contaminated drinking water and poor sanitation that result from pollution and low water levels account for five to ten million deaths per year worldwide.5
In addition to containing harmful levels of pollution, many of the world's waterways are being fished in a manner that is depleting fish populations and threatening with extinction fish species such as red snapper, white sturgeon, and bluefin tuna-species highly valuable to human life.6 By 2003, primarily due to fishing practices associated with public waterways, 27 percent of the world's fisheries (zones where fish and other seafood is caught) had "collapsed"-the term used by scientists to denote fish populations that drop to 10 percent or less of their historical highs.7 In 2006, the journal Science published a study that offered a grim prediction: All of the world's fisheries will collapse by 2048.8 Whether or not all of the world's fisheries will collapse in a mere forty years, the data clearly show that current fishing practices are depleting supplies of many species of consumable fish. At best, at the current rate of fish depletion, many fishermen will lose their livelihoods and consumers will have fewer and fewer species from which to choose, species that will become more and more expensive.
What solutions have been proposed? Federal and state governments have attempted to remedy these problems through regulation-violating rights and creating new problems in the process. For example, twenty-five states prohibit or severely restrict the use of laundry detergents containing phosphates, substances that harm aquatic life when present in water in high quantities.9 A growing number of state and local governments-including Westchester County, New York, and Annapolis, Maryland-are enacting similar regulations on phosphate-containing fertilizers.10 These laws violate the rights of detergent and fertilizer manufacturers by precluding them from creating the products they choose to create-and they violate the rights of consumers who want to buy such products rather than more-expensive, less-effective alternatives. Further, these rights-violating prohibitions have proven impractical in achieving their purpose: Despite many such regulations having been in effect for nearly forty years,11 an estimated two-thirds of America's bays and estuaries still contain harmful amounts of phosphates.12
Regulations regarding sewage treatment have proven similarly impractical: Since 1972, the federal government has forced water utilities to spend billions of dollars upgrading water treatment facilities, and yet, during the past four years, record numbers of beaches have closed due to pollution from sewage.13 And, for what it is worth, the EPA predicts that by 2016 American rivers will be as polluted by sewage as they were in the 1970s.14
Government efforts to address depleted fish populations have proven similarly impractical. The history of the halibut industry in Alaska is an illuminating case in point. In the 1970s, the International Pacific Halibut Commission (IPHC)-a U.S.-backed intergovernmental regulatory agency-established a five-month fishing season in public waters off the Alaskan coast with the hope of maintaining halibut populations, which had become severely depleted. But forcibly limiting the time during which fishermen could operate did little to improve the fishery's viability: Fishermen simply worked more vigorously during the season, and the halibut population remained at historically low levels. So, in the 1980s, the IPHC attempted to remedy the problem by reducing the five-month fishing season dramatically-to as few as two days.15 During these shortened windows of opportunity, fishermen took extreme risks to maximize their catches, only to be "rewarded" onshore with the plummeting prices of a glutted market. And, in the end, the huge catches brought in by fishermen on these days were still large enough to jeopardize the halibut population.16 So, in 1995, the IPHC dropped the idea of a short fishing season and instead introduced a "catch share program," through which it limits each fisherman's yearly catch to a percentage of what it deems to be a "safe" overall halibut harvest. But neither has this policy helped the situation; today, after more than two decades of shifting regulations, the usable halibut population in Alaskan waters is less than in 1985.17
Although some claim that still more government regulations are required to combat the ongoing problems of pollution and depleted fish populations, any such coercive measures are in principle doomed to failure because they attempt to treat problems in the waterways while ignoring their actual cause: "public ownership." Government force may provide a disincentive for certain behaviors, but this disincentive does not motivate the users of waterways to maintain or enhance the life-serving value of these resources. As a result, America's waterways remain largely and significantly polluted, and fish populations, even where they are stabilizing, remain at levels insufficient to meet the growing demand for seafood. . . .
Endnotes
The authors would like to thank Craig Biddle, Dwyane Hicks, and Thomas A. Bowden for discussions that aided the authors' understanding of the issues discussed in this article, and Matthew Gerber, Ben Bayer, and Steve Simpson for helpful comments made to earlier drafts.
1 Illinois Central R.R. Co. v Illinois (1892) 146 U.S. 387, 452.
2 Jaime Holguin, "Pollution Overtaking Lakes, Rivers,," CBSNews.com, http://www.cbsnews.com/stories/2004/08/24/tech/main638130.shtml.
3 Antoaneta Bezlova, "China's Toxic Spillover," Asia Times, December 2, 2005, http://www.atimes.com/atimes/China_Business/GL02Cb06.html. When consumed by fish, shellfish, and livestock, such hazardous algae can enter the human food chain.
4 Mary Dejevsky, "Half of World's Rivers Polluted or Running Dry," The Independent, November 30, 1999; http://www.independent.co.uk/news/world/half-of-worlds-rivers-polluted-or-running-dry-1129811.html.
5 http://www.grinningplanet.com/2005/07-26/water-pollution-facts-article.htm.
6 http://www.nmfs.noaa.gov/fishwatch/species/red_snapper.htm , Species l ist from the U.S. Fish and Wildlife Service; http://ecos.fws.gov/tess_public/SpeciesReport.do?groups=E&listingType=L&mapstatus=1; http://news.nationalgeographic.com/news/2006/07/060724-bluefin-tuna.html.
7 "Catch Shares Key to Reviving Fisheries," Environmental Defense Fund, http://www.edf.org/article.cfm?contentID=8446.
8 Cornelia Dean, "Study Sees ‘Global Collapse' of Fish Species," New York Times, November 3, 2006, http://www.nytimes.com/2006/11/03/science/03fish.
9 http://enviro.blr.com/enviro_docs/88147_9.pdf.
10 Juli S. Charkes, "Board Votes to Ban Phosphate Fertilizers," New York Times, May 1, 2009, http://www.nytimes.com/2009/05/03/nyregion/westchester/03lawnwe.html; Karl Blankenship, "Annapolis to Ban Use of Fertilizer with Phosphorus in Most Cases," Bay Journal, http://www.bayjournal.com/article.cfm?article=3511.
11 Michael Hawthorne, "From the Archives: Banned in Chicago but Available in Stores," Chicago Tribune, April 4, 2007, http://www.chicagotribune.com/news/local/chi-daley-phosphates,0,2871187.story.
12 http://www.grinningplanet.com/2005/07-26/water-pollution-facts-article.htm.
13 http://www.nrdc.org/water/oceans/ttw/titinx.asp and http://epa.gov/beaches/learn/pollution.html#primary.
14 Martha L. Noble, "The Clean Water Act at 30-Time to Renew a Commitment to National Stewardship," Catholic Rural Life Magazine, vol. 45, no. 2, Spring 2003, http://www.ncrlc.com/crl-magazine-articles/vol45no2/Noble.pdf.
15 http://www.fishex.com/seafood/halibut/halibut.html.
16 Halibut populations continued to decline, and the IPHC decreased the allowed catch more than 26 percent between 1986 and 1995. http://www.iphc.washington.edu/halcom/commerc/limits80299.htm.
17 The total catch share for halibut-which is based on "exploitable biomass"-declined between 1985 and 2009. For 1985 limits, see http://www.iphc.washington.edu/halcom/commerc/limits80299.htm. For 2009 limits, see http://www.iphc.washington.edu/halcom/newsrel/2009/nr20090120.htm.
Norman Borlaug: The Man Who Taught People To Feed Themselves
Audra Hilse
In 1970, the Nobel Peace Prize was awarded to a man named Norman Borlaug. His achievement? Saving hundreds of millions of people from death by starvation. Yet today, few people in America and the West even know his name. This is unfortunate, for his story is heroic.
Borlaug was a geneticist and plant pathologist who discovered ways to produce heartier and faster-growing varieties of wheat and other grains, brought these methods to various parts of the world, and taught people how to implement them. Thanks to his work, farmers and agriculturalists were-and are-able to produce orders of magnitude more food than they could prior to his discoveries.
Borlaug was born in Iowa on March 25, 1914. His parents were farmers, and he was educated in a one-room schoolhouse through the eighth grade. He did well in high school, and wanted to pursue a college degree. In 1933, on the recommendation of a friend, and despite the onset of the Depression, he hitched a ride north to enroll at the University of Minnesota. He started in the General College, and later chose forestry as his major. He earned his degree in 1937, and was planning to enter the Forest Service until he attended a lecture presented by Dr. E. C. Stakman, a plant pathologist. That talk, Borlaug later said, "changed my life, my whole career."1
Stakman's lecture, "These Shifty Little Enemies that Destroy our Food Crops," discussed the spread of plant "rust" that was killing off grains across the United States.2 Borlaug was so fascinated by the subject that, instead of joining the Forest Service, he enrolled in the university's graduate program for plant pathology, where he proceeded to earn both a master's degree (1937) and a doctorate (1942).
After receiving his doctorate, Borlaug took a job as a microbiologist with the DuPont de Nemours Foundation, but he did not stay there long.3 In September 1943, the Rockefeller Foundation offered him a position running a joint program with the Mexican government, helping Mexican farmers to improve agricultural technology and increase their wheat production. Borlaug accepted the job, moved to Mexico with his wife and children, and launched the Cooperative Wheat Research and Production Program. . . .
End Notes
1 Vicki Stavig, "Bread and Peace," Minnesota, January-February 2004,http://www.alumni.umn.edu/Bread_and_Peace (accessed December 29, 2009).
2 Mark Stuertz, "Green Giant," Dallas Observer, December 5, 2002, http://www.dallasobserver.com/2002-12-05/news/green-giant/ (accessed December 29, 2009).
3 Stavig, "Bread and Peace" (accessed January 6, 2010).
Making Life Meaningful: Living Purposefully
Craig Biddle
Author's note: This is chapter 5 of my book Loving Life: The Morality of Self-Interest and the Facts that Support It (Richmond: Glen Allen Press, 2002), which is an introduction to Ayn Rand's morality of rational egoism. Chapters 1-4 were reprinted in the prior four issues of TOS.
In chapter 4, we saw the life-or-death importance of productive work and, more fundamentally, of rational thinking. We also discovered what emotions are, where they come from, and what they mean. Finally, we observed and contrasted the crucial yet distinct roles of reason and emotion in human life and happiness. We will now capitalize on these truths. In this chapter, we turn to the question of how to make life meaningful. And the key word here is: make.
Life does not come with ready-made meaning; we are not born with pre-packaged purpose. If we want our life to be meaningful, we have to make it so.
Our life is a process of self-generated, goal-directed action-action that, because we have free will, is generated by us toward goals chosen by us. The meaning of our life is a function of the goals we choose to pursue-that is, our purposes.
A purpose is a conscious, intentional goal-a goal chosen and pursued for a desired outcome. A rational purpose is a purpose that promotes one's life-such as getting an education, developing a career, engaging in a hobby, building a romantic relationship, or raising one's children. These are the kinds of goals that make life meaningful.
For example, consider a college student who chooses his major carefully, goes to class regularly, and takes his studies seriously. He is selfishly after something; he is acting purposefully toward a life-promoting end. In so doing, he adds meaning to his life in the form of value-achievements-such as increased knowledge, improved judgment, and an earned diploma. By contrast, consider a college student who picks a major at random, frequently skips class to "hang out" in the coffee shop, and studies just enough to "get by." He is not selfishly after anything; he is not acting purposefully toward a life-promoting end. Consequently, he achieves nothing of value; he adds no meaning to his life. Even if he happens to receive a diploma, it will be meaningless, because he did not put anything into it; he did not earn it. Meaningful values are products of purposeful efforts. They have to be earned.
In regard to career, suppose a young office clerk decides that he wants to manage the company for which he works. He commits himself to learning everything he can about the business, constantly asks himself what can be done to improve operations, develops innovative ideas, presents them to his superiors, and seizes every opportunity to excel. Not surprisingly, over the course of some interesting, action-packed years, he makes his way to the top-where he does not stop: Once there, he strives to take the company to ever greater heights. Here is a person acting purposefully and, as a result, making his days and years exciting, inspiring, and rewarding-filling his life with meaning.
Now, contrast him to a young office clerk with the same potential, but who sets no such goals, takes no such actions, and stagnates as a clerk for the rest of his life. What will be the meaning of his days and years? What spiritual values will he achieve by means of his lethargy? The answer is obvious.
The meaning of one's life is determined by the choices one makes and the effort one exerts. Whether one's life is meaningful or meaningless depends on whether or not one chooses to be rational and purposeful.
Of course, irrational choices and actions may be said to have negative meaning-in that they have anti-life consequences. But this does not grant them any moral validity. Taking life-destroying actions is not a means to an "alternative lifestyle." Acting against one's life and long-term happiness is not another way to live; it is only a way to die.
Observe further, in this connection, that there is no such thing as a "neutral" goal or value. . . .
Infidel by Ayaan Hirsi Ali
Heike Larson
Infidel is a heroic, inspiring story of a courageous woman who escapes the hell of a woman's life in the Muslim world and becomes an outspoken and blunt defender of the West. Ms. Hirsi Ali takes the reader on her own journey of discovery, and enables him to see, through concretes and by sharing her thought processes, how she arrived at the conclusion that Islam is a stagnant, tyrannical belief system and that the Enlightenment philosophy of the West is the proper system for human beings.
In Part I, Ms. Hirsi Ali describes her childhood in Muslim Africa and the Middle East. With her father imprisoned for opposing Somalia's communist dictator Siad Barré and her mother often preoccupied with finding food for her family, young Ayaan and her siblings grew up listening to the ancient legends their grandmother told them-legends glorifying the Islamic values of honor, family clans, physical strength, and aggression. Born in 1969 in Somalia, Ms. Hirsi Ali moved frequently with her family to escape persecution and civil war, living in Saudi Arabia, Ethiopia, and Kenya. At a colonially influenced Kenyan school, she discovered Western ideas, in the form of novels, "tales of freedom, adventure, of equality between girls and boys, trust and friendship. These were not like my grandmother's stark tales of the clan, with their messages of danger and suspicion. These stories were fun, they seemed real, and they spoke to me as the old legends never had" (p. 64).
Forced into an arranged marriage, she was shipped to Germany to stay with distant family while awaiting a visa for Canada to join the husband she didn't know. At age twenty-two, alone and with nothing but a duffle bag of clothes and papers, she took a train to Holland to escape the dreary life of a Muslim wife-slave. "It was Friday, July 24, 1992, when I stepped on the train. Every year I think of it. I see it as my real birthday: the birth of me as a person, making decisions about my life on my own" (p. 188).
In Part II, Ms. Hirsi Ali shares her wonder of arriving in modernity, and her relentless effort to create a productive, independent life for herself. After being granted asylum, she worked menial jobs, learned Dutch, became a Swahili translator, earned a vocational degree, and finally graduated with a degree in political science from one of Holland's most prestigious universities. An outspoken advocate of the rights of Muslim women, she was elected to the Dutch parliament in 2003, as a "one-issue politician"-she "wanted Holland to wake up and stop tolerating the oppression of Muslim women in its midst" and to "spark a debate among Muslims about reforming aspects of Islam so people could begin to question" (p. 295). She became a notorious critic of Islam, at one point daring to call the Prophet Muhammad a pervert for consummating marriage with one of his many wives when she was only nine years old. In 2004, she made a short film called Submission: Part 1 in which she depicted women mistreated under Islamic law raising their heads and refusing to submit any longer. Tragically, the film's producer, Theo van Gogh, was brutally murdered by an offended Muslim, who left on van Gogh's body a letter threatening Ms. Hirsi Ali with the same fate. Since 2004, Ms. Hirsi Ali has had to live under the constant watch of bodyguards, often going into hiding for months at a time.
Although the straight facts of her life are in and of themselves admirable, Ms. Hirsi Ali's intellectual journey as presented in Infidel is truly awe inspiring. This journey begins in Africa in the disturbingly dark world of Islam-with its disdain for thought and reason, its self-sacrificial ethics, and its corrupt, tyrannical politics-and ends in the West with her having become an outspoken champion of reason and freedom. . . .
Winning the Unwinnable War: America's Self-Crippled Response to Islamic Totalitarianism, edited by Elan Journo
Grant W. Jones
In Winning the Unwinnable War, editor Elan Journo and fellow contributors Yaron Brook and Alex Epstein consider the ideas and events that led to 9/11 and analyze America's response. Arguing that our nation has been made progressively less secure by policies based on "subordinating military victory to perverse, allegedly moral constraints" (p. ix), they offer an alternative: grounding American foreign policy on "the moral ideal of rational self-interest" (p. 188). This they accomplish in the space of seven chapters, divided into three sections: "Part One. The Enemy," "Part Two. America's Self-Crippled Response to 9/11," and "Part Three. From Here, Where Do We Go?"
In Part One, in a chapter titled "What Motivates the Jihad on America," Journo considers the nature of the enemy that attacked America on 9/11. With refreshing honesty, Journo dispenses with the whitewashing that often accompanies discussions of Islam and Jihad, pointing out that the meaning of "Islam" is "submission to Allah" and that its nature "demands the sacrifice of not only the mind, but also of self" (p. 33). Says Journo, the Jihadists seek to impose Allah's will-Islamic Law-just as Islamic teaching would have it: by means of the sword. "Islamic totalitarians consciously try to model themselves on the religion's founder and the figure who is held to exemplify its virtues, Muhammad. He waged wars to impose, and expand, the dominion of Islam" (p. 35).
In "The Road to 9/11," Journo summarizes thirty years of unanswered Jihadist aggression, beginning with the Iranian takeover of the American embassy in Tehran in 1979. Throughout, Journo criticizes the idea that influenced the actions of America's leaders during this time-"realism"-which he describes as eschewing "[m]oral ideals and other broad principles" in favor of achieving narrow, short-range goals by sheer expediency (p. 20). Because of the nature of their own ideas, says Journo, realists are incapable of understanding the Jihadists and thus incapable of understanding how to act with respect to them. "The operating assumption for realist policymakers is that (like them) no one would put an abstract, far off ideal ahead of collecting some concrete, immediate advantage (money, honor, influence). So for realists, an enemy that is dedicated to a long-term goal-and thus cannot be bought off with bribes-is an enemy that must remain incomprehensible" (p. 21). Journo indicates how realism was applied to the Islamist threat in the years leading up to 9/11:
Facing the Islamist onslaught, our policymakers aimed, at most, to manage crises with range-of-the-moment remedies-heedless of the genesis of a given crisis and the future consequences of today's solution. Running through the varying policy responses of Jimmy Carter, Ronald Reagan, George H.W. Bush, and Bill Clinton there is an unvarying motif. . . . Our leaders failed to recognize that war had been launched against us and that the enemy is Islamic totalitarianism. This cognitive failure rendered Washington impotent to defeat the enemy. Owing to myopic policy responses, our leaders managed only to appease and encourage the enemy's aggression (p. 6).
After 9/11, President George W. Bush shied away from the realist policy of passively reacting to the ever-escalating Islamist threat-and instead adopted the foreign policy favored by neoconservatives. "In place of ‘realism,' neoconservatives advocated a policy often called ‘interventionism,' one component of which calls for America to work assertively to overthrow threatening regimes and to replace them with peaceful ‘democracies'" (p. 118). Two chapters of Winning the Unwinnable War are devoted to dissecting this policy, "The ‘Forward Strategy' of Failure" by Brook and Journo (first published in TOS, Spring 2007) and "Neoconservative Foreign Policy: An Autopsy" by Brook and Epstein (first published in TOS, Summer 2007).
In the first of these chapters, Brook and Journo consider Bush's interventionist plan, the "forward strategy of freedom." On the premise that democracies do not wage wars of aggression, Bush launched two campaigns of democratic state building in the Middle East-in Afghanistan and Iraq. In 2003, Bush exclaimed, "Iraqi democracy will succeed-and that success will send forth the news, from Damascus to Tehran-that freedom can be the future of every nation" (p. 54). But neither Iraqi freedom nor American security was achieved by Bush's "forward strategy" of enabling Iraqis and Afghanis to vote. Because of democratic elections, Iraq "is [now] dominated by a Shiite alliance led by the Islamic Dawa Party and the Supreme Council for Islamic Revolution in Iraq (SCIRI)" (p. 54), and a "further effect of the elections in the region has been the invigoration of Islamists in Afghanistan" (p. 57). . . .
Why Are Jews Liberals? by Norman Podhoretz
Gideon Reich
Norman Podhoretz, Jewish neoconservative and former editor-in-chief of Commentary magazine, attempts in his book Why Are Jews Liberals? to answer the perplexing commitment of American Jews to modern liberalism. Jews, according to Podhoretz, violate "commonplace assumptions" about political behavior, such as that "people tend to vote their pocket books"; they "take pride . . . in their refusal to put self-interest . . . above the demands of ‘social justice'"; and they have consistently sided with the left in the "culture war" (pp. 2-3). According to statistics cited by Podhoretz, 74 percent of Jews support increased government spending and, since 1928, on average, 75 percent have voted for candidates of the Democratic Party.
Such political behavior "finds no warrant either in the Jewish religion or in the socioeconomic condition of the American Jewish community" (p. 3), argues Podhoretz; it can be explained only by realizing that Jews are treating liberalism as a "religion . . . obdurately resistant to facts that undermine its claims and promises" (p. 283).
Podhoretz traces the prevalent political orientation of present-day Jews to conditions suffered by their Jewish ancestors in medieval Europe and later in the United States. During the Dark and Middle Ages, Christian authorities in Europe placed severe restrictions on Jews, including where they could live and what professions they could practice. In later centuries, as the influence of Christianity declined, liberal revolutions swept much of the European continent, and, in the 19th century, Western European governments began recognizing the rights of Jews and treating them as equal under the law (p. 57). Even so, conservative Christians, who still supported the monarchies, remained opposed to the "emancipation" of the Jews (pp. 55-57). Consequently, Jews entered politics in Europe almost exclusively as liberals, in opposition to the Christian right that had oppressed them and their ancestors (pp. 58-59).
Governments in Eastern Europe and Russia, however, continued to persecute Jews well into the early 20th century (pp. 65-67), and, between 1881 and 1924, two million Jews immigrated to America, where they would be treated equally before the law. Most were poor, and few ventured out of Lower East Side Manhattan, where the majority found jobs in the textile industry, working more than sixty hours a week for low wages, and where even "modest improvements in their condition" were achieved only by the efforts of a Jewish labor movement (pp. 99-100). . . .
Capitalism Unbound: The Incontestable Moral Case for Individual Rights by Andrew Bernstein
Ari Armstrong
With Congress debating far-reaching bills to expand federal control of health care, politicians and pundits blaming the economic downturn on allegedly free markets, President Obama fulfilling his promise to "spread the wealth around," and dozens of czars overseeing wide swaths of American life, it seems that capitalism is in retreat. A rousing defense of capitalism, therefore, could not have come at a better time, and that is what Andrew Bernstein provides in his new book, Capitalism Unbound. Bernstein ably defends the achievements of the Industrial Revolution, presents the moral foundation for capitalism, skewers socialism, and indicates in some respects how several disasters-including the recent housing bust-were caused by government meddling in the economy.
Capitalism Unbound is an updated and highly condensed version of Bernstein's 2005 book, The Capitalist Manifesto: The Historic, Economic and Philosophic Case for Laissez-Faire. With the new book, Bernstein promises "the essential points-presented in a simple, easy to read format" (p. ix).
He begins his sixteen-page Prologue, "The Primordial Struggle for Individual Liberty," by mentioning that capitalism rests on the "moral code . . . of an individual's inalienable right to his own life" (p. 1). After recounting the American Revolution as a key example of the furthering of individual rights, Bernstein applies the principle of rights to issues such as contracts, property, and employment. He then defines some key terms, including capitalism ("the system of individual rights, including property rights, in which all property is privately owned"), freedom (protection "against the initiation of force by either private citizens or the government"), and statism ("the subordination of the individual to the state [and] the repudiation of inalienable individual rights") (pp. 10-11). The prologue concludes with a discussion of some of history's most horrifying instances of statism, including tribal dictatorships, Soviet communism, National Socialism, and Islamic theocracy.
The rest of the book is divided into three parts, about the historical, moral, and economic superiority of capitalism, respectively.
In Part One, "The Historic Superiority of Capitalism," Bernstein first summarizes the impoverished conditions of preindustrial Europe. He then explains how, inspired by Enlightenment thinkers, innovators of 18th-century England and 19th-century America achieved profound advances in technology and economic production, created goods and services that radically improved the living conditions of the common person, and often amassed fortunes in the process. These productive giants include steam engineer James Watt, steel titan Andrew Carnegie, and oil pioneer John D. Rockefeller, who by the height of his dominance had driven oil prices from fifty-eight cents to eight cents per gallon (p. 52). Bernstein reviews many of the economic advances of the Industrial Revolution, such as the enormous expansion of cotton cloth-spun English cotton increased twenty-four-fold between 1765 and 1784 alone-enabling "hundreds of millions of people worldwide . . . to dress . . . comfortably, cleanly, and hygienically" (pp. 34-35, emphasis removed). . . .
Essays on Ayn Rand's Atlas Shrugged, edited by Robert Mayhew
Daniel Wahl
Since it was first published in 1957, Ayn Rand's Atlas Shrugged has sold more than seven million copies. Remarkably, in that time, the popularity of the novel has actually grown, and, over the past couple of years, due to the parallels between the story and recent events, sales have skyrocketed. In 2009 alone, Atlas sold more then a half million copies. And now a new resource is available for the novel's legion of fans: Essays on Ayn Rand's Atlas Shrugged,edited by Robert Mayhew.
In "The Part and Chapter Headings of Atlas Shrugged," Onkar Ghate indicates why such a collection is of value to fans of the novel: "At over a thousand pages long and dealing with the fate of a civilization, Atlas Shrugged is a story of incredible scope and complexity."
Its theme is the role of man's reasoning mind in achieving all the values of his existence. Its plot is driven by a central question, a seeming contradiction: If the men of the mind are the creators and sustainers of man's life, why do they continually lose their battles and witness their achievements siphoned off and destroyed by men who have abandoned their minds? The story focuses on how the men of the mind learn to ask and to answer this question, thereby putting a stop to their own exploitation.
To resolve the apparent contradiction demands of the heroes a ruthless commitment to logic: to identify the problem, learn its fundamental cause, and grasp the path to its solution. To liberate themselves, the men of the mind must discover, understand, and then practice a new set of philosophical principles. And for we as readers to really appreciate the story's progression, the same exacting logical focus is demanded of us (p. 1).
Assuming "familiarity with the events of each part and each chapter," Ghate highlights "how the part and chapter headings help integrate those events and thus enable [readers] to gain a deeper understanding of the story and its meaning" (p. 2).
Gregory Salmieri presents two essays (either of which could be reason enough to purchase the book) elucidating the theme of the novel, which Rand specified as "the role of the mind in man's existence" (p. 219), and discussing how the novel constitutes "the demonstration of a new moral philosophy" (p. 398). . . .
The Sparrowhawk Series by Edward Cline
Dina Schein Federman
The founding of the United States was among the most dramatic and glorious events in history. For the first time, a nation was founded on the principle of individual rights. Those interested in learning about America's founding and its cause may turn to history texts. But history texts, even when their content is accurate, tend to be dry accounts of events. They lack the excitement of an adventure novel. Yet most novels set in the Revolutionary period are not good sources of information: Being works of fiction, they may take liberties with historical fact; and they often employ the American Revolution merely as their setting, not as their focus. What if one could find a work that combined the accuracy of a well-researched historical work with the dramatic presentation of a work of fiction? Fortunately, such a combination exists-the Sparrowhawk series of six novels by Edward Cline.
Cline's purpose in this series is to dramatize America's founding:
Most nations can claim a literature . . . that dramatizes the early histories of those countries. . . . But, except for a handful of novels that dramatize . . . specific periods of events in American colonial history, America has no such literature. The Sparrowhawk series of novels represents, in part, an ambitious attempt to help correct that deficiency (foreword to Book Three, p. ix).
The series is set in the decades preceding the Revolution, beginning in the 1740s in England and concluding in 1775 in colonial Virginia. Throughout, the books dramatize important events leading up to the war, such as the liberty-constraining acts of British Parliament against the colonies and the colonial response to them.
But Cline's theme is that the fundamental cause of America's declaration of independence from Great Britain was not mere events, but certain ideas Americans held. "[D]oing justice . . . to the founding of the United States . . . has meant understanding, in fundamentals, what moved the Founders to speak, write, and act as they did. Those fundamentals were ideas" (foreword to Book Three, p. ix). According to Cline, these fundamental ideas led inexorably to the Revolution: "The juggernaut of Parliamentary supremacy collided with the American colonies' incorruptible sense of liberty, which could be neither crushed nor flung aside. The result was a spectacular explosion: the American Revolution" (foreword to Book Six, p. 1). . . .
Born to Run: A Hidden Tribe, Superathletes, and the Greatest Race the World Has Never Seen by Christopher McDougall
Daniel Wahl
Christopher McDougall's Born to Run: A Hidden Tribe, Superathletes, and the Greatest Race the World Has Never Seen inspires by showing superathletes running hundreds of miles nonstop across treacherous terrain-and argues that these seemingly superhuman feats can be achieved and enjoyed by practically anyone who excises self-imposed limitations.
Readers first see these superathletes in action in the 1994 running of a brutal "ultra-race" called the Leadville Trail 100. McDougall describes participation in the Leadville 100, high in the mountains of Colorado, as "running the Boston Marathon two times in a row with a sock stuffed in your mouth"-then hiking "to the top of Pike's Peak"-and then doing it all over again, "this time with your eyes closed."
That's pretty much what the Leadville Trail 100 boils down to: nearly four full marathons, half of them in the dark, with twin twenty-six-hundred-foot climbs in the middle. Leadville's starting line is twice as high as the altitude where planes pressurize their cabins, and from there you only go up (p. 60).
In the 1994 running, a small group of runners from a secretive tribe based in Mexico's Copper Canyon participated. Called the Tarahumara, they ranged in age from 25 to 42 and ran in sandals, laughing as they left checkpoints more than halfway through the race. They completely dominated the Leadville, finishing first, third, fourth, fifth, seventh, tenth, and eleventh.
The question of how these "stone-age guys in sandals" were able to do it, moreover looking like they were just having fun, has captivated the ultra-running world and McDougall. The explanation McDougall posits was discovered-or, according to the book, rediscovered-by modern doctors, track coaches, anthropologists, and ultra-runners. According to McDougall and the maverick scientists he quotes, we are all born to run, and very long distances at that. McDougall argues that the seemingly superhuman feats displayed by the Tarahumara are achievable by us all if we get rid of barriers that prevent us from doing so, including the modern "junk food" diet and mental limitations with regard to age and sex.
Your Inner Fish: A Journey Into the 3.5-Billion-Year History of the Human Body by Neil Shubin
David H. Mirman
In 2006, Neil Shubin made history when he announced his discovery of a "missing link": a new fossil species showing characteristics that bridge the evolutionary gap between two different types of organisms. He named it Tiktaalik.
[F]or me the greatest moment of the whole media blitz was . . . at my son's preschool. . . . The first child said it was a crocodile or an alligator. When queried why, he said that like a crocodile or lizard it has a flat head with eyes on top. Big teeth, too. Other children started to voice their dissent. Choosing the raised hand of one of these kids, I heard: No, no it isn't a crocodile, it is a fish, because it has scales and fins. Yet another child shouted "Maybe it is both." Tiktaalik's message is so straightforward even preschoolers can see it (p. 25).
Your Inner Fish seeks to reveal the evolutionary history of humans as we travel down the branches of our family tree toward its trunk, passing fossils such as Tiktaalik and finding our "inner fish" along the way.
Shubin makes this story accessible by maintaining a tight focus on the essentials of each topic while minimizing his use of technical terms. He uses analogy well in clarifying points; for instance, he compares the complexity of a smell made of many molecules to a musical chord made of many notes (p. 141), and compares the function of an inner ear bone to a plunger (p. 158). And he generally makes effective use of humor; for example, of searching for fossils, he writes: "Volcanic rocks are mostly out. No fish that we know of can live in lava" (p. 10). On waiting for monthly shipments of rare eggs for embryological research: "We became a virtual cargo cult as we waited" (p. 56). And on the alignment of teeth in mammals: "A mismatch between upper and lower teeth can shatter our teeth, and enrich our dentist" (p. 61). His droll style even affects some of his chapter titles, such as "Handy Genes" for the chapter on genetics of hand development, and "Getting Ahead" for the chapter on the head. . . .
Newton and the Counterfeiter: the Unknown Detective Career of the World's Greatest Scientist by Thomas Levenson
Daniel Wahl
Isaac Newton sought "to master all the apparent confusion of the world, to bring order where none was apparent" (p. 5). He had, as Thomas Levenson shows, a "legendary capacity for study"; and for him frequently "sleep was optional" (p. 9). He was absolutely brilliant. Yet "for all his raw intelligence," Levenson says, "Newton's ultimate achievement turned on [the fact that if] something mattered to him, the man pursued it relentlessly."
Equally crucial to his ultimate success, Newton was never a purely abstract thinker. He gained his central insight into the concept of force from evidence "known by [the] light of nature." He tested his ideas about gravity and the motion of the moon with data drawn from his own painstaking experiments and the imperfect observations of others. When it came time to analyze the physics of the tides, the landlocked Newton sought out data from travelers the world over (p. 20).
By 1695, at the age of 51, Newton had been at "the leading edge of discovery" for almost three decades (p. 104). He had, as Levenson shows, discovered calculus-"the essential tool used to analyze change over time" (p. 14), and he had "presented gravity as the engine of the system of all creation-one that binds the rise and fall of the Thames . . . to all the observed motions of the solar system" (p. 30).
And then "an odd message arrived from London . . . on a matter completely outside his usual competence" (p. 105). The letter, sent by Secretary of the Treasury William Lowndes, solicited Newton's opinion on the worsening shortage of silver coins, a matter of national importance. This unexpected correspondence would ultimately lead to Newton becoming warden of the Mint-and, as such, both an industrialist and a criminal investigator. Levenson tells the exciting story that follows in Newton and the Counterfeiter: the Unknown Detective Career of the World's Greatest Scientist.
The problem facing England was that from "the late 1680s to the mid-1690s, the supply of [silver] coins-the basic units of exchange for the daily business of the country-shrank year by year." "By 1695," Levenson says, "it was almost impossible to find legal silver in circulation" (p. 109). This led to a standstill in trade, tenants that could not pay rent, many suicides, and a "general sense of terror" that violence would soon erupt (p. 138). . . .