CIAO DATE: 10/2009
Volume: 4, Issue: 3
Fall 2009
Obama's Atomic Bomb: The Ideological Clarity of the Democratic Agenda
John David Lewis
Examines America's political climate in light of the unmistakably statist agenda emanating from Washington, and finds cause for optimism in the effect Obama is having on the minds of Americans-and cause for activism toward helping Americans to see the proper political alternative: not conservatism but capitalism.
America's Self-Crippled Foreign Policy: An Interview with Yaron Brook, Elan Journo, and Alex Epstein
Discusses the dismal state of American foreign policy and what should be done about it.
An Unwinnable War?
Elan Journo
Author's note: The following is the introduction to Winning the Unwinnable War: America's Self-Crippled Response to Islamic Totalitarianism. The book is being published by Lexington Books and is scheduled for release this November.
"I don't think you can win it. . . . I don't have any . . . definite end [for the war]"
-President George W. Bush1
The warriors came in search of an elusive Taliban leader. Operating in the mountains of eastern Afghanistan, the team of Navy SEALs was on difficult terrain in an area rife with Islamist fighters. The four men set off after their quarry. But sometime around noon that day, the men were boxed into an impossible situation. Three Afghan men, along with about one hundred goats, happened upon the team's position. What should the SEALs do?
Their mission potentially compromised, they interrogated the Afghan herders. But they got nothing. Nothing they could count on. "How could we know," recalls one of the SEALs, "if they were affiliated with a Taliban militia group or sworn by some tribal blood pact to inform the Taliban leaders of anything suspicious-looking they found in the mountains?" It was impossible to know for sure. This was war, and the "strictly correct military decision would still be to kill them without further discussion, because we could not know their intentions." Working behind enemy lines, the team was sent there "by our senior commanders. We have a right to do everything we can to save our own lives. The military decision is obvious. To turn them loose would be wrong."
But the men of SEAL Team 10 knew one more thing. They knew that doing the right thing for their mission-and their own lives-could very well mean spending the rest of their days behind bars at Leavenworth. The men were subject to military rules of engagement that placed a mandate on all warriors to avoid civilian casualties at all costs. They were expected to bend over backward to protect Afghans, even if that meant forfeiting an opportunity to kill Islamist fighters and their commanders, and even if that meant imperiling their own lives.
The SEALs were in a bind. Should they do what Washington and the military establishment deemed moral-release the herders and assume a higher risk of death-or protect themselves and carry out their mission-but suffer for it back home? The men-Lt. Michael Murphy; Sonar Technician 2nd Class Matthew Axelson; Gunner's Mate 2nd Class Danny Dietz; and Hospital Corpsman 2nd Class Marcus Luttrell-took a vote.
They let the herders go.
Later that afternoon, a contingent of about 100-140 Taliban fighters swarmed upon the team. The four Americans were hugely outnumbered. The battle was fierce. Dietz fought on after taking five bullets, but succumbed to a sixth, in the head. Murphy and Axelson were killed not long after. When the air support that the SEALs had called for finally arrived, all sixteen members of the rescuing team were killed by the Islamists. Luttrell was the lone survivor, and only just.2
The scene of carnage on that mountainside in Afghanistan captures something essential about American policy. What made the deadly ambush all the more tragic is that in reaching their decision, those brave SEALs complied with the policies handed down to them from higher-ups in the military and endorsed by the nation's commander-in-chief. Their decision to place the moral injunction to selflessness ahead of their mission and their very lives encapsulates the defining theme of Washington's policy response to 9/11.
Across all fronts U.S. soldiers are made to fight under the same, if not even more stringent, battlefield rules. Prior to the start of the Afghanistan War and the Iraq War, for instance, the military's legal advisors combed through the Pentagon's list of potential targets, and expansive "no-strike" lists were drawn up.3 Included on the no-strike lists were cultural sites, electrical plants, broadcast facilities-a host of legitimate strategic targets ruled untouchable, for fear of affronting or harming civilians. To tighten the ropes binding the hands of the military, some artillery batteries "were programmed with a list of sites that could not be fired on without a manual override," which would require an OK from the top brass.4 From top to bottom, the Bush administration consciously put the moral imperative of shielding civilians and bringing them elections above the goal of eliminating real threats to our security. . . .
The Creed of Sacrifice vs. The Land of Liberty
Craig Biddle
The proper purpose of government, wrote Thomas Jefferson, is to "guarantee to everyone the free exercise of his industry and the fruits acquired by it."1 The government "shall restrain men from injuring one another, shall leave them otherwise free to regulate their own pursuits of industry and improvement, and shall not take from the mouth of labor the bread it has earned. This is the sum of good government."2
In accordance with this view of the purpose of government, the founders established a republic in which the government was constitutionally limited to the protection of individual rights-the rights to life, liberty, property, and the pursuit of happiness. In this new republic, men were free to think, to produce, and to trade in accordance with their own best judgment; thus, they were free to thrive in accordance with their intelligence, their ability, their initiative. The result was astounding.
Nineteenth-century America was a land of unparalleled innovation and prosperity-and further political achievement. In addition to countless inventions that sprang up-including the steamboat, the cotton gin, vulcanized rubber, the telephone, the incandescent light, the electric power plant, the skyscraper, and the safety elevator-and in addition to the vital industries that arose or were revolutionized-such as the railroad, oil, and steel industries-19th-century America witnessed the end of slavery, which was recognized as a violation of the basic principle of the land.
Between the end of the Civil War and the turn of the century, America came as close to being a fully rights-respecting society as any country has ever come. Men were essentially free to live their own lives, by their own judgment, for their own sake.
Unfortunately, although the Land of Liberty was a great success, it would not and could not last.
The founders established America on the principle of individual rights, but neither they nor the thinkers who followed them identified the deeper philosophic foundation on which this principle depends, namely, the morality of egoism-the idea that being moral consists in pursuing the values on which one's life and happiness depend. In the absence of this foundation, Americans have embraced philosophical ideas that are contrary to individual rights.
Over the past century, Americans have increasingly accepted the morality of altruism-the notion that being moral consists in self-sacrificially serving others-and they have increasingly applied this morality to the realm of politics. Consequently, our government is no longer committed to "restrain men from injuring one another [and] leave them otherwise free to regulate their own pursuits of industry and improvement." Rather, our government regularly-and increasingly-"take[s] from the mouth of labor the bread it has earned" and redistributes that bread to those who have not earned it.
Consider just a few of the countless altruistically motivated, wealth-redistributing laws and institutions that have been enacted or established over the past hundred years: The Federal Reserve violates the rights of Americans by (among other things) printing fiat money-thus debasing citizens' savings-in order to finance welfare programs, bail out failed banks, "rescue" bankrupt car companies, and the like. The Federal Deposit Insurance Corporation (FDIC) violates the rights of taxpayers by forcing them to insure the bank deposits of strangers. Social Security violates the rights of younger Americans by forcing them to fund the retirements of older Americans. The National Labor Relations Act (aka the Wagner Act) violates the rights of automakers (and other businessmen) by forcing them to "contract" with labor unions on terms that are detrimental to their businesses. Medicare and Medicaid violate the rights of taxpaying Americans by forcing them to fund the health care of the aged and the (allegedly) destitute. The Community Reinvestment Act violates the rights of bankers by forcing them to provide loans to people whom they regard as too risky for business. The Troubled Asset Relief Program (TARP) violates the rights of taxpayers by forcing them to purchase bad debt from failing financial institutions. The American Recovery and Reinvestment Act (ARRA) violates the rights of Americans by expanding the extent to which they are forced to fund welfare programs, unemployment benefits, government-run education, and the health care of others. Of course, federal, state, and municipal governments violate Americans' rights in thousands of other ways as well, but the foregoing indicates the enormity of the problem.
The explicit "justification" for all such rights-violating laws and institutions-the principle behind all of them-is altruism: the notion that we have a moral duty to serve others, whether "the poor" or "the public interest" or "society" or "the common good." As Theodore Roosevelt put it, the government must "regulate the use of wealth in the public interest" and "regulate the terms and conditions of labor, which is the chief element of wealth, directly in the interest of the common good";3 or as Franklin D. Roosevelt put it, the government must seek "the greater good of the greater number of Americans";4 or as John F. Kennedy put it, the individual must "weigh his rights and comforts against his obligations to the common good";5 or as Bill Clinton put it, the individual must "give something back" on behalf of "the common good";6 or as George W. Bush put it, we must "seek a common good beyond our comfort"; or as Barack Obama puts it, we must heed the "call to sacrifice" and uphold our "core ethical and moral obligation" to "look out for one another" and to "be unified in service to a greater good."7
A government animated by this principle will increasingly force citizens to serve the so-called "common good"-and with each political success, the government will get bolder and more aggressive in its enforcement of this principle. This is why the U.S. government has graduated over decades from the mere redistribution of wealth via taxation and inflation . . . to the establishment of wealth-redistributing institutions and hubs such as Social Security, Medicare, and TARP . . . to the outright nationalization of businesses, such as American International Group (AIG), General Motors (GM), and Citigroup . . . and to the nullification of private contracts that stand in its way (e.g., employment contracts in the case of AIG bonuses, investment contracts in the case of Chrysler's senior-secured creditors).
Under such expanding government control, explains an article in the New York Times:
Businesses and private property . . . become not an instrument of private "egoism" but "functions of the people." They remain private wherever and so long as they fulfill their "functions." Wherever and whenever they fall down, the State steps in and either forces them to fulfill the functions or takes them over entirely.8
That description of what we have witnessed recently, however, was not written recently; it was written in 1938. Nor was the author describing conditions in the United States; he was describing conditions in Germany under the then-burgeoning National Socialist Party. . . .
The Rise of American Big Government: A Brief History of How We Got Here
Michael Dahlen
Nineteenth-century America was the closest thing to capitalism-a system in which government is limited to protecting individual rights-that has ever existed. There was no welfare state, no central bank, no fiat money, no deficit spending to speak of, no income tax for most of the century, and no federal regulatory agencies or antitrust laws until the end of the century. Consequently, total (federal, state, and local) government spending averaged a mere 3.26 percent of Gross Domestic Product (GDP).1 The Constitution's protection of individual rights and limitation on the power of government gave rise to an economy in which individuals were free to pursue their own interests, to start new businesses, and to create as much wealth as their ability and ambition allowed. This near laissez-faire politico-economic system led to the freest, most innovative, and wealthiest nation in history.
Since the beginning of the 20th century, however, capitalism and freedom have been undermined by an explosion in the size and power of government: Total government spending has increased from 6.61 percent of GDP in 1907 to a projected 45.19 percent of GDP in 2009;2 the dollar has lost more than 95 percent of its value due to the Federal Reserve's inflationary policies; top marginal income tax rates have been as high as 94 percent; entitlement programs now constitute more than half of the federal budget; and businesses are hampered and hog-tied by more than eighty thousand pages of regulations in the Federal Register.
What happened? How did America shift from a predominantly free-market economy to a heavily regulated mixed economy; from capitalism to welfare state; from limited government to big government? This article will survey the progression of laws, acts, programs, and interventions that brought America to its present state-and show their economic impact. Let us begin our survey by taking a closer look at the state of the country in the 19th century.
Total (Federal, State, and Local) Government Spending
America's Former Free Market
The Constitution established the political framework necessary for a free market. It provided for the protection of private property (the Fifth Amendment) including intellectual property (Article I, Section 8), the enforcement of private contracts (Article 1, Section 10), and the establishment of sound (gold or silver)3 money (Article I, Sections 8 and 10). It prohibited the states from erecting trade barriers (Article I, Section 9), thereby establishing the whole nation as one large free-trade zone. It permitted direct taxes such as the income tax only if apportioned among the states on the basis of population (Article 1, Sections 2 and 9), which made them very difficult to levy.4 Finally, it specifically enumerated and therefore limited Congress's powers (Article I, Section 8), severely constraining the government's power to intervene in the marketplace.
Federal regulatory agencies dictating how goods could be produced and traded did not exist. Rather than being forced to accept the questionable judgments of agencies such as the FDA, FTC, and USDA, participants in the marketplace were governed by the free-market principle of caveat emptor (let the buyer beware). As historian Larry Schweikart points out:
merchants stood ready to provide customers with as much information as they desired. . . . In contrast to the modern view of consumers as incompetent to judge the quality or safety of a product, caveat emptor treated consumers with respect, assuming that a person could spot shoddy workmanship. Along with caveat emptor went clear laws permitting suits for damage incurred by flawed goods.5
To be sure, 19th-century America was not a fully free market. Besides the temporary suspension of the gold standard and the income tax levied during the Civil War, the major exceptions to the free market in the 19th century were tariffs, national banking, and subsidies for "internal improvements" such as canals and railroads. These exceptions, however, were limited in scope and were accompanied by considerable debate about whether they should exist at all. Alexander Hamilton, Henry Clay, and Abraham Lincoln supported such interventions; Thomas Jefferson, Andrew Jackson, and John Tyler generally opposed them. These interventions (except for tariffs) were, as Jefferson, Jackson, and Tyler pointed out, unconstitutional. But history shows that they were also impractical. Tariffs were initially implemented, beginning with the Tariff Act of 1789, as a source of revenue-the main source in the 19th century-for the federal government. Pressure from northern manufacturers, however, to implement tariffs for purposes of protection led to the "Tariff of Abominations" (1828), which was scaled back by 1833 due to heavy opposition from the South. Tariff rates then remained relatively low-about 15 percent-until the Civil War. By 1864, average tariff rates had risen to 47.09 percent for protectionist reasons and remained elevated for the remainder of the century.6
As to national banking, the Second Bank of the United States' charter expired in 1836, thereby paving the way for the free banking era-which lasted until a national bank was reinstituted during the Civil War. By virtually every measure of bank health, this free banking era was the soundest in American history. In terms of capital adequacy, asset quality, liquidity, profitability, and prudent management, national banking proved to be inferior to free banking.7
As to subsidies for internal improvements, although private entrepreneurs financed and built most roads and many canals,8 state governments intervened in the 1820s to subsidize canal building-amending their constitutions to do so.9 However, most state-funded canals either went unfinished, generated little to no income, or went bankrupt. As a result, by 1860 most state constitutions were amended again to prohibit such subsidies.10 After the Civil War, federal subsidies for the transcontinental railroads caused similar problems-as well as corruption. Further, they were proven to be a hindrance to rather than a precondition of a thriving railroad industry: James Jerome Hill's Great Northern was the most successful of the transcontinental railroads, yet was built without any subsidies or land grants.11
The foregoing interventions, though impractical, were motivated in part by a desire to help promote the development of business and industry. But lurking in the periphery, growing in popularity, and poised to fuel further government interference in the marketplace, was the ideology of collectivism-the notion that the individual must be subordinated to the collective or the "common good." This idea was stated by economist Daniel Raymond in his 1820 textbook: "it is the duty of every citizen to forgo his own private advantage for the public good."12 And as the 19th century progressed, this idea was increasingly cited as a justification for government intervention. One of the most important instances of this was the Supreme Court's decision in Munn v. Illinois (1876). In the majority opinion, Chief Justice Morrison Waite declared:
Property does become clothed with a public interest when used in a manner to make it of public consequence. . . . When, therefore, one devotes his property to a use in which the public has an interest, he, in effect, grants to the public an interest in that use, and must submit to be controlled by the public for the common good. . . .13
Although the case applied only to the states, Munn undermined the sanctity of private property rights by establishing the precedent that property "clothed with a public interest" (i.e., any property related to business) is subject to government regulation and control. As a result, Munn helped pave the way for the two major assertions of federal control over the economy-the Interstate Commerce Act and the Sherman Antitrust Act-that would come in the Gilded Age.14 . . .
How the Freedom to Contract Protects Insurability
Paul Hsieh
Shows that, contrary to proposals being put forth by Republicans, a genuinely free market in health insurance is not only moral, in that it respects the rights of producers and consumers, but also practical, in that it enables businessmen to solve problems for profit-which leads to more and better products and services at lower prices for consumers.
How Morality is Grounded in Reality
Craig Biddle
Author's note: This is chapter 3 of my book Loving Life: The Morality of Self-Interest and the Facts that Support It (Richmond: Glen Allen Press, 2002), which is an introduction to Ayn Rand's morality of rational egoism. Chapters 1 and 2 were reprinted in the prior two issues of TOS. In the book, this chapter is titled "To Be Or Not To Be: The Basic Choice."
In chapter 2, we encountered the problem known as the "is-ought" dichotomy, the notion that moral principles (principles regarding what people "ought" to do) cannot be derived from the facts of reality (from what "is"). We also saw that this problem persists for lack of an observation-based, objective standard of value. Here we turn to the solution to that problem. First, we will discover just such a standard; then, we will discover a number of objective moral principles-principles in accordance with that standard.
To begin, note that the basic fact that makes morality such a difficult subject is the very fact that makes it a subject in the first place: free will. As human beings we have the faculty of volition, the power of choice; we choose our actions. This fact gives rise to our need of morality. Indeed, the realm of morality is the realm of choice. What makes the issue complicated is the fact that our choices are guided by our values-which are also chosen. This is why it is so difficult to get to the bottom of morality: Human values are chosen-every last one of them. Consequently, peoples' values seem to differ in every imaginable way.
Some people choose to play soccer; they value footwork, teamwork, and winning. Some choose to dance ballet; they value grace, poise, and flight. And some choose to attend church; they value sermons, faith, and prayer. A person who goes hiking values the scenery and exercise. One who goes fishing values the nibble and catch. And one who takes heroin values the so-called "high." A person who steals jewelry values "free stuff." One who makes jewelry values craftsmanship. A sculptor values the process of creating art. A software developer values that creative process. A student who cheats on a test values "getting away" with it. One who studies for the test values the knowledge he gains thereby. A doctor specializing in internal medicine values the process of curing disease. A terrorist specializing in biological warfare values the process of spreading disease. A man who treats his wife with respect values certain qualities in her. One who abuses his wife values having power over her. A General who fights for mandatory "volunteerism" values involuntary servitude. One who fights to defend individual rights values freedom. And so on. Different people act in different ways; they value different things.
So the question is: How do we know if our choice of values is good or bad, right or wrong? What is our standard of value?
As we have already seen, if we do not consciously hold something as our standard of value, then we have nothing by reference to which we can determine what goals we should or should not pursue-how we should or should not act. And if we do not hold something rationally provable as our standard of value, then we default to some form of subjectivism-personal, social, or "supernatural"-which can lead only to human sacrifice, suffering, and death. If we want to live and achieve happiness, we need a non-sacrificial standard of value that is grounded in perceptual evidence-facts we can see.
In search of such a standard, the proper approach is to turn not to personal opinion or social convention or "super-nature," but to actual nature and ask, as the American philosopher Ayn Rand did: "What are values? Why does man need them?" . . .
Objectively Speaking: Ayn Rand Interviewed edited by Marlene Podritske and Peter Schwartz
Dina Schein Federman
People today sense that something is wrong with the world and are searching for answers. What they generally find is disappointing. Skeptics tell us that there is no clear-cut right and wrong in any issue, that all issues are "complex," that wisdom consists of dropping the notion that there are absolute truths. The most prevalent alternative to the skeptical, relativist position comes from religionists, who accept the existence of absolute truths but insist that they may be found only within a religious framework-a belief in a supernatural being who is the source of truth and morality. Both camps agree that absolutes cannot be discovered by a rational process. Both camps agree that morality consists of selfless service to others. Both camps support the welfare state.
Ayn Rand rejects all these claims and sweeps aside both skepticism and mysticism. Her philosophy, Objectivism, holds that reality is an objective absolute, independent of anyone's beliefs or feelings; that reason, based on the evidence of the senses, is our only means of knowing reality and, consequently, our only proper guide to action; that each man is an end in himself, not the means to the ends of others, and, therefore, that the pursuit of his own rational self-interest and happiness is the highest moral purpose of his life; that the proper political system is that of laissez-faire capitalism, in which men deal with one another as "traders, by free, voluntary exchange to mutual benefit."1 The reader may find the elucidation of her philosophical principles and their application in her novels, essays, and cultural commentary.
Objectively Speaking: Ayn Rand Interviewed is a recent addition to this body of work. It is a collection of radio and television interviews conducted with Ayn Rand from 1932 to 1981, in which she applies Objectivism to current events. Starting with her earliest known interview at age twenty-seven, it goes on to include a series of interviews conducted with her at Columbia University from 1962 to 1966, in which students and professors asked her questions on the principles of Objectivism and their application. It also includes a series of interviews in various media, ranging from the 1959 interview with Mike Wallace to her final public appearance, a 1981 interview with Louis Rukeyser. The epilogue is an interview with Dr. Leonard Peikoff, Rand's best student, heir, and the leading exponent of her philosophy, in which he recounts his thirty-year professional and personal association with her.
Among the topics Rand discusses in her interviews are the political structure of a free society, the American constitution, objective law, the nature of capitalism and various myths about it, why political conservatives are worse enemies of capitalism than the leftists, the crucial need for a free press, proper foreign policy, the moral nature of businessmen, education, the arts, the nature of humor, the foundations of morality, individual rights, and many others.
For example, in one interview from the 1960s, during a discussion of the origin of individual rights, Rand is asked to elucidate her rejection of various alleged "rights," such as rights to a minimum wage, free education and medical care, and the like. She explains that because jobs, education, medical care, and other goods and services do not grow on trees but are produced and provided by individuals and businesses, a "right" to these things means that the providers are to be forced to serve those who allegedly have a right to the largesse, which is slavery. "Nobody can have a right to the unearned. . . . [These things] can only come from other men-and nobody may claim the right to enslave others" (pp. 154-55). She explains that the only political-economic system in which force is banished from human relations is the system of laissez-faire capitalism, in which men deal with one another as traders, voluntarily exchanging value for value to mutual benefit.
Discussing the nature of capitalism and debunking the myths that surround it, Rand answers the allegation that government must regulate the economy in order to prevent financial crises: "Depressions and panics are the result of government intervention in the economy-specifically, government manipulation of credit and money. That was the cause of the Depression of 1929. Once more, it is capitalism that is taking the blame for the evils created by its opposite: statist intervention" (p. 42). In order to prevent financial crises, she counsels, the government must stay out of the economy. . . .
The Snowball: Warren Buffett and the Business of Life by Alice Schroeder New York: Bantam, 2008. 976 pp. $35 (cloth).
Daniel Wahl
Nine-year-old Warren Buffett is in his yard playing in the snow.
Warren is catching snowflakes. One at a time at first. Then he is scooping them up by handfuls. He starts to pack them into a ball. As the snowball grows bigger, he places it on the ground. Slowly it begins to roll. He gives it a push, and it picks up more snow. Soon he reaches the edge of the yard. After a moment of hesitation, he heads off, rolling the snowball through the neighborhood. And from there, Warren continues onward, casting his eye on a whole world full of snow (prologue).
Many decades later, Alice Schroeder, a former insurance analyst at Morgan Stanley and the author of The Snowball, is sitting in front of Warren Buffett, one of the world's richest men. "Where did it come from," she asks, "Caring so much about making money?" Buffett leans forward, "more like a teenager bragging about his first romance than a seventy-two-year-old financier," and begins to tell his story: "Balzac said that behind every great fortune lies a crime. That's not true at Berkshire [Hathaway]" (p. 4). Thus begins The Snowball, one of the most highly anticipated biographies of the past few years and the first to be written about Buffett with his full cooperation.
As its full title indicates, The Snowball: Warren Buffett and the Business of Life sets out to present Buffett's thinking, not only about business but about life in general. Among the many topics this hefty volume explores are those individuals who influenced his thinking. A major figure in this respect is Buffett's father, whom he idolizes and from whom he learned a crucial point when it comes to judging both oneself and others:
The big question about how people behave is whether they've got an Inner Scorecard or an Outer Scorecard. It helps if you can be satisfied with an Inner Scorecard.
I always pose it this way, I say: "Lookit. Would you rather be the world's greatest lover, but have everyone think you're the world's worst lover? Or would you rather be the world's worst lover but have everyone think you're the world's greatest lover?" . . . Now my dad: He was a hundred percent Inner Scorecard guy. He was really a maverick. But he wasn't a maverick for the sake of being a maverick. He just didn't care what other people thought" (p. 33).
In addition to the valuable lessons he learned from important figures in his life, Schroeder shows how Buffett's own interests and thinking during childhood contributed to his development. Schroeder reveals him to have been an efficacious child, intensely interested in collecting and processing facts. . . .
Fred Astaire by Joseph Epstein New Haven, CT: Yale University Press, 2008. 198 pp. $22 (cloth).
Scott Holleran
In the light and lively Fred Astaire, author and journalist Joseph Epstein offers an excellent overview of the career of the world's greatest male ballroom and tap dancer. This short biography, part of Yale University's Icons of America series, is like its subject-accessible yet elegant.
Astaire began his dance training at the age of five after his mother, Johanna Austerlitz, brought him to New York City in the hopes of grooming him and his talented older sister, Adele, for careers in show business. Attending dance school with his sister, young Frederick took to the art form and was soon rehearsing with Adele in routines developed by their instructor. Changing their last names to "Astaire," the brother-sister act hit the theatrical circuit and began a professional career that lasted many years and included appearances on Broadway with Al Jolson and Fanny Brice; and work with famed showman Flo Ziegfeld, who paid the duo an impressive $5,000 per week during the Depression (pp. 12, 15).
After Adele retired at the age of thirty-five, Astaire sought fortune in Hollywood. Shortly after being famously dismissed by a studio executive as "Balding. Can't sing. Dances a little." (p. 18), Astaire was noticed by Metro-Goldwyn-Mayer (MGM) and signed to a three-week contract for $1,500 per week. His first role-playing himself in Dancing Lady opposite Joan Crawford-proved that he had potential as a screen star (p. 19). . . .
The Garden of Invention: Luther Burbank and the Business of Breeding Plants by Jane S. Smith New York: The Penguin Press, 2009. 368 pp. $25.95 (cloth).
Daniel Wahl
In 1849, millions were starving due to a then-mysterious disease that "in a matter of days, if not hours, could transform a thriving field [of potatoes] into a slimy, foul-smelling patch of rotting vegetation" (pp. 38-39). Everywhere, plants were "extremely inconspicuous, [tasted] terrible, or [went] to seed in a fast and fabulously prolific way, leaving nothing behind to harvest." And plants had evolved characteristics by which they survived just long enough to reproduce, characteristics that were unsuitable for feeding a large and fast-growing population (p. 40).
But all this was about to change, and dramatically so, thanks to a man born that year: Luther Burbank (p. 6). In her new book The Garden of Invention: Luther Burbank and the Business of Breeding Plants, Jane S. Smith presents the life of this extremely influential but mostly forgotten plant breeder and businessman, emphasizing his innovations and the methods he used to develop and sell them.
Burbank displayed some mechanical ingenuity as a child, but, Smith reveals, apart from this, nothing in Burbank's background suggested that he might become an inventor of new plants (p. 19). Though young men of his time were encouraged to work in an academic setting, Burbank, fond as he was of the outdoors, was unsure whether he wanted to follow suit-until, at the age of 21, he picked up a copy of Charles Darwin's The Variation of Animals and Plants Under Domestication.
Smith describes this book as a "detail-crammed response to those who had criticized On the Origin of Species by Means of Natural Selection as a hypothesis unsupported by sufficient proof" (p. 27), then concisely sums up what Burbank read:
From gooseberries to gladioli, Darwin compiled his evidence: plants changed in response to outside stimulus (like the cabbages Darwin described that changed their shape or color when planted in different countries), and these changes could happen within a short time span (like the hyacinths he said growers had managed to improve from the offerings of only a few generations earlier). The causes of the changes were still largely unknown, but their occurrence was a fact beyond dispute. This was evolution measured in human time (p. 28).
Burbank took from the book several big ideas-each of which Smith relates in an easy-to-read style:
The first was that it was possible to force the emergence of latent differences in fruits and flowers, even to the point of generating what seemed to be entirely new varieties. Still more exciting was Darwin's tentative suggestion that selecting, grafting, hybridizing, or simply moving a plant to a new environment might spur changes that would persist over generations. According to Darwin, these alterations were often inadvertent, but as Burbank immediately realized, such happy accidents could also be deliberately pursued. The creation of new plant varieties, something far beyond the familiar efforts to breed the best of an existing stock, did not need to wait for the slow accumulation of natural advantages Darwin had described in his Origin of Species. Evolutionary change could be accelerated by human intervention (p. 28).
Darwin's words "opened up a new world" for Burbank (p. 27). Not only did the book give him an intellectual framework for viewing the world and man's place in it, but an advertisement on the last page of Burbank's edition of Darwin's tract (for a book called Gardening for Profit) enabled him to see for the first time his place in it. He would not have to choose "between the outdoor life and the inventor's bench" after all-plant life "could be a subject for experimentation and improvement, and a commercial garden could provide a good living for an imaginative and enterprising man" (p. 30). . . .