Main


November 11, 2014

Model Flaws Result in No Useful Climate Consensus




At the end of the first page of the commentary quoted below, the following biographical credentials were provided for the author of the commentary:

(p. C1) Dr. Koonin was undersecretary for science in the Energy Department during President Barack Obama's first term and is currently director of the Center for Urban Science and Progress at New York University. His previous positions include professor of theoretical physics and provost at Caltech, as well as chief scientist of where his work focused on renewable and low-carbon energy technologies.




(p. C1) The idea that "Climate science is settled" runs through today's popular and policy discussions. Unfortunately, that claim is misguided. It has not only distorted our public and policy debates on issues related to energy, greenhouse-gas emissions and the environment. But it also has inhibited the scientific and policy discussions that we need to have about our climate future.


. . .


(p. C2) We often hear that there is a "scientific consensus" about climate change. But as far as the computer models go, there isn't a useful consensus at the level of detail relevant to assessing human influences.


. . .


• Although the Earth's average surface temperature rose sharply by 0.9 degree Fahrenheit during the last quarter of the 20th century, it has increased much more slowly for the past 16 years, even as the human contribution to atmospheric carbon dioxide has risen by some 25%. This surprising fact demonstrates directly that natural influences and variability are powerful enough to counteract the present warming influence exerted by human activity.

Yet the models famously fail to capture this slowing in the temperature rise. Several dozen different explanations for this failure have been offered, with ocean variability most likely playing a major role. But the whole episode continues to highlight the limits of our modeling.


. . .


• A crucial measure of our knowledge of feedbacks is climate sensitivity--that is, the warming induced by a hypothetical doubling of carbon-dioxide concentration. Today's best estimate of the sensitivity (between 2.7 degrees Fahrenheit and 8.1 degrees Fahrenheit) is no different, and no more certain, than it was 30 years ago. And this is despite an heroic research effort costing billions of dollars.

These and many other open questions are in fact described in the IPCC research reports, although a detailed and knowledgeable reading is sometimes required to discern them. They are not "minor" issues to be "cleaned up" by further research. Rather, they are deficiencies that erode confidence in the computer projections. Work to resolve these shortcomings in climate models should be among the top priorities for climate research.

Yet a public official reading only the IPCC's "Summary for Policy Makers" would gain little sense of the extent or implications of these deficiencies. These are fundamental challenges to our understanding of human impacts on the climate, and they should not be dismissed with the mantra that "climate science is settled."



For the full commentary, see:

STEVEN E. KOONIN. "Climate Science Is Not Settled." The Wall Street Journal (Sat., Sept. 20, 2014): C1-C2.

(Note: italics in original; ellipses added.)

(Note: the online version of the commentary has the date Sept. 19, 2014.)






November 9, 2014

"Discovery Cannot Be Achieved by Directive"



(p. 170) As early as 1945 the medical advisory committee reporting to the committee reporting to the federal government on a postwar program for scientific research emphasized the frequently unexpected nature of discoveries:

Discoveries in medicine have often come from the most remote and unexpected fields of science in the past; and it is probable that this will be equally true in the future. It is not unlikely that significant progress in the treatment of cardiovascular disease, kidney disease, cancer, and other refractory conditions will be made, perhaps unexpectedly, as the result of fundamental discoveries in fields unrelated to these diseases.... Discovery cannot be achieved by directive. Further progress requires that the entire field of medicine and the underlying sciences of biochemistry, physiology, pharmacology, bacteriology, pathology, parasitology, etc., be developed impartially.

Their statement "discovery cannot be achieved by directive" would prove to be sadly prophetic.



Source:

Meyers, Morton A. Happy Accidents: Serendipity in Modern Medical Breakthroughs. New York: Arcade Publishing, 2007.

(Note: italics in original.)






November 5, 2014

"Folkman Persisted in His Genuinely Original Thinking"



(p. 141) As detailed by Robert Cooke in his 2001 book Dr. Folkman's War, the successful answers to these basic questions took Folkman through diligent investigations punctuated by an astonishing series of chance observations and circumstances. Over decades, Folkman persisted in his genuinely original thinking. His concept was far in advance of technological and other scientific advances that would provide the methodology and basic knowledge essential to its proof, forcing him to await verification and to withstand ridicule, scorn, and vicious competition for grants. Looking back three decades later, Folkman would ruefully reflect: "I was too young to realize how much trouble was in store for a theory that could not be tested immediately."


Source:

Meyers, Morton A. Happy Accidents: Serendipity in Modern Medical Breakthroughs. New York: Arcade Publishing, 2007.

(Note: italics in original.)






October 28, 2014

In Finding Cure for Ulcers, Marshall Was Not Constrained by the Need to Obtain Approval or Funding



(p. 113) Marshall was a youthful maverick, not bound by traditional theory and not professionally invested in a widely held set of beliefs. There is such a thing as being too much of an insider. Marshall viewed the problem with fresh eyes and was not constrained by the requirement to obtain approval or funding for his pursuits. It is also noteworthy that his work was accomplished not at a high-powered academic ivory tower with teams of investigators but instead far from the prestigious research centers in the Western Hemisphere.

The delay in acceptance of Marshall's revolutionary hypothesis reflects the tenacity with which long-held concepts are maintained. Vested interests--intellectual, financial, commercial, status--keep these entrenched. Dogmatic believers find themselves under siege by a new set of explanations.



Source:

Meyers, Morton A. Happy Accidents: Serendipity in Modern Medical Breakthroughs. New York: Arcade Publishing, 2007.






October 24, 2014

Ideas Should Not Be Rejected Just Because They Disagree with Reigning Theory



(p. 107) . . . Claude Bernard, the nineteenth-century founder of experimental medicine, . . . famously said, "If an idea presents itself to us, we must not reject it simply because it does not agree with the logical deductions of a reigning theory."


Source:

Meyers, Morton A. Happy Accidents: Serendipity in Modern Medical Breakthroughs. New York: Arcade Publishing, 2007.

(Note: ellipses added.)






October 20, 2014

Needed Revolutionary Ideas Often Come From Outsiders



(p. 103) . . . where knowledge is no longer growing and the field has been worked out, a revolutionary new approach is required and this is more likely to come from the outsider. The skepticism with which the experts nearly always greet these revolutionary ideas confirms that the available knowledge has been a handicap."


Source:

W. I. B. Beveridge as quoted in Meyers, Morton A. Happy Accidents: Serendipity in Modern Medical Breakthroughs. New York: Arcade Publishing, 2007.

(Note: ellipsis added.)






October 16, 2014

Medical Innovator "Maintained a Healthy Skepticism Toward Accepted Wisdom"



(p. 103) Barry Marshall, a lanky twenty-nine-year-old resident in internal medicine at Warren's hospital, was assigned to was assigned to gastroenterology for six months as part of his training and was looking for a research project. The eldest son of a welder and a nurse, Marshall grew up in a remote area of Western Australia where self-sufficiency and common sense were essential characteristics. His personal qualities of intelligence, tenacity, open-mindedness, and self-confidence would serve him and Warren well in bringing about a conceptual revolution. Relatively new to gastroenterology, he did not hold a set of well-entrenched beliefs. Marshall could maintain a healthy skepticism toward accepted wisdom. Indeed, the concept that bacteria caused stomach inflammation, and even ulcers, was less alien to him than to most gastroenterologists.


Source:

Meyers, Morton A. Happy Accidents: Serendipity in Modern Medical Breakthroughs. New York: Arcade Publishing, 2007.






October 12, 2014

"It Is Often Essential to Spot the Exceptions to the Rule"




Baruch Blumberg was awarded the Nobel Prize in 1976:


(p. 98) . . ., Blumberg learned an invaluable lesson: "In research, it is often essential to spot the exceptions to the rule--those cases that do not fit what you perceive as the emerging picture.... Frequently the most interesting findings grow out of the 'chance' or unanticipated results."


Source:

Meyers, Morton A. Happy Accidents: Serendipity in Modern Medical Breakthroughs. New York: Arcade Publishing, 2007.

(Note: ellipsis added.)






September 27, 2014

Theory Said Giant Bird Could Not Fly, But It Flew Anyway



(p. A3) Scientists have identified the largest flying bird ever found--an ungainly glider with a wingspan of 21 feet or more that likely soared above ancient seas 25 million years ago.

Until now, though, it was a bird that few experts believed could get off the ground. By the conventional formulas of flight, the extinct sea bird--twice the size of an albatross, the largest flying bird today--was just too heavy to fly on its long, fragile wings.

But a new computer analysis reported Monday [July 7, 2014] in the Proceedings of the National Academy of Sciences shows that the bird apparently could ride efficiently on rising air currents, staying aloft for a week or more at a stretch.


. . .


"You have to conclude that this animal was capable of flapping its wings and taking off, even though it is much heavier than the theoretical maximum weight of a flapping flying bird," said Luis Chiappe, an expert on flight evolution at the Los Angeles County Natural History Museum, who wasn't involved in the project. "Our modern perspective on the diversity of flight is rather narrow," he said. "These were very unique birds."


. . .


"This was a pretty impressive creature," said avian paleontologist Daniel T. Ksepka at the Bruce Museum in Greenwich, Conn., who conducted the analysis of the bird's biomechanics. "Science had made a rule about flight, and life found a way around it."



For the full story, see:

ROBERT LEE HOTZ. "U.S. NEWS; Giant Bird Was Able to Fly, Scientists Find; Computer Analysis Shows Ancient Glider Could Get Off the Ground, Defying Conventional Theories of Flight." The Wall Street Journal (Tues., July 8, 2014): A3.

(Note: ellipses, and bracketed date, added.)

(Note: the online version of the article has the date July 7, 2014.)






September 26, 2014

Serendipitous Discoveries Are Made by "Accidents and Sagacity"



(p. 6) "Accident" is not really the best word to describe such fortuitous discoveries. Accident implies mindlessness. Christopher Columbus's discovery of the American continent was pure accident--he was looking for something else (the Orient) and stumbled upon this, and never knew, not even on his dying day, that he had discovered a new continent. A better name for the phenomenon we will be looking at in the pages to follow is "serendipity," a word that came into the English language in 1754 by way of the writer Horace Walpole. The key point of the phenomenon of serendipity is illustrated in Walpole's telling of an ancient Persian fairy tale, The Three Princes of Serendip (set in the land of Serendip, now known as Sri Lanka): "As their highnesses traveled, they were always making discoveries, by accidents and sagacity, of things they were not in quest of."

Accidents and sagacity. Sagacity--defined as penetrating intelligence, keen perception, and sound judgment--is essential to serendipity. The men and women who seized on lucky accidents that happened to them were anything but mindless. In fact, their minds typically had special qualities that enabled them to break out of established paradigms, imagine new possibilities, and see that they had found a solution, often to some problem other than the one they were working on. Accidental discoveries would be nothing without keen, creative minds knowing what to do with them.



Source:

Meyers, Morton A. Happy Accidents: Serendipity in Modern Medical Breakthroughs. New York: Arcade Publishing, 2007.

(Note: italics in original.)






September 20, 2014

Modelers Can Often Obtain the Desired Result




(p. A13) After earning a master's degree in environmental engineering in 1982, I spent most of the next 10 years building large-scale environmental computer models. My first job was as a consultant to the Environmental Protection Agency. I was hired to build a model to assess the impact of its Construction Grants Program, a nationwide effort in the 1970s and 1980s to upgrade sewer-treatment plants.

The computer model was huge--it analyzed every river, sewer treatment plant and drinking-water intake (the places in rivers where municipalities draw their water) in the country. I'll spare you the details, but the model showed huge gains from the program as water quality improved dramatically. By the late 1980s, however, any gains from upgrading sewer treatments would be offset by the additional pollution load coming from people who moved from on-site septic tanks to public sewers, which dump the waste into rivers. Basically the model said we had hit the point of diminishing returns.

When I presented the results to the EPA official in charge, he said that I should go back and "sharpen my pencil." I did. I reviewed assumptions, tweaked coefficients and recalibrated data. But when I reran everything the numbers didn't change much. At our next meeting he told me to run the numbers again.

After three iterations I finally blurted out, "What number are you looking for?" He didn't miss a beat: He told me that he needed to show $2 billion of benefits to get the program renewed. I finally turned enough knobs to get the answer he wanted, and everyone was happy.


. . .


There are no exact values for the coefficients in models such as these. There are only ranges of potential values. By moving a bunch of these parameters to one side or the other you can usually get very different results, often (surprise) in line with your initial beliefs.



For the full commentary, see:

ROBERT J. CAPRARA. "OPINION; Confessions of a Computer Modeler; Any model, including those predicting climate doom, can be tweaked to yield a desired result. I should know." The Wall Street Journal (Weds., July 9, 2014): A13.

(Note: ellipsis added.)

(Note: the online version of the commentary has the date July 8, 2014.)






August 6, 2014

"Different Structural Models Can Fit Aggregate Macroeconomic Data About Equally Well"



(p. 1149) There is an apparent lack of encompassing-forecasting and economic models that can explain the facts uniformly well across business cycles. This is perhaps an inevitable outcome given the changing nature of business cycles. The fact that business cycles are not all alike naturally means that variables that predict activity have a performance that is episodic. Notably, we find that term spreads were good predictors of economic activity in the 1970s and 1980s, but that credit spreads have fared better more recently. This is of course a challenge for forecasters, as we do (p. 1150) not know the origins of future business cycle fluctuations. Much needs to be learned to determine which and how financial variables are to be monitored in real time especially in an evolving economy when historical data do not provide adequate guidance.

Explanations for the Great Recessions usually involve some form of nonlinearity. The sudden nature of the downturn following the collapse of Lehman is consistent with nonlinearity being part of the transmission mechanism. At the same time, we lack robust evidence of nonlinearity from aggregate low-frequency macroeconomic data. Essentially, there is an identification issue as different structural models can fit aggregate macroeconomic data about equally well.



For the full article, see:

Ng, Serena, and Jonathan H. Wright. "Facts and Challenges from the Great Recession for Forecasting and Macroeconomic Modeling." Journal of Economic Literature 51, no. 4 (Dec. 2013): 1120-54.






August 2, 2014

Climate Models Allow "the Modeler to Obtain Almost Any Desired Result"




Integrated assessment models (IAMs) are the commonly-used models that attempt to integrate climate science models with economic effect models. In the passage quoted below, "SCC" stands for "social cost of carbon."


(p. 870) I have argued that IAMs are of little or no value for evaluating alternative climate change policies and estimating the SCC. On the contrary, an IAM-based analysis suggests a level of knowledge and precision that is nonexistent, and allows the modeler to obtain almost any desired result because key inputs can be chosen arbitrarily.

As I have explained, the physical mechanisms that determine climate sensitivity involve crucial feedback loops, and the parameter values that determine the strength of those feedback loops are largely unknown. When it comes to the impact of climate change, we know even less. IAM damage functions are completely made up, with no theoretical or empirical foundation. They simply reflect common beliefs (which might be wrong) regarding the impact of 2º C or 3º C of warming, and can tell us nothing about what might happen if the temperature increases by 5º C or more. And yet those damage functions are taken seriously when IAMs are used to analyze climate policy. Finally, IAMs tell us nothing about the likelihood and nature of catastrophic outcomes, but it is just such outcomes that matter most for climate change policy. Probably the best we can do at this point is come up with plausible estimates for probabilities and possible impacts of catastrophic outcomes. Doing otherwise is to delude ourselves.



For the full article, see:

Pindyck, Robert S. "Climate Change Policy: What Do the Models Tell Us?" Journal of Economic Literature 51, no. 3 (Sept. 2013): 860-72.






March 24, 2014

Environmentalists Seek to Silence Those Who Dare to Disagree



(p. A13) Surely, some kind of ending is upon us. Last week climate protesters demanded the silencing of Charles Krauthammer for a Washington Post column that notices uncertainties in the global warming hypothesis. In coming weeks a libel trial gets under way brought by Penn State's Michael Mann, author of the famed hockey stick, against National Review, the Competitive Enterprise Institute, writer Rand Simberg and roving commentator Mark Steyn for making wisecracks about his climate work. The New York Times runs a cartoon of a climate "denier" being stabbed with an icicle.

These are indications of a political movement turned to defending its self-image as its cause goes down the drain.



For the full commentary, see:

HOLMAN W. JENKINS, JR. "BUSINESS WORLD; Personal Score-Settling Is the New Climate Agenda; The cause of global carbon regulation may be lost, but enemies still can be punished." The Wall Street Journal (Sat., March 1, 2014): A13.

(Note: the online version of the commentary has the date Feb. 28, 2014, and has the title "BUSINESS WORLD; Jenkins: Personal Score-Settling Is the New Climate Agenda; The cause of global carbon regulation may be lost, but enemies still can be punished.")



The Krauthammer column that the environmentalists do not want you to read:

Krauthammer, Charles. "The Myth of 'Settled Science'." The Washington Post (Fri., Feb. 21, 2014): A19.






March 16, 2014

Many Important Medical Articles Cannot Be Replicated




The standard scientific method is more fallible, and less logically rigorous, than is generally admitted. One implication is to strengthen the case for allowing patients considerable freedom in choosing their own treatments.


(p. D1) It has been jarring to learn in recent years that a reproducible result may actually be the rarest of birds. Replication, the ability of another lab to reproduce a finding, is the gold standard of science, reassurance that you have discovered something true. But that is getting harder all the time. With the most accessible truths already discovered, what remains are often subtle effects, some so delicate that they can be conjured up only under ideal circumstances, using highly specialized techniques.

Fears that this is resulting in some questionable findings began to emerge in 2005, when Dr. John P. A. Ioannidis, a kind of meta-scientist who researches research, wrote a paper pointedly titled "Why Most Published Research Findings Are False."


. . .


. . . he published another blockbuster, examining more than a decade's worth of highly regarded papers -- the effect of a daily aspirin on cardiac disease, for example, or the risks of hormone replacement therapy for older women. He found that a large proportion of the conclusions were undermined or contradicted by later studies.

His work was just the beginning. Concern about the problem has reached the point that the journal Nature has assembled an archive, filled with reports and analyses, called Challenges in Irreproducible Research.

Among them is a paper in which C. Glenn Begley, who is chief scientific officer at TetraLogic Pharmaceuticals, described an experience he had while at Amgen, another drug company. He and his colleagues could not replicate 47 of 53 landmark papers about cancer. Some of the results could not be reproduced even with the help of the original scientists working in their own labs.



For the full commentary, see:

GEORGE JOHNSON. "Raw Data; New Truths That Only One Can See." The New York Times (Tues., JAN. 21, 2014): D1 & D6.

(Note: ellipses added.)

(Note: the online version of the commentary has the date JAN. 20, 2014.)


The first Ioannidis article mentioned above is:

Ioannidis, John P. A. "Why Most Published Research Findings Are False." PLoS Medicine 2, no. 8 (August 2005): 696-701.


The second Ioannidis article mentioned above is:

Ioannidis, John P. A. "Contradicted and Initially Stronger Effects in Highly Cited Clinical Research." JAMA 294, no. 2 (July 13, 2005): 218-28.


The Begley article mentioned above is:

Begley, C. Glenn, and Lee M. Ellis. "Drug Development: Raise Standards for Preclinical Cancer Research." Nature 483, no. 7391 (March 29, 2012): 531-33.






March 1, 2014

Better to Fail at Solving a Big Problem, than to Succeed at a Minor One?



BrilliantBlundersBK2014-02-23.jpg

















Source of book image: http://ecx.images-amazon.com/images/I/61s10qMqpxL._SL1400_.jpg



Francis Collins, head of the NIH, discusses a favorite book of 2013:



(p. C6) Taking risks is part of genius, and genius is not immune to bloopers. Mario Livio's "Brilliant Blunders" leads us through the circumstances that surrounded famous gaffes.   . . .   Mr. Livio helps us see that such spectacular errors are opportunities rather than setbacks. There's a lesson for young scientists here. Boldly attacking problems of fundamental significance can have more impact than pursuing precise solutions to minor questions--even if there are a few bungles along the way.


For the full article, see:

"12 Months of Reading; We asked 50 of our friends--from April Bloomfield to Mike Tyson--to name their favorite books of 2013." The Wall Street Journal (Sat., Dec. 14, 2013): C6 & C9-C12.

(Note: the online version of the article has the date Dec. 13, 2013.)


The book that Collins praises is:

Livio, Mario. Brilliant Blunders: From Darwin to Einstein - Colossal Mistakes by Great Scientists That Changed Our Understanding of Life and the Universe. New York: Simon & Schuster, 2013.






February 17, 2014

Would Science Progress Faster If It Were Less Academic and More Entrepreneurial?



BootstrapGeologistBK2014-01-18.jpg














Source of caption and photo: online version of the NYT article quoted and cited below.




(p. D5) There is Big Science, defined as science that gets the big bucks. There is tried and true science, which, from an adventurous dissident's point of view, is boldly going where others have gone before but extending the prevailing knowledge by a couple of decimal places (a safe approach for dissertation writers and grant seekers).

Then there is bootstrap science, personified by Gene Shinn, who retired in 2006 after 31 years with the United States Geological Survey and 15 years with a research arm of the Shell Oil Company.


. . .


Without a Ph.D. and often without much financing, Mr. Shinn published more than 120 peer-reviewed papers that helped change many experts' views on subjects like how coral reefs expand and the underwater formation of limestone. Some of his papers, at odds with established scientific views, were initially rejected, only to be seen later as visionary.

His bootstrap ingredients included boundless curiosity, big ideas -- "gee-whiz science," he calls it -- persistence, a sure hand at underwater demolition (dynamite was comparatively easy to come by in those remarkably innocent days) and versatility at improvising core-sampling equipment on tight budgets. The ability to enlist the talents of other scientists, many with doctorates, who shared his love of hands-on field work and his impatience with official rules and permits added to the mix.



For the full review, see:

MICHAEL POLLAK. "BOOKS; Science on His Own Terms." The New York Times (Tues., November 5, 2013): D5.

(Note: the online version of the review has the date November 4, 2013.)


Book under review:

Shinn, Eugene A. Bootstrap Geologist: My Life in Science. Gainesville, FL: University Press of Florida, 2013.






January 27, 2014

The Use of Note Cards to Structure Writing



(p. A21) I tell college students that by the time they sit down at the keyboard to write their essays, they should be at least 80 percent done. That's because "writing" is mostly gathering and structuring ideas.

For what it's worth, I structure geographically. I organize my notes into different piles on the rug in my living room. Each pile represents a different paragraph in my column. The piles can stretch on for 10 feet to 16 feet, even for a mere 806-word newspaper piece. When "writing," I just pick up a pile, synthesize the notes into a paragraph, set them aside and move on to the next pile. If the piece isn't working, I don't try to repair; I start from scratch with the same topic but an entirely new structure.

The longtime New Yorker writer John McPhee wonderfully described his process in an essay just called "Structure." For one long article, McPhee organized his notecards on a 32-square-foot piece of plywood. He also describes the common tension between chronology and theme (my advice: go with chronology). His structures are brilliant, but they far too complex for most of us. The key thing is he lets you see how a really fine writer thinks about the core problem of writing, which takes place before the actual writing.



For the full commentary, see:

DAVID BROOKS. "The Sidney Awards, Part 2." The New York Times (Tues., December 31, 2013): A21. [National Edition]

(Note: the online version of the commentary has the date December 30, 2013.)


The article praised by Brooks is:

McPhee, John. "Structure." The New Yorker (Jan. 14, 2013): 46-55.






September 19, 2013

Key to Google: "Both Larry and Sergey Were Montessori Kids"



(p. 121) [Marissa Mayer] conceded that to an outsider, Google's new-business process might indeed look strange. Google spun out projects like buckshot, blasting a spray and using tools and measurements to see what it hit. And sometimes it did try ideas that seemed ill suited or just plain odd. Finally she burst out with her version of the corporate Rosebud. "You can't understand Google," she said, "unless you know that both Larry and Sergey were Montessori kids."

"Montessori" refers to schools based on the educational philosophy of Maria Montessori, an Italian physician born in 1870 who believed that children should be allowed the freedom to pursue what interested them.

(p. 122) "It's really ingrained in their personalities," she said. "To ask their own questions, do their own things. To disrespect authority. Do something because it makes sense, not because some authority figure told you. In Montessori school you go paint because you have something to express or you just want to do it that afternoon, not because the teacher said so. This is really baked into how Larry and Sergey approach problems. They're always asking 'Why should it be like that?' It's the way their brains were programmed early on."



Source:

Levy, Steven. In the Plex: How Google Thinks, Works, and Shapes Our Lives. New York: Simon & Schuster, 2011.

(Note: bracketed name added.)






July 18, 2013

Ignoring Einstein's Mistakes by Deifying Him, Makes Us Forget His Struggles



EinsteinsMistakesBK2013-07-17.jpg
















Source of book image: http://ecx.images-amazon.com/images/I/41zyL4LVYxL.jpg




(p. A13) Mr. Ohanian finds that four out of five of the seminal papers that Einstein produced in the so-called "miracle year" of 1905, when he was working as a patent inspector in Zurich, were "infested with flaws."


. . .


. . . he notes Einstein's errors for a purpose, showing us why his achievement was all the greater for them.

In this Mr. Ohanian provides a useful corrective, for there is a tendency, even today, to deify Einstein and other men of genius, treating them as if they were immortal gods. Einstein himself objected to the practice even as he reveled in his fame. "It is not fair," he once observed, "to select a few individuals for boundless admiration and to attribute superhuman powers of mind and of character to them." In doing so, ironically, we make less of the person, not more, forgetting and simplifying their struggle.


. . .


. . . Einstein's ability to make use of his mistakes as "stepping stones and shortcuts" was central to his success, in Mr. Ohanian's view. To see Einstein's wanderings not as the strides of a god-like genius but as the steps and missteps of a man -- fallible and imperfect -- does not diminish our respect for him but rather enhances it.



For the full review, see:

McMahon, Darrin M. "BOOKSHELF; Great and Imperfect." The Wall Street Journal (Fri., September 5, 2008): A13.

(Note: ellipses added.)


The book under review is:

Ohanian, Hans C. Einstein's Mistakes: The Human Failings of Genius. New York: W. W. Norton & Company, 2008.






June 24, 2013

We Should Disenthrall Ourselves of False Scientific Certainties



An Optimists Tour of the Future CoverBK2013-06-21.jpg
















Source of book image: http://2.bp.blogspot.com/-ELpfH2bTO7c/Tb53WpKuDxI/AAAAAAAADrE/Zq8BQiiasJc/s640/An+Optimists+Tour+of+the+Future+Cover.jpg



(p. C4) Among the scientific certainties I have had to unlearn: that upbringing strongly shapes your personality; that nurture is the opposite of nature; that dietary fat causes obesity more than dietary carbohydrate; that carbon dioxide has been the main driver of climate change in the past.

I came across a rather good word for this kind of unlearning--"disenthrall"--in Mark Stevenson's book "An Optimist's Tour of the Future," published just this week. Mr. Stevenson borrows it from Abraham Lincoln, whose 1862 message to Congress speaks of disenthralling ourselves of "the dogmas of the quiet past" in order to "think anew."

Mr. Stevenson's disenthrallment comes in the course of a series of sharp and fascinating interviews with technological innovators and scientific visionaries. This disenthralls him of the pessimism about the future and nostalgia about the past that he barely realized he had and whose "fingers reach deep into [his] soul." It eventually turns him into an optimist almost as ludicrously sanguine about the 21st century as I am: "I steadfastly refuse to believe that human society can't grow, improve and learn; that it can't embrace change and remake the world better."

Along the way, Mr. Stevenson is struck by other examples of how the way he thinks and reasons is "in thrall to a world that is passing." The first of these bad habits is linear thinking about the future. . . .

We expect to see changes coming gradually, but because things like computing power or the cheapness of genome sequencing change exponentially, technologies can go from impossible to cheap quite suddenly and with little warning.



For the full commentary, see:

MATT RIDLEY. "MIND & MATTER; A Key Lesson of Adulthood: The Need to Unlearn." The Wall Street Journal (Sat., February 5, 2011): C4.

(Note: ellipsis added.)


The book praised by Ridley, in the passages quoted above, is:

Stevenson, Mark. An Optimist's Tour of the Future: One Curious Man Sets out to Answer "What's Next?". New York: Avery, 2011.






April 21, 2013

Analytical Solutions Require Unrealistic Assumptions that Make Models Useless for Policy



TheCraftOfEconomicsBK2013-04-05.jpg
















Source of book image: http://www.anderson.ucla.edu/faculty/edward.leamer/images/COVER%209120_jkt_Rev1.jpg



(p. 190) When I was a younger man, I and all of my cohort were apprehensive if we saw Ed Leamer in the audience when we were presenting a paper. His comments were blunt, incisive, and often negative. But what truly terrified us was that he was almost always right. . . .

Leamer has produced a highly original little book, with big insights and lessons for us all. He explores the tension between economics that is mathematically sophisticated and complex but often vacuous, versus economics that may be vague but which is useful and carries a message. It is frankly a remarkable work, full of insights and persuasive arguments that need to be read, debated, and taken seriously.


. . .


(p. 191) But this is no rant of an old guy. Leamer gets very specific about his notions of usefulness versus rigor. A good drum to bang on is Samuelson, an important "mathematizer." I would strongly encourage all young trade economists and perhaps all graduate students who have been subjected to a traditional international trade course at any level, to read the section on factor-price equalization. This is beautifully done and even exciting and funny at times. As told by Leamer, the young Samuelson excoriates Ohlin for largely dismissing the possibility of factor-price equalization and then presents his (Samuelson's) "proof" of factor-price equalization. The latter, of course, is a theorem that is mathematically correct given the assumptions, but Ohlin is talking about its usefulness in understanding the world and constructing policy. The factor-price-equalization theorem is indeed a prime example of something that is valid but not useful.


. . .


Yet at the same time, I have thought long and hard about exactly what message should be given to graduate students and assistant professors without much success. The journal publishing business puts a huge premium on rigor over usefulness and few referees or editors are inclined to take the chance inherent in accepting papers that are a bit loose in their analytical or econometric structures, no matter how exciting they might be. If you accept that, then the profession as a whole has to rethink our view of what is an important scientific contribution: I cannot simply tell graduate students to think more broadly and worry less about elegance. Some will of course deny that there is any tension, but I side with Leamer. Over and over again, I hear, read, and/or referee papers (p. 192) where, in order to get an analytical solution to a model, the author has to assume away almost every interesting feature of the problem to the point that the remaining model is uninteresting and uninformative. But that at least qualifies the paper for possible publication in Econometrica, RESTud, or JET.



For the full review, see:

Markusen, James R. "Book Review of Ed Leamer's the Craft of Economics." Journal of Economic Literature 51, no. 1 (2013): 190-92.

(Note: ellipses added; italics in original.)


The book under review is:

Leamer, Edward E. The Craft of Economics, Ohlin Lectures. Cambridge, MA: The MIT Press, 2012.






March 13, 2013

To Avoid Economic Crises We Need to Look at Evidence from Economic History



(p. 1093) Methodologically, the most fundamental and forceful message from the book is that, by ignoring history and the fact that crises remain frequent, recurrent, episodic events--in both rich and poor countries--almost everyone, including researchers and policymakers, made themselves vulnerable to the wishful thinking encapsulated in the book's title. There is a deeper statistical point here. Crises, and for that matter large recessions and other phenomena that are of first-order interest given their implications for economic activity, occur at quite a low frequency. They are rare events, meaning that they do not occur so frequently, at least for most countries in a short-span time series. Thus recent experience can be an unfaithful guide for scholars and statesmen alike, a good example being the complacent thinking that accompanied the erstwhile Great Moderation of recent decades even as financial pressures built up nationally and internationally. Possibly the most important lesson that readers will take away from this book is that if we are to do better in future, from our policy thinking in the chambers of power to our macroeconometric analyses in academe, (p. 1094) we need to admit the existence of, and come to grips with, a much broader universe of evidence.


For the full review, see:

Taylor, Alan M. "Global Financial Stability and the Lessons of History: A Review of Carmen M. Reinhart and Kenneth S. Rogoff's This Time Is Different: Eight Centuries of Financial Folly." Journal of Economic Literature 50, no. 4 (Dec. 2012): 1092-105.

(Note: italics in original.)


The book that Taylor reviews, is:

Reinhart, Carmen M., and Kenneth Rogoff. This Time Is Different: Eight Centuries of Financial Folly. Princeton, NJ: Princeton University Press, 2009.







February 22, 2013

Darwin Shared His Thought Processes Without Condescension



DarwinCharlesIn1881.jpg














"SAGE OF AGES; Portrait of Charles Darwin in 1881, by Julia Margaret Cameron." Source of caption and photo: online version of the WSJ article quoted and cited below.






(p. C14) . . . Mr. Johnson observes:

No scientific innovator has ever taken more trouble to smooth the way for lay readers without descending into vulgarity. What is almost miraculous about the book is Darwin's generosity in sharing his thought processes, his lack of condescension. There is no talking down, but no hauteur, either. It is a gentlemanly book.

In both style and substance, this passage is classic Paul Johnson.


. . .


What makes Darwin good, in the biographer's estimation, is the scientist's democratic dissemination of knowledge. Darwin triumphed with "The Origin of Species," Mr. Johnson contends, not only because of his ability to portray the theory of evolution as the inescapable outcome of his decades of study and the work of fellow scientists, whom he was careful to praise, but because he was acutely aware that he had to present his notions of natural selection and survival of the fittest so as not to stir up public controversy. To an extraordinary degree, Darwin deflected attacks by couching his discoveries in terms of the plants he liked to examine and cultivate. He had relatively little to say about human evolution.



For the full review, see:

CARL ROLLYSON. "Studies of the Moral Animal." The Wall Street Journal (Sat., December 15, 2012): C14.

(Note: ellipses added.)

(Note: the online version of the review essay has the date December 14, 2012.)



The book under review is:

Johnson, Paul M. Darwin: Portrait of a Genius. New York: Viking Adult, 2012.







February 4, 2013

Social Scientists Prefer Articles that Contain Bogus Math



MathBiasGraphic2013-01-12.jpgSource of graphic: online version of the WSJ article quoted and cited below.




(p. A2) . . . research has shown that even those who should be especially clear-sighted about numbers--scientific researchers, for example, and those who review their work for publication--are often uncomfortable with, and credulous about, mathematical material. As a result, some research that finds its way into respected journals--and ends up being reported in the popular press--is flawed.

In the latest study, Kimmo Eriksson, a mathematician and researcher of social psychology at Sweden's Mälardalen University, chose two abstracts from papers published in research journals, one in evolutionary anthropology and one in sociology. He gave them to 200 people to rate for quality--with one twist. At random, one of the two abstracts received an additional sentence, the one above with the math equation, which he pulled from an unrelated paper in psychology. The study's 200 participants all had master's or doctoral degrees. Those with degrees in math, science or technology rated the abstract with the tacked-on sentence as slightly lower-quality than the other. But participants with degrees in humanities, social science or other fields preferred the one with the bogus math, with some rating it much more highly on a scale of 0 to 100.

"Math makes a research paper look solid, but the real science lies not in math but in trying one's utmost to understand the real workings of the world," Prof. Eriksson said.



For the full story, see:

CARL BIALIK. "THE NUMBERS GUY; Don't Let Math Pull the Wool Over Your Eyes." The Wall Street Journal (Sat., January 5, 2013): A2.

(Note: ellipsis added.)

(Note: the online version of the story has the date January 4, 2013,)



A pdf of Eriksson's published article can be downloaded from:

Eriksson, Kimmo. "The Nonsense Math Effect." Judgment and Decision Making 7, no. 6 (November 2012): 746-49.






January 24, 2013

Economics Should Be in "Broad-Exploration Mode"



(p. 85) What does concern me about my discipline, . . . , is that its current core--by which I mainly mean the so-called dynamic stochastic general equilibrium approach--has become so mesmerized with its own internal logic that it has begun to confuse the precision it has achieved about its own world with the precision that it has about the real one. This is dangerous for both methodological and policy reasons. On the methodology front, macroeconomic research has been in "fine-tuning" mode within the local-maximum of the dynamic stochastic general equilibrium world, when we should be in "broad-exploration" mode. We are too far (p. 86) from absolute truth to be so specialized and to make the kind of confident quantitative claims that often emerge from the core. On the policy front, this confused precision creates the illusion that a minor adjustment in the standard policy framework will prevent future crises, and by doing so it leaves us overly exposed to the new and unexpected.


. . .


(p. 100) Going back to our macroeconomic models, we need to spend much more effort in understanding the topology of interactions in real economies. The financial sector and its recent struggles have made this need vividly clear, but this issue is certainly not exclusive to this sector.

The challenges are big, but macroeconomists can no longer continue playing internal games. The alternative of leaving all the important stuff to the "policy"-types (p. 101) and informal commentators cannot be the right approach. I do not have the answer. But I suspect that whatever the solution ultimately is, we will accelerate our convergence to it, and reduce the damage we do along the transition, if we focus on reducing the extent of our pretense-of-knowledge syndrome.



Source:

Caballero, Ricardo J. "Macroeconomics after the Crisis: Time to Deal with the Pretense-of-Knowledge Syndrome." Journal of Economic Perspectives 24, no. 4 (Fall 2010): 85-102.

(Note: ellipses added.)






January 12, 2013

Solow Testifies on Irrelevance of DSGE Macro Models




In Nobel-prize-winner Robert Solow's congressional testimony, quoted below, "DSGE" is an abbreviation for "dynamic stochastic general equilibrium."


(p. 221) Solow argues: "It may be unusual for the Committee to focus on so abstract a question, but it is certainly natural and urgent. Here we are, still near the bottom of a deep and prolonged recession, with the immediate future uncertain, desperately short of jobs, and the approach to macroeconomics that dominates serious thinking, certainly in our elite universities and in many central banks and other influential policy circles, seems to have absolutely nothing to say about the problem. Not only does it (p. 222) offer no guidance or insight, it really seems to have nothing useful to say. . . . Especially when it comes to matters as important as macroeconomics, a mainstream economist like me insists that every proposition must pass the smell test: does this really make sense? I do not think that the currently popular DSGE models pass the smell test."


Source:

Taylor, Timothy. "Recommendations for Further Reading." Journal of Economic Perspectives 24, no. 4 (Fall 2010): 219-26.

(Note: ellipsis in original.)






December 29, 2012

Debating Grammar: "Think Different" or "Think Differently"



(p. 329) They debated the grammatical issue: If "different" was supposed to modify the verb "think," it should be an adverb, as in "think dif-(p. 330)ferently." But Jobs insisted that he wanted "different" to be used as a noun, as in "think victory" or "think beauty." Also, it echoed colloquial use, as in "think big." Jobs later explained, "We discussed whether it was correct before we ran it. It's grammatical, if you think about what we're trying to say. It's not think the same, it's think different. Think a little different, think a lot different, think different. 'Think differently' wouldn't hit the meaning for me."


Source:

Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.






December 23, 2012

Internet Posting May Be Replacing Peer Reviewed Publishing



The article quoted below provides additional signs that institutions of knowledge production and dissemination may be changing in important ways. (Wikipedia is another, even bigger, sign.)


(p. 635) Over the past decade, there has been a decline in the fraction of papers in top economics journals written by economists from the highest-ranked economics departments. This paper documents this fact and uses additional data on publications and citations to assess various potential explanations. Several observations are consistent with the hypothesis that the Internet improves the ability of high-profile authors to disseminate their research without going through the traditional peer-review process.


Source:

Ellison, Glenn. "Is Peer Review in Decline?" Economic Inquiry 49, no. 3 (July 2011): 635-57.






December 8, 2012

"It Isn't What You Know that Counts--It Is How Efficiently You Can Refresh"



HalfLifeOfFactsBK2012-12-01.jpg












Source of book image: online version of the WSJ review quoted and cited below.







(p. A17) Knowledge, then, is less a canon than a consensus in a state of constant disruption. Part of the disruption has to do with error and its correction, but another part with simple newness--outright discoveries or new modes of classification and analysis, often enabled by technology.


. . .


In some cases, the facts themselves are variable.  . . .


. . .


More commonly, however, changes in scientific facts reflect the way that science is done. Mr. Arbesman describes the "Decline Effect"--the tendency of an original scientific publication to present results that seem far more compelling than those of later studies. Such a tendency has been documented in the medical literature over the past decade by John Ioannidis, a researcher at Stanford, in areas as diverse as HIV therapy, angioplasty and stroke treatment. The cause of the decline may well be a potent combination of random chance (generating an excessively impressive result) and publication bias (leading positive results to get preferentially published).

If shaky claims enter the realm of science too quickly, firmer ones often meet resistance. As Mr. Arbesman notes, scientists struggle to let go of long-held beliefs, something that Daniel Kahneman has described as "theory-induced blindness." Had the Austrian medical community in the 1840s accepted the controversial conclusions of Dr. Ignaz Semmelweis that physicians were responsible for the spread of childbed fever--and heeded his hand-washing recommendations--a devastating outbreak of the disease might have been averted.

Science, Mr. Arbesman observes, is a "terribly human endeavor." Knowledge grows but carries with it uncertainty and error; today's scientific doctrine may become tomorrow's cautionary tale. What is to be done? The right response, according to Mr. Arbesman, is to embrace change rather than fight it. "Far better than learning facts is learning how to adapt to changing facts," he says. "Stop memorizing things . . . memories can be outsourced to the cloud." In other words: In a world of information flux, it isn't what you know that counts--it is how efficiently you can refresh.



For the full review, see:

DAVID A. SHAYWITZ. "BOOKSHELF; The Scientific Blind Spot." The Wall Street Journal (Mon., November 19, 2012): A17.

(Note: ellipses added, except for the one internal to the last paragraph, which was in the original.)

(Note: the online version of the article was dated November 18, 2012.)


The book under review, is:

Arbesman, Samuel. The Half-Life of Facts: Why Everything We Know Has an Expiration Date. New York: Current, 2012.






December 5, 2012

Progress of Economic Science on Central Banking



The passage below is a comment by former head of the Fed, Paul Volcker.


(p. 25) . . . I recently commented to some of my economist friends that I'm not aware of any large contribution that economic science has made to central banking in the last 50 years or so.

Our ability to forecast is still very limited. The old issues of the relative role of fiscal and monetary policies are still debated. Markets are certainly more complex, and some of the old approaches toward monetary control seem less relevant. Recent events have certainly illustrated limitations in our understanding of the economy.

The advent of floating exchange rates, which partly reflects a shift in academic thinking, has certainly been important, but the underlying problems of policy seem familiar.



Stern, Gary H., interviewer. "Paul A.Volcker in Conversation with Gary H. Stern." The Region (September 2009): 18-29.

(Note: ellipsis added.)






November 23, 2012

Econometrician Leamer Argues for Methodological Pluralism



(p. 44) Ignorance is a formidable foe, and to have hope of even modest victories, we economists need to use every resource and every weapon we can muster, including thought experiments (theory), and the analysis of data from nonexperiments, accidental experiments, and designed experiments. We should be celebrating the small genuine victories of the economists who use their tools most effectively, and we should dial back our adoration of those who can carry the biggest and brightest and least-understood weapons. We would benefit from some serious humility, and from burning our "Mission Accomplished" banners. It's never gonna happen.


Source:

Leamer, Edward E. "Tantalus on the Road to Asymptopia." Journal of Economic Perspectives 24, no. 2 (Spring 2010): 31-46.






November 19, 2012

Econometric "Priests" Sell Their New "Gimmicks" as the "Latest Euphoria Drug"



The American Economic Association's Journal of Economic Perspectives published a symposium focused on the thought-provoking views of the distinguished econometrician Edward Leamer.

I quote below some of Leamer's comments in his own contribution to the symposium.


(p. 31) We economists trudge relentlessly toward Asymptopia, where data are unlimited and estimates are consistent, where the laws of large numbers apply perfectly and where the full intricacies of the economy are completely revealed. But it's a frustrating journey, since, no matter how far we travel, Asymptopia remains infinitely far away. Worst of all, when we feel pumped up with our progress, a tectonic shift can occur, like the Panic of 2008, making it seem as though our long journey has left us disappointingly close to the State of Complete Ignorance whence we began.

The pointlessness of much of our daily activity makes us receptive when the Priests of our tribe ring the bells and announce a shortened path to Asymptopia. (Remember the Cowles Foundation offering asymptotic properties of simultaneous equations estimates and structural parameters?) We may listen, but we don't hear, when the Priests warn that the new direction is only for those with Faith, those with complete belief in the Assumptions of the Path. It often takes years down the Path, but sooner or later, someone articulates the concerns that gnaw away in each of (p. 32) us and asks if the Assumptions are valid. (T. C. Liu (1960) and Christopher Sims (1980) were the ones who proclaimed that the Cowles Emperor had no clothes.) Small seeds of doubt in each of us inevitably turn to despair and we abandon that direction and seek another.

Two of the latest products-to-end-all-suffering are nonparametric estimation and consistent standard errors, which promise results without assumptions, as if we were already in Asymptopia where data are so plentiful that no assumptions are needed. But like procedures that rely explicitly on assumptions, these new methods work well in the circumstances in which explicit or hidden assumptions hold tolerably well and poorly otherwise. By disguising the assumptions on which nonparametric methods and consistent standard errors rely, the purveyors of these methods have made it impossible to have an intelligible conversation about the circumstances in which their gimmicks do not work well and ought not to be used. As for me, I prefer to carry parameters on my journey so I know where I am and where I am going, not travel stoned on the latest euphoria drug.

This is a story of Tantalus, grasping for knowledge that remains always beyond reach. In Greek mythology Tantalus was favored among all mortals by being asked to dine with the gods. But he misbehaved--some say by trying to take divine food back to the mortals, some say by inviting the gods to a dinner for which Tantalus boiled his son and served him as the main dish. Whatever the etiquette faux pas, Tantalus was punished by being immersed up to his neck in water. When he bowed his head to drink, the water drained away, and when he stretched up to eat the fruit hanging above him, wind would blow it out of reach. It would be much healthier for all of us if we could accept our fate, recognize that perfect knowledge will be forever beyond our reach and find happiness with what we have. If we stopped grasping for the apple of Asymptopia, we would discover that our pool of Tantalus is full of small but enjoyable insights and wisdom.



For the full article, see:

Leamer, Edward E. "Tantalus on the Road to Asymptopia." Journal of Economic Perspectives 24, no. 2 (Spring 2010): 31-46.






November 8, 2012

Coase: "Firms Never Calculate Marginal Costs"







Source of YouTube video:

http://www.youtube.com/watch?feature=player_embedded&v=ZAq06n79QIs#!




(p. 257) You can watch a 99 year-old Ronald Coase speaking in December 2009 for 25 minutes on the subjects of "Markets, Firms and Property Rights." "One of the things that people don't understand is that markets are creations. . . . In fact, it's very difficult to imagine that firms act in the way that is described in the textbooks, where you maximize profits by equating marginal costs and marginal revenues. One of the reasons one can feel doubtful about this particular way of looking at things is that firms never calculate marginal costs . . . I think we ought to study directly how firms operate and develop our theory accordingly." From the conference "Markets, Firms and Property Rights: A Celebration of the Research of Ronald Coase," held at the University of Chicago Law School by the Information Economy Project at George Mason University School of Law. The webpage also includes video of seven panels of prominent speakers, along with PDF files of a dozen or so papers given at the conference. Available at 〈http://iep.gmu.edu/CoaseConference.php〉.


Source:

Taylor, Timothy. "Recommendations for Further Reading." Journal of Economic Perspectives 24, no. 3 (Summer 2010): 251-58.

(Note: ellipses in original.)






October 25, 2012

Reality Is Not Always "Elegant"



Ordinary-Geniuses-Segre-Gino-BK2012-09-03.jpg

















Source of book image: http://images.betterworldbooks.com/067/Ordinary-Geniuses-Segre-Gino-9780670022762.jpg



(p. C9) In the summer of 1953, while visiting Berkeley, Gamow was shown a copy of the article in Nature where Watson and Crick spelled out some of the genetic implications of their discovery that DNA is structured as a double helix. He immediately realized what was missing. Each helix is a linear sequence of four molecules known as bases. The sequence contains all the information that guides the manufacture of the proteins from which living things are made. Proteins are assembled from 20 different amino acids. What is the code that takes you from the string of bases to the amino acids? Gamow seems to have been the first to look at the problem in quite this way.

But he made a physicist's mistake: He thought that the code would be "elegant"--that each amino acid would be specified by only one string of bases. (These strings were dubbed "codons.") He produced a wonderfully clever code in which each codon consisted of three bases. That was the only part that was right. In the actual code sometimes three different codons correspond to the same amino acid, while some codons do not code for an amino acid at all. These irregularities are the results of evolutionary stops and starts, and no amount of cleverness could predict them.



For the full review, see:

JEREMY BERNSTEIN. "The Inelegant Universe." The Wall Street Journal (Sat., August 13, 2011): C9.


The book under review is:

Segrè, Gino. Ordinary Geniuses: Max Delbruck, George Gamow, and the Origins of Genomics and Big Bang Cosmology. New York: Viking, 2011.






October 22, 2012

Paul Samuelson, in 2009 Interview, Says Economists Should Study Economic History



Clarke Conor interviewed Paul Samuelson in the summer of 2009. Since Samuelson died in October 2009, the interview was one of his last.

Samuelson was a student of Joseph Schumpeter at Harvard, and Schumpeter worked to get Samuelson financial support and a job. Near the end of his life, Schumpeter was ridiculed when he warned National Bureau of Economic Research (NBER) economists that they should not neglect economic history.

It took Paul Samuelson a long time to appreciate Schumpeter's truth.


Very last thing. What would you say to someone starting graduate study in economics? Where do you think the big developments in modern macro are going to be, or in the micro foundations of modern macro? Where does it go from here and how does the current crisis change it?

Well, I'd say, and this is probably a change from what I would have said when I was younger: Have a very healthy respect for the study of economic history, because that's the raw material out of which any of your conjectures or testings will come. And I think the recent period has illustrated that. The governor of the Bank of England seems to have forgotten or not known that there was no bank insurance in England, so when Northern Rock got a run, he was surprised. Well, he shouldn't have been.

But history doesn't tell its own story. You've got to bring to it all the statistical testings that are possible. And we have a lot more information now than we used to.



For the full interview, see:

Clarke, Conor. "An Interview with Paul Samuelson, Part Two." The Atlantic (2009), http://www.theatlantic.com/politics/archive/2009/06/an-interview-with-paul-samuelson-part-two/19627/.

(Note: bold indicates Conor question, and is bolded in original.)

(Note: the interview was posted on The Atlantic online website, but I do not believe that it ever appeared in the print version of the magazine.)






October 16, 2012

No Amount of Econometric Sophistication Will Substitute for Good Data



(p. 234) Using a powerful method due to Singh, we have established a relationship between God's attitude toward man and the amount of prayer (p. 235) transmitted to God. The method presented here is applicable to a number of important problems. Provided conditional density (1) is assumed, we do not need to observe a variable to compute its conditional expectation with respect to another variable whose density can be estimated. For example, one can extend current empirical work in a variety of areas of economics to estimate the effect of income on happiness or the effect of income inequality on democracy. We conjecture that this powerful method can be extended to the more general case when X is not observed either.


For the full article, from which the above is quoted, see:

Heckman, James. "The Effect of Prayer on God's Attitude toward Mankind." Economic Inquiry 48, no. 1 (Jan. 2010): 234-35.






September 28, 2012

Reference Point Ignored Due to "Theory-Induced Blindness"



(p. 290) The omission of the reference point from the indifference map is a surprising case of theory-induced blindness, because we so often encounter cases in which the reference point obviously matters. In labor negotiations, it is well understood by both sides that the reference point is the existing contract and that the negotiations will focus on mutual demands for concessions relative to that reference point. The role of loss aversion in bargaining is also well understood: making concessions hurts. You have much (p. 291) personal experience of the role of reference point. If you changed jobs or locations, or even considered such a change, you surely remember that the features of the new place were coded as pluses or minuses relative to where you were. You may also have noticed that disadvantages loomed larger than advantages in this evaluation--loss aversion was at work. It is difficult to accept changes for the worse. For example, the minimal wage that unemployed workers would accept for new employment averages 90% of their previous wage, and it drops by less than 10% over a period of one year.


Source:

Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.





September 24, 2012

Kahneman Grants that "the Basic Concepts of Economics Are Essential Intellectual Tools"



(p. 286) Most graduate students in economics have heard about prospect theory and loss aversion, but you are unlikely to find these terms in the index of an introductory text in economics. I am sometimes pained by this omission, but in fact it is quite reasonable, because of the central role of rationality in basic economic theory. The standard concepts and results that undergraduates are taught are most easily explained by assuming that Econs do not make foolish mistakes. This assumption is truly necessary, and it would be undermined by introducing the Humans of prospect theory, whose evaluations of outcomes are unreasonably short-sighted.

There are good reasons for keeping prospect theory out of introductory texts. The basic concepts of economics are essential intellectual tools, which are not easy to grasp even with simplified and unrealistic assumptions about the nature of the economic agents who interact in markets. Raising questions about these assumptions even as they are introduced would be confusing, and perhaps demoralizing. It is reasonable to put priority on helping students acquire the basic tools of the discipline. Furthermore, the failure of rationality that is built into prospect theory is often irrelevant to the predictions of economic theory, which work out with great precision in some situations and provide good approximations in many others. In some contexts, however, the difference becomes significant: the Humans described by prospect theory are (p. 287) guided by the immediate emotional impact of gains and losses, not by long-term prospects of wealth and global utility.

I emphasized theory-induced blindness in my discussion of flaws in Bernoulli's model that remained unquestioned for more than two centuries. But of course theory-induced blindness is not restricted to expected utility theory. Prospect theory has flaws of its own, and theory-induced blindness to these flaws has contributed to its acceptance as the main alternative to utility theory.



Source:

Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.





September 21, 2012

Models Often "Ignore the Messiness of Reality"



SuperCooperatorsBK2012-08-31.png




















Source of book image: http://www.namingandtreating.com/wp-content/uploads/2011/04/SuperCooperators_small.png



(p. 18) Nowak is one of the most exciting modelers working in the field of mathematical biology today. But a model, of course, is only as good as its assumptions, and biology is much messier than physics or chemistry. Nowak tells a joke about a man who approaches a shepherd and asks, ''If I tell you how many sheep you have, can I have one?'' The shepherd agrees and is astonished when the stranger answers, ''Eighty-three.'' As he turns to leave, the shepherd retorts: ''If I guess your profession, can I have the animal back?'' The stranger agrees. ''You must be a mathematical biologist.'' How did he know? ''Because you picked up my dog.''


. . .


Near the end of the book, Nowak describes Gustav Mahler's efforts, in his grandiloquent Third Symphony, to create an all-encompassing structure in which ''nature in its totality may ring and resound,'' adding, ''In my own way, I would like to think I have helped to give nature her voice too.'' But there remains a telling gap between the precision of the models and the generality of the advice Nowak offers for turning us all into supercooperators. We humans really are infinitely more complex than falling apples, metastasizing colons, even ant colonies. Idealized accounts of the world often need to ignore the messiness of reality. Mahler understood this. In 1896 he invited Bruno Walter to Lake Attersee to glimpse the score of the Third. As they walked beneath the mountains, Walter admonished Mahler to look at the vista, to which he replied, ''No use staring up there -- I've already composed it all away into my symphony!''



For the full review, see:

OREN HARMAN. "A Little Help from Your Friends." The New York Times Book Review (Sun., April 10, 2011): 18.

(Note: ellipsis added.)

(Note: the online version of the review has the date April 8, 2011, and has the title "How Evolution Explains Altruism.")


The full reference for the book under review, is:

Nowak, Martin A., and Roger Highfield. Supercooperators: Altruism, Evolution, and Why We Need Each Other to Succeed. New York: Free Press, 2011.






September 20, 2012

Sticking with Expected Utility Theory as an Example of "Theory-Induced Blindness"



(p. 286) Perhaps carried away by their enthusiasm, [Rabin and Thaler] . . . concluded their article by recalling the famous Monty Python sketch in which a frustrated customer attempts to return a dead parrot to a pet store. The customer uses a long series of phrases to describe the state of the bird, culminating in "this is an ex-parrot." Rabin and Thaler went on to say that "it is time for economists to recognize that expected utility is an ex-hypothesis." Many economists saw this flippant statement as little short of blasphemy. However, the theory-induced blindness of accepting the utility of wealth as an explanation of attitudes to small losses is a legitimate target for humorous comment.


Source:

Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

(Note: bracketed names and ellipsis added.)





September 16, 2012

"Theory-Induced Blindness"



(p. 276) The mystery is how a conception of the utility of outcomes that is vulnerable to . . . obvious counterexamples survived for so long. I can explain (p. 277) it only by a weakness of the scholarly mind that I have often observed in myself. I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing. You give the theory the benefit of the doubt, trusting the community of experts who have accepted it. . . . As the psychologist Daniel Gilbert observed, disbelieving is hard work, and System 2 is easily tired.


Source:

Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

(Note: ellipses added.)





September 14, 2012

How Politics Trumps Peer Review in Medical Research



Abstract

The U.S. public biomedical research system is renowned for its peer review process that awards federal funds to meritorious research performers. Although congressional appropriators do not earmark federal funds for biomedical research performers, I argue that they support allocations for those research fields that are most likely to benefit performers in their constituencies. Such disguised transfers mitigate the reputational penalties to appropriators of interfering with a merit‐driven system. I use data on all peer‐reviewed grants by the National Institutes of Health during the years 1984-2003 and find that performers in the states of certain House Appropriations Committee members receive 5.9-10.3 percent more research funds than those at unrepresented institutions. The returns to representation are concentrated in state universities and small businesses. Members support funding for the projects of represented performers in fields in which they are relatively weak and counteract the distributive effect of the peer review process.

Source:

Hegde, Deepak. "Political Influence Behind the Veil of Peer Review: An Analysis of Public Biomedical Research Funding in the United States." Journal of Law and Economics 52, no. 4 (Nov. 2009): 665-90.






September 10, 2012

Economists Have "the Tools to Slap Together a Model to 'Explain' Any and All Phenomena"



(p. 755) The economist of today has the tools to slap together a model to 'explain' any and all phenomena that come to mind. The flood of models is rising higher and higher, spouting from an ever increasing number of journal outlets. In the midst of all this evidence of highly trained cleverness, it is difficult to retain the realisation that we are confronting a complex system 'the working of which we do not understand'. . . . That the economics profession might be humbled by recent events is a realisation devoutly to be wished.


Source:

Leijonhufvud, Axel. "Out of the Corridor: Keynes and the Crisis." Cambridge Journal of Economics 33, no. 4 (July 2009): 741-57.

(Note: ellipsis added.)

(Note: the passage above was quoted on the back cover of The Cato Journal 30, no. 2 (Spring/Summer 2010).)






July 30, 2012

Simple Algorithms Predict Better than Trained Experts



(p. 222) I never met Meehl, but he was one of my heroes from the time I read his Clinical vs. Statistical Prediction: A Theoretical Analysis and a Review of the Evidence.

In the slim volume that he later called "my disturbing little book," Meehl reviewed the results of 20 studies that had analyzed whether clinical predictions based on the subjective impressions of trained professionals were more accurate than statistical predictions made by combining a few scores or ratings according to a rule. In a typical study, trained counselors predicted the grades of freshmen at the end of the school year. The counselors interviewed each student for forty-five minutes. They also had access to high school grades, several aptitude tests, and a four-page personal statement. The statistical algorithm used only a fraction of this information: high school grades and one aptitude test. Nevertheless, the formula was more accurate than 11 of the 14 counselors. Meehl reported generally sim-(p. 223)ilar results across a variety of other forecast outcomes, including violations of parole, success in pilot training, and criminal recidivism.

Not surprisingly, Meehl's book provoked shock and disbelief among clinical psychologists, and the controversy it started has engendered a stream of research that is still flowing today, more than fifty years after its publication. The number of studies reporting comparisons of clinical and statistical predictions has increased to roughly two hundred, but the score in the contest between algorithms and humans has not changed. About 60% of the studies have shown significantly better accuracy for the algorithms. The other comparisons scored a draw in accuracy, but a tie is tantamount to a win for the statistical rules, which are normally much less expensive to use than expert judgment. No exception has been convincingly documented.



Source:

Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.

(Note: italics in original.)





May 19, 2012

Observed Climate "Not in Good Agreement with Model Predictions"



The author of the following commentary is a Princeton physics professor:


(p. A13) What is happening to global temperatures in reality? The answer is: almost nothing for more than 10 years. Monthly values of the global temperature anomaly of the lower atmosphere, compiled at the University of Alabama from NASA satellite data, can be found at the website http://www.drroyspencer.com/latest-global-temperatures/. The latest (February 2012) monthly global temperature anomaly for the lower atmosphere was minus 0.12 degrees Celsius, slightly less than the average since the satellite record of temperatures began in 1979.

The lack of any statistically significant warming for over a decade has made it more difficult for the United Nations Intergovernmental Panel on Climate Change (IPCC) and its supporters to demonize the atmospheric gas CO2 which is released when fossil fuels are burned.


. . .


Frustrated by the lack of computer-predicted warming over the past decade, some IPCC supporters have been claiming that "extreme weather" has become more common because of more CO2. But there is no hard evidence this is true.


. . .


Large fluctuations from warm to cold winters have been the rule for the U.S., as one can see from records kept by the National Ocean and Atmospheric Administration, NOAA. For example, the winters of 1932 and 1934 were as warm as or warmer than the 2011-2012 one and the winter of 1936 was much colder.


. . .


It is easy to be confused about climate, because we are constantly being warned about the horrible things that will happen or are already happening as a result of mankind's use of fossil fuels. But these ominous predictions are based on computer models. It is important to distinguish between what the climate is actually doing and what computer models predict. The observed response of the climate to more CO2 is not in good agreement with model predictions.


. . .


. . . we should . . . remember the description of how science works by the late, great physicist, Richard Feynman:

"In general we look for a new law by the following process. First we guess it. Then we compute the consequences of the guess to see what would be implied if this law that we guessed is right. Then we compare the result of the computation to nature, with experiment or experience; compare it directly with observation, to see if it works. If it disagrees with experiment it is wrong."



For the full commentary, see:

WILLIAM HAPPER. "Global Warming Models Are Wrong Again; The observed response of the climate to more CO2 is not in good agreement with predictions." The Wall Street Journal (Tues., March 27, 2012): A13.

(Note: ellipses added.)





April 30, 2012

Physicist Says "Financial Models Are Only Mediocre Metaphors"



ModelsBehavingBadlyBK2012-04-08.jpg











Source of book image: online version of the WSJ review quoted and cited below.








(p. A19) Trained as a physicist, Emanuel Derman once served as the head of quantitative analysis at Goldman Sachs and is currently a professor of industrial engineering and operations research at Columbia University. With "Models Behaving Badly" he offers a readable, even eloquent combination of personal history, philosophical musing and honest confession concerning the dangers of relying on numerical models not only on Wall Street but also in life.

Mr. Derman's particular thesis can be stated simply: Although financial models employ the mathematics and style of physics, they are fundamentally different from the models that science produces. Physical models can provide an accurate description of reality. Financial models, despite their mathematical sophistication, can at best provide a vast oversimplification of reality. In the universe of finance, the behavior of individuals determines value--and, as he says, "people change their minds."

In short, beware of physics envy. When we make models involving human beings, Mr. Derman notes, "we are trying to force the ugly stepsister's foot into Cinderella's pretty glass slipper. It doesn't fit without cutting off some of the essential parts." As the collapse of the subprime collateralized debt market in 2008 made clear, it is a terrible mistake to put too much faith in models purporting to value financial instruments. "In crises," Mr. Derman writes, "the behavior of people changes and normal models fail. While quantum electrodynamics is a genuine theory of all reality, financial models are only mediocre metaphors for a part of it."



For the full review, see:

BURTON G. MALKIEL. "BOOKSHELF; Physics Envy; Creating financial models involving human behavior is like forcing 'the ugly stepsister's foot into Cinderella's pretty glass slipper.'" The Wall Street Journal (Weds., December 14, 2011): A19.


The book under review is:

Derman, Emanuel. Models.Behaving.Badly: Why Confusing Illusion with Reality Can Lead to Disaster, on Wall Street and in Life. New York: Free Press, 2011.





March 23, 2012

Faraday and Einstein Were Visual and Physical Thinkers, Not Mathematicians



Faraday_Chemical_History-of-a-CandleBK2012-03-08.jpg













Source of book image: http://www.rsc.org/images/Faraday_Chemical_History-of-a-Candle_180_tcm18-210390.jpg





(p. C6) Michael Faraday is one of the most beguiling and lovable figures in the history of science. Though he could not understand a single equation, he deduced the essential structure of the laws of electromagnetism through visualization and physical intuition. (James Clerk Maxwell would later give them mathematical form.) Albert Einstein kept a picture of Faraday over his desk, for Einstein also thought of himself primarily as a visual and physical thinker, not an abstract mathematician.


. . .


Faraday's text is still charming and rich, a judgment that few popular works on science could sustain after so many years. Though he addresses himself to an "auditory of juveniles," he calls for his audience to follow a close chain of reasoning presented through a series of experiments and deductions.


. . .


. . . : "In every one of us there is a living process of combustion going on very similar to that of a candle," as Faraday illustrates in his experiments.

In his closing, he turns from our metabolic resemblance to a candle to his deeper wish that "you may, like it, shine as lights to those about you."



For the full review, see:

PETER PESIC. "BOOKSHELF; Keeper of the Flame." The Wall Street Journal (Sat., January 7, 2012): C6.

(Note: ellipses added.)


Book under review:

Faraday, Michael. The Chemical History of a Candle. Oxford, UK: Oxford University Press 2011.






March 21, 2012

In History, Documenting Your Sources Matters More than Your Credentials



DysonGeorge2012-03-09.jpg









George Dyson. Source of photo: online version of the NYT interview quoted and cited below.





(p. D11) BELLINGHAM, Wash. -- More than most of us, the science historian George Dyson spends his days thinking about technologies, old and very new.


. . .


Though this 58-year-old author's works are centered on technology, they often have an autobiographical subtext. Freeman Dyson, the physicist and mathematician who was a protagonist of Project Orion, is his father. Esther Dyson, the Internet philosopher and high-tech investor, is his sister. We spoke for three hours at his cottage here, and later by telephone. A condensed and edited version of the conversations follows.


. . .


. . . today you make your living as a historian of science and technology. How does a high school dropout get to do that?

Hey, this is America. You can do what you want! I love this idea that someone who didn't finish high school can write books that get taken seriously. History is one of the only fields where contributions by amateurs are taken seriously, providing you follow the rules and document your sources. In history, it's what you write, not what your credentials are.



For the full interview, see:

CLAUDIA DREIFUS, interviewer. "Looking Backward to Put New Technologies in Focus." The New York Times (Tues., December 6, 2011): D11.

(Note: question bolded in original; ellipses added.)

(Note: the online version of the interview is dated December 5, 2011.)


Dyson's most recent book is:

Dyson, George. Turing's Cathedral: The Origins of the Digital Universe. New York: Pantheon Books, 2012.






March 18, 2012

Simple Heuristics Can Work Better than Complex Formulas



(p. C4) Most business people and physicians privately admit that many of their decisions are based on intuition rather than on detailed cost-benefit analysis. In public, of course, it's different. To stand up in court and say you made a decision based on what your thumb or gut told you is to invite damages. So both business people and doctors go to some lengths to suppress or disguise the role that intuition plays in their work.

Prof. Gerd Gigerenzer, the director of the Max Planck Institute for Human Development in Berlin, thinks that instead they should boast about using heuristics. In articles and books over the past five years, Dr. Gigerenzer has developed the startling claim that intuition makes our decisions not just quicker but better.


. . .


The economist Harry Markowitz won the Nobel prize for designing a complex mathematical formula for picking fund managers. Yet when he retired, he himself, like most people, used a simpler heuristic that generally works better: He divided his retirement funds equally among a number of fund managers.

A few years ago, a Michigan hospital saw that doctors, concerned with liability, were sending too many patients with chest pains straight to the coronary-care unit, where they both cost the hospital more and ran higher risks of infection if they were not suffering a heart attack. The hospital introduced a complex logistical model to sift patients more efficiently, but the doctors hated it and went back to defensive decision-making.

As an alternative, Dr. Gigerenzer and his colleagues came up with a "fast-and-frugal" tree that asked the doctors just three sequential yes-no questions about each patient's electrocardiographs and other data. Compared with both the complex logistical model and the defensive status quo, this heuristic helped the doctors to send more patients to the coronary-care unit who belonged there and fewer who did not.



For the full commentary, see:

By MATT RIDLEY. "MIND & MATTER; All Hail the Hunch--and Damn the Details." The Wall Street Journal (Sat., December 24, 2011): C4.

(Note: ellipsis added.)


A couple of Gigerenzer's relevant books are:

Gigerenzer, Gerd. Gut Feelings: The Intelligence of the Unconscious. New York: Penguin Books, 2007.

Gigerenzer, Gerd. Rationality for Mortals: How People Cope with Uncertainty. New York: Oxford University Press, USA, 2008.





March 2, 2012

Amateurs Can Advance Science



(p. C4) The more specialized and sophisticated scientific research becomes, the farther it recedes from everyday experience. The clergymen-amateurs who made 19th-century scientific breakthroughs are a distant memory. Or are they? Paradoxically, in an increasing variety of fields, computers are coming to the rescue of the amateur, through crowd-sourced science.

Last month, computer gamers working from home redesigned an enzyme. Last year, a gene-testing company used its customers to find mutations that increase or decrease the risk of Parkinson's disease. Astronomers are drawing amateurs into searching for galaxies and signs of extraterrestrial intelligence. The modern equivalent of the Victorian scientific vicar is an ordinary person who volunteers his or her time to solving a small piece of a big scientific puzzle.

Crowd-sourced science is not a recent invention. In the U.S., tens of thousands of people record the number and species of birds that they see during the Christmas season, a practice that dates back more than a century. What's new is having amateurs contribute in highly technical areas.



For the full commentary, see:

MATT RIDLEY. "MIND & MATTER; Following the Crowd to Citizen Science." The Wall Street Journal (Sat., FEBRUARY 11, 2012): C4.





February 27, 2012

Big Data Opportunity for Economics and Business



(p. 7) Data is not only becoming more available but also more understandable to computers. Most of the Big Data surge is data in the wild -- unruly stuff like words, images and video on the Web and those streams of sensor data. It is called unstructured data and is not typically grist for traditional databases.

But the computer tools for gleaning knowledge and insights from the Internet era's vast trove of unstructured data are fast gaining ground. At the forefront are the rapidly advancing techniques of artificial intelligence like natural-language processing, pattern recognition and machine learning.

Those artificial-intelligence technologies can be applied in many fields. For example, Google's search and ad business and its experimental robot cars, which have navigated thousands of miles of California roads, both use a bundle of artificial-intelligence tricks. Both are daunting Big Data challenges, parsing vast quantities of data and making decisions instantaneously.


. . .


To grasp the potential impact of Big Data, look to the microscope, says Erik Brynjolfsson, an economist at Massachusetts Institute of Technology's Sloan School of Management. The microscope, invented four centuries ago, allowed people to see and measure things as never before -- at the cellular level. It was a revolution in measurement.

Data measurement, Professor Brynjolfsson explains, is the modern equivalent of the microscope. Google searches, Facebook posts and Twitter messages, for example, make it possible to measure behavior and sentiment in fine detail and as it happens.

In business, economics and other fields, Professor Brynjolfsson says, decisions will increasingly be based on data and analysis rather than on experience and intuition. "We can start being a lot more scientific," he observes.


. . .


Research by Professor Brynjolfsson and two other colleagues, published last year, suggests that data-guided management is spreading across corporate America and starting to pay off. They studied 179 large companies and found that those adopting "data-driven decision making" achieved productivity gains that were 5 percent to 6 percent higher than other factors could explain.

The predictive power of Big Data is being explored -- and shows promise -- in fields like public health, economic development and economic forecasting. Researchers have found a spike in Google search requests for terms like "flu symptoms" and "flu treatments" a couple of weeks before there is an increase in flu patients coming to hospital emergency rooms in a region (and emergency room reports usually lag behind visits by two weeks or so).


. . .


In economic forecasting, research has shown that trends in increasing or decreasing volumes of housing-related search queries in Google are a more accurate predictor of house sales in the next quarter than the forecasts of real estate economists. The Federal Reserve, among others, has taken notice. In July, the National Bureau of Economic Research is holding a workshop on "Opportunities in Big Data" and its implications for the economics profession.



For the full story, see:


STEVE LOHR. "NEWS ANALYSIS; The Age of Big Data." The New York Times, SundayReview (Sun., February 12, 2012): 1 & 7.

(Note: ellipses added.)

(Note: the online version of the article is dated February 11, 2012.)





December 28, 2011

Collins Says Successful CEOs Are Empirical and Disciplined



GreatByChoiceBK.jpg















Source of book image: online version of the WSJ review quoted and cited below.







(p. A15) 'Great by Choice" is a sequel to Jim Collins's best-selling "Good to Great" (2001), which identified seven characteristics that enabled companies to become truly great over an extended period of time. Never mind that one of the 11 featured companies is now bankrupt (Circuit City) and another is in government receivership (Fannie Mae). Mr. Collins has a knack for analysis that business readers find compelling.

Mr. Collins's new book tackles the question of how to steer a company to lasting success in an environment characterized by change, uncertainty and even chaos. Like his previous work, this book builds its conclusions on a framework of painstaking research, conducted over nine years and overseen by Mr. Collins and his co-author, Morten T. Hansen, a management professor at the University of California, Berkeley.


. . .


Messrs. Collins and Hansen draw some interesting and counterintuitive conclusions from their research. First, the successful leaders were not the most "visionary" or the biggest risk-takers; instead, they tended to be more empirical and disciplined, relying on evidence over gut instinct and preferring consistent gains to blow-out winners. The successful companies were not more innovative than the control companies; indeed, they were in some cases less innovative. Rather, they managed to "scale innovation"--introducing changes gradually, then moving quickly to capitalize on those that showed promise. The successful companies weren't necessarily the most likely to adopt internal changes as a response to a changing environment. "The 10X companies changed less in reaction to their changing world than the comparison cases," the authors conclude.


. . .


If "Great by Choice" shares the qualities that made "Good to Great" so popular, it also shares some that drew criticism. The authors' conclusions sometimes feel like the claims of a well-written horoscope--so broadly stated that they are hard to disprove. Their 10X leaders are both "disciplined" and "creative," "prudent" and "bold"; they go fast when they must but slow when they can; they are consistent but open to change. This encompassing approach allows the authors to fit pretty much any leader who achieves 10X performance into their analysis. Would it ever be possible, one wonders, to find a leader whose success contradicted their thesis?



For the full review, see:

ALAN MURRAY. "BOOKSHELF; Turbulent Times, Steady Success; How certain companies achieved shareholder returns at least 10 times greater than their industry." The Wall Street Journal (Tues., OCTOBER 11, 2011): A15.

(Note: ellipses added.)






October 25, 2011

The Huge Value of Exposing Ourselves to Unexpected Evidence




Bill Bryson tells how much we learned from the remains of a man from the neolithic age, who has been called Ötzi:


(p. 377) His equipment employed eighteen different types of wood - a remarkable variety. The most surprising of all his tools was the axe. It was copper-bladed and of a type known as a Remedello axe, after a site in Italy where they were first found. But Ötzi's axe was hundreds of years older than the oldest Remedello axe. 'It was,' in the words of one observer, 'as if the tomb of a medieval warrior had yielded a modern rifle.' The axe changed the timeframe for the copper age in Europe by no less than a thousand years.

But the real revelation and excitement were the clothes. Before Ötzi we had no idea - or, to be more precise, nothing but ideas - of how stone age people dressed. Such materials as survived existed only as fragments. Here was a complete outfit and it was full of surprises. His clothes were made from the skins and furs of an impressive range of animals - red deer, bear, chamois, goat and cattle. He also had with him a woven grass rectangle that was three feet long. This might have been a kind of rain cape, but it might equally have been a sleeping mat. Again, nothing like it had ever been seen or imagined.

Ötzi wore fur leggings held up with leather strips attached to a waist strap that made them look uncannily - almost comically - like the kind of nylon stockings and garter sets that Hollywood pin-ups wore in the Second World War. Nobody had remotely foreseen such a get-up. He wore a loincloth of goatskin and a hat made from the fur of a brown bear - probably a kind of hunting trophy. It would have been very warm and covetably stylish. The rest of his outfit was mostly made from the skin and fur of red deer. Hardly any came from domesticated animals, the opposite of what was expected.



Source:

Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.





October 11, 2011

Confirmation Bias (aka "Pigheadedness") in Science



(p. 12) In a classic psychology experiment, people for and against the death penalty were asked to evaluate the different research designs of two studies of its deterrent effect on crime. One study showed that the death penalty was an effective deterrent; the other showed that it was not. Which of the two research designs the participants deemed the most scientifically valid depended mostly on whether the study supported their views on the death penalty.

In the laboratory, this is labeled confirmation bias; observed in the real world, it's known as pigheadedness.

Scientists are not immune. In another experiment, psychologists were asked to review a paper submitted for journal publication in their field. They rated the paper's methodology, data presentation and scientific contribution significantly more favorably when the paper happened to offer results consistent with their own theoretical stance. Identical research methods prompted a very different response in those whose scientific opinion was challenged.



For the full commentary, see:

CORDELIA FINE. "GRAY MATTER; Biased but Brilliant." The New York Times, SundayReview Section (Sun., July 31, 2011): 12.

(Note: the online version of the article is dated July 30, 2011.)






September 28, 2011

We Tend to Ignore Information that Contradicts Our Beliefs



BelievingBrainBK2011-08-09.jpg












Source of book image: online version of the WSJ review quoted and cited below.






We learn the most when our priors are contradicted. But the dissonance between evidence and beliefs is painful. So we often do not see, or soon forget, evidence that does not fit with our beliefs.

The innovative entrepreneur is often a person who sees and forces herself to remember, the dissonant fact, storing it away to make sense of, or make use of, later. At the start, she may be alone in what she sees and what she remembers. So if we are to benefit from her ability and willingness to bear the pain of dissonance, she must have the freedom to differ, and she must have the financial wherewith-all to support herself until her vision is more widely shared, better understood, and more fruitfully applied.


(p. A13) Beliefs come first; reasons second. That's the insightful message of "The Believing Brain," by Michael Shermer, the founder of Skeptic magazine. In the book, he brilliantly lays out what modern cognitive research has to tell us about his subject--namely, that our brains are "belief engines" that naturally "look for and find patterns" and then infuse them with meaning. These meaningful patterns form beliefs that shape our understanding of reality. Our brains tend to seek out information that confirms our beliefs, ignoring information that contradicts them. Mr. Shermer calls this "belief-dependent reality." The well-worn phrase "seeing is believing" has it backward: Our believing dictates what we're seeing.


. . .


One of the book's most enjoyable discussions concerns the politics of belief. Mr. Shermer takes an entertaining look at academic research claiming to prove that conservative beliefs largely result from psychopathologies. He drolly cites survey results showing that 80% of professors in the humanities and social sciences describe themselves as liberals. Could these findings about psychopathological conservative political beliefs possibly be the result of the researchers' confirmation bias?

As for his own political bias, Mr. Shermer says that he's "a fiscally conservative civil libertarian." He is a fan of old-style liberalism, as in liberality of outlook, and cites "The Science of Liberty" author Timothy Ferris's splendid formulation: "Liberalism and science are methods, not ideologies." The "scientific solution to the political problem of oppressive governments," Mr. Shermer says, "is the tried-and-true method of spreading liberal democracy and market capitalism through the open exchange of information, products, and services across porous economic borders."

But it is science itself that Mr. Shermer most heartily embraces. "The Believing Brain" ends with an engaging history of astronomy that illustrates how the scientific method developed as the only reliable way for us to discover true patterns and true agents at work. Seeing through a telescope, it seems, is believing of the best kind.



For the full review, see:

RONALD BAILEY. "A Trick Of the Mind; Looking for patterns in life and then infusing them with meaning, from alien intervention to federal conspiracy." The Wall Street Journal (Weds., July 27, 2011): A13.

(Note: ellipsis added.)


Book reviewed:

Shermer, Michael. The Believing Brain: From Ghosts and Gods to Politics and Conspiracies---How We Construct Beliefs and Reinforce Them as Truths. New York: Times Books, 2011.





August 17, 2011

A Case for Epistemic and Technological Optimism



BeginningOfInfinityBK2011-08-07.jpg
















Source of book image: http://us.penguingroup.com/static/covers/all/5/5/9780670022755H.jpg



Horgan is well-known for writing a pessimistic book about the future of science. For him to write such a positive review of a book that reaches the opposite conclusion, is impressive (both about him and the book he is reviewing).

From Horgan's review and the reviews on Amazon as of 8/7/11, I view the Deutsch book as potentially important and profound. (I will write more when I have read it.)




(p. 17) . . . Mr. Deutsch knocks my 1996 book, "The End of Science," for proposing that the glory days of science--especially pure science, the effort to map out and understand reality--may be over. Mr. Deutsch equates my thesis with "dogmatism, stagnation and tyranny," all of which, for the record, I oppose. But he makes the case for infinite progress with such passion, imagination and quirky brilliance that I couldn't help enjoying his argument. More often than not I found myself agreeing with him--or at least hoping that he is right.


. . .


If we acknowledge our imperfections, Mr. Deutsch observes, then, paradoxically, there is no problem that we cannot tackle. Death, for instance. Or the apparent incompatibility between the two pillars of modern physics, quantum theory and general relativity. Or global warming, which Mr. Deutsch believes we can overcome through innovation rather than drastic cutbacks in consumption. He gores the sacred cow of "sustainability": Societies are healthiest, he declares, not when they achieve equilibrium but when they are rapidly evolving.



For the full review, see:

JOHN HORGAN. "BOOKSHELF; To Err Is Progress; How to foster the growth of scientific knowledge: accept that it is limited no matter how definitive it may seem." The Wall Street Journal (Weds., JULY 20, 2011): A17.

(Note: ellipses added.)


Source information on book under review:

Deutsch, David. The Beginning of Infinity: Explanations That Transform the World. New York: Viking Adult, 2011.






June 9, 2011

"Progress Depended on the Empirical Habit of Thought"



In the passage below from 1984 Orwell presents an underground rebel's account of why the authoritarian socialist dystopia cannot advance in science and technology.


(p. 155) The world of today is a bare, hungry, dilapidated place compared with the world that existed before 1914, and still more so if compared with the imaginary future to which the people of that period looked forward. In the early twentieth century, the vision of a future society unbelievably rich, leisured, orderly, and efficient--a glittering (p. 156) antiseptic world of glass and steel and snow-white concrete--was part of the consciousness of nearly every literate person. Science and technology were developing at a prodigious speed, and it seemed natural to assume that they would go on developing. This failed to happen, partly because of the impoverishment caused by a long series of wars and revolutions, partly because scientific and technical progress depended on the empirical habit of thought, which could not survive in a strictly regimented society.



Source:

Orwell, George. Nineteen Eighty-Four. New York: The New American Library, 1961 [1949].

By Canadian law, 1984 is no longer under copyright. The text has been posted on the following Canadian web site: http://wikilivres.info/wiki/Nineteen_Eighty-Four





March 5, 2011

Caballero Worries about the Relevance of Mainstream Macro Modeling



In the past, I have found some of MIT economist Ricardo Caballero's research useful because he takes Schumpeter's process of creative destruction seriously.

In a recent paper, he joins a growing number of mainstream economists who worry that the recent and continuing economic crisis has implications for the methodology of economics:


In this paper I argue that the current core of macroeconomics--by which I mainly mean the so-called dynamic stochastic general equilibrium approach--has become so mesmerized with its own internal logic that it has begun to confuse the precision it has achieved about its own world with the precision that it has about the real one. This is dangerous for both methodological and policy reasons. On the methodology front, macroeconomic research has been in "fine-tuning" mode within the local-maximum of the dynamic stochastic general equilibrium world, when we should be in "broad-exploration" mode. We are too far from absolute truth to be so specialized and to make the kind of confident quantitative claims that often emerge from the core. On the policy front, this confused precision creates the illusion that a minor adjustment in the standard policy framework will prevent future crises, and by doing so it leaves us overly exposed to the new and unexpected.


Source:

Caballero, Ricardo J. "Macroeconomics after the Crisis: Time to Deal with the Pretense-of-Knowledge Syndrome." NBER Working Paper # w16429, October 2010.


The paper has been published as:

Caballero, Ricardo J. "Macroeconomics after the Crisis: Time to Deal with the Pretense-of-Knowledge Syndrome." Journal of Economic Perspectives 24, no. 4 (Fall 2010): 85-102.





February 3, 2011

"Inventors Are Sometimes Beneficiaries of Their Own Ignorance"




William Rosen gives us a thought-provoking anecdote about Edmund Cartwright, the inventor of the first power loom:


(p. 238) He was also, apparently, convinced of the practicality of such a machine by the success of the "Mechanical Turk," a supposed chess-playing robot that had mystified all of Europe and which had not yet been revealed as one of the era's great hoaxes: a hollow figurine concealing a human operator. Inventors are sometimes beneficiaries of their own ignorance.


Source:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.





January 1, 2011

Peer Review Versus Open Review (As Inspired by Wikipedia)



CohenDan2010-12-21.jpg "Dan Cohen, director of the Center for History and New Media at George Mason University, is among the academics who advocate a more open, Web-based approach to reviewing scholarly works." Source of caption and photo: online version of the NYT article quoted and cited below.


(p. A1) For professors, publishing in elite journals is an unavoidable part of university life. The grueling process of subjecting work to the up-or-down judgment of credentialed scholarly peers has been a cornerstone of academic culture since at least the mid-20th century.


. . .


"What we're experiencing now is the most important transformation in our reading and writing tools since the invention of movable type," said Katherine Rowe, a Renaissance specialist and media historian at Bryn Mawr College. "The way scholarly exchange is moving is radical, and we need to think about what it means for our fields."


. . .


(p. A3) Today a small vanguard of digitally adept scholars is rethinking how knowledge is understood and judged by inviting online readers to comment on books in progress, compiling journals from blog posts and sometimes successfully petitioning their universities to grant promotions and tenure on the basis of non-peer-reviewed projects.


. . .


Each type of review has benefits and drawbacks.

The traditional method, in which independent experts evaluate a submission, often under a veil of anonymity, can take months, even years.

Clubby exclusiveness, sloppy editing and fraud have all marred peer review on occasion. Anonymity can help prevent personal bias, but it can also make reviewers less accountable; exclusiveness can help ensure quality control but can also narrow the range of feedback and participants. Open review more closely resembles Wikipedia behind the scenes, where anyone with an interest can post a comment. This open-door policy has made Wikipedia, on balance, a crucial reference resource.

Ms. Rowe said the goal is not necessarily to replace peer review but to use other, more open methods as well.

In some respects scientists and economists who have created online repositories for unpublished working papers, like repec.org, have more quickly adapted to digital life. Just this month, mathematicians used blogs and wikis to evaluate a supposed mathematical proof in the space of a week -- the scholarly equivalent of warp speed.



For the full story, see:

PATRICIA COHEN. "Scholars Test Web Alternative to the Venerable Peer Review." The New York Times (Tues., August 24, 2010): A1 & A3.

(Note: ellipses added.)

(Note: the online version of the review has the date August 23, 2010, and had the slightly shorter title "Scholars Test Web Alternative to Peer Review.")






December 20, 2010

Government "Gave People the Crazy Juice"



BoettkePete2010-12-19.jpg "Peter J. Boettke of George Mason University is the emerging standardbearer for a revived Austrian school of economics." Source of caption and photo: online version of the WSJ article quoted and cited below.


(p. B1) Peter J. Boettke, shuffling around in a maroon velour track suit or faux-leather rubber shoes he calls "dress Crocs," hardly seems like the type to lead a revolution.

But the 50-year-old professor of economics at George Mason University in Virginia is emerging as the intellectual standard-bearer for the Austrian school of economics that opposes government intervention in markets and decries federal spending to prop up demand during times of crisis. Mr. Boettke, whose latest research explores people's ability to self-regulate, also is minting a new generation of disciples who are spreading the Austrian approach throughout academia, where it had long been left for dead.

To these free-market economists, government intrusion ultimately sows the seeds of the next crisis. It hampers what one famous Austrian, Joseph Schumpeter, called the process of "creative destruction."


. . .


(p. B3) It wasn't a lack of government oversight that led to the crisis, as some economists argue, but too much of it, Mr. Boettke says. Specifically, low interest rates and policies that subsidized homeownership "gave people the crazy juice," he says.




For the full story, see:

KELLY EVANS. "Spreading Hayek, Spurning Keynes; Professor Leads an Austrian Revival." The Wall Street Journal (Sat., AUGUST 28, 2010): B1 & B3.

(Note: ellipsis added.)





December 18, 2010

Google Releases Intriguing New Bibliometric Tool



GoogleBookWordsGraphs2010-12-17.jpg




















Source of graphs: online version of the NYT article quoted and cited below.






















(p. A1) With little fanfare, Google has made a mammoth database culled from nearly 5.2 million digitized books available to the public for free downloads and online searches, opening a new landscape of possibilities for research and education in the humanities.

The digital storehouse, which comprises words and short phrases as well as a year-by-year count of how often they appear, represents the first time a data set of this magnitude and searching tools are at the disposal of Ph.D.'s, middle school students and anyone else who likes to spend time in front of a small screen. It consists of the 500 billion words contained in books published between 1500 and 2008 in English, French, Spanish, German, Chinese and Russian.


. . .


"The goal is to give an 8-year-old the ability to browse cultural trends throughout history, as recorded in books," said Erez Lieberman Aiden, a junior fellow at the Society of Fellows at Harvard. Mr. Lieberman Aiden and Jean-Baptiste Michel, a postdoctoral fellow at Harvard, assembled the data set with Google and spearheaded a research project to demonstrate how vast digital databases can transform our understanding of language, culture and the flow of ideas.

Their study, to be published in (p. A3) the journal Science on Friday, offers a tantalizing taste of the rich buffet of research opportunities now open to literature, history and other liberal arts professors who may have previously avoided quantitative analysis. Science is taking the unusual step of making the paper available online to nonsubscribers.

"We wanted to show what becomes possible when you apply very high-turbo data analysis to questions in the humanities," said Mr. Lieberman Aiden, whose expertise is in applied mathematics and genomics. He called the method "culturomics."


. . .


Looking at inventions, they found technological advances took, on average, 66 years to be adopted by the larger culture in the early 1800s and only 27 years between 1880 and 1920.



For the full story, see:

PATRICIA COHEN. "In 500 Billion Words, New Window on Culture." The New York Times (Fri., December 17, 2010): A1 & A3.

(Note: ellipses added.)

(Note: the online version of the article is dated December 16, 2010.)





December 15, 2010

"We Need to Know What Works"



(p. A12) WASHINGTON--World Bank President Robert Zoellick challenged economists to take on tougher challenges in development economics and to consult a wider range of professionals in developing countries, opening a debate about how effectively economists have attacked problems in global poverty.

"Too often research economists seem not to start with the key knowledge gaps facing development practitioners, but rather search for questions they can answer with the industry's currently favorite tools," Mr. Zoellick said at Georgetown University.


. . .


"We need to know what works: we need a research agenda that focuses on results," he said.

Nobel Prize-winning economist Michael Spence, who led a commission on economic growth, said Mr. Zoellick's comments are "generally not only in the right direction, but very useful." Harvard economist Dani Rodrik, who favors a stronger government hand in development, also praised the World Bank president. "The speech hits all the right notes: the need for economists to demonstrate humility, eschew blueprints...and focus on evaluation but not at the expense of the big questions," Mr. Rodrik said.



For the full story, see:

BOB DAVIS. "World Bank Chief Ignites a Debate." The Wall Street Journal (Thurs., SEPTEMBER 30, 2010): A12.

(Note: first two ellipses added; ellipsis in last quoted paragraph is in original.)





December 9, 2010

Science Can Contribute "Diligent Experimental Habits" to Technology



(p. 101) Nothing is more common in the history of science than independent discovery of the same phenomenon, unless it is a fight over priority. To this day, historians debate how much prior awareness of the theory of latent heat was in Watt's possession, but they miss Black's real contribution, which anyone can see by examining the columns of neat script that attest to Watt's careful recording of experimental results. Watt didn't discover the existence of latent heat from Black, at least not directly; but he rediscovered it entirely through exposure to the diligent experimental habits of professors such as Black, John Robison, and Robert Dick.


Source:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.





December 5, 2010

A Key to Scientific Truth: Nullius in Verba ("On No One's Word")



(p. 68) . . . scientific understanding didn't progress by looking for truth; it did so by looking for mistakes.

This was new. In the cartoon version of the Scientific Revolution, science made its great advances in opposition to a heavy-handed Roman Catholic Church; but an even larger obstacle to progress in the understanding and manipulation of nature was the belief that Aristotle had already figured out all of physics and had observed all that biology had to offer, or that Galen was the last word in medicine. By this standard, the real revolutionary manifesto of the day was written not by Descartes, or Galileo, but by the seventeenth-century Italian poet and physician Francesco Redi, in his Experiments on the Generation of Insects, who wrote (as one of a hundred examples), "Aristotle asserts that cabbages produce caterpillars daily, but I have not been able to witness this remarkable reproduction, though I have seen many eggs laid by butterflies on the cabbage-stalks. . . ." Not for nothing was the motto of the Royal Society nullius in verba: "on no one's word."



Source:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: first ellipsis added; italics and second ellipsis, in original.)





December 1, 2010

"The Steam Engine Has Done Much More for Science than Science Has Done for the Steam Engine"



(p. 67) The great scientist and engineer William Thomson, Lord Kelvin, made his reputation on discoveries in basic physics. electricity, and thermodynamics, but he may be remembered just as well for his talent for aphorism. Among the best known of Kelvin's quotations is the assertion that "all science is either physics or stamp collecting (while one probably best forgotten is the confident "heavier-than-air flying machines are impossible"). But the most relevant for a history of the Industrial Revolution is this: "the steam engine has done much more for science than science has done for the steam engine."


Source:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.





November 25, 2010

Neurosurgeons Treating Dogs is Mutually Beneficial to Dogs and Humans



(p. D3) An operation commonly performed to remove brain tumors from the pituitary glands of humans is now available to dogs, thanks to a collaboration between a neurosurgeon and some veterinarians in Los Angeles. And that is turning out to be good for humans.

So far, nine dogs and one cat that otherwise would have died have been treated successfully.


. . .


What Dr. Mamelak has gained from teaching the procedure to veterinarians is access to tissue samples from the treated dogs. That's significant because Cushing's afflicts only one in a million humans, making it a difficult disease to study. By contrast, it afflicts about 100,000 dogs a year in the United States. The canine tissue samples are enabling him and his colleagues to develop drugs to one day treat Cushing's disease in both humans and dogs.

"We have a full loop," he said. "We're using a human procedure in animals, and using their tissue to study the disease."



For the full story, see:

SINDYA N. BHANOO. "Observatory; They Fetch, They Roll Over, They Aid Tumor Research." The New York Times, Science Times Section (Tues., October 26, 2010): D3.

(Note: ellipsIs added.)

(Note: the online version of the article is dated October 22 (sic), 2010.)





November 19, 2010

Invention Aided By the Intelligent Hand and Spatial Intelligence



(p. 36) For centuries, certainly ever since Immanuel Kant called the hand the window on the mind," philosophers have been pondering the very complex way in which the human hand is related to the human mind. Modern neuroscience and evolutionary biology have confirmed the existence of what the Scottish physician and theologian Charles Bell called the intelligent hand. Stephen Pinker of Harvard even argues that early humans' intelligence increased partly because they were equipped with levers of influence on the world. namely the grippers found at the end of their two arms. We now know that the literally incredible amount of sensitivity and articulation of the human hand, which has increased at roughly the same pace as has the complexity of the human brain, is not merely a product of the pressures of natural selection, butt an initiator of it: The hand has led the brain to evolve just as much as the brain has led the hand. The hands of a pianist, or a painter, or a sushi chef, or even, as with Thomas New-(p. 37)comen, hands that could use a hammer to shape soft iron, are truly, in any functional sense, "intelligent."

This sort of tactile intelligence was not emphasized in A. P. Usher's theory of invention, the components of which he filtered through the early twentieth-century school of psychology known as Gestalt theory, which was preeminently a theory of visual behavior. The most important precepts of Gestalt theory (to Usher, anyway, who was utterly taken with their explanatory power) are that the patterns we perceive visually appear all at once, rather than by examining components one at a time, and that a principle of parsimony organizes visual perceptions into their simplest form. Or forms; one of the most famous Gestalt images is the one that can look like either a goblet or two facing profiles. Usher's enthusiasm for Gestalt psychology explains why, despite his unshakable belief in the inventive talents of ordinary individuals, he devotes an entire chapter of his magnum opus to perhaps the most extraordinary individual in the history of invention: Leonardo da Vinci.

Certainly, Leonardo would deserve a large place in any book on the history of mechanical invention, not only because of his fanciful helicopters and submarines. hut for his very real screw cutting engine, needle making machine, centrifugal pumps, and hundreds more. And Usher found Leonardo an extraordinarily useful symbol in marking the transition in mechanics from pure intuition to the application of science and mathematics.

But the real fascination for Usher was Leonardo's straddling of two worlds of creativity, the artistic and the inventive. No one, before or since, more clearly demonstrated the importance to invention of what we might call "spatial intelligence"; Leonardo was not an abstract thinker of any great achievement, nor were his mathematical skills, which he taught himself late in life, remarkable. (p. 38) His perceptual skills, on the other hand, developed primarily for his painting, were extraordinary, but they were so extraordinary that Usher could write, "It is only with Leonardo that the process of invention is lifted decisively into the field of the imagination. . . . "



Source:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.





November 12, 2010

Guidelines for Innovative Thinking?



innovation-cartoon.jpg Source of cartoon: http://filipspagnoli.files.wordpress.com/2009/11/innovation-cartoon.jpg?w=361&h=364


The NYT ran the above cartoon by New Yorker cartoonist Leo Cullum as part of Cullum's obituary.


(p. A22) Leo Cullum, a cartoonist whose blustering businessmen, clueless doctors, venal lawyers and all-too-human dogs and cats amused readers of The New Yorker for the past 33 years, died on Saturday in Los Angeles. He was 68 and lived in Malibu, Calif.

Mr. Cullum, a TWA pilot for more than 30 years, was a classic gag cartoonist whose visual absurdities were underlined, in most cases, by a caption reeled in from deep left field. "I love the convenience, but the roaming charges are killing me," a buffalo says, holding a cellphone up to its ear. "Your red and white blood cells are normal," a doctor tells his patient. "I'm worried about your rosé cells."


. . .


His most popular cartoon, from 1998, showed a man addressing the family cat, which is sitting next to the litterbox. "Never, ever, think outside the box," he says.




For the full obituary, see:

WILLIAM GRIMES. "Leo Cullum, New Yorker Cartoonist, Dies at 68." The New York Times (Tues., October 26, 2010): A22.

(Note: the online version of the obituary is dated October 25, 2010.)

(Note: ellipsis added.)





November 11, 2010

Toricelli Experiment Dispoved Aristotlelian Theory that a Vacuum Was Impossible



(p. 8) Florence, in the year 1641, had been essentially the private fief of the Medici family for two centuries. The city, ground zero for both the Renaissance and the Scientific Revolution, was also where Galileo Galilei had chosen to live out the sentence imposed by the Inquisition for his heretical writings that argued that the earth revolved around the sun. Galileo was seventy years old and living in a villa in Arcetri, in the hills above the city, (p. 9) when he read a book on the physics of movement titled De motu (sometimes Trattato del Moto) and summoned its author, Evangelista Torricelli, a mathematician then living in Rome. Torricelli, whose admiration for Galileo was practically without limit, decamped in time not only to spend the last three months of the great man's life at his side, but to succeed him as professor of mathematics at the Florentine Academy.


. . .


(p. 9) . . . , Torricelli used a tool even more powerful than his well--cultivated talent for mathematical logic: He did experiments. At the behest of one of his patrons, the Grand Duke of Tuscany, whose engineers were unable to build a sufficiently powerful pump, Torricelli designed a series of apparatuses to test the limits of the action of contemporary water pumps. In spring of 1644, Torricelli filled a narrow, four-foot-long glass tube with mercury--a far heavier fluid than water--inverted it in a basin of mercury, sealing the tube's top. and documented that while the mercury did not pour out, it did leave a space at the closed top of the tube. He reasoned that since nothing could have slipped past the mercury in the tube, what occupied the top of the tube must, therefore, be nothing: a vacuum.


. . .


(p. 10) Torricelli was not, even by the standards of his day, a terribly ambitious inventor. When faced with hostility from religious authorities and other traditionalists who believed, correctly, that his discovery was a direct shot at the Aristotelian world, he happily returned to his beloved cycloids, the latest traveler to find himself on the wrong side of the boundary line between science and technology

But by then it no longer mattered if Torricelli was willing to leave the messiness of physics for the perfection of mathematics: vacuum would keep mercury in the bottle, hut the genie was already out. Nature might have found vacuum repugnant for two thousand years, but Europe was about to embrace it.



Source:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics in original; ellipses added.)





November 7, 2010

How Scientific Progress Was Slowed By Too Much Respect for Aristotelian Theory



William Rosen has a wonderful early example of how too much respect for theory can keep us from making the observations that would eventually prove the theory to be wrong:


(p. 7) Aristotle argued against the existence of a vacuum with unerring, though curiously inelegant, logic. His primary argument ran something like this:

1. If empty space can be measured, then it must have dimension.
2. If it has dimension, then it must be a body (this is something of a tautology: by Aristotelian definition, bodies are things that have dimension).
3. Therefore, anything moving into such a previously empty space would he occupying the same space simultaneously, and two bodies cannot do so.

More persuasive was the argument that a void is unnecessary, that since the fundamental character of an object consists of those measurable dimensions, then a void with the same dimensions as the cup, or horse, or ship occupying it is no different from the object. One, therefore, is redundant, and since the object cannot be superfluous, the void must be.

It takes millennia to recover from that sort of unassailable logic, temptingly similar to that used in Monty Python and the Holy GraiI to demonstrate that if a woman weighs as much as a duck, she is a witch. Aristotle's blind spot regarding the existence of a void would be inherited by a hundred generations of his adherents. Those who read the work of Heron did so through an Aristotelian scrim on which was printed, in metaphorical letters twenty feet high: NATURE ABHORS A VACUUM.



Source:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics in original.)





October 4, 2010

Country Data on Light Intensity at Night May Be More Accurate than Official GDP



(p. 63) In a new working paper, Vernon Henderson, Adam Storeygard and David Weil of Brown University suggest an alternative source of data: outer space. In particular they track changes in the intensity of artificial light over a country at night, which should increase with incomes. American military weather satellites collect these data every night for the entire world.

It is hard to know exactly how much weight to put on extraterrestrial brightness. Changes in the efficiency of electricity transmission, for example, may cause countries to look brighter from outer space, even if economic activity has not increased much. But errors in its measurement are unlikely to be correlated with errors in the calculation of official GDP, since they arise for different reasons. A weighted average of the growth implied by changes in the intensity of artificial light and official GDP growth rates ought to improve the accuracy of estimates of economic growth. Poor countries in particular may have dodgy GDP numbers but their night-light data are as reliable as anyone else's.



For the full story, see:

"Measuring growth from outer space; Light relief; Data about light emitted into space may help improve growth estimates." The Economist (Aug. 6, 2009): 63.


The working paper referenced is:

Henderson, J. Vernon, Adam Storeygard, and David N. Weil. "Measuring Economic Growth from Outer Space." NBER Working Paper No. 15199, July 2009.





October 2, 2010

CFOs Are Bad at Forecasting, and Don't Realize They Are Bad



(p. 5) . . . , three financial economists -- Itzhak Ben-David of Ohio State University and John R. Graham and Campbell R. Harvey of Duke -- found that chief financial officers of major American corporations are not very good at forecasting the future. The authors' investigation used a quarterly survey of C.F.O.'s that Duke has been running since 2001. Among other things, the C.F.O.'s were asked about their expectations for the return of the Standard & Poor's 500-stock index for the next year -- both their best guess and their 80 percent confidence limit. This means that in the example above, there would be a 10 percent chance that the return would be higher than the upper bound, and a 10 percent chance that it would be less than the lower one.

It turns out that C.F.O.'s, as a group, display terrible calibration. The actual market return over the next year fell between their 80 percent confidence limits only a third of the time, so these executives weren't particularly good at forecasting the stock market. In fact, their predictions were negatively correlated with actual returns. For example, in the survey conducted on Feb. 26, 2009, the C.F.O.'s made their most pessimistic predictions, expecting a market return of just 2.0 percent, with a lower bound of minus 10.2 percent. In fact, the market soared 42.6 percent over the next year.

It may be neither troubling nor surprising that C.F.O.'s can't accurately predict the stock market's path. If they could, they'd be running hedge funds and making billions. What is troubling, though, is that as a group, many of these executives apparently don't realize that they lack forecasting ability. And, just as important, they don't seem to be aware of how volatile the market can be, even in "normal" times.



For the full commentary, see:

RICHARD H. THALER: "Economic View; Often Wrong, But Never in Doubt." The New York Times, SundayBusiness Section (Sun., August 22, 2010): 5.

(Note: ellipses added.)

(Note: the online version of the article is dated August 21, 2010 and has the somewhat shorter title "Economic View; The Overconfidence Problem in Forecasting.")


The Ben-David et al article is:

Ben-David, Itzhak, John R. Graham, and Campbell Harvey. "Managerial Miscalibration." Fisher College of Business Working Paper No.2010-03-012, July 2010.





October 1, 2010

Japanese "Longevity" Due Partly to Government Over-Counting Centenarians



WataseMitsueJapanCentenerian2010-09-10.jpg"A Kobe city official, left, visited Mitsue Watase, 100, at her home last week as Japanese officials started a survey on the whereabouts of centenarians." Source of caption and photo: online version of the NYT article quoted and cited below. Source of caption and photo: online version of the NYT article quoted and cited below.


Oskar Morgenstern is mainly known as the co-author with John von Neumann of the book that started game theory. But it may be that his most important contribution to economics is a little known book called On the Accuracy of Economic Observations. In that book he gave examples of social scientists theorizing to explain 'facts' that turned out not to be true (such as the case of the 14 year-old male widowers).

The point is that truth would be served by economists spending a higher percent of their time in improving the quality of data.

One can imagine Morgenstern sadly smiling at the case of the missing Japanese centenarians:


(p. 1) TOKYO -- Japan has long boasted of having many of the world's oldest people -- testament, many here say, to a society with a superior diet and a commitment to its elderly that is unrivaled in the West.

That was before the police found the body of a man thought to be one of Japan's oldest, at 111 years, mummified in his bed, dead for more than three decades. His daughter, now 81, hid his death to continue collecting his monthly pension payments, the police said.

Alarmed, local governments began sending teams to check on other elderly residents. What they found so far has been anything but encouraging.

A woman thought to be Tokyo's oldest, who would be 113, was last seen in the 1980s. Another woman, who would be the oldest in the world at 125, is also missing, and probably has been for a long time. When city officials tried to visit her at her registered address, they discovered that the site had been turned into a city park, in 1981.

To date, the authorities have been unable to find more than 281 Japanese who had been listed in records as 100 years old or older. Facing a growing public outcry, the (p. 6) country's health minister, Akira Nagatsuma, said officials would meet with every person listed as 110 or older to verify that they are alive; Tokyo officials made the same promise for the 3,000 or so residents listed as 100 and up.

The national hand-wringing over the revelations has reached such proportions that the rising toll of people missing has merited daily, and mournful, media coverage. "Is this the reality of a longevity nation?" lamented an editorial last week in The Mainichi newspaper, one of Japan's biggest dailies.


. . .


. . . officials admit that Japan may have far fewer centenarians than it thought.

"Living until 150 years old is impossible in the natural world," said Akira Nemoto, director of the elderly services section of the Adachi ward office. "But it is not impossible in the world of Japanese public administration."



For the full story, see:

MARTIN FACKLER. "Japan, Checking on Its Oldest People, Finds Many Gone, Some Long Gone." The New York Times, First Section (Sun., August 15, 2010): 1 & 6.

(Note: ellipses added.)

(Note: the online version of the article is dated August 14, 2010 and has the somewhat shorter title "Japan, Checking on Its Oldest, Finds Many Gone"; the words "To date" appear in the online, but not the print, version of the article.)


The Morgenstern book is:

Morgenstern, Oskar. On the Accuracy of Economic Observations. 2nd ed. Princeton: Princeton University Press, 1965.





September 29, 2010

Myron Scholes on Sticking to His Ideas, Losing $4 Billion in Four Months, and Rejecting Taleb's Advice



ScholesMyron2010-08-29.jpg





Myron Scholes. Source of photo: online version of the NYT article quoted and cited below.





(p. 22) The writer Nassim Nicholas Taleb contends that instead of giving advice on managing risk, you "should be in a retirement home doing sudoku."
If someone says to you, "Go to an old-folks' home," that's kind of ridiculous, because a lot of old people are doing terrific things for society. I never tried sudoku. Maybe he spends his time doing sudoku.



Some economists believe that mathematical models like yours lulled banks into a false sense of security, and I am wondering if you have revised your ideas as a consequence.
I haven't changed my ideas. A bank needs models to measure risk. The problem, however, is that any one bank can measure its risk, but it also has to know what the risk taken by other banks in the system happens to be at any particular moment.


. . .


After leaving academia, you helped found Long-Term Capital Management, a hedge fund that lost $4 billion in four months and became a symbol of '90s-style financial failure. .
Obviously, you prefer not to have lost money for investors.



For the full interview, see:

DEBORAH SOLOMON. "Questions for Myron Scholes; Crash Course." The New York Times, Magazine Section (Sun., May 17, 2009): 22.

(Note: ellipsis added; bold in original versions, to indicate questions by Deborah Solomon.)

(Note: the online version of the article is dated May 14, 2009.)





September 25, 2010

"A Very Clear-Thinking Heretic" Doubted Big Bang Theory



BurbidgeGeoffrey2010-09-02.jpg "Geoffrey Burbidge's work in astronomy changed the field." Source of caption and photo: online version of the NYT obituary quoted and cited below.


(p. 26) Geoffrey Burbidge, an English physicist who became a towering figure in astronomy by helping to explain how people and everything else are made of stardust, died on Jan. 26 in San Diego. He was 84.


. . .


Dr. Burbidge's skepticism extended to cosmology. In 1990, he and four other astronomers, including Drs. Arp and Hoyle, published a broadside in the journal Nature listing arguments against the Big Bang.

Dr. Burbidge preferred instead a version of Dr. Hoyle's Steady State theory of an eternal universe. In the new version, small, local big bangs originating in the nuclei of galaxies every 20 billion years or so kept the universe boiling. To his annoyance, most other astronomers ignored this view.

In a memoir in 2007, Dr. Burbidge wrote that this quasi-steady state theory was probably closer to the truth than the Big Bang. But he added that "there is such a heavy bias against any minority point of view in cosmology that it may take a very long time for this to occur."

Despite his contrarian ways, Dr. Burbidge maintained his credibility in the astronomical establishment, serving as director of Kitt Peak from 1978 to 1984 and editing the prestigious Annual Review of Astronomy and Astrophysics for more than 30 years. He was "a very clear-thinking heretic," Dr. Strittmatter said.



For the full obituary, see:

DENNIS OVERBYE. "Geoffrey Burbidge, Who Traced Life to Stardust, Is Dead at 84 " The New York Times, First Section (Sun., February 7, 2010 ): A7.

(Note: ellipsis added.)

(Note: the online version of the obituary is dated February 6, 2010.)





August 11, 2010

Documenting Dangers of Growing Public Debt (and of Replacing History with Math)



RogoffReinhart2010-08-04.jpg "Kenneth Rogoff and Carmen Reinhart at Ms. Reinhart's Washington home. They started their book around 2003, years before the economy began to crumble." Source of caption and photo: online version of the NYT article quoted and cited below.


(p. 1) Like a pair of financial sleuths, Ms. Reinhart and her collaborator from Harvard, Kenneth S. Rogoff, have spent years investigating wreckage scattered across documents from nearly a millennium of economic crises and collapses. They have wandered the basements of rare-book libraries, riffled through monks' yellowed journals and begged central banks worldwide for centuries-old debt records. And they have manually entered their findings, digit by digit, into one of the biggest spreadsheets you've ever seen.

Their handiwork is contained in their recent best seller, "This Time Is Different," a quantitative reconstruction of hundreds of historical episodes in which perfectly smart people made perfectly disastrous decisions. It is a panoramic opus, both geographically and temporally, covering crises from 66 countries over the last 800 years.

The book, and Ms. Reinhart's and Mr. Rogoff's own professional journeys as economists, zero in on some of the broader shortcomings of their trade -- thrown into harsh relief by economists' widespread failure to anticipate or address the financial crisis that began in 2007.

"The mainstream of academic research in macroeconomics puts theoretical coherence and elegance first, and investigating the data second," says Mr. Rogoff. For that reason, he says, much of the profession's celebrated work "was not terribly useful in either predicting the financial crisis, or in assessing how it would it play out once it happened."

"People almost pride themselves on not paying attention to current events," he says.


. . .


(p. 6) Although their book is studiously nonideological, and is more focused on patterns than on policy recommendations, it has become fodder for the highly charged debate over the recent growth in government debt.

To bolster their calls for tightened government spending, budget hawks have cited the book's warnings about the perils of escalating public and private debt. Left-leaning analysts have been quick to take issue with that argument, saying that fiscal austerity perpetuates joblessness, and have been attacking economists associated with it.


. . .


The economics profession generally began turning away from empirical work in the early 1970s. Around that time, economists fell in love with theoretical constructs, a shift that has no single explanation. Some analysts say it may reflect economists' desire to be seen as scientists who describe and discover universal laws of nature.

"Economists have physics envy," says Richard Sylla, a financial historian at the Stern School of Business at New York University. He argues that Paul Samuelson, the Nobel laureate whom many credit with endowing economists with a mathematical tool kit, "showed that a lot of physical theories and concepts had economic analogs."

Since that time, he says, "economists like to think that there is some physical, stable state of the world if they get the model right." But, he adds, "there is really no such thing as a stable state for the economy."

Others suggest that incentives for young economists to publish in journals and gain tenure predispose them to pursue technical wizardry over deep empirical research and to choose narrow slices of topics. Historians, on the other hand, are more likely to focus on more comprehensive subjects -- that is, the material for books -- that reflect a deeply experienced, broadly informed sense of judgment.

"They say historians peak in their 50s, once they've accumulated enough knowledge and wisdom to know what to look for," says Mr. Rogoff. "By contrast, economists seem to peak much earlier. It's hard to find an important paper written by an economist after 40."



For the full story, see:

CATHERINE RAMPELL. "They Did Their Homework (800 Years of It)." The New York Times, SundayBusiness Section (Sun., July 4, 2010): 1 & 6.

(Note: the online version of the article is dated July 2, 2010.)

(Note: ellipses added.)


The reference for the book is:

Reinhart, Carmen M., and Kenneth Rogoff. This Time Is Different: Eight Centuries of Financial Folly. Princeton, NJ: Princeton University Press, 2009.



This-time-is-differentBK.jpg














Source of book image: http://www.paschaldonohoe.ie/wp-content/uploads/2010/02/This-time-is-different.jpg







August 9, 2010

Scientific Opinion Shifts to Galambos Who Was Fired for His Theory



GalambosRobertNerveScientist2010-08-04.jpg













"Robert Galambos, . . . , studied the inaudible sounds that allow bats to fly in the dark." Source of caption and photo: online version of the NYT article quoted and cited below.



(p. 20) Dr. Robert Galambos, a neuroscientist whose work included helping to prove how bats navigate in total darkness and deciphering the codes by which nerves transmit sounds to the brain, died June 18 at his home in the La Jolla section of San Diego. He was 96.


. . .


In 1960, while on an airplane, Dr. Galambos wrote that he had an inspiring thought: that the tiny cells that make up 40 percent of the brain, called glia, are as crucial to mental functioning as neurons.

"I know how the brain works!" he exclaimed to his companion.

But his superiors at Walter Reed found the theory so radical that he was soon job-hunting. The view at the time was that glia existed mainly to support neurons, considered the structural and functional unit of the nervous system. But Dr. Galambos clung to his belief, despite the failure of three experiments he performed in the 1960s.

Since then, scientific opinion has been shifting in his direction. In 2008, Ben A. Barres of the Stanford University School of Medicine wrote glowingly in the journal Neuron about the powerful role glia are now seen to play. He concluded, "Quite possibly the most important roles of glia have yet to be imagined."




For the full obituary, see:

DOUGLAS MARTIN. "Robert Galambos, 96, Dies; Studied Nerves and Sound." The New York Times, First Section (Sun., July 18, 2010): 20.

(Note: the online version of the article is dated July 15, 2010 and has the title "Robert Galambos, Neuroscientist Who Showed How Bats Navigate, Dies at 96.")

(Note: ellipses added.)





August 6, 2010

Smithsonian and NIH Are Contributing to Wikipedia, But Will Professors?



(p. B2) Professor Jemielniak in the passage quoted below, asks why professors would ever contribute to Wikipedia since they already can get published in academic journals, and also have a captive audience at their lectures.

Based on that reasoning, Professors likewise would have little motive to blog---yet many do. Why? Perhaps because there is something satisfying in reaching a wide audience of readers who are not required to read, but who choose to read.

(Readers of academic articles are often few, and students at academic lectures are often captives whose bodies are present, but whose minds are somewhere else.)


(p. B2) In the United States, the Wikimedia Foundation has sponsored an academy to teach experts at the National Institutes of Health how to contribute to the site and monitor what appears there. And Mr. Wyatt said that other institutions including the Smithsonian had inquired about getting their own Wikipedian in residence to facilitate their staff members' contributions to the site.

One talk here by a Polish professor, Dariusz Jemielniak, took a jab at the idea of experts as contributors. He said he had noticed that students often remained contributors to Wikipedia but that professors left quickly. His explanation was that Wikipedia was really just a game for people to gain status. A teenager offering the definitive account of the Thirty Years' War gets a huge audience and respect from his peers. But, Mr. Jemielniak asked, why would a professor stoop to edit Wikipedia?

"Professors already get published and can lecture and force people to listen to their ideas," he said.



For the full story, see:

NOAM COHEN. "Link by Link; How Can Wikipedia Grow? Maybe in Bengali." The New York Times (Mon., July 12, 2010): B2.

(Note: the online version of the article is dates July 11, 2010.)





August 1, 2010

Jefferson "Was Experimental and Had a Lot of Failures"



JeffersonianGardeningA2010-07-12.jpg"In the vegetable garden at Monticello, his home in Virginia, Thomas Jefferson sowed seeds from around the world and shared them with farmers. He was not afraid of failure, which happened often." Source of caption and photo: online version of the NYT article quoted and cited below.


Steven Johnson has written an intriguing argument that the intellectual foundation of the founding fathers was based as much on experimental science as on religion. The article quoted below provides a small bit of additional evidence in support of Johnson's argument.


(p. D1) NEW gardeners smitten with the experience of growing their own food -- amazed at the miracle of harvesting figs on a Brooklyn rooftop, horrified by the flea beetles devouring the eggplants -- might be both inspired and comforted by the highs and lows recorded by Thomas Jefferson from the sun-baked terraces of his two-acre kitchen garden 200 years ago.

And they could learn a thing or two from the 19th-century techniques still being used at Monticello today.

"He was experimental and had a lot of failures," Peter Hatch, the director of gardens and grounds, said on a recent afternoon, as we stood under a scorching sun in the terraced garden that took seven slaves three years to cut into the hill. "But Jefferson always believed that 'the failure of one thing is repaired by the success of another.' "

After he left the White House in 1809 and moved to Monticello, his Palladian estate here, Jefferson grew 170 varieties of fruits and 330 varieties of vegetables and herbs, until his death in 1826.

As we walked along the geometric beds -- many of them planted in an ancient Roman quincunx pattern -- I made notes on the beautiful crops I had never grown. Sea kale, with its great, ruffled blue-green leaves, now full of little round seed pods. Egyptian onions, whose tall green stalks bore quirky hats of tiny seeds and wavy green sprouts. A pre-Columbian tomato called Purple Calabash, whose energetic vines would soon be trained up a cedar trellis made of posts cut from the woods.

"Purple Calabash is one of my favorites," Mr. Hatch said. "It's an acidic, al-(p. D7)most black tomato, with a convoluted, heavily lobed shape."

Mr. Hatch, who has directed the restoration of the gardens here since 1979, has pored over Jefferson's garden notes and correspondence. He has distilled that knowledge in "Thomas Jefferson's Revolutionary Garden," to be published by Yale University Press.



For the full story, see:

ANNE RAVER. "A Revolutionary With Seeds, Too." The New York Times (Thurs., July 1, 2010): D1 & D7.

(Note: the online version of the article is dated June 30, 2010 and has the title "In the Garden; At Monticello, Jefferson's Methods Endure.")





July 26, 2010

The British Museum Collaborating with Wikipedia



WikipediaVisitsBritishMuseum2010-07-05.jpg"Two visitors from Wikipedia, Liam Wyatt, left, and Joseph Seddon, at the British Museum." Source of caption and photo: online version of the NYT article quoted and cited below.


(p. C1) The British Museum has begun an unusual collaboration with Wikipedia, the online, volunteer-written encyclopedia, to help ensure that the museum's expertise and notable artifacts are reflected in that digital reference's pages.

About 40 Wikipedia contributors in the London area spent Friday with a "backstage pass" to the museum, meeting with curators and taking photographs of the collection. And in a curious reversal in status, curators were invited to review Wikipedia's treatment of the museum's collection and make a case that important pieces were missing or given short shrift.

Among those wandering the galleries was the museum's first Wikipedian in residence, Liam Wyatt, who will spend five weeks in the museum's offices to build a relationship between the two organizations, one founded in 1753, the other in 2001.

"I looked at how many Rosetta Stone page views there were at Wikipedia," said Matthew Cock, who is in charge of the museum's Web site and is supervising the collaboration with Wikipedia. "That is perhaps our iconic object, and five times as many people go to the Wikipedia article as to ours."

In other words, if you can't beat 'em, join 'em.

Once criticized as amateurism run amok, Wikipedia has become ingrained in the online world: it is consulted by millions of users when there is breaking news; its articles are frequently the first result when a search engine is used.


. . .


(p. C6) Getting permission to work with Wikipedia was not as hard a sell as he expected, Mr. Cock said. "Everyone assumed everyone else hated it and that I shouldn't recommend it to the directorate," he said. "I laid it out, put a paper together. I won't say I was surprised, but I was very pleased it was very well received."

He said he had enthusiastic support from four departments, including Greek and Roman antiquity and prints and drawings. "I don't think it is just the young curators," he added.



For the full story, see:

NOAM COHEN. "Venerable British Museum Enlists in the Wikipedia Revolution." The New York Times (Sat., June 5, 2010): C1 & C6.

(Note: ellipsis added.)

(Note: the online version of the article is dated June 4, 2010.)





June 19, 2010

Economics Is More Like Biology than Physics



(p. A13) If economics is a science, it is more like biology than physics. Biologists try to understand the relationships in a complex system. That's hard enough. But they can't tell you what will happen with any precision to the population of a particular species of frog if rainfall goes up this year in a particular rain forest. They might not even be able to count the number of frogs right now with any exactness.

We have the same problems in economics. The economy is a complex system, our data are imperfect and our models inevitably fail to account for all the interactions.

The bottom line is that we should expect less of economists. Economics is a powerful tool, a lens for organizing one's thinking about the complexity of the world around us. That should be enough. We should be honest about what we know, what we don't know and what we may never know. Admitting that publicly is the first step toward respectability.



For the full commentary, see:

RUSS ROBERTS. "Is the Dismal Science Really a Science? Some macroeconomists say if we just study the numbers long enough we'll be able to design better policy. That's like the sign in the bar: Free Beer Tomorrow." The Wall Street Journal (Thurs., FEBRUARY 26, 2010): A13.







June 3, 2010

"The Intellectual Energy is No Longer with the Economists Who Construct Abstract and Elaborate Models"



(p. A23) In The Wall Street Journal, Russ Roberts of George Mason University wondered why economics is even considered a science. Real sciences make progress. But in economics, old thinkers cycle in and out of fashion. In real sciences, evidence solves problems. Roberts asked his colleagues if they could think of any econometric study so well done that it had definitively settled a dispute. Nobody could think of one.

"The bottom line is that we should expect less of economists," Roberts wrote.

In a column called "A Crisis of Understanding," Robert J. Shiller of Yale pointed out that the best explanation of the crisis isn't even a work of economic analysis. It's a history book -- "This Time is Different" by Carmen M. Reinhart and Kenneth S. Rogoff -- that is almost entirely devoid of theory.

One gets the sense, at least from the outside, that the intellectual energy is no longer with the economists who construct abstract and elaborate models. Instead, the field seems to be moving in a humanist direction. Many economists are now trying to absorb lessons learned by psychologists, neuroscientists and sociologists.



For the full commentary, see:

DAVID BROOKS. "The Return of History." The New York Times (Fri., March 26, 2010): A23.

(Note: the online version of the commentary was dated March 25, 2010.")





May 13, 2010

PowerPoint Useful for Graphs and for "Hypnotizing Chickens"



PowerpointChartAfganStrategy2010-05-12.jpg"A PowerPoint diagram meant to portray the complexity of American strategy in Afghanistan certainly succeeded in that aim." Source of caption and graphic: online version of the NYT article quoted and cited below.


(p. A1) WASHINGTON -- Gen. Stanley A. McChrystal, the leader of American and NATO forces in Afghanistan, was shown a PowerPoint slide in Kabul last summer that was meant to portray the complexity of American military strategy, but looked more like a bowl of spaghetti.

"When we understand that slide, we'll have won the war," General McChrystal dryly remarked, one of his advisers recalled, as the room erupted in laughter.

The slide has since bounced around the Internet as an example of a military tool that has spun out of control. Like an insurgency, PowerPoint has crept into the daily lives of military commanders and reached the level of near obsession. The amount of time expended on PowerPoint, the Microsoft presentation program of computer-generated charts, graphs and bullet points, has made it a running joke in the Pentagon and in Iraq and Afghanistan.

"PowerPoint makes us stupid," Gen. James N. Mattis of the Marine Corps, the Joint Forces commander, said this month at a military conference in North Carolina. (He spoke without PowerPoint.) Brig. Gen. H. R. McMaster, who banned PowerPoint presentations when he led the successful effort to secure the northern Iraqi city of Tal Afar in 2005, followed up at the same conference by likening PowerPoint to an internal threat.

"It's dangerous because it can create the illusion of understanding and the illusion of control," General McMaster said in a telephone interview afterward. "Some problems in the world are not bullet-izable."


. . .


(p. A8) Gen. David H. Petraeus, who oversees the wars in Iraq and Afghanistan and says that sitting through some PowerPoint briefings is "just agony," nonetheless likes the program for the display of maps and statistics showing trends. He has also conducted more than a few PowerPoint presentations himself.


. . .


Senior officers say the program does come in handy when the goal is not imparting information, as in briefings for reporters.

The news media sessions often last 25 minutes, with 5 minutes left at the end for questions from anyone still awake. Those types of PowerPoint presentations, Dr. Hammes said, are known as "hypnotizing chickens."



For the full story, see:

COREY ELISABETH BUMILLER. "We Have Met the Enemy and He Is PowerPoint." The New York Times (Thurs., April 27, 2010): A1 & A8.

(Note: ellipses added.)

(Note: the online version of the story is dated April 26, 2010.)


An interesting, but overdone critique of PowerPoint by an intelligent expert on graphics is:

Tufte, Edward R. The Cognitive Style of PowerPoint. Cheshire, CT: Graphics Press, 2003.





April 18, 2010

Britannica Imitates Wikipedia



(p. 209) Britannica had already launched a project called WebShare in April 2008, which was described as "A special program for web publishers, including bloggers, webmasters, and anyone who writes for the Internet. You get complimentary access to the Encyclopaedia Britannica online and, if you like, an easy way to give your readers background on the topics you write about with links to complete Britannica articles." This was a rather radical move, obviously trying to vie with Wikipedia's emergence as one of the most linked-to resources on the Internet.

But the latest initiative was something quite astonishing, as Britannica was now inviting users to be part of the team of content creators:

To elicit their participation in our new online community of scholars, we will provide our contributors with a reward system and a rich online home that will enable them to promote themselves, their work, and their services. . . . Encyclopaedia Britannica will allow those visitors to suggest changes and additions to that content.


Source:

Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia. New York: Hyperion, 2009.

(Note: ellipsis in original.)





April 14, 2010

Highly Reputed Academic Science Journal Found Similar Error Rates in Britannica and Wikipedia



(p. 208) Wikipedia was already highly regarded, anecdotally, but it got a glowing evaluation from the prestigious Nature magazine in December 2005, when it concluded that Wikipedia "comes close" to Britannica in the quality of its science articles. "Our reviewers identified an average of four errors in each Wikipedia article, and three in each Britannica article."

The news came as a bit of a surprise. Many folks felt Wikipedia did better than they'd have thought, and Britannica did, well, worse than they expected. The result of the study was hotly debated between Nature and Britannica, but to most Wikipedians it was a vindication. They knew that Wikipedia was a minefield of errors, but to be in such close proximity in quality to a traditionally edited encyclopedia, while using such a grassroots process, was the external validation they had been waiting for.

Britannica wasn't pleased with the methodology, and posted a rebuttal with this criticism: "Almost everything about the journal's investigation, from the criteria for identifying inaccuracies to the discrepancy between the article text and its headline, was wrong and misleading." Nature and Britannica exchanged barbs and rebuttals, but in the end, the overall result seemed clear.

"The Nature (sic) article showed that we are on the right track with our current methods. We just need better ways to prevent the display of obvious vandalism at any time," wrote longtime Wikipedian Daniel Mayer on the mailing list.



Source:

Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia. New York: Hyperion, 2009.

(Note: italics in original.)





April 11, 2010

Quants Confused Mathematical Models and Reality



QuantsBK.jpg















Source of book image: http://seekingalpha.com/article/188632-the-quants-review-when-the-money-grid-went-dark



(p. 7) The virtually exclusive use of mathematical models, Mr. Patterson says, was what separated the younger cohorts of quants from their Wall Street forebears. Unlike Warren Buffett or Peter Lynch, the quants did not focus on so-called market fundamentals like what goods or services a particular company actually produced. Seldom if ever did they act on old-fashioned gut instinct. Instead, they focused on factors like how cheap a stock was relative to the rest of the market or how quickly its price had risen or fallen.

Therein was the quants' flaw, according to Mr. Patterson. Pioneers like Mr. Thorp understood that while the math world and the financial world have much in common, they aren't always in sync. The quant traders' model emphasized the most likely moves a stock or bond price could make. It largely ignored the possibility of big jolts caused by human factors, especially investor panics.

"The model soon became so ubiquitous that, hall-of-mirrors-like, it became difficult to tell the difference between the model and the market itself," Mr. Patterson declares.

Move ahead to August 2007 and beyond, when markets swooned on doubts about subprime mortgages. Stocks that the model predicted were bound to go up went sharply down, and vice versa. Events that were supposed to happen only once in 10,000 years happened three days in a row.




For the full review, see:

HARRY HURT III. "Off the Shelf; In Practice, Stock Formulas Weren't Perfect." The New York Times, SundayBusiness Section (Sun., February 21, 2010): 7
.

(Note: the online version of the article is dated February 20, 2010.)



The reference to Patterson's book, is:

Patterson, Scott. The Quants: How a New Breed of Math Whizzes Conquered Wall Street and Nearly Destroyed It. New York: Crown Business, 2010.






April 4, 2010

Philosopher Duped by Hoax Because He Failed to Consult Wikipedia



(p. A4) PARIS -- For the debut of his latest weighty title, "On War in Philosophy," the French philosopher Bernard-Henri Lévy made the glossy spreads of French magazines with his trademark panache: crisp, unbuttoned white Charvet shirts, golden tan and a windswept silvery mane of hair.

But this glamorous literary campaign was suddenly marred by an absolute philosophical truth: Mr. Lévy backed up the book's theories by citing the thought of a fake philosopher. In fact, the sham philosopher has never been a secret, and even has his own Wikipedia entry.

In the uproar that followed over the rigors of his research, Mr. Lévy on Tuesday summed up his situation with one e-mailed sentence: "My source of information is books, not Wikipedia."



For the full story, see:

DOREEN CARVAJAL. "Philosopher Left to Muse on Ridicule Over a Hoax." The New York Times (Weds., February 10, 2010): A4.

(Note: the online version of the article is dated February 9, 2010.)





April 2, 2010

"Expert Scholarship" Versus "People of Dubious Background"



(p. 71) The acknowledgment, by name, of volunteers in the preface sections of the OED is akin to Wikipedia's edit history, where one can inspect who contributed to each article. Some Oxford contributors were professors, some royalty, but most were ordinary folks who answered the call. Winchester, in The Professor and the Madman: A Tale of Murder, Insanity, and the Making of the Oxford English Dictionary, tells the story of the "madman" William Chester Minor, a U.S. Civil War survivor whose "strange and erratic behavior" resulted in him shooting an "innocent working man" to death in the street in Lambeth. He was sent to Broadmoor asylum for criminal lunatics. He discovered the OED as a project around 1881, when he saw the "Appeal for Readers" in the library, and worked for the next twenty-one years contributing to the project, receiving notoriety as a contributor "second only to the contributions of Dr. Fitzedward Hall in enhancing our illustration of the literary history of individual words, phrases and constructions." Minor did something unusual in not just sending submissions, but having his own cataloging system such that the dictionary editors could send a postcard and "out the details flowed, in abundance and always with unerring accuracy." Until Minor and Murray met in January 1891, no one working with (p. 72) the OED knew their prolific contributor was a madman and murderer housed at Broadmoor.

As we will see in later chapters, a common question of the wiki method is whether one can trust information created by strangers and people of dubious background. But the example of the OED shows that using contributors rather than original expert scholarship is not a new phenomenon, and that projects built as a compendium of primary sources are well suited for harnessing the power of distributed volunteers.



Source:

Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia. New York: Hyperion, 2009.

(Note: italics in original.)





March 29, 2010

Like Wikipedia, Oxford English Dictionary Was Built by Amateur Volunteers



(p. 70) The venerable Oxford English Dictionary (OED), the history of which is masterfully documented by Simon Winchester in The Meaning of Everything and The Professor and the Madman, was in fact possible only through the soliciting of contributions, and the receipt of thousands of "slips" of paper, each with words and definitions found by readers and volunteers.

The OED didn't start out with such a grand title, and was first a project of the Philological Society in Great Britian (sic), as a response to what they saw as the popular dictionaries of Noah Webster and Samuel Johnson not doing the "English language justice." In 1857, it was started as the Unregistered Words Committee, and the job was to comb through all forms of media of the era (printed matter, song, spoken word) leading to the inventorying and cataloging of English words. The three founders, Chenevix Trench, Herbert Coleridge, and Frederick Furnivall, sent out a notice in November of that year: "AN APPEAL TO THE ENGLISH-SPEAKING AND ENGLISH-READING PUBLIC TO READ BOOKS AND MAKE EXTRACTS FOR THE PHILOLOGICAL SOCIETY'S NEW ENGLISH DICTIONARY." Specifically, it described the project thusly:

Accordingly, in January 1859. the Society issued their Proposal for the publication of a New English Dictionary, in which the characteristics of the proposed work were explained, and an appeal made to the English and American public to assist in collecting the raw materials for the work, these materials consisting of quotations illustrating the use of English words by all writers of all ages and in all senses, each quotation being made on a uniform plan on a half-sheet of notepaper that they might in due course be arranged and classified alphabetically and significantly. This Appeal met with generous response: some hundreds of volunteers began to read books, make quotations and send in their slips to "sub-editors who volunteered each to take charge of a letter or part of one, and by whom the slips were in turn further arranged, classified, and (p. 71) to some extent used as the basis of definitions and skeleton schemes of the meanings of words in preparation for the Dictionary.

The notice was sent to "bookshops and libraries across the English-speaking world" and, under the direction of Scottish lexicographer James Murray, saw its growth blossom. In 1879, Oxford University Press formally agreed to be publisher and employed Murray to take on the editorship. Slips sent in to the effort were filed away in pigeonholes at the Scriptorium, a corrugated metal building Mill Hill School erected specifically for the effort of sorting and housing the staff to work on the dictionary.



Source:

Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia. New York: Hyperion, 2009.

(Note: italics and caps in original.)


The block quote within the Lih block quote is from p. 108 of:

Winchester, Simon. The Meaning of Everything: The Story of the Oxford English Dictionary. paperback ed. New York: Oxford University Press, USA, 2003.





March 25, 2010

At Odds with Academic Culture, Wiki Programmer Adams Released Early and Released Often



(p. 67) Adams did something unexpected for the academic community, but common in open source culture--release early and release often. Within weeks of its launch, one of the biggest annoyances of Wikipedia was resolved directly by the software's author. It was not because of monetary compensation or any formal request, but simply because the author was interested in solving it on his own time, and sharing it with others. It was the hacker ethos, and it had crossed from the domain of tech programmers into the world of encyclopedias.


Source:

Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia. New York: Hyperion, 2009.





March 23, 2010

"Strategy, as We Knew It, Is Dead''



(p. B7) During the recession, as business forecasts based on seemingly plausible swings in sales smacked up against reality, executives discovered that strategic planning doesn't always work.

Some business leaders came away convinced that the new priority was to be able to shift course on the fly. Office Depot Inc., for example, began updating its annual budget every month, starting in early 2009. Other companies started to factor more extreme scenarios into their thinking. A few even set up "situation rooms,'' where staffers glued to computer screens monitored developments affecting sales and finances.

Now, even though the economy is slowly picking up, those fresh habits aren't fading. "This downturn has changed the way we will think about our business for many years to come," says Steve Odland, Office Depot's chairman and chief executive.

Walt Shill, head of the North American management consulting practice for Accenture Ltd., is even more blunt: "Strategy, as we knew it, is dead,'' he contends. "Corporate clients decided that increased flexibility and accelerated decision making are much more important than simply predicting the future."

Companies have long planned for changing circumstances. What's new--and a switch from the distant calendars and rigid forecasts of the past--is the heavy dose of opportunism. Office Depot stuck with its three-year planning process after the recession hit, largely to make sure employees had a common plan to rally around, Mr. Odland says. But the CEO decided to review the budget every month rather than quarterly so the office-supply chain could react faster to changes in customers' needs.




For the full story, see:

JOANN S. LUBLIN and DANA MATTIOLI. "Theory & Practice; Strategic Plans Lose Favor; Slump Showed Bosses Value of Flexibility, Quick Decisions." The Wall Street Journal (Mon., January 22, 2010): B7.





March 17, 2010

Wikipedia Works in Practice, Not in Theory



(p. 20) Jimmy walked into the offices of Chicago Options Associates in 1994 and met the CEO Michael Davis for a job interview. Davis had looked over Wales's academic publication about options pricing.

"It was impressive looking," says Wales wryly about the paper. "It was a very theoretical paper but it wasn't very practical." But Davis was sufficiently intrigued, as he wanted someone like Wales to pore over the firm's financial models and help improve them. So he took on young Wales, who seemed to be sharp and had acumen for numbers. Little did either of them know they would have a long road ahead together, with Wikipedia in the future.

Wales's first job was to go over the firm's current pricing models. "What was really fascinating was that it was truly a step beyond what I'd seen in academia," he recalls. "It was very practical, and didn't have a real theoretical foundation." Wales was intrigued that the firm traded on principles that worked in practice, not in theory. (This is something he would say about his future endeavor Wikipedia.) "Basically they just knew in the marketplace that the existing models were wrong."



Source:

Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia. New York: Hyperion, 2009.

(Note: italics in original.)





March 12, 2010

The Entrepreneurial Epistemology of Wikipedia



Wikipedia-RrevolutionBK2010-02-08.jpg















Source of book image: http://kellylowenstein.files.wordpress.com/2009/04/wikipedia-revolution1.jpg



Wikipedia is a very unexpected and disruptive institution. Amateurs have produced an encyclopedia that is bigger, deeper, more up-to-date, and arguably of at least equal accuracy, with the best professional encyclopedias, such as Britannica.

I learned a lot from Lih's book. For instance I did not know that the founders of Wikipedia were admirers of Ayn Rand. And I did not know that the Oxford English Dictionary was constructed mainly by volunteer amateurs.

I also did not know anything about the information technology precursors and the back-history of the institutions that helped Wikipedia to work.

I learned much about the background, values, and choices of Wikipedia entrepreneur "Jimbo" Wales. (Jimbo Wales seems not to be perfect, but on balance to be one of the 'good guys' in the world---one of those entrepreneurs who can be admired for something beyond their particular entrepreneurial innovation.)

Lih's book also does a good job of sketching the problems and tensions within Wikipedia.

I believe that Wikipedia is a key step in the development of faster and better institutions of knowledge generation and communication. I also believe that substantial further improvements can and will be made.

Most importantly, I think that you can only go so far with volunteers--ways must be found to reward and compensate.

In the meantime, much can be learned from Lih. In the next few weeks, I will be quoting a few passages that I found especially illuminating.


Book discussed:

Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia. New York: Hyperion, 2009.





January 20, 2010

Global Warming "Consensus" Achieved by Suppressing Skeptical Research



(p. A25) When scientists make putative compendia of that literature, such as is done by the U.N. climate change panel every six years, the writers assume that the peer-reviewed literature is a true and unbiased sample of the state of climate science.

That can no longer be the case. The alliance of scientists at East Anglia, Penn State and the University Corporation for Atmospheric Research (in Boulder, Colo.) has done its best to bias it.

A refereed journal, Climate Research, published two particular papers that offended Michael Mann of Penn State and Tom Wigley of the University Corporation for Atmospheric Research. One of the papers, published in 2003 by Willie Soon and Sallie Baliunas (of the Harvard-Smithsonian Center for Astrophysics), was a meta-analysis of dozens of "paleoclimate" studies that extended back 1,000 years. They concluded that 20th-century temperatures could not confidently be considered to be warmer than those indicated at the beginning of the last millennium.

In fact, that period, known as the "Medieval Warm Period" (MWP), was generally considered warmer than the 20th century in climate textbooks and climate compendia, including those in the 1990s from the IPCC.

Then, in 1999, Mr. Mann published his famous "hockey stick" article in Geophysical Research Letters (GRL), which, through the magic of multivariate statistics and questionable data weighting, wiped out both the Medieval Warm Period and the subsequent "Little Ice Age" (a cold period from the late 16th century to the mid-19th century), leaving only the 20th-century warming as an anomaly of note.

Messrs. Mann and Wigley also didn't like a paper I published in Climate Research in 2002. It said human activity was warming surface temperatures, and that this was consistent with the mathematical form (but not the size) of projections from computer models. Why? The magnitude of the warming in CRU's own data was not as great as in the models, so therefore the models merely were a bit enthusiastic about the effects of atmospheric carbon dioxide.

Mr. Mann called upon his colleagues to try and put Climate Research out of business. "Perhaps we should encourage our colleagues in the climate research community to no longer submit to, or cite papers in, this journal," he wrote in one of the emails. "We would also need to consider what we tell or request of our more reasonable colleagues who currently sit on the editorial board."

After Messrs. Jones and Mann threatened a boycott of publications and reviews, half the editorial board of Climate Research resigned. People who didn't toe Messrs. Wigley, Mann and Jones's line began to experience increasing difficulty in publishing their results.




For the full commentary, see:

PATRICK J. MICHAELS. "OPINION; How to Manufacture a Climate Consensus; The East Anglia emails are just the tip of the iceberg." The Wall Street Journal (Fri., DECEMBER 18, 2009): A25.

(Note: the online version of the article is dated DECEMBER 17, 2009.)





January 18, 2010

Establishments Assume New Methods Are Unsound Methods



(p. 188) For the next two years, Conway coordinated her efforts under Sutherland at PARC with Mead's ongoing work at Caltech. But she was frustrated with the pace of progress. There was no shortage of innovative design ideas; computerized design tools had advanced dramatically since Mead's first efforts several years before. Yet the industry as a whole continued in the old rut. As Conway put it later, the problem was "How can you take methods that are new, methods that are not in common use and therefore perhaps considered unsound methods, and turn them into sound methods?" [Conway's italics].

She saw the challenge in the terms described in Thomas Kuhn's popular book The Structure of Scientific Revolutions. it was the problem that took Boltzmann to his grave. It was the problem of innovation depicted by economist Joseph Schumpeter in his essays on entrepreneurship: new systems lay waste to the systems of the past. Creativity is a solution for the creator and the new ventures he launches. But it wreaks dissolution--"creative destruction," in Schumpeter's words-- for the defenders of old methods. In fact, no matter how persuasive the advocates of change, it is very rare that an entrenched establishment will reform its ways. Establishments die or retire or fall in revolution; they only rarely transform themselves.




Source:

Gilder, George. Microcosm: The Quantum Revolution in Economics and Technology. Paperback ed. New York: Touchstone, 1990.

(Note: italics in original.)





January 14, 2010

For 30 Years "Poincaré's Elegant Math Prevailed Over Boltzmann's Practical Findings"



(p. 182) . . . , Poincaré's elegant math prevailed over Boltzrnann's practical findings. For some thirty years, Boltzmann struggled to get his ideas across. But he failed. He had the word, but he could not find a way to gain its acceptance in the world. For long decades, the establishment held firm.

So in the year 1906, Poincaré became president of the French (p. 183) Académie des Sciences and Boltzmann committed suicide. As Mead debatably puts it, "Boltzmann died because of Poincaré." At least, as Boltzmann's friends attest, this pioneer of the modem era killed himself in an apparent fit of despair, deepened by the widespread official resistance to his views.

He died, however, at the very historic moment when all over Europe physicists were preparing to vindicate the Boltzmann vision. He died just before the findings of Max Planck, largely derived from Boltzmann's probability concepts, finally gained widespread acceptance. He died several months after an obscure twenty-one-year-old student in Geneva named Albert Einstein used his theories in proving the existence of the atom and demonstrating the particle nature of light. In retrospect, Boltzmann can be seen as a near-tragic protagonist in the greatest intellectual drama of the twentieth century: the overthrow of matter.



Source:

Gilder, George. Microcosm: The Quantum Revolution in Economics and Technology. Paperback ed. New York: Touchstone, 1990.

(Note: ellipsis added.)





December 27, 2009

Emails Vindicate Skeptics Who Questioned Scientific Basis of Global Warming



(p. A1) Just two years ago, a United Nations panel that synthesizes the work of hundreds of climatologists around the world called the evidence for global warming "unequivocal."

But as representatives of about 200 nations converge in Copenhagen on Monday to begin talks on a new international climate accord, they do so against a background of renewed attacks on the basic science of climate change.

The debate, set off by the circulation of several thousand files and e-mail messages stolen from one of the world's foremost climate research institutes, has led some who oppose limits on greenhouse gas emissions, and at least one influential country, Saudi Arabia, to question the scientific basis for the Copenhagen talks.

The uproar has threatened to complicate a multiyear diplomatic effort already ensnared in difficult political, technical and financial disputes that have caused leaders to abandon hopes of hammering out a binding international climate treaty this year.


. . .


(p. A8) On dozens of Web sites and blogs, skeptics and foes of greenhouse gas restrictions take daily aim at the scientific arguments for human-driven climate change. The stolen material was quickly seized upon for the questions it raised about the accessibility of raw data to outsiders and whether some data had been manipulated.

An investigation into the stolen files is being conducted by the University of East Anglia, in England, where the computer breach occurred. Rajendra K. Pachauri, chairman of the United Nations Intergovernmental Panel on Climate Change, has also said he will look into the matter. At the same time, polls in the United States and Britain suggest that the number of people who doubt that global warming is dangerous or caused by humans has grown in recent years.


. . .


Science is about probability, not certainty. And the persisting uncertainties in climate science leave room for argument. What is a realistic estimate of how much temperatures will rise? How severe will the effects be? Are there tipping points beyond which the changes are uncontrollable?

Even climate scientists disagree on many of these questions. But skeptics have been critical of the data assembled to show that warming is occurring and the analytic methods that climate scientists use, including mathematical models used to demonstrate a human cause for warming and project future trends.

Both sides also have at times been criticized for overstatement in characterizing the scientific evidence. The contents of the stolen e-mail messages and documents have given fresh ammunition to the skeptics' camp.

The Climatic Research Unit's role as a central aggregator of temperature and other climate data has also made it a target. One widely discussed file extracted from the unit's computers, presumed to be the log of a researcher named Ian Harris, recorded his years of frustration in trying to make sense of disparate data and described procedures -- or "fudge factors," as he called them -- used by scientists to eliminate known sources of error.




For the full story, see:

ANDREW C. REVKIN and JOHN M. BRODER. "Facing Skeptics, Climate Experts Sure of Peril." The New York Times (Mon., December 7, 2009): A1 & A8.

(Note: the online version of the article is dated Sun., December 6, 2009 and has the title "In Face of Skeptics, Experts Affirm Climate Peril.")

(Note: ellipses added.)


Note: the online version of the article includes the following, very interesting, correction of the print version:

Correction: December 15, 2009
Because of an editing error, an article on Dec. 7 about the scientific evidence supporting global warming overstated the level of certainty expressed in a 2007 report by the Intergovernmental Panel on Climate Change, a network of scientists, that human-caused warming was under way and, if unabated, would pose rising risks. The panel said that most warming since 1950 was "very likely" caused by humans, not that there was "no doubt." The article also misidentified the temperature data cited by a scientist at the University of East Anglia's Climatic Research Unit who had expressed frustration in a log about trying to make sense of disparate data. The data was direct measurements of temperature, not indirect indicators like the study of tree rings.

(Note: italics and bold in original.)





December 26, 2009

Emails Reveal Global Warming Scientists Exclude Contrary Views



ClimateGateEmails.gifSource of photo and email images: online version of the WSJ article quoted and cited below.



One can imagine Michael Crichton looking down on us with a sad smile:


(p. A3) The scientific community is buzzing over thousands of emails and documents -- posted on the Internet last week after being hacked from a prominent climate-change research center -- that some say raise ethical questions about a group of scientists who contend humans are responsible for global warming.

The correspondence between dozens of climate-change researchers, including many in the U.S., illustrates bitter feelings among those who believe human activities cause global warming toward rivals who argue that the link between humans and climate change remains uncertain.

Some emails also refer to efforts by scientists who believe man is causing global warming to exclude contrary views from important scientific publications.

"This is horrible," said Pat Michaels, a climate scientist at the Cato Institute in Washington who is mentioned negatively in the emails. "This is what everyone feared. Over the years, it has become increasingly difficult for anyone who does not view global warming as an end-of-the-world issue to publish papers. This isn't questionable practice, this is unethical."

John Christy, a scientist at the University of Alabama at Huntsville attacked in the emails for asking that an IPCC report include dissenting viewpoints, said, "It's disconcerting to realize that legislative actions this nation is preparing to take, and which will cost trillions of dollars, are based upon a view of climate that has not been completely scientifically tested--but rather orchestrated."

In all, more than 1,000 emails and more than 2,000 other documents were stolen Thursday from the Climate Research Unit at East Anglia University in the U.K. The identity of the hackers isn't certain, but the files were posted on a Russian file-sharing server late Thursday, and university officials confirmed over the weekend that their computer had been attacked and said the documents appeared to be genuine.


. . .


In one email, Benjamin Santer from the Lawrence Livermore National Laboratory in Livermore, Calif., wrote to the director of the climate-study center that he was "tempted to beat" up Mr. Michaels. Mr. Santer couldn't be reached for comment Sunday.

In another, Phil Jones, the director of the East Anglia climate center, suggested to climate scientist Michael Mann of Penn State University that skeptics' research was unwelcome: We "will keep them out somehow -- even if we have to redefine what the peer-review literature is!" Neither man could be reached for comment Sunday.




For the full story, see:

KEITH JOHNSON. "Climate Strife Comes to Light; Emails Illustrate Anger of Scientists Who Believe Humans Are Root of Global Warming." The Wall Street Journal (Mon., NOVEMBER 23, 2009): A3.

(Note: ellipsis added.)

(Note: the printed version of the article is mostly the same as the online version, but has some differences in order and content. The part quoted above is consistent with the printed version. The passages quoted are the same in both versions, except that the paragraph on the views of John Christy appears later in the online version, and the online version omits his phrase "but rather orchestrated." [I skimmed for differences, but am not absolutely sure that I caught them all.])

(Note: the title of the online version of the article is: "Climate Emails Stoke Debate; Scientists' Leaked Correspondence Illustrates Bitter Feud over Global Warming.")





December 24, 2009

Heretics to the Religion of Global Warming



SuperFreakonomicsBK.jpg















Source of book image: online version of the WSJ review quoted and cited below.



(p. A19) Suppose for a minute--. . . --that global warming poses an imminent threat to the survival of our species. Suppose, too, that the best solution involves a helium balloon, several miles of garden hose and a harmless stream of sulfur dioxide being pumped into the upper atmosphere, all at a cost of a single F-22 fighter jet.


. . .


The hose-in-the-sky approach to global warming is the brainchild of Intellectual Ventures, a Bellevue, Wash.-based firm founded by former Microsoft Chief Technology Officer Nathan Myhrvold. The basic idea is to engineer effects similar to those of the 1991 mega-eruption of Mt. Pinatubo in the Philippines, which spewed so much sulfuric ash into the stratosphere that it cooled the earth by about one degree Fahrenheit for a couple of years.

Could it work? Mr. Myhrvold and his associates think it might, and they're a smart bunch. Also smart are University of Chicago economist Steven Levitt and writer Stephen Dubner, whose delightful "SuperFreakonomics"--the sequel to their runaway 2005 bestseller "Freakonomics"--gives Myhrvold and Co. pride of place in their lengthy chapter on global warming. Not surprisingly, global warming fanatics are experiencing a Pinatubo-like eruption of their own.


. . .


. . . , Messrs. Levitt and Dubner show every sign of being careful researchers, going so far as to send chapter drafts to their interviewees for comment prior to publication. Nor are they global warming "deniers," insofar as they acknowledge that temperatures have risen by 1.3 degrees Fahrenheit over the past century.

But when it comes to the religion of global warming--the First Commandment of which is Thou Shalt Not Call It A Religion--Messrs. Levitt and Dubner are grievous sinners. They point out that belching, flatulent cows are adding more greenhouse gases to the atmosphere than all SUVs combined. They note that sea levels will probably not rise much more than 18 inches by 2100, "less than the twice-daily tidal variation in most coastal locations." They observe that "not only is carbon plainly not poisonous, but changes in carbon-dioxide levels don't necessarily mirror human activity." They quote Mr. Myhrvold as saying that Mr. Gore's doomsday scenarios "don't have any basis in physical reality in any reasonable time frame."

More subversively, they suggest that climatologists, like everyone else, respond to incentives in a way that shapes their conclusions. "The economic reality of research funding, rather than a disinterested and uncoordinated scientific consensus, leads the [climate] models to approximately match one another." In other words, the herd-of-independent-minds phenomenon happens to scientists too and isn't the sole province of painters, politicians and news anchors

.


For the full commentary, see:

BRET STEPHENS. "Freaked Out Over SuperFreakonomics; Global warming might be solved with a helium balloon and a few miles of garden hose." The Wall Street Journal (Tues., OCTOBER 27, 2009): A19.

(Note: ellipsis added.)





December 12, 2009

Fat-Tailed Distributions Seldom Used "Because the Math Was So Unwieldy"



DragonCurveCartoon2009-10-28.jpg




















Source of cartoon: online version of the WSJ article quoted and cited below.




(p. C1) Last year, a typical investment portfolio of 60% stocks and 40% bonds lost roughly a fifth of its value. Standard portfolio-construction tools assume that will happen only once every 111 years.

With once-in-a-century floods seemingly occurring every few years, financial-services firms ranging from J.P. Morgan Chase & Co. to MSCI Inc.'s MSCI Barra are concocting new ways to protect investors from such steep losses. The shift comes from increasing recognition that conventional assumptions about market behavior are off the mark, substantially underestimating risk.


. . .


(p. C9) Many of Wall Street's new tools assume market returns fall along a "fat-tailed" distribution, where, say, last year's nearly 40% stock-market decline would be more common than previously thought.

Fat-tailed distributions are nothing new. Mathematician Benoit Mandelbrot recognized their relevance to finance in the 1960s. But they were never widely used in portfolio-building tools, partly because the math was so unwieldy.



For the full story, see:

ELEANOR LAISE. "Some Funds Stop Grading on the Curve." The Wall Street Journal (Tues., SEPTEMBER 8, 2009): C1 & C9.

(Note: ellipsis added.)





December 7, 2009

The Real Disney and the Disney of Academic Critiques



(p. 324) Disney seems no more real in the growing body of academic critiques of the man and the company that bears his name. Many of these critiques are vaguely if not specifically Marxist in their methodology, and they display the usual Marxist tendency to bulldoze the complexities of human behavior in the pursuit of an all--embracing interpretation of Disney's life and work. What fatally cripples most academic writing about Walt Disney is simple failure to examine its supposed subject. Disney scholarship, like many other kinds of scholarship in today's academy, feeds on itself. The common tendency is for scholars to rush past the facts of Disney's life and career, frequently getting a lot of them wrong, in order to write about what really interests them, which is what other scholars have already written. It is this incestuous quality, even more than such commonly cited sins as a reliance on jargon, that makes so much academic writing, on Disney as on other subjects, claustrophobically difficult to read.



Disney has attracted other writers whose unsupportable claims and speculations sometimes win approval of scholars all too eager to believe the worst of the man. The persistent accusations of anti-Semitism are only the mildest examples of an array whose cumulative effect is to portray a Disney who was, among other vile things, racist, misogynist, imperialist, sexually warped. a spy for J. Edgar Hoover, desperate to conceal his illegitimate Spanish birth, (p. 325) and so terrified of death that he had his body cryogenically frozen. Pathologies are undoubtedly at work here, none of them Disney's.



Source:

Barrier, Michael. The Animated Man: A Life of Walt Disney. 1 ed. Berkeley, CA: University of California Press, 2007.





November 29, 2009

Walt Disney: "I Don't Care About Critics"



(p. 286) "He is shy with reporters." Edith Efron wrote for TV Guide in 1965. "His eyes are dull and preoccupied, his affability mechanical and heavy-handed. He gabs away slowly and randomly in inarticulate, Midwestern speech that would be appropriate to a rural general store. His shirt is open, his tie crooked. One almost expects to see over-all straps on his shoulders and wisps of hay in his hair. . . . If one has the patience to persist, however, tossing questions like yellow flares into the folksy fog, the fog lifts, a remote twinkle appears in the preoccupied eves, and the man emerges."

Here again, as in other interviews from the 1960s, Disney permitted himself to sound bitter and resentful when he said anything of substance: "These avant-garde artists are adolescents. It's only a little noisy element that's going that way, that's creating this sick art. . . . There is no cynicism in me and there is none allowed in our work. . . . I don't like snobs. You find some of intelligentsia, they become snobs. They think they're above everybody else. They're not. More education doesn't mean more common sense. These ideas they have about art are crazy. . . . I don't care about critics. Critics take themselves too seriously. They think the only way to be noticed and to be the smart guy is to pick and find fault with things. It's the public I'm making pictures for."




Source:

Barrier, Michael. The Animated Man: A Life of Walt Disney. 1 ed. Berkeley, CA: University of California Press, 2007.

(Note: ellipses and italics in original.)





November 20, 2009

Breakthrough Innovations Require Judgment, Not Surveys



DesignDrivenInnovationBK.jpg














Source of book image: http://press.harvardbusiness.org/on/wp-content/uploads/2009/03/verganti_300dpi.jpg.



(p. W8) In "Design-Driven Innovation" (Harvard Business Press, 272 pages, $35), Roberto Verganti holds that product development should be grounded not in the data of survey-takers or the observations of anthropologists but in the judgment of executives. "We have experienced years of hype about user-centered design," he says. But breakthrough innovations, in Mr. Verganti's view, do not represent what customers knew they wanted. Rather, the most profitable innovations are those that create a radically new meaning for a product.

Nintendo's Wii video-game console and its motion-sensing controllers "transformed what a console meant: from an immersion in a virtual world approachable only by niche experts into an active workout, in the real world, for everyone." The Swatch in 1983 introduced a new meaning to the watch: neither an article of fine jewelry nor a utilitarian timekeeping tool but a fashion accessory. Starbucks, he says, changed the meaning of a coffee shop from a place to buy coffee to a home away from home.



For the full review, see:

DAVID A. PRICE. "The Shape of Things to Come; Design is more than aesthetics and ease of use. It's a way of doing business." The Wall Street Journal (Fri., OCTOBER 9, 2009): W8.


Reference for book under review:

Brown, Tim. Change by Design: How Design Thinking Transforms Organizations and Inspires Innovation. New York: HarperBusiness Publishers, 2009.





November 11, 2009

"A Foolish Faith in Authority Is the Worst Enemy of Truth"



(p. A21) Several years ago I grew concerned about my postmenopausal mother's risk of osteoporosis. I tried to convince her to initiate hormone replacement therapy. She didn't listen to me. Instead, she spoke with her gynecologist, who--contrary to best medical evidence at the time--recommended against such treatment. I would eventually be thankful my mother listened to the gynecologist who had known her for decades instead of me and the published medical reviews I was relying on. Some years later my mother was diagnosed with early breast cancer. Had she been on estrogen replacement, it is likely that her tumor would have progressed more rapidly. The gynecologist likely saved my mother's life.

Studies published in the medical literature are mostly produced by academics who face an imperative to publish or watch their careers perish. These academics aren't basing their careers on their clinical skills and experiences. Paradoxically, if we allow the academic literature to set guidelines for accepted practices, we are allowing those who are often academics first and clinicians second to determine what clinical care is appropriate.

Consciously or not, those who provide the peer review for medical journals are influenced by whether the work they are reviewing will impact their standing in the medical community. This is a dilemma. The experts who serve as reviewers compete with the work they are reviewing. Leaders in every community, therefore, exert disproportional influence on what gets published. We expect reviewers to be objective and free of conflicts, but in truth, only rarely is that the case.

Albert Einstein once noted that "a foolish faith in authority is the worst enemy of truth."



For the full commentary, see:

NORBERT GLEICHER. "'Expert Panels' Won't Improve Health Care; Government reliance on medical studies will make it harder to discard false prophecies and dogmas." The Wall Street Journal (Mon., October 19, 2009): A21.

(Note: the online version of the commentary is dated Sun., Oct. 18.)





November 2, 2009

Monty Python Success Arose from Freedom, Not Plans



Pythons1969.jpg"The unusual suspects, 1969: top row from left, Graham Chapman, Eric Idle and Terry Gilliam; bottom row from left, Terry Jones, John Cleese and Michael Palin." Source of caption and photo: online version of the NYT article quoted and cited below.


(p. 24) "A lot of contemporary comedy seems self-conscious," Mr. Palin said. "It's almost documentary, like 'The Office.' That's a very funny show, but you're looking at the human condition under stress. The Pythons made the human condition seem like fun."

He added: "I'm proud to be a Python. It's a badge of silliness, which is quite important. I was the gay lumberjack, I was the Spanish Inquisition, I was one-half of the fish-slapping dance. I look at myself and think that may be the most important thing I've ever done."

Mr. Cleese and Mr. Jones, in rare agreement, both suggested that one reason the Pythons have never been successfully imitated is that television executives nowadays would never let anyone get away with putting together a show like theirs. When they began, they didn't have an idea what the show should be about or even a title for it. The BBC gave them some money, and then, Mr. Cleese joked, the executives hurried off to the bar.

"The great thing was that in the beginning we had such a low profile," he said. "We went on at different times, and some weeks we didn't go on at all, because there might be a show-jumping competition. But that was the key to our feeling of freedom. We didn't know what the viewing figures were, and we didn't care. What has happened now is the complete reverse. Even the BBC is obsessed with the numbers."

So obsessed, Bill Jones pointed out, that in the case of "Monty Python: Almost the Truth" some people encouraged the documentarians to see if they couldn't squeeze the six hours down to one.




For the full story, see:

CHARLES McGRATH. "Television; On Comedy's Flying Trapeze." The New York Times, Arts & Leisure Section (Sun., October 4, 2009): 1 & 24.

(Note: ellipses added.

(Note: the online version of the article is dated September 30, (sic) 2009.)


PythonsPremeireSpamalot2009-10-23.jpg"Above from left, Mr. Jones, Mr. Gilliam, Mr. Cleese, Mr. Idle and Mr. Palin at the premiere of "Spamalot."" Source of caption and photo: online version of the NYT article quoted and cited above.





November 1, 2009

Picking Up Surface Nuggets Versus Digging a Deep Hole in One Place



(p. 423) The work was extraordinarily difficult, pushing the limits of the technically possible. Disappointment is my daily bread, he had said. I thrive on it. But he did not thrive. Often he thought of abandoning the work, abandoning all of it. Yet every day he continued to fill nearly every waking hour with thinking about it. Between 1934 and 1941 he published nothing. Nothing. For a scientist to go through such a dry period is more than depressing. It is a refutation of one's abilities, of one's life. But in the midst of that dry spell, Avery told a young researcher there were two types of investigators: most "go around picking up surface nuggets, and whenever they can spot a surface nugget of gold they pick it up and add it to their collection. . . . [The other type] is not really interested in the surface nugget. He is much more interested in digging a deep hole in one place, hoping to hit a vein. And of course if he strikes a vein of gold he makes a tremendous advance."



Source:

Barry, John M. The Great Influenza: The Story of the Deadliest Pandemic in History. Revised ed. New York: Penguin Books, 2005.

(Note: italics, ellipsis, and brackets, all in original.)





October 24, 2009

Rapid Mutation of RNA-Based Flu Virus Allows Rapid Adaptation to Immune System Response



I found the passage quoted below to be especially illuminating on how rapid mutation helps explain why the flu virus is so successful and dangerous. (An additional important factor is that the virus can survive in birds, without killing them.)

It occurs to me that something akin to rapid mutation (e.g., rapid experimentation) has also been advocated as a way to quickly advance science (Karl Popper), or enterprise (George Gilder).


(p. 105) Whenever an organism reproduces, its genes try to make exact copies of themselves. But sometimes mistakes--mutations--occur in this process.

This is true whether the genes belong to people, plants, or viruses. The more advanced the organism, however, the more mechanisms exist to prevent mutations. A person mutates at a much slower rate than bacteria, bacteria mutates at a much slower rate than a virus--and a DNA virus mutates at a much slower rate than an RNA virus.

DNA has a kind of built-in proofreading mechanism to cut down on copying mistakes. RNA has no proofreading mechanism whatsoever, no way to protect against mutation. So viruses that use RNA to carry their genetic information mutate much faster--from 10,000 to 1 million times faster--than any DNA virus.

Different RNA viruses mutate at different rates as well. A few mutate so rapidly that virologists consider them not so much a population of copies of the same virus as what they call a "quasi species" or a "mutant swarm."

These mutant swarms contain trillions and trillions of closely related but different viruses. Even the viruses produced from a single cell will include many different versions of themselves, and the swarm as a whole will routinely contain almost every possible permutation of its genetic code.

Most of these mutations interfere with the functioning of the virus and will either destroy the virus outright or destroy its ability to infect. But other mutations, sometimes in a single base, a single letter, in its genetic code will allow the virus to adapt rapidly to a new situation. It is this adaptability that explains why these quasi species, these mutant swarms, can move rapidly back and forth between different environments and also develop extraordinarily rapid drug resistance. As one investigator has observed, the rapid mutation "confers a certain randomness to the disease processes that accompany RNA [viral] infections."

Influenza is an RNA virus. So is HIV and the coronavirus. And of all RNA viruses, influenza and HIV are among those that mutate the fastest. The influenza virus mutates so fast that 99 percent of the 100,000 to 1 million new viruses that burst out of a cell in the reproduction process (p. 106) are too defective to infect another cell and reproduce again. But that still leaves between 1,000 and 10,000 viruses that can infect another cell.

Both influenza and HIV fit the concept of a quasi species, of a mutant swarm. In both, a drug-resistant mutation can emerge within days. And the influenza virus reproduces rapidly--far faster than HIV. Therefore it adapts rapidly as well, often too rapidly for the immune system to respond.




Source:

Barry, John M. The Great Influenza: The Story of the Deadliest Pandemic in History. Revised ed. New York: Penguin Books, 2005.

(Note: italics in original.)





October 21, 2009

Small Evidence Kills Big Theory



Raptorex_Trex2009-09-27.jpg














Big Tyrannosaurus rex and much smaller Raptorex kriegsteini. Source of image: http://scienceblogs.com/notrocketscience/upload/2009/09/raptorex_tiny_king_of_thieves_shows_how_tyrannosaurus_body_p/Raptorex_Trex.jpg



(p. A5) Paleontologists said Thursday that they had discovered what amounted to a miniature prototype of Tyrannosaurus rex, complete with the oversize head, powerful jaws, long legs -- and, as every schoolchild knows, puny arms -- that were hallmarks of the king of the dinosaurs.

But this scaled-down version, which was about nine feet long and weighed only 150 pounds, lived 125 million years ago, about 35 million years before giant Tyrannosaurs roamed the earth. So the discovery calls into question theories about the evolution of T. rex, which was about five times longer and almost 100 times heavier.

"The thought was these signature Tyrannosaur features evolved as a consequence of large body size," Stephen L. Brusatte of the American Museum of National History, an author of a paper describing the dinosaur published online by the journal Science, said at a news conference. "They needed to modify their entire skeleton so they could function as a predator at such colossal size."

The new dinosaur, named Raptorex kriegsteini, "really throws a wrench into this observed pattern," Mr. Brusatte said.



For the full story, see:

HENRY FOUNTAIN. "Fossil Discovery Challenges Theories on T. Rex Evolution." The New York Times (Fri., September 18, 2009): A5.

(Note: the online version is dated Sept. 17th and has the slightly different title: "Fossil Find Challenges Theories on T. Rex" but the body of the article seems the same as the print version.)





October 20, 2009

Scientist Huxley: "The Great End of Life is Not Knowledge But Action"



John Barry calls our attention to the views of Thomas Huxley who gave the keynote address at the founding of the Johns Hopkins University:


(p. 13) A brilliant scientist, later president of the Royal Society, he advised investigators, "Sit down before a fact as a little child, be prepared to give up every preconceived notion. Follow humbly wherever and to whatever abysses nature leads, or you shall learn nothing." He also believed that learning had purpose, stating, "The great end of life is not knowledge but action."



Source:

Barry, John M. The Great Influenza: The Story of the Deadliest Pandemic in History. Revised ed. New York: Penguin Books, 2005.

(Note: from the context in Barry, I am not certain whether the Huxley quotes are from the keynote address, or from elsewhere in Huxley's writings.)





September 13, 2009

Economists "Mistook Beauty, Clad in Impressive-Looking Mathematics, for Truth"



PlanglossianEconomistsCartoon2009-09-06.jpg Source of caricatures: online version of the NYT article quoted and cited below.


Nobel Prize winner Paul Krugman is no friend of the free market, and more importantly, his manner of dealing with opponents is a long way from gracious civility.

But he is not always completely wrong:


(p. 36) Few economists saw our current crisis coming, . . .


. . .


(p. 37) As I see it, the economics profession went astray because economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth.



For the full commentary, see:

PAUL KRUGMAN. "How Did Economists Get It So Wrong?." The New York Times, Magazine Section (Sun., September 2, 2009): 36-43.

(Note: ellipses added.)


DissentingEconomistsCartoon2009-09-06.jpgThe economist on the left is probably intended to resemble Keynes, but he also bears some resemblance to Hayek. Source of caricatures: online version of the NYT article quoted and cited above.






August 25, 2009

Wikipedia Continues to Gain Respect



(p. B5) Recognizing that the online encyclopedia Wikipedia is increasingly used by the public as a news source, Google News began this month to include Wikipedia among the stable of publications it trawls to create the site.

A visit to the Google News home page on Wednesday evening, for example, found that four of the 30 or so articles summarized there had prominent links to Wikipedia articles, including ones covering the global swine flu outbreak and the Iranian election protests.


. . .


The move by Google News was news to Wikipedia itself. Jay Walsh, a spokesman for the Wikimedia Foundation, said he learned about it by reading an online item on the subject by the Nieman Journalism Lab.

"Google is recognizing that Wikipedia is becoming a source for very up-to-date information," he said, although "it is an encyclopedia at the end of the day."



For the full story, see:

NOAM COHEN. "Google Starts Including Wikipedia on Its News Site." The New York Times Company (Weds., June 22, 2009): B5.

(Note: ellipsis added.)





August 22, 2009

"The Evidence of His Eyes Overturned 2,000 Years of Accepted Wisdom"




GalileoShowsVenetianSenators.jpg". . ., the Italian astronomer shows the satellites of Jupiter to Venetian senators in this 1882 illustration." Source of illustration and caption: online version of the WSJ article quoted and cited below.


(p. A9) A mathematician and experimental physicist, Galileo, however, immediately recognized that what he could see of Venus, Jupiter and the moon through his telescope offered crucial evidence that the sun, not Earth, was the center of our solar system. The evidence of his eyes overturned 2,000 years of accepted wisdom about cosmology in which philosophers had conceived the night sky as a system of crystalline spheres.

Moreover, Galileo quickly shared his observations with scientists throughout Europe by openly publishing his data.

"He wrought a change so fundamental for science and for humanity," says Munich astronomer Pedro Russo, who is global coordinator of the International Year of Astronomy. "For the first time, we realized we were not the center of the universe."

But his insistence on contradicting traditional cosmology led to his arrest and trial by the Roman Catholic Church. He was forced to recant his views and imprisoned for life. The Vatican did not formally admit that Galileo was correct until 1992. Now Vatican authorities are planning a statue in his honor.

During his life, Galileo is known to have built at least 100 telescopes, mostly as ornate presentation gifts for his patrons -- the powerful Medici family of Florence. Only one is known to survive with its optics intact -- the humble device now on show at the Franklin Institute.

"We assume it was personally used by Galileo," says Paolo Galluzzi, director of the science museum in Florence, which loaned the telescope for the exhibit. "Only this one was found among his property at his death. We believe that this is one of the major tools of his work."


. . .


"Science is fundamentally about establishing truth for yourself," says Dr. Pompea in Arizona. "People can make observations, take data and establish for themselves the nature of the universe. They don't have to take it from someone else or read it in a book."

Like Galileo, "they can see it."



For the full story, see:

ROBERT LEE HOTZ. "Galileo's Discoveries, 400 Years Later, Still Open Eyes
Astronomer's Telescope, on View Outside Italy for the First Time, Helped Expand Perceptions of the Universe." The Wall Street Journal (Fri., APRIL 10, 2009): A9.

(Note: ellipsis added.)


GalileoGalilei2009-08-12.gif










"Galileo Galilei." Source of image and caption: online version of the WSJ article quoted and cited above.






August 16, 2009

Richard Langlois on Why Capitalism Needs the Entrepreneur



DynamicsOfIndustrialCapitalismBK.jpg
















Source of book image: http://www.amazon.com/Dynamics-Industrial-Cpitalism-Schumpeter-Lectures/dp/0415771676/ref=sr_11_1?ie=UTF8&qid=1204828232&sr=11-1



Schumpeter is sometimes viewed as having predicted the obsolescence of the entrepreneur, although Langlois documents that Schumpeter was always of two minds on this issue.

Langlois discusses Schumpeter's ambivalence and the broader issue of the roles of the entrepreneur and the corporation in his erudite and useful book on The Dynamics of Industrial Capitalism. He concludes that changing economic conditions will always require new industrial structures, and the entrepreneur will always be needed to get these new structures built.

(I have written a brief positive review of the book that has recently appeared online.)



Reference to Langlois' book:

Langlois, Richard N. The Dynamics of Industrial Capitalism: Schumpeter, Chandler and the New Economy. London: Routledge, 2006.


Reference to my review of Langlois' book:

Diamond, Arthur M., Jr. "Review of Richard N. Langlois, The Dynamics of Industrial Capitalism: Schumpeter, Chandler and the New Economy." EH.Net Economic History Services, Aug 6 2009. URL: http://eh.net/bookreviews/library/1442


Apparently Langlois likes my review:

http://organizationsandmarkets.com/2009/08/07/another-nanosecond-of-fame/




LangloisRichard2009-08-12.jpg




"Richard N. Langlois." Source of photo and caption: http://www.clas.uconn.edu/facultysnapshots/images/langlois.jpg






July 25, 2009

The Epistemological Implications of Wikipedia



WikipediaRevolutionBK.jpg














Source of book image: online version of the WSJ review quoted and cited below.




I think the crucial feature of Wikipedia is in its being quick (what "wiki" means in Hawaiian), rather than in its current open source model. Academic knowledge arises in a slow, vetted process. Publication depends on refereeing and revision. On Wikipedia (and the web more generally) knowledge is posted first, and corrected later.

In the actual fact, Wikipedia's coverage is vast, and its accuracy is high.

I speculate that Wikipedia provides clues to developing new, faster, more efficient knowledge generating institutions.

(Chris Anderson has a nice discussion of Wikipedia in The Long Tail, starting on p. 65.)


(p. A13) Until just a couple of years ago, the largest reference work ever published was something called the Yongle Encyclopedia. A vast project consisting of thousands of volumes, it brought together the knowledge of some 2,000 scholars and was published, in China, in 1408. Roughly 600 years later, Wikipedia surpassed its size and scope with fewer than 25 employees and no official editor.

In "The Wikipedia Revolution," Andrew Lih, a new-media academic and former Wikipedia insider, tells the story of how a free, Web-based encyclopedia -- edited by its user base and overseen by a small group of dedicated volunteers -- came to be so large and so popular, to the point of overshadowing the Encyclopedia Britannica and many other classic reference works. As Mr. Lih makes clear, it wasn't Wikipedia that finished off print encyclopedias; it was the proliferation of the personal computer itself.


. . .


By 2000, both Britannica and Microsoft had subscription-based online encyclopedias. But by then Jimmy Wales, a former options trader in Chicago, was already at work on what he called "Nupedia" -- an "open source, collaborative encyclopedia, using volunteers on the Internet." Mr. Wales hoped that his project, without subscribers, would generate its revenue by selling advertising. Nupedia was not an immediate success. What turned it around was its conversion from a conventionally edited document into a wiki (Hawaiian for "fast") -- that is, a site that allowed anyone browsing it to edit its pages or contribute to its content. Wikipedia was born.

The site grew quickly. By 2003, according to Mr. Lih, "the English edition had more than 100,000 articles, putting it on par with commercial online encyclopedias. It was clear Wikipedia had joined the big leagues." Plans to sell advertising, though, fell through: The user community -- Wikipedia's core constituency -- objected to the whole idea of the site being used for commercial purposes. Thus Wikipedia came to be run as a not-for-profit foundation, funded through donations.


. . .


It is clear by the end of "The Wikipedia Revolution" that the site, for all its faults, stands as an extraordinary demonstration of the power of the open-source content model and of the supremacy of search traffic. Mr. Lih observes that when "dominant encyclopedias" were still hiding behind "paid fire walls" -- and some still are -- Wikipedia was freely available and thus easily crawled by search engines. Not surprisingly, more than half of Wikipedia's traffic comes from Google.



For the full review, see:

JEREMY PHILIPS. "Business Bookshelf; Everybody Knows Everything." Wall Street Journal (Weds., March 18, 2009): A13.

(Note: ellipses added.)


The book being reviewed, is:

Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia. New York: Hyperion, 2009.





July 1, 2009

RIP Marjorie Grene, Who Helped Polanyi with Personal Knowledge



GreneMarjorie2009-06-10.jpg











"Marjorie Grene in 2003." Source of photo and caption: online version of the NYT obituary quoted and cited below.



The NYT reported, in the obituary quoted below, that philosopher Marjorie Grene died on March 16, 2009, at the age of 93.

Although I studied philosophy at the University of Chicago, my time there did not overlap with Marjorie Grene's and I don't believe that I ever met her, or ever even heard her speak (though I did occasionally walk past her former husband David Grene, on my way to talk to Stephen Toulmin).

I am increasingly appreciating Michael Polanyi's book Personal Knowledge in which he introduced his view of what he called "tacit knowledge." In particular, I am coming to believe that tacit knowledge is very important in understanding the role and importance of the entrepreneur.

So if Marjorie Grene was crucial to Personal Knowledge, as is indicated in the obituary quoted below, then she is deserving of serious consideration, and high regard.


(p. 23) In Chicago, she had met Michael Polanyi, a distinguished physical chemist turned philosopher; she ended up helping him research and develop his important book "Personal Knowledge" (1958). The book proposed a far more nuanced, personal idea of knowledge, and directly addressed approaches to science.

"There is hardly a page that has not benefited from her criticism," Dr. Polanyi wrote in his acknowledgments. "She has a share in anything I may have achieved here."


. . .


Her sense of humor sparkled when she was asked about being the first woman to have an edition of the Library of Living Philosophers devoted to her -- Volume 29 in 2002. Previous honorees included Bertrand Russell and Einstein. "I thought they must be looking desperately for a woman," Dr. Grene said.



For the full obituary, see:

DOUGLAS MARTIN. "Marjorie Grene, a Leading Philosopher of Biology, Is Dead at 98." The New York Times, First Section (Sun., March 29, 2009): 23.

(Note: ellipsis added.)


The reference for the Polanyi book, is:

Polanyi, Michael. Personal Knowledge: Towards a Post-Critical Philosophy. Chicago: University Of Chicago Press, 1958.





May 19, 2009

Bacon Died Experimenting and Hegel Died Contradicting Himself



(p. C32) The philosopher Francis Bacon, that great champion of the empirical method, died of his own philosophy: in an effort to observe the effects of refrigeration, on a freezing cold day he stuffed a chicken with snow and caught pneumonia.

As a philosopher dies, so he has lived and believed. And from the manner of his dying we can understand his thinking, or so the philosopher Simon Critchley seems to be saying in his cheekily titled "Book of Dead Philosophers."

. . .

Mr. Critchley recounts that Voltaire, after decades of denouncing the Roman Catholic Church, announced on his deathbed that he wanted to die a Catholic. But the shocked parish priest kept asking him, "Do you believe in the divinity of Christ?" Voltaire begged, "In the name of God, Monsieur, don't speak to me any more of that man and let me die in peace."

Hegel, who, as much as any philosopher, Mr. Critchley says, saw philosophy as an abstraction, while he was dying of cholera, moaned, "Only one man ever understood me ... and he didn't understand me."




For the full review, see:

DINITIA SMITH. "Books of The Times - Dying and Death: When You Sort It Out, What's It All About, Diogenes?" The New York Times (Fri., January 30, 2009): C32.

(Note: ellipsis between paragraphs was added; ellipsis in Hegel quote was in original.)


The reference to Critchley's book, is:

Critchley, Simon. The Book of Dead Philosophers. New York: Vintage Books, 2009.





May 11, 2009

More Accurate Measurements Reveal Previously Undetected Anomalies



(p. 69) This is a standard pattern in the history of science: when tools for measuring increase their precision by orders of magnitude, new paradigms often emerge, because the newfound accuracy reveals anomalies that had gone undetected. One of the crucial benefits of increasing the accuracy of scales is that it suddenly became possible to measure things that had almost no weight. Black's discovery of fixed air, and its perplexing mixture with common air, would have been impossible without the state-of-the-art scales he employed in his experiments. The whole inquiry had begun when Black heated a quantity of white magnesia, and discovered that it lost a minuscule amount of weight in the process--a difference that would have been imperceptible using older scales. The shift in weight suggested that something was escaping from the magnesia into the air. By then running comparable experiments, heating a wide array of substances, Black was able to accurately determine the weight of carbon dioxide, and consequently prove the existence of the gas. It weighs, therefore it is.


Source:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.





May 7, 2009

Magdeburg Sphere Let Scientists "See" the Vacuum



(p. 68) When we think of technological advances powering scientific discovery, the image that conventionally comes to mind is a specifically visual one: tools that expand the range of our vision, that let us literally see the object of study with new clarity, or peer into new levels of the very distant, the very small. Think of the impact that the telescope had on early physics, or the microscope on bacteriology. But new ways of seeing are not always crucial to discovery. The air pump didn't allow you to see the vacuum, because of course there was nothing to see: but it did allow you to see it indirectly, in the force that held the Magdeburg Sphere together despite all that horsepower.


Source:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.





May 1, 2009

Frazer Institute Seeks Better Measures of Policy Variables



George Gilder emphasizes that the importance of entrepreneurship to economic growth has been missed by many economists, in part because of the difficulty of measuring both the inputs of entrepreneurship (e.g., courage, persistence, creativity, etc.) and the outputs of entrepreneurship (e.g., happiness from more challenging work, greater variety of products, etc.).

Unfortunately this is not just an academic problem, because economists' policy advice is based on their models, and their models focus on what they can measure. If they can't measure entrepreneurship, then policies to encourage entrepreneurship are neglected.

Now the Frazer Institute, is seeking proposals to improve the measurement of important poorly measured policy-relevant variables. This initiative is in the spirit of the good work that the Frazer Institute has done in correlating measures of economic freedom with measures of economic growth.

I have been asked to publicize this initiative, and am pleased to do so:


Dear Art Diamond,

The Fraser Institute is launching a new contest to identify economic and public policy issues which still require proper measurement in order to facilitate meaningful analysis and public discourse. We hope you can help promote this contest by posting it on your weblog, artdiamondblog.

The Essay Contest for Excellence in the Pursuit of Measurement is an opportunity for the public to comment on an economic or public policy issue that they feel is important and deserves to be properly measured.

A top prize of $1,000 and other cash prizes can be won by identifying a vital issue that is either not being measured, or is being measured inappropriately. Acceptable entry formats include a short 500-600 word essay, or a short one-minute video essay.

Complete details and a promotional flyer are available at: http://www.fraserinstitute.org/programsandinitiatives/measurement_center.htm.

Entry deadline is Friday, May 15th, 2009.

Sponsored by the R.J. Addington Center for the Study of Measurement.

Enquiries may be directed to:

Courtenay Vermeulen
Education Programs Assistant
The Fraser Institute
Direct: 604.714.4533
courtenay.vermeulen@fraserinstitute.org



The Fraser Institute is an independent international research and educational organization with offices in Canada and the United States and active research ties with similar independent organizations in more than 70 countries around the world. Our vision is a free and prosperous world where individuals benefit from greater choice, competitive markets, and personal responsibility. Our mission is to measure, study, and communicate the impact of competitive markets and government interventions on the welfare of individuals.



An important source of Gilder's views, obliquely referred to in my comments above, is:

Gilder, George. Recapturing the Spirit of Enterprise: Updated for the 1990s. updated ed. New York: ICS Press, 1992.





April 21, 2009

An Intellectual Collaboration Beyond the Grave



There is something touchingly noble in this:

(p. 11) There is no direct evidence in the historical record, but it is entirely probable that it was the waterspout sighting that sent Priestley off on his quest to measure the temperature of the sea, trying to marshal supporting evidence for a passing conjecture his friend had made a decade before. Franklin had been dead for nearly four years, but their intellectual collaboration continued, undeterred by war, distance, even death.


Source:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.





April 14, 2009

Steven Johnson's The Invention of Air



InventionOfAirBK.jpg














Source of book image: http://stevenberlinjohnson.typepad.com/photos/uncategorized/2008/09/10/invention_final_81908.jpg


Steven Johnson's The Ghost Map, about the determined entrepreneurial detective work that uncovered the cause of cholera, is one of my all-time favorite books, so I am now in the mode of reading everything else that Steven Johnson has written, or will write.

The most recent book, The Invention of Air, is not as spectacular as The Ghost Map, but is well-written on a thought-provoking topic. It focuses on Joseph Priestley's role in the American Revolution. Priestley is best known as an early chemist, but Johnson paints him as a poly-math whose science was of a piece with his philosophy, politics and his religion.

Johnson's broader point is that for many of the founding fathers, science was not a compartment of their lives, but part of the whole cloth (hey, it's my blog, so I can mix as many metaphors as I want to).

And the neat bottom line is that Priestley's method of science (and polity) is the same broadly empirical/experimental/entrepreneurial method that usually leads to truth and progress.

Along the way, Johnson makes many amusing and thought-provoking observations, such as the paragraphs devoted to his coffee-house theory of the enlightenment. (You see, coffee makes for clearer thinking than beer.)


The book:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.






February 28, 2009

Financial Crisis Is "A Coming-Out Party" for Taleb and Behavioral Economists


(p. A23) My sense is that this financial crisis is going to amount to a coming-out party for behavioral economists and others who are bringing sophisticated psychology to the realm of public policy. At least these folks have plausible explanations for why so many people could have been so gigantically wrong about the risks they were taking.

Nassim Nicholas Taleb has been deeply influenced by this stream of research. Taleb not only has an explanation for what's happening, he saw it coming. His popular books "Fooled by Randomness" and "The Black Swan" were broadsides at the risk-management models used in the financial world and beyond.

In "The Black Swan," Taleb wrote, "The government-sponsored institution Fannie Mae, when I look at its risks, seems to be sitting on a barrel of dynamite, vulnerable to the slightest hiccup." Globalization, he noted, "creates interlocking fragility." He warned that while the growth of giant banks gives the appearance of stability, in reality, it raises the risk of a systemic collapse -- "when one fails, they all fail."

Taleb believes that our brains evolved to suit a world much simpler than the one we now face. His writing is idiosyncratic, but he does touch on many of the perceptual biases that distort our thinking: our tendency to see data that confirm our prejudices more vividly than data that contradict them; our tendency to overvalue recent events when anticipating future possibilities; our tendency to spin concurring facts into a single causal narrative; our tendency to applaud our own supposed skill in circumstances when we've actually benefited from dumb luck.

And looking at the financial crisis, it is easy to see dozens of errors of perception. Traders misperceived the possibility of rare events. They got caught in social contagions and reinforced each other's risk assessments. They failed to perceive how tightly linked global networks can transform small events into big disasters.

Taleb is characteristically vituperative about the quantitative risk models, which try to model something that defies modelization. He subscribes to what he calls the tragic vision of humankind, which "believes in the existence of inherent limitations and flaws in the way we think and act and requires an acknowledgement of this fact as a basis for any individual and collective action." If recent events don't underline this worldview, nothing will.



For the full commentary, see:

DAVID BROOKS. "The Behavioral Revolution." The New York Times (Tues., October 28, 2008): A31.


The reference to Taleb's Black Swan book is:

Taleb, Nassim Nicholas. The Black Swan: The Impact of the Highly Improbable. New York: Random House, 2007.


Another review of Taleb's book is:

Diamond, Arthur M., Jr. "Review of: Taleb, Nassim Nicholas. The Black Swan." Journal of Scientific Exploration 22, no. 3 (Fall 2008): 419-422.




February 10, 2009

Leeuwenhoek's Great Discovery Was at First Rejected by the "Experts"


In the passage quoted below, Hager discusses the reception that Leeuwenhoeck received to his first report of the "animalcules" seen under his microscope:

(p. 42) He hired a local artist to draw what he saw and sent his findings to the greatest scientific body of the day, the Royal Society of London.

(p. 43) Van Leeuwenhoek's raising of the curtain on a new world was greeted with what might kindly be called a degree of skepticism. Three centuries later a twentieth-century wit wrote a lampoon of what the Royal Society's secretary might well have responded:

Dear Mr. Anthony van Leeuwenhoek,

Your letter of October 10th has been received here with amusement. Your account of myriad "little animals" seen swimming in rainwater, with the aid of your so-called "microscope," caused the members of the society considerable merriment when read at our most recent meeting. Your novel descriptions of the sundry anatomies and occupations of these invisible creatures led one member to imagine that your "rainwater" might have contained an ample portion of distilled spirits---imbibed by the investigator. Another member raised a glass of clear water and exclaimed, "Behold, the Africk of Leeuwenhoek." For myself, I withhold judgement as to the sobriety of your observations and the veracity of your instrument. However, a vote having been taken among the members---accompanied, I regret to inform you, by considerable giggling---it has been decided not to publish your communication in the Proceedings of this esteemed society. However, all here wish your "little animals" health, prodigality and good husbandry by their ingenious "discoverer."



The satire was not far from the truth. Although very interested in the Dutchman's discoveries, so many English scientists were doubtful about his reports that van Leeuwenhoek had to enlist an English vicar and several jurists to attest to his findings. Then Hooke himself confirmed them. All doubt was dispelled.



Source:

Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor's Heroic Search for the World's First Miracle Drug. New York: Three Rivers Press, 2007.





December 28, 2008

"Four G's Needed for Success: Geduld, Geschick, Glück, Geld"


One of Domagk's predecessors, in goal and method, was Paul Ehrlich, who was a leader in the search for the Zuberkugeln (magic bullet) against disease causing organisms. He systematized the trial and error method, and pursued dyes as promising chemicals that might be modified to attach themselves to the intruders. But he never quite found a magic bullet:

(p. 82) Ehrlich announced to the world that he had found a cure for sleeping sickness. But he spoke too soon. Number 418, also, proved too toxic for general use. He and his chemists resumed the search.

Ehrlich said his method consisted basically of "examining and sweating"---and his coworkers joked that Ehrlich examined while they sweated. There was another motto attributed to Ehrlich's lab, the list of "Four Gs" needed for success: Geduld, Geschick, Glück, Geld---patience, skill, luck, and money.



Source:

Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor's Heroic Search for the World's First Miracle Drug. New York: Three Rivers Press, 2007.

(Note: do not confuse the "Paul Ehrlich" discussed above, with the modern environmentalist "Paul Ehrlich" who is best known for losing his bet with Julian Simon.)




December 20, 2008

Why You Want Your Surgeon to Be a Disciple of Lister


The sources of new ideas are diverse. Sometimes, as below, even a newspaper article can provide inspiration.

The passage below also provides another example of the project oriented entrepreneur, who is motivated by a mission to get the job done.

(p. 60) In Lister's early years, the mid-1800s, half of all amputation patients died from hospital fever; in some hospitals the rate was as high as 80 percent. Lister, like all surgeons, had little idea of how to improve the situation. Then he chanced on a newspaper article that caught his interest. It described how the residents of a local town, tired of the smell of their sewage, had begun treating it by pouring into their system something called German Creosote, a by-product of coal tar. Something in the creosote stopped the smell. Lister had heard about the work of Pasteur, and he made the same mental connection the French chemist had: The stink of sewage came from putrefaction, rotting organic matter; the stink of infected wounds also came from putrefaction; whatever stopped the putrefaction of sewage might also stop the putrefaction of infected wounds. So Lister decided to try coal-tar chemicals on his patients. And he found one that worked exceptionally well: carbolic acid, a solution of what today is called phenol.   . . .

. . .

(p. 61) Lister's insistence on stopping the transfer of bacteria in the operating room became absolute. Once when a visiting knighted physician from King's College idly poked a forefinger into a patient's incision during one of Lister's operations, Lister flung him bodily from the room.



Source:

Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor's Heroic Search for the World's First Miracle Drug. New York: Three Rivers Press, 2007.

(Note: ellipses added.)




December 16, 2008

Doctors Rejected Pasteur's Work


Whether in science, or in entrepreneurship, at the initial stages of an important new idea, the majority of experts will reject the idea. So a key for the advance of science, or for innovation in the economy, is to allow scientists and entrepreneurs to accumulate sufficient resources so that they can make informed bets based on their conjectures, and on their tacit knowledge.

A few entries ago, Hager recounted how Leeuwenhoek faced initial skepticism from the experts. In the passage below, Hager recounts how Pasteur also faced initial skepticism from the experts:

(p. 44) If bacteria could rot meat, Pasteur reasoned, they could cause diseases, and he spent years proving the point. Two major problems hindered the acceptance of his work within the medical community: First, Pasteur, regardless of his ingenuity, was a brewing chemist, not a physician, so what could he possibly know about disease? And second, his work was both incomplete and imprecise. He had inferred that bacteria caused disease, but it was impossible for him to definitively prove the point. In order to prove that a type of bacterium could cause a specific disease, precisely and to the satisfaction of the scientific world, it would be necessary to isolate that one type of bacterium for study, to create a pure culture, and then test the disease-causing abilities of this pure culture.


Source:

Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor's Heroic Search for the World's First Miracle Drug. New York: Three Rivers Press, 2007.




December 8, 2008

Amateur Leeuwenhoek Made Huge Contribution to Science


(p. 40) Antoni van Leeuwenhoek was a scientific superstar. The greats of Europe traveled from afar to see him and witness his wonders. It was (p. 41) not just the leading minds of the era---Descartes, Spinoza, Leibnitz, and Christopher Wren---but also royalty, the prince of Liechtenstein and Queen Mary, wife of William III of Orange. Peter the great of Russia took van Leeuwenhoek for an afternoon sail on his yacht. Emperor Charles of Spain planned to visit as well but was prevented by a strong eastern storm.

It was nothing that the Dutch businessman had ever expected. He came from an unknown family, had scant education, earned no university degrees, never traveled far from Delft, and knew no language other than Dutch. At age twelve he had been apprenticed to a linen draper, learned the trade, then started his own business as a fabric merchant when he came of age, making ends meet by taking on additional work as a surveyor, wine assayer, and minor city official. He picked up a skill at lens grinding along the way, a sort of hobby he used to make magnifying glasses so he could better see the quality of fabrics he bought and sold. At some point he got hold of a copy of Micrographia, a curious and very popular book by the British scientist Robert Hooke. Filled with illustrations, Micrographia showed what Hooke had sen through a novel instrument made of two properly ground and arranged lenses, called a "microscope."  . . .   Micrographia was an international bestseller in its day. Samuel Pepys stayed up until 2:00 A.M. one night poring over it, then told his friends it was "the most ingenious book that I ever read in my life."

Van Leeuwenhoek, too was fascinated. He tried making his own microscopes and, as it turned out, had talent as a lens grinder. His lens were better than anyone's in Delft; better than any Hooke had access to; better, it seemed, than any in the world.  . . .  

(p. 42) Then, in the summer of 1675, he looked deep within a drop of water from a barrel outside and became the first human to see an entirely new world. In that drop he could make out a living menagerie of heretofore invisible animals darting, squirming, and spinning.



Source:

Hager, Thomas. The Demon under the Microscope: From Battlefield Hospitals to Nazi Labs, One Doctor's Heroic Search for the World's First Miracle Drug. New York: Three Rivers Press, 2007.

(Note: ellipses added.)


The example above is consistent with Baumol's hypotheses about formal education mattering less, in the initial stages of great discoveries. (And maybe even being a hindrance).

See:

Baumol, William J. "Education for Innovation: Entrepreneurial Breakthroughs Versus Corporate Incremental Improvements." In Innovation Policy and the Economy, edited by Adam B. Jaffe, Josh Lerner and Scott Stern, 33-56. Cambridge, Mass.: MIT Press, 2005.


The example is also consistent with Terence Kealey's claim that important science can often arise as a side-effect of the pursuit of business activity.

See:

Kealey, Terence. The Economic Laws of Scientific Research. New York: St. Martin's Press, 1996.




November 30, 2008

Einstein on What Counts


"Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted."


Source:

Albert Einstein, as quoted in Koch, Charles G. The Science of Success: How Market-Based Management Built the World's Largest Private Company. Hoboken, NJ: Wiley & Sons, Inc., 2007.




October 23, 2008

Based on Past Experience, the Renaissance Was Impossible


(p. 26) Even the wisest of them were at a hopeless disadvantage, for their only guide in sorting it all out---the only guide anyone ever has---was the past, and precedents are worse than useless when facing something entirely new. They suffered another handicap. As medieval men, crippled by ten centuries of immobility, they viewed the world through distorted prisms peculiar to their age.

In all that time nothing of real consequence had either improved or declined. Except for the introduction of waterwheels in the 800s and windmills in the late 1100s, there had been no inventions of significance. No startling new ideas had appeared, no new terri-(p. 27)tories outside Europe had been explored. Everything was as it had been for as long as the oldest European could remember. The center of the Ptolemaic universe was the known world---Europe, with the Holy Land and North Africa on its fringes. The sun moved round it every day. Heaven was above the immovable earth, somewhere in the overarching sky; hell seethed far beneath their feet. Kings ruled at the pleasure of the Almighty; all others did what they were told to do. Jesus, the son of God, had been crucified and resurrected, and his reappearance was imminent, or at any rate inevitable. Every human being adored him (the Jews and the Muslims being invisible). The Church was indivisible, the afterlife a certainty; all knowledge was already known. And nothing would ever change.

The mighty storm was swiftly approaching, but Europeans were not only unaware of it; they were convinced that such a phenomenon could not exist. Shackled in ignorance, disciplined by fear, and sheathed in superstition, they trudged into the sixteenth century in the clumsy, hunched, pigeon-toed gait of rickets victims, their vacant faces, pocked by smallpox, turned blindly toward the future they thought they knew---gullible, pitiful innocents who were about to be swept up in the most powerful, incomprehensible, irresistible vortex since Alaric had led his Visigoths and Huns across the Alps, fallen on Rome, and extinguished the lamps of learning a thousand years before.



Source:

Manchester, William. A World Lit Only by Fire: The Medieval Mind and the Renaissance, Portrait of an Age. New York: Little, Brown & Co., 1993.

(Note: italics in original.)




October 21, 2008

The Current Financial Crisis Reveals a Need for Reform


As I think about the current financial crisis, I have been struck by the uncertainty among economists about what should be done. Many economists are silent. Those who speak, have offered very diverse opinions. And even among those who express opinions, there is a lack of confidence in their opinions.

Milton Friedman used to say that economists will be listened to when there is a crisis, and that economists need to be ready, as Friedman himself was with his floating exchange rate proposal. (Milton, we need you again.)

I believe that one lesson from the current crisis is that we need reform---reform of economists' research priorities and methods. We should become more interested in policy relevance, history and institutions; and less interested in mathematical rigor.

We should avoid what Schumpeter called "the Ricardian Vice." (Highly stylized, aggregated models, based on unrealistic simplifying assumptions, that are then blindly applied to policy decisions in the actual, richly "thick" world---see McCloskey's essay on thick and thin methods in economics.)

We also should spend less time in studying cute, counter-intuitive results ("freakonomics"), and spend more time on the big issues.

We should be willing to suggest institutional reforms and experiments, and participate in experiments (natural and artificial) to see how they work. (Spontaneous order is nice when it happens, but entrepreneurial vision and initiative can improve the world too.)

Capitalism has produced huge gains in longevity and standards of living. Yet capitalism is in danger of being hobbled or destroyed.

Schumpeter warned of "the crumbling of the protecting walls." We should have been better prepared to rebuild and defend them.

Note: The "Ricardian Vice" phrase is from Schumpeter's History of Economic Analysis, p. 473; the "protecting walls" phrase is from Capitalism, Socialism and Democracy, p. 143.

The McCloskey essay mentioned is:

McCloskey, Deirdre. "Thick and Thin Methodologies in the History of Economic Thought." In The Popperian Legacy in Economics, 245-57. Cambridge, UK: Cambridge University Press, 1988.




October 15, 2008

Schumpeter Saw Keynes' Work as a "Striking Example" of "the Ricardian Vice"


McCraw on Schumpeter's History of Economic Analysis:

(p. 460) . . . , Schumpeter compared Keynes to David Ricardo: "His work, is a striking example of what we have called above the Ricardian Vice, namely, the habit of piling a heavy load of practical conclusions upon a tenuous groundwork, which was unequal to it yet seemed in its simplicity not only attractive but also convincing. All this goes a long way though not the whole way toward answering the questions that always interest us, namely the questions what it is in a man's message that makes people listen to him, and why and how."


Source:

McCraw, Thomas K. Prophet of Innovation: Joseph Schumpeter and Creative Destruction. Cambridge, Mass.: Belknap Press, 2007.

(Note: ellipsis added.)

(Note: italics in original.)




October 11, 2008

McCraw Calls Schumpeter's History of Economic Analysis "an Epic Analytical Narrative"


McCraw on Schumpeter's History of Economic Analysis:

(p. 461) History of Economic Analysis succeeds where much economic writing or our own time fails, having sacrificed the messy humanity of its subject on the alter of mathematical rigor. Above all else, Schumpeter's History is an epic analytical narrative. It is about real human beings, moored in their own time, struggling like characters in a a novel to resolve difficult problems. Sometimes the problems (p. 462) are purely intellectual. Sometimes they are issues of public policy. Often they are both. But what Schumpeter was trying to do---and in fact did---was answer the deceptively simple question he posed in the early pages of his book: to discover "how economists have come to reason as they do."


Source:

McCraw, Thomas K. Prophet of Innovation: Joseph Schumpeter and Creative Destruction. Cambridge, Mass.: Belknap Press, 2007.




September 21, 2008

Among Academic Economists Interest in Entrepreneurship is "A Quick Ticket Out of a Job"


From McCraw's discussion of Schumpeter's "legacy":

(p. 500) In the new world of academic economics, neither the Schumpeterian entrepreneur as an individual nor entrepreneurship as a phenomenon attracts much attention. For professors in economics departments at most major universities, particularly in the United States and Britain, a focus on these favorite issues of Schumpeter's has become a quick ticket out of a job. This development arose from a self-generated isolation of academic economics from history, sociology, and the other social sciences. It represented a trend that Schumpeter himself had glimpsed and lamented but that accelerated rapidly during the two generations after his death.


Source:

McCraw, Thomas K. Prophet of Innovation: Joseph Schumpeter and Creative Destruction. Cambridge, Mass.: Belknap Press, 2007.




September 5, 2008

Schumpeter's Final Thoughts on the Importance of the Individual Entrepreneur



Here is McCraw discussing and quoting Schumpeter's notes for the Walgreen Lectures that he was preparing to deliver just before he died.

(p. 475) In notes he prepared in 1949 for the prestigious Walgreen Lectures, Schumpeter headed one entire section "The Personal Element and the Element of Chance: A Principle of Indeterminateness." Here, he wrote that the time had come for economists to face a problem they had long tried to dodge:

the problem of the influence that may be exerted by exceptional individuals, a problem that has hardly ever been treated without the most blatant preconceptions. Without committing ourselves either to hero worship or to its hardly less absurd opposite, we have got to realize that, since the emergence of exceptional indi-(p. 476)viduals does not lend itself to scientific generalization, there is here an element that, together with the element of random occurrences with which it may be amalgamated, seriously limits our ability to forecast the future. That is what is meant here by "a principle of indeterminateness." To put it somewhat differently: social determinism, where it is nonoperational, is a creed like any other and entirely unscientific.


Source:

McCraw, Thomas K. Prophet of Innovation: Joseph Schumpeter and Creative Destruction. Cambridge, Mass.: Belknap Press, 2007.





July 13, 2008

"Theory" Said Gene Sequencing Technique Was "Impossible"


In the book The Genome War, the story is told about how the leading theorist proved the impossibility of the gene sequencing technique. It was the Venter group that gave it a try and proved it could work. This story is similar to the one about theory saying that what Marconi was trying, was impossible. (See: Larson, 2006.)

Rosenberg and Birdzell (1986) discuss the case that theory had proven how solid objects fall. But Galileo's experiments proved them wrong. This established the primacy of experiment and evidence, over theory.

When governments decide, they usually do what is safe, which is to follow current theory (or in rare cases, they pick Lysenko).

The entrepreneurial system, takes advantage of the tacit individual knowledge that is out there, but not yet theoretically defensible, and allows it to percolate to success.

References:

Larson, Erik. Thunderstruck. New York: Crown, 2006.

Rosenberg, Nathan, and L.E. Birdzell, Jr. How the West Grew Rich: The Economic Transformation of the Industrial World. New York: Basic Books, 1986.

Shreeve, James. The Genome War: How Craig Venter Tried to Capture the Code of Life and Save the World. 1st ed. New York: Alfred A. Knopf, 2004.




July 1, 2008

The Method of Milton Friedman's Practice Was Better Than the Method of His Essay


The method of the Chicago School is often thought to be the method outlined in Friedman's famous essay "The Methodology of Positive Economics." It can be (and has been) persuasively argued that the actual methodology practiced by Friedman is broader, and more eclectic than that advocated in his early essay.

His practice continued to exemplify a kind of empiricism, but it was a kind of empiricism that included, not only 'rigorous' econometrics, but also economic history, case studies, and 'stylized facts.'

I believe that the method of Friedman's practice is sounder than the method of his essay. So it is unfortunate that the Institute founded in Friedman's name will probably only support those who practice the formal method of the essay.

(p. B5) The University of Chicago will announce Thursday that it plans to establish a center for economics honoring the late economist Milton Friedman.

The school plans to raise an endowment of $200 million to support the Milton Friedman Institute.

. . .

. . . his approach to economics embodies what has come to be known as the Chicago School. He defined that as "an approach that insists on the empirical testing of theoretical generalizations and that rejects alike facts without theory and theory without facts."

It is that approach, and the intellectual rigor that Mr. Friedman brought to it, that the Friedman Institute is meant to advocate, rather than any ideology, says Chicago economist Gary Becker, a Nobel Prize-winning former student of Mr. Friedman's who was on the faculty committee that recommended the institute.


For the full story, see:

JUSTIN LAHART. "University Plans Institute to Honor Milton Friedman." The Wall Street Journal (Thurs., May 15, 2008): B5.

(Note: ellipses added.)


The famous Friedman method essay is:

Friedman, Milton. "The Methodology of Positive Economics." In Essays in Positive Economics, 3-43. Chicago: University of Chicago Press, 1953.




March 27, 2008

Science Would Advance Faster if Results of Failed Experiments Were Easier to Find


The passages below are from a WSJ summary of an October 2007 article in Wired:

(p. B7) Scientists shouldn't be so quick to squelch the results of failed experiments, Wired Deputy Editor Thomas Goetz says.

Even if data don't deliver hoped-for results, they still have uses. This is especially the case today, when some compelling discoveries have come from meta-studies, in which statisticians sift the data of several papers to reach conclusions. "Your dead end may be another scientist's missing link, the elusive chunk of data they needed," Mr. Goetz says. Making such data available might one day fuel advances in genetics, neuroscience and biotechnology, he contends.


For the full summary, see:

"The Informed Reader; Science; Researchers' 'Dark Data' Should Be Brought to Light." The Wall Street Journal (Thurs., Oct. 4, 2007): B7.




February 22, 2008

"Sometimes It Pays to Read the Old Literature"


(p. A1) Researchers in New York believe they have solved one of the great mysteries of the flu: Why does the infection spread primarily in the winter months?

The answer, they say, has to do with the virus itself. It is more stable and stays in the air longer when air is cold and dry, the exact conditions for much of the flu season.

. . .

(p. A22) To his surprise, Dr. Palese stumbled upon a solution that appeared to be a good second best.

Reading a paper published in 1919 in the Journal of the American Medical Association on the flu epidemic at Camp Cody in New Mexico, he came upon a key passage: "It is interesting to note that very soon after the epidemic of influenza reached this camp, our laboratory guinea pigs began to die." At first, the study's authors wrote, they thought the animals had died from food poisoning. But, they continued, "a necropsy on a dead pig revealed unmistakable signs of pneumonia."

Dr. Palese bought some guinea pigs and exposed them to the flu virus. Just as the paper suggested, they got the flu and spread it among themselves. So Dr. Palese and his colleagues began their experiments.

. . .

As for Dr. Palese, he was glad he spotted the journal article that mentioned guinea pigs.

"Sometimes it pays to read the old literature," he said.

 

For the full story, see:

GINA KOLATA. "Study Shows Why the Flu Likes Winter." The New York Times (Weds., December 5, 2007): A1 & A22.

(Note:  ellipses added.)

 




February 20, 2008

Government Biologists Spend Big Bucks Protecting Wrong Fish


CutthroatTrout.jpg

"Without DNA tests, the rare greenback cutthroat trout, left, and the Colorado River cutthroat fish are difficult to tell apart." Source of caption and photo: online version of the NYT article quoted and cited below.

 

(p. 26) DENVER, Oct. 13 (AP) -- State and federal biologists, who are smarting from research showing that they may have been protecting the wrong fish the past 20 years, are regrouping in their efforts to restore the rare greenback cutthroat trout to Colorado waters.

Tom Nesler, the state biologist, had hoped to see the fish removed from the endangered species list during his career. He concedes that might not happen if it turns out some of the greenback populations biologists thought they were saving are actually the similar but more common Colorado River cutthroat trout.

A three-year study led by University of Colorado researchers and published in August found that out of nine fish populations believed to be descendants of original greenbacks, five were actually Colorado River cutthroat trout.

The recovery effort was thought to be near its goal of establishing 20 self-sustaining greenback populations.

"Hey, science happens," said Mr. Nesler with a shrug as he discussed the findings.

. . .

The Colorado Division of Wildlife has spent an average of $320,000 annually for the past five years to restore the greenback. Most of the money has come from state lottery revenue; no state tax dollars have been used.

. . .

"Science is not about proof and certainty," he said, "it's about testable hypotheses."

 

For the full story, see:

THE ASSOCIATED PRESS. "After Possible 'Oops,' a Trout Rescue Project Regroups." The New York Times, First Section (Sun., October 14, 2007): 26.

(Note: ellipses added.)

 




February 15, 2008

Private Money Supports Quest for Dinosaur DNA

 

   Source of graphic: the online version of the WSJ article quoted and cited below.

 

(p. A1)  JORDAN, Mont. -- Prospecting in Montana's badlands, rock ax in hand, paleontologist Jack Horner picks up a piece of the jawbone of a dinosaur. He examines the splinter, then puts it back and moves on. It isn't the kind of bone he is looking for.

Prof. Horner is searching for something that many scientists believe no longer exists: dinosaur bones that harbor blood cells, protein and, perhaps, even DNA.

"Most people looking for dinosaurs are looking for beautiful skeletons," he says. "We are looking for information."

. . .  

Prof. Horner, a curator at the Museum of the Rockies in Bozeman, is among the world's most influential and offbeat paleontologists. He pioneered studies of dinosaur parent-(p. A12)ing behavior, species variation and bone cells. He is dyslexic, a former Special Forces operative of the Vietnam War era, a MacArthur Foundation "genius" fellow, and a chaired professor of Montana State University who never finished a formal college degree.

"The lenses that people normally use to look at stuff are broken in Jack," says Mary Schweitzer, an assistant professor of paleontology at North Carolina State University, who has worked with him for years. "That's what makes Jack such a good scientist. Every now and then, every field should get a renegade weirdo in it who challenges assumptions."

. . .  

"The chances of finding any [dinosaur] DNA are pretty low," Prof. Horner acknowledges. "I am still hopeful."

In a field mostly outside the mainstream of federal research funding, Prof. Horner has a knack for attracting private grants. Star Wars producer George Lucas, Qualcomm co-founder Klein Gilhousen and Wade Dokken, a developer of Montana real estate, have contributed toward his research, the university says. Nathan Myhrvold, formerly chief technology officer at Microsoft Corp. and co-founder of Intellectual Ventures LLC, is helping to underwrite this season's fieldwork.

This summer, in Montana's Hell Creek Formation, Prof. Horner is searching the last landscape inhabited by dinosaurs. More than 65 million years ago, this plain was a wetland where herds of horned Triceratops watered. Today, it is an arid outwash of boulders, cactus and sage. The red and gray soil is littered with white shards of petrified wood that ring like bone china when tapped together and countless crumbs of dinosaur bone.

. . .

"As long as you are not bound by preconceived ideas of what you can find," Prof. Horner says, "there are an awful lot of things you can discover."

 

For the full story, see:

ROBERT LEE HOTZ. "Dinosaur Hunter Seeks More Than Just Bare Bones; Prof. Horner Searches For Traces of Blood, DNA; Lucky Break From T. Rex."  The Wall Street Journal  (Fri., August 24, 2007):  A1 & A12.

(Note:  ellipses added.)

  

     At top, Prof. Horner; at bottom: "Sarah Keenan, 21, an undergraduate at the University of St. Andrews in Scotland who is working this summer for Prof. Horner, covers the fossilized triceratops frill in a protective jacket of plaster."  Source of caption and photos: the online version of the WSJ article quoted and cited above.

 




January 29, 2008

Marconi Matters

 

    Source of book image:  http://palmaddict.typepad.com/photos/uncategorized/big_larsonthunderstruckdrm_1.jpg

 

Larson's book plays off a murder mystery against Marconi as the innovator who brought us communication through the air. 

I'm most enthused about hte Marconi part.  It shows how he proceeded against the theorists of the day, whose theories told them that what he was trying to do was impossible.  He was more entrepreneur, than scientist.  And it turned out that it was a good thing that the theoretical scientists did not rule, as they might if all decisions about technology were made by the government.

What happened here is an example of what Taleb would call a Black Swan.

 

Source:

Larson, Erik. Thunderstruck. New York: Crown, 2006.

 




Marconi Matters

 

    Source of book image:  http://palmaddict.typepad.com/photos/uncategorized/big_larsonthunderstruckdrm_1.jpg

 

Larson's book plays off a murder mystery against Marconi as the innovator who brought us communication through the air. 

I'm most enthused about hte Marconi part.  It shows how he proceeded against the theorists of the day, whose theories told them that what he was trying to do was impossible.  He was more entrepreneur, than scientist.  And it turned out that it was a good thing that the theoretical scientists did not rule, as they might if all decisions about technology were made by the government.

What happened here is an example of what Taleb would call a Black Swan.

 

Source:

Larson, Erik. Thunderstruck. New York: Crown, 2006.

 




November 29, 2007

Let the Evidence Decide if the Mapinguary is Myth or Real

 

   A statue of the mapinguary in Rio Branco, Brazil.  Source of the photo:  online version of the NYT article quoted and cited below. 

 

RIO BRANCO, Brazil — Perhaps it is nothing more than a legend, as skeptics say. Or maybe it is real, as those who claim to have seen it avow. But the mere mention of the mapinguary, the giant slothlike monster of the Amazon, is enough to send shivers down the spines of almost all who dwell in the world’s largest rain forest.

The folklore here is full of tales of encounters with the creature, and nearly every Indian tribe in the Amazon, including those that have had no contact with one another, have a word for the mapinguary (pronounced ma-ping-wahr-EE). The name is usually translated as “the roaring animal” or “the fetid beast.”

. . .  

The giant ground sloth, Megatherium, was once one of the largest mammals to walk the earth, bigger than a modern elephant. Fossil evidence is abundant and widespread, found as far south as Chile and as far north as Florida. But the trail stops cold thousands of years ago.

“When you travel in the Amazon, you are constantly hearing about this animal, especially when you are in contact with indigenous peoples,” said Peter Toledo, an expert on sloths at the Goeldi Institute. “But convincing scientific proof, in the form of even vestiges of bones, blood or excrement, is always lacking.”

Glenn Shepard Jr., an American ethnobiologist and anthropologist based in Manaus, said he was among the skeptics until 1997, when he was doing research about local wildlife among the Machiguenga people of the far western Amazon, in Peru. Tribal members all mentioned a fearsome slothlike creature that inhabited a hilly, forested area in their territory.

Dr. Shepard said “the clincher that really blew me away” came when a member of the tribe remarked matter of factly that he had also seen a mapinguary at the natural history museum in Lima. Dr. Shepard checked; the museum has a diorama with a model of the giant prehistoric ground sloth.

“At the very least, what we have here is an ancient remembrance of a giant sloth, like those found in Chile recently, that humans have come into contact with,” he said. “Let me put it this way: Just because we know that mermaids and sirens are myths doesn’t mean that manatees don’t exist.”

Even so, the mystery of the mapinguary is likely to continue, as is the search.

“There’s still an awful lot of room out there for a large sloth to be roaming around,” Dr. Shepard said.

 

For the full story, see: 

LARRY ROHTER.  "A Huge Amazon Monster Is Only a Myth. Or Is It?"  The New York Times, Section 1  (Sun., July 8, 2007):  3. 

(Note:  ellipsis added.)

 

Some scientists believe that the mapinguary may be based on actual sightings of a real creature related to the Megarium, two of whom are depicted above.  Source of the photo:  online version of the NYT article cited above. 

RioBrancoAmazonMap.jpg   Source of the map:  online version of the NYT article cited above. 

 




October 27, 2007

Academic Entrepreneurs in a Toxic Wasteland

 

   The Berkeley Pit was once a copper mine, and now holds a lake of toxic waste.  Source of photo:  online version of the NYT article quoted and cited below.

 

Here are a few paragraphs from a fascinating story about a couple of people who seem to be practicing what Taleb is preaching in The Black Swan:

 

BUTTE, Mont. — Death sits on the east side of this city, a 40-billion-gallon pit filled with corrosive water the color of a scab. On the opposite side sits the small laboratory of Don and Andrea Stierle, whose stacks of plastic Petri dishes are smeared with organisms pulled from the pit. Early tests indicate that some of those organisms may help produce the next generation of cancer drugs.

From death’s soup, the Stierles hope to coax life.

“I love the idea of looking at toxic waste and finding something of value,” said Ms. Stierle, 52, a chemistry researcher at Montana Tech of the University of Montana.

For decades, scientists assumed that nothing could live in the Berkeley Pit, a hole 1,780 feet deep and a mile and a half wide that was one of the world’s largest copper mines until 1982, when the Atlantic Richfield Company suspended work there. The pit filled with water that turned as acidic as vinegar, laced with high concentrations of arsenic, aluminum, cadmium and zinc.

. . .

Mr. Stierle is a tenured professor at Montana Tech, but his wife gets paid only for teaching an occasional class or if there is a grant to finance her research. From 1996 to 2001 they applied for dozens of grants, but received only rejection letters. So they financed their own research, using personal savings and $12,000 in annual patent royalty payments. In 2001, they won a six-year, $800,000 grant from the United States Geological Survey.

“Their work is considered a very high-risk approach,” said Matthew D. Kane, a program director at the National Science Foundation. “It takes a long time to get funding, and some luck to find active compounds.”

Unlike scientists at large research universities, who commonly teach only one class a year and employ graduate students to run their laboratories, Mr. Stierle teaches four classes each semester at a college with 2,000 undergraduates and no major research presence.

. . .

The couple said they were negotiating privately with a pharmaceutical company to test some of the compounds they have discovered and possibly turn them into drugs. As they wait, they open another Mason jar filled with murky pit water, draw a sample and return to work.

“The pit very easily could have been a complete waste of time,” Mr. Stierle said. “We just had luck and worked our butts off. We take that first walk into the dark.”

 

For the full story, see:

CHRISTOPHER MAAG.  "In the Battle Against Cancer, Researchers Find Hope in a Toxic Wasteland."   The New York Times  (Tues., October 9, 2007):  A21.

(Note:  ellipses added.)

 

BerkeleyPitMap.gif   In the photo immediately above, Don and Andrea Steirle work in their lab.  The map to the left shows the location of the Berkeley Pit.  Source of the photo and map:  online version of the NYT article quoted and cited above.

 




October 7, 2007

Thales of Miletus Lives

 

   Source of book image:  http://store.43folders.com/books-3-1400063515-The_Black_Swan_The_Impact_of_the_Highly_Improbable

 

This is part entertaining rant and part serious epistemology.  I've finished 9 of 19 chapters so far--almost all of my reading time spent smiling. 

Historians of Greek philosophy used to tell the story of one of the first philosophers, Thales of Miletus, that he once was watching the stars, and fell into a well.  The citizens of Miletus made fun of him being an impractical philosopher.  To prove them wrong, he used his knowledge to corner the market in something, and made a fortune. 

Not a very plausible story, but appealing to us philosophers.  (Like Thales, we like to think we could all be rich, if we didn't have higher goals.)

Well apparently Taleb is the real Thales.  He wanted to be a philosopher, got rich on Wall Street using his epistemological insights, and is now using his wealth to finance his musings on whatever he cares to muse on.

Beautiful!

 

Here's an amusing sentence that broadened my grin.  (It was even more amusing, and profound, in context, but I don't have time to type in the context for you.)

(p. 87)  If you are a researcher, you will have to publish inconsequential articles in "prestigious" publications so that others say hello to you once in a while when you run into them at conferences.

 

Reference for the book:

Taleb, Nassim Nicholas. The Black Swan: The Impact of the Highly Improbable. New York: Random House, 2007.

 




October 5, 2007

Sherwin Rosen Approves a Positive Review of Rosenberg's Book

 

Another of Sherwin Rosen’s minor acts of methodological delinquency, this time in his role as co-editor of the Journal of Political Economy, was his asking me to review Alexander Rosenberg’s criticism of the economics profession for drifting further and further into irrelevant mathematical puzzle-solving.  My review was basically favorable to Rosenberg, and my memory is that Rosen was quite content with my review.

 

The reference to my review is:

"Review of:  Alexander Rosenberg's Economics--Mathematical Politics or Science of Diminishing Returns?."  Journal of Political Economy 104, no. 3 (June 1996):  655-659.

 




October 3, 2007

When Sherwin Rosen Stunned the Fifth World Congress of the Econometric Society

         

I remember hearing Sherwin Rosen speak to a good-sized auditorium of technical economists at a plenary session of the Fifth World Congress of the Econometric Society in 1985.[i]  Rosen was proceeding in his typically bemused style, when he suggested to the stunned audience that they might benefit from re-reading Alfred Marshall.  I remember him saying that there are some things in Marshall that we don’t talk about any more, but that we should still talk about.  I specifically remember him mentioning that Marshall had said that the success of the institution of contract depended on the correct expectation of a certain level of ethical behavior among the participants in the economy.  I wish I could remember the specifics better, but Rosen’s auditors were visibly dismayed, and I supposed that they were thinking something like:  ‘read Marshall?, here was the sad sight of a once proud theoretician, going soft and senile.’



[i]I remember a large auditorium-like venue for the session, but could not remember any other session details, so I dug out a copy of the program for the meetings.  The only plenary session participation listed for Rosen, was his serving as a discussant for Oliver Hart and Bengt Holmstrom’s “Theory of Contracts” paper, which was delivered on August 19, 1985 (see:  “Program . . .,” 1986, p. 471).  In searching through Rosen’s publications, I cannot find any evidence that his comment was ever published.  In an email response (email dated Nov. 19, 2006) to my inquiry, Bengt Holmstrom has replied:  “I know that his comment was not published.”

"Program of the Fifth World Congress of the Econometric Society.”  Econometrica 54, no. 2 (March 1986):  459-505.

 




September 8, 2007

James Buchanan Convinced Harry Johnson to Over-Rule the Referees

 

  James Buchanan (center) flanked by economics graduate students at the the closing dinner of the 2007 Summer Institute for the Preservation of the History of Economics, held at George Mason University.  (Source of photo:  me.)

 

On the evening of Thursday, June 7, 2007, Nobel-prize-winner James Buchanan joined participants for their final dinner-gathering at George Mason's Summer Institute for the Preservation of the History of Economics.

As a young economist, you are advised that it is never productive to dispute the decisions of journal editors. 

But Buchanan, in conversation during the dinner, mentioned that he had only once disputed a journal rejection--of an article submitted to the JPE during Harry Johnson's editorship.  Buchanan said that he wrote Johnson a letter explaining that the referees had totally misunderstood his paper; Johnson read the paper himself, agreed with Buchanan, and published it.

This tells us something about Buchanan, but also something about Johnson.  I never took a class from Johnson, but had a conversation with him at a party or reception once, in the last year or two before his death.  All I remember about the conversation was that he was polite and respectful to an unknown graduate student, and that somehow we got onto the topic of car advertising.

I remember hearing that sometimes Harry Johnson whittled on wood carving projects during committee meetings.  Someone asked him why he did that, and he is reputed to have responded that at the end of the meeting he liked to feel as though something at been accomplished.

 




September 4, 2007

Astronauts (and the Rest of Us) Would Benefit from More Unscripted Time

 

Noctilucent clouds.  Source of photo:  http://apod.nasa.gov/apod/image/9907/noctilucent_pp.jpg

 

The excerpt below is from a WSJ summary of an article from the  June 2007 issue of Seed.

 

Many other scientific discoveries have come from astronauts puzzling over strange sights around them. Most of what is known about so-called noctilucent clouds -- thin, beautiful wisps that hover at the edge of the Earth's atmosphere -- comes from 30 years of astronauts sketching and trying to photograph them in their spare time. The strength of a certain type of cosmic ray was first recognized in 1969 when Buzz Aldrin asked fellow astronauts if they, like him, were seeing occasional streaks of light when their eyes were closed.

Such discoveries off the beaten path of the National Aeronautic and Space Administration's research agenda prompt the question attributed to an Apollo program geologist: "If human beings can do much better science than robots, why does NASA make its astronauts do science like robots?"

 

For the full summary, see: 

"Informed Reader; SCIENCE; Why Astronauts Need Down Time in Space."  The Wall Street Journal (Weds., May 9, 2007):  B15.

 




August 10, 2007

The Courage of Milton Friedman

 

The following two paragraphs are from a paper I am currently working on.

 

Milton Friedman wrote a Newsweek column many years ago that caused a firestorm of anger among his colleagues in the economics profession. Friedman’s argument was that, in general, the government is not going to do a good job of identifying the best and most productively innovative economists. In particular, he argued that economics funding by the National Science Foundation (NSF) had made the economics profession more mathematical than was appropriate.

Even his ‘Chicago’ colleagues, who were otherwise inclined to be sympathetic to his work, were appalled: Robert Lucas wrote against Friedman in the New York Times, and Zvi Griliches spoke against him before Congress. 

 

Not too long after Friedman’s article came out, I praised it during one of the sessions of a Liberty Fund colloquium held in California.  After the session, a very distinguished economist came up to me, and started talking about the Friedman article in a very irritated and animated manner.  He said that what Friedman wrote in the article, might be true, but he shouldn’t have written it in a public forum.[i]  He said that within the NSF, the physicists have always been opposed to funding economics, and that Friedman’s article gave the physicists just the ammunition they needed.  I remember distinctly that after this conversation, the distinguished economist got into his very large and very expensive car and drove off.  To the cynical, it may also be worth mentioning that this economist had received very substantial funding from the NSF.

I also remember mentioning to George Stigler my disappointment that Lucas had written contra Friedman, and Stigler gave me his cynical smile, and said that I should have expected that Lucas, and the rest of the profession, would defend NSF funding.


[i] Most of the conversation I remember in broad terms, but specifically, I remember he said something very close to:  ‘Friedman shouldn’t air the profession’s dirty laundry in public.’

 

The reference for the Friedman article, is: 

Friedman, Milton.  "An Open Letter on Grants."  Newsweek, May 18 1981, 99.




August 3, 2007

Beebe's "Colleagues Reacted Coolly"

 

    Photos of strange deep sea creatures.  Source of photos:  online version of the NYT article cited below.

 

When, more than 70 years ago, William Beebe became the first scientist to descend into the abyss, he described a world of twinkling lights, silvery eels, throbbing jellyfish, living strings as “lovely as the finest lace” and lanky monsters with needlelike teeth.

“It was stranger than any imagination could have conceived,” he wrote in “Half Mile Down” (Harcourt Brace, 1934). “I would focus on some one creature and just as its outlines began to be distinct on my retina, some brilliant, animated comet or constellation would rush across the small arc of my submarine heaven and every sense would be distracted, and my eyes would involuntarily shift to this new wonder.”

Beebe sketched some of the creatures, because no camera of the day was able to withstand the rigors of the deep and record the nuances of this cornucopia of astonishments.

Colleagues reacted coolly. Some accused Beebe of exaggeration. One reviewer suggested that his heavy breathing had fogged the window of the submarine vessel, distorting the undersea views.

Today, the revolution in lights, cameras, electronics and digital photography is revealing a world that is even stranger than the one that Beebe struggled to describe.

The images arrayed here come from “The Deep: The Extraordinary Creatures of the Abyss” (University of Chicago Press, 2007), by Claire Nouvian, a French journalist and film director.

. . .

Beebe, who ran the tropical research department at the New York Zoological Society, surely had intimations of what lay beyond the oceanic door he had opened. “The Deep” brings much of that dark landscape to light, even while noting that a vast majority of the planet’s largest habitat remains unexamined, awaiting a new generation of explorers. 

 

For the full story, see: 

WILLIAM J. BROAD.  "Mysteries to Behold in the Dark Down Deep: Seadevils and Species Unknown."  The New York Times  (Tues.,  May 22, 2007):  D3.

(Note:  ellipsis added.)

 

    "A Ping-Pong tree sponge."  Source of caption and photo:  online version of the NYT article cited above.

 




July 29, 2007

A Public Choice Theory of "Taxonomic Inflation"

 

The excerpt below is from a WSJ summary of an article that appeared in The Economist on May 19, 2007.

 

Scientists have taken to upgrading animals once thought to be subspecies into full-fledged species, in what the Economist says is an overzealous attempt to boost conservation of seemingly rare animals.

Sometimes, the reclassification of animals into their own species category is warranted, as new research reveals once-obscured markers that differentiate certain beasts. But lately, the weekly says, primatologists have been suffering from "taxonomic inflation."

. . .

. . .   One reason is that by fragmenting animal groups, the number of rare species increases, boosting animal-conservation claims.  At the same time, having a greater number of species boosts the chances that a habitat can pursue a legal designation as a protected area.

 

For the full summary, see: 

"Informed Reader; NATURE; Species Inflation May Infect Over-Eager Conservationists."  The Wall Street Journal  (Sat., May 19, 2007):  A6.

(Note:  ellipsis added.)

 




July 27, 2007

A Salute to Underappreciated Amateur Historians

 

On a bright Saturday afternoon earlier this month, 30 or so of us gathered to give James O. Hall the send-off he deserved. Appropriately enough, the memorial service was held in the James O. Hall Research Center of the Surratt House museum, in Clinton, Md., 12 miles south of Washington. Mr. Hall died in February at the age of 95, leaving no immediate survivors. The 30 who showed up were instead neighbors, friends, a pair of nieces and random hangers-on who, like me, had known him only slightly but who honored him as a giant in a long and noble and underappreciated line.

I don't think there's a good word for what Mr. Hall did: "researcher" is too dry, "historical investigator" carries hints of melodrama, and "archivist" suggests a dutiful drudge, which Mr. Hall was not. "Amateur historian" probably fits best, though it sounds vaguely derivative and second-tier. Following a career with the Labor Department -- he retired in the early 1970s -- Mr. Hall turned himself into the world's foremost authority on the assassination of Abraham Lincoln. Historians, pros and amateurs alike, sought him out for his knowledge and access to his exhaustive files. As one of them put it, James O. Hall knew more about Lincoln's murder than anyone who ever lived, including John Wilkes Booth.

. . .

"I had to teach myself genealogy," he said. "Not because I liked genealogy, but because it's how you find things that have been lost." Over the years, he tried to trace the descendants of everyone even remotely tied to the assassination. When he found a new great-granddaughter or the grandson of a nephew, he politely peppered that person with letters and phone calls, asking the descendant to rummage through attics -- or offering, even better, to do it himself. His industry never flagged, and it led him to some of his greatest discoveries. In a dusty cubby in a forgotten archive, Mr. Hall made one of the major Lincoln finds of the past 50 years: a letter of self-justification Booth wrote the morning of the murder.

Typically, in 1977, Mr. Hall chose to publish this astonishing find in the Lincoln Log, a newsletter for buffs. Its circulation was minuscule compared with the slick magazines -- National Geographic or American Heritage -- that would have loved to showcase such a find and maybe make its discoverer famous. But Mr. Hall was without professional vanity; that's what it means to be an amateur, after all.

At the end of his life, Mr. Hall treated his vast archives with the same modesty and discretion. At least two well-endowed universities made a play for the contents of his file cabinets. Instead, he gave them to the small, homespun Surratt House museum, once the country home of the Lincoln conspirator Mary Surratt and a favorite gathering place for buffs. With a single stroke, he transformed the museum into the Alexandrian library of assassination studies. It was a gesture of confidence and fellow feeling, made to all amateur historians from the best of their kind.

 

For the full commentary, see: 

ANDREW FERGUSON.  "TASTE; A History Hobby."  The Wall Street Journal  (Fri., May 25, 2007):  W13.

(Note:  ellipsis added.)

 




July 3, 2007

Neglect of the Important Issues, Is the Opportunity Cost of Pursuing the Cutely Clever

 

The Wall Street Journal summarizes an April 2, 2007 article by Noam Scheiber in The New Republic:

 

A new generation of economists has become so addicted to cleverness that dull but genuinely useful research is under threat.

"Freakonomics," the 2005 best seller that sought to explain the mysteries of everyday life through economics, is only partly to blame, writes Noam Scheiber. The deeper roots lie in a 1980s crisis of faith over economists' ability to reliably crunch numbers. Influential economist H. Gregg Lewis kicked it off by demonstrating that a host of broad, worthwhile empirical surveys of unions' impact on wages came to opposite conclusions, mostly thanks to the differing original assumptions by the studies' authors.

As a result, some economists retrenched, opting to focus on finding "solid answers to modest questions."

 

For the full summary, see:

"Informed Reader; Economics; How 'Freakonomics' Quashes Real Debates." The Wall Street Journal (Weds., March 28, 2007):  B11.

 




May 26, 2007

"A Triumph of Engaged Amateurism"

 

Steven Johnson has a great passage on the contribution of the amateur in The Ghost Map story (see below).

This is a theme that resonates.  In The Long Tail, Chris Anderson makes the case for amateurs in astronomy.  (Is it he, who points out that the root of the word "amateur" is to love?)

Stigler had an important early paper in which he discusses the professionalization of the economics profession.  He praises the results, but what he presents provides some grist for the mill of criticism too.  For example, the exit of the amateurs, reduced the applicability of the work, and turned research more toward internal puzzle-solving, and model-building.

The web is a leveler in science, as suggested in an NBER paper.  Maybe the result will be a resurgence of amateurism, and maybe that won't be all bad.

 

(p. 202)  But Broad Street should be understood not just as the triumph of rogue science, but also, and just as important, as the triumph of a certain mode of engaged amateurism.  Snow himself was a kind of amateur.  He had no institutional role where cholera was concerned; his interest in the disease was closer to a hobby than a true vocation.  But Whitehead was an amateur par excellence.  He had no medical training, no background in public health.  His only credentials for solving the mystery behind London's most devastating outbreak of disease were his open and probing mind and his intimate knowledge of the community. 

 

Source:

Johnson, Steven. The Ghost Map: The Story of London's Most Terrifying Epidemic - and How It Changed Science, Cities, and the Modern World. New York: Riverhead Books, 2006.

 

(Note:  A probably relevant, much praised book, that I have never gotten around to reading, is Martin J.S. Rudwick's The Great Devonian Controversy:  The Shaping of Scientific Knowledge among Gentlemanly Specialists.)

 




May 5, 2007

George Stigler on Astrology

 

I remember George Stigler saying (I think in conversation with me, but maybe as an aside in a lecture), that at first he had been inclined to reject a paper for the JPE that empirically tested astrology.  His reason was that, while he liked what the authors were doing, he was not sure that what they were doing, most appropriately belonged in an economics journal.

But when the authors convinced him that they would not be able to publish it anywhere else, he changed his mind and published it.

The episode tells us something about Stigler, and something about scientific method.  About Stigler, the episode provides strong evidence of the importance that Stigler placed on evidence, relative to theory.

On scientific method, the episode can be taken as an illustration of Karl Popper's distinction between the context of discovery and the context of justification.  There are many sources of hypotheses (the context of discovery).  Famously, Kepler's inspiration for the elliptical paths of planets, had a somewhat mystical source in the "perfect" solids.  But what makes a theory "scientific" is not its context of discovery, but its context of justification, viz., is the theory subjected to empirical test?

Hypotheses are not ruled out of science ab initio, but by being inconsistent with the evidence.  I agree with this view, which explains why I am a (passive) member of the Society for Scientific Exploration, which is devoted to the empirical testing of politically incorrect theories (such as UFOs, the Shroud of Turin, ESP, the Loch Ness Monster, etc.)

 

The JPE astrology article, accepted by Stigler as editor, was: 

Bennett, James T., and James R. Barth. "Astronomics: A New Approach to Economics?" Journal of Political Economy 81, no. 6 (Nov.-Dec. 1973): 1473-75.

 

A more recent empirical test of an astrological hypothesis is:

Wong, Ka-Fu, and Linda Yung. "Do Dragons Have Better Fate?" Economic Inquiry 43, no. 3 (July 2005): 689-97.

 

For more on Popper's views of science, consult:

Popper, Karl R. The Logic of Scientific Discovery. New York: Basic Books, 1959.

 




May 4, 2007

Venturing Into the Mean Streets to Get the Job Done

 

(p. 108)  It is worth pausing for a second to reflect on Snow's willingness to pursue his investigation this far.  Here we have a man who had reached the very pinnacle of Victorian medical practice---attending on the queen of England with a procedure that he himself had pioneered---who was nonetheless willing to spend every spare moment away from his practice knocking on hundreds of doors in some of London's most dangerous neighborhoods, seeking out specifically those houses that had been attacked by the most dread disease of the age.  But without that tenacity, without that fearlessness, without that readiness to leave behind the safety of professional success and royal patronage, and venture into the streets, his "grand experiment"---as Snow came to call it---would have gone nowhere.   The miasma theory would have remained unchallenged,

 

Source: 

Johnson, Steven. The Ghost Map: The Story of London's Most Terrifying Epidemic - and How It Changed Science, Cities, and the Modern World. New York: Riverhead Books, 2006.

 




April 29, 2007

"Under the Spell of a Theory"

 

Johnson's wonderful book is part mystery, part history, part philosophy of science, and part musing on political philosophy.  The passage below warms the heart (and stimulates the brain) of the libertarian.  Against great odds, Dr. Snow persisted in presenting ever-more convincing evidence for his correct water-borne theory of cholera.  Meanwhile Chadwick, the main advocate of government public health activities, continued to direct policy on the basis of the mistaken theory that cholera was spread by foul vapors in the air. 

 

(p. 120) Herein lies the dominant irony of the state of British public health in the late 1840s.  Just as Snow was concocting his theory of cholera as a waterborne agent that had to be ingested to do harm, Chadwick was building an elaborate scheme that would deliver the cholera bacteria directly to the mouths of Londoners.  (A modern bioterrorist couldn't have come up with a more ingenious and far-reaching scheme.)  Sure enough, the cholera returned with a vengeance in 1848-1849, the rising death toll neatly following the Sewer Commission's cheerful data on the growing supply of waste deposited in the river.  By the end of the outbreak, nearly 15,000 Londoners would be dead.  The first defining act of a modern, centralized public-health authority was to poison an entire urban population.  (There is some precedent to Chadwick's folly, however.  During the plague years of 1665-1666, popular lore had it that the disease was being spread by dogs and cats.  The Lord Mayor promptly called for a mass extermination of the city's entire population of pets and strays, which was dutifully carried out by his minions.  Of course, the plague turned out to be (p. 121) transmitted via the rats, whose numbers grew exponentially after the sudden, state-sponsored demise of their only predators.)

Why would the authorities go to such lengths to destroy the Thames?  All the members of these various commissions were fully aware that the waste being flushed into the river was having disastrous effects on the quality of the water.  And they were equally aware that a significant percentage of the population was drinking the water.  Even without a waterborne theory of cholera's origin, it seems like madness to celebrate the ever-increasing tonnage of human excrement being flushed into the water supply.  And, indeed, it was a kind of madness, the madness that comes from being under the spell of a Theory.  If all smell was disease, if London's health crisis was entirely attributable to contaminated air, then any effort to rid the houses and streets of miasmatic vapors was worth the cost, even if it meant turning the Thames into a river of sewage.

 

Source: 

Johnson, Steven. The Ghost Map: The Story of London's Most Terrifying Epidemic - and How It Changed Science, Cities, and the Modern World. New York: Riverhead Books, 2006.

 




February 23, 2007

New Book on Wiki (Quick) Process

   Source of book image:  http://ec2.images-amazon.com/images/P/1591841380.01._SS500_SCLZZZZZZZ_V37439749_.jpg

 

A new book is out on the wiki ("quick") phenomenon.  Chris Anderson has some stimulating comments on this phenomenon in his The Long Tail.  The Wikinomics book appears to be less profound, but may still be of interest.  (It appears to be a quick-read, management guru-jargon type book.)

The wiki issue that interests me is how wiki collaboration processes might substitute for rigorous editing and peer-review, as a way to get a lot of high-quality information out there fast.  (This is what Anderson claims, and the more I use the Wikipedia, the more plausible I find the claim.)

 

The reference to the book is:

Tapscott, Don, and Anthony D. Williams. Wikinomics: How Mass Collaboration Changes Everything. Portfolio, 2006.

 




January 29, 2007

Empirical Science at Its Best


   Source of book image:  http://images.barnesandnoble.com/images/11460000/11468284.jpg

 

I have not yet read The Ghost Map, but from the review excerpted below, it sounds like a wonderful book.  One lesson from the book appears to be that much good can come from a careful collection of evidence, and that much harm can come from sticking to a theory in spite of the evidence.  It is also interesting that in this tale, the villain turns out to be the advocate of public works, whose good intentions resulted in much death and suffering. 

 

(p. P8) The sociology of error is a wonderful subject. Some university ought to endow a chair in it -- and then make Steven Johnson the first professor. Mr. Johnson last provoked the public with his counterintuitive polemic "Everything Bad Is Good For You," in which he argued that TV and videogames actually improve our cognitive skills. In "The Ghost Map" he tells the story of how for 30 years and more the medical establishment in Victorian London refused to accept what was staring them in the face, namely that cholera was a waterborne disease.

Thousands of Londoners died while doctors and public-health officials stubbornly clung to the view that the plague was an airborne miasma that hung in the foul atmosphere of the slums and was inhaled by the wretched creatures who lived there. Every kind of cure was proposed: opium, linseed oil and hot compresses, smoke, castor oil, brandy -- everything but the simple, obvious remedy of rehydration, which reduces the otherwise fatal disease to a bad case of diarrhea.

The fact that the cholera toxin tricks the cells in the lining of the colon into expelling water at a terrifying rate (victims have been known to lose 30% of their body weight in a matter of hours) should surely have alerted someone to the possibility that putting this Niagara back into the body might be worth trying. Only one doctor, Thomas Latta, hit upon the answer, in 1832, just a few months after the first outbreak ever in Britain. His mistake was not to inject enough salty water, and his lone initiative was soon overwhelmed by the brainless babble of the quacks.

Chief among the villains of Mr. Johnson's unputdownable tale was the man whom we were brought up to revere as the father of public sanitation, Edwin Chadwick. This dour, tactless, unpopular reformer laid the foundations for all the government interventions in public health that we now take for granted. Yet in this story he labored under not one but two illusions that proved catastrophic.

. . .

With the austere teetotaller and vegetarian Dr. Snow and his devoted helper in the Soho slums, the Rev. Henry Whitehead, "The Ghost Map" gains not one but two heroes. Patiently they mapped the patterns of victims and survivors and narrowed down the most likely source of the cholera plague to the Broad Street pump. But even after the pump handle was removed so that Londoners could no longer fill their buckets there and the illness subsided, the miasmatists were not convinced. Snow then tramped the streets of Battersea and Vauxhall to demonstrate that those who had their water from higher up the Thames, above the reach of the tide, remained unharmed, while those who took it from the foul tidewater perished in the hundreds. This was no easy task, since the pattern of water pipes under London's houses was as tangled as the pattern of Internet service providers are today.

Why did it take so long? Because mapping epidemics was only in its infancy, though Snow's famous map was not quite the first. Because the questions that Chadwick's public-health board researched were self-fulfilling, all having to do with the smells and personal habits of the poor and not with the water they drank. The researchers mistook correlation for causation: Nobody died on the high ground of Hampstead, where the air was purer, therefore higher was safer -- or so it seemed until a Mrs. Eley, who had retired thither, arranged to receive a jugful of water from her beloved Broad Street pump and got cholera.

But above all Chadwick and his crew were certain of themselves because the stench of the slums was so utterly disgusting and because smell acts so powerfully on our imaginations. Only the most careful and dispassionate investigators were free of the obsession with stench. Henry Mayhew, for example, noted in his "London Labour and the London Poor" (1851) that sewer-hunters, who scavenged deep underground knee-deep in muck, lived to a ripe old age. The Great Stink of 1858, which finally persuaded the government to commission Sir Joseph Bazalgette to lay down the magnificent network of sewers that have lasted to this day, did not kill a single Londoner -- yet still Chadwick did not believe.

 

For the full review, see: 

FERDINAND MOUNT.  "BOOKS; Lost in a Time of Cholera; How a doctor's search solved the mystery of an epidemic in Victorian London."  The Wall Street Journal   (Sat., October 21, 2006):  P8.

(Note: ellipsis added.)

 

The reference to the book is:

Johnson, Steven. The Ghost Map: The Story of London's Most Terrifying Epidemic - and How It Changed Science, Cities, and the Modern World. New York: Riverhead Books, 2006.  299 pages, $26.95

 

SnowJohn.jpg   Dr. John Snow.  Source of photo:  online version of the WSJ article cited above.

ChadwickEdwin.jpg   Edwin Chadwick.  Source of photo:  online version of the WSJ article cited above.

 




January 13, 2007

Feynman on Viking Evidence of No Life on Mars

 

Based on the Viking tests, astronomers concluded that there probably was no life on Mars.  Begley (2006) documents the recent research showing that applying the Viking tests to earth, results in the conclusion that there is no life on earth, either.  Once again, Feynman was way ahead of his time:

  

(p. 204)  We like to sit down and talk about how different things could be from what we expected; take the Viking landers on Mars, for example, we were trying to think how many ways there could be life that they couldn't find with that equipment. 

 

Source: 

Feynman, Richard P. The Pleasure of Finding Things Out: The Best Short Works of Richard P. Feynman. New York: Perseus Books, 1999.

(Note:  italics in original.)

 

The reference on the Begley article:

Begley, Sharon. "Science Journal; Scientists Revisit Data on Mars with Minds More Open to 'Life'." The Wall Street Journal  (Fri., October 27, 2006):  B1.

 




January 8, 2007

"Drawing the Best Minds into a Whirlpool of Mathematical Solipsism"

TroubleWithPhysicsBK.gif   Source of book image:  http://www.houghtonmifflinbooks.com/catalog/titledetail.cfm?titleNumber=689539

 

Physicists rightly feel uneasy about descriptions of the physical world that divide it into discrete clusters of equations and axioms, each cluster explaining one part of existence but not another.  Better would be finding a Theory of Everything capable of conjoining, in a few equations, planet-pulling gravitation and the microcosmic weirdness that goes on in the quantum world of atoms and particles.  Physicists would like to stitch time and space together as well.

Einstein tried and failed.  In recent years, "string theory" has been the favored means of attempting to tie everything together, but it has unraveled into mathematical frippery, positing ever more intricate elaborations extending into anywhere from 10 to 26 dimensions, some arising from themselves, some hidden in ways so baroquely scrolled that you can get a migraine just thinking about thinking about them.  Little wonder that, as an experimental science, string theory seems to have nowhere to go.

That is the problem that Lee Smolin identifies in "The Trouble With Physics."  He laments a kind of sociological imperative drawing the best minds into a whirlpool of mathematical solipsism.

 

For the full review, see:

RUSSELL SEITZ.  "BOOKS; Untangling the Knots in String Theory."  The Wall Street Journal  (Sat., December 2, 2006):  P9.

 

The reference to the book under review, is: 

Lee Smolin.  The Trouble With Physics: The Rise of String Theory, the Fall of a Science, and What Comes Next.  Houghton Mifflin, 2006.  (392 pages, $26)

 




December 24, 2006

Publishing Pretty Papers Full of Clever Mathematical Tricks

  Source of book image:  http://images.amazon.com/images/P/0738203491.01.LZZZZZZZ.jpg

 

In his elegant and thoughtful foreward, physicist, futurist, and guru Freeman Dyson writes:

(p. viii)  Before I met Feynman, I had published a number of mathematical papers, full of clever tricks but totally lacking in im-(p. ix)portance.  When I met Feynman, I knew at once that I had entered another world.  He was not interested in publishing pretty papers.  He was struggling, more intensely than I had ever seen anyone struggle, to understand the workings of nature by rebuilding physics from the bottom up.   

 

The reference to the book, is:

Feynman, Richard P. The Pleasure of Finding Things Out: The Best Short Works of Richard P. Feynman. New York: Perseus Books, 1999.




December 23, 2006

Schumpeter's "Sarcastic Remark" on Mathematics in Economics

Erich Schneider had been a student of Schumpeter's at the University of Bonn in the late 1920s.  The following sentences are from his lectures on Schumpeter that he published in German in 1970, and that were were translated into English by W.E. Kuhn and published in that form in 1975.

(p. 41) When, after many years of separation, I saw Schumpeter again at Harvard in the fall of 1949 and heard his lectures on economic theory--which he gave at 2 p.m., as in Bonn--I found him to be exactly the same man as before. On that afternoon he talked about the nature of dynamic analysis and about the role of difference equations in the framework of such an analysis.

To the above passage, Schneider adds footnote 3:

(p. 59) He dropped the sarcastic remark: "There are economists who do not know what a difference equation is; but there are also those who know nothing else."

Schneider, Erich.  Joseph A. Schumpeter:  Leben Und Werk Eines Grossen Sozialokonomen (Life and Work of a Great Social Scientist). Lincoln, Neb.:  University of Nebraska--Lincoln Bureau of Business Research, 1975.




December 19, 2006

Feynman: What Biology Needs is Not More Math, But to See Better at the Atomic Level

A very bright, and very mathematically competent, fellow, grants that math is not the source of all knowledge.  So is economics more like physics, or more like biology? 

 

(p. 124)  We have friends in other fields--in biology, for instance.  We physicists often look at them and say, "You know the reason you fellows are making so little progress?"  (Actually I don't know any field where they are making more rapid progress than they are in biology today.)  "You should use more mathematics, like we do."  They could answer us--but they're so polite, so I'll answer for them:  "What you should do in order for us to make more rapid progress is to make the electron microscope 100 times better."

What are the most central and fundamental problems of biology today?  They are questions like:  What is the sequence of bases in the DNA?  What happens when you have a mutation?  How is the base order in the DNA connected to the order of amino acids in the protein?  What is the structure of the RNA:  is it a single-chain or double-chain, and how is it related in its order of bases to the DNA?  What is (p. 125) the organization of the microsomes?  How are proteins synthesized?  Where does the RNA go?  How does it sit?  Where do the proteins sit?  Where do the amino acids go in?  In photosynthesis, where is the chlorophyll; how is it arranged; where are the carotenoids involved in this thing?  What is the system of the conversion of light into chemical energy?

It is very easy to answer many of these fundamental biological questions; you just look at the thing!  You will see the order of bases in the chain; you will see the structure of the microsome.  Unfortunately, the present microscope sees at a scale which is just a bit too crude.  Make the microscope one hundred times more powerful, and many problems of biology would be made very much easier.  I exaggerate, of course, but the biologists would surely be very thankful to you--and they would prefer that to the criticism that they should use more mathematics.

 

Source:

Feynman, Richard P.  The Pleasure of Finding Things Out: The Best Short Works of Richard P. Feynman.  New York:  Perseus Books, 1999.

 




December 14, 2006

"Forgotten not for lack of importance, but for lack of theoretical frame-works"

A paper by current head of the President's Council of Economic Advisors, Ed Lazear, is significant for what it says near the end about economists forgetting facts, because the facts do not fit into current theory.

(p. 260)  Human capital theory is primarily a supply-side approach that focuses on the characteristics and skills of the individual workers.  It pays far less attention to the environments in which workers work.  As such, the human capital framework has led researchers to focus on one class of questions, but to ignore others.  Specifically, little attention has been paid to the jobs in which workers are employed. 

(p. 263) The fact that some jobs and some job characteristics are more likely to lead to promotions than other jobs is not surprising.  But the analysis suggests that other ways of thinking about wage determination, namely, through job selection, may have been unduly ignored in the past. 

. . .

Researchers have begun to make jobs rather than individuals the unit of analysis.  This change of focus can illuminate new issues and provide answers to questions that were once posed and forgotten.  The questions were forgotten not for lack of importance, but for lack of theoretical frame-works.  The theory is now developed and awaits confirmation in the data.

 

For the full paper, see:

Lazear, Edward P.  "A Jobs-Based Analysis of Labor Markets."  American Economic Review 85, no. 2 (May 1995):  260-265.

(Note:  elipsis added.)

 




November 25, 2006

Does Focus on Scarcity, Blind Us to Abundance?

Chris Anderson ends chapter 8 of his stimulating The Long Tale, with a provocative jab at economists:

(p. 146)  Finally, it's worth noting that economics, for all its charms, doesn't have the answer to everything.  Many phenomena are simply left to other disciplines, from psychology to physics, or left without an academic theory at all.  Abundance, like growth itself, is a force that is changing our world in ways that we experience every day, whether we have an equation to describe it or not.

 

The reference to Anderson's book, is:

Anderson, Chris. The Long Tail. New York: Hyperion, 2006.




October 14, 2006

R&D Stats Better; But Still Omit a Lot of Innovation

GDPgrowthWithR&Dgraph.gif  Source of graphic:  online version of WSJ article cited below.

Note well Romer's caveat below that, although we may be measuring better, we are still not measuring Schumpeterian innovations (such as the Wal-Mart innovations that are vastly increasing the efficiency of retailing).

 

That research and development makes an important contribution to U.S. economic growth has long been obvious.  But in an important advance, the nation's economic scorekeepers declared they can now measure that contribution and found that it is increasing.

. . .

Since the 1950s, economists have explained economic output as the result of measurable inputs.  Any increase in output that can't be explained by capital and labor is called "multifactor productivity" or "the Solow residual," after Robert Solow, the Nobel Prize-winning economist considered the father of modern growth theory.

From 1959 to 2002, this factor accounted for about 20% of U.S. growth.  From 1995 to 2002, when productivity growth accelerated sharply, that grew to about 33%.  Accounting for R&D would explain about one-fifth, by some measures, of the productivity mystery.  It suggests companies have been investing more than the official data had previously shown -- a good omen for future economic growth.  "The slump in investment is not as drastic as people thought before they saw these figures," says Dale Jorgenson, professor of economics at Harvard University.

Mr. Jorgenson noted a lot of the multifactor productivity growth remains unexplained.  "The great mystery of growth . . . is not eliminated."

Paul Romer, an economics professor at Stanford Business School, said the better the measurements of R&D become, the more economists and policy makers will realize other factors may be more important.  "If you look at why we had rapid productivity growth in big-box retailing, there were lots of intangibles and ideas that . . . don't get recorded as R&D."

 

For the full story, see:

GREG IP and MARK WHITEHOUSE.  "Why Economists Track Firms' R&D; Data on Knowledge Creation Point to an Increasing Role In Domestic Product Growth."  Wall Street Journal  (Fri., September 29, 2006):  A2.

(Note:  The slightly different online version of the title is:  "Why Economists Track Firms' R&D; Data on Knowledge Creation Point to an Increasing Role In Domestic Product Growth.")

(Note:  ellipses in Jorgenson and Romer quotes, in original; ellipsis between paragraphs, added.)

 




August 10, 2006

Static Assumptions Undermine Economic Policy Analysis


Over 50 years ago, Schumpeter emphasized that static models of capitalism miss what is most important in capitalism.  Yet static analysis still dominates most policy discussions.  But there is hope:


(p. A14) A bit of background:  Most official analysis of tax policy is based on what economists call "static assumptions."  While many microeconomic behavioral responses are included, the future path of macroeconomic variables such as the capital stock and GNP are assumed to stay the same, regardless of tax policy.  This approach is not realistic, but it has been the tradition in tax analysis mainly because it is simple and convenient.

In his 2007 budget, President Bush directed the Treasury staff to develop a dynamic analysis of tax policy, and we are now reaping the fruits of those efforts.  The staff uses a model that does not consider the short-run effects of tax policy on the business cycle, but instead focuses on its longer run effects on economic growth through the incentives to work, save and invest, and to allocate capital among competing uses.

 

For the full story, see:

ROBERT CARROLL and N. GREGORY MANKIW.  "Dynamic Analysis."  The Wall Street Journal  (Weds., July 26, 2006):  A14.





July 12, 2006

Test That Showed No Life on Mars, Now Also Shows No Life on Earth, Either

  One of the Viking landers on Mars.  Source of photo:  http://www.msss.com/mars/pictures/viking_lander/viking_lander.html

 

When scientists announced Monday that the search for life on Mars 30 years ago may not have been quite the bust it has long been portrayed, it didn't mean that the mission had missed any microorganisms, let alone advanced life forms.  But it did underline the growing sense that decades of assumptions about extraterrestrial life need serious re-examination.

In 1976, scientists studying data sent back by the Viking landers were quick to dismiss life on Mars.  . . .

. . .

Some three decades later, more-sophisticated instruments have shown that the Vikings couldn't have detected organic molecules even if any were present.  When scientists fed soil from the Atacama Desert of Chile and Peru, and the Dry Valleys of Antarctica, experiments like those the Vikings conducted came up empty.  Yet, new techniques show the samples contained 10 to 1,500 micrograms of carbon per gram.

"If we knew this 30 years ago, our interpretation of the Viking results would have been very different," says Rafael Navarro-González of Mexico's National Autonomous University, who led the study published in Proceedings of the National Academy of Sciences.

 

For the full story, see: 

SHARON BEGLEY.  "SCIENCE JOURNAL; Scientists Revisit Data On Mars With Minds More Open to 'Life'."  The Wall Street Journal  (Fri., October 27, 2006):  B1.

 

 




May 25, 2006

Omit the Footnotes?

When I was a graduate student at Chicago, Milton Friedman was rumored to have given a presentation on how to write a doctoral dissertation in which he said something like: 

Take everything nonessential, and move it into footnotes.  Then collect all the footnotes into an appendix.  Finally, delete the appendix.

My memory is that Deirdra McCloskey, in her wonderful advice on how to write economics clearly, also advises against footnotes.  I at least attribute this advice to McCloskey (and Friedman) when I pass it on to students.

But sometimes, when I write an article, a misguided referee, or editor, insists that I omit some stuff that I think is really good.  When that happens, sometimes, if I feel strongly, I sneak some of that material back into the paper in footnotes.  Maybe no one will ever read it, but I feel better that it is still there.

And every once in awhile, it may turn out that the footnotes are what matter most: 

It was typical of Schumpeter's love for theory that he rejected Marshall's view that the reader could skip the footnotes and appendixes.  If time were short, Schumpeter advised, read them and skip the text!  (p. 7; italics in original.)

In this case, though, I suspect that Marshall was right, and Schumpeter wrong.

 

Source:

Samuelson, Paul. "Compete as an Economic Theorist." In Schumpeterian Economics, edited by Helmut Frisch, New York: Praeger Publishers, 1981, pp. 1-27.

 




April 17, 2006

Becker on Goals of Economics: Understand the World, and Improve It

 

Becker.jpg   Gary Becker at April 7, 2006 tribute dinner.  Source of image:  online press release cited below.

 

Gary Becker has made enormous contributions to economic theory, most notably in convincing the profession of the importance of human capital and the family.  A new center has been established at the University of Chicago in Gary Becker's honor.

 

Becker's brief remarks concluded the evening.  Economics will change over time, but one constant—whatever the tools or techniques—is the goal of economics, he said.   “It is judged ultimately by how well it helps us understand the world, and how well we can help improve it.”

 

For the full story, see:

Goddu, Jenn Q.  "Gift Names the Becker Center on Chicago Price Theory, Founded by Richard O. Ryan."  University of Chicago News Office, 2006.

 




April 12, 2006

Studies Show Economic Freedom Boosts Economic Growth

A trio of European economists have just published a meta-analysis on the effects of economic freedom (EF) on economic growth. After many pages, here is their bottom-line conclusion:

(p. 182) Most studies reviewed in this paper have serious drawbacks, including lacking senstitivity analysis and poor specifications of the growth model used. However, studies that have applied some kind of sensitivity analysis and sensible specfications generally find support for a positive relationship between changes in EF and growth. This suggests that liberalization will indeed boost economic growth.


For the full article, see:

De Haan, Jakob, Susanna Lundström, and Jan-Egbert Sturm. "Market-Oriented Institutions and Policies and Economic Growth: A Critical Survey." Journal of Economic Surveys 20, no. 2 (2006): 157-91.




March 2, 2006

Owlish Evidence: More on Why Crichton is Right

Environmentalists have hypothesized that there is a link between harvesting old-growth forests and declines in owl populations. But there is reason to believe that the hypothesis may be false, and apparently environmentalists and the federal government do not have much interest in testing it:


. . . , we know little about the relationship between harvesting and owl populations. One such study -- privately funded -- infers an inverse relationship between harvesting and owls. In other words, in areas where some harvesting has occurred, owl numbers are increasing a bit, or at least holding their own, while numbers are declining in areas where no harvesting has occurred.

This news will come as no surprise to Oregon, Washington and California timberland owners who are legally required to provide habitat for owls. Their actively managed lands are home to the highest reproductive rates ever recorded for spotted owls. Why is this?

One possible answer is that the anecdotal evidence on which the listing decision was based is incomplete. No one denies the presence of owls in old-growth forests, but what about the owls that are prospering in managed forests and in forests where little old growth remains? Could it be that spotted owls are more resourceful than we think?

We don't know -- and the reason we don't know is that 16 years ago federal scientists chose to politicize their hypothesis rather than test it rigorously, to flatly reject critiques from biometricians who questioned the statistical validity of the evidence on which the listing decision was based, and to declare with by-god certainty that once the old-growth harvest stopped owl populations would begin to recover.


For the full story, see:

JIM PETERSEN. "RULE OF LAW; Owl Be Damned." The Wall Street Journal (Sat., February 18, 2006): A9.




February 22, 2006

Solow's Wit (But Not Wisdom): Treat Schumpeter "Like a Patron Saint"


(p. 195) As Robert Solow wrote acidly in 1994, commenting on a series of papes on growth and imperfect competition, "Schumpeter is a sort of patron saint in this field. I may be alone in thinking that he should be treated like a patron saint: paraded around one day each year and more or less ignored the rest of the time."

Schumpeter was a most unwelcome guest at the neoclassical table. Yet it was hard for the mainstream to reject him out of hand, since Schumpeter was such a celebrant of capitalism and entrepreneurship. He thought it a superb, energetic, turbulent system, one that led to material betterment over time. He hoped it would triumph over socialism. He just didn't believe it functioned in anything close to the way the Marshallians did, and he was appalled that economists could apply an essentially static model to something as profoundly dynamic as capitalism. Schumpeter wrote presciently, "Whereas a stationary feudal economy would still be a feudal economy, and a stationary socialist economy would still be a socialist economy, stationary capitalism is a contradiction in terms." Its very essence, as the economic historian Nathan Rosenberg wrote, (p. 196) echoing Schumpeter, "lies not in equilibrating forces, but in the inevitable tendency to depart from equilibrium" every time an innovation occurs.


Source:

Kuttner, Robert. Everything for Sale: The Virtues and Limits of Markets. Chicago: University of Chicago Press, 1999.





January 10, 2006

Theory Uncomplemented by History, "Is Worse than no Theory at All"

I have been primarily a theorist all my life and feel quite uncomfortable in having to preach the historian's faith. Yet I have arrived at the conclusion that theoretical equipment, if uncomplemented by a thorough grounding in the history of the economic process, is worse than no theory at all.

Excerpted from a letter from Schumpeter to Miss Edna Lonegan, dated February 16, 1942, stored in the Schumpeter archives at Harvard, and reprinted in:

Swedberg, Richard. Schumpeter: A Biography. Princeton, NJ: Princeton University Press, 1991, pp. 229-230.




January 8, 2006

Ben Rogge on Consistency, Smith and Ricardo

Some would argue that consistency is not always a good thing. Ben Rogge's favorite quote from Emerson was:

A foolish consistency is the hobgoblin of little minds, adored by little statesman and philosophers and divines.

Rogge used to mention this quote when he defended Adam Smith against the charge of inconsistency. He would say that Smith's errors on one page would not keep him from writing an important (albeit inconsistent) truth on the next page. In this regard, he contrasted Smith with Ricardo. Ricardo was consistent, and since he was wrong at the start, he was consistently wrong throughout.


Source for the Emerson quote:

Bartlett, John. Familiar Quotations. Boston: Little, Brown and Company, 1955, p. 501, column b. Bartlett gives the source as Emerson's essay "Self-Reliance."




December 30, 2005

Mathematical Rigor

Mathematics brought rigor to economics. Unfortunately, it also brought mortis.

Attributed, in various versions, and in various places, to Kenneth Boulding.




November 18, 2005

The Importance of a Good Example

Stanley Fischer, the former deputy managing director of the IMF, once remarked to me, "One good example is worth a thousand theories." (p. 463)

Friedman, Thomas L. The World Is Flat: A Brief History of the Twenty-First Century. New York: Farrar, Straus and Giroux, 2005.




August 18, 2005

No Known Upper Bound for Economic Growth

Given the limited state of our knowledge of the process of technological change, we have no way to estimate what the upper bound on the feasible rate of growth for an economy might be. If economists had tried to make a judgment at the end of the 19th century, they would have been correct to argue that there was no historical precedent that could justify the possibility of an increase in the trend rate of growth of income per capita to 1.8% per year. Yet this increase is what we achieved in the 20th century. (p. 226)

Romer, Paul M. "Should the Government Subsidize Supply or Demand in the Market for Scientists and Engineers?" In Innovation Policy and the Economy, vol. 1, Cambridge, MA: MIT Press, 2001, pp. 221-252.




August 17, 2005

Evidence Matters: The Ivory-Billed Woodpecker

Science doesn't always work this way, but sometimes it does, and it would be good if it did more often.

The phonex had nothing on the ivory-billed woodpecker.

It is hard to keep track of how many times this near-mythic bird, the largest American woodpecker and a poignant symbol of extinction and disappearing forests, has been lost, and then found. Now it is found again.

Even the most skeptical ornithologists now agree. They say newly presented recordings show that at least two of the birds are living in Arkansas

Richard O. Brum, an ornithologist at Yale University and one of several scientists who had challenged the most recently claimed rediscovery of the ivory bill, said Monday after listening to the tape recordings that he was not "strongly convinced that there is at least a pair of ivory bills out there."

Mark B. Robbins, an ornithologist at the University of Kansas, who had also been a skeptic, listened to the same recordings with a graduate students and said, "We were absolutely stunned."

James Gorman and Andrew C. Revkin. "Vindication For a Bird And Its Fans." New York Times (Tuesday, August 2, 2005): A10.

See also, the earlier: James Gorman. "Woodpecker Flies By and the Critics Soon Follow." New York Times Section 1 (Sun., July 24, 2005): 1 & 21.




August 16, 2005

A Modest Proposal

Schools: Save cash

Teaching intelligent design exclusively would allow us to sell off a lot of "scientific" lab equipment and get rid of all of our grossly overpaid "science" teachers.

Imagine the savings if we could hire people who had the common sense to answer that "God did that" to every scientific inquiry. We don't need to limit intelligent design merely to the question of the origins of humanity, either. It's such a tidy theory that it can explain just about everything.

It can explain the 250,000 humans killed by the December tsunami. It can explain AIDS. It can explain cancer. It can explain 9/11.

Why bother studying anything at all when we already have the answer?

Dan Mundt, Denison, Iowa


Source: "The Public Pulse." Omaha World-Herald (Friday, April 12, 2005): 6B.




August 3, 2005

Debreu on Danger of Over-Mathematization of Economics

From Debreu's Presidential Address before the American Economic Association:

In the past two decades, economic theory has been carried away further by a seemingly irresistible current that can be explained only partly by the intellectual successes of its mathematization.

Essential to an attempt at a fuller explanation are the values imprinted on an economist by his study of mathematics. When a theorist who has been so typed judges his scholarly work, those values do not play a silent role; they may play a decisive role. The very choice of the questions to which he tries to find answers is influenced by his mathematical background. Thus, the danger is ever present that the part of economics will become secondary, if not marginal, in that judgment. (p. 5)

Debreu, Gerard. "The Mathematization of Economic Theory." American Economic Review 81, no. 1 (1991): 1-7.





July 30, 2005

Economics and Physics

Economists have sometimes been accused of physics-envy, but that is an utterly misleading accusation. Anyone who knows modern physics will testify that physicists care about experimental evidence, about bringing their theories into conformity with the experimental evidence, and very little about rigorous theorems and analytical lemmas. What economists really suffer from is mathematics-envy.

Blaug, Mark. "Not Only an Economist: Autobiographical Reflections of a Historian of Economic Thought." In Reflections of Eminent Economists, edited by Michael Szenberg and Lall Ramrattan, 71-94. Cheltenham, UK: Edward Elgar, 2004, p. 90.

We economists like to think of our science as akin to physics in its mathematical rigor. Quite by accident, while pursuing some research on stochastic processes, I obtained in 1995 a detailed description of the research being done by tenured and tenure-track members of the Harvard University physics department, ranked that same year in a National Research Council survey as the top physics department in the United States. Analysis of the research

Continue reading "Economics and Physics" »




HP3D5006CropSmall.jpg


















The StatCounter number above reports the number of "page loads" since the counter was installed late on 2/26/08. Page loads are defined on the site as "The number of times your page has been visited."


View My Stats