November 18, 2015

Challenging Videogame Improves Attention and Memory in Seniors

(p. R1) Neuroscientist Adam Gazzaley and his colleagues at the University of California in San Francisco have found that playing a challenging videogame upgrades our ability to pay attention.

As reported in the journal Nature in 2013, the Gazzaley lab trained 60- to 85-year-old subjects on a game called NeuroRacer. The multitask version involves simulated driving along a winding road while quickly pressing keys or a game controller to respond to a green sign when it appears on the roadside. As a control, some subjects played a single-task version of the game that omits the winding road and involves only noticing and responding to the green sign. To ensure that subjects were genuinely challenged but not discouraged, the level of game difficulty was individualized.

After 12 hours of training spread evenly over a month, multitasking subjects were about twice as efficient at shifting attention as when they started, a huge improvement by any standard. Remarkably, their new scores were comparable to those of 20-year-olds not trained on NeuroRacer. The subjects still tested positive six months later.

The multitaskers also got an unexpected brain bonus. Their sustained concentration and working memory (briefly holding information such as a phone number) improved as well. The training had targeted neither of these functions, but the general benefits emerged nonetheless.

For the full commentary, see:

PATRICIA CHURCHLAND. "MIND AND MATTER; A Senior Moment for Videogames as Brain-Boosters." The Wall Street Journal (Sat., Oct. 3, 2015): C2.

(Note: the online version of the commentary has the date Sept. 30, 2015, and the title "MIND AND MATTER: Videogames for Seniors Boost Brainpower.")

The Gazzaley article mentioned above, is:

Anguera, J. A., J. Boccanfuso, J. L. Rintoul, O. Al-Hashimi, F. Faraji, J. Janowich, E. Kong, Y. Larraburo, C. Rolle, E. Johnston, and Adam Gazzaley. "Video Game Training Enhances Cognitive Control in Older Adults." Nature 501, no. 7465 (Sept. 5, 2013): 97-101.

November 15, 2015

Dogged Dreamers Developed Deadly Dirigibles

(p. C7) "Dirigibility" means the ability to navigate through the air by engine power, unlike balloon flight, which is captive to the wind. Beginning and ending with the Hindenburg vignette, C. Michael Hiam gives in "Dirigible Dreams" a concise but comprehensive history of the airship and its evolution. With style and some flair, Mr. Hiam introduces a cast of dogged visionaries, starting with Albert Santos-Dumont, a Brazilian whose exploits from 1901 onward usually culminated in our hero dangling from a tree or a high building, shredded gas bags draped around him like a shroud. For all of these pioneers, problems queued up from the outset: Insurance companies, for example, refused to quote a rate for aerial liability. (Try asking your broker today.) And to inflate the craft the engineers were stuck with hydrogen, since non-flammable helium was too scarce and hot air has insufficient lifting force.

. . .

In 1929, British engineers pioneered a giant dirigible--at 133 feet in diameter, Mr. Hiam notes, it was "the largest object ever flown"--powered by six Rolls-Royce Condor engines. But too many died as the still-flimsy crafts plunged to the ground in flames. His Majesty's secretary of state for air perished in a luxurious airship cabin on the way to visit the king's subjects in India. One by one, nations gave up their dirigible dreams, especially after 35 souls burned to death on the Hindenburg in Lakehurst, N.J., one of the first transport disasters recorded on film. After that tragedy, commercial passengers never flew in an airship again, and by the start of World War II just two years later "the airship had become entirely extinct."

For the full review, see:

SARA WHEELER. "Inflated Hopes; Early airship experimenters found that insurance companies refused to quote rates for aerial liability." The Wall Street Journal (Sat., Oct. 18, 2014): C7.

(Note: ellipsis added.)

(Note: the online version of the review was updated on Oct. 23, 2014.)

The book under review, is:

Hiam, C. Michael. Dirigible Dreams: The Age of the Airship. Lebanon, NH: ForeEdge, 2014.

November 12, 2015

The Cure for Technology Problems Is Better Technology

(p. D2) The real lesson in VW's scandal -- in which the automaker installed "defeat devices" that showed the cars emitting lower emissions in lab tests than they actually did -- is not that our cars are stuffed with too much technology. Instead, the lesson is that there isn't enough tech in vehicles.

In fact, the faster we upgrade our roads and autos with better capabilities to detect and analyze what's going on in the transportation system, the better we'll be able to find hackers, cheaters and others looking to create havoc on (p. B11) the highways.

. . .

"What happened at Volkswagen had to do with embedded software that's buried deep in the car, and only the supplier knows what's in it -- and it's a black box for everybody else," said Stefan Heck, the founder of Nauto, a new start-up that is introducing a windshield-mounted camera that monitors road conditions for commercial fleets and consumers. The camera uses artificial intelligence to track traffic conditions; over time, as more vehicles use it, it could provide users with traffic and safety information plus data about mileage and other automotive functions.

The end goal for intelligent-car systems, said Dr. Heck, is to create an on-road network with data that is constantly being analyzed to get a sharper picture of what's happening on the road. Sure, companies might still be able to cheat. But with enough independent data sources coming from different places on the road, it would become much more difficult.

He said there really isn't any going back -- software in cars is responsible not just for driver comforts like in-dash navigation, but also for critical safety and performance systems, many of which improve the car's environmental footprint.

For the full commentary, see:

Farhad Manjoo. "STATE OF THE ART; Our Cars Need More Technology." The New York Times (Thurs., Oct. 1, 2015): B1 & B11.

(Note: ellipsis added.)

(Note: the online version of the commentary has the date SEPT. 30, 2015, and the title "STATE OF THE ART; VW Scandal Shows a Need for More Tech, Not Less." )

October 22, 2015

"Bring Prosperity to Billions of People"

(p. B1) If you're feeling down about the world, the book, "Resource Revolution: How to Capture the Biggest Business Opportunity in a Century," is an antidote. Mr. Rogers and Mr. Heck outline how emerging advances -- among them 3-D printing, autonomous vehicles, modular construction systems and home automation -- might in time alter some of the world's largest industries and (p. B7) bring prosperity to billions of people.

They put forward a rigorous argument bolstered by mountains of data and recent case studies. And once you start looking at Silicon Valley their way, your mind reels at the far-reaching potential of the innovations now spreading through society.

For the full commentary, see:

Farhad Manjoo. "STATE OF THE ART; The Future Could Work, if We Let It." The New York Times (Thurs., AUG. 28, 2014): B1 & B7.

(Note: the online version of the commentary has the date AUG. 27, 2014.)

The book praised in the commentary is:

Heck, Stefan, and Matt Rogers. Resource Revolution: How to Capture the Biggest Business Opportunity in a Century. New York: Melcher Media, 2014.

October 19, 2015

FCC Gains Arbitrary Power Over Internet Innovation

(p. A11) Imagine if Steve Jobs, Larry Page or Mark Zuckerberg had been obliged to ask bureaucrats in Washington if it was OK to launch the iPhone, Gmail, or Facebook's forthcoming Oculus virtual-reality service. Ridiculous, right? Not anymore.

A few days before the Independence Day holiday weekend, the Federal Communications Commission announced what amounts to a system of permission slips for the Internet.

. . .

As the FCC begins to issue guidance and enforcement actions, it's becoming clearer that critics who feared there would be significant legal uncertainty were right. Under its new "transparency" rule, for example, the agency on June 17 conjured out of thin air an astonishing $100 million fine against AT&T, even though the firm explained its mobile-data plans on its websites and in numerous emails and texts to customers.

The FCC's new "Internet Conduct Standard," meanwhile, is no standard at all. It is an undefined catchall for any future behavior the agency doesn't like.

. . .

From the beginning, Internet pioneers operated in an environment of "permissionless innovation." FCC Chairman Tom Wheeler now insists that "it makes sense to have somebody watching over their shoulder and ready to jump in if necessary." But the agency is jumping in to demand that innovators get permission before they offer new services to consumers. The result will be less innovation.

For the full commentary, see:

BRET SWANSON. "Permission Slips for Internet Innovation; The FCC's new Web rules are already as onerous as feared and favor some business models over others." The Wall Street Journal (Sat., Aug. 15, 2015): A11.

(Note: ellipses added.)

(Note: the online version of the commentary has the date Aug. 14, 2015.)

October 16, 2015

"We Embrace New Technology"

(p. 2D) . . . , the first digital images created by the earliest digital cameras "were terrible," Rockbrook's Chuck Fortina said. "These were real chunky images made by big, clunky cameras."

Viewing those results, some retailers dismissed the new digital technology and clung doggedly to film. But Rockbrook Camera began stocking digital cameras alongside models that used film, Fortina said.

"Film sales were great, but we just knew digital was going to take over," Fortina said. As those cameras and their images improved, the retailer saw a huge opportunity. ''Instead of thinking this is going to kill our business, we were thinking people are going to have to buy all new gear," Fortina said of the switch from analog to digital.

"By 2000, film was over," he said. Companies that didn't refocus their business found themselves struggling or forced to close their doors.

Today, Rockbrook Camera is constantly scouring the Internet, attending trade shows and quizzing customers and employees in search of new technologies, Fortina said. "We embrace new technology," he said.

For the full story, see:

Janice Podsada. "More Ready than Not for Tech Shifts; How 3 Omaha-area businesses altered course and thrived amid technological changes." Omaha World-Herald (SUNDAY, SEPTEMBER 27, 2015 ): 1D-2D.

(Note: ellipsis added.)

(Note: the online version of the story has the title "How 3 Omaha-area businesses altered course and thrived amid technological changes.")

October 14, 2015

John Paul Stapp Thumbed His Nose at the Precautionary Principle

(p. C7) In the early 19th century, a science professor in London named Dionysus Lardner rejected the future of high-speed train travel because, he said, "passengers, unable to breathe, would die of asphyxia." A contemporary, the famed engineer Thomas Tredgold, agreed, noting "that any general system of conveying passengers . . . [traveling] at a velocity exceeding 10 miles an hour, or thereabouts, is extremely improbable."

The current land speed for a human being is 763 miles an hour, or thereabouts, thanks in large part to the brilliance, bravery and dedication of a U.S. Air Force lieutenant colonel named John Paul Stapp, a wonderfully iconoclastic medical doctor, innovator and renegade consumer activist who repeatedly put his own life in peril in search of the line beyond which human survival at speed really was "extremely improbable."

. . .

Initial tests were carried out on a crash-test dummy named Oscar Eightball, then chimpanzees and pigs. There was plenty of trial and error--the term "Murphy's Law" was coined during the Gee Whiz experiments--until Stapp couldn't resist strapping himself into the Gee Whiz to experience firsthand what the cold data could never reveal: what it felt like. On May 5, 1948, for example, he "took a peak deceleration of an astounding twenty-four times the force of gravity," the author writes. "This was the equivalent of a full stop from 75 miles per hour in just seven feet or, in other words, freeway speed to zero in the length of a very tall man."

Stapp endured a total of 26 rides on the Gee Whiz over the course of 50 months, measuring an array of physiological factors as well as testing prototype helmets and safety belts. Along the way he suffered a broken wrist, torn rib cartilage, a bruised collarbone, a fractured coccyx, busted capillaries in both eyes and six cracked dental fillings. Colleagues became increasingly concerned for his health every time he staggered, gamely, off the sled, but, according to Mr. Ryan, he never lost his sense of humor, nor did these ordeals stop Dr. Stapp from voluntarily making house calls at night for families stationed on the desolate air base.

. . .

After 29 harrowing trips down the track, Stapp prepared for one grand finale, what he called the "Big Run," hoping to achieve 600 miles per hour, the speed beyond which many scientists suspected that human survivability was--really, this time--highly improbable. On Dec. 10, 1954, Sonic Wind marked a speed of 639 miles per hour, faster than a .45 caliber bullet shot from a pistol. Film footage of the test shows the sled rocketing past an overhead jet plane that was filming the event. The Big Run temporarily blinded Stapp, and he turned blue for a few days, but the experiment landed him on the cover of Time magazine as the fastest man on earth. The record stood for the next 30 years.

For the full review, see:

PATRICK COOKE. "Faster Than a Speeding Bullet--Really." The Wall Street Journal (Sat., Aug. 22, 2015): C7.

(Note: first ellipsis, and bracketed word, in original; other ellipses added.)

(Note: the online version of the review has the date Aug. 21, 2015.)

The book under review, is:

Ryan, Craig. Sonic Wind: The Story of John Paul Stapp and How a Renegade Doctor Became the Fastest Man on Earth. New York: Liveright Publishing Corp., 2015.

September 17, 2015

Fire Cooked Carbohydrates Fed Bigger Brains

(p. D5) Scientists have long recognized that the diets of our ancestors went through a profound shift with the addition of meat. But in the September issue of The Quarterly Review of Biology, researchers argue that another item added to the menu was just as important: carbohydrates, bane of today's paleo diet enthusiasts. In fact, the scientists propose, by incorporating cooked starches into their diet, our ancestors were able to fuel the evolution of our oversize brains.

. . .

Cooked meat provided increased protein, fat and energy, helping hominins grow and thrive. But Mark G. Thomas, an evolutionary geneticist at University College London, and his colleagues argue that there was another important food sizzling on the ancient hearth: tubers and other starchy plants.

Our bodies convert starch into glucose, the body's fuel. The process begins as soon as we start chewing: Saliva contains an enzyme called amylase, which begins to break down starchy foods.

Amylase doesn't work all that well on raw starches, however; it is much more effective on cooked foods. Cooking makes the average potato about 20 times as digestible, Dr. Thomas said: "It's really profound."

. . .

Dr. Thomas and his colleagues propose that the invention of fire, not farming, gave rise to the need for more amylase. Once early humans started cooking starchy foods, they needed more amylase to unlock the precious supply of glucose.

Mutations that gave people extra amylase helped them survive, and those mutations spread because of natural selection. That glucose, Dr. Thomas and his colleagues argue, provided the fuel for bigger brains.

For the full story, see:

Carl Zimmer. "MATTER; For Evolving Brains, a 'Paleo' Diet of Carbs." The New York Times (Tues., AUG. 18, 2015): D5.

(Note: ellipses added.)

(Note: the online version of the story has the date AUG. 13, 2015.)

The academic article summarized in the passages above, is:

Hardy, Karen, Jennie Brand-Miller, Katherine D. Brown, Mark G. Thomas, and Les Copeland. "The Importance of Dietary Carbohydrate in Human Evolution." The Quarterly Review of Biology 90, no. 3 (Sept. 2015): 251-68.

September 15, 2015

More Danger from Existing Artificial Stupidity than from Fictional Artificial Intelligence

(p. B6) In the kind of artificial intelligence, or A.I., that most people seem to worry about, computers decide people are a bad idea, so they kill them. That is undeniably bad for the human race, but it is a potentially smart move by the computers.

But the real worry, specialists in the field say, is a computer program rapidly overdoing a single task, with no context. A machine that makes paper clips proceeds unfettered, one example goes, and becomes so proficient that overnight we are drowning in paper clips.

In other words, something really dumb happens, at a global scale. As for those "Terminator" robots you tend to see on scary news stories about an A.I. apocalypse, forget it.

"What you should fear is a computer that is competent in one very narrow area, to a bad degree," said Max Tegmark, a professor of physics at the Massachusetts Institute of Technology and the president of the Future of Life Institute, a group dedicated to limiting the risks from A.I.

In late June, when a worker in Germany was killed by an assembly line robot, Mr. Tegmark said, "it was an example of a machine being stupid, not doing something mean but treating a person like a piece of metal."

. . .

"These doomsday scenarios confuse the science with remote philosophical problems about the mind and consciousness," Oren Etzioni, chief executive of the Allen Institute for Artificial Intelligence, a nonprofit that explores artificial intelligence, said. "If more people learned how to write software, they'd see how literal-minded these overgrown pencils we call computers actually are."

What accounts for the confusion? One big reason is the way computer scientists work. "The term 'A.I.' came about in the 1950s, when people thought machines that think were around the corner," Mr. Etzioni said. "Now we're stuck with it."

It is still a hallmark of the business. Google's advanced A.I. work is at a company it acquired called DeepMind. A pioneering company in the field was called Thinking Machines. Researchers are pursuing something called Deep Learning, another suggestion that we are birthing intelligence.

. . .

DeepMind made a program that mastered simple video games, but it never took the learning from one game into another. The 22 rungs of a neural net it climbs to figure out what is in a picture do not operate much like human image recognition and are still easily defeated.

For the full story, see:

QUENTIN HARDY. "The Real Threat Computers Pose: Artificial Stupidity, Not Intelligence." The New York Times (Mon., JULY 13, 2015): B6.

(Note: ellipses added.)

(Note: the online version of the story has the date JULY 11, 2015, and has the title "The Real Threat Posed by Powerful Computers.")

August 31, 2015

Marie Curie Opposed Patents Because Women Could Not Own Property in France

(p. C6) Ms. Wirtén, a professor at Linköping University in Sweden, pays special attention to the decision not to patent and how it was treated in the founding texts of the Curie legend: Curie's 1923 biography of her husband, "Pierre Curie," and their daughter Eve's 1937 biography of her mother, "Madame Curie." The books each recount a conversation in which husband and wife agree that patenting their radium method would be contrary to the spirit of science.

It is not quite that simple. As Ms. Wirtén points out, the Curies derived a significant portion of their income from Pierre's patents on instruments. Various factors besides beneficence could have affected their decision not to extend this approach to their radium process. Intriguingly, the author suggests that the ineligibility of women to own property under French law might have shaped Curie's perspective. "Because the law excluded her from the status of person upon which these intellectual property rights depend," Ms. Wirtén writes, "the 'property' road was closed to Marie Curie. The persona road was not."

For the full review, see:

EVAN HEPLER-SMITH. "Scientific Saint; After scandals in France, Curie was embraced by American women as an intellectual icon." The Wall Street Journal (Sat., March 21, 2015): C6.

(Note: the online version of the review has the date March 20, 2015.)

The book under review, is:

Wirtén, Eva Hemmungs. Making Marie Curie: Intellectual Property and Celebrity Culture in an Age of Information. Chicago: University of Chicago Press, 2015.

August 26, 2015

Pentagon Seeks Innovation from Private Start-Ups Since "They've Realized that the Old Model Wasn't Working Anymore"

(p. A3) SAN FRANCISCO -- A small group of high-ranking Pentagon officials made a quiet visit to Silicon Valley in December to solicit national security ideas from start-up firms with little or no history of working with the military.

The visit was made as part of an effort to find new ways to maintain a military advantage in an increasingly uncertain world.

In announcing its Defense Innovation Initiative in a speech in California in November, Chuck Hagel, then the defense secretary, mentioned examples of technologies like robotics, unmanned systems, miniaturization and 3-D printing as places to look for "game changing" technologies that would maintain military superiority.

"They've realized that the old model wasn't working anymore," said James Lewis, director of the Strategic Technologies Program at the Center for Strategic and International Studies in Washington. "They're really worried about America's capacity to innovate."

There is a precedent for the initiative. Startled by the Soviet launch of the Sputnik satellite in 1957, President Dwight D. Eisenhower created the Advanced Research Projects Agency, or ARPA, at the Pentagon to ensure that the United States would not be blindsided by technological advances.

Now, the Pentagon has decided that the nation needs more than ARPA, renamed the Defense Advanced Research Projects Agency, or Darpa, if it is to find new technologies to maintain American military superiority.

. . .

The Pentagon focused on smaller companies during its December visit; it did not, for example, visit Google. Mr. Welby acknowledged that Silicon Valley start-ups were not likely to be focused on the Pentagon as a customer. The military has captive suppliers and a long and complex sales cycle, and it is perceived as being a small market compared with the hundreds of millions of customers for consumer electronics products.

Mr. Welby has worked for three different Darpa directors, but he said that Pentagon officials now believed they had to look beyond their own advanced technology offices.

"The Darpa culture is about trying to understand high-risk technology," he said. "It's about big leaps." Today, however, the Pentagon needs to break out of what can be seen as a "not invented here" culture, he said.

"We're thinking about what the world is going to look like in 2030 and what tools the department will need in 20 or 30 years," he added.

For the full story, see:

JOHN MARKOFF. "Pentagon Shops in Silicon Valley for Game Changers." The New York Times (Fri., FEB. 27, 2015): A3.

(Note: ellipsis added.)

(Note: the online version of the story has the date FEB. 26, 2015.)

August 21, 2015

More Tech Stars Skip College, at Least for a While

(p. B1) The college dropout-turned-entrepreneur is a staple of Silicon Valley mythology. Steve Jobs, Bill Gates and Mark Zuckerberg all left college.

In their day, those founders were very unusual. But a lot has changed since 2005, when Mr. Zuckerberg left Harvard. The new crop of dropouts has grown up with the Internet and smartphones. The tools to create new technology are more accessible. The cost to start a company has plunged, while the options for raising money have multiplied.

Moreover, the path isn't as lonely.

. . .

Not long ago, dropping out of school to start a company was considered risky. For this generation, it is a badge of honor, evidence of ambition and focus. Very few dropouts become tycoons, but "failure" today often means going back to school or taking a six-figure job at a big tech company.

. . .

(p. B5) There are no hard numbers on the dropout trend, but applicants for the Thiel Fellowship tripled in the most recent year; the fellowship won't disclose numbers.

. . .

It has tapped 82 fellows in the past five years.

"I don't think college is always bad, but our society seems to think college is always good, for everyone, at any cost--and that is what we have to question," says Mr. Thiel, a co-founder of PayPal and an early investor in Facebook.

Of the 43 fellows in the initial classes of 2011 and 2012, 26 didn't return to school and continued to work on startups or independent projects. Five went to work for large tech firms, including a few through acquisitions. The remaining 12 went back to school.

Mr. Thiel says companies started by the fellows have raised $73 million, a record that he says has attracted additional applicants. He says fellows "learned far more than they would have in college."

For the full story, see:

DAISUKE WAKABAYASHI. "College Dropouts Thrive in Tech." The Wall Street Journal (Thurs., June 4, 2015): B1 & B10.

(Note: ellipses added. The phrase "the fellowship won't disclose numbers" was in the online, but not the print, version of the article.)

(Note: the online version of the article has the date June 3, 2015, and has the title "College Dropouts Thrive in Tech.")

August 14, 2015

Computer Programs "Lack the Flexibility of Human Thinking"

(p. A11) . . . let's not panic. "Superintelligent" machines won't be arriving soon. Computers today are good at narrow tasks carefully engineered by programmers, like balancing checkbooks and landing airplanes, but after five decades of research, they are still weak at anything that looks remotely like genuine human intelligence.

. . .

Even the best computer programs out there lack the flexibility of human thinking. A teenager can pick up a new videogame in an hour; your average computer program still can only do just the single task for which it was designed. (Some new technologies do slightly better, but they still struggle with any task that requires long-term planning.)

For the full commentary, see:

GARY MARCUS. "Artificial Intelligence Isn't a Threat--Yet; Superintelligent machines are still a long way off, but we need to prepare for their future rise." The Wall Street Journal (Sat., Dec. 13, 2014): A11.

(Note: ellipsis added.)

(Note: the online version of the commentary has the date Dec. 11, 2014.)

August 13, 2015

Cultural and Institutional Differences Between Europe and U.S. Keep Europe from Having a Silicon Valley

(p. B7) "They all want a Silicon Valley," Jacob Kirkegaard, a Danish economist and senior fellow at the Peterson Institute for International Economics, told me this week. "But none of them can match the scale and focus on the new and truly innovative technologies you have in the United States. Europe and the rest of the world are playing catch-up, to the great frustration of policy makers there."

Petra Moser, assistant professor of economics at Stanford and its Europe Center, who was born in Germany, agreed that "Europeans are worried."

"They're trying to recreate Silicon Valley in places like Munich, so far with little success," she said. "The institutional and cultural differences are still too great."

. . .

There is . . . little or no stigma in Silicon Valley to being fired; Steve Jobs himself was forced out of Apple. "American companies allow their employees to leave and try something else," Professor Moser said. "Then, if it works, great, the mother company acquires the start-up. If it doesn't, they hire them back. It's a great system. It allows people to experiment and try things. In Germany, you can't do that. People would hold it against you. They'd see it as disloyal. It's a very different ethic."

Europeans are also much less receptive to the kind of truly disruptive innovation represented by a Google or a Facebook, Mr. Kirkegaard said.

He cited the example of Uber, the ride-hailing service that despite its German-sounding name is a thoroughly American upstart. Uber has been greeted in Europe like the arrival of a virus, and its reception says a lot about the power of incumbent taxi operators.

"But it goes deeper than that," Mr. Kirkegaard said. "New Yorkers don't get all nostalgic about yellow cabs. In London, the black cab is seen as something that makes London what it is. People like it that way. Americans tend to act in a more rational and less emotional way about the goods and services they consume, because it's not tied up with their national and regional identities."

. . .

With its emphasis on early testing and sorting, the educational system in Europe tends to be very rigid. "If you don't do well at age 18, you're out," Professor Moser said. "That cuts out a lot of people who could do better but never get the chance. The person who does best at a test of rote memorization at age 17 may not be innovative at 23." She added that many of Europe's most enterprising students go to the United States to study and end up staying.

She is currently doing research into creativity. "The American education system is much more forgiving," Professor Moser said. "Students can catch up and go on to excel."

Even the vaunted European child-rearing, she believes, is too prescriptive. While she concedes there is as yet no hard scientific evidence to support her thesis, "European children may be better behaved, but American children may end up being more free to explore new things."

For the full story, see:

JAMES B. STEWART. "Common Sense; A Fearless Culture Fuels Tech." The New York Times (Fri., JUNE 19, 2015): B1 & B7.

(Note: ellipses added.)

(Note: the online version of the story has the date JUNE 18, 2015, and has the title "Common Sense; A Fearless Culture Fuels U.S. Tech Giants.")

August 6, 2015

Chimps Are Willing to Delay Gratification in Order to Receive Cooked Food

This is a big deal because cooking food allows us humans to spend a lot less energy digesting our food, which allows a lot more energy to be used by the brain. So one theory is that the cooking technology allowed humans to eventually develop cognitive abilities superior to other primates.

(p. A3) . . . scientists from Harvard and Yale found that chimps have the patience and foresight to resist eating raw food and to place it in a device meant to appear, at least to the chimps, to cook it.

. . .

But they found that chimps would give up a raw slice of sweet potato in the hand for the prospect of a cooked slice of sweet potato a bit later. That kind of foresight and self-control is something any cook who has eaten too much raw cookie dough can admire.

The research grew out of the idea that cooking itself may have driven changes in human evolution, a hypothesis put forth by Richard Wrangham, an anthropologist at Harvard and several colleagues about 15 years ago in an article in Current Anthropology, and more recently in his book, "Catching Fire: How Cooking Made Us Human."

He argued that cooking may have begun something like two million years ago, even though hard evidence only dates back about one million years. For that to be true, some early ancestors, perhaps not much more advanced than chimps, had to grasp the whole concept of transforming the raw into the cooked.

Felix Warneken at Harvard and Alexandra G. Rosati, who is about to move from Yale to Harvard, both of whom study cognition, wanted to see if chimpanzees, which often serve as stand-ins for human ancestors, had the cognitive foundation that would prepare them to cook.

. . .

Dr. Rosati said the experiments showed not only that chimps had the patience for cooking, but that they had the "minimal causal understanding they would need" to make the leap to cooking.

For the full story, see:

JAMES GORMAN. "Chimpanzees Would Cook if Given Chance, Research Says." The New York Times (Weds., JUNE 3, 2015): A3.

(Note: ellipses added.)

(Note: the date of the online version of the story is JUNE 2, 2015, and has the title "Chimpanzees Would Cook if Given the Chance, Research Says.")

The academic article discussed in the passages quoted above, is:

Warneken, Felix, and Alexandra G. Rosati. "Cognitive Capacities for Cooking in Chimpanzees." Proceedings of the Royal Society of London B: Biological Sciences 282, no. 1809 (June 22, 2015).

August 5, 2015

Plant Breeders Use Old Sloppy "Natural" Process to Avoid Regulatory Stasis

(p. A11) What's in a name?

A lot, if the name is genetically modified organism, or G.M.O., which many people are dead set against. But what if scientists used the precise techniques of today's molecular biology to give back to plants genes that had long ago been bred out of them? And what if that process were called "rewilding?"

That is the idea being floated by a group at the University of Copenhagen, which is proposing the name for the process that would result if scientists took a gene or two from an ancient plant variety and melded it with more modern species to promote greater resistant to drought, for example.

"I consider this something worth discussing," said Michael B. Palmgren, a plant biologist at the Danish university who headed a group, including scientists, ethicists and lawyers, that is funded by the university and the Danish National Research Foundation.

They pondered the problem of fragile plants in organic farming, came up with the rewilding idea, and published their proposal Thursday in the journal Trends in Plant Science.

. . .

The idea of restoring long-lost genes to plants is not new, said Julian I. Schroeder, a plant researcher at the University of California, Davis. But, wary of the taint of genetic engineering, scientists have used traditional breeding methods to cross modern plants with ancient ones until they have the gene they want in a crop plant that needs it. The tedious process inevitably drags other genes along with the one that is targeted. But the older process is "natural," Dr. Schroeder said.

. . .

Researchers have previously crossbred wheat plants with traits found in ancient varieties, noted Maarten Van Ginkel, who headed such a program in Mexico at the International Maize and Wheat Improvement Center.

"We selected for disease resistance, drought tolerance," he said. "This method works but it has drawbacks. You prefer to move only the genes you want."

When Dr. Van Ginkel crossbred for traits, he did not look for the specific genes conferring those traits. But with the flood-resistant rice plants, researchers knew exactly which gene they wanted. Nonetheless, they crossbred and did not use precision breeding to alter the plants.

Asked why not, Dr. Schroeder had a simple answer -- a complex maze of regulations governing genetically engineered crops. With crossbreeding, he said, "the first varieties hit the fields in a couple of years."

And if the researchers had used precision breeding to get the gene into the rice?

"They would still be stuck in the regulatory process," Dr. Schroeder said.

For the full story, see:

GINA KOLATA. "A Proposal to Modify Plants Gives G.M.O. Debate New Life." The Wall Street Journal (Fri., MAY 29, 2015): A11.

(Note: ellipses added.)

(Note: the online version of the story has the date MAY 28, 2015.)

August 1, 2015

Little Progress Toward Complex Autonomous Robots

(p. A8) [In June 2015] . . . , the Defense Advanced Research Projects Agency, a Pentagon research arm, . . . [held] the final competition in its Robotics Challenge in Pomona, Calif. With $2 million in prize money for the robot that performs best in a series of rescue-oriented tasks in under an hour, the event . . . offer[ed] what engineers refer to as the "ground truth" -- a reality check on the state of the art in the field of mobile robotics.

A preview of their work suggests that nobody needs to worry about a Terminator creating havoc anytime soon. Given a year and a half to improve their machines, the roboticists, who shared details about their work in interviews before the contest in June, appear to have made limited progress.

. . .

"The extraordinary thing that has happened in the last five years is that we have seemed to make extraordininary progress in machine perception," said Gill Pratt, the Darpa program manager in charge of the Robotics Challenge.

Pattern recognition hardware and software has made it possible for computers to make dramatic progress in computer vision and speech understanding. In contrast, Dr. Pratt said, little headway has been made in "cognition," the higher-level humanlike processes required for robot planning and true autonomy. As a result, both in the Darpa contest and in the field of robotics more broadly, there has been a re-emphasis on the idea of human-machine partnerships.

"It is extremely important to remember that the Darpa Robotics Challenge is about a team of humans and machines working together," he said. "Without the person, these machines could hardly do anything at all."

In fact, the steep challenge in making progress toward mobile robots that can mimic human capabilities is causing robotics researchers worldwide to rethink their goals. Now, instead of trying to build completely autonomous robots, many researchers have begun to think instead of creating ensembles of humans and robots, an approach they describe as co-robots or "cloud robotics."

Ken Goldberg, a University of California, Berkeley, roboticist, has called on the computing world to drop its obsession with singularity, the much-ballyhooed time when computers are predicted to surpass their human designers. Rather, he has proposed a concept he calls "multiplicity," with diverse groups of humans and machines solving problems through collaboration.

For decades, artificial-intelligence researchers have noted that the simplest tasks for humans, such as reaching into a pocket to retrieve a quarter, are the most challenging for machines.

"The intuitive idea is that the more money you spend on a robot, the more autonomy you will be able to design into it," said Rodney Brooks, an M.I.T. roboticist and co-founder two early companies, iRobot and Rethink Robotics. "The fact is actually the opposite is true: The cheaper the robot, the more autonomy it has."

For example, iRobot's Roomba robot is autonomous, but the vacuuming task it performs by wandering around rooms is extremely simple. By contrast, the company's Packbot is more expensive, designed for defusing bombs, and must be teleoperated or controlled wirelessly by people.

For the full story, see:

JOHN MARKOFF. "A Reality Check for A.I." The New York Times (Tues., MAY 26, 2015): D2.

(Note: ellipses, and bracketed expressions, added. I corrected a misspelling of "extraordinary.")

(Note: the date of the online version of the story is MAY 25, 2015, and has the title "Relax, the Terminator Is Far Away.")

July 28, 2015

Mobile Tech Drives Social Revolution in Saudi Arabia

(p. 6) RIYADH, Saudi Arabia -- Life for many young Saudis is an ecosystem of apps.

Lacking free speech, they debate on Twitter. Since they cannot flirt at the mall, they do it on WhatsApp and Snapchat.

Young women who cannot find jobs sell food or jewelry through Instagram. Since they are banned from driving, they get rides from car services like Uber and Careem. And in a country where shops close for five daily Muslim prayers, there are apps that issue a call to prayer from your pocket and calculate whether you can reach, say, the nearest Dunkin' Donuts before it shuts.

Confronted with an austere version of Islam and strict social codes that place sharp restrictions on public life, young Saudis are increasingly relying on social media to express and entertain themselves, earn money and meet friends and potential mates.

That reliance on technology -- to circumvent the religious police, and the prying eyes of relatives and neighbors -- has accelerated since it first began with the spread of satellite television in the 1990s. Saudis in their 30s (and older) recall the days of unsanctioned courtship via BlackBerry Messenger.

But the scale of today's social media boom is staggering, with many of the country's 18 million citizens wielding multiple smartphones and spending hours online each day. Digital has not replaced face-to-face interaction, but it has opened the door to much more direct and robust communication, especially in a society that sharply segregates men and women who are not related.

The spread of mobile technology is driving nothing short of a social revolution in the lives of young people. In this rich but conservative kingdom that bans movie theaters, YouTube and Internet streaming have provided an escape from the censors and a window to the outside world. A young Shariah judge, for example, confided that he had watched all five seasons of "Breaking Bad."

For the full story, see:

BEN HUBBARD. "Young Saudis Find Freedom on Smartphones." The New York Times, First Section (Sun., MAY 24, 2015): 6 & 11.

(Note: the date of the online version of the story is MAY 22, 2015, and has the title "Young Saudis, Bound by Conservative Strictures, Find Freedom on Their Phones." )

July 26, 2015

"Nimble" Account of the Creative Destruction of the Music Industry

(p. C1) Stephen Witt's nimble new book, "How Music Got Free," is the richest explanation to date about how the arrival of the MP3 upended almost everything about how music is distributed, consumed and stored. It's a story you may think you know, but Mr. Witt brings fresh reporting to bear, and complicates things in terrific ways.

He pushes past Napster (Sean Fanning, dorm room, lawsuits) and goes deep on the German audio engineers who, drawing on decades of research into how the ear works, spent years developing the MP3 only to almost see it nearly become the Betamax to another group's VHS.

. . .

(p. C6) Even better, he has found the man -- a manager at a CD factory in small-town North Carolina -- who over eight years leaked nearly 2,000 albums before their release, including some of the best-known rap albums of all time. He smuggled most of them out behind an oversized belt buckle before ripping them and putting them online.

Mr. Witt refers to this winsome if somewhat hapless manager, Dell Glover, as "the most fearsome digital pirate of them all."

. . .

Into these two narratives Mr. Witt inserts a third, the story of Doug Morris, who ran the Universal Music Group from 1995 to 2011. At some points you wonder if Mr. Morris has been introduced just so the author can have sick fun with him.

The German inventors and Mr. Glover operate as if they unwittingly have voodoo dolls of this man. Every time they make an advance, and prick the music industry, there's a jump to Mr. Morris for a reaction shot, screaming in his corner office.

. . .

Mr. Witt covers a lot of terrain in "How Music Got Free" without ever becoming bogged down in one place for long. He is knowledgeable about intellectual property issues. In finding his reporting threads, he doesn't miss the big picture: He gives us a loge seat to the entire digital music revolution.

He is especially good on the arrival of iTunes and the iPod.

For the full review, see:

DWIGHT GARNER. "Books of The Times; That Download Has a Back Story." The New York Times (Tues., JUNE 16, 2015): C1 & C6.

(Note: ellipses added.)

(Note: the online version of the review has the date JUNE 15, 2015, and has the title "Books of The Times; Review: In 'How Music Got Free,' Stephen Witt Details an Industry Sea Change.")

The book under review is:

Witt, Stephen. How Music Got Free: The End of an Industry, the Turn of the Century, and the Patient Zero of Piracy. New York: Viking, 2015.

June 6, 2015

Science Fiction Creates "False Sense of Conflict between Humans and Machines"

(p. R4) "I think the development of full artificial intelligence could spell the end of the human race," astrophysicist Stephen Hawking told the BBC. Tesla founder Elon Musk called AI "our biggest existential threat." Former Microsoft Chief Executive Bill Gates has voiced his agreement.

. . .

Taking part in the discussion [is] . . .; Guruduth S. Banavar, vice president of cognitive computing at IBM's Thomas J. Watson Research Center; . . .

. . .

WSJ: Does AI pose a threat to humanity?

MR. BANAVAR: Fueled by science-fiction novels and movies, popular treatment of this topic far too often has created a false sense of conflict between humans and machines. "Intelligent machines" tend to be great at tasks that humans are not so good at, such as sifting through vast data. Conversely, machines are pretty bad at things that humans are excellent at, such as common-sense reasoning, asking brilliant questions and thinking out of the box. The combination of human and machine, which we consider the foundation of cognitive computing, is truly revolutionizing how we solve complex problems in every field.

. . .

(p. R5) WSJ: Some experts believe that AI is already taking jobs away from people. Do you agree?

. . .

MR. BANAVAR: From time immemorial, we have built tools to help us do things we can't do. Each generation of tools has made us rethink the nature and types of jobs. Productivity goes up, professions are redefined, new professions are created and some professions become obsolete. Cognitive systems, which can enhance and scale the capabilities of our minds, have the potential to be even more transformative.

The key question will be how to build institutions to quickly train professionals to exploit cognitive systems as their assistants. Once learned, these skills will make every individual a better professional, and this will set a new bar for the nature of expertise.

For the full interview, see:

TED GREENWALD, interviewer. "Does Artificial Intelligence Pose a Threat?" The Wall Street Journal (Mon., May 11, 2015): R4-R5.

(Note: ellipses, and bracketed word, added; bold in original online version.)

(Note: the online version of the interview has the date May 10, 2015.)

May 9, 2015

Incandescents Better than LEDs at Allowing a Good Night's Sleep

(p. D6) Studies have shown that such light, especially from the blue part of the spectrum, inhibits the body's production of melatonin, a hormone that helps people fall asleep.

. . .

Devices such as smartphones and tablets are often illuminated by light-emitting diodes, or LEDs, that tend to emit more blue light than incandescent products.

For the full story, see:

KATE GALBRAITH. "WIRED WELL; Can Orange Glasses Help You Sleep Better?" The New York Times (Tues., APRIL 7, 2015): D6.

(Note: ellipsis added.)

(Note: the online version of the story has the title "WIRED WELL; Can Orange Glasses Help You Sleep Better?")

April 25, 2015

Lincoln Defended Innovative Rail Against Incumbent Steam

(p. A15) "Lincoln's Greatest Case" convincingly shows that 1857 was a watershed year for the moral and political questions surrounding slavery's expansion to the west, something that Jefferson Davis's preferred railroad route would have facilitated. Mr. McGinty's discussion of Lincoln's philosophy and the career-making speeches he would develop in the late 1850s allows us to see the transportation disputes in light of the political and cultural dynamics that would lead to the Civil War. The book is also a case study of discomfort with new technology--and the futility of using a tort suit to prevent the adoption of inevitable innovation.

The book ends on an elegiac note, with steamboats making their inevitable passage into the mists of history. The rails, which could operate year-round through paths determined by man, not nature, would reign supreme, thanks in part to the efforts of a technophile future president.

For the full review, see:

MARGARET A. LITTLE. "BOOKSHELF; When Steam Was King; A dispute over a fiery collision pitted steamboats against railroads and the North against the South. Lincoln defended the rail." The Wall Street Journal (Mon., Feb. 23, 2015): A15.

(Note: the online version of the review has the date Feb. 22, 2015, and has the title "BOOKSHELF; Technology's Great Liberator; A dispute over a fiery collision pitted steamboats against railroads and the North against the South. Lincoln defended the rail.")

The book under review is:

McGinty, Brian. Lincoln's Greatest Case: The River, the Bridge, and the Making of America. New York: Liveright Publishing Corp., 2015.

April 15, 2015

New Evidence on the Antikythera Mechanism

The Antikythera Mechanism was recovered in about 1901 and is believed to date from about 200 BC. Its complicated gear mechanism is believed to have been used to generate calendars or predict astronomical events. The technology never spread to benefit ordinary people. It was forgotten and mechanical gears had to be re-invented.

The Antikythera Mechanism raises a question: how is it that technologies with the potential to benefit humankind can fail to be adopted? This issue of the causes of technology adoption is an important issue for economic growth.

(p. D3) A riddle for the ages may be a small step closer to a solution: Who made the famed Antikythera Mechanism, the astronomical calculator that was raised from an ancient shipwreck near Crete in 1901?

. . .

. . . a new analysis of the dial used to predict eclipses, which is set on the back of the mechanism, provides . . . another clue to one of history's most intriguing puzzles. Christián C. Carman, a science historian at the National University of Quilmes in Argentina, and James Evans, a physicist at the University of Puget Sound in Washington, suggest that the calendar of the mysterious device began in 205 B.C., just seven years after Archimedes died.

. . .

Starting with the ways the device's eclipse patterns fit Babylonian eclipse records, the two scientists used a process of elimination to reach a conclusion that the "epoch date," or starting point, of the Antikythera Mechanism's calendar was 50 years to a century earlier than had been generally believed.

. . .

. . . Archimedes was killed by a Roman soldier in 212 B.C., while the commercial grain ship carrying the mechanism is believed to have sunk sometime between 85 and 60 B.C. The new finding suggests the device may have been old at the time of the shipwreck, but the connection to Archimedes now seems even less likely.

An inscription on a small dial used to date the Olympic Games refers to an athletic competition that was held in Rhodes, according to research by Paul Iversen, a Greek scholar at Case Western Reserve University.

"If we were all taking bets about where it was made, I think I would bet what most people would bet, in Rhodes," said Alexander Jones, a specialist in the history of ancient mathematical sciences at New York University.

For the full story, see:

JOHN MARKOFF. "On the Trail of an Ancient Mystery." The New York Times (Tues., NOV. 25, 2014): D3.

(Note: ellipses added.)

(Note: the online version of the story has the date NOV. 24, 2014.)

March 23, 2015

How Air Conditioning Can Improve Metabolism

(p. 14) Sleep is essential for good health, as we all know. But a new study hints that there may be an easy but unrealized way to augment its virtues: lower the thermostat. Cooler bedrooms could subtly transform a person's stores of brown fat -- what has lately come to be thought of as "good fat" -- and consequently alter energy expenditure and metabolic health, even into daylight hours.

. . .

"These were all healthy young men to start with," . . . [senior author Francesco S. Celi] says, "but just by sleeping in a colder room, they gained metabolic advantages" that could, over time, he says, lessen their risk for diabetes and other metabolic problems. The men also burned a few more calories throughout the day when their bedroom was chillier (although not enough to result in weight loss after four weeks).

. . .

The message of these findings, Celi says, is that you can almost effortlessly tweak your metabolic health by turning down the bedroom thermostat a few degrees. His own bedroom is moderately chilled, as is his office -- which has an added benefit: It "keeps meetings short."

For the full story, see:

GRETCHEN REYNOLDS. "Let's Cool It in the Bedroom." The New York Times Magazine (Sun., JULY 20, 2014): 14.

(Note: ellipses added.)

(Note: the online version of the story has the date JULY 17, 2014.)

The academic paper discussed above, is:

Lee, Paul, Sheila Smith, Joyce Linderman, Amber B. Courville, Robert J. Brychta, William Dieckmann, Charlotte D. Werner, Kong Y. Chen, and Francesco S. Celi. "Temperature-Acclimated Brown Adipose Tissue Modulates Insulin Sensitivity in Humans." Diabetes 63, no. 11 (Nov. 2014): 3686-98.

March 14, 2015

Serendipitous Discovery that Titanium Fuses with Bone, Leads to Implants

(p. 24) Implants have been a major advance in dentistry, liberating millions of elderly people from painful, ill-fitting dentures, a diet of soft foods and the ignominy of a sneeze that sends false teeth flying out of the mouth. But addressing those problems was not Dr. Branemark's initial intent.

At the start of his career, he was studying how blood flow affects bone healing.

In 1952, he and his team put optical devices encased in titanium into the lower legs of rabbits in order to study the healing process. When the research period ended and they went to remove the devices, they discovered to their surprise that the titanium had fused into the bone and could not be removed.

Dr. Branemark called the process "osseointegration," and his research took a whole new direction as he realized that if the body could tolerate the long-term presence of titanium, the metal could be used to create an anchor for artificial teeth.

. . .

. . . , Dr. Branemark's innovation was poorly received. After Dr. Branemark gave a lecture on his work in 1969, Dr. Albrektsson recalled, one of the senior academics of Swedish dentistry rose and referred to an article in Reader's Digest describing Dr. Branemark's research, adding, "This may prove to be a popular article, but I simply do not trust people who publish themselves in Reader's Digest."

As it happened, that senior academic was well known to the Swedish public for recommending a particular brand of toothpick. So Dr. Branemark immediately rose and struck back, saying, "And I don't trust people who advertise themselves on the back of boxes of toothpicks."

For the full story, see:

TAMAR LEWIN. "Per-Ingvar Branemark, Dental Innovator, Dies at 85." The New York Times, First Section (Sun., DEC. 28, 2014): 24.

(Note: ellipses are added.)

(Note: the online version of the story has the date JAN. 27, 2015.)

February 17, 2015

Congress Appropriates Funds to Test Concussion Theory of Rain

(p. 190) the first century A.D., when the Greek moralist Plutarch came up with the notion that rain followed military battles. Napoleon believed as much and fired cannons and guns at the sky to muddy up the ground between him and his attackers. Civil War veterans who wallowed in cold slop believed that ceaseless, close-range artillery fire had opened up the skies. In the late 1890s, as the first nesters started to dig their toeholds on the dry side of the one hundredth meridian, Congress had appropriated money to test the concussion theory in Texas. The tests were done by a man named Dyrenforth. He tried mightily, with government auditors looking over (p. 191) his shoulder, but Dyrenforth could not force a drop from the hot skies of Texas. From then on, he was called "Dry-Henceforth."

Government-sponsored failure didn't stop others from trying. A man who called himself "the moisture accelerator," Charles M. Hatfield, roamed the plains around the turn of the century. A Colonel Sanders of rainmaking, Hatfield had a secret mixture of ingredients that could be sent to the sky by machine. In the age before the widespread use of the telephone, it was hard to catch up with the moisture accelerator after he had fleeced a town and moved on.


Egan, Timothy. The Worst Hard Time: The Untold Story of Those Who Survived the Great American Dust Bowl. Boston: Houghton Mifflin, 2006.

February 11, 2015

Ways Technology May Decrease Inequality

(p. 7) As the previous generation retires from the work force, many more people will have grown up with intimate knowledge of computers. And over time, it may become easier to work with computers just by talking to them. As computer-human interfaces become simpler and easier to manage, that may raise the relative return to less-skilled labor.

The future may also extend a growing category of employment, namely workers who team up with smart robots that require human assistance. Perhaps a smart robot will perform some of the current functions of a factory worker, while the human companion will do what the robot cannot, such as deal with a system breakdown or call a supervisor. Such jobs would require versatility and flexible reasoning, a bit like some of the old manufacturing jobs, but not necessarily a lot of high-powered technical training, again because of the greater ease of the human-computer interface. That too could raise the returns to many relatively unskilled workers.

For the full commentary, see:

TYLER COWEN "TheUpshot; Economic View; The Technological Fix to Inequality." The New York Times, SundayBusiness Section (Sun., DEC. 7, 2014): 7.

(Note: ellipses, and bracketed date, added.)

(Note: the online version of the commentary has the date DEC. 6, 2014, and has the title "TheUpshot; Economic View; How Technology Could Help Fight Income Inequality." )

January 22, 2015

As with Airplanes, Lives Must Be Risked to Achieve Routine Safety in Spaceships

(p. A21) SEATTLE -- ONE clear winter day in 1909, in Hampshire, England, a young man named Geoffrey de Havilland took off in a twin-propeller motorized flying machine of his own design, built of wood, piano wire and stiff linen hand-stitched by his wife. The launch was flawless, and soon he had an exhilarating sensation of climbing almost straight upward toward the brilliant blue sky. But he soon realized he was in terrible trouble.

The angle of ascent was unsustainable, and moments later de Havilland's experimental plane crashed, breaking apart into a tangled mass of shards, splinters and torn fabric, lethal detritus that could easily have killed him even if the impact of smashing into the ground did not. Somehow, he survived and Sir Geoffrey -- he was ultimately knighted as one of the world's great aviation pioneers -- went on to build an astonishing array of military and civilian aircraft, including the world's first jet airliner, the de Havilland Comet.

I thought immediately of de Havilland on Friday when I heard that Virgin Galactic's SpaceShipTwo, a rocket-powered vehicle designed to take well-heeled tourists to the edge of space, had crashed on a flight over the Mojave Desert, killing one test pilot and seriously injuring the other.

. . .

Certainly the Wright brothers and others like de Havilland were involved in what we now view as an epic quest, but many experts of the day were certain that flight, however interesting, was destined to be not much more than a rich man's hobby with no practical value.

"The public has greatly over-estimated the possibilities of the aeroplane, imagining that in another generation they will be able to fly over to London in a day," said a Harvard expert in 1908. "This is manifestly impossible." Two other professors patiently explained that while laymen might think that "because a machine will carry two people another may be constructed that will carry a dozen," in fact "those who make this contention do not understand the theory of weight sustentation in the air."

. . .

There will be tragedies like the crash of SpaceShipTwo and nonlethal setbacks such as the fiery explosion, also last week, of a remote-controlled rocket intended for a resupply mission to the International Space Station. There will be debates about how to improve regulation without stifling innovation. Some will say private industry can't do the job -- though it's not as if the NASA-sponsored Apollo or space shuttle missions went off without a hitch (far from it, sadly).

But at the heart of the enterprise there will always be obsessives like Sir Geoffrey, who forged ahead with his life's work of building airplanes despite his own crash and, incredibly, the deaths of two of his three sons while piloting de Havilland aircraft, one in an attempt to break the sound barrier. Getting to routine safety aloft claimed many lives along the way, and a hundred years from now people will agree that in that regard, at least, spaceships are no different from airplanes.

For the full commentary, see:

SAM HOWE VERHOVEK. "Not a Flight of Fancy." The New York Times (Tues., NOV. 4, 2014): A21.

(Note: ellipses added.)

(Note: the online version of the commentary has the date NOV. 3, 2014.)

January 19, 2015

Leading Computability Expert Says Humans Can Do What Computers Cannot

(p. B4) What does Turing's research tell us?

"There is some scientific basis for the view that humans are doing something that a machine isn't doing--and that we don't even want our machine to do," says S. Barry Cooper, a mathematician at Leeds and the foremost scholar of Turing's work.

The math behind this is deep, but here's the short version: Humans seem to be able to decide the validity of statements that should stump us, were we strictly computers as Turing described them. And since all modern computers are of the sort Turing described, well, it seems that we've won the race against the machines before it's even begun.

. . .

The future of technology isn't about replacing humans with machines, says Prof. Cooper--it's about figuring out the most productive way for the two to collaborate. In a real and inescapable way, our machines need us just as much as we need them.

For the full commentary, see:

Mims, Christopher. "KEYWORDS; Why Humans Needn't Fear the Machines All Around Us; Turing's Heirs Realize a Basic Truth: The Machines We Create Are Not, Indeed Cannot Be, Replacements for Humans." The Wall Street Journal (Tues., DEC. 1, 2014): B4.

(Note: ellipsis added.)

(Note: the online version of the commentary has the date Nov. 30, 2014, and has the title "KEYWORDS; Why We Needn't Fear the Machines; A Basic Truth: Computers Can't Be Replacements for Humans.")

One of the major books by the Turing and computability expert quoted in the passages above, is:

Cooper, S. Barry. Computability Theory, Chapman Hall/CRC Mathematics Series. Boca Raton, Florida: Chapman and Hall/CRC Mathematics, 2003.

November 30, 2014

Esther Dyson Sees a Lot of Silicon Valley as Just Motivated to Make Money

(p. C11) The U.S. Commerce Department recently said that it plans to relinquish its oversight of Icann, handing that task to an international body of some kind. The details are still being worked out, but Ms. Dyson hopes that governments won't be the new regulators. . . .

For now, she thinks there are many Silicon Valley Internet companies with inflated market values. "There is the desire to make money that motivates a lot of that in Silicon Valley, and yes, I think it's totally a bubble," she says. "It's not like the last bubble in that there are a lot of real companies there [now], but there are a lot of unreal companies and...many of them will disappear." She thinks too many people are starting similar companies. "You have people being CEOs of teeny little things who would be much better as marketing managers of someone else's company," she says.

And though her work often takes her to California, she's happy to stay in New York. These days, she finds Silicon Valley "very fashionable," she says, "and I don't really like fashion."

For the full interview, see:

ALEXANDRA WOLFE, interviewer. "WEEKEND CONFIDENTIAL; Esther Dyson's Healthy Investments; The investor is hoping to produce better health through technology with a new nonprofit." The Wall Street Journal (Sat., May 3, 2014): C11.

(Note: first ellipsis added; second ellipsis in original.)

(Note: the online version of the interview has the date May 2, 2014, and has the title "WEEKEND CONFIDENTIAL; Esther Dyson's Healthy Investments; The investor is hoping to produce better health through technology with a new nonprofit.")

November 18, 2014

Japanese Try to Sell the iPhone of Toilets in United States

(p. B8) TOKYO--Yoshiaki Fujimori wants to be the Steve Jobs of toilets.

Like iPhones, app-packed commodes are objects of desire in Mr. Fujimori's Japan. The lids lift automatically. The seats heat up. Built-in bidets make cleanup a breeze. Some of them even sync with users' smartphones via Bluetooth so that they can program their preferences and play their favorite music through speakers built into the bowl.

Three-quarters of Japanese homes contain such toilets, most of them made by one of two companies: Toto Ltd., Japan's largest maker of so-called sanitary ware, or Lixil Corp., where Mr. Fujimori is the chief executive.

Now Mr. Fujimori is leading a push to bring them to the great unwashed. In May, Lixil plans to add toilets with "integrated bidets" to the lineup of American Standard Brands, which Lixil acquired last year for $542 million, including debt.

. . .

Few people realized they needed smartphones until Apple's iPhone came along. So it will be in the U.S. with American Standard's new toilets, Mr. Fujimori said.

"Industry presents iPhone--industry presents shower toilet," Mr. Fujimori said in an interview at Lixil's headquarters in Tokyo. "We can create the same type of pattern."

. . .

Mr. Fujimori maintained that once American consumers try such toilets, they won't go back.

"This improves your standard of living," he said. "It doesn't hurt you. People like comfort, they like ease, they like automatic. And people like clean."

For the full story, see:

ERIC PFANNER and ATSUKO FUKASE. "Smart Toilets Arrive in U.S." The Wall Street Journal (Tues., May 27, 2014): B8.

(Note: ellipses added.)

(Note: the online version of the story has the date May 26, 2014.)

November 14, 2014

High Skill Foreign Workers Raise Wages for Native Workers

WageGrowthRelatedToChangesInForeignSTEMworkersGraph2014-10-08.jpgSource of graph: online version of the WSJ article quoted and cited below.

(p. A6) "A lot of people have the idea there is a fixed number of jobs," said . . . , Giovanni Peri of the University of California, Davis. "It's completely turned around."

Immigrants can boost the productivity of the overall economy, he said, "because then the pie grows and there are more jobs for other people as well and there's not a zero-sum trade-off between natives and immigrants."

Mr. Peri, along with co-authors Kevin Shih at UC Davis, and Chad Sparber at Colgate University, studied how wages for college- and noncollege-educated native workers shifted along with immigration. They found that a one-percentage-point increase in the share of workers in STEM fields raised wages for college-educated natives by seven to eight percentage points and wages of the noncollege-educated natives by three to four percentage points.

Mr. Peri said the research bolsters the case for raising, or even removing, the caps on H-1B visas, the program that regulates how many high-skilled foreign workers employers can bring into the country.

For the full story, see:

JOSH ZUMBRUN and MATT STILES. "Study: Skilled Foreign Workers a Boon to Pay." The Wall Street Journal (Fri., May 23, 2014): A6.

(Note: ellipsis added.)

(Note: the online version of the story has the date May 22, 2014, and has the title "Skilled Foreign Workers a Boon to Pay, Study Finds.")

The paper discussed in the passage quoted above, is:

Peri, Giovanni, Kevin Shih, and Chad Sparber. "Foreign Stem Workers and Native Wages and Employment in U.S. Cities." National Bureau of Economic Research, Inc, NBER Working Paper Number 20093, May 2014.

November 5, 2014

"Folkman Persisted in His Genuinely Original Thinking"

(p. 141) As detailed by Robert Cooke in his 2001 book Dr. Folkman's War, the successful answers to these basic questions took Folkman through diligent investigations punctuated by an astonishing series of chance observations and circumstances. Over decades, Folkman persisted in his genuinely original thinking. His concept was far in advance of technological and other scientific advances that would provide the methodology and basic knowledge essential to its proof, forcing him to await verification and to withstand ridicule, scorn, and vicious competition for grants. Looking back three decades later, Folkman would ruefully reflect: "I was too young to realize how much trouble was in store for a theory that could not be tested immediately."


Meyers, Morton A. Happy Accidents: Serendipity in Modern Medical Breakthroughs. New York: Arcade Publishing, 2007.

(Note: italics in original.)

October 9, 2014

Feds Allow Hollywood to Use Drones

(p. B1) LOS ANGELES -- The commercial use of drones in American skies took a leap forward on Thursday [Sept. 25, 2014] with the help of Hollywood.

The Federal Aviation Administration, responding to applications from seven filmmaking companies and pressure from the Motion Picture Association of America, said six of those companies could use camera-equipped drones on certain movie and television sets. Until now, the F.A.A. has not permitted commercial drone use except for extremely limited circumstances in wilderness areas of Alaska.

Put bluntly, this is the first time that companies in the United States will be able to legally use drones to fly over people.

The decision has implications for a broad range of industries including agriculture, energy, real estate, the news media and online retailing. "While the approval for Hollywood is very limited in scope, it's a message to everyone that this ball is rolling," said Greg Cirillo, chairman of the aviation practice at Wiley Rein, a law firm in Washington.

Michael P. Huerta, the administrator of the F.A.A., said at least 40 similar applications were pending from companies beyond Hollywood. One is Amazon, which wants permission to move forward with a drone-delivery service. Google has acknowledged "self-flying vehicle" tests in the Australian outback.

"Today's announcement is a significant milestone in broadening commercial use," Anthony R. Foxx, secretary of transportation, told reporters in a conference call.

For the full story, see:

BROOKS BARNES. "Drone Exemptions for Hollywood Pave the Way for Widespread Use." The New York Times (Fri., SEPT. 26, 2014): B1 & B7.

(Note: bracketed date added.)

(Note: the online version of the story has the date SEPT. 25, 2014.)

October 6, 2014

Shellshock Bug Shows Low Quality of Open Source Software

(p. B1) Long before the commercial success of the Internet, Brian J. Fox invented one of its most widely used tools.

In 1987, Mr. Fox, then a young programmer, wrote Bash, short for Bourne-Again Shell, a free piece of software that is now built into more than 70 percent of the machines that connect to the Internet. That includes servers, computers, routers, some mobile phones and even everyday items like refrigerators and cameras.

On Thursday [Sept. 25, 2014], security experts warned that Bash contained a particularly alarming software bug that could be used to take control of hundreds of millions of machines around the world, potentially including Macintosh computers and smartphones that use the Android operating system.

The bug, named "Shellshock," drew comparisons to the Heartbleed bug that was discovered in a crucial piece of software last spring.

But Shellshock could be a bigger threat. While Heartbleed could be used to do things like steal passwords from a server, Shellshock can be used to take over the entire machine. And Heartbleed went unnoticed for two years and affected an estimated 500,000 machines, but Shellshock was not discovered for 22 years.

. . .

Mr. Fox maintained Bash -- which serves as a sort of software interpreter for different commands from a user -- for five years before handing over the reins to Chet Ramey, a 49-year-old programmer who, for the last 22 years, has maintained the software as an unpaid hobby. That is, when he is not working at his day job as a senior technology architect at Case Western Reserve University in Ohio.

. . .

(p. 2) The mantra of open source was perhaps best articulated by Eric S. Raymond, one of the elders of the open-source movement, who wrote in 1997 that "given enough eyeballs, all bugs are shallow." But, in this case, Steven M. Bellovin, a computer science professor at Columbia University, said, those eyeballs are more consumed with new features than quality. "Quality takes work, design, review and testing and those are not nearly as much fun as coding," Mr. Bellovin said. "If the open-source community does not develop those skills, it's going to fall further behind in the quality race."

For the full story, see:

NICOLE PERLROTH. "Flaw in Code Puts Millions At Big Risk." The New York Times (Fri., SEPT. 26, 2014): B1-B2.

(Note: ellipses, and bracketed date, added.)

(Note: the online version of the story has the date SEPT. 25, 2014, and has the title "Security Experts Expect 'Shellshock' Software Bug in Bash to Be Significant.")

August 4, 2014

Did Intel Succeed in Spite of, or Because of, Tension Between Noyce and Grove?

(p. C5) . . . , much more so than in earlier books on Intel and its principals, the embedded thread of "The Intel Trinity" is the dirty little secret few people outside of Intel knew: Andy Grove really didn't like Bob Noyce.

. . .

(p. C6) . . . there's the argument that one thing a startup needs is an inspiring, swashbuckling boss who lights up a room when he enters it and has the confidence to make anything he's selling seem much bigger and more important than it actually is. And Mr. Malone makes a compelling case that Noyce was the right man for the job in this phase of the company. "Bob Noyce's greatest gift, even more than his talent as a technical visionary," Mr. Malone writes, "was his ability to inspire people to believe in his dreams, in their own abilities, and to follow him on the greatest adventure of their professional lives."

. . .

Noyce hid from Mr. Grove, who was in charge of operations, the fact that Intel had a secret skunk works developing a microprocessor, a single general-purpose chip that would perform multiple functions--logic, calculation, memory and power control. Noyce had the man who was running it report directly to him rather than to Mr. Grove, even though Mr. Grove was his boss on the organizational chart. When Mr. Grove learned what was going on, he became furious, but like the good soldier he was, he snapped to attention and helped recruit a young engineer from Fairchild to be in charge of the project, which ultimately redefined the company.

. . .

Remarkably, none of this discord seemed to have much effect on the company's day-to-day operations. Mr. Malone even suggests that the dysfunction empowered Intel's take-no-prisoners warrior culture.

. . .

So while the humble, self-effacing Mr. Moore, who had his own time in the CEO's chair from 1975 to 1987, played out his role as Intel's big thinker, the brilliant visionary "who could see into the technological future better than anyone alive," Mr. Grove was the kick-ass enforcer. No excuses. For anything.

For the full review, see:

STEWART PINKERTON. "Made in America; A Born Leader, a Frustrated Martinet Built One of Silicon Valley's Giants." The Wall Street Journal (Sat., July 19, 2014): C5-C6.

(Note: ellipses added.)

(Note: the online version of the review has the date July 18, 2014, and has the title "Book Review: 'The Intel Trinity' by Michael S. Malone; A born leader, an ethereal genius and a tough taskmaster built the most important company on the planet.")

The book under review is:

Malone, Michael S. The Intel Trinity: How Robert Noyce, Gordon Moore, and Andy Grove Built the World's Most Important Company. New York: HarperCollins Publishers, 2014.

July 31, 2014

Early Cars Were Playthings of the Idle Rich


Source of book image:

(p. C14) Mr. Parissien writes that Frenchman Nicolas Cugnot may well have built the first mechanical vehicle in 1769, a two-ton, steam-driven colossus that reportedly went out of control and crashed into a wall. It wasn't until 1885 that Karl Benz, the acknowledged father of the automobile, debuted the first gasoline-powered motorcar, in Mannheim, Germany. It carried passengers just slightly quicker than they could walk.

With the arrival of that breakthrough, however, the race was on for who could come up with a sturdier, faster, more reliable motor car. Many of the innovators' names are still familiar: Renault, Bentley and Daimler among them. Even piano makers Steinway & Sons tried their hand at building cars. Other companies appeared for a time and then vanished--Durant, Lanchester, Panhard and De Dion-Bouton--victims of bad guesses or bad timing. Much of Mr. Parissien's story is devoted to the personalities, and eccentricities, of the men who created what for many years amounted to a plaything of the idle rich. Italian luxury builder Ettore Bugatti refused to sell one of his cars to King Zog of Albania because "the man's table manners are beyond belief."

It is the despotic Henry Ford who looms large in automotive history, not only for the introduction of his Model T but for his revolutionary system of shoveling raw materials in one end of his half-mile long Rouge River, Mich., factory complex and sending "Tin Lizzies" out the other end.

For the full review, see:

Patrick Cooke. "Book Review: 'The Life of the Automobile' by Steven Parissien; The history of cars, from playthings of the idle rich to emblems of the working man." The Wall Street Journal (Sat., May 24, 2014): C14.

(Note: the online version of the review has the date May 23, 2014, an has the title "Book Review: 'The Life of the Automobile' by Steven Parissien; The history of cars, from playthings of the idle rich to emblems of the working man.")

The book under review is:

Parissien, Steven. The Life of the Automobile: The Complete History of the Motor Car. New York: Thomas Dunne Books, 2014.

July 28, 2014

Entrepreneur Gutenberg's Press Creatively Destroyed the Jobs of Scribes

(p. 32) Poggio possessed . . . [a] gift that set him apart from virtually all the other book-hunting humanists. He was a superbly well-trained scribe, with exceptionally fine handwriting, great powers of concentration, and a high degree of accuracy. It is difficult for us, at this distance, to take in the significance of such qualities: our technologies for producing transcriptions, facsimiles, and copies have almost entirely erased what was once an important personal achievement. That importance began to decline, though not at all precipitously, even in Poggio's own lifetime, for by the 1430s a German entrepreneur, Johann Gutenberg, began experimenting with a new invention, movable type, which would revolutionize the reproduction and transmission of texts. By the century's end printers, especially the great Aldus in Venice, would print Latin texts in a typeface whose clarity and elegance remain unrivalled after five centuries. That typeface was based on the beautiful handwriting of Poggio and his humanist friends. What Poggio did by hand to produce a single copy would soon be done mechanically to produce hundreds.


Greenblatt, Stephen. The Swerve: How the World Became Modern. New York: W. W. Norton & Company, 2011.

(Note: ellipsis, and bracketed word, added.)

July 14, 2014

Forecasts of Mass Unemployment from Robots Were Wrong

(p. 215) Frank Levy and Richard J. Murnane consider the interaction between workers and machinery in "Dancing with Robots: Human Skills for Computerized Work." "On March 22, 1964, President Lyndon Johnson received a short, alarming memorandum from the Ad Hoc Committee on the Triple Revolution. The memo warned the president of threats to the nation beginning with the likelihood that computers would soon create mass unemployment: 'A new era of production has begun. Its principles of organization are as different from those of the industrial era as those of the industrial era were different from the agricultural. The cybernation revolution has been brought about by the combination of the computer and the automated self-regulating machine. This results in a system of almost unlimited productive capacity which requires progressively less human labor. Cybernation is already reorganizing the economic and social system to meet its own needs.' The memo was signed by luminaries including Nobel Prize winning chemist Linus Pauling, Scientific American publisher Gerard Piel, and economist Gunnar Myrdal (a future Nobel Prize winner). Nonetheless, its warning was only half right. There was no mass unemployment--since 1964 the economy has added 74 million jobs. But computers have changed the jobs that are available, the skills those jobs require, and the wages the jobs pay. For the foreseeable future, the challenge of "cybernation" is not mass unemployment but the need to educate many more young people for the jobs computers cannot do." Third Way, 2013, /publications/714/Dancing-With-Robots.pdf.


Taylor, Timothy. "Recommendations for Further Reading." Journal of Economic Perspectives 27, no. 4 (Fall 2013): 211-18.

(Note: italics in original.)

July 3, 2014

Rickenbacker Wasn't the Best Pilot or the Best Shot "but He Could Put More Holes in a Target that Was Shooting Back"


Source of book image:

(p. C6) With his unpolished manners, Rickenbacker encountered a good deal of arrogance from the privileged sons of Harvard and Yale, but after he had downed his first five enemies, criticism ceased. About Rickenbacker's killer instinct his colleague Reed McKinley Chambers had this to say: "Eddie wasn't the best pilot in the world. He could not put as many holes in a target that was being towed as I could, but he could put more holes in a target that was shooting back at him than I could."

For the full review, see:

HENRIK BERING. "Daring Done Deliberately." The Wall Street Journal (Sat., May 31, 2014): C6.

(Note: the online version of the review has the date May 30, 2014, and has the title "Book Review: 'Enduring Courage' by John F. Ross.")

The book under review is:

Ross, John F. Enduring Courage: Ace Pilot Eddie Rickenbacker and the Dawn of the Age of Speed. New York: St Martin's Press, 2014.

June 27, 2014

Instead of 50 Silicon Valleys, Andreessen Sees 50 Kinds of Silicon Valley

AndreessenMarcCofounderNetscape2014-05-31.jpg "Marc Andreessen, co-founder of the first major web browser, Netscape, has a record for knowing what's coming next with technology." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B8) Mr. Andreessen said new valleys will eventually emerge. But they won't be Silicon Valley copycats.

Over the past couple of years, venture firms have invested in start-ups in Los Angeles, New York, Chicago and all over China. Los Angeles, for example, is home to Snapchat, Tinder, Whisper, Oculus VR and Beats, some of the big tech stories of the year. Mr. Andreessen said another hot place is Atlanta, the home of Georgia Tech.

But he offers a caveat.

"My personal view is that Silicon Valley will continue to take a disproportionate share of the No. 1 positions in great new markets, and I think that's just a reflection that the fact that the valley works as well as it does," Mr. Andreessen said.

There is a caveat to his caveat.

In Mr. Andreessen's view, there shouldn't be 50 Silicon Valleys. Instead, there should be 50 different kinds of Silicon Valley. For example, there could be Biotech Valley, a Stem Cell Valley, a 3-D Printing Valley or a Drone Valley. As he noted, there are huge regulatory hurdles in many of these fields. If a city wanted to spur innovation around drones, for instance, it might have to remove any local legal barriers to flying unmanned aircraft.

For the full interview, see:

NICK BILTON. "DISRUPTIONS; Forecasting the Next Big Moves in Tech." The New York Times (Mon., MAY 19, 2014): B8.

(Note: the online version of the interview has the date MAY 18, 2014, and has the title "DISRUPTIONS; Marc Andreessen on the Future of Silicon Valley(s), and the Next Big Technology." )

May 29, 2014

In Bringing Us Electricity, Westinghouse Rejected the Precautionary Principle

(p. 180) The defensive position that Westinghouse found himself in is illustrated by the way he contradicted himself as he tried to defend overhead wires. The wires that were supposedly safe were also the same wires that he had to admit, yes, posed dangers, yes, but dangers of various kinds had to be accepted throughout the modern city. Westinghouse said, "If all things involving the use of power were to be prohibited because of the danger to life, then the cable cars, which have already killed and maimed a number of people, would have to be abolished." Say good-bye to trains, too, he added, because of accidents at road crossings.


Stross, Randall E. The Wizard of Menlo Park: How Thomas Alva Edison Invented the Modern World. New York: Crown Publishers, 2007.

May 25, 2014

Entrepreneurial Consumer J.P. Morgan "Handled Setbacks with Equanimity"

Schumpeter wrote that the entrepreneur is the one who overcomes obstacles to get the job done (1950, p. 132). Obstacles come in many forms. One of them is consumer resistance to change. So one key contributor to the technological progress is the "entrepreneurial consumer" who is willing to invest in new, buggy, possibly dangerous technologies at an early stage. (Paul Nodskov, a student in my spring 2014 Economics of Technology seminar suggested using the phrase "entrepreneurial consumer.")

Alexis de Tocqueville observed that in contrast to Europeans, Americans were "restless in the midst of their prosperity" (2000 [first published 1835], Ch. 13). Perhaps even that early, America had more entrepreneurial consumers?

(p. 131) Morgan prized being ahead of everyone else, and the next year was concerned that his plant was already less than state of the art, a suspicion that was confirmed when he persuaded Edison to send Edward Johnson to the house for an evaluation. Johnson was instructed to upgrade the equipment and also to devise a way to provide an electric light that would sit on Morgan's desk in his library. At a time when the very concept of an electrical outlet and detachable electrical appliances had yet to appear, this posed a significant challenge. Johnson's solution was to run wires beneath the floor to metal plates that were installed in different places beneath the rugs. One of the legs of the desk was equipped with sharp metal prongs, designed to make contact with one of the plates when moved about the room.

In conception, it was clever; in implementation, it fell short of ideal. On the first evening when the light was turned on, there was a flash, followed by a fire that quickly engulfed the desk and spread across the rug before being put out. When Johnson was summoned to the house the next morning, he was shown into the library, where charred debris was piled in a heap. He expected that when Morgan appeared, he would angrily announce that the services of Edison Electric were no longer needed.

(p. 132) "Well?" Morgan stood in the doorway, with Mrs. Morgan standing behind him, signaling Johnson with a finger across her lips not to launch into elaborate explanations. Johnson cast a doleful eye at the disaster in the room and remained silent.

"Well, what are you going to do about it?" Morgan asked. Johnson said the fault was his own and that he would personally reinstall everything, ensuring that it would be done properly.

"All right. See that you do." Morgan turned and left. The eager purchaser of first-generation technology handled setbacks with equanimity. "I hope that the Edison Company appreciates the value of my house as an experimental station," he would later say. A new installation with second-generation equipment worked well, and Morgan held a reception for four hundred guests to show off his electric lights. The event led some guests to place their own orders for similar installations. Morgan also donated entire systems to St. George's Church and to a private school, dispatching Johnson to oversee the installation as a surprise to the headmistress. The family biographer compared Morgan's gifts of electrical power plants to his sending friends baskets of choice fruit.


Stross, Randall E. The Wizard of Menlo Park: How Thomas Alva Edison Invented the Modern World. New York: Crown Publishers, 2007.

Schumpeter's book is:

Schumpeter, Joseph A. Capitalism, Socialism and Democracy. 3rd ed. New York: Harper and Row, 1950.

The other book I mention, is:

de Tocqueville, Alexis. Democracy in America. Chicago: University of Chicago Press, 2000 [first published in two volumes in 1835 and 1840].

May 21, 2014

Edison Genuinely Believed that AC Was More Dangerous than DC

(p. 174) In Edison's view, . . . , Westinghouse did not pose a serious threat in the power-and-light business because he used the relatively more dangerous alternating current, certain to kill one of his own customers within six months.

Edison's conviction that direct current was less dangerous than alternating current was based on hunch, however, not empirical scientific research. He, like others at the time, focused solely on voltage (the force that pushes electricity through a wire) without paying attention to amperage (the rate of flow of electricity), and thought it would be best to stay at 1,200 volts or less. Even he was not certain that his own system was completely safe--after all, he had elected to place wires in underground conduits, which was more expensive than stringing wires overhead but reduced the likelihood of electrical current touching a passerby. Burying the wires could not give him complete peace of mind, however. Privately, he told Edward Johnson that "we must look out for crosses [i.e., short-circuited wires] for if we ever kill a customer it would be a bad blow to the business."


Stross, Randall E. The Wizard of Menlo Park: How Thomas Alva Edison Invented the Modern World. New York: Crown Publishers, 2007.

(Note: ellipsis added, bracketed words in original.)

May 14, 2014

Delta Overcomes Obstacles that Ground Other Airlines

DeltaOvercomesObstaclesToKeepFlyingGraphic.jpgSource of graphic: online version of the WSJ article quoted and cited below.

Cancellations due to mechanical failures, piliot illness and government regulations are often announced as though they were acts of God, outside the possible control of airlines, for which the airline is blameless. But airlines can take actions, and improve processes, to reduce the frequency and consequences of such cancellations. In airlines, and in other firms, there is not a sharp line between what can and what cannot be under the firm's control.

(p. D3) Atlanta

The crew of Delta Air Lines Flight 55 last Thursday couldn't legally fly from Lagos, Nigeria, to Atlanta unless they waited a day due to new limits on how much pilots can fly in a rolling 28-day period. The trip would have to be canceled.

Instead, Delta headquarters told the captain to fly to San Juan, Puerto Rico, which they could reach within their duty limits. There, two new pilots would be waiting to take the Boeing 767 on to Atlanta. The plane arrived in San Juan at 2:44 a.m., quickly took on fuel and pilots, and landed in Atlanta only 40 minutes late.

The episode, unorthodox in the airline industry, illustrates the fanaticism Delta now has for avoiding cancellations. Last year, Delta canceled just 0.3% of its flights, according to flight-tracking service That was twice as good as the next-best airlines, Southwest and Alaska, and five times better than the industry average of 1.7%.

. . .

Managers in Delta operations centers move planes, crews and parts around hourly trying to avoid canceling flights. How well an airline maintains its fleet and how smartly it stashes spare parts and planes at airports affect whether your flight goes or not.

Delta thinks it has come up with new analytical software and instruments that can help monitor the health of airplanes and predict which parts will soon fail. Empty planes are ferried to replace crippled jets rather than waiting for overnight repairs.

Mechanics developed a vibration monitor to install on cooling fans for cockpit instruments. A plane can't be sent out on a new trip with a broken fan.

Now when vibration starts to increase, indicating that a bearing may be wearing down and getting close to failing, a new fan is swapped in. The wobbly fan goes to the shop for new bearings. That has reduced canceled flights.

So has spending $2 million to have spare starters for Boeing 767 engines at all 767 stations abroad. Starters last about five years. While each plane has two and both engines can be started with one, you can't send a plane out on a long trip over oceans with only one working.

For the full story, see:

SCOTT MCCARTNEY. "THE MIDDLE SEAT; A World Where Flights Aren't Canceled; How Smartly an Airline Stashes Spare Parts and Planes at Airports Affects Whether or Not Your Flight Takes Off." The Wall Street Journal (Thurs., April 3, 2014): D3.

(Note: ellipsis added.)

(Note: the online version of the story was updated April 2, 2014, and has the title "THE MIDDLE SEAT; A World Where Flights Aren't Canceled; Inside Delta's new strategies to avoid stranding fliers.")

May 13, 2014

In the End Edison Said "I Am Not Business Man Enough to Spend Time" in the Electricity Business

(p. 186) In early 1892, the deal was done: Edison General Electric and Thomson-Houston merged as nominal equals. The organization chart, however, reflected a different understanding among the principals. Thomson-Houston's chief executive, Charles Coffin, became the new head and other Thomson-Houston executives filled out the other positions. Insull was the only manager from the Edison side invited to stay, which he did only briefly. From the outside, it appeared that Thomas (p. 187) Edison and his coterie had arranged the combination from a position of abject surrender. Edison did not want this to be the impression left in the public mind, however. When the press asked him about the announcement, he said he had been one of the first to urge the merger. This was not close to the truth, and is especially amusing when placed in juxtaposition to Alfred Tate's account of the moment when Tate, hearing news of the merger first, had been the one to convey the news to Edison.

I always have regretted the abruptness with which I broke the news to Edison but I am not sure that a milder manner and less precipitate delivery would have cushioned the shock. I never before had seen him change color. His complexion naturally was pale, a clear healthy paleness, but following my announcement it turned as white as his collar.

"Send for Insull," was all he said as he left me standing in his library.

Having collected himself before meeting with the reporters, Edison could say with sincerity that he was too busy to "waste my time" on the electric light. For the past three years, since he first realized that his direct-current system would ultimately be driven to the margins by alternating current, he had been carting his affections elsewhere. The occasion of the merger did shake him into a rare disclosure of personal shortcoming: He allowed that "I am not business man enough to spend time" in the power-and-light business.


Stross, Randall E. The Wizard of Menlo Park: How Thomas Alva Edison Invented the Modern World. New York: Crown Publishers, 2007.

April 26, 2014

One Way to Appreciate All We Take for Granted


Source of book image:

(p. 8) Over the past generation or two we've gone from being producers and tinkerers to consumers. As a result, I think we feel a sense of disconnect between our modern existence and the underlying processes that support our lives. Who has any real understanding of where their last meal came from or how the objects in their pockets were dug out of the earth and transformed into useful materials? What would we do if, in some science-fiction scenario, a global catastrophe collapsed civilization and we were members of a small society of survivors?

My research has to do with what factors planets need to support life. Recently, I've been wondering what factors are needed to support our modern civilization. What key principles of science and technology would be necessary to rebuild our world from scratch?

. . .

. . . there are the many materials society requires: How do you transform base substances like clay and iron into brick or concrete or steel, and then shape that material into a useful tool? To learn a small piece of this, I spent a day in a traditional, 18th-century iron forge, learning the essentials of the craft of the blacksmith. Sweating over an open coke-fired hearth, I managed to beat a lump of steel into a knife. Once shaped, I got it cherry-red hot and then quenched it with a satisfying squeal into a water trough, before reheating the blade slightly to temper it for extra toughness.

. . .

. . . , it needn't take a catastrophic collapse of civilization to make you appreciate the importance of understanding the basics of how devices around you work. Localized disasters can disrupt normal services, making a reasonable reserve of clean water, canned food and backup technologies like kerosene lamps a prudent precaution. And becoming a little more self-reliant is immensely rewarding in its own right. Thought experiments like these can help us to explore how our modern world actually came to be, and to appreciate all that we take for granted.

For the full commentary, see:

LEWIS DARTNELL. "OPINION; Civilization's Starter Kit." The New York Times, SundayReview Section (Sun., MARCH 30, 2014): 8.

(Note: ellipses added.)

(Note: the online version of the commentary has the date MARCH 29, 2014.)

Dartnell's commentary, quoted above, has been elaborated in his book:

Dartnell, Lewis. The Knowledge: How to Rebuild Our World from Scratch. New York: Penguin Press, 2014.

April 25, 2014

Bill Clinton Says U.S. Control of Internet Protects Free Speech

(p. A11) . . . , Mr. Clinton, appearing on a panel discussion at a recent Clinton Global Initiative event, defended U.S. oversight of the domain-name system and the Internet Corporation for Assigned Names and Numbers, or Icann.

. . .

"I understand why the reaction in the rest of the world to the Edward Snowden declarations has given new energy to the idea that the U.S. should not be in nominal control of domain names on the Internet," Mr. Clinton said. "But I also know that we've kept the Internet free and open, and it is a great tribute to the U.S. that we have done that, including the ability to bash the living daylights out of those of us who are in office or have been.

"A lot of people who have been trying to take this authority away from the U.S. want to do it for the sole purpose of cracking down on Internet freedom and limiting it and having governments protect their backsides instead of empower their people."

Mr. Clinton asked Jimmy Wales, founder of Wikipedia: "Are you at all worried that if we give up this domain jurisdiction that we have had for all these years that we will lose Internet freedom?"

"I'm very worried about it," Mr. Wales answered. People outside the U.S. often say to him, "Oh, it's terrible. Why should the U.S. have this special power?" His reply: "There is the First Amendment in the U.S., and there is a culture of free expression."

He recalled being told on Icann panels to be more understanding of differences in cultures. "I have respect for local cultures, but banning parts of Wikipedia is not a local cultural variation that we should embrace and accept. That's a human-rights violation."

For the full commentary, see:

L. GORDON CROVITZ. "INFORMATION AGE; Open Internet: Clinton vs. Obama; The former president strongly defends the current system of oversight by the U.S." The Wall Street Journal (Mon., MARCH 31, 2014): A11.

(Note: ellipses added.)

(Note: the online version of the commentary has the shorter title "INFORMATION AGE; Open Internet: Clinton vs. Obama.")

April 7, 2014

William Vanderbilt Helped Disrupt His Gas Holdings by Investing in Edison's Electricity

(p. 84) But even the minimal ongoing work on the phonograph would be pushed aside by the launch of frenzied efforts to find a way to fulfill Edison's premature public claim that his electric light was working. A couple of months later, when asked in an interview about the state of his phonograph, Edison replied tartly, "Comatose for the time being." He changed metaphors and continued, catching hold of an image that would be quoted many times by later biographers: "It is a child and will grow to be a man yet; but I have a bigger thing in hand and must finish it to the temporary neglect of all phones and graphs."

Financial considerations played a part in allocation of time and resources, too. Commissions from the phonograph that brought in hundreds of dollars were hardly worth accounting for, not when William Vanderbilt and his friends were about to advance Edison $50,000 for the electric light. Edison wrote a correspondent that he regarded the financier's interest especially satisfying as Vanderbilt was "the largest gas stock owner in America."


Stross, Randall E. The Wizard of Menlo Park: How Thomas Alva Edison Invented the Modern World. New York: Crown Publishers, 2007.

(Note: ellipses, and capitals, in original.)

April 3, 2014

As a Young Inventor, Edison Patented Fast

Edison filed patent applications as fast as the ideas arrived.


Stross, Randall E. The Wizard of Menlo Park: How Thomas Alva Edison Invented the Modern World. New York: Crown Publishers, 2007.

March 13, 2014

How the Brain May Be Able to Control Robots


Michio Kaku. Source of photo: online version of the NYT article quoted and cited below.

(p. 2) Michio Kaku is a theoretical physicist and professor at City College of New York. When not trying to complete Einstein's theory of everything, he writes books that explain physics and how developments in the field will shape the future.

. . .

One of the most intriguing things I've read lately was by Miguel Nicolelis, called "Beyond Boundaries: The New Neuroscience of Connecting Brains With Machines," in which he describes hooking up the brain directly to a computer, which allows you to mentally control a robot or exoskeleton on the other side of the earth.

For the full interview, see:

KATE MURPHY, interviewer. "Download; Michio Kaku." The New York Times, SundayReview Section (Sun., FEB. 9, 2014): 2.

(Note: ellipsis added.)

(Note: the first paragraph is an introduction by Kate Murphy; the next paragraph is part of a response by Michio Kaku.)

(Note: the online version of the interview has the date FEB. 8, 2014.)

The book mentioned above is:

Nicolelis, Miguel. Beyond Boundaries: The New Neuroscience of Connecting Brains with Machines---and How It Will Change Our Lives. New York: Times Books, 2011.

March 11, 2014

Khan's Cousins Liked Him Better on YouTube than in Person

KhanSalmanAtKhanAcademy2014-03-03.jpg "Salman Khan at the offices of Khan Academy, which reaches more than 10 million users. Bill Gates invested in the school." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. D5) In 2008, Salman Khan, then a young hedge-fund analyst with a master's in computer science from M.I.T., started the Khan Academy, offering free online courses mainly in the STEM subjects -- science, technology, engineering and mathematics.

Today the free electronic schoolhouse reaches more than 10 million users around the world, with more than 5,000 courses, and the approach has been widely admired and copied. I spoke with Mr. Khan, 37, for more than two hours, in person and by telephone. What follows is a condensed and edited version of our conversations.

. . .

Did you have background as a math educator?

No, though I've had a passion for math my whole life. It got me to M.I.T. and enabled me to get multiple degrees in math and engineering. Long story shortened: Nadia got through what she thought she couldn't. Soon word got around the family that "free tutoring" was going on, and I found myself working on the phone with about 15 cousins.

To make it manageable, I hacked together a website where my cousins could go to practice problems and I could suggest things for them to work on. When I'd tutor them over the telephone, I'd use Yahoo Doodle, a program that was part of Yahoo Messenger, so they could visualize the calculations on their computers while we talked.

The Internet videos started two years later when a friend asked, "How are you scaling your lessons?" I said, "I'm not." He said, "Why don't you make some videos of the tutorials and post them on YouTube?" I said, "That's a horrible idea. YouTube is for cats playing piano."

Still, I gave it try. Soon my cousins said they liked me more on YouTube than in person. They were really saying that they found my explanations more valuable when they could have them on demand and where no one would judge them. And soon many people who were not my cousins were watching. By 2008, I was reaching tens of thousands every month.

Youtube is a search engine where producers can upload short videos at no cost. Would the Khan Academy have been possible without this technology?

No. Before YouTube, the cost of hosting streaming videos was incredibly expensive. I wouldn't have been able to afford the server space for that much video -- or traffic. That said, I was probably the 500th person to show up on YouTube with educational videos. Our success probably had to do with the technology being ready and the fact that my content resonated with users.

For the full interview, see:

CLAUDIA DREIFUS, interviewer. "A Conversation With Salman Khan; It All Started With a 12-Year-Old Cousin." The New York Times (Tues., JAN. 28, 2014): D5.

(Note: ellipsis added; bold in original; the first two paragraphs, and the bold questions, are Claudia Dreifus; the other paragraphs are Salman Khan.)

(Note: the online version of the interview has the date JAN. 27, 2014.)

February 25, 2014

Catmull's Pixar Had Technology Serve Story


Source of book image:

Ed Catmull, one of the creators of Pixar, discusses a favorite book of 2013. Catmull's appreciation of the importance of storytelling may help explain why the early Pixar movies were so wonderful:

(p. C6) I am constantly struck by how many people think of stories solely as entertainment--edifying or time-wasting but still: entertainment. "The Storytelling Animal" by Jonathan Gottschall shows that the storytelling part of our brain is deeper and more complex than that, wired into the way we think and learn. This struck me as a powerful idea, that our brain is structured for and shaped by stories whose value goes beyond entertainment and socialization.

For the full article, see:

"12 Months of Reading; We asked 50 of our friends--from April Bloomfield to Mike Tyson--to name their favorite books of 2013." The Wall Street Journal (Sat., Dec. 14, 2013): C6 & C9-C12.

(Note: the online version of the article has the date Dec. 13, 2013.)

The book that Catmull praises is:

Gottschall, Jonathan. The Storytelling Animal: How Stories Make Us Human. New York: Houghton Mifflin Harcourt, 2012.

February 21, 2014

Hero Rebels Against the Bureau of Technology Control


Source of book image: online version of the WSJ review quoted and cited below.

(p. D8) In "Influx," . . . , a sinister Bureau of Technology Control kidnaps scientists that have developed breakthrough technologies (the cure to cancer, immortality, true artificial intelligence), and is withholding their discoveries from humanity, out of concern over the massive social disruption they would cause. "We don't have a perfect record--Steve Jobs was a tricky one--but we've managed to catch most of the big disrupters before they've brought about uncontrolled social change," says the head of the bureau, the book's villain. The hero has developed a "gravity mirror" but refuses to cooperate, despite the best efforts of Alexa, who has been genetically engineered by the Bureau to be both impossibly sexy and brilliant.

In the publishing world, there is a growing sense that "Influx," Mr. Suarez's fourth novel, may be his breakout book and propel him into the void left by the deaths of Tom Clancy and Michael Crichton. "Influx' has Mr. Suarez's largest initial print run, 50,000 copies, and Twentieth Century Fox bought the movie rights last month.

An English major at the University of Delaware with a knack for computers, Mr. Suarez started a consulting firm in 1997, working with companies like Nestlé on complex production and logistics-planning issues. "You only want to move 100 million pounds of sugar once," says Mr. Suarez, 49 years old.

He began writing in his free-time. Rejected by 48 literary agents--(a database expert, he kept careful track)--he began self-publishing in 2006 under the name Leinad Zeraus, his named spelled backward. His sophisticated tech knowledge quickly attracted a cult following in Silicon Valley, Redmond, Wash., and Cambridge, Mass. The MIT bookstore was the first bookstore to stock his self-published books in 2007.

For the full review, see:

EBEN SHAPIRO. "Daniel Suarez Sees Into the Future." The Wall Street Journal (Fri., Feb. 7, 2014): D8.

(Note: ellipsis added.)

(Note: the online version of the review has the date Feb. 5, 2014, and the title "Daniel Suarez Sees Into the Future.")

The book under review, is:

Suarez, Daniel. Influx. New York: Dutton, 2014.


Author of Influx, Daniel Suarez. Source of photo: online version of the WSJ article quoted and cited above.

February 12, 2014

It Does Not Take a Government to Raise a Railroad

(p. A17) . . . , All Aboard Florida (the train will get a new name this year), is not designed to push political buttons. It won't go to Tampa. It will zip past several aggrieved towns on Florida's Treasure Coast without stopping.

Nor will the train qualify as "high speed," except on a stretch where it will hit 125 miles an hour. Instead of running on a dedicated line, the new service will mostly share existing track with slower freight trains operated by its sister company, the Florida East Coast Railway.

But the sponsoring companies, all owned by the private-equity outfit Fortress Investment Group, appear to have done their sums. By minimizing stops, the line will be competitive with road and air in connecting the beaches, casinos and resorts of Miami and Fort Lauderdale with the big airport and theme-park destination of Orlando. Capturing a small percentage of the 50 million people who travel between these fleshpots, especially European visitors accustomed to intercity rail at home, would let the train cover its costs and then some.

But Fortress has a bigger fish in the pan. Its local operation, Florida East Coast Industries, is a lineal progeny of Henry Flagler, the 1890s entrepreneur who created modern Florida when he built a rail line to support his resort developments. Flagler's heirs are adopting the same model. A Grand Central-like complex will rise on the site of Miami's old train station. A similar but smaller edifice is planned for Fort Lauderdale.

The project is a vivid illustration of the factors that have to fall in place to make passenger rail viable nowadays. If the Florida venture succeeds, it would be the only intercity rail service anywhere in the world not dependent on government operating subsidies. It would be the first privately run intercity service in America since the birth of Amtrak in 1971.

For the full commentary, see:

HOLMAN W. JENKINS, JR. "BUSINESS WORLD; A Private Railroad Is Born; All Aboard Florida isn't looking for government operating subsidies." The Wall Street Journal (Weds., Jan. 15, 2014): A17.

(Note: ellipsis added.)

(Note: the online version of the commentary has the date Jan. 14, 2014.)

January 31, 2014

70 Percent of Current Jobs May Soon Be Done by Robots

Kelly may be right, but it does not imply that we will all be unemployed. What will happen is that new and better jobs, and entrepreneurial opportunities, will be created for humans.

Robots will do the boring, the dangerous, and the physically exhausting. We will do the creative and the analytic, and the social or emotional

(p. A21) Kevin Kelly set off a big debate with a piece in Wired called "Better Than Human: Why Robots Will -- And Must -- Take Our Jobs." He asserted that robots will soon be performing 70 percent of existing human jobs. They will do the driving, evaluate CAT scans, even write newspaper articles. We will all have our personal bot to get coffee. There's already an existing robot named Baxter, who is deliciously easy to train: "To train the bot you simply grab its arms and guide them in the correct motions and sequence. It's a kind of 'watch me do this' routine. Baxter learns the procedure and then repeats it. Any worker is capable of this show-and-tell."

For the full commentary, see:

DAVID BROOKS. "The Sidney Awards, Part 2." The New York Times (Tues., December 31, 2013): A21. [National Edition]

(Note: the online version of the commentary has the date December 30, 2013.)

The article praised by Brooks is:

Kelly, Kevin. "Better Than Human: Why Robots Will -- and Must -- Take Our Jobs." Wired (Jan. 2013).

January 17, 2014

Carnegie Failed Twice Before Bessemer Success

(p. 101) [Carnegie] . . . organized his own company to secure the rights to the Dodd process for strengthening iron rails by coating them with steel facings. Thomson agreed to appropriate $20,000 of Pennsylvania Railroad funds to test the new technology.

On March 12, 1867, Thomson wrote to tell Carnegie that his Dodd-processed rails had failed their first test: "treatment under the hammer.... You may as well abandon the Patent--It will not do if this Rail is a sample." Three days later, Thomson wrote Carnegie again, this time marking his letter with a handwritten "Private" in the top left-hand corner and "a word to the wise" penned in just below. Carnegie had apparently asked Thomson for more time--and/or money--to continue his experiments. Thomson replied that the experiments his engineers had made had so "impaired my confidence in this process that I don't feel at liberty to increase our order for these Rails."

Instead of giving up, Carnegie pushed forward, hawking his new steel-faced iron rails to other railroad presidents, attempting to get a new contract with Thomson, and reorganizing the Freedom Iron Company in Lewistown, Pennsylvania, in which he was a major investor, into Freedom Iron and Steel. In the spring of 1867, he succeeded, despite Thomson's misgivings, in getting the approval to manufacture and deliver a second 500-ton batch of steel-faced rails. The new rails fared as poorly as the old ones. There would be no further contracts forthcoming from the Pennsylvania Railroad or any other railroad.

Carnegie tried to bluff his way through. When his contacts in England recommended that he purchase the American rights to a better process for facing iron rails with steel, this one invented by a Mr. Webb, Carnegie retooled his mill for the new process. He was fooled a second time. Not only was the Webb process as impractical as the Dodd, but there was, as there (p. 102) had been with the Dodd process, confusion as to who held the American patent rights. Within a year, the company Carnegie had organized to produce the new steel-faced rails was out of business.

. . .

These early failures did not deter him from investing in other start-up companies and technologies, but he would in future be a bit more careful before committing his capital. In March 1869, Tom Scott solicited his advice about investing in the rights to a new "Chrome Steel process." Carnegie replied that his "advice (which don't cost anything if of no value) would be to have nothing to do with this or any other great change in the manufacture of steel or iron.... I know at least six inventors who have the secret all are so anxiously awaiting.... That there is to be a great change in the manufacture of iron and steel some of these years is probable, but exactly what form it is to take no one knows. I would advise you to steer clear of the whole thing. One will win, but many lose and you and I not being practical men would very likely be among the more numerous class. At least we would wager at very long odds. There are many enterprises where we can go in even."


Nasaw, David. Andrew Carnegie. New York: Penguin Press, 2006.

(Note: bracketed name, ellipsis near start, and ellipsis between paragraphs added; ellipsis internal to other paragraphs, in original.)

(Note: the pagination of the hardback and paperback editions of Nasaw's book are the same.)

January 16, 2014

Malcolm Gladwell, on Harvard, Rings True to Debbie Sterling


Debbie Sterling, GoldieBlox entrepreneur. Source of photo: online version of the NYT article quoted and cited below.

(p. 2) Debbie Sterling is the founder and chief executive of GoldieBlox, a toy company dedicated to encouraging girls' interest in engineering and construction.

READING I just started "David and Goliath," by Malcolm Gladwell. He has some really interesting statistics about how at the top-tier universities like Stanford and Harvard, freshmen who go into engineering often fall out versus if those same students had gone to a second-tier school, they would have been in the top of their class and therefore would have stayed in. It really spoke to me because I was definitely one of those engineering students at Stanford who constantly felt like I was surrounded by geniuses. I was intimidated, but I stayed because I am just so stubborn.

For the full interview, see:

KATE MURPHY, interviewer. "DOWNLOAD; Debbie Sterling." The New York Times, SundayReview Section (Sun., December 22, 2013): 2.

(Note: bold in original, indicating that what follows are the words of Debbie Sterling.)

(Note: the online version of the interview has the date December 21, 2013.)

Book that "spoke to" Sterling:

Gladwell, Malcolm. David and Goliath: Underdogs, Misfits, and the Art of Battling Giants. New York, NY: Little, Brown and Company, 2013.

January 12, 2014

In 20th Century, Inventions Had Cultural Impact Twice as Fast as in 19th Century

NgramGraphTechnologies2013-12-08.png I used Google's Ngram tool to generate the Ngram above, using the same technologies used in the Ngram that appeared in the print (but not the online) version of the article quoted and cited below. The blue line is "railroad"; the red line is "radio"; the green line is "television"; the orange line is "internet." The search was case-insensitive. The print (but not the online) version of the article quoted and cited below, includes a caption that describes the Ngram tool: "A Google tool, the Ngram Viewer, allows anyone to chart the use of words and phrases in millions of books back to the year 1500. By measuring historical shifts in language, the tool offers a quantitative approach to understanding human history."

(p. 3) Today, the Ngram Viewer contains words taken from about 7.5 million books, representing an estimated 6 percent of all books ever published. Academic researchers can tap into the data to conduct rigorous studies of linguistic shifts across decades or centuries. . . .

The system can also conduct quantitative checks on popular perceptions.

Consider our current notion that we live in a time when technology is evolving faster than ever. Mr. Aiden and Mr. Michel tested this belief by comparing the dates of invention of 147 technologies with the rates at which those innovations spread through English texts. They found that early 19th-century inventions, for instance, took 65 years to begin making a cultural impact, while turn-of-the-20th-century innovations took only 26 years. Their conclusion: the time it takes for society to learn about an invention has been shrinking by about 2.5 years every decade.

"You see it very quantitatively, going back centuries, the increasing speed with which technology is adopted," Mr. Aiden says.

Still, they caution armchair linguists that the Ngram Viewer is a scientific tool whose results can be misinterpreted.

Witness a simple two-gram query for "fax machine." Their book describes how the fax seems to pop up, "almost instantaneously, in the 1980s, soaring immediately to peak popularity." But the machine was actually invented in the 1840s, the book reports. Back then it was called the "telefax."

Certain concepts may persevere, even as the names for technologies change to suit the lexicon of their time.

For the full story, see:

NATASHA SINGER. "TECHNOPHORIA; In a Scoreboard of Words, a Cultural Guide." The New York Times, SundayBusiness Section (Sun., December 8, 2013): 3.

(Note: ellipsis added; bold in original.)

(Note: the online version of the article has the date December 7, 2013.)

January 10, 2014

"Pretty Cool" Cochlear Implant: "It Helps Me Hear"

CochlearImplant2013-11-15.jpg "The cochlear implant." Source of caption and photo: online version of the WSJ commentary quoted and cited below.

(p. A15) . . . , three pioneering researchers-- Graeme Clark, Ingeborg Hochmair and Blake Wilson --shared the prestigious Lasker-DeBakey Award for Clinical Medical Research for their work in developing the [cochlear] implant. . . . The award citation says the devices have "for the first time, substantially restored a human sense with medical intervention" and directly transformed the lives of hundreds of thousands.

I've seen this up close. My 10-year-old son, Alex, is one of the 320,000 people with a cochlear implant.

, , ,

"What's that thing on your head?" I heard a new friend ask Alex recently.

"It helps me hear," he replied, then added: "I think it's pretty cool."

"If you took it off, would you hear me?" she asked.

"Nope," he said. "I'm deaf."

"Cool," she agreed. Then they talked about something else.

Moments like that make me deeply grateful for the technology that allows Alex to have such a conversation, but also for the hard-won aplomb that lets him do it so matter-of-factly.

For the full commentary, see:

Denworth, Lydia. "OPINION; What Cochlear Implants Did for My Son; Researchers who were just awarded the 'American Nobel' have opened up the world of sound to the deaf." The Wall Street Journal (Fri., Sept. 20, 2013): A15.

(Note: ellipses, and bracketed word, added.)

(Note: the online version of the article has the date Sept. 19, 2013.)

January 1, 2014

"Carnegie Watched, Listened, Learned" from Scott's Process Innovations

(p. 65) Later in life, Scott would be better known for his political skills, but he was, like his mentor Thomson, a master of cost accounting. Together, the two men steadily cut unit costs and increased revenues by investing in capital improvements--new and larger locomotives, better braking systems, improved tracks, new bridges. Instead of running several smaller trains along the same route, they ran fewer but longer trains with larger locomotives and freight cars. To minimize delays--a major factor in escalating costs--they erected their own telegraph lines, built a second track and extended sidings alongside the first one, and kept roadways, tunnels, bridges, and crossings in good repair.

Carnegie watched, listened, learned. Nothing was lost on the young man. With an exceptional memory and a head for figures, he made the most of his apprenticeship and within a brief time was acting more as Scott's deputy than his assistant. Tom Scott had proven to be so good at his job that when Pennsylvania Railroad vice president William Foster died unexpectedly of an infected carbuncle, Scott was named his successor.


Nasaw, David. Andrew Carnegie. New York: Penguin Press, 2006.

(Note: the pagination of the hardback and paperback editions of Nasaw's book are the same.)

December 31, 2013

"Western Union Bullied the Makers of Public Policy into Serving Private Capital"


Source of book image: online version of the WSJ review quoted and cited below.

(p. A13) Until now there has been no full-scale, modern company history. Joshua D. Wolff's "Western Union and the Creation of the American Corporate Order, 1845-1893" ably fills the bill, offering an exhaustive and yet fascinating account.

. . .

If people today remember anything about Western Union, it is that its coast-to-coast line put the Pony Express out of business and that its leaders didn't see the telephone coming. Mr. Wolff tells us that neither claim is exactly true. It was Hiram Sibley, Western Union's first president, who went out on his own, when his board balked, to form a separate company and build the transcontinental telegraph in 1861; he made his fortune by eventually selling it to Western Union. And the company was very aware of Alexander Graham Bell's invention, patented in 1876, but history had supposedly shown that it wasn't necessary to control a patent to win the technology war. The company's third president, William Orton, was sure that Bell and his "toy" would not get the better of Western Union: "We would come along and take it away from him." They didn't.

. . .

Mr. Wolff contends that the company's practices set the template for today's "corporate triumphalism," not least in the way Western Union bullied the makers of public policy into serving private capital. Perhaps, but telecom competition today is so ferocious and differently arranged from that of the late 19th century that a "triumphant" company today may be toast tomorrow--think of BlackBerry--and can't purchase help with anything like Western's Union's brazenness and scope. Western Union had friends in Congress, the regulatory bureaucracy and the press. Members of the company's board of directors chaired both the 1872 Republican and Democratic national conventions. It seemed that, whatever the battles in business, politics, technology or the courts, the company's shareholders won.

For the full review, see:

STUART FERGUSON. "Bookshelf; The Octopus of the Wires." The Wall Street Journal (Mon., Dec. 23, 2013): A13.

(Note: ellipses added.)

(Note: the online version of the review has the date Dec. 22, 2013, and has the title "BOOKSHELF; Book Review: 'Western Union and the Creation of the American Corporate Order, 1845-1893,' by Joshua D. Wolff.")

Book under review:

Wolff, Joshua D. Western Union and the Creation of the American Corporate Order, 1845-1893. New York: Cambridge University Press, 2013.

December 26, 2013

Innovators Agree: Whiteboard Is Fast, Easy to Use and Big

(p. B1) . . . Evernote, like pretty much every tech company I've ever visited, is in thrall to the whiteboard. Indeed, as technologically backward as they may seem, whiteboards are to Silicon Valley what legal pads are to lawyers, what Excel is to accountants, or what long sleeves are to magicians.

They're an all-purpose tool of innovation, often the first place a product or company's vision is dreamed up and designed, and a constant huddling point for future refinement. And though many digital technologies have attempted to unseat the whiteboard, the humble pre-electronic surface can't be beat.

The whiteboard has three chief virtues: It's fast. It's easy to use. And it's big. "We're often doing something I call 'designing in the hallway,' " said Jamie Hull, the product manager for Evernote's iOS apps. "When a new problem or request comes up, the fastest thing you can do is pull two or three people aside, go to the nearest wall, and figure it out."

Unlike a computer or phone, the whiteboard is always on, always fully charged, and it doesn't require that people download, install, and launch software to begin using it.

For the full commentary, see:

FARHAD MANJOO. "HIGH DEFINITION; High Tech's Secret Weapon: The Whiteboard." The Wall Street Journal (Thurs., Oct. 31, 2013): B1-B2.

(Note: ellipsis added.)

(Note: the online version of the commentary has the date Oct. 30, 2013. The online version combined paragraphs 1 and 2 above and 3 and 4 above. I have returned them to the form they had in the print version.)

December 25, 2013

Politically Correct Artisanal Locally Sourced Combat Video Game

CallOfDutyGhostsFemaleAvatar2013-11-06.jpg "Call of Duty: Ghosts Female avatars have been added, and so has an "extinction" mode involving fighting aliens, in this game for the PlayStation 3, Xbox 360, Wii U and PC." Source of caption and image: online version of the NYT review quoted and cited below.

From a review of the video game "Call of Duty: Ghosts":

(p. C5) ". . . the South Americans torture a character using artisanal, locally sourced interrogation techniques supposedly (and naturally) used by Amazonian tribes."

For the full review, see:

CHRIS SUELLENTROP. "VIDEO GAME REVIEW; A Fantastical Shootout, Moving Across Space and Time." The New York Times (Weds., November 6, 2013): C5.

(Note: ellipsis added; bold in caption in original of both print and online versions.)

(Note: the online version of the review has the date November 5, 2013.)

December 23, 2013

Over-Regulated Tech Entrepreneurs Seek Their Own Country

The embed above is provided by YouTube where the video clip is posted under the title "Balaji Srinivasan at Startup School 2013."

(p. B4) At a startup conference in the San Francisco Bay area last month, a brash and brilliant young entrepreneur named Balaji Srinivasan took the stage to lay out a case for Silicon Valley's independence.

According to Mr. Srinivasan, who co-founded a successful genetics startup and is now a popular lecturer at Stanford University, the tech industry is under siege from Wall Street, Washington and Hollywood, which he says he believes are harboring resentment toward Silicon Valley's efforts to usurp their cultural and economic power.

On its surface, Mr. Srinivasan's talk,—called "Silicon Valley's Ultimate Exit,"—sounded like a battle cry of the libertarian, anti-regulatory sensibility long espoused by some of the tech industry's leading thinkers. After arguing that the rest of the country wants to put a stop to the Valley's rise, Mr. Srinivasan floated a plan for techies to build an "opt-in society, outside the U.S., run by technology."

His idea seemed a more expansive version of Google Chief Executive Larry Page's call for setting aside "a piece of the world" to try out controversial new technologies, and investor Peter Thiel's "Seastead" movement, which aims to launch tech-utopian island nations.

For the full commentary, see:

FARHAD MANJOO. "HIGH DEFINITION; The Valley's Ugly Complex." The Wall Street Journal (Mon., Nov. 4, 2013): B4.

(Note: the online version of the commentary has the date Nov. 3, 2013, and has the title "HIGH DEFINITION; Silicon Valley Has an Arrogance Problem.")

December 10, 2013

Use of Floppy Disks Shows Slowness of Government

(p. A14) WASHINGTON -- The technology troubles that plagued the website rollout may not have come as a shock to people who work for certain agencies of the government -- especially those who still use floppy disks, the cutting-edge technology of the 1980s.

Every day, The Federal Register, the daily journal of the United States government, publishes on its website and in a thick booklet around 100 executive orders, proclamations, proposed rule changes and other government notices that federal agencies are mandated to submit for public inspection.

So far, so good.

It turns out, however, that the Federal Register employees who take in the information for publication from across the government still receive some of it on the 3.5-inch plastic storage squares that have become all but obsolete in the United States.

. . .

"You've got this antiquated system that still works but is not nearly as efficient as it could be," said Stan Soloway, chief executive of the Professional Services Council, which represents more than 370 government contractors. "Companies that work with the government, whether longstanding or newcomers, are all hamstrung by the same limitations."

The use of floppy disks peaked in American homes and offices in the mid-1990s, and modern computers do not even accommodate them anymore. But The Federal Register continues to accept them, in part because legal and security requirements have yet to be updated, but mostly because the wheels of government grind ever slowly.

. . .

. . . , experts say that an administration that prided itself on its technological savvy has a long way to go in updating the computer technology of the federal government. and the floppy disks of The Federal Register, they say, are but two recent examples of a government years behind the private sector in digital innovation.

For the full story, see:

JADA F. SMITH. "Slowly They Modernize: A Federal Agency That Still Uses Floppy Disks." The New York Times (Sat., December 7, 2013): A14.

(Note: ellipses added.)

(Note: the online version of the article has the date December 6, 2013.)

December 1, 2013

Kits Let Model T Owners Transform Them into Tractors, Snowmobiles, Roadsters and Trucks

ModelTtractorConversion2013-10-25.jpg "OFF ROAD; Kits to take the Model T places Henry Ford never intended included tractor conversions, . . . " Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 1) WHEN Henry Ford started to manufacture his groundbreaking Model T on Sept. 27, 1908, he probably never imagined that the spindly little car would remain in production for 19 years. Nor could Ford have foreseen that his company would eventually build more than 15 million Tin Lizzies, making him a billionaire while putting the world on wheels.

But nearly as significant as the Model T's ubiquity was its knack for performing tasks far beyond basic transportation. As quickly as customers left the dealers' lot, they began transforming their Ts to suit their specialized needs, assisted by scores of new companies that sprang up to cater exclusively to the world's most popular car.

Following the Model T's skyrocketing success came mail-order catalogs and magazine advertisements filled with parts and kits to turn the humble Fords into farm tractors, mobile sawmills, snowmobiles, racy roadsters and even semi-trucks. Indeed, historians credit the Model T -- which Ford first advertised as The Universal Car -- with launching today's multibillion-dollar automotive aftermarket industry.

For the full story, see:

LINDSAY BROOKE. "Mr. Ford's T: Mobility With Versatility." The New York Times, Automobiles Section (Sun., July 20, 2008): 1 & 14.

(Note: the online version of the story has the title "Mr. Ford's T: Versatile Mobility.")

November 29, 2013

Kerosene Creatively Destroyed Whale Oil

WhaleOilLamps2013-10-25.jpg "The whale-oil lamps at the Sag Harbor Whaling and Historical Museum are obsolete, though at one time, whale oil lighted much of the Western world." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 20) Like oil, particularly in its early days, whaling spawned dazzling fortunes, depending on the brute labor of tens of thousands of men doing dirty, sweaty, dangerous work. Like oil, it began with the prizes closest to home and then found itself exploring every corner of the globe. And like oil, whaling at its peak seemed impregnable, its product so far superior to its trifling rivals, like smelly lard oil or volatile camphene, that whaling interests mocked their competitors.

"Great noise is made by many of the newspapers and thousands of the traders in the country about lard oil, chemical oil, camphene oil, and a half-dozen other luminous humbugs," The Nantucket Inquirer snorted derisively in 1843. It went on: "But let not our envious and -- in view of the lard oil mania -- we had well nigh said, hog-gish opponents, indulge themselves in any such dreams."

But, in fact, whaling was already just about done, said Eric Jay Dolin, who . . . is the author of "Leviathan: The History of Whaling in America." Whales near North America were becoming scarce, and the birth of the American petroleum industry in 1859 in Titusville, Pa., allowed kerosene to supplant whale oil before the electric light replaced both of them and oil found other uses.

. . .

Mr. Dolin said the message for today was that one era's irreplaceable energy source could be the next one's relic. Like whaling, he said, big oil is ripe to be replaced by something newer, cleaner, more appropriate for its moment.

For the full story, see:

PETER APPLEBOME. "OUR TOWNS; Once They Thought Whale Oil Was Indispensable, Too." The New York Times, First Section (Sun., August 3, 2008): 20.

(Note: ellipses added.)

(Note: the online version of the story has the title, "OUR TOWNS; They Used to Say Whale Oil Was Indispensable, Too.")

Dolin's book is:

Dolin, Eric Jay. Leviathan: The History of Whaling in America. New York: W. W. Norton & Company, Inc., 2007.

November 8, 2013

Not All Environmentalists Reject the Refrigerator

(p. D4) MANY environmentalists -- even many who think nothing of using recycled toilet paper or cut the thermostat to near-arctic levels -- see fridge-free living as an extreme choice or an impractical and excessive goal.

"The refrigerator was a smart advance for society," said Gretchen Willis, 37, an environmentally conscious mother of four in Arlington, Tex., who recently read about the practice on a popular eco-themed blog,, and was astounded.

"I never would have thought of it," Ms. Willis said, explaining that although she's committed to recycling and using fluorescent bulbs, she draws the line at any environmental practice that will result in great expense or inconvenience. Living without a refrigerator, she said, qualifies on both counts: she would have to buy more food in smaller quantities because of spoilage, prepare exact amounts because she couldn't refrigerate leftovers, and make daily trips to the grocery store.

"It's silly not to have one," she said, "considering what the alternative is: drinking up a gallon of milk in one day so it doesn't spoil."

Deanna Duke, who lives in Seattle and runs the site Ms. Willis visited, said that taking a stand for or against unplugging has become "a badge of honor" for those on either side. "It's either 'look how far I'm willing to go,' or 'look how far I'm not willing to go,' " she said. For her part, Ms. Duke may refrain from watering her lawn in an effort at conservation, but she's firmly in the pro-refrigerator camp. "I can't think of any circumstances, other than an involuntary extreme situation, that would make me unplug my fridge," she said. "The convenience factor is too high."

. . .

Marty O'Gorman, the vice president of Frigidaire, said an 18-cubic-foot Energy Star-rated Frigidaire refrigerator uses about 380 kilowatt-hours a year -- less than a standard clothes dryer -- and costs a homeowner $40, or about 11 cents a day.

. . .

. . . , Mr. O'Gorman said downsizing from a standard model to Frigidaire's smallest minifridge would result in only about $6 in energy savings over a year.

It's this sort of practical calculus that has led many who advocate sustainable living to view unplugging the fridge as a dubious practice. They point out that it is likely to result in more trips to the store (which burns more gas, for those who drive) and the purchase of food in smaller portions (thus more packaging).

"It's easy to look at your bill and say, 'I'm saving energy,' " Ms. Duke said. "But you need to look at the whole supply chain."

For the full story, see:

STEVEN KURUTZ. "Trashing the Fridge." The New York Times (Thurs., February 5, 2009): D1 & D4.

(Note: ellipses added.)

(Note: the online version of the story has the date February 4, 2009.)

November 4, 2013

Better Batteries Would Be a General Purpose Technology (GPT)

Economists of technology have been thinking about General Purpose Technologies (GPT) for the last 10 years or so. As the name implies, a GPT is one where there are broad applications, and new applications are invented as the price of the GPT declines. My plausible guess is that a breakthrough in battery technology would be a very important GPT. The progress sketched below is probably not a breakthrough, but progress is good.

(p. C4) People take batteries for granted, but they shouldn't. All kinds of technological advances hinge on developing smaller and more powerful mobile energy sources.

Researchers at Harvard University and the University of Illinois are reporting just such a creation, one that happens to be no bigger than a grain of sand. These tiny but powerful lithium-ion batteries raise the prospect of a new generation of medical and other devices that can go where traditional hulking batteries can't.

. . .

Jennifer Lewis, a materials scientist at Harvard, says these batteries can store more energy because 3-D printing enables the stacking of electrodes in greater volume than the thin-film methods now used to make microbatteries.

For the full story, see:

DANIEL AKST. "R AND D: Batteries on the Head of a Pin." The Wall Street Journal (Sat., June 22, 2013): C4.

(Note: ellipsis added.)

(Note: the online version of the interview has the date June 21, 2013.)

October 27, 2013

Silicon Valley Is Open to Creative Destruction, But Tired of Taxes

(p. A15) Rancho Palos Verdes, Calif.

When the howls of creative destruction blew through the auto and steel industries, their executives lobbied Washington for bailouts and tariffs. For now, Silicon Valley remains optimistic enough that its executives don't mind having their own businesses creatively destroyed by newer technologies and smarter innovations. That's an encouraging lesson from this newspaper's recent All Things Digital conference, which each year attracts hundreds of technology leaders and investors.

. . .

In a 90-minute grilling by the Journal's Walt Mossberg and Kara Swisher, Apple Chief Executive Tim Cook assured the audience that his company has "some incredible plans that we've been working on for a while."

Mr. Cook's sunny outlook was clouded only by his dealings with Washington. He was recently the main witness at hearings called by Sen. Carl Levin, a Michigan Democrat, who accused Apple of violating tax laws. In fact, Apple's use of foreign subsidiaries is entirely legal--and Apple is the largest taxpayer in the U.S., contributing $6 billion a year to the government's coffers.

Mr. Cook put on a brave face about the hearings, saying, "I thought it was very important to go tell our side of the story and to view that as an opportunity instead of a pain in the [expletive]." Mr. Cook's foul language was understandable. "Just gut the [tax] code," he told the conference. "It's 7,500 pages long. . . . Apple's tax return is two feet high. It's crazy."

When the audience applauded, Ms. Swisher quipped, "All right, Rand Paul." A woman shouted: "No, I'm a Democrat!" One reason the technology industry remains the center of innovation may be that many technologists of all parties view trips to Washington as a pain.

For the full commentary, see:

L. GORDON CROVITZ. "INFORMATION AGE; Techies Cheer Creative Destruction." The Wall Street Journal (Mon., June 3, 2013): A15.

(Note: ellipsis between paragraphs added; italics in original; ellipsis, and bracketed words, within next-to-last paragraph, in original.)

(Note: the online version of the commentary has the date June 2, 2013.)

September 27, 2013

Google's Bathrooms Showed Montessori Discipline

(p. 124) You could even see the company's work/ play paradox in its bathrooms. In some of Google's loos, even the toilets were toys: high-tech Japanese units with heated seats, cleansing water jets, and a control panel that looked as though it could run a space shuttle. But on the side of the stall--and, for men, at an eye-level wall placement at the urinals--was the work side of Google, a sheet of paper with a small lesson in improved coding. A typical "Testing on the Toilet" instructional dealt with the intricacies of load testing or C + + microbenchmarking. Not a second was wasted in fulfilling Google's lofty--and work-intensive--mission.

It's almost as if Larry and Sergey were thinking of Maria Montessori's claim "Discipline must come through liberty.... We do not consider an individual disciplined only when he has been rendered as artificially silent as a mute and as immovable as a paralytic. He is an individual annihilated, not disciplined. We call an individual disciplined when he is master of himself." (p. 125) Just as it was crucial to Montessori that nothing a teacher does destroy a child's creative innocence, Brin and Page felt that Google's leaders should not annihilate an engineer's impulse to change the world by coding up some kind of moon shot.

"We designed Google," Urs Hölzle says, "to be the kind of place where the kind of people we wanted to work here would work for free."


Levy, Steven. In the Plex: How Google Thinks, Works, and Shapes Our Lives. New York: Simon & Schuster, 2011.

(Note: ellipsis in original.)

September 16, 2013

Frank Lloyd Wright Loved Cars

CordL29OwnedByFrankLloydWright2013-08-10.jpg "In the early 1920s, Wright bought a 1929 Cord L-29, which he praised for its sensible front-wheel drive. Besides, "It looked becoming to my houses," he wrote in his book "An Autobiography." He seemed to have a special bond with the Cord. "The feeling comes to me that the Cord should be heroic in this autobiography somewhere," he wrote." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 9) Frank Lloyd Wright, the architect whose birth in 1867 preceded the gasoline-powered automobile's by about 20 years, was an early adopter of the internal-combustion engine and an auto aficionado all his life.

He was also eerily prophetic in understanding how the car would transform the American landscape, and his designs reflect this understanding. Wright often designed both for and around automobiles, and his masterpiece, the Guggenheim Museum in New York, owes its most distinctive feature, the spiral of its rotunda, to his love for the automobile.

. . .

Wright was seduced by the combination of beauty, power and speed, whether powered by hay or by gas. He owned horses, and his first car, a yellow Model K Stoddard-Dayton roadster, was the same model that in 1909 won the very first automobile race at the Indianapolis Motor Speedway. Called the Yellow Devil by his neighbors, this was a 45-horsepower car capable of going 60 miles an hour. Wright and his sons seemed to enjoy that horsepower with abandon: "Dad was kept busy paying fines," his son John observed. So enamored was Wright of his automobile that he installed gas pumps in the garage of his home and studio in Oak Park, Ill.

. . .

In the early 1920s, Wright owned a custom-built Cadillac and later bought a 1929 Cord L-29, which he praised for its sensible front-wheel drive. Besides, "It looked becoming to my houses," he wrote in his book "An Autobiography." He seemed to have a special bond with the Cord. "The feeling comes to me that the Cord should be heroic in this autobiography somewhere," he wrote.

Wright's Cord can be seen today at the Auburn Cord Duesenberg Museum in Auburn, Ind.

For the full story, see:

INGRID STEFFENSEN. "Frank Lloyd Wright: The Auto as Architect's Inspiration." The New York Times, SportsSunday Section (Sun., August 9, 2009): 9.

(Note: ellipses added.)

(Note: the online version of the article has the date August 6, 2009 and the title "The Auto as Architect's Inspiration." There are some small differences between the print and online versions, although I think the sentences quoted above are the same in both.)

Wright's autobiography, mentioned above, is:

Wright, Frank Lloyd. An Autobiography. New York: Horizon Press, 1977.

September 12, 2013

Why IT-Savy Companies Are More Profitable


Dr. Peter Weill, Chair of the Center for Information Systems Research at the MIT Sloan School of Management. Source of caption information and photo: online version of the WSJ article quoted and cited below.

(p. R2) DR. WEILL: The IT-savvy companies are 21% more profitable than non-IT-savvy companies. And the profitability shows up in two ways. One is that IT-savvy companies have identified the best way to run their core day-to-day processes. Think about UPS or Southwest Airlines or Amazon: They run those core processes flawlessly, 24 hours a day.

The second thing is that IT-savvy companies are faster to market with new products and services that are add-ons, because their innovations are so much easier to integrate than in a company with siloed technology architecture, where you have to glue together everything and test it and make sure that it all works. We call that the agility paradox--the companies that have more standardized and digitized business processes are faster to market and get more revenue from new products.

Those are the two sources of their greater profitability: lower costs for running existing business processes, and faster innovation.

For the full interview, see:

Martha E. Mangelsdorf, interviewer. "EXECUTIVE BRIEFING; Getting an Edge From IT; Companies need to think strategically about their tech investments." The Wall Street Journal (Mon., November 30, 2009): R2.

(Note: bold in original.)

September 11, 2013

Yahoo Execs Complained that Google Did Yahoo Searches too Well

(p. 45) Even though Google never announced when it refreshed its index, there would invariably be a slight rise in queries around the world soon after the change was implemented. It was as if the global subconscious realized that there were fresher results available.

The response of Yahoo's users to the Google technology, though, was probably more conscious. They noticed that search was better and used it more. "It increased traffic by, like, 50 percent in two months," Manber recalls of the switch to Google. But the only comment he got from Yahoo executives was complaints that people were searching too much and they would have to pay higher fees to Google.

But the money Google received for providing search was not the biggest benefit. Even more valuable was that it now had access to many more users and much more data. It would be data that took Google search to the next level. The search behavior of users, captured and encapsulated in the logs that could be analyzed and mined, would make Google the ultimate learning machine.


Levy, Steven. In the Plex: How Google Thinks, Works, and Shapes Our Lives. New York: Simon & Schuster, 2011.

September 9, 2013

Silicon Valley May Be Insulated from the Jobs Ordinary People Need to Get Done

A long while ago I read somewhere that in his prime Bill Gates deliberately tested Microsoft software on the limited hardware that mainstream customers could afford, rather than on the cutting edge hardware he himself could easily afford. I thought that this gave an important clue to Gates' and Microsoft's success.

Christensen and Raynor (2003) suggest that the successful entrepreneur will think hard about what jobs ordinary people want to get done, but are having difficulty doing.

The passages quoted below suggest that Silicon Valley entrepreneurs are insulated from ordinary life, and so may need to work harder at learning what the real problems are.

(p. B5) Engineers tend to move to the Bay Area because of the opportunity to get together with other engineers and, just maybe, create a great company, Mr. Smith said. But in a region that has the highest concentration of tech workers in the United States, according to the Bureau of Labor Statistics, the bars, restaurants and other haunts of entrepreneurs can be an echo chamber. The result can be a focus on solutions for mundane problems.

. . .

. . . too often, says Jason Pontin, the editor in chief and publisher of MIT Technology Review, . . . start-ups are solving "fake problems that don't actually create any value." Mr. Pontin knows a thing or two about companies that aren't exactly reaching for the stars. From 1996 to 2002, he was the editor of Red Herring, a magazine in San Francisco that chronicled the region's dot-com boom and eventual collapse.

For the full commentary, see:

NICK BILTON. "Disruptions: The Echo Chamber of Silicon Valley." The New York Times (Mon., June 3, 2013): B5.

(Note: ellipses added.)

(Note: the online version of the commentary has the date June 2, 2013.)

The Christensen and Raynor book that I mention above, is:

Christensen, Clayton M., and Michael E. Raynor. The Innovator's Solution: Creating and Sustaining Successful Growth. Boston, MA: Harvard Business School Press, 2003.

August 22, 2013

"We Just Begged and Borrowed" for Equipment

(p. 32) Google was handling as many as 10,000 queries a day. At times it was consuming half of Stanford's Internet capacity. Its appetite for equipment and bandwidth was voracious. "We just begged and borrowed," says Page. "There were tons of computers around, and we managed to get some." Page's dorm room was essentially Google's operations center, with a motley assortment of computers from various manufacturers stuffed into a homemade version of a server rack-- a storage cabinet made of Legos. Larry and Sergey would hang around the loading dock to see who on campus was getting computers-- companies like Intel and Sun gave lots of free machines to Stanford to curry favor with employees of the future-- (p. 33) and then the pair would ask the recipients if they could share some of the bounty.

That still wasn't enough. To store the millions of pages they had crawled, the pair had to buy their own high-capacity disk drives. Page, who had a talent for squeezing the most out of a buck, found a place that sold refurbished disks at prices so low-- a tenth of the original cost-- that something was clearly wrong with them. "I did the research and figured out that they were okay as long as you replaced the [disk] operating system," he says. "We got 120 drives, about nine gigs each. So it was about a terabyte of space." It was an approach that Google would later adopt in building infrastructure at low cost.

Larry and Sergey would be sitting by the monitor, watching the queries-- at peak times, there would be a new one every second-- and it would be clear that they'd need even more equipment. What next? they'd ask themselves. Maybe this is real.


Levy, Steven. In the Plex: How Google Thinks, Works, and Shapes Our Lives. New York: Simon & Schuster, 2011.

(Note: italics in original.)

August 18, 2013

Excite Rejected Google Because It Was too Good

(p. 28) Maybe the closest Page and Brin came to a deal was with Excite, a search-based company that had begun-- just like Yahoo-- with a bunch of sharp Stanford kids whose company was called Architext before the venture capitalists (VCs) got their hands on it and degeekified the name. Terry Winograd, Sergey's adviser, accompanied them to a meeting with Vinod Khosla, the venture capitalist who had funded Excite.

. . .

(p. 29) Khosla made a tentative counteroffer of $ 750,000 total. But the deal never happened. Hassan recalls a key meeting that might have sunk it. Though Excite had been started by a group of Stanford geeks very much like Larry and Sergey, its venture capital funders had demanded they hire "adult supervision," the condescending term used when brainy geeks are pushed aside as top executives and replaced by someone more experienced and mature, someone who could wear a suit without looking as though he were attending his Bar Mitzvah. The new CEO was George Bell, a former Times Mirror magazine executive. Years later, Hassan would still laugh when he described the meeting between the BackRub team and Bell. When the team got to Bell's office, it fired up BackRub in one window and Excite in the other for a bake-off.

The first query they tested was "Internet." According to Hassan, Excite's first results were Chinese web pages where the English word "Internet" stood out among a jumble of Chinese characters. Then the team typed "Internet" into BackRub. The first two results delivered pages that told you how to use browsers. It was exactly the kind of helpful result that would most likely satisfy someone who made the query.

Bell was visibly upset. The Stanford product was too good. If Excite were to host a search engine that instantly gave people information they sought, he explained, the users would leave the site instantly. Since his ad revenue came from people staying on the site--" stickiness" was the most desired metric in websites at the time-- using BackRub's technology would be (p. 30) counterproductive. "He told us he wanted Excite's search engine to be 80 percent as good as the other search engines," says Hassan. And we were like, "Wow, these guys don't know what they're talking about."

Hassan says that he urged Larry and Sergey right then, in early 1997, to leave Stanford and start a company. "Everybody else was doing it," he says. "I saw Hotmail and Netscape doing really well. Money was flowing into the Valley. So I said to them, 'The search engine is the idea. We should do this.' They didn't think so. Larry and Sergey were both very adamant that they could build this search engine at Stanford."

"We weren't ... in an entrepreneurial frame of mind back then," Sergey later said.


Levy, Steven. In the Plex: How Google Thinks, Works, and Shapes Our Lives. New York: Simon & Schuster, 2011.

(Note: ellipsis between paragraphs added; ellipsis in last sentence, in original.)

August 16, 2013

New Technologies Often Are Feared at First

(p. 4) It is hard to think of a technology that wasn't feared when it was introduced. In his Atlantic article, Mr. Carr says that Socrates feared the impact that writing would have on man's ability to think. The advent of the printing press summoned similar fears. It wouldn't be the last time.

When Hewlett-Packard invented the HP-35, the first hand-held scientific calculator, in 1972, the device was banned from some engineering classrooms. Professors feared that engineers would use it as a crutch, that they would no longer understand the relationships that either penciled calculations or a slide rule somehow provided for proficient scientific thought.

But the HP-35 hardly stultified engineering skills. Instead, in the last 36 years those engineers have brought us iPods, cellphones, high-definition TV and, yes, Google and Twitter. It freed engineers from wasting time on mundane tasks so they could spend more time creating.

Many technological advances have that effect. Take tax software, for instance. The tedious job of filing a tax return no longer requires several evenings, but just a few hours. It gives us time for more productive activities.

For the full commentary, see:

DAMON DARLIN . "PING; Technology Doesn't Dumb Us Down. It Frees Our Minds." The New York Times, SundayBusiness Section (Sun., September 21, 2008): 4.

(Note: the online version of the commentary has the date September 20, 2008.)

August 14, 2013

"Web Links Were Like Citations in a Scholarly Article"

(p. 17) Page, a child of academia, understood that web links were like citations in a scholarly article. It was widely recognized that you could identify which papers were really important without reading them-- simply tally up how many other papers cited them in notes and bibliographies. Page believed that this principle could also work with web pages. But getting the right data would be difficult. Web pages made their outgoing links transparent: built into the code were easily identifiable markers for the destinations you could travel to with a mouse click from that page. But it wasn't obvious at all what linked to a page. To find that out, you'd have to somehow collect a database of links that connected to some other page. Then you'd go backward.


Levy, Steven. In the Plex: How Google Thinks, Works, and Shapes Our Lives. New York: Simon & Schuster, 2011.

August 5, 2013

In the Plex Helps Us Understand Entrepreneurs Page and Brin


Source of book image:

In the Plex goes from detail to detail of the values, actions and quirks of a large cast of characters who have been involved in the Google story. I did not find the book as consistently gripping as Isaacson's Steve Jobs biography.

But some of the details help suggest new hypotheses, or test old ones, on important issues of entrepreneurship and technological progress. Some parts are revealing of the goals and methods of Page and Brin.

During the next weeks I will quote some of the more interesting passages.

Book discussed:

Levy, Steven. In the Plex: How Google Thinks, Works, and Shapes Our Lives. New York: Simon & Schuster, 2011.

July 31, 2013

Falling Computer Prices Cured the "Digital Divide"

(p. 304) The more evident the power of the internet as an uplifting force became, the more evident the divide between the digital haves and have-nots. One sociological study concluded that there were "two Americas" emerging. The citizens of one America were poor people who could not afford a computer, and of the other, wealthy individuals equipped with PCs who reaped all the benefits. During the 1990s, when technology boosters like me were promoting the advent of the internet, we were often asked: What are we going to do about the digital divide? My an-(p. 305)swer was simple: nothing. We didn't have to do anything, because the natural history of a technology such as the internet was self-fulfilling. The have-nots were a temporary imbalance that would be cured (and more) by technological forces. There was so much profit to be made connecting up the rest of the world, and the unconnected were so eager to join, that they were already paying higher telecom rates (when they could get such service) than the haves. Furthermore, the costs of both computers and connectivity were dropping by the month. At that time most poor in America owned televisions and had monthly cable bills. Owning a computer and having internet access was no more expensive and would soon be cheaper than TV. In a decade, the necessary outlay would become just a $100 laptop. Within the lifetimes of all born in the last decade, computers of some sort (connectors, really) will cost $5.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

July 30, 2013

The French and Japanese Believe Water Cleans the Anus Better than Dry Paper


Source of the book image:

(p. C34) Ms. George's book is lively . . . . It is hard not to warm to a writer who can toss off an observation like this one: "I like engineers. They build things that are useful and sometimes beautiful -- a brick sewer, a suspension bridge -- and take little credit. They do not wear black and designer glasses like architects. They do not crow."

. . .

In Japan, where toilets are amazingly advanced -- most of even the most basic have heated seats and built-in bidet systems for front and rear -- the American idea of cleaning one's backside with dry paper is seen as quaint at best and disgusting at worst. As Ms. George observes: "Using paper to cleanse the anus makes as much sense, hygienically, as rubbing your body with dry tissue and imagining it removes dirt."

For the full review, see:

DWIGHT GARNER. "BOOKS OF THE TIMES; 15 Minutes of Fame for Human Waste and Its Never-Ending Assembly Line." The New York Times (Fri., December 12, 2008): C34.

(Note: ellipses added.)

(Note: the online version of the review has the date December 11, 2008.)

The book under review, is:

George, Rose. The Big Necessity: The Unmentionable World of Human Waste and Why It Matters. New York: Metropolitan Books, 2008.

July 28, 2013

Children of Chinese Entrepreneurs Want to Work for Government


"Engineering student Xie Chaobo has yet to land a job." Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. A1) BEIJING--Xie Chaobo figures he has the credentials to land a job at one of China's big state-owned firms. He is a graduate student at Tsinghua University, one of China's best. His field of study is environmental engineering, one of China's priorities. And he is experimenting with new techniques for identifying water pollutants, which should make him a valuable catch.

But he has applied to 30 companies so far and scored just four interviews, none of which has led to a job.

Although Mr. Xie's parents are entrepreneurs who have built companies that make glasses, shoes and now water pumps, he has no interest in working at a private startup. Chinese students "have been told since we were children to focus on stability instead of risk," the 24-year-old engineering student says.

Over the past decade, the number of new graduates from Chinese universities has increased sixfold to more than six million a year, creating an epic glut that is depressing wages, (p. A10) leaving many recent college graduates without jobs and making students fearful about their future. Two-thirds of Chinese graduates say they want to work either in the government or big state-owned firms, which are seen as recession-proof, rather than at the private companies that have powered China's remarkable economic climb, surveys indicate. Few college students today, according to the surveys, are ready to leave the safe shores of government work and "jump into the sea," as the Chinese expression goes, to join startups or go into business for themselves, although many of their parents did just that in the 1990s.

For the full story, see:

MIKE RAMSEY and VALERIE BAUERLEIN. "Tesla Clashes With Car Dealers; Electric-Vehicle Maker Wants to Sell Directly to Consumers; Critics Say Plan Violates Franchise Laws." The Wall Street Journal (Tues., June 18, 2013): B1-B2.

ChineseStudentAfterGraduationPlans2013-07-23.jpgSource of table: online version of the WSJ article quoted and cited above.

July 27, 2013

In 1916 a Single Home Motor Would Drive All Home Machines

(p. 301) By the 1910s, electric motors had started their inevitable spread into homes. They had been domesticated. Unlike a steam engine, they did not smoke or belch or drool. Just a tidy, steady whirr from a five-pound (p. 302) hunk. As in factories, these single "home motors" were designed to drive all the machines in one home. The 1916 Hamilton Beach "Home Motor" had a six-speed rheostat and ran on 110 volts. Designer Donald Norman points out a page from the 1918 Sears, Roebuck and Co. catalog advertising the Home Motor for $8.75 (which is equivalent to about $100 these days). This handy motor would spin your sewing machine. You could also plug it into the Churn and Mixer Attachment ("for which you will find many uses") and the Buffer and Grinder Attachments ("will be found very useful in many ways around the home"). The Fan Attachment "can be quickly attached to Home Motor," as well as the Beater Attachment to whip cream and beat eggs.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: the quote above omits the copy of a 1918 electric motor ad that appeared in the middle of the original paragraph.)

July 24, 2013

Laws to Protect Car Dealers, Keep Car Prices High

TeslaGalleryVirginia2013-07-23.jpg "Tesla 'galleries' such as this one in McLean, Va., can show but not sell cars." Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. B1) RALEIGH, N.C.--Elon Musk made a fortune disrupting the status quo in online shopping and renewable energy. Now he's up against his toughest challenge yet: local car dealers.

Mr. Musk, the billionaire behind PayPal and now Tesla Motors Inc., wants to sell his $70,000 Tesla electric luxury vehicles directly to consumers, bypassing franchised automobile dealers. Dealers are flexing their considerable muscle in states including Texas and Virginia to stop him.

The latest battleground is North Carolina, where the Republican-controlled state Senate last month unanimously approved a measure that would block Tesla from selling online, its only sales outlet here. Tesla has staged whiz-bang test drives for legislators in front of the State House and hired one of the state's most influential lobbyists to stave off a similar vote in the House before the legislative session ends in early July.

The focus of the power struggle between Mr. Musk and auto dealers is a thicket of state franchise laws, many of which go back to the auto industry's earliest days when industry pioneer Henry Ford began turning to eager entrepreneurs to help sell his Model T.

Dealers say laws passed over the decades to prevent car makers from selling directly to consumers are justified because without them auto makers could use their economic clout to sell vehicles for less than their independent franchisees.

For the full story, see:

MIKE RAMSEY and VALERIE BAUERLEIN. "Tesla Clashes With Car Dealers; Electric-Vehicle Maker Wants to Sell Directly to Consumers; Critics Say Plan Violates Franchise Laws." The Wall Street Journal (Tues., June 18, 2013): B1-B2.

July 23, 2013

If Driverless Cars Only Kill Half a Million Per Year, that "Would Be an Improvement"

(p. 261) . . . , human-piloted cars cause great harm, killing millions of people each year worldwide. If robot-controlled cars killed "only" half a million people per year, it would be an improvement!


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: ellipsis added.)

July 21, 2013

Students Learn More in Air Conditioning

(p. 5) My first year as a public school teacher, I taught at Manhattan's P.S. 98, which did not have air-conditioning. From mid-May until June's end -- roughly 17 percent of the school year -- the temperature in my classroom hovered in the 80s and often topped 90 degrees.

Students wilted over desks. Academic gains evaporated. Even restless pencil tappers and toe wigglers grew lethargic. Absenteeism increased as children sought relief at home or outdoors. By day's end, my hair was plastered to my face with perspiration.

It seems obvious: schools need to be cool. It's absurd to talk about inculcating 21st-century skills in classrooms that resemble 19th-century sweatshops.

. . .

Cool schools are critical if we are to boost achievement. Studies show that concentration and cognitive abilities decline substantially after a room reaches 77 or 78 degrees. This is a lesson American businesses learned long ago. . . . A pleasant atmosphere leads to more productive employees.

. . .

It isn't just white-collar laborers who work in cool climates. Amazon announced last year that it was spending $52 million to upgrade its warehouses with air-conditioning. Yet we can't seem to do the same for vulnerable children, though some of the achievement gap is most likely owing to a lack of air-conditioning. One Oregon study found that students working in three different temperature settings had strikingly different results on exams, suggesting that sweating a test actually undermines performance.

Students who enjoy the luxury of air-conditioning may enjoy an unfair advantage over their hotter peers.

We are also investing enormous sums to extend the school day and school year in many locales. But these investments won't be effective if schools are ovens.

For the full commentary, see:

SARA MOSLE. "SCHOOLING; Schools Are Not Cool." The New York Times, SundayReview Section (Sun., June 2, 2013): 5.

(Note: ellipses added.)

(Note: the online version of the commentary has the date June 1, 2013.)

July 20, 2013

Creator of C Language Worked Late "in a Chaotic Office"


"Dennis Ritchie received the Japan Prize in May at Bell Labs in Murray Hill, N.J., for his role in co-developing the Unix operating system." Source of caption and photo: online version of the WSJ obituary quoted and cited below.

(p. A7) Dennis Ritchie invented C, the computer-programming language that underlies Microsoft Windows, the Unix operating system and much of the other software running on computers around the world.

Mr. Ritchie was a longtime research scientist at Bell Labs, originally AT&T's research division. Bell Labs announced that he died at age 70 [his body was discovered on October 12, 2011].

. . .

Twitter and other online forums crackled with tributes to Mr. Ritchie after his death was announced.

One came from James Grimmelmann, a former Microsoft programmer who now is an associate professor at New York Law School.

"If [Steve] Jobs was a master architect of skyscrapers, it was Ritchie and his collaborators who invented steel," Mr. Grimmelmann wrote.

Long-haired and often working late into the night in a chaotic office, Mr. Ritchie fulfilled in some ways the computer-nerd stereotype. He was given to gnomic pronouncements on his creations.

"Unix is very simple, it just needs a genius to understand its simplicity" was one. Another: "C is quirky, flawed and an enormous success."

For the full obituary, see:

STEPHEN MILLER. "REMEMBRANCES; DENNIS RITCHIE 1941-2011; Pioneer Programmer Shaped the Evolution of Computers." The New York Times (Fri., October 14, 2011): A7.

(Note: ellipsis, and words in first brackets, added; name in second brackets, in original.)

July 19, 2013

The Precautionary Principle Is Biased Against the New, and Ignores the Risks of the Old

(p. 250) In general the Precautionary Principle is biased against anything new. Many established technologies and "natural" processes have unexamined faults as great as those of any new technology. But the Precautionary Principle establishes a drastically elevated threshold for things that are new. In effect it grandfathers in the risks of the old, or the "nat-(p. 251)ural." A few examples: Crops raised without the shield of pesticides generate more of their own natural pesticides to combat insects, but these indigenous toxins are not subject to the Precautionary Principle because they aren't "new." The risks of new plastic water pipes are not compared with the risks of old metal pipes. The risks of DDT are not put in context with the old risks of dying of malaria.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

July 16, 2013

Will Apple Innovate Without Jobs?

JobsSteveHoldingIphone2013-06-28.jpg "Steve Jobs, introducing the iPhone 4 in January [2011]." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B4) "The good news for Apple is that the product road map in this industry is pretty much in place two and three years out," said David B. Yoffie, a professor at the Harvard Business School. "So 80 percent to 90 percent of what would happen in that time would be the same, even without Steve."

"The real challenge for Apple," Mr. Yoffie continued, "will be what happens beyond that road map. Apple is going to need a new leader with a new way of recreating and managing the business in the future."

. . .

His design decisions, Mr. Jobs explained, were shaped by his understanding of both technology and popular culture. His own study and intuition, not focus groups, were his guide. When a reporter asked what market research went into the iPad, Mr. Jobs replied: "None. It's not the consumers' job to know what they want."

. . .

Great products, Mr. Jobs once explained, were a triumph of taste, of "trying to expose yourself to the best things humans have done and then trying to bring those things into what you are doing."

Mr. Yoffie said Mr. Jobs "had a unique combination of visionary creativity and decisiveness," adding: "No one will replace him."

For the full story, see:

STEVE LOHR. "Without Its Master of Design, Apple Will Face Challenges." The New York Times (Thurs., August 25, 2011): B1 & B4.

(Note: ellipses in text, and bracketed year in caption, added.)

(Note: the online version of the story has the date August 24, 2011, and the slightly longer title "Without Its Master of Design, Apple Will Face Many Challenges.")

July 15, 2013

Chinese Peasants Applied Precautionary Principle to Scythe Technology

(p. 249) In a letter Orville Wright wrote to his inventor friend Henry Ford, Wright recounts a story he heard from a missionary stationed in China. Wright told Ford the story for the same reason I tell it here: as a cautionary tale about speculative risks. The missionary wanted to improve the laborious way the Chinese peasants in his province harvested grain. The local farmers clipped the stalks with some kind of small hand shear. So the missionary had a scythe shipped in from America and demonstrated its superior productivity to an enthralled crowd. "The next morning, however, a delegation came to see the missionary. The scythe must be destroyed at once. What, they said, if it should fall into the hands of thieves; a whole field could be cut and carried away in a single night." And so the scythe was banished, progress stopped, because nonusers could imagine a possible--but wholly improbable--way it could significantly harm their society.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

July 14, 2013

Record Companies Refused to See Efficiency of Napster Distribution System


Source of book image: online version of the WSJ review quoted and cited below.

(p. A15) . . . the central character in "Appetite for Self-Destruction" is technological change.

. . .

Record labels scrambled to negotiate with Napster and develop a legal version of the service with multiple revenue streams. The attempts all failed. In Mr. Knopper's telling, there were unreasonable demands on all sides. But he faults music executives for "cling[ing] to the old, suddenly inefficient model of making CDs and distributing them to record stores. . . . In this world, the labels controlled -- and profited from -- everything." In the new world being ushered in by Napster, he writes, control was shifting "to a snot-nosed punk and his crazy uncle."

The labels' inability to reach an agreement with Napster destroyed "the last chance for the record industry as we know it to stave off certain ruin," Mr. Knopper writes in a typically overheated passage. Had a deal been consummated, he suggests, a legal version of Napster might have generated revenues of $16 billion in 2002 and saved the industry. Whether or not the author's estimate is accurate, his larger point remains: The music industry's big mistake was trying to protect a business model that no longer worked. Litigation would not keep music consumers offline.

For the full review, see:

JEREMY PHILIPS. "BUSINESS BOOKSHELF; Spinning Out of Control; How the record industry missed out on a chance to compete in a new digital world." The Wall Street Journal (Weds., February 11, 2009): A15.

(Note: first two ellipses added; third ellipsis in original.)

The book under review is:

Knopper, Steve. Appetite for Self-Destruction: The Spectacular Crash of the Record Industry in the Digital Age. New York: Free Press, 2009.

July 12, 2013

The Decay of River Rouge's Diseconomies of Scale

RiverRougeFordRollingHall2013-06-28.jpg "The rolling hall at Ford's River Rouge plant, one of Andrew Moore's photographs of Detroit." Source of caption and of the Andrew Moore photo: online version of the NYT article quoted and cited below.

Ford's River Rouge plant near Detroit is a standard textbook example of diseconomies of scale (aka diminishing returns to scale). The image above is an apt illustration of the consequences of diseconomies of scale.

(p. 19) A Connecticut native, Mr. Moore moved to New York in 1980, living near South and John Streets in Lower Manhattan. At night he would wander the neighborhood taking pictures of the construction of the South Street Seaport, which kindled an interest in documenting "life in flux," he said. "I like places in transformation, the process of becoming and changing."

. . .

Photos like those of the enormous rolling hall at Ford's River Rouge plant and a sunset over the Bob-Lo Island boat dock were inspired, Mr. Moore said, by 19th-century American landscape painters like Frederic Church and Martin Johnson Heade.

For the full story, see:

MIKE RUBIN. "Capturing the Idling of the Motor City." The New York Times, Arts&Leisure Section (Sun., August 21, 2011): 19.

(Note: ellipsis added.)

(Note: the online version of the story has the date August 18, 2011.")

Andrew Moore has a book of his photos of Detroit:

Moore, Andrew, and Philip Levine. Andrew Moore: Detroit Disassembled. Bologna, Italy: Damiani/Akron Art Museum, 2010.

July 7, 2013

The Precautionary Principle Stops Technological Progress

(p. 247) All versions of the Precautionary Principle hold this axiom in common: A technology must be shown to do no harm before it is embraced. It must be proven to be safe before it is disseminated. If it cannot be proven safe, it should be prohibited, curtailed, modified, junked, or ignored. In other words, the first response to a new idea should be inaction until its safety is established. When an innovation appears, we should pause. Only after a new technology has been deemed okay by the certainty of science should we try to live with it.

On the surface, this approach seems reasonable and prudent. Harm must be anticipated and preempted. Better safe than sorry. Unfortunately, the Precautionary Principle works better in theory than in practice. "The precautionary principle is very, very good for one thing--stopping technological progress," says philosopher and consultant Max More. Cass R. Sunstein, who devoted a book to debunking the principle, says, "We must challenge the Precautionary Principle not because it leads in bad directions, but because read for all it is worth, it leads in no direction at all."


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

July 4, 2013

Walker Says Those Who Call Him "Patent Troll" Want His Property Without Paying


Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. B1) Jay Walker turned his idea for "name your own price" Internet auctions into a fortune by starting Inc. Now the entrepreneur is trying to cash in on his ideas by suing other companies.

Since it was founded in 1994 as a research lab, Walker Digital LLC has made much of its money by spinning out its inventions, like online travel agent Priceline and vending-machine firm Vendmore Systems LLC, as independent businesses.

. . .

Mr. Walker defends his newly aggressive tactics, which some critics compare to those of "patent trolls," a derogatory term for firms that opportunistically enforce patents. Without the lawsuits, he said, his patents could expire while other companies exploit them. Patents have a 20-year lifespan.

"Not only are we not a troll, but the people who want to label me are often the same ones that want to use our property and not pay," Mr. Walker said in an interview.

For the full story, see:

JOHN LETZING. "Founder of Priceline Spoiling for a Fight Over Tech Patents." The Wall Street Journal (Mon., August 22, 2011): B1 & B10.

(Note: ellipsis added.)

July 3, 2013

Amish Break the Golden Rule

(p. 237) If we apply the ubiquity test--what happens if everyone does it?--to the Amish way, the optimization of choice collapses. By constraining the suite of acceptable occupations and narrowing education, the Amish are holding back possibilities not just for their children but indirectly for all.

If you are a web designer today, it is only because many tens of thousands of other people around you and before you have been expanding the realm of possibilities. They have gone beyond farms and home shops to invent a complex ecology of electronic devices that require new expertise and new ways of thinking. If you are an accountant, untold numbers of creative people in the past devised the logic and tools of accounting for you. If you do science, your instruments and field of study have been created by others. If you are a photographer, or an extreme sports athlete, or a baker, or an auto mechanic, or a nurse--then your potential has been given an opportunity by the work of others. You are being expanded as others expand themselves.

. . .

. . . as you embrace new technologies, you are indirectly working for future generations of Amish, and for the minimite homesteaders, even though they are not doing as much for you. Most of what you adopt they will ignore. But every once in a while your adoption of "something that doesn't quite work yet" (Danny Hillis's definition of technology) will evolve into an appropriate tool they can use. It might be a solar grain dyer; it might be a cure for cancer.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: ellipses added.)

June 30, 2013

iPhone: "A Gleaming World of Innovation and Opportunity, of Capitalism Behaving Well"

SubwayIphoneUse2013-06-21.jpg "The theft of electronic devices like iPhones has fueled a rise in subway crime this year, the police say. In the past, New Yorkers were mugged, sometimes killed, for bomber jackets, Cazal glasses and Air Jordan sneakers." Source of caption: print version of the NYT article quoted and cited below. Source of photo: online version of the NYT article quoted and cited below.

(p.24) The current spate of iPhone thefts feels, if anything, more poignant than disruptive. Apple products have always read as cooler than their rivals' because their design suggests a gleaming world of innovation and opportunity, of capitalism behaving well -- a world that seems ever diminishing, ever less accessible to the struggling and young.

Unlike the sneakers and glasses that caused such a fury in the '80s and '90s, iPhones didn't originate in the celebrity system. They come with a democratic ethos (if not the analogous price tag); BlackBerrys are for suits, but even a child can work an iPhone. Wasn't everyone supposed to have a shot?

For the full story, see:

GINIA BELLAFANTE. "BIG CITY; Easy to Use and Easy to Steal, a Status Object Inches Out of Reach." The New York Times, First Section (Sun., October 30, 2011): 24.

(Note: the first paragraph quoted above is from the print version, rather than from the somewhat different online version. The second quoted paragraph is the same in both versions.)

(Note: the online version of the story has the date October 28, 2011, and has the slightly different title "BIG CITY; Easy to Use, or Steal, but Inching Out of Reach.")

June 29, 2013

"Self-Reliant" Amish Depend on the Technologies of the Outside World

(p. 230) The Amish are a little sensitive about this, but their self-reliant lifestyle as it is currently practiced is heavily dependent on the greater technium that surrounds their enclaves. They do not mine the metal they build their mowers from. They do not drill or process the kerosene they use. They don't manufacture the solar panels on their roofs. They don't grow or weave the cotton in their clothes. They don't educate or train their own doctors. They also famously do not enroll in armed forces of any kind. (But in compensation for that, the Amish are world-class volunteers in the outside world. Few people volunteer more often, or with (p. 231) more expertise and passion, than the Amish/Mennonites. They travel by bus or boat to distant lands to build homes and schools for the needy.) If the Amish had to generate all their own energy, grow all their clothing fibers, mine all metal, harvest and mill all lumber, they would not be Amish at all because they would be running large machines, dangerous factories, and other types of industry that would not sit well in their backyards (one of the criteria they use to decide whether a craft is appropriate for them). But without someone manufacturing this stuff, they could not maintain their lifestyle or prosperity. In short, the Amish depend on the outside world for the way they currently live. Their choice of minimal technology adoption is a choice--but a choice enabled by the technium. Their lifestyle is within the technium, not outside it.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

June 26, 2013

Larry Page Makes an O.K. Decision Now, Rather than a Perfect Decision Later

PageLarryGoogleCEO2013-06-21.jpg "Larry Page has pushed for quicker decision-making and jettisoned more than 25 projects that were not up to snuff." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. A1) MOUNTAIN VIEW, Calif. -- Larry Page, Google's chief executive, so hates wasting time at meetings that he once dumped his secretary to avoid being scheduled for them. He does not much like e-mail either -- even his own Gmail -- saying the tedious back-and-forth takes too long to solve problems.

. . .

(p. A3) Borrowing from the playbooks of executives like Steven P. Jobs and Mayor Michael R. Bloomberg, he has put his personal imprint on the corporate culture, from discouraging excessive use of e-mail to embracing quick, unilateral decision-making -- by him, if need be.

"Ever since taking over as C.E.O., I have focused much of my energy on increasing Google's velocity and execution, and we're beginning to see results," Mr. Page, 38, told analysts recently.

. . .

Despite the many external pressures on Google, it is dominant in its business and highly profitable. But, when asked at a recent conference about the biggest threat to his company, Mr. Page answered in one word, "Google."

The problem was that the company had ballooned so quickly -- it now has more than 31,000 employees and $27.3 billion in revenue so far this year -- that it had become sclerotic. A triumvirate of Mr. Page, his co-founder, Sergey Brin, and Eric E. Schmidt, Google's former chief and current chairman, had to agree before anything could be done. The unwieldy management and glacial pace of decision-making were particularly noticeable in the Valley, where start-ups overtake behemoths in months.

It is different now.

"It's much more of a style like Steve Jobs than the three-headed monster that Google was," said a former Google executive who has spoken with current executives about the changes and spoke anonymously to preserve business relationships. "When Eric was there, you'd walk into a product meeting or a senior staff meeting, and everyone got to weigh in on every decision. Larry is much more willing to make an O.K. decision and make it now, rather than a perfect decision later."

For the full story, see:

CLAIRE CAIN MILLER. "Google's Chief Works to Trim a Bloated Ship." The New York Times (Thurs., November 10, 2011): A1 & A3.

(Note: ellipses added.)

(Note: the online version of the story has the date November 9, 2011.)

June 25, 2013

Limited Choice Is Cost of Amish Community Closeness

(p. 230) . . . the cost of . . . closeness and dependency is limited choice. No education beyond eighth grade. Few career options for guys, none besides homemaker for girls. For the Amish and minimites, one's fulfillment must blossom inside the traditional confines of a farmer, tradesman, or housewife. But not everyone is born to be a farmer. Not every human is ideally matched to the rhythms of horse and corn and seasons and the eternal close inspection of village conformity. Where in the Amish scheme of things is the support for a mathematical genius or a person who might spend all day composing new music?


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: ellipses added.)

June 21, 2013

Amish "Are Big Boosters of Genetically Modified Corn"

(p. 222) The Amish use disposable diapers (why not?), chemical fertilizers, and pesticides, and they are big boosters of genetically modified corn. In Europe this corn is called Frankenfood. I asked a few of the Amish elders about that last one. Why do they plant GMOs? Well, they reply, corn is susceptible to the corn borer, which nibbles away at the bottom of the stem and occasionally topples the stalk. Modern 500-horsepower harvesters don't notice this fall; they just suck up all the material and spit out the corn into a bin. The Amish harvest their corn semimanually. It's cut by a chopper device and then pitched into a thresher. But if there are a lot of stalks that are broken, they have to be pitched by hand. That is a lot of very hard, sweaty work. So they plant Bt corn. This genetic mutant carries the genes of the corn borer's enemy, Bacillus thuringiensis, which produces a toxin deadly to the corn borer. Fewer stalks are broken and the harvest can be aided with machines, so yields are up. One elder Amish man whose sons run his farm said he was too old to be pitching heavy, broken cornstalks, and he told his sons that he'd only help them with the harvest if they planted Bt corn. The alternative was to purchase expensive, modern harvesting equipment, which none of them wanted. So the technology of genetically modified crops allowed the Amish to continue using old, well-proven, debt-free equipment, which accomplished their main goal of keeping the family farm together. They did not use these words, but they made it clear that they considered genetically modified crops appropriate technology for family farms.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: italics in original.)

June 19, 2013

Cars Increase Our Individual Freedom

(p. A13) Cars appeal powerfully to one of the most important conservative values: individual freedom. Straphangers in public conveyances can only travel in groups, moving along with hordes of strangers according to schedules imposed by others. Bicyclists, free as they may be, are clearly limited by distance and time constraints. Once you get into a car, however, you go wherever you want, whenever you want, subject only to your ability to put gas in the tank.

For the full commentary, see:

MICHAEL MEDVED. "OPINION; Honk If You Were Ever Devoted to a Car; I asked for a chance to say a proper goodbye to our family Plymouth. The night before we traded the car in, I slept in it." The Wall Street Journal (Sat., March 9, 2013): A13.

(Note: the online version of the commentary has the date March 8, 2013.)

June 17, 2013

Amish Factory Uses Pneumatics in Place of Electricity

(p. 219) The Amish also make a distinction between technology they have at work and technology they have at home. I remember an early visit to an Amish man who ran a woodworking shop near Lancaster, Pennsylvania. . . .

. . .

(p. 220) While the rest of his large workshop lacked electricity beyond that naked bulb, it did not lack power machines. The place was vibrating with an ear-cracking racket of power sanders, power saws, power planers, power drills, and so on. Everywhere I turned there were bearded men covered in sawdust pushing wood through screaming machines. This was not a circle of Renaissance craftsman hand-tooling masterpieces. This was a small-time factory cranking out wooden furniture with machine power. But where was the power coming from? Not from windmills.

Amos took me around to the back where a huge SUV-sized diesel generator sat. It was massive. In addition to a gas engine there was a very large tank, which, I learned, stored compressed air. The diesel engine burned petroleum fuel to drive the compressor that filled the reservoir with pressure. From the tank, a series of high-pressure pipes snaked off toward every corner of the factory. A hard rubber flexible hose connected each tool to a pipe. The entire shop ran on compressed air. Every piece of machinery was running on pneumatic power. Amos even showed me a pneumatic switch, which he could flick like a light switch to turn on some paint-drying fans running on air.

The Amish call this pneumatic system "Amish electricity." At first, pneumatics were devised for Amish workshops, but air power was seen as so useful that it migrated to Amish households. In fact, there is an entire cottage industry in retrofitting tools and appliances to run on Amish electricity. The retrofitters buy a heavy-duty blender, say, and yank out the electrical motor. They then substitute an air-powered motor of appropriate size, add pneumatic connectors, and bingo, your Amish mom now has a blender in her electricity-less kitchen. You can get a pneumatic sewing machine and a pneumatic washer/dryer (with propane heat). In a display of pure steam-punk (air-punk?) nerdiness, Amish hackers try to outdo one another in building pneumatic versions of electrified contraptions. Their mechanical skill is quite impressive, particularly since none went to school beyond the eighth grade. They (p. 221) love to show off their geekiest hacks. And every tinkerer I met claimed that pneumatics were superior to electrical devices because air was more powerful and durable, outlasting motors that burned out after a few years of hard labor. I don't know if this claim of superiority is true or merely a justification, but it was a constant refrain.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: ellipses added.)

June 13, 2013

If Anarcho-Primitives Destroy Civilization, Billions of City-Dwellers Will Die

(p. 211) . . . , the . . . problem with destroying civilization as we know it is that the alternative, such as it has been imagined by the self-described "haters of civilization," would not support but a fraction of the people alive today. In other words, the collapse of civilization would kill billions. Ironically, the poorest rural inhabitants would fare the best, as they could retreat to hunting and gathering with the least trouble, but billions of urbanites would die within months or even weeks, once food ran out and disease took over. The anarcho-primitives are rather sanguine about this catastrophe, arguing that accelerating the collapse early might save lives in total.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: ellipses added.)

June 12, 2013

Patents Turned Steam from Toy to Engine


Source of book image:

(p. 20) The obvious audience for Rosen's book consists of those who hunger to know what it took to go from Heron of Alexandria's toy engine, created in the first century A.D., to practical and brawny beasts like George and Robert Stephenson's Rocket, which kicked off the age of steam locomotion in 1829. But Rosen is aiming for more than a fan club of steam geeks. The "most powerful idea" of his title is not an early locomotive: "The Industrial Revolution was, first and foremost, a revolution in invention," he writes, "a radical transformation in the process of invention itself." The road to Rocket was built with hundreds of innovations large and small that helped drain the mines, run the mills, and move coal and then people over rails.

. . .

Underlying it all, Rosen argues, was the recognition that ideas themselves have economic value, which is to say, this book isn't just gearhead wonkery, it's legal wonkery too. Abraham Lincoln, wondering why Heron's steam engine languished, claimed that the patent system "added the fuel of interest to the fire of genius." Rosen agrees, offering a forceful argument in the debate, which has gone on for centuries, over whether patents promote innovation or retard it.

Those who believe passionately, as Thomas Jefferson did, that inventions "cannot, in nature, be a subject of property," are unlikely to be convinced. Those who agree with the inventors James Watt and Richard Arkwright, who wrote in a manuscript that "an engineer's life without patent is not worthwhile," will cheer. Either way, Rosen's presentation of this highly intellectual debate will reward even those readers who never wondered how the up-and-down chugging of a piston is converted into consistent rotary motion.

For the full review, see:

JOHN SCHWARTZ. "Steam-Driven Dreams." The New York Times (Sun., August 29, 2010): 20.

(Note: ellipsis added; italicized words in original.)

(Note: the online version of the review has the date August 26, 2010.)

The book under review, is:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

June 9, 2013

Moore's Law: Inevitable or Intel?

I believe that Moore's Law remained true for a long time, not because it was inevitable, but because an exemplary company worked very hard and effectively to make it true.

(p. 159) In brief, Moore's Law predicts that computing chips will shrink by half in size and cost every 18 to 24 months. For the past 50 years it has been astoundingly correct.

It has been steady and true, but does Moore's Law reveal an imperative in the technium? In other words is Moore's Law in some way inevitable? The answer is pivotal for civilization for several reasons. First, Moore's Law represents the acceleration in computer technology, which is accelerating everything else. Faster jet engines don't lead to higher corn yields, nor do better lasers lead to faster drug discoveries, but faster computer chips lead to all of these. These days all technology follows computer technology. Second, finding inevitability in one key area of technology suggests invariance and directionality may be found in the
rest of the technium.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

June 6, 2013

Faculty Unions Oppose MOOCs that Might Cost Them Their Jobs in Five to Seven Years

ThrunSabastianUdacityCEO2013-05-14.jpg "Sebastian Thrun, a research professor at Stanford, is Udacity's chief executive officer." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. A1) SAN JOSE, Calif. -- Dazzled by the potential of free online college classes, educators are now turning to the gritty task of harnessing online materials to meet the toughest challenges in American higher education: giving more students access to college, and helping them graduate on time.

. . .

Here at San Jose State, . . . , two pilot programs weave material from the online classes into the instructional mix and allow students to earn credit for them.

"We're in Silicon Valley, we (p. A3) breathe that entrepreneurial air, so it makes sense that we are the first university to try this," said Mohammad Qayoumi, the university's president. "In academia, people are scared to fail, but we know that innovation always comes with the possibility of failure. And if it doesn't work the first time, we'll figure out what went wrong and do better."

. . .

Dr. Qayoumi favors the blended model for upper-level courses, but fully online courses like Udacity's for lower-level classes, which could be expanded to serve many more students at low cost. Traditional teaching will be disappearing in five to seven years, he predicts, as more professors come to realize that lectures are not the best route to student engagement, and cash-strapped universities continue to seek cheaper instruction.

"There may still be face-to-face classes, but they would not be in lecture halls," he said. "And they will have not only course material developed by the instructor, but MOOC materials and labs, and content from public broadcasting or corporate sources. But just as faculty currently decide what textbook to use, they will still have the autonomy to choose what materials to include."

. . .

Any wholesale online expansion raises the specter of professors being laid off, turned into glorified teaching assistants or relegated to second-tier status, with only academic stars giving the lectures. Indeed, the faculty unions at all three California higher education systems oppose the legislation requiring credit for MOOCs for students shut out of on-campus classes.

. . .

"Our ego always runs ahead of us, making us think we can do it better than anyone else in the world," Dr. Ghadiri said. "But why should we invent the wheel 10,000 times? This is M.I.T., No. 1 school in the nation -- why would we not want to use their material?"

There are, he said, two ways of thinking about what the MOOC revolution portends: "One is me, me, me -- me comes first. The other is, we are not in this business for ourselves, we are here to educate students."

For the full story, see:

TAMAR LEWIN. "Colleges Adapt Online Courses to Ease Burden." The New York Times (Tues., April 30, 2013): A1 & A3.

(Note: ellipses added.)

(Note: the online version of the story has the date April 29, 2013.)

KormanikKatieUdacityStudent2013-05-14.jpg "Katie Kormanik preparing to record a statistics course at Udacity, an online classroom instruction provider in Mountain View, Calif." Source of caption and photo: online version of the NYT article quoted and cited above.

June 4, 2013

Edison, Not Muybridge, Remains the Father of Hollywood


Source of book image: online version of the WSJ review quoted and cited below.

(p. A13) Wish it though we might, this strangely off-center Briton isn't really the Father of Hollywood, nor even a distant progenitor of "Avatar." The famous time-lapse images that he took for Stanford, proving that a horse does take all four hoofs off the ground while galloping--and the tens of thousands of photographs that he went on to make of birds flying and people sneezing or bending over and picking things up--were soon so comprehensively overtaken by newer technologies (lenses, shutters, celluloid) that his stature as a proto-movie-maker was soon reduced to a way-station. His contribution was technically interesting but hardly seminal at all. The tragic reality is that Thomas Edison, with whom Muybridge was friendly enough to propose collaboration, retains the laurels--though, as Mr. Ball points out with restrained politeness, Muybridge might have fared better had he been aware of Edison's reputation for "borrowing the work of others and not returning it."

For the full review, see:

SIMON WINCHESTER. "BOOKSHELF; Lights, Camera, Murder; The time-lapse photos Muybridge took in the 19th century were technically innovative, but they didn't make him the Father of Hollywood." The Wall Street Journal (Thurs., February 6, 2013): A13.

(Note: the online version of the review has the date February 6, 2013.)

The book under review is:

Ball, Edward. The Inventor and the Tycoon: A Gilded Age Murder and the Birth of Moving Pictures. New York: Doubleday, 2013.

June 2, 2013

Tesla CTO Straubel Likes Biography of Tesla


J.B. Straubel, Chief Technology Officer of Tesla Motors. Source of photo: online version of the NYT article quoted and cited below.

(p. 2) J. B. Straubel is a founder and the chief technical officer of Tesla Motors in Palo Alto, Calif. The company makes electric vehicles that some compare to Apple products in terms of obsessive attention to design, intuitive user interface and expense.

READING I like to read biographies of interesting people, mostly scientists and engineers. Right now, it's "Steve Jobs," by Walter Isaacson. One of my favorites biographies was "Wizard: The Life and Times of Nikola Tesla," by Marc Seifer, which I read even before Tesla Motors started.

. . .

WATCHING I really like the movie "October Sky." It's about a guy who grew up in a little coal-mining town around the time of Sputnik. He fell in love with the idea of building rockets and the movie follows him through his high school years when he's building rockets and eventually he ends up becoming an engineer at NASA. I watch it every year or so. It's inspirational. I always come out of it wanting to work harder.

For the full interview, see:

KATE MURPHY. "DOWNLOAD; J. B. Straubel." The New York Times, SundayReview Section (Sun., April 7, 2013): 2.

(Note: ellipsis added; bold in original.)

(Note: the online version of the interview has the date April 6, 2013.)

May 31, 2013

Paul Allen's Account of the Founding of Microsoft


Source of book image:

(p. C6) The first half of "Idea Man" sets forth Mr. Allen's version of the Microsoft creation myth, depicting Mr. Gates as a petulant, ambitious and money-minded mogul-to-be and Mr. Allen as an underappreciated visionary. Pictures of them from the 1970s and early '80s also tell this story, making Mr. Allen look like a hirsute, powerful older brother and Mr. Gates like a kid.

. . .

"Idea Man" is long overdue. It turns out to be as remote, yet as surpassingly strange, as its author, whose receipt of a diagnosis of Stage 4 non-Hodgkin's lymphoma in 2009 has made it that much more important for him to tell his story. Though it is written in the smoothly proficient style of many a collaborator-assisted memoir, it is a book filled with wild extremes: breakthrough, breakup, power, indulgence, blue-sky innovation. And it winds up offering Mr. Allen's guarded, partial answer to a universal question: what if you could make your wildest dreams come true?

For the full review, see:

JANET MASLIN. "BOOKS OF THE TIMES; The Reclusive Other Half of Microsoft's Odd Couple Breaks His Silence." The New York Times (Tues., April 19, 2011): C6.

(Note: ellipsis added.)

(Note: the online version of the review has the date April 18, 2011.)

The book under review is:

Allen, Paul. Idea Man: A Memoir by the Cofounder of Microsoft. New York: Portfolio, 2011.

May 30, 2013

MOOCs "Will Really Scale" Once Credible Credentialing Process Is Mastered

A "MOOC" is a "massive open online course."

(p. 1) Last May I wrote about Coursera -- co-founded by the Stanford computer scientists Daphne Koller and Andrew Ng -- just after it opened. Two weeks ago, I went back out to Palo Alto to check in on them. When I visited last May, about 300,000 people were taking 38 courses taught by Stanford professors and a few other elite universities. Today, they have 2.4 million students, taking 214 courses from 33 universities, including eight international ones.

Anant Agarwal, the former director of M.I.T.'s artificial intelligence lab, is now president of edX, a nonprofit MOOC that M.I.T. and Harvard are jointly building. Agarwal told me that since May, some 155,000 students from around the world have taken edX's first course: an M.I.T. intro class on circuits. "That is greater than the total number of M.I.T. alumni in its 150-year history," he said.

. . .

(p. 11) As we look to the future of higher education, said the M.I.T. president, L. Rafael Reif, something that we now call a "degree" will be a concept "connected with bricks and mortar" -- and traditional on-campus experiences that will increasingly leverage technology and the Internet to enhance classroom and laboratory work. Alongside that, though, said Reif, many universities will offer online courses to students anywhere in the world, in which they will earn "credentials" -- certificates that testify that they have done the work and passed all the exams. The process of developing credible credentials that verify that the student has adequately mastered the subject -- and did not cheat -- and can be counted on by employers is still being perfected by all the MOOCs. But once it is, this phenomenon will really scale.

I can see a day soon where you'll create your own college degree by taking the best online courses from the best professors from around the world -- some computing from Stanford, some entrepreneurship from Wharton, some ethics from Brandeis, some literature from Edinburgh -- paying only the nominal fee for the certificates of completion. It will change teaching, learning and the pathway to employment. "There is a new world unfolding," said Reif, "and everyone will have to adapt."

For the full commentary, see:

THOMAS L. FRIEDMAN. "Revolution Hits the Universities." The New York Times, SundayReview Section (Sun., January 27, 2013): 1 & 11.

(Note: ellipsis added.)

(Note: the online version of the commentary has the date January 26, 2013.)

May 28, 2013

Modern Cities Are "Successful Former Slums" that Allowed "Vibrant Economic Activity"

(p. 82) Babylon, London, and New York all had teeming ghettos of unwanted settlers erecting shoddy shelters with inadequate hygiene and engaging in dodgy dealings. Historian Bronislaw Geremek states that "slums constituted a large part of the urban landscape" of Paris in the Middle Ages. Even by the 1780s, when Paris was at its peak, nearly 20 percent of its residents did not have a "fixed abode"--that is, they lived in shacks. In a familiar complaint about medieval French cities, a gentleman from that time noted: "Several families inhabit one house. A (p. 83) weaver's family may be crowded into a single room, where they huddle around a fireplace." That refrain is repeated throughout history. A century ago Manhattan was home to 20,000 squatters in self-made housing. Slab City alone, in Brooklyn (named after the use of planks stolen from lumber mills), contained 10,000 residents in its slum at its peak in the 1880s. In the New York slums, reported the New York Times in 1858, "nine out of ten of the shanties have only one room, which does not average over twelve feet square, and this serves all the purposes of the family."

San Francisco was built by squatters. As Rob Neuwirth recounts in his eye-opening book Shadow Cities, one survey in 1855 estimated that "95 percent of the property holders in [San Francisco] would not be able to produce a bona fide legal title to their land." Squatters were everywhere, in the marshes, sand dunes, military bases. One eyewitness said, "Where there was a vacant piece of ground one day, the next saw it covered with half a dozen tents or shanties." Philadelphia was largely settled by what local papers called "squatlers." As late as 1940, one in five citizens in Shanghai was a squatter. Those one million squatters stayed and kept upgrading their slum so that within one generation their shantytown became one of the first twenty-first-century cities.

That's how it works. This is how all technology works. A gadget begins as a junky prototype and then progresses to something that barely works. The ad hoc shelters in slums are upgraded over time, infrastructure is extended, and eventually makeshift services become official. What was once the home of poor hustlers becomes, over the span of generations, the home of rich hustlers. Propagating slums is what cities do, and living in slums is how cities grow. The majority of neighborhoods in almost every modern city are merely successful former slums. The squatter cities of today will become the blue-blood neighborhoods of tomorrow. This is already happening in Rio and Mumbai today.

Slums of the past and slums of today follow the same description. The first impression is and was one of filth and overcrowding. In a ghetto a thousand years ago and in a slum today shelters are haphazard and dilapidated. The smells are overwhelming. But there is vibrant economic activity.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: italics, and bracketed "San Francisco" in original.)

May 24, 2013

Technology Brings Choices and Control, Which Brings Happiness

(p. 78) For the past 30 years the conventional wisdom has been that once a person achieves a minimal standard of living, more money does not bring more happiness. If you live below a certain income threshold, increased money makes a difference, but after that, it doesn't buy happiness. That was the conclusion of a now-classic study by Richard Easterlin in 1974. However, recent research from the Wharton School at the University of Pennsylvania shows that worldwide, affluence brings increased satisfaction. Higher income earners are happier. Citizens in higher-earning countries tend to be more satisfied on average.

My interpretation of this newest research--which also matches our intuitive impressions--is that what money brings is increased choices, rather than merely increased stuff (although more stuff comes with the territory). We don't find happiness in more gadgets and experiences. We do find happiness in having some control of our time and work, a chance for real leisure, in the escape from the uncertainties of war, poverty, and corruption, and in a chance to pursue individual freedoms--all of which come with increased affluence.

I've been to many places in the world, the poorest and the richest spots, the oldest and the newest cities, the fastest and the slowest cultures, and it is my observation that when given a chance, people who walk will buy a bicycle, people who ride a bike will get a scooter, people riding a scooter will upgrade to a car, and those with a car dream of a plane. Farmers everywhere trade their ox plows for tractors, their gourd bowls for tin ones, their sandals for shoes. Always. Insignificantly few ever go back. The exceptions such as the well-known Amish are not so exceptional when examined closely, for even their communities adopt selected technology without retreat.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: italics in original.)

May 19, 2013

Cooking Allowed the Toothless to Live


Source of book image:

(p. C12) . . . the narrative, ragtag though it may be, is a good one and it starts with the single greatest achievement in cookware--the cooking pot. Originally made of clay, this simple invention allowed previously inedible foods to be cooked in water, a process that removed toxins, made them digestible and reduced the need for serious chewing, a deadly problem for the toothless. (Archaeologists find adult skeletons without teeth only at sites dating from after the invention of the cooking pot.)

. . .

When "Consider the Fork" turns to cultural history, Ms. Wilson's points sometimes contradict one another. On one hand, she slyly condemns the rich throughout history and their use of cheap cooking labor. Yet she also relates how the Lebanese writer Anissa Helou remembers kibbé being made in Beirut by her mother and grandmother: They pounded the lamb in a mortar and pestle for an hour, a process described in loving terms. So is cooking labor a bedrock of family values or class exploitation?

For the full review, see:

CHRISTOPHER KIMBALL. "The World on a Plate." The Wall Street Journal (Sat., October 6, 2012): C12.

(Note: ellipses added.)

(Note: the online version of the review has the date October 5, 2012.)

The book under review, is:

Wilson, Bee. Consider the Fork: A History of How We Cook and Eat. New York: Basic Books, 2012.

May 16, 2013

New Technology Gets Better, Cheaper and More Diverse

(p. 75) Devices not only get better, they also get cheaper while they get better. We turn around to peer through our window into the past and realize there wasn't window glass back then. The past also lacked machine-woven cloth, refrigerators, steel, photographs, and the entire warehouse of goods spilling into the aisles of our local superstore. We can trace this cornucopia back along a diminishing curve to the Neolithic era. Craft from ancient times can surprise us in its sophistication, but in sheer quantity, variety, and complexity, it pales against modern inventions. The proof of this is clear: We buy the new over the old. Given the choice between an old-fashioned tool and a new one, most people--in the past as well as now--would grab the newer one. A very few will collect old tools, but as big as eBay is, and flea markets anywhere in the world, they are dwarfed by the market of the new. But if the new is not really better, and we keep reaching for it, then we are consistently duped or consistently dumb. The more likely reason we seek the new is that new things do get better. And of course there are more new things to choose from.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

May 15, 2013

Were Phone Phreaks Creative Incipient Entrepreneurs or Destructive "Sophomoric Savants"?


Source of book image:

(p. C6) Mr. Lapsley also describes John Draper, aka Captain Crunch, who was probably the most celebrated of the phreakers; his nickname derived from the fact that whistles that used to come in Cap'n Crunch cereal boxes happened to generate the key 2600-Hz tone used in long-distance switching. . . .

The phone-phreak netherworld was introduced to a mass audience by the October 1971 issue of Esquire magazine, which included what has to be (at least indirectly) one of the most influential articles ever written: Ron Rosenbaum's "Secrets of the Little Blue Box." Not only did it turn phreakers into folk heroes, but it inspired two young men, Steve Wozniak (who provided the foreword for this book) and Steve Jobs, to construct and sell blue boxes. Going door to door in Berkeley dorms, they managed to sell several dozen at $170 each. The "two Steves" savored this mix of clever engineering and entrepreneurial hustle: As Mr. Lapsley quotes Jobs saying: "If we hadn't made blue boxes, there would have been no Apple." (Mr. Rosenbaum's article also put the "phreak" into "phone phreak.")

. . .: By the 1980s, computerized phone systems and fiber-optic cables rendered many of the old phreaking modes obsolete. In addition, I can't help suspecting that the breakup of AT&T in 1984--the result of an antitrust lawsuit filed by the federal government--deeply discouraged the hard-core phreaks. Surreptitiously penetrating one of the shriveled new regional phone companies must have seemed a paltry caper compared with taking on mighty, majestic AT&T.

. . .

I must, however, take issue with one of Mr. Lapsley's conclusions. In reflecting on the phreaks' legacy, he writes: "The phone phreaks taught us that there is a societal benefit to tolerating, perhaps even nurturing (in the words of Apple) the crazy ones--the misfits, the rebels, the troublemakers." Is that truly what they taught us? . . .

Wilt Chamberlain supposedly once said that "nobody roots for Goliath." Perhaps. But the lesson to be learned from those waging guerrilla war against giants like the phone company and the Internet is that sophomoric savants who tamper with society's indispensable systems ultimately harm all too many innocent people.

For the full review, see:

HOWARD SCHNEIDER. "BOOKSHELF; Playing Tricks on Ma Bell." The Wall Street Journal (Sat., February 2, 2013): C6.

(Note: ellipses added.)

(Note: the online version of the review has the date February 1, 2013.)

The book under review, is:

Lapsley, Phil. Exploding the Phone: The Untold Story of the Teenagers and Outlaws Who Hacked Ma Bell. New York: Grove Press, 2013.

May 12, 2013

Knowledge Economy Migrating to Intangible Goods and Services

(p. 67) Our present economic migration from a material-based industry to a knowledge economy of intangible goods (such as software, design, and media products) is just the latest in a steady move toward the immaterial. (Not that material processing has let up, just that intangible processing is now more economically valuable.) Richard Fisher, president of the Federal Reserve Bank of Dallas, says, "Data from nearly all parts of the world show us that consumers tend to spend relatively less on goods and more on services as their incomes rise. . . . Once people have met their basic needs, they tend to want medical care, transportation and communication, information, recreation, entertainment, financial and legal advice, and the like." The disembodiment of value (more value, less mass) is a steady trend in the technium. In six years the average weight per dollar of U.S. exports (the most valuable things the U.S. produces) (p. 68) dropped by half. Today, 40 percent of U.S. exports are services (intangibles) rather than manufactured goods (atoms). We are steadily substituting intangible design, flexibility, innovation, and smartness for rigid, heavy atoms. In a very real sense our entry into a service- and idea-based economy is a continuation of a trend that began at the big bang.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: ellipsis in original; a graph is omitted that appears in the middle of the paragraph quoted above.)

May 8, 2013

You Can Buy a New Flint Knife or Stone Ax

(p. 55) Let's take the oldest technology of all: a flint knife or stone ax. Well, it turns out you can buy a brand-new flint knife, flaked by hand and carefully attached to an antler-horn handle by tightly wound leather straps. In every respect it is precisely the same technology as a flint knife made 30,000 years ago. It's yours for fifty dollars, available from more than one website. In the highlands of New Guinea, tribesmen were making stone axes for their own use until the 1960s. They still make stone axes the same way for tourists now. And stone-ax aficionados study them. There is an unbroken chain of knowledge that has kept this Stone Age technology alive. Today, in the United States alone, there are 5,000 amateurs who knap fresh arrowhead points by hand. They meet on weekends, exchange tips in flint-knapping clubs, and sell their points to souvenir brokers. John Whittaker, a professional archaeologist and flint knapper himself, has studied these amateurs and estimates that they produce over one million brand-new spear and arrow points per year. These new points are indistinguishable, even to experts like Whittaker, from authentic ancient ones.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

May 6, 2013

Currently Possible Inventions Are Not Inevitable; Someone Must Think of Them and Make Them Happen

(p. C4) . . . some inventions seem to have occurred to nobody until very late. The wheeled suitcase is arguably such a, well, case. Bernard Sadow applied for a patent on wheeled baggage in 1970, after a Eureka moment when he was lugging his heavy bags through an airport while a local worker effortlessly pushed a large cart past. You might conclude that Mr. Sadow was decades late. There was little to stop his father or grandfather from putting wheels on bags.

Mr. Sadow's bags ran on four wheels, dragged on a lead like a dog. Seventeen years later a Northwest Airlines pilot, Robert Plath, invented the idea of two wheels on a suitcase held vertically, plus a telescopic handle to pull it with. This "Rollaboard," now ubiquitous, also feels as if it could have been invented much earlier.

Or take the can opener, invented in the 1850s, eight decades after the can. Early 19th-century soldiers and explorers had to make do with stabbing bayonets into food cans. "Why doesn't somebody come up with a wheeled cutter?" they must have muttered (or not) as they wrenched open the cans.

For the full commentary, see:

MATT RIDLEY. "MIND & MATTER; Don't Look for Inventions Before Their Time." The Wall Street Journal (Sat., September 15, 2012): C4.

(Note: ellipsis added.)

(Note: the online version of the commentary has the date September 14, 2012.)

May 4, 2013

Steam-Powered Cars Show that Old Technologies Rarely Totally Disappear

(p. 53) In my own travels around the world I was struck by how resilient ancient technologies were, how they were often first choices where power and modern resources were scarce. It seemed to me as if no technologies ever disappeared. I was challenged on this conclusion by a highly regarded historian of technology who told me without thinking, "Look, they don't make steam-powered automobiles anymore." Well, within a few clicks on Google I very quickly located folks who are making brand-new parts for Stanley steam-powered cars. Nice shiny copper valves, pistons, whatever you need. With enough money you could put together an entirely new steam-powered car. And of course, thousand of hobbyists are still bolting together steam-powered vehicles, and hundreds more are keeping old ones running. Steam power is very much an intact, though uncommon, species of technology.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: italics in original.)

April 30, 2013

Increased CO2 "Kept a New Ice Age at Bay"

(p. 38) . . . the repeated inventions and spread of agriculture around the planet affected not only the surface of the Earth, but its 100-kilometer-wide (60-mile-wide) atmosphere as well. Farming disturbed the soil and increased CO2. Some climatologists believe that this early anthropogenic warming, starting 8,000 years ago, kept a new ice age at bay. Widespread adoption of farming disrupted a natural climate cycle that ordinarily would have refrozen the northernmost portions of the planet by now.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: ellipsis added.)

April 26, 2013

Longer Life Spans "Allowed More Time to Invent New Tools"

(p. 33) The primary long-term consequence of . . . slightly better nutrition was a steady increase in longevity. Anthropologist Rachel Caspari studied the dental fossils of 768 hominin individuals in Europe, Asia, and Africa, dated from 5 million years ago until the great leap. She determined that a "dramatic increase in longevity in the modern humans" began about 50,000 years ago. Increasing longevity allowed grandparenting, creating what is called the grandmother effect: In a virtuous circle, via the communication of grandparents, ever more powerful innovations carried forward were able to lengthen life spans further, which allowed more time to invent new tools, which increased population. Not only that: Increased longevity "provide[d] a selective advantage promoting further population increase," because a higher density of humans increased the rate and influence of innovations, which contributed to increased populations. Caspari claims that the most fundamental biological factor that underlies the behavioral innovations of modernity maybe the increase in adult survivorship. It is no coincidence that increased longevity is the most measurable consequence of the acquisition of technology. It is also the most consequential.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

(Note: ellipsis added; bracketed "d" in Kelly's original.)

April 6, 2013

In Later Middle Ages Machines Replaced Slaves and Coolies

(p. 7) By the European Middle Ages, craftiness manifested itself most significantly in a new use of energy. An efficient horse collar had disseminated throughout society, drastically increasing farm acreage, while water mills and windmills were improved, increasing the flow of lumber and flour and improving drainage. And all this plentitude came without slavery. As Lynn White, historian of technology, wrote, "The chief glory (p. 8) of the later Middle Ages was not its cathedrals or its epics or its scholasticism: it was the building for the first time in history of a complex civilization which rested not on the backs of sweating slaves or coolies but primarily on non-human power." Machines were becoming our coolies.


Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

April 1, 2013

Kevin Kelly Explains and Criticizes Amish Attitude toward Technology


Source of book image:

Kevin Kelly's book has received a lot of attention, sometimes in conjunction with Steven Johnson's Where Good Ideas Come From, with which it shares some themes. I found the Kelly book valuable, but frustrating.

The valuable part includes the discussion of the benefits of technology, and the chapter detailing Amish attitudes and practices related to technology. On the latter, for instance, I learned that the Amish do not categorically reject new technology, but believe that it should be adopted more slowly, after long community deliberation.

What frustrated me most about the book is that it argues that technology has a life of its own and that technological progress is predetermined and inevitable. (I believe that technological progress depends on enlightened government policies and active entrepreneurial initiative, neither of which is inevitable.)

In the next several weeks, I will be quoting some of the more important or thought-provoking passages in the book.

The reference for Kelly's book, is:

Kelly, Kevin. What Technology Wants. New York: Viking Adult, 2010.

The Johnson book mentioned above, is:

Johnson, Steven. Where Good Ideas Come From: The Natural History of Innovation. New York: Riverhead Books, 2010.

March 28, 2013

Driving to MobileIron Job Interview in $100,000 Car, Tells CEO Tinker You Are Not Hungry Enough

TinkerRobertMobileIronCEO2013-03-09.jpg "Above, Robert Tinker, the chief executive of MobileIron, at its offices in Mountain View, Calif." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B2) "There are disruptions everywhere," said Robert Tinker, the chief executive of MobileIron, which makes software for companies to manage smartphones and tablets. "Mobile disrupts personal computers, a market worth billions. Cloud disrupts computer servers and data storage, billions of dollars more. Social may be one of those rare things that is totally new."

Relative to the size of the markets that mobile devices, cloud computing and social media are toppling, he says, the valuations are reasonable.

But most of these chief executives are also veterans of the Internet bubble of the late '90s, and confess to worries that maybe things are not so different this time. Mr. Tinker, 43, drives a 1995 Ford Explorer that has logged 265,000 miles.

"If somebody comes to a job interview here in a $100,000 car, I know he's not hungry," he said. "The reality is, I've taken $94 million in investors' money, and we haven't gone public yet. I feel that responsibility every day."

For the full story, see:

QUENTIN HARDY. "A Billion-Dollar Club, and Not So Exclusive." The New York Times (Weds., February 5, 2013): B1 & B2.

(Note: the online version of the story has the date February 4, 2013.)

March 20, 2013

Many New Tech Entrepreneurs Shun "Fast Cars and Fancy Parties"


"Phil Libin, chief of Evernote, at its headquarters in Redwood City, Calif." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B1) SAN FRANCISCO -- The number of privately held Silicon Valley start-ups that are worth more than $1 billion shocks even the executives running those companies.

"I thought we were special," said Phil Libin, chief executive of Evernote, an online consumer service for storing clippings, photos and bits of information as he counted his $1 billion-plus peers.

He started Evernote in 2008 on the eve of the recession and built it methodically. "A lot of us didn't set out to have a big valuation, we're just trying to build something that lasts," Mr. Libin said. "There is no safe industry anymore, even here."

. . .

(p. B2) Silicon Valley entrepreneurs contend that the price spiral is not a sign of another tech bubble. The high prices are reasonable, they say, because innovations like smartphones and cloud computing will remake a technology industry that is already worth hundreds of billions of dollars.

. . .

The founders of the highly valued companies are old enough to remember past busts, and many shun the bubble lifestyle of fast cars and fancy parties.

Mr. Libin, who said he grew up on food stamps as the son of Russian immigrants in the Bronx, became a millionaire when he sold his first company, Engine5, to Vignette in 2000.

"The company I sold to, there were purple Lamborghinis in the garage. I got into watches," he said. "Maybe a half-dozen, nothing over $10,000, but I needed this glass and leather watch winder."

Evernote started as the financial crisis hit. "One night I was almost busted again," he said, "and there was that watch winder on the shelf, mocking me."

"Every job out there is insecure now," he said. "People sell 10 percent of their stock, and they have an incentive to make the other 90 percent worth more. They are still working, but not worrying about what will happen to their home or their kids."

For the full story, see:

QUENTIN HARDY. "A Billion-Dollar Club, and Not So Exclusive." The New York Times (Weds., February 5, 2013): B1 & B2.

(Note: the online version of the story has the date February 4, 2013.)

March 11, 2013

Open Systems Limit the Integrated Vision that Creates Great Products

The following passage is Steve Jobs speaking, as quoted by Walter Isaacson.

(p. 568) People pay us to integrate things for them, because they don't have the time to think about this stuff 24/7. If you have an extreme passion for producing great products, it pushes you to be integrated, to connect your hardware and your software and content management. You want to break new ground, so you have to do it yourself. If you want to allow your products to be open to other hardware or software, you have to give up some of your vision.


Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

March 7, 2013

Steve Jobs: "Never Rely on Market Research"

The following passage is Steve Jobs speaking, as quoted by Walter Isaacson.

(p. 567) Some people say, "Give the customers what they want." But that's not my approach. Our job is to figure out what they're going to want before they do. I think Henry Ford once said, "If I'd asked customers what they wanted, they would have told me, 'A faster horse!'" People don't know what they want until you show it to them. That's why I never rely on market research. Our task is to read things that are not yet on the page.


Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

March 1, 2013

Google's Eric Schmidt Saw that "Regulation Prohibits Real Innovation"

(p. A13) Eric Schmidt, executive chairman of Google, gave a remarkable interview this month to The Washington Post. So remarkable that Post editors preceded the transcript with this disclosure: "He had just come from the dentist. And he had a toothache."

Perhaps it was the Novocain talking, but Mr. Schmidt has done us a service. He said in public what most technologists will say only in private. Whatever caused him to speak forthrightly about the disconnects between Silicon Valley and Washington, his comments deserve wider attention.

Mr. Schmidt had just given his first congressional testimony. He was called before the Senate Judiciary Antitrust Subcommittee to answer allegations that Google is a monopolist, a charge the Federal Trade Commission is also investigating.

"So we get hauled in front of the Congress for developing a product that's free, that serves a billion people. OK? I mean, I don't know how to say it any clearer," Mr. Schmidt told the Post. "It's not like we raised prices. We could lower prices from free to . . . lower than free? You see what I'm saying?"

. . .

"Regulation prohibits real innovation, because the regulation essentially defines a path to follow," Mr. Schmidt said. This "by definition has a bias to the current outcome, because it's a path for the current outcome."

. . .

Washington is always slow to recognize technological change, which is why in their time IBM and Microsoft were also investigated after competing technologies had emerged.

Mr. Schmidt recounted a dinner in 1995 featuring a talk by Andy Grove, a founder of Intel: "He says, 'This is easy to understand. High tech runs three times faster than normal businesses. And the government runs three times slower than normal businesses. So we have a nine-times gap.' All of my experiences are consistent with Andy Grove's observation."

Mr. Schmidt explained there was only one way to deal with this nine-times gap, which this column hereby christens "Grove's Law of Government." That is "to make sure that the government does not get in the way and slow things down."

Mr. Schmidt recounted that when Silicon Valley first started playing a large role in the economy in the 1990s, "all of a sudden the politicians showed up. We thought the politicians showed up because they loved us. It's fair to say they loved us for our money."

He contrasted innovation in Silicon Valley with innovation in Washington. "Now there are startups in Washington," he said, "founded by people who were policy makers. . . . They're very clever people, and they've figured out a way in regulation to discriminate, to find a new satellite spectrum or a new frequency or whatever. They immediately hired a whole bunch of lobbyists. They raised some money to do that. And they're trying to innovate through regulation. So that's what passes for innovation in Washington."

For the full commentary, see:

L. GORDON CROVITZ. "INFORMATION AGE; Google Speaks Truth to Power; About the growing regulatory state, even Google's Eric Schmidt--a big supporter of the Obama administration--now feels the need to tell it like it is." The Wall Street Journal (Mon., October 24, 2011): A13.

(Note: ellipses between paragraphs added; ellipsis internal to Schmidt quote, in original WSJ commentary.)

The original Eric Schmidt interview with the Washington Post, can be read at:

February 20, 2013

Entrepreneur Kurzweil Says If He Gets Cancer, He Will Invent a Cure


"Ray Kurzweil." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 12) As a futurist, you are famous for making predictions of when technological innovations will actually occur. Are you willing to predict the year you will die? My plan is to stick around. We'll get to a point about 15 years from now where we're adding more than a year every year to your life expectancy.

To clarify, you're predicting your immortality.
The problem is I can't get on the phone with you in the future and say, "Well, I've done it, I have lived forever," because it's never forever.

. . .

You've said that if you woke up one day with a terminal disease, you'd be forced to invent a cure. Were you being serious?
I absolutely would try. I'm working now on a cancer project with some scientists at M.I.T., and if I develop cancer, I do have some ideas of what I would do.

I imagine a lot of people would hear that and say, Ray, if you think you're capable of curing yourself, why don't you go ahead and start curing others?
Well, I mean, I do have to pick my priorities. Nobody can do everything. What we spend our time on is probably the most important decision we make. I don't know if you're aware, but I'm joining Google as director of engineering.

For the full interview, see:

Andrew Goldman, Interviewer. "TALK; The Life Robotic; The Futurist Ray Kurzweil Says We're Going to Live Forever. Really." The New York Times Magazine (Sun., January 27, 2013): 12.

(Note: ellipsis added; bold in original, indicating interviewer questions.)

(Note: the online version of the interview has the date January 25, 2013, and has the title "TALK; Ray Kurzweil Says We're Going to Live Forever.")

February 11, 2013

Apple's Corporate Culture Under Jobs: "Accountability Is Strictly Enforced"

(p. 531) In theory, you could go to your iPhone or any computer and access all aspects of your digital life. There was, however, a big problem: The service, to use Jobs's terminology, sucked. It was complex, devices didn't sync well, and email and other data got lost randomly in the ether. "Apple's MobileMe Is Far Too Flawed to Be Reliable," was the headline on Walt Mossberg's review in the Wall Street Journal.

Jobs was furious. He gathered the MobileMe team in the auditorium on the Apple campus, stood onstage, and asked, "Can anyone tell me what MobileMe is supposed to do?" After the team members offered their answers, Jobs shot back: "So why the fuck doesn't it do that?" Over the next half hour he continued to berate them. "You've tarnished Apple's reputation," he said. You should hate each other for having let each other down. Mossberg, our friend, is no longer writing good things about us." In front of the whole audience, he got rid of the leader of the MobileMe team and replaced him with Eddy Cue, who oversaw all Internet content at Apple. As Fortune's Adam Lashinsky reported in a dissection of the Apple corporate culture, "Accountability is strictly enforced."


Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

February 7, 2013

The Universality of Values: Every Kid Wants a Cell Phone

(p. 528) When they got to Istanbul, . . . [Jobs] hired a history professor to give his family a tour. At the end they went to a Turkish bath, where the professor's lecture gave Jobs an insight about the globalization of youth:

I had a real revelation. We were all in robes, and they made some Turkish coffee for us. The professor explained how the coffee was made very different from anywhere else, and I realized, "So fucking what?" Which kids even in Turkey give a shit about Turkish coffee? All day I had looked at young people in Istanbul. They were all drinking what every other kid in the world drinks, and they were wearing clothes that look like they were bought at the Gap, and they are all using cell phones. They were like kids everywhere else. It hit me that, for young people, this whole world is the same now. When we're making products, there is no such thing as a Turkish phone, or a music player that young people in Turkey would want that's different from one young people elsewhere would want. We're just one world now.


Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

(Note: ellipsis and bracketed "Jobs" added; indented Jobs block quote was indented in the original.)

February 3, 2013

Steve Jobs Viewed Patents as Protecting Property Rights in Ideas

(p. 512) . . . Apple filed suit against HTC (and, by extension, Android), alleging infringement of twenty of its patents. Among them were patents covering various multi-touch gestures, swipe to open, double-tap to zoom, pinch and expand, and the sensors that determined how a device was being held. As he sat in his house in Palo Alto the week the lawsuit was filed, he became angrier than I had ever seen him:

Our lawsuit is saying, "Google, you fucking ripped off the iPhone, wholesale ripped us off." Grand theft. I will spend my last dying breath if I need to, and I will spend every penny of Apple's $40 billion in the bank, to right this wrong. I'm going to destroy Android, because it's a stolen product. I'm willing to go to thermonuclear war on this. They are scared to death, because they know they are guilty. Outside of Search, Google's products--Android, Google Docs--are shit.

A few days after this rant, Jobs got a call from Schmidt, who had resigned from the Apple board the previous summer. He suggested they get together for coffee, and they met at a café in a Palo Alto shopping center. "We spent half the time talking about personal matters, then half the time on his perception that Google had stolen Apple's user interface designs," recalled Schmidt. When it came to the latter subject, Jobs did most of the talking. Google had ripped him off, (p. 513) he said in colorful language. "We've got you red-handed," he told Schmidt. "I'm not interested in settling. I don't want your money. If you offer me $5 billion, I won't want it. I've got plenty of money. I want you to stop using our ideas in Android, that's all I want." They resolved nothing.


Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

(Note: ellipsis added.)

January 22, 2013

Apple's iTunes for Windows Gave "a Glass of Ice Water to Somebody in Hell"

(p. 463) Mossberg wanted the evening joint appearance to be a cordial discussion, not a debate, but that seemed less likely when Jobs unleashed a swipe at Microsoft during a solo interview earlier that day. Asked about the fact that Apple's iTunes software for Windows computers was extremely popular, Jobs joked, "It's like giving a glass of ice water to somebody in hell."

So when it was time for Gates and Jobs to meet in the green room before their joint session that evening, Mossberg was worried. Gates got there first, with his aide Larry Cohen, who had briefed him about Jobs's remark earlier that day. When Jobs ambled in a few minutes later, he grabbed a bottle of water from the ice bucket and sat down. After a moment or two of silence, Gates said, "So I guess I'm the representative from hell." He wasn't smiling. Jobs paused, gave him one of his impish grins, and handed him the ice water. Gates relaxed, and the tension dissipated.


Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

January 17, 2013

A Well-Researched Case Study on How Mulally Saved Ford


Source of book image:

(p. C12) Tomes by management gurus telling you how to remake your company are a dime a dozen. Well-researched case studies are much rarer. In "American Icon," Bryce G. Hoffman takes a careful look at how Alan Mulally, recruited from Boeing in 2006, restructured Ford Motor Co. in the midst of the steepest economic downturn since the 1930s. An engineer with no automotive background, Mr. Mulally came into a company on the verge of collapse and brought it back with insistent demands for accountability, information-sharing and tough decisions. Mr. Hoffman, who wrote this book with the company's cooperation, provides a fascinating and detailed examination of how a dynamic leader brought about change. He makes clear that much of the credit goes to others, not least Don Leclair, then the chief financial officer, who, even before Mr. Mulally's arrival, was arranging to mortgage everything up to Ford's blue-oval trademark to amass the $23.6 billion in cash that enabled the company to survive the recession.

For the full review essay, see:

Marc Levinson. "Boardroom Reading of 2012." The Wall Street Journal (Sat., December 15, 2012): C12.

(Note: the online version of the review essay has the date December 14, 2012.)

The book under review, is:

Hoffman, Bryce G. American Icon: Alan Mulally and the Fight to Save Ford Motor Company. New York: Crown Business, 2012.

January 14, 2013

With iTunes, Apple Leapfrogged CD Burners (a Boat Apple Had Missed)

Is the example sketched below, and in a previous entry, a case of a first mover disadvantage? Or is it simply a case of a lucky or wise bounce-back from a genuine mistake?

(p. 382) . . . [Job's] angry insistence that the iMac get rid of its tray disk drive and use instead a more elegant slot drive meant that it could not include the first CD burners, which were initially made for the tray format. "We kind of missed the boat on that," he recalled. "So we needed to catch up real fast." The mark of an innovative company is not only that it comes up with new ideas first, but also that it knows how to leapfrog when it finds itself behind.


Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

(Note: ellipsis and bracketed "Job's" added.)

January 10, 2013

Apple "Finding a Way to Leapfrog Over Its Competitors"

Isaacson says Jobs wanted two refinements in the iMac. One was new colors. The other is discussed below.

I am not sure what to make of this episode. Is Isaacson suggesting that it was good for Apple that Jobs made a mistake on the type of CD hardware to put in the iMac? That this added constraint "would then force Apple to be imaginative and bold"?

Or is the moral that good people who make a lot of quick decisions, make mistakes, sometimes big mistakes, and that Jobs found a way to bounce back from this one?

(p. 356) There was one other important refinement that Jobs wanted for the iMac: getting rid of that detested CD tray. "I'd seen a slot-load drive on a very high-end Sony stereo," he said, "so I went to the drive manufacturers and got them to do a slot-load drive for us for the version of the iMac we did nine months later." Rubinstein tried to argue him out of the change. He predicted that new drives would come along that could burn music onto CDs rather than merely play them, and they would be available in tray form before they were made to work in slots. "If you go to slots, you will always be behind on the technology," Rubinstein argued.

"I don't care, that's what I want," Jobs snapped back. They were having lunch at a sushi bar in San Francisco, and Jobs insisted that they continue the conversation over a walk. "I want you to do the slot-load drive for me as a personal favor," Jobs asked. Rubinstein agreed, of course, but he turned out to be right. Panasonic came out with a CD drive that could rip and burn music, and it was available first for computers that had old-fashioned tray loaders. The effects of this (p. 357) would ripple over the next few years: It would cause Apple to be slow in catering to users who wanted to rip and burn their own music, but that would then force
Apple to be imaginative and bold in finding a way to leapfrog over its competitors when Jobs finally realized that he had to get into the music market.


Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

January 3, 2013

"People Said He Was a Fraud, But He Turned Out to Be Right"


"Willis Whitfield with a mobile clean room in the 1960s." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B16) Half a century ago, as a rapidly changing world sought increasingly smaller mechanical and electrical components and more sanitary hospital conditions, one of the biggest obstacles to progress was air, and the dust and germs it contains.

. . .

Then, in 1962, Willis Whitfield invented the clean room.

"People said he was a fraud," recalled Gilbert V. Herrera, the director of microsystems science and technology at Sandia National Laboratories in Albuquerque. "But he turned out to be right."

. . .

His clean rooms blew air in from the ceiling and sucked it out from the floor. Filters scrubbed the air before it entered the room. Gravity helped particles exit. It might not seem like a complicated concept, but no one had tried it before. The process could completely replace the air in the room 10 times a minute.

Particle detectors in Mr. Whitfield's clean rooms started showing numbers so low -- a thousand times lower than other methods -- that some people did not believe the readings, or Mr. Whitfield. He was questioned so much that he began understating the efficiency of his method to keep from shocking people.

"I think Whitfield's wrong," a scientist from Bell Labs finally said at a conference where Mr. Whitfield spoke. "It's actually 10 times better than he's saying."

For the full obituary, see:

WILLIAM YARDLEY. "W. Whitfield, 92, Dies; Built Clean Room." The New York Times (Weds., December 5, 2012): B16.

(Note: ellipses added.)

(Note: the online version of the obituary has the date December 4, 2012, and has the title "Willis Whitfield, Inventor of Clean Room That Purges Tiny Particles, Dies at 92.")

December 30, 2012

"The Arpanet Was Not an Internet"

XeroxParcSign2012-12-18.jpg "Xerox PARC headquarters." Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. A11) A telling moment in the presidential race came recently when Barack Obama said: "If you've got a business, you didn't build that. Somebody else made that happen." He justified elevating bureaucrats over entrepreneurs by referring to bridges and roads, adding: "The Internet didn't get invented on its own. Government research created the Internet so that all companies could make money off the Internet."

. . .

Robert Taylor, who ran the ARPA program in the 1960s, sent an email to fellow technologists in 2004 setting the record straight: "The creation of the Arpanet was not motivated by considerations of war. The Arpanet was not an Internet. An Internet is a connection between two or more computer networks."

If the government didn't invent the Internet, who did? Vinton Cerf developed the TCP/IP protocol, the Internet's backbone, and Tim Berners-Lee gets credit for hyperlinks.

But full credit goes to the company where Mr. Taylor worked after leaving ARPA: Xerox. It was at the Xerox PARC labs in Silicon Valley in the 1970s that the Ethernet was developed to link different computer networks. Researchers there also developed the first personal computer (the Xerox Alto) and the graphical user interface that still drives computer usage today.

According to a book about Xerox PARC, "Dealers of Lightning" (by Michael Hiltzik), its top researchers realized they couldn't wait for the government to connect different networks, so would have to do it themselves. "We have a more immediate problem than they do," Robert Metcalfe told his colleague John Shoch in 1973. "We have more networks than they do." Mr. Shoch later recalled that ARPA staffers "were working under government funding and university contracts. They had contract administrators . . . and all that slow, lugubrious behavior to contend with."

For the full commentary, see:

Gordon Crovitz. "INFORMATION AGE; Who Really Invented the Internet?" The Wall Street Journal (Mon., July 23, 2012): A11.

(Note: ellipsis between paragraphs was added; ellipsis internal to last paragraph was in original.)

(Note: the online version of the commentary has the date July 22, 2012.)

I read the Hiltzik book several years ago, and my memory of it is not sharp, but I remember thinking that it was a useful book:

Hiltzik, Michael A. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: HarperBusiness, 1999.

December 18, 2012

Poor People Want Washing Machines

The wonderful clip above is from Hans Rosling's TED talk entitled "The Magic Washing Machine."

He clearly and strongly presents his central message that the washing machine has made life better.

What was the greatest invention of the industrial revolution? Hans Rosling makes the case for the washing machine. With newly designed graphics from Gapminder, Rosling shows us the magic that pops up when economic growth and electricity turn a boring wash day into an intellectual day of reading.

Source of video clip summary:

The version of the clip above is embedded from YouTube, where it was posted by TED:

It can also be viewed at the TED web site at:

(Note: I am grateful to Robin Kratina for telling me about Rosling's TED talk,)

(Note: I do not agree with Rosling's acceptance of the politically correct consensus view that the response to global warning should mainly be mitigation and green energy---to the extent that a response turns out to be necessary, I mainly support adaptation, as suggested in many previous entries on this blog.)

December 13, 2012

"Did Alexander Graham Bell Do Any Market Research Before He Invented the Telephone?"

(p. 170) After the Macintosh team returned to Bandley 3 that afternoon, a truck pulled into the parking lot and Jobs had them all gather next to it. Inside were a hundred new Macintosh computers, each personalized with a plaque. "Steve presented them one at a time to each team member, with a handshake and a smile, as the rest of us stood around cheering," Hertzfeld recalled. It had been a grueling ride, and many egos had been bruised by Jobs's obnoxious and rough management style. But neither Raskin nor Wozniak nor Sculley nor anyone else at the company could have pulled off the creation of the Macintosh. Nor would it likely have emerged from focus groups and committees. On the day he unveiled the Macintosh, a reporter from Popular Science asked Jobs what type of market research he had done. Jobs responded by scoffing, "Did Alexander Graham Bell do any market research before he invented the telephone?"


Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

(Note: italics in original.)

December 10, 2012

With Scorned Ideas, and Without College, Inventor and Entrepreneur "Ovshinsky Prevailed"


"Stanford R. Ovshinsky and Iris M. Ovshinsky founded Energy Conversion Laboratories in 1960." Source of caption and photo: online version of the NYT obituary quoted and cited below.

(p. A23) Stanford R. Ovshinsky, an iconoclastic, largely self-taught and commercially successful scientist who invented the nickel-metal hydride battery and contributed to the development of a host of devices, including solar energy panels, flat-panel displays and rewritable compact discs, died on Wednesday [October 17, 2012] at his home in Bloomfield Hills, Mich. He was 89.

. . .

His ideas drew only scorn and skepticism at first. He was an unknown inventor with unconventional ideas, a man without a college education who made his living designing automation equipment for the automobile industry in Detroit, far from the hotbeds of electronics research like Silicon Valley and Boston.

But Mr. Ovshinsky prevailed. Industry eventually credited him for the principle that small quantities or thin films of amorphous materials exposed to a charge can instantly reorganize their structures into semicrystalline forms capable of carrying significant current.

. . .

In 1960, he and his second wife, the former Iris L. Miroy, founded Energy Conversion Laboratories in Rochester Hills, Mich., to develop practical products from the discovery. It was renamed Energy Conversion Devices four years later.

Energy Conversion Devices and its subsidiaries, spinoff companies and licensees began translating Mr. Ovshinsky's insights into mechanical, electronic and energy devices, among them solar-powered calculators. His nickel-metal battery is used to power hybrid cars and portable electronics, among other things.

He holds patents relating to rewritable optical discs, flat-panel displays and electronic-memory technology. His thin-film solar cells are produced in sheets "by the mile," as he once put it.

. . .

"His incredible curiosity and unbelievable ability to learn sets him apart," Hellmut T. Fritzsche, a longtime friend and consultant, said in an interview in 2005.

For the full obituary, see:

BARNABY J. FEDER. "Stanford R. Ovshinsky Dies at 89, a Self-Taught Maverick in Electronics." The New York Times (Fri., October 19, 2012): A23.

(Note: ellipses and bracketed date added.)

(Note: the online version of the article was dated October 18, 2012.)

(Note: in the first sentence of the print version, "hybrid" was used instead of the correct "hydride.")

December 4, 2012

Isaacson's "Steve Jobs" Tells Us Much About the Innovative Project Entrepreneur


Source of book image:

Steve Jobs is one of my favorite examples of what I call the "project entrepreneur." Walter Isaacson has written a fascinating biography of Jobs, full of memorable examples for any student of the innovative entrepreneur.

During the next few weeks, I will occasionally add entries that quote some of the more important or thought-provoking passages.

The book under review is:

Isaacson, Walter. Steve Jobs. New York: Simon & Schuster, 2011.

November 29, 2012

Personal DNA Data, Smart Phones, and the Social Network Can Democratize Medicine

(p. 236) With the personal montage of your DNA, your cell phone, your social network---aggregated with your lifelong health information and physiological and anatomic data---you are positioned to reboot the future of medicine. Who could possibly be more interested and more vested in your data? For the first time, the medical world is getting democratized. Think of the priests before the Gutenberg printing press. Now, nearly six hundred years later, think of physicians and the creative destruction of medicine.


Topol, Eric. The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Health Care. New York: Basic Books, 2012.

November 26, 2012

American Innovators Created Synergies and Interchangeable Parts


Source of book image: online version of the WSJ review quoted and cited below.

(p. A13) . . . the post-Civil War industrialization had an important and largely overlooked predecessor in the first decades of the 19th century, when, as Charles Morris writes in "The Dawn of Innovation," "the American penchant for mechanized, large-scale production spread throughout industry, presaging the world's first mass-consumption economy." It is a story well worth telling, and Mr. Morris tells it well.

. . .

Whole industries sprang up as the country's population boomed and spilled over into the Middle West. The rich agricultural lands there produced huge surpluses of grain and meat, especially pork. The city of Cincinnati--whose population grew to 160,000 in 1860, from 2,500 in 1810--became known as "Porkopolis" because of the number of hogs its slaughterhouses processed annually.

Mr. Morris does a particularly good job of explaining the crucial importance of synergy in economic development, how one development leads to another and to increased growth. The lard (or pig fat) from the slaughterhouses, he notes, served as the basis for the country's first chemical industry. Lard had always been used for more than pie crust and frying. It was a principal ingredient in soap, which farm wives made themselves, a disagreeable and even dangerous task thanks to the lye used in the process.

But when lard processing was industrialized to make soap, it led to an array of byproducts such as glycerin, used in tanning and in pharmaceuticals. Stearine, another byproduct, made superior candles. Just in the decade from the mid-1840s to the mid-1850s, Cincinnati soap exports increased 20-fold, as did the export of other lard-based products. Procter & Gamble, founded in Cincinnati in 1837 by an Irish soap maker and an English candle maker who had married sisters, grew into a giant company as the fast-rising middle class sought gentility.

Mr. Morris goes into great detail on the development of interchangeable parts--the system of making the components of a manufactured product so nearly identical that they can be easily substituted and replaced.

For the full review, see:

John Steele Gordon. "BOOKSHELF; The Days Of Porkopolis." The Wall Street Journal (Tues., November 20, 2012): A13.

(Note: ellipses added.)

(Note: the online version of the article was updated November 19, 2012.)

The book under review, is:

Morris, Charles R. The Dawn of Innovation: The First American Industrial Revolution. Philadelphia, PA: PublicAffairs, 2012.

November 16, 2012

Race Car Flywheels Save Fuel


"FLYWHEEL; Using a flywheel system from Williams Hybrid Power. Audi's R18 e-tron quattro car became the first hybrid vehicle to win the iconic 24 Hours of Le Mans race." Source of caption: print version of the WSJ article quoted and cited below. Source of photo:

NASA costs are often justified by giving cases where innovations from space spillover into the consumer economy. But in the normal market economy, there are alternative sources of innovation that would spillover into the consumer economy, without taking taxpayer dollars. I wonder if anyone has ever studied how often race car innovations make their way into mainstream production cars?

(p. B4) European auto makers might turn to an unlikely source to reduce fuel consumption and pollution: gas-guzzling Formula One race cars.

Rotating mechanical devices called flywheels developed for these speed machines could make everyday cars more powerful and efficient.

. . .

Flywheels have been around since the Industrial Revolution, when they were widely used in steam engines. Mounted on a crankshaft, these spinning discs provide a steady flow of energy when the energy source isn't constant, as is the case with piston-driven engines in cars.

Flywheels can range from about a meter in diameter to less than three centimeters, depending on the amount of energy required. The larger and heavier the flywheel is, the more inertial energy it delivers when spinning.

In miniature form, they show up in friction toy cars that are driven by a flywheel and speed up when the toy is rolled quickly across a surface. When the car is let go, it is the flywheel that speeds the car across the floor.

Until recently, flywheels--known in the auto industry as kinetic energy recovery systems, or KERS--have been too heavy or too bulky to use on road vehicles. But that is changing, thanks to new, lighter materials, high-tech engineering and power-management systems.

For the full story, see:

DAVID PEARSON. "NEXT IN TECH; An Unlikely Fuel Saver: Racing Cars; To Reduce Pollution, Auto Makers Experiment With Flywheels Developed for Formula One Motors." The Wall Street Journal (Tues., August 21, 2012): B4.

(Note: ellipsis added.)

(Note: the online version of the article was updated August 22, 2012.)

November 12, 2012

Edison Foresaw Phonograph Music Potential

EdisonWangemannGroupPhoto2012-11-11.jpg "EUROPEAN JOURNEY; Thomas Edison, seated center, sent Adelbert Theodor Edward Wangemann, standing behind him, to France in 1889. From there Wangemann traveled to Germany to record recitations and performances." Source of caption and photo: online version of the NYT article quoted and cited below.

Edison is often ridiculed for failing to foresee that playing music would be a major use for his phonograph invention. (Nye 1991, p. 142 approvingly references Hughes 1986, p. 201 on this point.) But if Edison failed to foresee, then why did he assign Wangemann to make the phonograph "a marketable device for listening to music"?

(p. D3) Tucked away for decades in a cabinet in Thomas Edison's laboratory, just behind the cot in which the great inventor napped, a trove of wax cylinder phonograph records has been brought back to life after more than a century of silence.

The cylinders, from 1889 and 1890, include the only known recording of the voice of the powerful chancellor Otto von Bismarck. . . . Other records found in the collection hold musical treasures -- lieder and rhapsodies performed by German and Hungarian singers and pianists at the apex of the Romantic era, including what is thought to be the first recording of a work by Chopin.

. . .

The lid of the box held an important clue. It had been scratched with the words "Wangemann. Edison."

The first name refers to Adelbert Theodor Edward Wangemann, who joined the laboratory in 1888, assigned to transform Edison's newly perfected wax cylinder phonograph into a marketable device for listening to music. Wangemann became expert in such strategies as positioning musicians around the recording horn in a way to maximize sound quality.

In June 1889, Edison sent Wangemann to Europe, initially to ensure that the phonograph at the Paris World's Fair remained in working order. After Paris, Wangemann toured his native Germany, recording musical artists and often visiting the homes of prominent members of society who were fascinated with the talking machine.

Until now, the only available recording from Wangemann's European trip has been a well-known and well-worn cylinder of Brahms playing an excerpt from his first Hungarian Dance. That recording is so damaged "that many listeners can scarcely discern the sound of a piano, which has in turn tarnished the reputations of both Wangemann and the Edison phonograph of the late 1880s," Dr. Feaster said. "These newly unearthed examples vindicate both."

For the full story, see:

RON COWEN. "Restored Edison Records Revive Giants of 19th-Century Germany." The New York Times (Tues., January 31, 2012): D3.

(Note: ellipses added.)

(Note: the online version of the article is dated January 30, 2012.)

EdisonPhonograph2012-11-11.jpg "Adelbert Theodor Edward Wangemann used a phonograph to record the voice of Otto von Bismarck." Source of caption and photo: online version of the NYT article quoted and cited above.

November 7, 2012

Health Inefficiencies Free-Ride on "Home Run Innovations"

The article quoted below is a useful antidote to those economists who sometimes seem to argue that health gains fully justify the rise in health costs.

(p. 645) In the United States, health care technology has contributed to rising survival rates, yet health care spending relative to GDP has also grown more rapidly than in any other country. We develop a model of patient demand and supplier behavior to explain these parallel trends in technology growth and cost growth. We show that health care productivity depends on the heterogeneity of treatment effects across patients, the shape of the health production function, and the cost structure of procedures such as MRIs with high fixed costs and low marginal costs. The model implies a typology of medical technology productivity: (I) highly cost-effective "home run" innovations with little chance of overuse, such as anti-retroviral therapy for HIV, (II) treatments highly effective for some but not for all (e.g., stents), and (III) "gray area" treatments with uncertain clinical value such as ICU days among chronically ill patients. Not surprisingly, countries adopting Category I and effective Category II treatments gain the greatest health improvements, while countries adopting ineffective Category II and Category III treatments experience the most rapid cost growth. Ultimately, economic and political resistance in the United States to ever-rising tax rates will likely slow cost growth, with uncertain effects on technology growth.

Source of abstract:

Chandra, Amitabh, and Jonathan Skinner. "Technology Growth and Expenditure Growth in Health Care." Journal of Economic Literature 50, no. 3 (Sept. 2012): 645-80.

October 31, 2012

Thiel Fellows Avoid Formal Education to Pursue Entrepreneurial Projects

FullEdenTh ielFellowSolarPanel2012-10-12.jpg

"Eden Full, 20, tested her rotating solar panel in Kenya in 2010." Source of caption and photo: online version of the NYT article quoted and cited below.

(p.1) EDEN FULL should be back at Princeton by now. She should be hustling to class, hitting the books, acing tests. In short, she should be climbing that old-school ladder toward a coveted spot among America's future elite.

She isn't doing any of that. Instead, Ms. Full, as bright and poised and ambitious as the next Ivy Leaguer, has done something extraordinary for a Princetonian: she has dropped out.

It wasn't the exorbitant cost of college. (Princeton, all told, runs nearly $55,000 a year.) She says she simply received a better offer -- and, perhaps, a shot at a better education.

Ms. Full, 20, is part of one of the most unusual experiments in higher education today. It rewards smart young people for not going to college and, instead, diving into the real world of science, technology and business.

The idea isn't nuts. After all, Bill Gates and Steve Jobs dropped out, and they did O.K.

Of course, their kind of success is rare, degree or no degree. Mr. Gates and Mr. Jobs changed the world. Ms. Full wants to, as well, and she's in a hurry. She has built a low-cost solar panel and is starting to test it in Africa.

"I was antsy to get out into the world and execute on my ideas," she says.

At a time when the value of a college degree is being called into question, and when job prospects for many new graduates are grimmer than they've been in years, perhaps it's no surprise to see a not-back-to-school movement spring up. What is surprising is where it's springing up, and who's behind it.

The push, which is luring a handful of select students away from the likes of Princeton, Harvard and M.I.T., is the brainchild of Peter A. Thiel, 44, a billionaire and freethinker with a remarkable record in Sil-(p. 7)icon Valley. Back in 1998, during the dot-com boom, Mr. Thiel gambled on a company that eventually became PayPal, the giant of online payments. More recently, he got in early on a little start-up called Facebook.

Since 2010, he has been bankrolling people under the age of 20 who want to find the next big thing -- provided that they don't look for it in a college classroom. His offer is this: $50,000 a year for two years, few questions asked. Just no college, unless a class is helpful for their Thiel projects.

. . .

Ms. Full is friends with another Thiel fellow, Laura Deming, 18. Ms. Deming is clearly brilliant. When she was 12, her family moved to San Francisco from New Zealand so she could work with Cynthia Kenyon, a molecular biologist who studies aging. When Ms. Deming was 14, the family moved again, this time to the Boston area, so she could study at M.I.T.

"Families of Olympic-caliber athletes make these kinds of sacrifices all the time," says Tabitha Deming, Laura's mother. "When we lived nearby in Boston, we were lucky to see her once a month. She never came home for weekends."

John Deming, Laura's father, graduated from Brandeis University at the age of 35 but says he disdains formal education at every level. His daughter was home-schooled.

"I can't think of a worse environment than school if you want your kids to learn how to make decisions, manage risk and take responsibility for their choices," Mr. Deming, an investor, wrote in an e-mail. "Rather than sending them to school, turn your kids loose on the world. Introduce them to the rigors of reality, the most important of which is earning your own way." He added, "I detest American so-called 'education.' "

His daughter's quest to slow aging was spurred by her maternal grandmother, Bertie Deming, 85, who began having neuromuscular problems a decade ago. Laura, a first-year fellow, now spends her days combing medical journals, seeking a handful of researchers worth venture capital funding, which is a continuation of her earlier work.

"I'm looking for therapies that target aging damage and slow or reverse it," she says. "I've already spent six years on this stuff. So far I've found only a few companies, two or three I'm really bullish on."

For the full story, see:

CAITLIN KELLY. "Drop Out, Dive In, Start Up.." The New York Times, SundayBusiness (Sun., September 16, 2012): 1 & 7.

(Note: ellipsis added.)

(Note: the online version of the article is dated September 15, 2012, and had he title "Forgoing College to Pursue Dreams.")

DemingLauraThielFellow2012-10-12.jpg "Laura Deming, left, at age 6 with her grandmother, whose neuromuscular problems have now inspired Laura to work on anti-aging technology." Source of caption and photo: online version of the NYT article quoted and cited above.

October 28, 2012

The Kairos of Creative Destruction in Medicine

Wikipedia tells us that "Kairos" "is an ancient Greek word meaning the right or opportune moment (the supreme moment)."

(p. x) With a medical profession that is particularly incapable of making a transition to practicing individualized medicine, despite a new array of powerful tools, isn't it time for consumers to drive this capability? The median of human beings is not the message. The revolution in technology that is based on the primacy of individuals mandates a revolution by consumers in order for new medicine to take hold.

Now you've probably thought "creative destruction" is a pretty harsh term to apply to medicine. But we desperately need medicine to he Schumpetered, to be radically transformed. We need the digital world to invade (p. xi) the medical cocoon and to exploit the newfound and exciting technological capabilities of digitizing human beings. Some will consider this to be a unique, opportune moment in medicine, a veritable once-in-a-lifetime Kairos.

This book is intended to arm consumers to move us forward.


Topol, Eric. The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Health Care. New York: Basic Books, 2012.

(Note: italics in original.)

October 24, 2012

"Our World Has Been "Schumpetered""

"Schumpeter" is now a verb!

(p. v) In the mid-twentieth century Joseph Schumpeter, the noted Austrian economist, popularized the term "creative destruction" to denote transformation that accompanies radical innovation. In recent years, our world has been "Schumpetered." By virtue of the intensive infiltration of digital devices into our daily lives, we have radically altered how we communicate with one another and with our entire social network at once. We can rapidly turn to our prosthetic brain, the search engine, at any moment to find information or compensate for a senior moment. Everywhere we go we take pictures and videos with our cell phone, the one precious object that never leaves our side. Can we even remember the old days of getting film developed? No longer is there such a thing as a record album that we buy as a whole--instead we just pick the song or songs we want and download them anytime and anywhere. Forget about going to a video store to rent a movie and finding out it is not in stock. Just download it at home and watch it on television, a computer monitor, a tablet, or even your phone. If we're not interested in getting a newspaper delivered and accumulating enormous loads of paper to recycle, or having our hands smudged by newsprint, we can simply click to pick the stories that interest us. Even clicking is starting to get old, since we can just tap a tablet or cell phone in virtual silence. The Web lets us sample nearly all books in print without even making a purchase and efficiently download the whole book in a flash. We have both a digital, virtual identity and a real one. This profile just scratches the surface of the way our lives have been radically transformed through digital innovation. Radically transformed. Creatively destroyed.


Topol, Eric. The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Health Care. New York: Basic Books, 2012.

September 2, 2012

Information Technology Enables Massive Process Creative Destruction

(p. 2) . . . I want to argue that something deep is going on with information technology, something that goes well beyond the use of computers, social media, and commerce on the Internet. Business processes that once took place among human beings are now being executed electronically. They are taking place in an unseen domain that is strictly digital. On the surface, this shift doesn't seem particularly consequential -- it's almost something we take for granted. But I believe it is causing a revolution no less important and dramatic than that of the railroads. It is quietly creating a second economy, a digital one.

. . .

(p. 5) Now this second, digital economy isn't producing anything tangible. It's not making my bed in a hotel, or bringing me orange juice in the morning. But it is running an awful lot of the economy. It's helping architects design buildings, it's tracking sales and inventory, getting goods from here to there, executing trades and banking operations, controlling manufacturing equipment, making design calculations, billing clients, navigating aircraft, helping diagnose patients, and guiding laparoscopic surgeries. Such operations grow slowly and take time to form.

. . .

(p. 6) Is this the biggest change since the Industrial Revolution? Well, without sticking my neck out too much, I believe so. In fact, I think it may well be the biggest change ever in the economy. It is a deep qualitative change that is bringing intelligent, automatic response to the economy. There's no upper limit to this, no place where it has to end.

. . .

I think that for the rest of this century, barring wars and pestilence, a lot of the story will be the building out of this second economy, an unseen underground economy that basically is giving us intelligent reactions to what we do above the ground.


Arthur, W. Brian. "The Second Economy." McKinsey Quarterly, no. 4 (Oct. 2011): 90-99.

(Note: ellipses added.)

(Note: I first saw the passages quoted above on pages 243-244 of Timothy Taylor's "Recommendations for Further Reading" feature in The Journal of Economic Perspectives 26, no. 1 (Winter 2012).)

August 26, 2012

Decouple Learning from Credentialing


"JOHN HENNESSY: 'There's a tsunami coming.' [At left] . . . , John Hennessy & Salman Khan." Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. R8) Is there anything to be done about the rising price of higher education? That was the question posed to John Hennessy, president of Stanford University, and Salman Khan, founder of Khan Academy, a nonprofit online-learning organization. They sat down with The Wall Street Journal's Walt Mossberg to discuss how technology might be part of the solution.

Here are edited excerpts of their conversation.

. . .

MR. MOSSBERG: You have a lot of money at Stanford. I've been, until recently, a trustee of Brandeis University. It's a very good university. It charges about what you do. But it doesn't have your money, and there are a lot of colleges like that.

MR. HENNESSY: Agreed, and if you look at the vast majority of colleges in the U.S., there are way too many that are [dependent on tuition to fund their budgets]. That is not sustainable. We have to do something to bend the cost curve, and this is where technology comes in.

MR. KHAN: On the sustainability question, I agree. I think the elites will probably do just fine, but for the bulk of universities, nothing can grow 5% faster than inflation forever. It will just take over the world, and that's what's happening now.

There is a fundamental disconnect happening between the providers of education and the consumers of education. If you ask universities what they are charging the $60,000 for, they'll say, "Look at our research facilities. Look at our faculty. Look at the labs and everything else." And then if you ask the parents and the students why they are taking on $60,000 of debt, they'll say, "Well, I need the credential. I need a job."

So one party thinks they're selling a very kind of an enriching experience, and the other one thinks that they're buying a credential. And if you ask the universities what percentages of your costs are "credentialing," they say oh, maybe 5% to 10%. And so I think there's an opportunity if we could decouple those things--if the credentialing part could happen for significantly less.

MR. MOSSBERG: What do you mean by the credentialing part?

MR. KHAN: If you think about what education is, it's a combination. There's a learning part. You learn accounting, you learn to write better, to think, whatever. Then there is a credentialing part, where I'm going to hand you something that you can go take into the market and signal to people that you know what you're doing.

Right now they're very muddled, but this whole online debate or what's happening now is actually starting to clarify things. At Khan Academy we're 100% focused on the learning side of things. And I think it would be interesting [if credentials could be earned based on what you know and not on where you acquired that knowledge].

For the full interview, see:

Walt Mossberg, interviewer. "Changing the Economics of Education; John Hennessy and Salman Khan on how technology can make the college numbers add up." The Wall Street Journal (Mon., June 4, 2012): R8.

(Note: bracketed words in caption, and ellipses, added; bold and italics in original.)

August 22, 2012

"It's All about Creative Destruction"


"LARRY ELLISON: 'It's all about creative destruction.'" Source of book image: online version of the WSJ review quoted and cited below.

(p. R6) In Silicon Valley, a spot known for constant change, Larry Ellison has kept his job atop Oracle Corp. . . . for decades. And that gives him a unique perspective on the industry and where it's headed.

The Wall Street Journal's Kara Swisher spoke with Mr. Ellison about the state of tech innovation, the future of the Internet--and what keeps him inspired.

What follows are edited excerpts of their discussion.

. . .

MS. SWISHER: A lot of people talk about the end of Silicon Valley, the end of innovation. Do you imagine that?

MR. ELLISON: It's all about creative destruction. Remember Woody Allen's great line about relationships: "Relationships are like a shark. It either has to move forward, or it dies."

That's true of a company. If you don't keep your technology current, if you're not monitoring what is possible today that wasn't possible yesterday, then someone's going to beat you to the punch. Someone's going to get ahead of you, and you're going to lose your customers to some competitor.

We see a lot of companies in Silicon Valley that are under stress now. But there are a lot of other companies that have come along and are doing interesting things.

. . .

MS. SWISHER: What keeps you going?

MR. ELLISON: Red Bull.

I mean, this is going to sound really corny, but life's a journey of discovery. I'm really fascinated by people, and by what can be done with technology. I also enjoy the competition, the process of learning as we compete, learning as we exploit these technologies to solve customer problems.

The whole thing is just fascinating. I don't know what I would do if I retired.

For the full interview, see:

Kara Swisher, interviewer. "Silicon Valley, the Long View; Larry Ellison on how much simpler the consumer has it now." The Wall Street Journal (Mon., June 4, 2012): R6.

(Note: ellipses added; bold and italics in original.)

August 13, 2012

Revolutionary Entrepreneurs Need "Unbridled Confidence and Arrogance"

(p. B1) Will there be another?

It's a bit absurd to try to identify "the next Steve Jobs." Two decades ago, Mr. Jobs himself wouldn't even have qualified. Exiled from Apple Inc., . . . Mr. Jobs was then hoping to revive his struggling computer maker, NeXT Inc. . . .

But just as Mr. Jobs followed Henry Ford and Thomas Edison, there will some day be another innovator with the vision, drive and disdain of the status quo to spark, and then direct, big changes in how we live.

. . .

"You have to try the unreasonable," says Vinod Khosla, a co-founder of Sun Microsystems Inc., who, as a longtime venture capitalist, has seen thousands of would-be revolutionaries. Two key characteristics, Mr. Khosla says: "unbridled confidence and arrogance."

For the full story, see:

SCOTT THURM and STU WOO. "Who Will Be the 'Next Steve Jobs'?" The Wall Street Journal (Sat., October 8, 2011): B1 & B3.

(Note: ellipses added.)

August 6, 2012

Stewart Brand Marvels at Hippie Perfectionist Jobs' Results


Stewart Brand. Source of photo: online version of the NYT interview quoted and cited below.

(p. 3) Stewart Brand is best known as the editor of the Whole Earth Catalog, a counterculture compendium published twice a year between 1968 and 1972 and the only catalog to win the National Book Award. Its credo, "Stay hungry. Stay foolish," influenced many of the hippie generation, most notably Steve Jobs.

. . .

READING I'm devouring "Steve Jobs," by Walter Isaacson. Steve's life and interests intersected with mine a number of times, so revisiting all that in sequence is like galloping through a version of my own life, plus I get to fill in the parts of his life I wondered about. Take a hippie who is also a driven perfectionist at crafting digital tools, let him become adept at managing corporate power, and marvel at what can result. The book I'm studying line by line, and dog-earing every other page, is Steven Pinker's "Better Angels of Our Nature." It chronicles the dramatic decline of violence and cruelty in human affairs in every century. Now that we know that human behavior has been getting constantly gentler and fairer, how do we proceed best with that wind at our backs?

For the full interview, see:

KATE MURPHY, interviewer. "DOWNLOAD; Stewart Brand." The New York Times, Sunday Review (Sun., Nov. 6, 2011): 3.

(Note: ellipsis added.)

(Note: the online version of the interview has the date November 5, 2011.)

July 29, 2012

Neural Implants "Restored Their Human Functionality"


Ray Kurzweil. Source of photo: online version of the WSJ article quoted and cited below.

(p. C12) Inventor and entrepreneur Ray Kurzweil is a pioneer in artificial intelligence--the principal developer of the first print-to-speech reading machine for the blind, and the first text-to-speech synthesizer, among other breakthroughs. He is also a writer who explores the future of information technology and how it is changing our world.

In a wide-ranging interview, Mr. Kurzweil and The Wall Street Journal's Alan Murray discussed advances in artificial intelligence, nanotechnology, and what it means to be human. Here are edited excerpts of their conversation:

. . .

MR. MURRAY: What about life expectancy? Is there a limit?

MR. KURZWEIL: No. We're constantly pushing back life expectancy. Now it's going to go into high gear because of the inherent exponential progression of information technology. According to my models, within 15 years we'll be adding more than a year to your remaining life expectancy each year.

MR. MURRAY: So if you play the odds right, you never hit the endpoint.

MR. KURZWEIL: Right. If you can hang in there for another 15 years, we could get to that point.

What Is Human?

MR. MURRAY: What does it mean to be human in a post-2029 world?

MR. KURZWEIL: It's a slippery slope. But we've already gone down that slope. I've talked to people who have neural implants in their brain, for Parkinson's, and I've asked them, "Are you still human? Are you less human?"

Generally speaking, they say, "It's part of me." And they're very proud of it, because it restored their human functionality.

For the full interview, see:

Alan Murray, interviewer. "Man or Machine? Ray Kurzweil on how long it will be before computers can do everything the brain can do." The Wall Street Journal (Fri., June 29, 2012): C12.

(Note: ellipsis added; bold in original.)

July 19, 2012

Larry Page on Tesla, Commerce, and Changing the World

Funding is a key constraint for the innovative project entrepreneur. By "project entrepreneur" I mean the innovator who views money as a means to achieving the project, and not as an end in itself. In this brief clip from Page's 2007 AAAS talk, he discusses how as a 12 year-old reading Tesla's autobiography he almost cried at how Tesla's failure to commercialize his ideas limited his ability to change the world.

The Tesla autobiography is:

Tesla, Nikola. My Inventions: The Autobiography of Nikola Tesla. SoHo Books, 2012.

June 21, 2012

"A123 Systems" Battery Company Is Another Example of Failed Industrial Policy

The YouTube video embedded above was from a CBS Evening News broadcast in June 2012. It illustrates the difficulty of the government successfully selecting the technologies, and companies, that will eventually prove successful. (The doctrine that government can and should do such selection is often called "industrial policy.")

The Obama administration has bet billions of tax dollars on lithium ion batteries for electric vehicles that A123 Systems won $249 million of. But as Sharyl Attkisson reports, expensive recalls and other setbacks have put substantial doubt in the company's ability to continue.

The text above, and the embedded video clip were published on YouTube on Jun 17, 2012 by CBSNewsOnline at

May 25, 2012

Wise and Wyly Words on Air Conditioning

(p. 42) It was February 1958. I got myself a room, not far from the office, in a little house built in the 1920s owned by a seventy-five-year-old woman named Mrs. Thompson. I lived in her "in-law's room," which meant I had my own front door, but I had to share the bathroom with her and, because I did not have a kitchen, I had to eat out. My rent was $10 a week.

I had my car, which meant I could get around, and the training school was air-conditioned, which meant my second summer in Dallas was a lot more pleasant than my first.

Thank you, Willis Haviland Carrier, for inventing air-conditioning. I owe you one. And I'm not the only one. At the height of the dot-com stock market bubble of 1999, Barton Biggs--the wise, graying investments guru at Morgan Stanley--posed this question to seventy-one people: which invention is more important, the Internet or air-conditioning? Barton was on the losing side of the vote, 70-2.

Obviously, he'd found seventy people who'd never spent an August in Texas.


Wyly, Sam. 1,000 Dollars and an Idea: Entrepreneur to Billionaire. New York: Newmarket Press, 2008.

May 18, 2012

Asteroid-Mining Start-Up Hopes to Launch First Spacecraft within Two Years


"A computer image shows a rendering of a spacecraft preparing to capture a water-rich, near-Earth asteroid." Source of caption: print version of the WSJ article quoted and cited below. Source of photo: online version of the WSJ article quoted and cited below.

(p. B3) SEATTLE--A start-up with high-profile backers on Tuesday unveiled its plan to send robotic spacecraft to remotely mine asteroids, a highly ambitious effort aimed at opening up a new frontier in space exploration.

At an event at the Seattle Museum of Flight, a group that included former National Aeronautics and Space Administration officials unveiled Planetary Resources Inc. and said it is developing a "low-cost" series of spacecraft to prospect and mine "near-Earth" asteroids for water and metals, and thus bring "the natural resources of space within humanity's economic sphere of influence."

The solar system is "full of resources, and we can bring that back to humanity," said Planetary Resources co-founder Peter Diamandis, who helped start the X-Prize competition to spur nongovernmental space flight.

The company said it expects to launch its first spacecraft to low-Earth orbit--between 100 and 1,000 miles above the Earth's surface--within two years, in what would be a prelude to sending spacecraft to prospect and mine asteroids.

The company, which was founded three years ago but remained secret until last week, said it could take a decade to finish prospecting, or identifying the best candidates for mining.

For the full story, see:

AMIR EFRATI. "Asteroid-Mining Strategy Is Outlined by a Start-Up." The Wall Street Journal (Weds., April 25, 2012): B3.

(Note: the online version of the story is dated April 24, 2012, and has the title "Start-Up Outlines Asteroid-Mining Strategy.")

May 17, 2012

Quick Computing If Air Conditioning Worked

(p. 36) Using those IBM 650s was no easy feat. You had to take your turn in line with the other students, write your program, key punch it onto a big stack of cards, do your proofs to make sure it was accurate, and feed it into the computer. If you were lucky and the air-conditioning did not malfunction, you'd get your results back quickly. But there would be errors, which you had to correct, and then you had to repeat the process over and over again until the 650--working on the data with the program that you wrote--came up with the right answers.


Wyly, Sam. 1,000 Dollars and an Idea: Entrepreneur to Billionaire. New York: Newmarket Press, 2008.

May 16, 2012

"Birdseye Coaxes Readers to Re-examine Everyday Miracles"


Source of book image:

(p. C7) Birdseye made and lost money, went west to search for the cause of Rocky Mountain spotted fever and hunted fox for furs in Labrador, where he took his wife and infant son to live 250 miles by dogsled from the nearest hospital. He harpooned whales near his home in Gloucester, Mass., and wore a necktie while doing it. And he designed the industrial processes that made it possible to fast-freeze food, thus rendering obsolete much canned, dried, salted and smoked food and the musty basement bins that once held a winter's diet of turnips, onions and potatoes.

Food had been frozen earlier but more slowly. Crystallization turned it mushy and tasteless. It was poor man's food. In Labrador, fishing with the Inuit, Birdseye noticed that when a fish was pulled from a hole in the ice and into minus-40-degree air, it froze instantly, staying so fresh that when it was thawed months later, it would sometimes come alive.

He spent years putting together modern mass production with what he had seen in Labrador. By the 1920s, he was fast-freezing food that was far closer to fresh than any competition. "Today's locavore movement--the movement to shun food from afar and eat what is produced locally . . . would have perplexed him," Mr. Kurlansky writes. After all, "consumers could go to a supermarket and buy the food of California, France and China for less money."

. . .

The author makes a telling point about locavores: "We need to grasp that people who are accustomed only to artisanal goods long for the industrial. It is only when the usual product is industrial that the artisanal is longed for. This is why artisanal food, the dream of the food of family farms, caught on so powerfully in California, one of the early strongholds of agribusiness with little tradition of small family farms."

Birdseye's heroism has been forgotten, and his frozen food is taken for granted, the way all inventions are taken sooner or later. He sold his business for $23.5 million in 1929 to what would become General Foods. He stayed on as a consultant and also ran his light bulb company, which he would sell too.

For the full review, see:

HENRY ALLEN. "The American Way of Eating; Harlan Sanders and Clarence Birdseye, just like today's locavores, saw a meal as a way to improve people's lives." The Wall Street Journal (Sat., May 5, 2012): C5 & C7.

(Note: ellipsis added.)

(Note: the online version of the review is dated May 4, 2012.)

(p. C6) "Birdseye" is a slight but intriguing book that raises far more questions than it answers. But it indeed coaxes readers to re-examine everyday miracles like frozen food, and to imagine where places with no indigenous produce would be without them. It emphasizes the many steps that went into developing such a simple-seeming process.

For the full review, see:

JANET MASLIN. "BOOKS OF THE TIMES; The Inventor Who Put Frozen Peas on Our Tables." The New York Times (Thurs., April 26, 2012): C6.

(Note: the online version of the review is dated April 25, 2012.)

Book reviewed:

Kurlansky, Mark. Birdseye: The Adventures of a Curious Man. New York: Doubleday, 2012.


"Mark Kurlansky." Source of caption and photo: online version of the NYT article quoted and cited above.

May 13, 2012

A "Boring" and "Excellent" Business Education

(p. 34) Most of what they taught us in those days was functional. This was before they added "entrepreneurship" to business courses. It was all about manufacturing, marketing, and personnel. I found that somewhat boring. I had two favorite courses. The first was Small Business. It was the only course where all the pieces carne together. The other was Computing, which was the first computer course that the Michigan Business School had ever taught. I had a feeling that this was the big new thing. But, more important, it was what IBM did. I had never seen a computer lab before. This was soon after Remington Rand made headlines with its UNIVAC I, the world's first commercial computer.

. . .

(p. 59) The University of Michigan is an excellent school. I loved being there and I am proud to have earned an MBA. When I was there, I noticed that the fìve-and--ten-cents-store founder, Sebastian S. Kresge--the man who invented the Kmart chain--had given them Kresge Hall. When I could afford to, I figured, why not do the same? I have always been so grateful for what I learned there. In 1997 I gave the school funding for a Sam Wyly Hall. (A few years earlier, Charles and I had helped to build Louisiana Tech's 16-story Wyly Tower of Learning.) It's fulfilling to me that today Paton Scholars study at Sam Wyly Hall on the Ann Arbor campus.

Source of both quotes:

Wyly, Sam. 1,000 Dollars and an Idea: Entrepreneur to Billionaire. New York: Newmarket Press, 2008.

(Note: ellipsis added.)

May 8, 2012

Entrepreneur Sam Wyly Hard to Classify


Source of book image:

I sometimes divide entrepreneurs into two broad types: free agent entrepreneurs and innovative entrepreneurs. Free agent entrepreneurs are the self-employed. Innovative entrepreneurs are the agents of Schumpeter's process of creative destruction.

Then there are entrepreneurs like Sam Wyly who don't fit very well in either category.

He built or improved businesses in ways that made the world better, but usually did not involve breakthrough innovations.

Like many of the entrepeneurs considered in Amar Bhidé's main books, Wyly grew businesses that served consumers, enriched investors and created jobs. Some of his most important start-ups, especially early-on, involved computer services. And his efforts to compete with the government-backed AT&T monopoly, were heroic.

I read the 2008 version of his autobiography a few months ago, and found that it contained a few stories and observations that are worth pondering. In the next few weeks I will briefly quote a few of these.

The 2008 Wyly autobiography is:

Wyly, Sam. 1,000 Dollars and an Idea: Entrepreneur to Billionaire. New York: Newmarket Press, 2008.

I have not read the 2011 version of Wyly's autobiography:

Wyly, Sam. Beyond Tallulah: How Sam Wyly Became America's Boldest Big-Time Entrepreneur. New York: Melcher Media, 2011.

The dominant examples in Bhidé's two main books are entrepreneurs like Wyly. The two main Bhidé books are:

Bhidé, Amar. The Origin and Evolution of New Businesses. Oxford, UK: Oxford University Press, 2000.

Bhidé, Amar. The Venturesome Economy: How Innovation Sustains Prosperity in a More Connected World. Princeton, NJ: Princeton University Press, 2008.

May 6, 2012

Entrepreneurs Will Mine Asteroids to "Help Ensure Humanity's Prosperity"

CameronJames2012-04-30.jpg "Space mining has captivated Hollywood. Director James Cameron is a backer of the new venture." Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. B1) A new company backed by two Google Inc. billionaires, film director James Cameron and other space exploration proponents is aiming high in the hunt for natural resources--with mining asteroids the possible target.

The venture, called Planetary Resources Inc., revealed little in a press release this week except to say that it would "overlay two critical sectors--space exploration and natural resources--to add trillions of dollars to the global GDP" and "help ensure humanity's prosperity." The company is formally unveiling its plans at an event . . . in Seattle.

. . .

[The] . . . event is being hosted by Peter H. Diamandis and Eric Anderson, known for their efforts to develop commercial space exploration, and two former NASA officials.

Mr. Diamandis, a driving force behind the Ansari X-Prize competition to spur non-governmental space flight, has long discussed his goal to become an asteroid miner. He contends that such work by space pioneers would lead to a "land rush" by companies to develop lower-cost technology to travel to and extract resources from asteroids.

For the full story, see:

AMIR EFRATI. "A Quixotic Quest to Mine Asteroids." The Wall Street Journal (Sat., April 21, 2012,): B1 & B4.

(Note: ellipses and bracketed word added.)

(Note: the online "updated" version of the article is dated April 23, 2012.)

May 4, 2012

Innovation Took "Three Years Working through the Bureaucratic Snags"

FlyingCar2012-04-30.jpg "FULL FLEDGED; The production prototype of the Terrafugia Transition, with its wings folded and road-ready." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 13) THE promise of an airplane parked in every driveway, for decades a fantasy of suburban commuters and a staple of men's magazines, resurfaced this month in Manhattan. On display at the New York auto show was the Terrafugia Transition, an airplane with folding wings and a drive system that enabled it to be used on the road.

. . .

But there can be many delays along the road from concept to certification. For instance, government officials and the designers have had to determine which regulations -- aircraft or automotive -- take precedence when the vehicle in question is both.

. . .

In 2010, the $94,000 Maverick, a rudimentary buggy that takes to the air under a powered parachute, earned certification as a light-sport aircraft. Troy Townsend, design manager and chief test pilot for the company, based in Dunnellon, Fla., said he spent spent nearly all of his time over the course of three years working through the bureaucratic snags.

"There was a lot of red tape," Mr. Townsend said. "The certification process went all the way to Oklahoma and Washington, D.C."

For the full story, see:

CHRISTINE NEGRONI. "Before Flying Car Can Take Off, There's a Checklist." The New York Times, SportsSunday Section (Sun., April 29, 2012): 13.

(Note: ellipses added.)

(Note: the online version of the story is dated April 27, 2012.)

FederalRegsFlyingTable.pngSource of table: online version of the NYT article quoted and cited above.

April 13, 2012

Myhrvold Left Work with Hawking for the Excitement of Entrepreneurship

(p. 139) Microsoft was represented ¡n the discussion by its senior vice president for advanced technology, a thirty-five-year-old Nathan Myhrvold. After finishing his Ph.D. at Princeton at age twenty-three, Myhrvold had worked for a year as a postdoctoral fellow with the physicist Stephen Hawking at Cambridge, tackling theories of (p. 140) gravitation and curved space-time, before taking a three-month leave of absence to help some friends in the Bay Area with a software project. He became caught up in the excitement of personal computer software and entrepreneurship and never went back. In Berkeley, he co-founded a company called Dynamical Systems to develop operating system for personal computers, which struggled for two years until Microsoft bought it in 1986. At Microsoft, he persuaded Bill Gates to let him establish a corporate research center, Microsoft Research, with Myhrvold himself in charge.


Price, David A. The Pixar Touch: The Making of a Company. New York: Alfred A. Knopf, 2008.

(Note: italics in original.)

(Note: my strong impression is that the pagination is the same for the 2008 hardback and the 2009 paperback editions, except for part of the epilogue, which is revised and expanded in the paperback. I believe the passage above has the same page number in both editions.)

April 12, 2012

Benefits of Driverless Cars Justify Changing Liability Laws

DriverlessCar2012-03-26.jpg "The car is driven by a computer that steers, starts and stops itself. A 360 degrees laser scanner on top of the car, a GPS system and other sensors monitor the surrounding traffic." Source of caption and photo: online version of the WSJ article quoted and cited below.

(p A13) Expect innovations that change the nature of driving more than anything since the end of the hand-crank engine--so long as the legal and regulatory systems don't strangle new digital technologies before they can roll off the assembly line.

. . .

Mr. Ford outlined a future of what the auto industry calls "semiautonomous driving technology," meaning increasingly self-driving cars. Over the next few years, cars will automatically be able to maintain safe distances, using networks of sensors, V-to-V (vehicle-to-vehicle) communications and real-time tracking of driving conditions fed into each car's navigation system.

This will limit the human error that accounts for 90% of accidents. Radar-based cruise control will stop cars from hitting each other, with cars by 2025 driving themselves in tight formations Mr. Ford describes as "platoons," cutting congestion as the space between cars is reduced safely.

. . .

Over the next decade, cars could finally become true automobiles. Our laws will have to be updated for a new relationship between people and cars, but the benefits will be significant: fewer traffic accidents and fewer gridlocked roads--and, perhaps best of all, young people will be in self-driving cars, not teenager-driven cars.

For the full commentary, see:

L. GORDON CROVITZ. "INFORMATION AGE; The Car of the Future Will Drive You; A truly auto-mobile is coming if liability laws don't stop it." The Wall Street Journal (Mon., March 5, 2012): A13.

(Note: ellipses added.)

March 31, 2012

Quantum Computers May Revolutionize Nanotechnology and Drug Design


"Scott Aaronson." Source of caption and photo: online version of the NYT commentary quoted and cited below.

(p. D5) When people hear that I work on quantum computing -- one of the most radical proposals for the future of computation -- their first question is usually, "So when can I expect a working quantum computer on my desk?" Often they bring up breathless news reports about commercial quantum computers right around the corner. After I explain the strained relationship between those reports and reality, they ask: "Then when? In 10 years? Twenty?"

Unfortunately, this is sort of like asking Charles Babbage, who drew up the first blueprints for a general-purpose computer in the 1830s, whether his contraption would be hitting store shelves by the 1840s or the 1850s. Could Babbage have foreseen the specific technologies -- the vacuum tube and transistor -- that would make his vision a reality more than a century later? Today's quantum computing researchers are in a similar bind. They have a compelling blueprint for a new type of computer, one that could, in seconds, solve certain problems that would probably take eons for today's fastest supercomputers. But some of the required construction materials don't yet exist.

. . .

While code-breaking understandably grabs the headlines, it's the more humdrum application of quantum computers -- simulating quantum physics and chemistry -- that has the potential to revolutionize fields from nanotechnology to drug design.

. . .

Like fusion power, practical quantum computers are a tantalizing possibility that the 21st century may or may not bring -- depending on the jagged course not only of science and technology, but of politics and economics.

For the full commentary, see:

SCOTT AARONSON. "ESSAY; Quantum Computing Promises New Insights, Not Just Supermachines." The New York Times (Tues., December 6, 2011): D5.

(Note: ellipses added.)

(Note: the online version of the commentary is dated December 5, 2011.)

March 24, 2012

The Intensity of Entrepreneur Jobs

(p. 114) As Jobs was criticizing the Pixar managers for failing to hit a delivery date on a project, Smith interrupted and said, "Steve, but you haven't delivered your board on time"--meaning, a board for the NeXT computer.

It was the sort of remark Jobs normally might have put up with, but it seemed Smith had crossed a line by joking about Jobs's [sic] computer. "He went completely nonlinear," Smith recalled. "He went crazy on me and started insulting my accent."

Jobs had homed in on a sensitive spot. Smith's native southwestern accent, which he had mostly suppressed since his days as an academic in New York City, sometimes reemerged in moments of stress. Jobs mocked it.

"So I went nonlinear, too, which I had never done before or since," Smith remembered. "We're screaming at each other, and our faces are about three inches apart."

There was an unspoken understanding around Jobs that the whiteboard in his office was part of his personal space--no one else was to write on it. As the confrontation went on, Smith defiantly marched past him and started writing on the whiteboard. "You can't do that," Jobs interjected. When Smith continued writing, Jobs stormed out of the room.

To outward appearances, the conflict blew over, but the men's relationship would never be the same.


Price, David A. The Pixar Touch: The Making of a Company. New York: Alfred A. Knopf, 2008.

(Note: italics in original.)

(Note: my strong impression is that the pagination is the same for the 2008 hardback and the 2009 paperback editions, except for part of the epilogue, which is revised and expanded in the paperback. I believe the passage above has the same page number in both editions.)

March 20, 2012

Tool Makers Cannot Predict Creative Ways Tools Will Be Used

(p. 89) Jobs had no use for small-minded naysayers. His experience had taught him that if you offered a better computer, well priced and accessible, there was no limit to what human ingenuity could achieve with it. No one, after all, had thought of electronic spread-sheets when he and Wozniak rolled out the Apple II, in 1977, but within two years, a spreadsheet program called VisiCalc--created in an attic by a first-year Harvard MBA student and a programmer friend--was one of the strongest drivers of Apple Il sales. The PIC was not a consumer product like the Apple II, but the principle was the same. "People are inherently creative," Jobs remarked to an interviewer a few years later. "They will use tools in ways the tool makers never thought possible."


Price, David A. The Pixar Touch: The Making of a Company. New York: Alfred A. Knopf, 2008.

(Note: my strong impression is that the pagination is the same for the 2008 hardback and the 2009 paperback editions, except for part of the epilogue, which is revised and expanded in the paperback. I believe the passage above has the same page number in both editions.)

March 17, 2012

Internet Companies Respect the Value of Your Time

JainArvindGoogleEngineer2012-03-08.jpg "Arvind Jain, a Google engineer, pointed out the loading speed of individual elements of a website on a test application used to check efficiency, at Google offices in Mountain View, Calif." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. A1) Wait a second.

No, that's too long.

Remember when you were willing to wait a few seconds for a computer to respond to a click on a Web site or a tap on a keyboard? These days, even 400 milliseconds -- literally the blink of an eye -- is too long, as Google engineers have discovered. That barely perceptible delay causes people to search less.

"Subconsciously, you don't like to wait," said Arvind Jain, a Google engineer who is the company's resident speed maestro. "Every millisecond matters."

Google and other tech companies are on a new quest for speed, challenging the likes of Mr. Jain to make fast go faster. The reason is that data-hungry smartphones and tablets are creating frustrating digital traffic jams, as people download maps, video clips of sports highlights, news updates or recommendations for nearby restaurants. The competition to be the quickest is fierce.

People will visit a Web site less often if it is slower than a close competitor by more than 250 milliseconds (a millisecond is a thousandth of a second).

"Two hundred fifty milliseconds, either slower or faster, is close to the magic number now for competitive advantage on the Web," said Harry Shum, a computer scientist and speed specialist at Microsoft.

. . .

(p. A3) The need for speed itself seems to be accelerating. In the early 1960s, the two professors at Dartmouth College who invented the BASIC programming language, John Kemeny and Thomas Kurtz, set up a network in which many students could tap into a single, large computer from keyboard terminals.

"We found," they observed, "that any response time that averages more than 10 seconds destroys the illusion of having one's own computer."

In 2009, a study by Forrester Research found that online shoppers expected pages to load in two seconds or fewer -- and at three seconds, a large share abandon the site. Only three years earlier a similar Forrester study found the average expectations for page load times were four seconds or fewer.

The two-second rule is still often cited as a standard for Web commerce sites. Yet experts in human-computer interaction say that rule is outdated. "The old two-second guideline has long been surpassed on the racetrack of Web expectations," said Eric Horvitz, a scientist at Microsoft's research labs.

For the full story, see:

STEVE LOHR. "For Impatient Web Users, an Eye Blink Is Just Too Long to Wait." The New York Times (Thurs., March 1, 2012): A1 & A3.

(Note: ellipsis added.)

(Note: the online version of the article is dated February 29, 2012.)

WebSpeedGraphic2012-03-08.jpgSource of graph: online version of the NYT article quoted and cited above.

March 9, 2012

Web Sites Expose Petty Corruption

RamanathanSwatiBribeSite2012-03-07.jpg "Swati Ramanathan, a founder of the site I Paid a Bribe, in India." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B1) The cost of claiming a legitimate income tax refund in Hyderabad, India? 10,000 rupees.

The going rate to get a child who has already passed the entrance requirements into high school in Nairobi, Kenya? 20,000 shillings.

The expense of obtaining a driver's license after having passed the test in Karachi, Pakistan? 3,000 rupees.

Such is the price of what Swati Ramanathan calls "retail corruption," the sort of nickel-and-dime bribery, as opposed to large-scale graft, that infects everyday life in so many parts of the world.

Ms. Ramanathan and her husband, Ramesh, along with Sridar Iyengar, set out to change all that in August 2010 when they started, a site that collects anonymous reports of bribes paid, bribes requested but not paid and requests that were expected but not forthcoming.

About 80 percent of the more than 400,000 reports to the site tell stories like the ones above of officials and bureaucrats seeking illicit payments to provide routine services or process paperwork and forms.

"I was asked to pay a bribe to get a birth certificate for my daughter," someone in Bangalore, India, wrote in to the Web site on Feb. 29, recording payment of a 120-rupee bribe in Bangalore. "The guy in charge called it 'fees' " -- except there are no fees charged for birth certificates, Ms. Ramanathan said.

Now, similar sites are spreading like kudzu around the globe, vexing petty bureaucrats the world over. Ms. Ramanathan said nongovernmental organizations and government agencies from at least 17 countries had contacted Janaagraha, the nonprofit organization in Bangalore that operates I Paid a Bribe, to ask about obtaining the source code and setting up a site of their own.

For the full story, see:

STEPHANIE STROM. "Web Sites Shine Light on Petty Bribery Worldwide." The New York Times (Weds., March 7, 2012): B1 & B4.

(Note: the online version of the article has the date March 6, 2012.)


"Antony Ragui started an I Paid a Bribe site in Kenya." Source of caption and photo: online version of the NYT article quoted and cited above.

March 4, 2012

Storytelling Trumps Technology in Making Good Movies

(p. 28) The calamity of Tubby the Tuba forced them to confront an unpleasant fact--namely, that they were in the wrong place for making good movies. Money was nor enough, they could now see. Technical genius was not enough (though Tubby had grave technical problems, too). Splendid equipment would not be enough. For them to make worthwhile films someday--not just the R&D exercises (p. 29) they showed at SIGGRAPH meetings--there also had to be people on board who understood film storytelling. Schure, although blessed with great foresight, could not be their Walt Disney.


Price, David A. The Pixar Touch: The Making of a Company. New York: Alfred A. Knopf, 2008.

(Note: italics in original.)

(Note: my strong impression is that the pagination is the same for the 2008 hardback and the 2009 paperback editions, except for part of the epilogue, which is revised and expanded in the paperback. I believe the passage above has the same page number in both editions.)

February 18, 2012

Paul Allen Uses Microsoft Profits for Bold Private Space Project

StratolaunchSpacePlane2012-02-05.jpgSource of graphic: online version of the WSJ article quoted and cited below.

(p. B1) Microsoft Corp. co-founder Paul Allen indicated he is prepared to commit $200 million or more of his wealth to build the world's largest airplane as a mobile platform for launching satellites at low cost, which he believes could transform the space industry.

Announced Tuesday, the novel, high-risk project conceived by renowned aerospace designer Burt Rutan seeks to combine engines, landing gears and other parts removed from old Boeing 747 jets with a newly created composite craft from Mr. Rutan and a powerful rocket to be built by a company run by Internet billionaire and commercial-space pioneer Elon Musk.

Dubbed Stratolaunch and funded by one of Mr. Allen's closely held entities, the venture seeks to meld decades-old airplane technology with cutting-edge booster-rocket designs in an unprecedented way to assemble a hybrid that would offer the first totally privately funded space transportation system.

For the full story, see:

ANDY PASZTOR And DIONNE SEARCEY. "Paul Allen, Supersizing Space Flight; Billionaire's Novel Vision Has Wingspan Wider Than a Football Field, Weighs 1.2 Million Pounds." The Wall Street Journal (Weds., December 5, 2011): B1 & B5.

February 17, 2012

"What Success Had Brought Him, . . . , Was Freedom"

(p. 5) The success of Pixar's films had brought him something exceedingly rare in Hollywood: not the house with the obligatory pool in the backyard and the Oscar statuettes on the fireplace mantel, or the country estate, or the vintage Jaguar roadster--although he had all of those things, too. It wasn't that he could afford to indulge his affinity for model railroads by acquiring a full-size 1901 steam locomotive, with plans to run it on the future site of his twenty-thousand-square-foot mansion in Sonoma Valley wine country. (Even Walt Dìsney's backyard train had been a mere one-eighth-scale replica.)

None of these was the truly important fruit of Lasseter's achievements. What success had brought him, most meaningfully, was freedom. Having created a new genre of film with his colleagues at Pixar, he had been able to make the films he wanted to make, and he was coming back to Disney on his own terms.


Price, David A. The Pixar Touch: The Making of a Company. New York: Alfred A. Knopf, 2008.

(Note: ellipsis in title was added.)

(Note: my strong impression is that the pagination is the same for the 2008 hardback and the 2009 paperback editions, except for part of the epilogue, which is revised and expanded in the paperback. I believe the passage above has the same page number in both editions.)

February 12, 2012

Pixar as a Case Study on Innovative Entrepreneurship


Source of book image:

Toy Story and Finding Nemo are among my all-time-favorite animated movies. How Pixar developed the technology and the story-telling sense, to make these movies is an enjoyable and edifying read.

Along the way, I learned something about entrepreneurship, creative destruction, and the economics of technology. In the next couple of months I occasionally will quote passages that are memorable examples of broader points or that raise thought-provoking questions about how innovation happens.

Book discussed:

Price, David A. The Pixar Touch: The Making of a Company. New York: Alfred A. Knopf, 2008.

February 7, 2012

The Tasmanian Technological Regress: "Slow Strangulation of the Mind"

(p. 78) The most striking case of technological regress is Tasmania. Isolated on an island at the end of the world, a population of less than 5,000 hunter-gatherers divided into nine tribes did not just stagnate, or fail to progress. They fell steadily and gradually back into a simpler toolkit and lifestyle, purely because they lacked the numbers to sustain their existing technology. Human beings reached Tasmania at least 35,000 years ago while it was still connected to Australia. It remained connected - on and off - until about 10,000 years ago, when the rising seas filled the Bass Strait. Thereafter the Tasmanians were isolated. By the time Europeans first encountered Tasmanian natives, they found them not only to lack many of the skills and tools of their mainland cousins, but to lack many technologies that their own ancestors had once possessed. They had no bone tools of any kind, such as needles and awls, no cold-weather clothing, no fish hooks, no hafted tools, no barbed spears, no fish traps, no spear throwers, no boomerangs. A few of these had been invented on the mainland after the Tasmanians had been isolated from it - the boomerang, for instance - but most had been made and used by the very first Tasmanians. Steadily and inexorably, so the archaeological history tells, these tools and tricks were abandoned. Bone tools, for example, grew simpler and simpler until they were dropped altogether about 3,800 years ago. Without bone tools it became impossible to sew skins into clothes, so even in the bitter winter, the Tasmanians went nearly naked but for seal-fat grease smeared on their skin and wallaby pelts over their shoulders. The first Tasmanians caught and ate plenty of fish, but by the time of Western contact they not only ate no fish (p. 79) and had eaten none for 3,000 years, but they were disgusted to be offered it (though they happily ate shellfish).

The story is not quite that simple, because the Tasmanians did invent a few new things during their isolation. Around 4,000 years ago they came up with a horribly unreliable form of canoe-raft, made of bundles of rushes and either paddled by men or pushed by swimming women (!), which enabled them to reach offshore islets to harvest birds and seals. The raft would become waterlogged and disintegrate or sink after a few hours, so it was no good for re-establishing contact with the mainland. As far as innovation goes, it was so unsatisfactory that it almost counts as an exception to prove the rule. The women also learnt to dive up to twelve feet below the water to prise clams off the rocks with wooden wedges and to grab lobsters. This was dangerous and exhausting work, which they were very skilled at: the men did not take part. So it was not that there was no innovation; it was that regress overwhelmed progress.

The archaeologist who first described the Tasmanian regress, Rhys Jones, called it a case of the 'slow strangulation of the mind', which perhaps understandably enraged some of his academic colleagues. There was nothing wrong with individual Tasmanian brains; there was something wrong with their collective brains. Isolation - self-sufficiency - caused the shrivelling of their technology. Earlier I wrote that division of labour was made possible by technology. But it is more interesting than that. Technology was made possible by division of labour: market exchange calls forth innovation.


Ridley, Matt. The Rational Optimist: How Prosperity Evolves. New York: Harper, 2010.

February 2, 2012

Kickstarter Helps Finance Projects

KickstarterProjects2012-01-29.jpg "The creators of the TikTok Watchband, left, and the Elevation Dock have raised far more money on Kickstarter than they initially sought." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B1) Kickstarter is a "crowd-funding" site. It's a place for creative people to get enough start-up money to get their projects off the ground. The categories include music, film, art, design, food, publishing and technology. The projects seeking support might be recording a CD, putting on a play, producing a short film or developing a cool new tech product.

Suppose you're the one who needs money. You describe your project with a video, a description and a target dollar amount. Listing your project is free.

If the citizens of the Web pledge enough money to meet your target by the deadline you set, then you get your money and (p. B7) you proceed with your project. At that point, Kickstarter takes 5 percent, and you pay 3 to 5 percent to's credit card service.

If you don't raise the money by the deadline, the deal is off. Your contributors keep their money, and Kickstarter takes nothing.

But here's the part I had trouble understanding: These are not investments. If you make a pledge, you'll never see your money again, even if the play, movie or gadget becomes a huge hit. You do get some little memento of your financial involvement -- a T-shirt or a CD, for example, or a chance to preorder the gadget being developed -- but nothing else tangible. Not even a tax deduction.

Furthermore, you have no guarantee that the project will even see the light of day. All kinds of things happen between inspiration and production. People lose interest, get married, move away, have trouble lining up a factory. The whole thing dies, and it was all for nothing.

So why, I kept wondering, does anybody participate? Who would give money for so little in return?

. . .

I started reading about . . . projects. The one that seemed to be drumming up the most interest lately is called the Elevation Dock. It's just a charging stand for the iPhone, but wow, what a stand. It's exquisitely milled from solid, Applesque aluminum. You don't have to take your iPhone (or iPod Touch) out of its case to insert it into this dock. And the dock is solid enough that you can yank the phone out of it with one hand. The dock stays on the desk.

. . .

Other projects seeking your support: Jaja, a drawing stylus for iPad and Android tablets that's pressure-sensitive (makes fatter lines when you bear down harder); LED Side Glow Hats (baseball caps with illuminated brims for working in dark places); Eye3 (an inexpensive flying drone for aerial photography); and so on.

Not all of them will reach their financing goals (only 44 percent do). Even fewer will wind up on store shelves.

But in dark economic times, Kickstarter offers aspirational voyeurism: you can read about the big dreams of the little people. And you can give the worthy artists a small financial vote of confidence -- and enjoy the ride with them.

For the full story, see:

DAVID POGUE. "STATE OF THE ART; Embracing the Mothers of Invention." The New York Times (Thurs., January 26, 2012): B1 & B7.

(Note: ellipses added.)

(Note: the online version of the article was dated January 25, 2012.)

January 31, 2012

Is "The Replicator" the Personal Fabricator of Gershenfeld's Dreams?

Replicator3Dprinter2012-01-28.jpgThe Replicator 3-D printer. Source of photo: online version of the NYT article quoted and cited below.

Back in 2005 technology "visionary" Neil Gershenfeld predicted the soon to be seen day when personal fabricators would follow the path of computers which progressed from mainframes costing millions to mini-computers costing hundreds of thousands to personal computers costing a couple of thousand. Well apparently that day is here.

Now we will see if the implications are as far-reaching as Gershenfeld predicted.

(p. B7) By now you may have heard about the Replicator, a $1,750 3-D printer made by the Brooklyn start-up MakerBot, due next month. If not, the significance of the Replicator is that it is the first 3-D printer to break the $2,000 barrier. Here's more about what the Replicator can and can't do.

Q. What does a 3-D printer use?
A: Spools of coiled A.B.S. (acrylonitrile butadiene styrene) plastic that costs about $45 each per kilogram. This is the same materials that is used to make Lego blocks. It is strong, safe and comes in many colors. One spool can make about 176 chess pieces.

For the full story, see:

WARREN BUCKLEITNER. "Gadgetwise; A 3-D Printer for Under $2,000: What Can It Do?" The New York Times (Thurs., January 26, 2012): B7.

(Note: bold in original.)

(Note: the online version of the article was dated January 23, 2012, and had the title "3-D Printing for the Masses: MakerBot's Replicator." The online version differs in several places from the print version. Where they differ, I quote the print version.)

The Gershenfeld book discussed above is:

Gershenfeld, Neil. Fab: The Coming Revolution on Your Desktop--from Personal Computers to Personal Fabrication. New York: Basic Books, 2005.

January 17, 2012

Krim Saw Use for Noisy CK722 Transistors


Norman Krim holding some early transistors. He first put transistors into hearing aids. Source of caption and photo: online version of the NYT obituary quoted and cited below.

(p. B11) Mr. Krim, who made several breakthroughs in a long career with the Raytheon Company and who had an early hand in the growth of the RadioShack chain, did not invent the transistor. (Three scientists did, in 1947, at Bell Laboratories.)

But he saw the device's potential and persuaded his company to begin manufacturing it on a mass scale, particularly for use in miniaturized hearing aids that he had designed. Like the old tube, a transistor amplifies audio signals.

As Time magazine wrote in 1953: "This little device, a single speck of germanium, is smaller than a paper clip and works perfectly at one-tenth the power needed by the smallest vacuum tube. Today, much of Raytheon's transistor output goes to America's hearing aid industry." (Germanium, a relatively rare metal, was the predecessor to silicon in transistors.)

. . .

Thousands of hearing-disabled people benefited from Mr. Krim's initial use of the transistor in compact hearing aids. But not every transistor Raytheon made was suitable for them, he found.

"When transistors were first being manufactured by Raytheon on a commercial scale, there was a batch called CK722s that were too noisy for use in hearing aids," said Harry Goldstein, an editor at IEEE Spectrum, the magazine of the Institute of Electrical and Electronics Engineers.

So Mr. Krim contacted editors at magazines like Popular Science and Radio Electronics and began marketing the CK722s to hobbyists.

"The result was that a whole generation of aspiring engineers -- kids, really, working in their garages and basements -- got to make all kinds of electronic projects," Mr. Goldstein said, among them transistor radios, guitar amplifiers, code oscillators, Geiger counters and metal detectors. "A lot of them went on to become engineers."

Mr. Ward called Mr. Krim "the father of the CK722."

For the full obituary, see:

DENNIS HEVESI. "Norman Krim, 98, Dies; Championed the Transistor." The New York Times (Weds, December 21, 2011): B11.

(Note: ellipsis added.)

(Note: the online version of the article is dated December 20, 2011 and has the title "Norman Krim, Who Championed the Transistor, Dies at 98.")

January 11, 2012

Gentle Oshman Inspired Loyalty as He Made Work Fun in Silicon Valley


"M. Kenneth Oshman" Source of caption and photo: online version of the NYT obituary quoted and cited below.

(p. 19) M. Kenneth Oshman, who helped create one of the early successful technology start-up firms in Silicon Valley, one that embodied the informal management style that came to set the Valley apart from corporate America, died on Saturday in Palo Alto, Calif. He was 71.

. . .

In the 1970s and '80s, Rolm was the best example of an emerging Silicon Valley management style that effectively broke down the barrier between work and play. Setting out to recruit the most talented technical minds, Rolm became known as a great place to work, so much so that it was nicknamed "G.P.W."

Early on as chief executive, Mr. Oshman took funds normally used for company Christmas parties and used them to help construct a company recreational center, consisting of swimming pools, racquetball courts, exercise rooms and other amenities to attract new employees and underline the image that Rolm was a fun place to work.

But there was a tradeoff, said Keith Raffel, who left a staff position on Capitol Hill to become an assistant to Mr. Oshman at Rolm before starting his own company.

"The quid pro quo was you would be driven and work really hard," he said.

With a gentle, understated style, Mr. Oshman stood apart from other well-known leaders in Silicon Valley, many of whom were seen as capricious and even tyrannical. He was a mentor to a generation of Silicon Valley technologists and able to inspire a kind of loyalty in his employees not frequently seen in high-tech industries.

For the full obituary, see:

JOHN MARKOFF. "M. Kenneth Oshman, Silicon Valley Mentor, Dies at 71." The New York Times, First Section (Sun., August 10, 2011): A10.

(Note: ellipsis added.)

(Note: the online version of the obituary is dated August 10, 2011 and has the title "M. Kenneth Oshman, Who Brought Fun to Silicon Valley, Dies at 71.")

December 30, 2011

More Firms Adopt 'Bring Your Own Device' (BYOD) Policies to Empower Workers and Cut Costs

CitrixSystemsWorkersPickOwnLaptops2011-11-10.jpg"At Citrix Systems, Berkley Reynolds, left, uses his Alienware laptop, and Alan Meridian, his MacBook Pro, paid for with stipends." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B1) SAN FRANCISCO -- Throughout the information age, the corporate I.T. department has stood at the chokepoint of office technology with a firm hand on what equipment and software employees use in the workplace.

They are now in retreat. Employees are bringing in the technology they use at home and demanding the I.T. department accommodate them. The I.T. department often complies.

Some companies have even surrendered to what is being called the consumerization of I.T. At Kraft Foods, the I.T. department's involvement in choosing technology for employees is limited to handing out a stipend. Employees use the money to buy whatever laptop they want from Best Buy, or the local Apple store.

"We heard from people saying, 'How come I have better equipment at home?' " said Mike Cunningham, chief technology officer for Kraft Foods. "We said, hey, we can address that."

Encouraging employees to buy their own laptops, or bring their mobile phones and iPads from home, is gaining traction in the workplace. A survey published on Thursday by Forrester Research found that 48 percent of information workers buy smartphones for work without considering what their I.T. department supports. By being more flexible, companies are hoping that workers will be more comfortable with their devices and therefore more productive.

"Bring your own device" policies, as they are called, are also shifting the balance of power among electronics makers. Manufacturers good at selling to consumers are increasingly gaining the upper hand, while those focused on bulk corporate sales are slipping.

. . .

(p. B6) Letting workers bring their iPhones and iPads to work can . . . save companies money. In some cases, employees pay for equipment themselves and seek tech help from store staff rather than their company's I.T. department. "You can basically outsource your I.T. department to Apple," said Ben Reitzes, an analyst with Barclays Capital.

A similar B.Y.O.D. program at Citrix Systems, a software maker that also helps its clients implement such programs, saves the company about 20 percent on each laptop over three years. Of the 1,000 or so employees in Citrix's program, 46 percent have bought Mac computers, according to Paul Martine, Citrix's chief information officer. "That was a little bit of a surprise."

For the full story, see:

VERNE G. KOPYTOFF. "More Offices Let Workers Choose Their Own Devices." The New York Times (Fri., September 23, 2011): B1 & B6.

(Note: ellipses added.)

(Note: the online version of the article is dated September 22, 2011.)

December 14, 2011

Entrepreneur Julius Blank's Greatest Pleasure Came from "Building Something from Nothing"

FairchildSemiconductorFoundersIn1988.jpg"Fairchild Semiconductor's founders in 1988. Victor Grinich (left), Jay Last, Jean Hoerni, Julius Blank, Eugene Kleiner, Sheldon Roberts, Robert N. Noyce (seated, left,) and Gordon E. Moore." Source of caption and photo: online version of the NYT obituary quoted and cited below.

(p. B14) Julius Blank, a mechanical engineer who helped start a computer chip company in the 1950s that became a prototype for high-tech start-ups and a training ground for a generation of Silicon Valley entrepreneurs, died on Saturday in Palo Alto, Calif.. He was 86.

. . .

Mr. Blank and his partners -- who included Robert N. Noyce and Gordon E. Moore, the future founders of the Intel Corporation -- began their venture as scientist-entrepreneurs in the wake of a mutiny of sorts against their common previous employer, the Nobel Prize-winning physicist William B. Shockley.

Dr. Shockley, . . . , had recruited the eight scientists from around the country in 1956 to work in his own semiconductor lab in nearby Mountain View, Calif.

The group left en masse the next year because of what its members described as Dr. Shockley's authoritarian management style and their differences with him over his scientific approach. Dr. Shockley called it a betrayal.

Fairchild's founders came to be branded in the lore of Silicon Valley as the "Traitorous Eight." How that happened remains something of a mystery.

. . .

When he left Fairchild in 1969 -- he was the last of the eight founding partners to depart -- Mr. Blank became an investor and consultant to start-up companies and helped found the technology firm Xicor, which was sold in 2004 for $529 million to Intersil.

His former partners, in addition to founding Intel, had started Advanced Micro Devices and National Semiconductor. Mr. Kleiner had founded a venture capital firm that became an early investor in hundreds of technology companies, including, Google and AOL. Still, the greatest pleasure of his working life, Mr. Blank said in a 2008 interview for the archives of the Computer History Museum, a project in Silicon Valley, came with the uncertainty and camaraderie of "the early years, building something from nothing."

Mr. Blank described a moment in the first days of Fairchild, just before production began in its factory built from nothing, when the ducts and plumbing and air-conditioning were set, and the new crystal growers and one-of-a-kind chip making machines were ready to be installed.

"I remember the day we finally got the floor tile laid," he said. "And that night, Noyce and the rest of the guys came out and got barefoot and rolled their pants up and were swabbing the floors. I wish I had a picture of that."

For the full obituary, see:

PAUL VITELLO. "Julius Blank, 86, Dies; Built First Chip Maker." The New York Times (Fri., September 23, 2011): B14.

(Note: ellipses added.)

(Note: the online version of the obituary is dated September 22, 2011 and had the title "Julius Blank, Who Built First Chip Maker, Dies at 86.")


May 2011 photo of Julius Blank. Source of photo: online version of the NYT obituary quoted and cited above.

December 5, 2011

"Private Life Was Completely Transformed in the Nineteenth Century"

(p. 448) Private life was completely transformed in the nineteenth century - socially, intellectually, technologically, hygienically, sartorially, sexually and in almost any other respect that could be made into an adverb. Mr Marsham was born (in 1822) into a world that was still essentially medieval - a place of candlelight, medicinal leeches, travel at walking pace, news from afar that was always weeks or months old - and lived to see the introduction of one marvel after another: steamships and speeding trains, telegraphy, photography, anaesthesia, indoor plumbing, gas lighting, antisepsis in medicine, refrigeration, telephones, electric lights, recorded music, cars and planes, skyscrapers, motion pictures, radio, and literally tens of thousands of tiny things more, from mass-produced bars of soap to push-along lawnmowers.

It is almost impossible to conceive just how much radical day-to-day change people were exposed to in the nineteenth century, particularly in the second half. Even something as elemental as the weekend was brand new.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

November 12, 2011

Wozniak Waits 20 Hours to Be First in Line for iPhone 4S; They Say "4S" Means "For Steve"

WozniakIphone4S2011-11-04.jpg"Apple co-founder Steve Wozniak uses the voice feature on his new Apple iPhone 4S at the Apple Store in Los Gatos, Calif., on Friday. Wozniak, who created Apple with Steve Jobs in a Silicon Valley garage in 1976, waited 20 hours in line to be the first customer at the store to buy the new iPhone." Source of caption and photo: online version of the Omaha World-Herald article cited below.

What a classy and wonderfully symbolic way to pay tribute to his friend and the values they shared.

Source of photo and caption:

AP. "Even Wozniak stood in line for new iPhone." Omaha World-Herald (Saturday October 15, 2011): 9A.

November 5, 2011

Art Diamond Defended Air Conditioning in WPR Debate with Stan Cox

From archive of the Joy Cardin show:

Wednesday 6/8/2011 7:00 AM

Joy Cardin - 110608B After seven, Joy Cardin asks her guests a weather-related Big Question: "Do we rely too much on air-conditioning?"

- Stan Cox, Senior Scientist, The Land Institute. Author, "Losing Our Cool: Uncomfortable Truths About Our Air Conditioned World" Author's blog:
- Arthur Diamond, Professor of Economics, University of Nebraska at Omaha. Author, conference paper, "Keeping Our Cool: In Defense of Air Conditioning" (

Link to streaming version of debate between Art Diamond and Stan Cox (author Losing Our Cool) on whether air conditioning is good (Diamond) or bad (Cox). Broadcast on Joy Cardin Show on the Wisconsin Public Radio network on Weds., June 8, 2011, from about 7:00 - 7:50 AM:

October 25, 2011

The Huge Value of Exposing Ourselves to Unexpected Evidence

Bill Bryson tells how much we learned from the remains of a man from the neolithic age, who has been called Ötzi:

(p. 377) His equipment employed eighteen different types of wood - a remarkable variety. The most surprising of all his tools was the axe. It was copper-bladed and of a type known as a Remedello axe, after a site in Italy where they were first found. But Ötzi's axe was hundreds of years older than the oldest Remedello axe. 'It was,' in the words of one observer, 'as if the tomb of a medieval warrior had yielded a modern rifle.' The axe changed the timeframe for the copper age in Europe by no less than a thousand years.

But the real revelation and excitement were the clothes. Before Ötzi we had no idea - or, to be more precise, nothing but ideas - of how stone age people dressed. Such materials as survived existed only as fragments. Here was a complete outfit and it was full of surprises. His clothes were made from the skins and furs of an impressive range of animals - red deer, bear, chamois, goat and cattle. He also had with him a woven grass rectangle that was three feet long. This might have been a kind of rain cape, but it might equally have been a sleeping mat. Again, nothing like it had ever been seen or imagined.

Ötzi wore fur leggings held up with leather strips attached to a waist strap that made them look uncannily - almost comically - like the kind of nylon stockings and garter sets that Hollywood pin-ups wore in the Second World War. Nobody had remotely foreseen such a get-up. He wore a loincloth of goatskin and a hat made from the fur of a brown bear - probably a kind of hunting trophy. It would have been very warm and covetably stylish. The rest of his outfit was mostly made from the skin and fur of red deer. Hardly any came from domesticated animals, the opposite of what was expected.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

October 14, 2011

Larry Page's Wonderful Crusade to Save Us Time


Source of book image:

On C-SPAN's book TV I saw the last part of an interesting and entertaining interview with Steven Levy that was originally recorded at the Computer History Museum on April 6, 2011. Levy is the author of of In the Plex which I have not read, but which is now on my to-read list.

At the end of the interview, Levy read a passage from his book about how Larry Page is obsessed with reducing latency, which is a technical term for how long we have to wait for something to happen on a computer.

Isn't it wonderful that Larry Page is on a crusade to save us from wasted time?

Book discussed above:

Levy, Steven. In the Plex: How Google Thinks, Works, and Shapes Our Lives. New York: Simon & Schuster, 2011.

(Note: "latency" appears on the following pages of Steven Levy's book: 93, 184, 185, 186, 187, 207, 262, and 398.)

October 6, 2011

"Insanely Great" Entrepreneur Steve Jobs Wanted "a Chance to Change the World"

Steve Jobs died yesterday (Weds., October 5, 2011).

Jobs was an innovator of my favorite kind, what I call a "project entrepreneur." He showed us what excitement and progress is possible if we preserve the institutions that allow entrepreneurial capitalism to exist.

When he was recruiting John Sculley to leave Pepsi and join Apple, Jobs asked him: "Do you want to spend the rest of your life selling sugared water or do you want a chance to change the world?" (p. 90).

Steve Jobs wanted to change the world. He got the job done.

Source of quote of Jobs' question to Sculley:

Sculley, John, and John A. Byrne. Odyssey: Pepsi to Apple. paperback ed. New York: HarperCollins, 1988.

September 25, 2011

Lunar Entrepreneurs

(p. A1) Now that the last space shuttle has landed back on Earth, a new generation of space entrepreneurs would like to whip up excitement about the prospect of returning to the Moon.

. . .

(p. A3) "It's probably the biggest wealth creation opportunity in modern history," said Barney Pell, a former NASA computer scientist turned entrepreneur and now a co-founder of Moon Express. While Moon Express might initially make money by sending small payloads, the big fortune would come from bringing back platinum and other rare metals, Dr. Pell said.

"Long term, the market is massive, no doubt," he said. "This is not a question of if. It's a question of who and when. We hope it's us and soon."

Like the aviation prizes that jump-started airplane technology a century ago, the Google Lunar X Prize is meant to rally technologists and entrepreneurs. It is administered by the X Prize Foundation, which handed out $10 million in 2004 to the first private team to build a spacecraft that could carry people 60 miles above the Earth's surface. (The winner, SpaceShipOne, was built by the aerospace designer Burt Rutan with backing from Paul Allen, the software magnate.)

For the full story, see:

KENNETH CHANG. "In a Private Race to the Moon, Flights of Fancy Are in the Air." The New York Times (Fri., July 22, 2011): A1 & A3.

(Note: the online version of the story is dated July 21, 2011 and has the title "Race to the Moon Heats Up for Private Firms.")

September 22, 2011

Deregulation Revived Railroads


"ALL ABOARD: The Wasp magazine in 1881 lampooned railroad moguls as having regulators in the palms of their hands." Source of caricature: online version of the WSJ article quoted and cited below.

(p. C8) Mr. Klein has written thoroughly researched and scrupulously objective biographies of the previously much maligned Jay Gould and E.H. Harriman, remaking their public images by presenting them in full. Now he has published the third and final volume of his magisterial history of the Union Pacific railroad, taking the company from 1969 to the present day.

Union Pacific--the only one of the transcontinentals to remain in business under its original name--is now a flourishing business. Thanks to a series of mergers, it is one of the largest railroads in the world, with more than 37,000 miles of track across most of the American West. Thanks to its investment in new technology, it is also among the most efficient.

In 1969, though, the future of American railroading was in doubt as the industry struggled against competition from airplanes, automobiles and trucks--all of which were in effect heavily subsidized through the government's support for airports and the Interstate Highway System.

Another major factor in the decline of the railroads had been the stultifying hand of the Interstate Commerce Commission. The ICC had come into existence in the late 19th century to limit the often high-handed ways of the railroads as they wrestled with the difficult economics of an industry that has very high fixed costs. ( . . . .) But the ICC soon evolved into a cartel mechanism that discouraged innovation and wrapped the railroad industry in a cocoon of stultifying rules.

Mr. Klein notes that in 1975 he wrote a gloomy article about the sad state of an industry with a colorful past: "Unlike many other historical romances," he wrote back then, "the ending did not promise to be a happy one."

Fortunately, a deregulation movement that began under the Carter administration--yes, the Carter administration--limited the power of the ICC and then abolished it altogether. As Mr. Klein shows in the well-written "Union Pacific," the reduction of government interference left capitalism to work its magic and produce--with the help of dedicated and skillful management--the modern, efficient and profitable railroad that is the Union Pacific.

For the full review, see:

JOHN STEELE GORDON. "Tracks Across America." The Wall Street Journal (Sat., JUNE 11, 2011): C8.

(Note: ellipsis added.)

Book reviewed in the part of the review quoted above:

Klein, Maury. Union Pacific: The Reconfiguration: America's Greatest Railroad from 1969 to the Present. New York: Oxford University Press, USA, 2011.

September 17, 2011

Study Finds No Link Between Cellphones and Cancer

(p. A3) A European study involving nearly 1,000 participants has found no link between cellular-phone use and brain tumors in children and adolescents, a group that may be particularly sensitive to phone emissions.

The study, published in the Journal of the National Cancer Institute, was prompted by concerns that the brains of younger users may be more vulnerable to adverse health effects--such as cancer--from cellphones.

For the full story, see:

GAUTAM NAIK. "Study Sees No Cellphone-Cancer Ties." The Wall Street Journal (Thurs., July 28, 2011): A3.

September 8, 2011

Arthur Murray "America's First Space Pilot," RIP


"Maj. Arthur Murray in 1954." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. A18) "I begin to feel weightless, and I'm flying so fast my instruments can't keep up -- they show what happened two miles ago. I'm climbing so steeply I can't see the ground, and I feel confused. I have a sense of falling and I want to grab something for support."

It was May 28, 1954, and Maj. Arthur Murray, test pilot, would wrestle for the next 15 terrifying seconds with a rocket plane racing over 1,400 miles an hour and spinning wildly, supersonically out of control. In the turmoil, he would fly higher than any human being had ever been, 90,440 feet over the earth.

Finally, Major Murray's plane, a Bell X-1A, sank back into heavier air, and he had time to look at the dark blue sky and dazzling sunlight. He became the first human to see the curvature of the earth. At the time, he was called America's first space pilot.

Arthur Murray, known as Kit, died on July 25, in a nursing home in the town of West in Texas, his family said. He was 92. He requested that his ashes be scattered over the Mojave Desert, where some of his fellow test pilots crashed and died.

Tom Wolfe marveled at the test pilots of Edwards Air Force Base in his 1979 book "The Right Stuff" exclaiming, "My God -- to be part of Edwards in the late forties and early fifties!"

For the full obituary, see:

DOUGLAS MARTIN. "Arthur Murray, Test Pilot, Is Dead at 92." The New York Times (Fri., August 5, 2011): A18.

(Note: the online version of the story is dated August 4, 2011.)

The wonderful Tom Wolfe book mentioned is:

Wolfe, Tom. The Right Stuff. New York: Farrar, Straus & Giroux, Inc., 1979.

September 7, 2011

At First, Some Feared Electricity

(p. 133) Something of the prevailing ambivalence was demonstrated by Mrs Cornelius Vanderbilt, who went to a costume ball dressed as an electric light to celebrate the installation of electricity in her Fifth Avenue home in New York, but then had the whole system taken out when it was suspected of being the source of a small fire. Others detected more insidious threats. One authority named S. F. Murphy identified a whole host of electrically induced maladies - eyestrain, headaches, general unhealthiness and possibly even 'the premature exhaustion of life'. One architect was certain electric light caused freckles.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

September 6, 2011

The Movie Auteur as a Model for Technology Entrepreneurship

AuteurVersusCommittee2011-08-07.jpg Source of image: online version of the NYT article quoted and cited below.

(p. 3) Two years ago, the technology blogger John Gruber presented a talk, "The Auteur Theory of Design," at the Macworld Expo. Mr. Gruber suggested how filmmaking could be a helpful model in guiding creative collaboration in other realms, like software.

The auteur, a film director who both has a distinctive vision for a work and exercises creative control, works with many other creative people. "What the director is doing, nonstop, from the beginning of signing on until the movie is done, is making decisions," Mr. Gruber said. "And just simply making decisions, one after another, can be a form of art."

"The quality of any collaborative creative endeavor tends to approach the level of taste of whoever is in charge," Mr. Gruber pointed out.

Two years after he outlined his theory, it is still a touchstone in design circles for discussing Apple and its rivals.

Garry Tan, designer in residence and a venture partner at Y Combinator, an investor in start-ups, says: "Steve Jobs is not always right--MobileMe would be an example. But we do know that all major design decisions have to pass his muster. That is what an auteur does."

Mr. Jobs has acquired a reputation as a great designer, Mr. Tan says, not because he personally makes the designs but because "he's got the eye." He has also hired classically trained designers like Jonathan Ive. "Design excellence also attracts design talent," Mr. Tan explains.

For the full story, see:

RANDALL STROSS. "DIGITAL DOMAIN; The Auteur vs. the Committee." The New York Times, SundayBusiness Section (Sun., July 24, 2011): 3.

(Note: the online version of the story is dated July 23, 2011.)

September 3, 2011

Edison Excelled as an Organizer of Systems

(p. 131) Where Edison truly excelled was as an organizer of systems. The invention of the light bulb was a wondrous thing but of not much practical use when no one had a socket to plug it into. Edison and his tireless workers had to design and build the entire system from scratch, from power stations to cheap and reliable wiring, to lampstands and switches. Within months Edison had set up no fewer than 334 small electrical plants all over the world; (p. 132) within a year or so his plants were powering thirteen thousand light bulbs. Cannily he put them in places where they would be sure to make maximum impact: on the New York Stock Exchange, in the Palmer House Hotel in Chicago, La Scala opera house in Milan, the dining room of the House of Commons in London. Swan, meanwhile, was still doing much of his manufacturing in his own home. He didn't, in short, have a lot of vision. Indeed, he didn't even file for a patent. Edison took out patents everywhere, including in Britain in November 1879, and so secured his preeminence.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

August 30, 2011

Bill Bryson Slams Thomas Edison Based on Brief Comments in Linda Simon Book

In the passage quoted below, Bill Bryson is strongly critical of Thomas Edison. It's been many years since I last read a full biography of Edison, but my impression is that Bryson is not being fair to Edison.

I like Bryson and I like Edison, so I was bothered enough to dig out the online "Notes" that Bryson posted to go with his book. On the passage critical of Edison, he cites p. 83 of Linda Simon's Dark Light book.

It turns out that Simon is a literature professor whose book is mainly about the early fears that superstitious people had about electricity. Many of her sources are literary. The book is a long way from a focused, balanced biography of Edison.

On page 83, she makes a casual and unjustifiedly snide comment on Morgan, Vanderbilt, and especially Gould, and then criticizes Edison by associating him with them. She also criticizes Edison because others sometimes challenged his patents. (Just because lawsuits were brought against Edison, does not imply his patent claims were unsound---anyone can file a lawsuit who is willing to hire a lawyer.)

The "bribe" is apparently that Edison gave some reporters stock, or "suppers or songfests" who had reported favorably. To judge such claims, we would like more evidence and more context. (Today, many institutions hire former reporters to do public relations work. Universities often provide free meals to those whose favor they seek; even book publishers send out free books in the hope that they will be reviewed favorably. Do we count all of these as "bribes"? Are all "rewards" ipso facto "bribes"?)

My view is that if we are going to strongly malign the character of one who brought us so much good (Edison), we should do so based on stronger evidence than the brief casual opinions of Linda Simon.

On my "to do" list is to read a biography or two on Edison. When I do so, I will comment again on this issue.

(p. 130) By 1877, when he started his quest to make a commercially successful light, Edison was already well on his way to becoming known as 'the Wizard of Menlo Park'. Edison was not a wholly attractive human being. He didn't scruple to cheat or lie, and was prepared to steal patents or bribe journalists for favourable coverage. In the words of one of his contemporaries, he had 'a vacuum where his conscience ought to be'. But he was enterprising and hard-working and a peerless organizer.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

August 29, 2011

In 1880s Prices Fell Because of Technological Progress


Source of book image:

Michael Perelman has strongly suggested that I read David Well's book. It is on my "to do" list.

(p. C10) The dull title of "Recent Economic Changes" does no justice to David A. Wells's fascinating contemporary account of a deflationary miasma that settled over the world's advanced economies in the 1880s. His cheery conclusion: Prices were falling because technology was progressing. What had pushed the price of a bushel of wheat down to 67 cents in 1887 from $1.10 in 1882 was nothing more sinister than the opening up of new regions to cultivation (Australia, the Dakotas) and astounding improvements in agricultural machinery.

For the full review, see:

JAMES GRANT. "FIVE BEST; Little-Known Gold From the Gilded Age." The Wall Street Journal (Sat., AUGUST 6, 2011): C10.

Source of book under review:

Wells, David A. Recent Economic Changes and Their Effect on Production and Distribution of Wealth and Well-Being of Society. New York: D. Appleton and Co., 1889.

Michael Perelman argues that in Recent Economic Changes, David Wells anticipates the substance, although not the wording, of Schumpeter's "creative destruction":

Perelman, Michael. "Schumpeter, David Wells, and Creative Destruction." The Journal of Economic Perspectives 9, no. 3 (Summer 1995): 189-97.

August 26, 2011

"Only One Person in History Thought Hermann Sprengel Deserved to Be Better Known: Hermann Sprengel"

(p. 130) . . . in the early 1870s Hermann Sprengel, a German chemist working in London, invented a device that came to be called the Sprengel mercury pump. This was the crucial invention that actually made household illumination possible. Unfortunately, only one person in history thought Hermann Sprengel deserved to be better known: Hermann Sprengel. Sprengel's pump could reduce the amount of air in a glass chamber to one-millionth of its normal volume, which would enable a filament to glow for hundreds of hours. All that was necessary now was to find a suitable material for the filament.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

(Note: ellipsis added.)

August 22, 2011

Gas Lighting Did Not Appeal to Those Who Had Servants to Light Their Candles

(p. 123) Gas was particularly popular in America and Britain. By 1850 it was available in most large cities in both countries. Gas remained, however, a (p. 124) middle-class indulgence. The poor couldn't afford it and the rich tended to disdain it, partly because of the cost and disruption of installing it and partly because of the damage it did to paintings and precious fabrics, and partly because when you have servants to do everything for you already there isn't the same urgency to invest in further conveniences. The ironic upshot, as Mark Girouard has noted, is that not only middle-class homes but institutions like lunatic asylums and prisons tended to be better lit - and, come to that, better warmed - long before England's stateliest homes were.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

August 19, 2011

"A Brilliant and Exhilarating and Profoundly Eccentric Book"


"David Deutsch." Source of caption and photo: online version of the NYT review quoted and cited below.

(p. 16) David Deutsch's "Beginning of Infinity" is a brilliant and exhilarating and profoundly eccentric book. It's about everything: art, science, philosophy, history, politics, evil, death, the future, infinity, bugs, thumbs, what have you. And the business of giving it anything like the attention it deserves, in the small space allotted here, is out of the question. But I will do what I can.

. . .

The thought to which Deutsch's conversation most often returns is that the European Enlightenment of the 17th and 18th centuries, or something like it, may turn out to have been the pivotal event not merely of the history of the West, or of human beings, or of the earth, but (literally, physically) of the universe as a whole.

. . .

(p. 17) Deutsch's enthusiasm for the scientific and technological transformation of the totality of existence naturally brings with it a radical impatience with the pieties of environmentalism, and cultural relativism, and even procedural democracy -- and this is sometimes exhilarating and sometimes creepy. He attacks these pieties, with spectacular clarity and intelligence, as small-­minded and cowardly and boring. The metaphor of the earth as a spaceship or life-­support system, he writes, "is quite perverse. . . . To the extent that we are on a 'spaceship,' we have never merely been its passengers, nor (as is often said) its stewards, nor even its maintenance crew: we are its designers and builders. Before the designs created by humans, it was not a vehicle, but only a heap of dangerous raw materials." But it's hard to get to the end of this book without feeling that Deutsch is too little moved by actual contemporary human suffering. What moves him is the grand Darwinian competition among ideas. What he adores, what he is convinced contains the salvation of the world, is, in every sense of the word, The Market.

For the full review, see:

DAVID ALBERT. "Explaining it All: David Deutsch Offers Views on Everything from Subatomic Particles to the Shaping of the Universe Itself." The New York Times Book Review (Sun., August 14, 2011): 16-17.

(Note: ellipses between paragraphs added; ellipsis in Deutsch quote in original.)

(Note: the online version of the review is dated August 12, 2011 and has the title "Explaining it All: How We Became the Center of the Universe.")

Book under review:

Deutsch, David. The Beginning of Infinity: Explanations That Transform the World. New York: Viking Adult, 2011.

August 18, 2011

"How Painfully Dim the World Was before Electricity"

(p. 112) We forget just how painfully dim the world was before electricity. A candle - a good candle - provides barely a hundredth of the illumination of a single 100-watt light bulb. Open your refrigerator door and you summon forth more light than the total amount enjoyed by most households in the eighteenth century. The world at night for much of history was a very dark place indeed.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

August 17, 2011

A Case for Epistemic and Technological Optimism


Source of book image:

Horgan is well-known for writing a pessimistic book about the future of science. For him to write such a positive review of a book that reaches the opposite conclusion, is impressive (both about him and the book he is reviewing).

From Horgan's review and the reviews on Amazon as of 8/7/11, I view the Deutsch book as potentially important and profound. (I will write more when I have read it.)

(p. 17) . . . Mr. Deutsch knocks my 1996 book, "The End of Science," for proposing that the glory days of science--especially pure science, the effort to map out and understand reality--may be over. Mr. Deutsch equates my thesis with "dogmatism, stagnation and tyranny," all of which, for the record, I oppose. But he makes the case for infinite progress with such passion, imagination and quirky brilliance that I couldn't help enjoying his argument. More often than not I found myself agreeing with him--or at least hoping that he is right.

. . .

If we acknowledge our imperfections, Mr. Deutsch observes, then, paradoxically, there is no problem that we cannot tackle. Death, for instance. Or the apparent incompatibility between the two pillars of modern physics, quantum theory and general relativity. Or global warming, which Mr. Deutsch believes we can overcome through innovation rather than drastic cutbacks in consumption. He gores the sacred cow of "sustainability": Societies are healthiest, he declares, not when they achieve equilibrium but when they are rapidly evolving.

For the full review, see:

JOHN HORGAN. "BOOKSHELF; To Err Is Progress; How to foster the growth of scientific knowledge: accept that it is limited no matter how definitive it may seem." The Wall Street Journal (Weds., JULY 20, 2011): A17.

(Note: ellipses added.)

Source information on book under review:

Deutsch, David. The Beginning of Infinity: Explanations That Transform the World. New York: Viking Adult, 2011.

August 14, 2011

Inventor of Mason Jar Died Poor, Alone and Forgotten

(p. 74) In 1859 an American named John Landis Mason solved the challenge that the Frenchman François (or Nicolas) Appert had not quite mastered the better part of a century before. Mason patented the threaded glass jar with a metal screw-on lid. This provided a perfect seal and made it possible to preserve all kinds of foods that would previously spoil. The Mason jar became a huge hit everywhere, though Mason himself scarcely benefited from it. He sold the rights in it for a modest sum, then turned his attention to other inventions - a folding life raft, a case for keeping cigars fresh, a selfdraining soap dish - that he assumed would make him rich, but his other inventions not only weren't successful, they weren't even very good. As one after another failed, Mason withdrew into a semidemented poverty. He died alone and forgotten in a New York City tenement house in 1902.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

August 7, 2011

Theft of Elderly Woman's Air Conditioner Called "Murder"

Source of the "murder" quote is from:

J.D. Miles, reporter. "Elderly Woman Dies From Heat After A/C Stolen." Dallas, CBS 11 News, August 5, 2011.

(Note: this report is the source of the "murder" quote which was stated by Mrs. Grissom's neighbor Caroline Ware.)

(Note: Another version of the report with the "murder" quote has the title: "Texas Heat Wave." CBS 11 News, August 5, 2011.)

Another report on the incident is:


Ed Lavandera, reporter. "Woman Dies After Air Conditioner Stolen." CNN American Morning, August 5, 2011.

July 29, 2011

Resistance to New Technology

(p. 59) . . . , not everyone was happy with the loss of open hearths. Many people missed the drifting smoke and were convinced they had been healthier when kept "well kippered in wood smoke," as one observer put it. As late as 1577, a William Harrison insisted that in the days of open fires our heads did never ake." Smoke in the roof space discouraged nesting birds and was believed to strengthen timbers. Above all, people complained that they weren't nearly as warm as before, which was true. Because fireplaces were so inefficient, they were constantly enlarged. Some became so enormous that they were built with benches in them, letting people sit inside the fireplace, almost the only place in the house where they could be really warm.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

(Note: ellipsis added.)

July 28, 2011

Zuckerberg Has Most Followers on Google+


"The profile page of Mark Zuckerberg on Google+, a service created to compete with Facebook." Source of caption and image: online version of the NYT article quoted and cited below.

(p. B1) Any guesses as to who is the most popular person on Google+, the company's new social networking service? Ashton Kutcher, perhaps? Or Lady Gaga?

Actually, that title is currently held by Mark Zuckerberg, the founder and chief executive of Facebook -- the very service that Google+ was meant to challenge.

As of Tuesday evening, Mr. Zuckerberg had nearly 35,000 people following his updates on the service, more than anyone else in a broad survey of Google+ profiles by Social Statistics, an outside service. His fan base exceeds that of Larry Page, one of the founders of Google and its recently appointed chief executive, who had only 24,000 people following him.

Google+ is less than a week old and is still not yet widely available to the public. But access to the service, which lets people share photos, links, status updates and video chats with groups of friends, is already in high demand among early adopters who are eager to play with its (p. B8) features. That includes Mr. Zuckerberg, who apparently signed up to keep tabs on his new adversary.

For the full story, see:

JENNA WORTHAM. "Zuckerberg Finds Fans on Google+." The New York Times (Weds., July 5, 2011): B1 & B8.

(Note: the online version of the story is dated July 6, 2011.)

July 26, 2011

Technology as an Enabler of Free Speech

InternetJalalabad2011-07-16.jpg "Volunteers have built a wireless Internet around Jalalabad, Afghanistan, from off-the-shelf electronics and ordinary materials." Source of caption and photo: online version of the NYT article quoted and cited below.

The main point of the passages quoted below is to illustrate how, with the right technology, we can dance around tyrants in order to enable human freedom.

(But as a minor aside, note in the large, top-of-front-page photo above, that Apple once again is visibly the instrument of human betterment---somewhere, before turning to his next challenge, one imagines a fleeting smile on the face of entrepreneur Steve Jobs.)

(p. 1) The Obama administration is leading a global effort to deploy "shadow" Internet and mobile phone systems that dissidents can use to undermine repressive governments that seek to silence them by censoring or shutting down telecommunications networks.

The effort includes secretive projects to create independent cellphone networks inside foreign countries, as well as one operation out of a spy novel in a fifth-floor shop on L Street in Washington, where a group of young entrepreneurs who look as if they could be in a garage band are fitting deceptively innocent-looking hardware into a prototype "Internet in a suitcase."

Financed with a $2 million State Department grant, the suitcase could be secreted across a border and quickly set up to allow wireless communication over a wide area with a link to the global Internet.

For the full story, see:

JAMES GLANZ and JOHN MARKOFF. "U.S. Underwrites Internet Detour Around Censors." The New York Times, First Section (Sun., June 12, 2011): 1 & 8.


Source of graphic: online version of the NYT article quoted and cited above.

July 20, 2011

Zuckerberg: ''Filmmakers Can't Get Their Head around the Idea that Someone Might Build Something because They Like Building Things''


Marc Andreessen. Source of photo: online version of the NYT article quoted and cited below.

(p. 13) After hearing a story about Foursquare's co-founder, Dennis Crowley, walking into a press event in athletic wear and eating a banana, I developed a theory that bubbles might be predicted by fashion: when tech founders can't be bothered to appear businesslike, the power has shifted too much in their favor.

Believe it or not, this goes deep into the interior mentality of the engineer, which is very truth-oriented. When you're dealing with machines or anything that you build, it either works or it doesn't, no matter how good of a salesman you are. So engineers not only don't care about the surface appearance, but they view attempts to kind of be fake on the surface as fundamentally dishonest.

That reminds me of Mark Zuckerberg's criticism of ''The Social Network.'' He said that ''filmmakers can't get their head around the idea that someone might build something because they like building things.''

Aaron Sorkin was completely unable to understand the actual psychology of Mark or of Facebook. He can't conceive of a world where social status or getting laid or, for that matter, doing drugs, is not the most important thing.

For the full interview, see:

ANDREW GOLDMAN. "TALK; Bubble? What Bubble? Marc Andreessen, one of Silicon Valley's biggest venture capitalists, has no fear." The New York Times Magazine (Sun., July 10, 2011): 13.

(Note: bold in original, indicating comments/questions by interviewer Andrew Goldman.)

(Note: the online version of the interview is dated July 7, 2011 (sic).)

July 18, 2011

"If We Can't Win on Quality, We Shouldn't Win at All"


Source of book image: online version of the WSJ review quoted and cited below.

(p. A13) At the tail end of the 1990s dot-com boom, Douglas Edwards took a gamble: He left his marketing job at an old-media company, taking a $25,000 salary cut to start work at a small, little-known Internet concern in its second year of operation. That his new employer was losing money and burning through venture capital went without saying. But unlike the footloose 20-somethings who usually populated Silicon Valley start-ups, Mr. Edwards had little margin to bet wrong; he was 41, with a mortgage, three children and a worried wife. He hoped he could get his old job back if the company ran out of money.

. . .

Mr. Edwards came to his job as a subscriber to the conventional wisdom. In an early presentation to cofounder Larry Page and others, Mr. Edwards unwisely declared that only marketing, not technology, could set Google apart. "In a world where all search engines are equal," he asserted, "we'll need to rely on branding to differentiate us from our competitors."

The room became quiet. Then Mr. Page spoke up. "If we can't win on quality," he said, "we shouldn't win at all."

For the full review, see:

DAVID A. PRICE. "BOOKSHELF; How Google Got Going; Branding, shmanding, a marketer was told. 'If we can't win on quality,' Larry Page said, 'we shouldn't win at all.'" The Wall Street Journal (Tues., July 12, 2011): A13.

(Note: ellipsis added.)

Book being reviewed:

Edwards, Douglas. I'm Feeling Lucky: The Confessions of Google Employee Number 59. New York: Houghton Mifflin Harcourt Publishing Co., 2011.

July 9, 2011

38 Theories Why Humans Became Sedentary

(p. 36) . . . if people didn't settle down to take up farming, why then did they embark on this entirely new way of living? We have no idea--or actually, we have lots of ideas, but we don't know if any of them are right. According to the historian Felipe Fernández-Armesto, at least thirty-eight theories have been put forward to explain why people took to living in communities: that they were driven to it by climatic change, or by a wish to stay near their dead, or by a powerful desire to brew and drink beer, which could only be indulged by staying in one place. One theory, evidently seriously suggested (Jane Jacobs cites It In her landmark work of 1969, The Economy of Cities), was that "fortuitous showers" of cosmic rays caused mutations in grasses that made them suddenly attractive as a food source. The short answer is that no one knows why agriculture developed as it did.

Making food out of plants is hard work. The conversion of wheat, rice, corn, millet, barley, and other grasses into staple foodstuffs is one of the great achievements of human history, but also one of the more unexpected ones.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

(Note: italics in original; ellipsis added.)

July 6, 2011

Google CEO Larry Page Admires Steve Jobs

BrinPageSchmidtGoogle2011-06-05.jpg "Former colleagues describe Larry Page, center, as strong-willed and sometimes impolite. He is said to admire Apple CEO Steve Jobs." Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. B1) Larry Page's PageRank algorithm was the basis for Google Inc.'s search engine. As Google's new chief executive, Mr. Page will face the challenge of leading a company that has grown far beyond that algorithm and must compete with agile Web upstarts such as Facebook Inc. and Groupon Inc.

On Friday, a day after being named to replace outgoing CEO Eric Schmidt in April, Mr. Page gave little hint of how he planned to tackle such challenges. The 38-year-old Google co-founder didn't immediately address employees in an all-hands note or meeting, said a person familiar with the matter, though the company has a weekly Friday meeting that Mr. Page was expected to attend.

But several of Mr. Page's former colleagues describe him as having similarities to Apple CEO Steve Jobs, whom Mr. Page has said he admired. Both men are strong willed, sometimes impolite and push engineers hard to execute their ambitious projects.

Some former colleagues said Mr. Page is likely to try to pierce through the sometimes "paralyzing" bureaucracy that product managers and engineers have faced when trying to launch some Google products in recent years.

On Thursday, Messrs. Page and Schmidt said some top-level decision-making had gotten slower and the management change would improve that. Also, the company has said it is trying to allow more projects to operate like start-ups inside of Google in order to speed up innovation.

For the full story, see:

AMIR EFRATI and SCOTT MORRISON. "TECHNOLOGY; Chief Seeks More Agile Google; As CEO, Larry Page Must Pierce Bureaucracy, Compete With Nimble Upstarts." The Wall Street Journal (Tues., January 22, 2011): B1 & B4.

July 5, 2011

"The American Machines Did Things that the World Earnestly Wished Machines to Do"

(p. 22) . . . when the displays were erected it came as something of a surprise to discover that the American section was an outpost of wizardry and wonder. Nearly all the American machines did things that the world earnestly wished machines to do--stamp out nails, cut stone, mold candles--but with a neatness, dispatch, and tireless reliability that left other nations blinking. Elias Howe's sewing machine dazzled the ladies and held out the impossible promise that one of the great drudge pastimes of domestic life could actually be made exciting and fun. Cyrus McCormick displayed a reaper that could do the work of forty men--a claim so improbably bold that almost no one believed it until the reaper (p. 23) was taken out to a farm in the Home Counties and shown to do all that it promised it could. Most exciting of all was Samuel Colt's repeat-action revolver, which was not only marvelously lethal but made from inter-changeable parts, a method of manufacture so distinctive that it became known as "the American system." Only one homegrown creation could match these virtuoso qualities of novelty, utility, and machine-age precision--Paxton's great hall itself, and that was to disappear when the show was over. For many Europeans this was the first unsettling hint that those tobacco-chewing rustics across the water were quietly creating the next industrial colossus--a transformation so improbable that most wouldn't believe it even as It was happening.

The most popular feature at the Great Exhibition was not an exhibition at all, but rather the elegant "retiring rooms," where visitors could relieve themselves in comfort, an offer taken up with gratitude and enthusiasm by 827,000 people--11,000 of them on a single day. Public facilities in London were woefully lacking in 1851. At the British Museum, up to 30,000 daily visitors had to share just two outside privies. At the Crystal Palace the toilets actually flushed, enchanting visitors so much that It started a vogue for installing flushing toilets at home-- . . .


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

(Note: ellipses added.)

July 4, 2011

Steve Jobs as Project Entrepreneur

JobsSteveIpadIntroduction2011-06-05.jpg "Steve Jobs's presence at the unveiling seemed to reassure investors." Source of caption and photo: online version of the NYT article quoted and cited below.

Innovative entrepreneurs can have several different motives. I think Steve Jobs is mainly a "project entrepreneur"---his main motive is to envision a project and to accomplish it.

(p. B1) SAN FRANCISCO -- Steven P. Jobs, Apple's chief executive, interrupted his medical leave on Wednesday to introduce the company's much-anticipated new iPad, a thinner, faster and lighter version of its popular tablet computer that will sell at the same prices as the original models.

Mr. Jobs alluded to his leave but neither commented on his health nor said whether he planned to return to the company in the near future.

"We've been working on this product for a while and I just didn't want to miss today," he said.

For the full story, see:

MIGUEL HELFT. "Jobs Returns to Introduce a New iPad." The New York Times (Thurs., March 3, 2011): B1 & B6.

(Note: the online version of the commentary is dated March 2, 2011 and has the title "Jobs Returns to Introduce a New iPad.")

June 28, 2011

At NeXT Steve Jobs Learned to Delegate, Retain Talent, and Attend to the Price


"Steve Jobs, after returning to Apple in 1999. Would Apple be what it is today had he never left?" Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 5) Suppose Mr. Jobs had not left in 1985. Suppose he had convinced the Apple board to oust his nemesis, John Sculley, then chief executive and president. Under Mr. Jobs's uninterrupted direction, would Apple have arrived at the pinnacle it has reached today, but 12 years earlier?

It's hard to see how anything like that would have transpired. The Steve Jobs who returned to Apple was a much more capable leader -- precisely because he had been badly banged up. He had spent 12 tumultuous, painful years failing to find a way to make the new company profitable.

"I am convinced that he would not have been as successful after his return at Apple if he hadn't gone through his wilderness experience at Next," said Tim Bajarin, president of Creative Strategies, a technology consulting company.

. . .

Mr. Jobs's lieutenants tried to warn him away from certain disaster, but he was not receptive. In 1992-93, seven of nine Next vice presidents were shown the door or left on their own.

In this period, Mr. Jobs did not do much delegating. Almost every aspect of the machine -- including the finish on interior screws -- was his domain. The interior furnishings of Next's offices, a stunning design showplace, were Mr. Jobs's concern, too. While the company's strategy begged to be re-examined, Mr. Jobs attended to other matters. I spoke with many current and former Next employees for my 1993 book, "Steve Jobs and the NeXT Big Thing." According to one of them, while a delegation of visiting Businessland executives waited on the sidewalk, Mr. Jobs spent 20 minutes directing the landscaping crew on the exact placement of the sprinkler heads.

Next's computer hardware and software were filled with innovations that drew a small, but devoted, following. Mr. Jobs had created the first easy-to-use Unix machine, but the mainstream marketplace shrugged. He had already helped bring to market an easy-to-use machine, the Mac, so the Next couldn't differentiate itself enough -- and certainly not at the price the company charged.

. . .

And he had always been able to attract great talent. What he hadn't learned before returning to Apple, however, was the necessity of retaining it. He has now done so. One of the unremarked aspects of Apple's recent story is the stability of the executive team -- no curb filled with dumped managers.

Kevin Compton, who was a senior executive at Businessland during the Next years, described Mr. Jobs after returning to Apple: "He's the same Steve in his passion for excellence, but a new Steve in his understanding of how to empower a large company to realize his vision." Mr. Jobs had learned from Next not to try to do everything himself, Mr. Compton said.

For the full commentary, see:

RANDALL STROSS. "DIGITAL DOMAIN; What Steve Jobs Learned in the Wilderness." The New York Times, SundayBusiness Section (Sun., October 3, 2010): 5.

(Note: ellipses added.)

(Note: the online version of the commentary is dated October 2, 2010.)

June 27, 2011

"A Tax on Air and Light"

(p. 11) Paxton was very lucky in his timing, for just at the moment of the Great Exhibition glass suddenly became available in a way it never had before. Glass had always been a tricky material. It was not particularly easy to make, and really hard to make well, which is why for so much of its history it was a luxury Item. Happily, two recent technological breakthroughs had changed that. First, the French invented plate glass--so called because the molten glass was spread across tables known as plates. This allowed for the first time the creation of really large panes of glass, which made shop windows possible. Plate glass, however, had to be cooled for ten days after being rolled out, which meant that each table was unproductively occupied most of the time, and then each sheet required a lot of grinding and polishing. This naturally made it expensive. In 1838, a cheaper refinement was developed--sheet glass. This had most of the virtues of plate glass, but ¡t cooled faster and needed less polishing, and so could be made much more cheaply. Suddenly glass of a good size could be produced economically In limitless volumes.

Allied with this was the timely abolition of two long-standing taxes: the window tax and glass tax (which, strictly speaking, was an excise duty). The window tax dated from 1696 and was sufficiently punishing that (p. 12) people really did avoid putting windows in buildings where they could. The bricked-up window openings that are such a feature of man period
buildings in Britain today were once usually painted to look like windows. (It Is sometimes rather a shame that they aren't still.) The tax, sorely resented as "a tax on air and light," meant that many servants and others of constrained means were condemned to live In airless rooms.


Bryson, Bill. At Home: A Short History of Private Life. New York: Doubleday, 2010.

May 19, 2011

Entrepreneur Ken Olsen Was First Lionized and Then Chastised

OlsenKenObit2011-05-16.jpg"Ken Olsen, the pioneering founder of DEC, in 1996." Source of caption and photo: online version of the NYT article quoted and cited below.

I believe in The Road Ahead, Bill Gates describes Ken Olsen as one of his boyhood heroes for having created a computer that could compete with the IBM mainframe. His hero failed to prosper when the next big thing came along, the PC. Gates was determined that he would avoid his hero's fate, and so he threw his efforts toward the internet when the internet became the next big thing.

Christensen sometimes uses the fall of minicomputers, like Olsen's Dec, to PCs as a prime example of disruptive innovation, e.g., in his lectures on disruptive innovation available online through Harvard. A nice intro lecture is viewable (but only using Internet Explorer) at:

(p. A22) Ken Olsen, who helped reshape the computer industry as a founder of the Digital Equipment Corporation, at one time the world's second-largest computer company, died on Sunday. He was 84.

. . .

Mr. Olsen, who was proclaimed "America's most successful entrepreneur" by Fortune magazine in 1986, built Digital on $70,000 in seed money, founding it with a partner in 1957 in the small Boston suburb of Maynard, Mass. With Mr. Olsen as its chief executive, it grew to employ more than 120,000 people at operations in more than 95 countries, surpassed in size only by I.B.M.

At its peak, in the late 1980s, Digital had $14 billion in sales and ranked among the most profitable companies in the nation.

But its fortunes soon declined after Digital began missing out on some critical market shifts, particularly toward the personal computer. Mr. Olsen was criticized as autocratic and resistant to new trends. "The personal computer will fall flat on its face in business," he said at one point. And in July 1992, the company's board forced him to resign.

For the full obituary, see:

GLENN RIFKIN. "Ken Olsen, Founder of the Digital Equipment Corporation, Dies at 84." The New York Times (Tues., February 8, 2011): A22.

(Note: ellipsis added.)

(Note: the online version of the story is dated February 7, 2011 and has the title "Ken Olsen, Who Built DEC Into a Power, Dies at 84.")

Gates writes in autobiographical mode in the first few chapters of:

Gates, Bill. The Road Ahead. New York: Viking Penguin, 1995.

Christensen's mature account of disruptive innovation is best elaborated in:

Christensen, Clayton M., and Michael E. Raynor. The Innovator's Solution: Creating and Sustaining Successful Growth. Boston, MA: Harvard Business School Press, 2003.

April 29, 2011

"The Internet Is Really the Work of a Thousand People"


Paul Baran. Source of photo: online version of the NYT obituary quoted and cited below.

(p. A23) In the early 1960s, while working at the RAND Corporation in Santa Monica, Calif., Mr. Baran outlined the fundamentals for packaging data into discrete bundles, which he called "message blocks." The bundles are then sent on various paths around a network and reassembled at their destination. Such a plan is known as "packet switching."

Mr. Baran's idea was to build a distributed communications network, less vulnerable to attack or disruption than conventional networks. In a series of technical papers published in the 1960s he suggested that networks be designed with redundant routes so that if a particular path failed or was destroyed, messages could still be delivered through another.

Mr. Baran's invention was so far ahead of its time that in the mid-1960s, when he approached AT&T with the idea to build his proposed network, the company insisted it would not work and refused.

. . .

Mr. Baran was also an entrepreneur. He started seven companies, five of which eventually went public.

In recent years, the origins of the Internet have been subject to claims and counterclaims of precedence, and Mr. Baran was an outspoken proponent of distributing credit widely.

"The Internet is really the work of a thousand people," he said in an interview in 2001.

"The process of technological developments is like building a cathedral," he said in an interview in 1990. "Over the course of several hundred years, new people come along and each lays down a block on top of the old foundations, each saying, 'I built a cathedral.'

"Next month another block is placed atop the previous one. Then comes along an historian who asks, 'Well, who built the cathedral?' Peter added some stones here, and Paul added a few more. If you are not careful you can con yourself into believing that you did the most important part. But the reality is that each contribution has to follow onto previous work. Everything is tied to everything else."

For the full obituary, see:

KATIE HAFNER. "Paul Baran, Internet Pioneer, Dies at 84." The New York Times (Mon., MARCH 28, 2011): A23.

(Note: ellipsis added.)

(Note: the online version of the obituary is dated March 27, 2011.)

March 29, 2011

Cars Bring Convenience, Freedom, and Personal Security

(p. 16) Two generations ago in the United States,most families lacked a car; by our parents' generation, most families had one car while the two-car lifestyle was a much-sought ideal; today a third of America's families own three cars or more. The United States now contains just shy of one automobile per licensed driver, and is on track to having more cars than licensed drivers. Cars are a mixed blessing, as a future chapter will detail: But there is no doubt they represent convenience, freedom, and, for women, personal security, when compared to standing on street corners waiting for buses or lingering on dark subway platforms. Cars would not he so infuriatingly popular if the did not make our lives easier. Today all but the bottom-most fraction of the impoverished in the United States do most of their routine traveling by car: 100 auto trips in the United States for every one trip on a bus or the subway, according to the American Public Transit Association. The portion of routine trips made in private cars is rising toward overwhelming in the European Union, too. Two generations ago, people dreamed of possessing their own cars. Now almost everyone in the Western world who desires a car has one--and vehicles that are more comfortable, better-equipped, lower-polluting, and much safer than those available only a short time ago.


Easterbrook, Gregg. The Progress Paradox: How Life Gets Better While People Feel Worse. Paperback ed. New York: Random House, 2004.

March 20, 2011

"The Adventurous, Pioneering Spirit"


Source of book image:

(p. 30) "Jet Age" is ostensibly about the race between two companies and nations to commercialize a military technology and define a new era of air travel. There's Boeing with its back to the wall and its military contracts drying up, betting everything on passenger jets, pitted against de Havilland and the government-subsidized project meant to reclaim some of Britain's lost glory. . . .

. . .

But the book is really about the risk-taking essential for making any extreme endeavor common­place. "Jet Age" celebrates the managers, pilots, engineers, flight attendants and, yes, even passengers (for without passengers there is no business) who gambled everything so that we might cross oceans and continents in hours rather than days.

It is easy to forget, in this time of overcrowded flights, demoralizing security checks, embattled flight attendants and dwindling service, that risk was once embraced as a necessary, even desirable, part of flying. Quoted in the book, the celebrated aviator Lord Brabazon summed it up in post-accident testimony: "You know, and I know, the cause of this accident. It is due to the adventurous, pioneering spirit of our race. It has been like that in the past, it is like that in the present, and I hope it will be in the future."

For the full review, see:

MICHAEL BELFIORE. "Fatal Flaws." The New York Times Book Review (Sun., February 6, 2011): 30.

(Note: ellipses added.)

(Note: the online version of the article is dated February 4, 2011.)

The book under review is:

Verhovek, Sam Howe. Jet Age: The Comet, the 707, and the Race to Shrink the World. New York: Avery, 2010.

March 19, 2011

Abraham Lincoln's Defence of the Patent System

William Rosen quotes a key passage from Abraham Lincoln's speech on "Discoveries, Inventions, and Improvements":

(p. 323) The advantageous use of Steam-power is, unquestionably, a modern discovery. And yet, as much as two thousand years ago the power of steam was not only observed, but an ingenious toy was actually made and put in motion by it, at Alexandria in Egypt. What appears strange is that neither the inventor of the toy, nor any one else, for so long a time afterwards, should perceive that steam would move useful machinery as well as a toy. . . . . . . in the days before Edward Coke's original Statute on Monopolies, any man could instantly use what another had invented; so that the inventor had no special advantage from his own invention. . . . The (p. 324) patent system changed this; secured to the inventor, for a limited time, the exclusive use of his invention; and thereby added the fuel of interest to the fire of genius, in the discovery of new and useful things.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics and ellipses in original.)

March 15, 2011

Lincoln's Popular Speech on "Discoveries, Inventions, and Improvements"

(p. 322) Lincoln, the only American president ever awarded a patent, had a long and passionate love for things mechanical. He made his living for many years as a railroad lawyer and appears to have absorbed something of the fascination with machines, and with steam, of the engineers with whom he worked. . . .     . . . , in 1859, after his loss in the Illinois senatorial race against Stephen Douglas, he was much in demand for a speech entitled "Discoveries, Inventions, and Improvements" that he gave at agricultural fairs, schools, and self-improvement societies.

The speech--decidedly not one of Lincoln's best--nonetheless revealed an enthusiasm for mechanical innovation that resonates (p. 323) powerfully even today. "Man," Lincoln said, "is not the only animal who labors, but he is the only one who improves his workmanship . . . by Discoveries and Inventions."


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics and last ellipsis in original; other ellipses added.)

March 14, 2011

"The Information in a Message Is Inversely Proportional to Its Probability"


Source of book image:

(p. A13) What, exactly, is information? Prior to Shannon, Mr. Gleick notes, the term seemed as hopelessly subjective as "beauty" or "truth." But in 1948 Shannon, then working for Bell Laboratories, gave information an almost magically precise, quantitative definition: The information in a message is inversely proportional to its probability. Random "noise" is quite uniform; the more surprising a message, the more information it contains. Shannon reduced information to a basic unit called a "bit," short for binary digit. A bit is a message that represents one of two choices: yes or no, heads or tails, one or zero.

For the full review, see:

JOHN HORGAN. "Little Bits Go a Long Way; The more surprising a message, the more information it contains." The Wall Street Journal (Tues., March 1, 2011): A13.

Book being reviewed:

Gleick, James. The Information: A History, a Theory, a Flood. New York: Pantheon Books, 2011.

March 11, 2011

"Rocket" Showed the Motive Power of the Industrial Revolution

Stephenson's steam locomotive, called "Rocket," won the Rainhill Trials in 1829. Rosen uses this as the culminating event in his history of the development of steam power.

(p. 310) The reason for ending with Stephenson's triumph . . . seems persuasive. Rainhill was a victory not merely for George and Robert Stephenson, but for Thomas Saverv and Thomas Newcomen, for James Watt and Matthew Boulton, for Oliver Evans and Richard Trevithick. It was a triumph for the iron mongers of the Severn Valley, the weavers of Lancashire, the colliers of Newcastle, and the miners of Cornwall. It was even a triumph for John Locke and Edward Coke, whose ideas ignited the Rocket just as much as its firebox did.

When the American transcendentalist Ralph Waldo Emerson met Stephenson in 1847, he remarked, "he had the lives of many men in him."

Perhaps that's what he meant.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics in original; ellipsis added.)

March 7, 2011

Better Rails Were Needed Before Train Would "Work"

(p. 300) The other weight problem was the one that licked Trevithick at Penydarren: The tracks on which the locomotive ran were just not able to survive the tonnage traveling over them. Driving a five-ton steam locomotive over rails designed for horse-drawn carts was only slightly more sensible than driving a school bus over a bridge made of wet ice cubes. In both cases, it's a close call whether the vehicle will skid before or after the surface collapses.

. . .

(p. 301) Two years later, Stephenson, in collaboration with the ironmonger William Losh of Newcastle, produced, and in September 1816 jointly patented, a series of' improvements in wheels, suspension, and--most important--the method by which the rails and "chairs" connected one piece of track to another. Stephenson's rails seem mundane next to better-known eureka moments, but as much as any other innovation of the day they underline the importance of such micro-inventions in the making of a revolution. For it was the rails that finally made the entire network of devices--engine, linkage, wheel, and track--work.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: ellipsis added.)

March 6, 2011

What a Picture Is Worth

AppleLaptopEgyptianYouths2011-02-28.jpg"In Cairo, Egyptian youths used laptops to post video they had shot earlier Tuesday in Tahir Square. The group has been collecting accounts of the demonstrations and voices of the protesters, putting them on social networking sites like Facebook and Twitter." Source of caption and photo: online version of the NYT article quoted and cited below.

The photo above was at the top of the first page of the New York Times on Weds., Feb. 9, 2011. You have a group of lively, engaged, young people intoxicated with the idea that they may be helping to bring their country freedom. And in the center of the dark picture, amidst the conversations, is one youth looking with concentration at an Apple laptop, the sole source of color and illumination.

If I was Steve Jobs, I would value this one photo at more than a whole hour's worth of Superbowl ads.

The photo above was placed above the following story on the front page of the NYT:

DAVID D. KIRKPATRICK. "As Egypt Protest Swells, U.S. Sends Specific Demands." The New York Times (Weds., February 9, 2011): A1 & A12.

(Note: the online version of the article is dated February 8, 2011.)

February 15, 2011

Luddism in 1811 England

(p. 243) The stockingers began in the town of Arnold, where weaving frames were being used to make cut-ups and, even worse, were being operated by weavers who had not yet completed the seven-year apprenticeship that the law required. They moved next to Nottingham and the weaver-heavy villages surrounding it, attacking virtually every night for weeks, a few dozen men carrying torches and using prybars and hammers to turn wooden frames--and any doors, walls, or windows that surrounded them--into kindling. None of the perpetrators were arrested, much less convicted and punished.

The attacks continued throughout the spring of' 1811, and after a brief summertime lull started up again in the fall, by which time nearly one thousand weaving frames had been destroyed (out of the 25.000 to 29,000 then in Nottingham, Leicestershire, and Derbyshire), resulting in damages of between £6,000 and £10.000. That November, a commander using the nom de sabotage of Ned Ludd (sometimes Lud)--the name was supposedly derived from an apprentice to a Leicester stockinger named Ned Ludham whose reaction to a reprimand was to hammer the nearest stocking frame to splinters--led a series of increasingly daring attacks throughout the Midlands. On November 13, a letter to the Home Office demanded action against the "2000 men, many of them armed, [who] were riotously traversing the County of Nottingham."

By December 1811, rioters appeared in the cotton manufacturing capital of Manchester, where Luddites smashed both weaving and spinning machinery. Because Manchester was further down the path to industrialization, and therefore housed such machines in large factories as opposed to small shops, the destruction demanded larger, and better organized, mobs.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics and bracketed word in original.)

February 12, 2011

"Powerful Pressure for Scientists to Conform"


Source of book image: online version of the WSJ review quoted and cited below.

(p. A13) In "Hyping Health Risks," Geoffrey Kabat, an epidemiologist himself, shows how activists, regulators and scientists distort or magnify minuscule environmental risks. He duly notes the accomplishments of epidemiology, such as uncovering the risks of tobacco smoking and the dangers of exposure to vinyl chloride and asbestos. And he acknowledges that industry has attempted to manipulate science. But he is concerned about a less reported problem: "The highly charged climate surrounding environmental health risks can create powerful pressure for scientists to conform and to fall into line with a particular position."

Mr. Kabat looks at four claims -- those trying to link cancer to man-made chemicals, electromagnetic fields and radon and to link cancer and heart disease to passive smoking. In each, he finds more bias than biology -- until further research, years later, corrects exaggeration or error.

. . .

I know whereof Mr. Kabat speaks. In 1992, as the producer of a PBS program, I interviewed an epidemiologist who was on the EPA's passive-smoking scientific advisory board. He admitted to me that the EPA had put its thumb on the evidentiary scales to come to its conclusion. He had lent his name to this process because, he said, he wanted "to remain relevant to the policy process." Naturally, he didn't want to appear on TV contradicting the EPA.

For the full review, see:

RONALD BAILEY. "Bookshelf; Scared Senseless." The Wall Street Journal (Mon., AUGUST 11, 2008): A13.

(Note: ellipsis added.)

(Note: the first paragraph quoted above has slightly different wording in the online version than the print version; the second paragraph quoted is the same in both.)

The book under review is:

Kabat, Geoffrey C. Hyping Health Risks: Environmental Hazards in Daily Life and the Science of Epidemiology. New York: Columbia University Press, 2008.

February 11, 2011

Luddism in France

(p. 240) Not only was Richard Hargreaves's original spinningjenny destroyed in 1767, but so also was his new and improved version in 1769.

Nor was the phenomenon exclusively British. Machine breaking in France was at least as frequent. and probably even more consequential, though it can be hard to tease out whether the phenomenon contributed to, or was a symptom of, some of the uglier aspects of the French Revolution. Normandy in particular, which was not only close to England but the most "English" region of France, was the site of dozens of incidents in 1789 alone. In July, hundreds of spinnigjennys were destroyed, along with a French version of Arkwright's water frame. In October, an attorney in Rouen applauded the destruction of "the machines used in cotton-spinning that have deprived many workers of their jobs." In Troyes, spinners rioted, killing the mayor and mutilating his body because he had favored machines." The carders of Lille destroyed machines in 1790; in 1791, the spinning jennies of Roanne were hacked up and burned. By 1796, administrators in the Department of the Somme were complaining, it turns out presciently, that the prejudice against machinery has led the commercial classes . . . to abandon their interest in the cotton industry.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: ellipsis in original.)

February 7, 2011

After a Series of Anonymous Threats, Cartwright Power Looms Were Burned Down

(p. 239) Cartwright constructed twenty looms using his design and put them to work in a weaving "shed" in Doncaster. He further agreed to license the design to a cotton manufacturer named Robert Grimshaw, who started building five hundred Cartwright looms at a new mill in Manchester in the spring of 1792. By summertime, only a few dozen had been built and installed, but that was enough to provoke Manchester's weavers, who accurately saw the threat they represented. Whether their anger flamed hot enough to burn down Grimshaw's mill remains unknown, but something certainly did: In March 1792, after a series of anonymous threats, the mill was destroyed.

Cartwright's power looms were not the first textile machines to be attacked, and they would not be the last.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

February 3, 2011

"Inventors Are Sometimes Beneficiaries of Their Own Ignorance"

William Rosen gives us a thought-provoking anecdote about Edmund Cartwright, the inventor of the first power loom:

(p. 238) He was also, apparently, convinced of the practicality of such a machine by the success of the "Mechanical Turk," a supposed chess-playing robot that had mystified all of Europe and which had not yet been revealed as one of the era's great hoaxes: a hollow figurine concealing a human operator. Inventors are sometimes beneficiaries of their own ignorance.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

January 30, 2011

Carlyle (and Rosen) on Arkwright

(p. 236) The greatest hero-worshipper of them all, Thomas Carlyle. described Arkwright as

A plain, almost gross, bag-checked, potbellied, much enduring, much inventing man and barber... . French Revolutions were a-brewing: to resist the same in any measure, imperial Kaisers were impotent without the cotton and cloth of England, and it was this man that had to give England the power of cotton.... It is said ideas produce revolutions, and truly they do; not spiritual ideas only, but even mechanical. In this clanging clashing universal Sword-dance which the European world now dances for the last half-century, Voltaire is but one choragus [leader of a movement, from the old Greek word for the sponsor of a chorus] where Richard Arkwright is another.

. . .

Arkwright was not a great inven-(p. 237)tor, but he was a visionary, who saw, better than any man alive, how to convert useful knowledge into cotton apparel and ultimately into wealth: for himself, and for Britain.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: internal ellipses in original; ellipsis between paragraphs added.)

January 29, 2011

"It Isn't the Consumers' Job to Know What They Want"

iPadChild2011-01-21.jpg "Steven P. Jobs has played a significant role in a string of successful products at Apple, including the iPad, shown above, which was introduced last year." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B1) Shortly before the iPad tablet went on sale last year, Steven P. Jobs showed off Apple's latest creation to a small group of journalists. One asked what consumer and market research Apple had done to guide the development of the new product.

"None," Mr. Jobs replied. "It isn't the consumers' job to know what they want."

For years, and across a career, knowing what consumers want has been the self-appointed task of Mr. Jobs, Apple's charismatic co-founder. Though he has not always been right, his string of successes at Apple is uncanny. His biggest user-pleasing hits include the Macintosh, the iMac, iBook, iPod, iPhone and iPad.

But as he takes a medical leave of absence, announced on Monday, the question is: Without him at the helm, can Apple continue its streak of innovation, particularly in an industry where rapid-fire product cycles can make today's leader tomorrow's laggard?

. . .

(p. B4) With the iPad tablet, Apple jump-started a product category. But with the iPod (a music and media player) and iPhone (smartphone), Apple moved into markets with many millions of users using rival products, but he gave consumers a much improved experience.

"These are seeing-around-the-corner innovations," said John Kao, an innovation consultant to corporations and governments. "Steve Jobs is totally tuned into what consumers want. But these are not the kind of breakthroughs that market research, where you are asking people's opinions, really help you make."

Regis McKenna, a Silicon Valley investor and marketing consultant, said employees at Apple stores provide the company with a powerful window into user habits and needs, even if it is not conventional market research.

"Steve visits the Apple store in Palo Alto frequently," said Mr. McKenna, a former consultant to Apple.

. . .

In a conversation years ago, Mr. Jobs said he was disturbed when he heard young entrepreneurs in Silicon Valley use the term "exit strategy" -- a quick, lucrative sale of a start-up. It was a small ambition, Mr. Jobs said, instead of trying to build companies that last for decades, if not a century or more.

That was a sentiment, Mr. Jobs said, that he shared with his sometime luncheon companion, Andrew S. Grove, then the chief executive of Intel.

"There are builders and traders," Mr. Grove said on Tuesday. "Steve Jobs is a builder."

For the full story, see:

STEVE LOHR. "The Missing Tastemaker?" The New York Times (Weds., JANUARY 19, 2011): B1 & B4.

(Note: ellipses added.)

(Note: the online version of the article is dated January 18, 2011 and has the title "Can Apple Find More Hits Without Its Tastemaker?.")

January 6, 2011

Supervising a Talented Inventor

(p. 180) Anyone who has ever supervised a talented subordinate with a tendency to set his own priorities will find Watt's letters familiar: "I wish William could be brought to do as we do, to mind the business in hand, and let such as Symington [William Symington, the builder of the Charlotte Dundas, one of the world's first steam-engine boats] and Sadler [James Sadler, balloonist and inventor of a table steam engine] throw away their time and money, hunting shadows."


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics and bracketed words in original.)

January 5, 2011

Chinese Communist Oligarchs Unfriend the World

ChinaFacebookLightMap2011-01-02.jpg "The Facebook friendship map, created by Paul Butler." Source of caption and map: online version of the WSJ article quoted and cited below.

(p. B7) The contrast between Facebook's spreading global network of users and its effective absence from China is starkly illustrated by a map, produced by a Facebook intern and flagged on the Economist's website earlier this month, that has lately become a point of fascination of the Chinese Internet.

Described by its creator Paul Butler as "a social graph of 500 million people," the map represents the worldwide volume of Facebook friendships across geographic locations using lines of varying intensity. Butler's methodology is interesting in its own right, but what appeared to most interest China's netizens was how China appears on the map. Or, rather, how it doesn't.

. . .

Since Facebook is blocked in China, the number Facebook friendship lines flowing in and out of the country is essentially negligible, making China almost impossible to see."

For the full story, see:

Josh Chin. "Facebook Gets Back Into China (Sort of...)." The Wall Street Journal (Tues., December 21, 2010): B7.

(Note: ellipsis added.)

(Note: the online version of the article has the title "Facebook Gets Back Into China (Sort of...)" and includes paragraphs at the end that were not in the print version.)

January 2, 2011

Suppression of Cistercians Did Not Delay Industrial Revolution

(p. 138) . . . , the Cistercians' proven ability to produce substantial quantities of high-quality iron not only fails to prove that they were about to ignite an Industrial Revolution when they were suppressed in the early sixteenth century, it actually demonstrates the opposite--and for two reasons. First, the iron of Laskill and Fontenoy was evidence not of industrialization, but of industriousness. The Cistercians owed their factories' efficiency to their disciplined and cheap workforce rather than any technological innovation; there's nothing like a monastic brotherhood that labors twelve hours a day for bread and water to keep costs down. The sixteenth-century monks were still using thirteenth-century technology, and they neither embraced, nor contributed to, the Scientific Revolution of Galileo and Descartes.

The second reason is even more telling: For centuries, the Cistercian monasteries (and other ironmakers; the Cistercians were leaders of medieval iron manufacturing, but they scarcely monopolized it) had been able to supply all the high-quality iron that anyone could use, but all that iron still failed to ignite a technological revolution. Until something happened to increase demand for iron, smelters and forges, like the waterpower that drove them, sounded a lot like one hand clapping. It would sound like nothing else for--what else?--two hundred years.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: ellipsis added.)

December 31, 2010

The Glamour of Trains and Windmills Hides Their High Costs

(p. C12) When Robert J. Samuelson published a Newsweek column last month arguing that high-speed rail is "a perfect example of wasteful spending masquerading as a respectable social cause," he cited cost figures and potential ridership to demonstrate that even the rosiest scenarios wouldn't justify the investment. He made a good, rational case--only to have it completely undermined by the evocative photograph the magazine chose to accompany the article.

The picture showed a sleek train bursting through blurred lines of track and scenery, the embodiment of elegant, effortless speed. It was the kind of image that creates longing, the kind of image a bunch of numbers cannot refute. It was beautiful, manipulative and deeply glamorous.

. . .

The problems come, of course, in the things glamour omits, including all those annoyingly practical concerns the policy wonks insist on debating. Neither trains nor wind farms are as effortlessly liberating as their photos suggest. Neither really offers an escape from the world of compromises and constraints. The same is true, of course, of evening gowns, dream kitchens and tropical vacations. But at least the people who enjoy that sort of glamour pay their own way.

For the full commentary, see:

VIRGINIA POSTREL. "COMMERCE & CULTURE; The Allure of Techno-Glamour." The Wall Street Journal (Sat., NOVEMBER 20, 2010): C12.

(Note: ellipsis added.)

December 29, 2010

"A Nation's Heroes Reveal Its Ideals"

(p. 133) Robert and John Hart were two Glasgow engineers and merchants who regarded James Watt with the sort of awe usually reserved for pop musicians, film stars, or star athletes. Or even more: They regarded him as "the greatest and most useful man who ever lived." . . .

. . .

(p. 134) . . . the hero worship of the brothers Hart is more enlightening about the explosion of inventive activity that started in eighteenth-century Britain than their reminiscences. For virtually all of human history, statues had been built to honor kings, solders, and religious figures; the Harts lived in the first era that built them to honor builders and inventors. James Watt was an inventor inspired in every way possible, right down to the neurons in his Scottish skull; but he was also, and just as significantly, the inspiration for thousands of other inventors, during his lifetime and beyond. The inscription on the statue of Watt that stood in Westminster Abbey from 1825 until it was moved in 1960 reminded visitors that it was made "Not to perpetuate a name which must endure while the peaceful arts flourish, but to shew that mankind have learned to know those who best deserve their gratitude" (emphasis added).

A nation's heroes reveal its ideals, and the Watt memorial carries an impressive weight of symbolism. However, it must be said that the statue, sculpted by Sir Francis Chantrey in marble, might bear that weight more appropriately if it had been made out of the trademark material of the Industrial Revolution: iron.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: ellipses added; italics in original.)

December 25, 2010

"Inventors Fear Wrong Answers Less than Noninventors"

(p. 123) [A] . . . study . . . conducted in 1962, compared the results of psychometric tests given to inventors and noninventors (the former defined by behaviors such as application for or receipt of a patent) in similar professions, such as engineers, chemists, architects, psychologists, and science teachers. Some of the results (p. 124) were about what one might expect: inventors are significantly more thing-oriented than people-oriented, more detail-oriented than holistic. They are also likely to come from poorer families than noninventors in the same professions. . . .

. . . , the 1962 study also revealed that independent inventors scored far lower on general intelligence tests than did research scientists, architects, or even graduate students. There's less to this than meets the eye: The intelligence test that was given to the subjects subtracted wrong answers from right answers, and though the inventors consistently got as many answers correct as did the research scientists, they answered far more questions, thereby incurring a ton of deductions. While the study was too small a sample to prove that inventors fear wrong answers less than noninventors, it suggested just that. In the words of the study's authors, "The more inventive an independent inventor is, the more disposed he will be--and this indeed to a marked degree--to try anything that might work."


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: word in brackets and ellipses added.)

December 13, 2010

"The Most Important Invention of the Industrial Revolution Was Invention Itself"

(p. 103) Alfred North Whitehead famously wrote that the most important invention of the Industrial Revolution was invention itself.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

December 10, 2010

Measuring Inflation by Internet Prices


Source of graphs: online version of the WSJ article quoted and cited below.

(p. A5) Economists Roberto Rigobon and Alberto Cavallo at the Massachusetts Institute of Technology's Sloan School of Management have come up with a method to scour the Internet for online prices on millions of items and then use them to calculate inflation statistics for a dozen countries on a daily basis. The two have been collecting data for the project for more than three years, but only made their results public this week.

. . .

In countries where the apparatus for collecting prices is limited, or where officials have manipulated inflation data, the economists' indexes might give a clearer view. In Argentina, for example, the government has been widely accused of massaging price figures to let it pay less interest to holders of inflation-indexed bonds. President Cristina Fernández has defended the government data. For September, the government's measure of prices rose 11.1% from a year earlier. The economists' measure in that period: up 19.7%.

For the full story, see:

JUSTIN LAHART. "A Way, Day by Day, of Gauging Prices." The Wall Street Journal (Thurs., NOVEMBER 11, 2010): A5.

(Note: ellipsis added.)

(Note: the online version of the article is dated NOVEMBER 10, 2010.)

December 9, 2010

Science Can Contribute "Diligent Experimental Habits" to Technology

(p. 101) Nothing is more common in the history of science than independent discovery of the same phenomenon, unless it is a fight over priority. To this day, historians debate how much prior awareness of the theory of latent heat was in Watt's possession, but they miss Black's real contribution, which anyone can see by examining the columns of neat script that attest to Watt's careful recording of experimental results. Watt didn't discover the existence of latent heat from Black, at least not directly; but he rediscovered it entirely through exposure to the diligent experimental habits of professors such as Black, John Robison, and Robert Dick.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

December 1, 2010

"The Steam Engine Has Done Much More for Science than Science Has Done for the Steam Engine"

(p. 67) The great scientist and engineer William Thomson, Lord Kelvin, made his reputation on discoveries in basic physics. electricity, and thermodynamics, but he may be remembered just as well for his talent for aphorism. Among the best known of Kelvin's quotations is the assertion that "all science is either physics or stamp collecting (while one probably best forgotten is the confident "heavier-than-air flying machines are impossible"). But the most relevant for a history of the Industrial Revolution is this: "the steam engine has done much more for science than science has done for the steam engine."


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

November 28, 2010

Whittle "Struggled for Years to Get Funding and Time to Pursue His Idea"

DeHavilandComet2010-11-14.jpg"When Britain Ruled The Skies: A De Havilland Comet under construction in Belfast in 1954." Source of caption and photo: online version of the WSJ review quoted and cited below.

(p. C8) Frank Whittle, the brilliant British military pilot and engineer who began patenting jet designs in 1930, struggled for years to get funding and time to pursue his idea. Even after World War II, when a competing Nazi design showed what fighter jets could achieve in battle, U.S. airlines were slow to see jets' potential for passenger travel.

It took another Brit, airplane designer Geoffrey de Havilland, to awaken postwar America's aviation behemoths. While Lockheed and Douglas were still churning out rumbling, low-flying propeller planes, De Havilland's jet-powered Comet began breaking records in 1952. Only after seeing Comets scorch the stratosphere at 500 miles an hour did Howard Hughes want jetliners for TWA and Juan Trippe get interested for Pan Am.

Among American plane makers, it was a military contractor that had struggled in the prewar passenger-plane market--Boeing--that first took up the jetliner challenge. In retrospect, the outcome seems obvious. The Boeing 707 inspired the term "jet set." Boeing's iconic 747 "Jumbo Jet" opened jet-setting to the masses.

But in 1952, that outcome was far from obvious. Mr. Verhovek zeroes in on the mid-1950s, when Comets first seemed to own the world and then started plunging from the sky in pieces. The Comet's fatal design flaw--the result of an insufficient appreciation of the danger of metal fatigue--holds resonance today as both Boeing and Airbus struggle to master the next generation of jetliner materials, composites of carbon fiber and plastic.

. . .

Although "Jet Age" inevitably centers on technology, Mr. Verhovek wisely focuses as well on the outsize personalities behind world-changing innovations. There's Mr. De Havilland, a manic depressive who was so dedicated to aviation that he kept going after two of his three sons died testing his planes. Mr. Whittle, we learn, sniffed Benzedrine to stay awake, popped tranquilizers to sleep and shriveled to just 127 pounds while developing the jet engine. And Boeing chief executive Bill Allen, a meticulous lawyer, bet the company on passenger jets when not a single U.S. airline wanted one.

For the full review, see:

DANIEL MICHAELS. "Shrinking the World; How jetliners commercialized air travel--stewardesses and all." The Wall Street Journal (Sat., October 9, 2010): C8.

(Note: ellipsis added.)

The book under review is:

Verhovek, Sam Howe. Jet Age: The Comet, the 707, and the Race to Shrink the World. New York: Avery, 2010.

November 27, 2010

Coke's Patent Law Motivated by Belief that Creative Craftsmen Were Source of Britain's Prosperity

William Rosen discusses the genesis and significance of the world's first patent law:

(p. 52) The Statute became law in 1624. The immediate impact was barely noticeable, like a pebble rolling down a gradual slope at the top of a snow-covered mountain. For decades, fewer than six patents were awarded annually, though still more in Britain than anywhere else. It was seventy-five years after the Statute was first drafted, on Monday, July 25, 1698, before an anonymous clerk in the employ of the Great Seal Patent Office on Southampton Row, three blocks from the present--day site of the British Museum, granted patent number 356: Thomas Savery's "new Invention for Raiseing of Water and occasioning Motion to all Sorts of Mill Work by the lmpellent Force of Fire."

Both the case law and the legislation under which the application was granted had been written by Edward Coke. Both were imperfect, as indeed was Savery's own engine. The law was vague enough (and Savery's grant wide-ranging enough; it essentially covered all ways for "Raiseing of Water" by fire) that Thomas Newcomen was compelled to form a partnership with a man whose machine scarcely resembled his own. But it is not too much to claim that Coke's pen had as decisive an impact on the evolution of steam power as any of Newcomen's tools. Though he spent most of his life as something of a sycophant to Elizabeth and James, Coke's philosophical and temperamental affinity for ordinary Englishmen, particularly the nation's artisans, compelled him to act, time and again, in their interests even when, as with his advocacy of the 1628 Petition of Right (an inspiration for the U.S. Bill of Rights) it landed him in the King's prisons. He became the greatest advocate for England's craftsmen, secure in the belief that they, not her landed gentry or her merchants, were the nation's source of prosperity. By understanding that it was England's duty, and--perhaps even more important--in England's interest, to promote the creative labors of her creative laborers, he anticipated an economic philosophy far more modern than he probably understood, and if he grew rich in the service of the nation, he also, with his creation of the world's first durable patent law, returned the favor.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics in original.)

November 23, 2010

When Inventors Could Get Patents that Were Durable and Enforceable, "the World Started to Change"

(p. 50) . . . Coke, who had . . . been made Lord Chief Justice of' England, drafted the 1623 "Act concerning Monopolies and Dispensations with penall Lawes and the Forfeyture thereof," or, as it has become known, the Statute on Monopolies. The Act was designed to promote the interests of artisans, and eliminate all traces of monopolies.

With a single, and critical, exception. Section 6 of the Statute, which forbade every other form of monopoly, carved out one area in which an exclusive franchise could still be granted: Patents could still be awarded to the person who introduced the invention to the realm--to the "first and true inventor."

This was a very big deal indeed, though not because it represented the first time inventors received patents. The Venetian Republic was offering some form of patent protection by 1471, and in 1593, the Netherlands' States-General awarded a patent to Mathys Siverts, for a new (and unnamed) navigational instrument. And, of course, Englishmen like John of Utynam had been receiving patents for inventions ever since Henry VI. The difference between Coke's statute and the customs in place before and elsewhere is that it was a law, with all that implied for its durability and its enforceability. Once only inventors could receive patents, the world started to change.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics in original; ellipses added.)

November 19, 2010

Invention Aided By the Intelligent Hand and Spatial Intelligence

(p. 36) For centuries, certainly ever since Immanuel Kant called the hand the window on the mind," philosophers have been pondering the very complex way in which the human hand is related to the human mind. Modern neuroscience and evolutionary biology have confirmed the existence of what the Scottish physician and theologian Charles Bell called the intelligent hand. Stephen Pinker of Harvard even argues that early humans' intelligence increased partly because they were equipped with levers of influence on the world. namely the grippers found at the end of their two arms. We now know that the literally incredible amount of sensitivity and articulation of the human hand, which has increased at roughly the same pace as has the complexity of the human brain, is not merely a product of the pressures of natural selection, butt an initiator of it: The hand has led the brain to evolve just as much as the brain has led the hand. The hands of a pianist, or a painter, or a sushi chef, or even, as with Thomas New-(p. 37)comen, hands that could use a hammer to shape soft iron, are truly, in any functional sense, "intelligent."

This sort of tactile intelligence was not emphasized in A. P. Usher's theory of invention, the components of which he filtered through the early twentieth-century school of psychology known as Gestalt theory, which was preeminently a theory of visual behavior. The most important precepts of Gestalt theory (to Usher, anyway, who was utterly taken with their explanatory power) are that the patterns we perceive visually appear all at once, rather than by examining components one at a time, and that a principle of parsimony organizes visual perceptions into their simplest form. Or forms; one of the most famous Gestalt images is the one that can look like either a goblet or two facing profiles. Usher's enthusiasm for Gestalt psychology explains why, despite his unshakable belief in the inventive talents of ordinary individuals, he devotes an entire chapter of his magnum opus to perhaps the most extraordinary individual in the history of invention: Leonardo da Vinci.

Certainly, Leonardo would deserve a large place in any book on the history of mechanical invention, not only because of his fanciful helicopters and submarines. hut for his very real screw cutting engine, needle making machine, centrifugal pumps, and hundreds more. And Usher found Leonardo an extraordinarily useful symbol in marking the transition in mechanics from pure intuition to the application of science and mathematics.

But the real fascination for Usher was Leonardo's straddling of two worlds of creativity, the artistic and the inventive. No one, before or since, more clearly demonstrated the importance to invention of what we might call "spatial intelligence"; Leonardo was not an abstract thinker of any great achievement, nor were his mathematical skills, which he taught himself late in life, remarkable. (p. 38) His perceptual skills, on the other hand, developed primarily for his painting, were extraordinary, but they were so extraordinary that Usher could write, "It is only with Leonardo that the process of invention is lifted decisively into the field of the imagination. . . . "


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

November 15, 2010

If the Uncredentialed Succeed, It Must Be Luck

(p. 33) Newcomen and Calley had, in broad strokes, the design for a working engine. They had enjoyed some luck, though it was anything but dumb luck. This didn't seem to convince the self-named (p. 34) experimental philosopher J. T. Desaguliers, a Huguenot refugee Like Papin, who became one of Isaac Newton's assistants and (later) a priest in the Church of England. Desaguliers wrote, just before his death in 1744, that the two men had made their engine work, but "not being either philosophers to understand the reason, or mathematicians enough to calculate the powers and to proportion the parts, very luckily by accident found what they sought for."

The notion of' Newcomen's scientific ignorance persists to this day. One of its expressions is the legend that the original engine was made to cycle automatically by the insight of a boy named Humphrey Potter, who built a mazelike network of catches and strings from the plug rod to open the valves and close them. It is almost as if a Dartmouth ironmonger simply had to have an inordinate amount of luck to succeed where so many had failed.

The discovery of the power of injected water was luck; understanding and exploiting it was anything but. Newcomen and CalIey replaced the accidental hole in the cylinder with an injection valve, and, ingeniously, attached it to the piston itself. When the piston reached the bottom of the cylinder, it automatically closed the injection valve and opened another valve, permitting the water to flow out.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics in original.)

November 11, 2010

Toricelli Experiment Dispoved Aristotlelian Theory that a Vacuum Was Impossible

(p. 8) Florence, in the year 1641, had been essentially the private fief of the Medici family for two centuries. The city, ground zero for both the Renaissance and the Scientific Revolution, was also where Galileo Galilei had chosen to live out the sentence imposed by the Inquisition for his heretical writings that argued that the earth revolved around the sun. Galileo was seventy years old and living in a villa in Arcetri, in the hills above the city, (p. 9) when he read a book on the physics of movement titled De motu (sometimes Trattato del Moto) and summoned its author, Evangelista Torricelli, a mathematician then living in Rome. Torricelli, whose admiration for Galileo was practically without limit, decamped in time not only to spend the last three months of the great man's life at his side, but to succeed him as professor of mathematics at the Florentine Academy.

. . .

(p. 9) . . . , Torricelli used a tool even more powerful than his well--cultivated talent for mathematical logic: He did experiments. At the behest of one of his patrons, the Grand Duke of Tuscany, whose engineers were unable to build a sufficiently powerful pump, Torricelli designed a series of apparatuses to test the limits of the action of contemporary water pumps. In spring of 1644, Torricelli filled a narrow, four-foot-long glass tube with mercury--a far heavier fluid than water--inverted it in a basin of mercury, sealing the tube's top. and documented that while the mercury did not pour out, it did leave a space at the closed top of the tube. He reasoned that since nothing could have slipped past the mercury in the tube, what occupied the top of the tube must, therefore, be nothing: a vacuum.

. . .

(p. 10) Torricelli was not, even by the standards of his day, a terribly ambitious inventor. When faced with hostility from religious authorities and other traditionalists who believed, correctly, that his discovery was a direct shot at the Aristotelian world, he happily returned to his beloved cycloids, the latest traveler to find himself on the wrong side of the boundary line between science and technology

But by then it no longer mattered if Torricelli was willing to leave the messiness of physics for the perfection of mathematics: vacuum would keep mercury in the bottle, hut the genie was already out. Nature might have found vacuum repugnant for two thousand years, but Europe was about to embrace it.


Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: italics in original; ellipses added.)

November 2, 2010

William Rosen's "The Most Powerful Idea in the World"


Source of book image:

The range of William Rosen's fascinating and useful book is very broad indeed. He is interested in THE question: why did the singular improvement in living standards known as the industrial revolution happen where and when it did?

The question is not just of historical interest---if we can figure out what caused the improvement then and there, we have a better shot at continuing to improve in the here and now.

I especially enjoyed and learned from William Rosen's discussion, examples and quotations on the difficult issue of whether patents are on balance a good or bad institution.

Deirdre McCloskey taught me that the most important part of a sentence is the last word, and the most important part of a paragraph is the last sentence, and the most important part of a chapter is the last paragraph.

Here are the last couple of sentences of Rosen's book:

(p. 324) Incised in the stone over the Herbert C. Hoover Building's north entrance is the legend that, with Lincoln's characteristic brevity, sums up the single most important idea in the world:




In the next few weeks I will occasionally quote a few of the more illuminating passages from Rosen's well-written account.

Book discussed:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

September 27, 2010

Twitter CEO Returned to Nebraska to Found First Company


Evan Williams, Twitter CEO. Source of photo: online version of the NYT article quoted and cited below.

(p. 9) I GREW up on a farm in Nebraska, where we grew mostly corn and soybeans. During the summers I was responsible for making sure the crops were irrigated.

After high school, I enrolled at the University of Nebraska at Lincoln, but I stayed only a year and a half. I felt college was a waste of time; I wanted to start working. I moved to Florida, where I did some freelance copywriting. After that I moved to Texas and stayed with my older sister while I figured out what to do next. In 1994, I returned to Nebraska and started my first company with my dad.

We didn't know anything about the Internet, but I thought it was going to be a big deal. We produced CD-ROMs and a video on how to use the Internet, and we did some Web hosting. I recruited some friends and we tossed around some ideas, but none of us knew how to write software and we didn't have much money. We watched what entrepreneurs in California were doing and tried to play along.

. . .

My life has been a series of well-orchestrated accidents; I've always suffered from hallucinogenic optimism. I was broke for more than 10 years. I remember staying up all night one night at my first company and looking in couch cushions the next morning for some change to buy coffee. I've been able to pay my father back, which is nice, and my mother doesn't worry about me as much since I got married a year and a half ago.

For the full story, see:

EVAN WILLIAMS. "The Boss; For Twitter C.E.O., Well-Orchestrated Accidents." The New York Times, SundayBusiness Section (Sun., March 8, 2009): 9.

(Note: the online version of the story is dated March 7, 2009.)

August 26, 2010

Air Conditioning as "the Antithesis of Passive Resignation"

In the passage quoted below, Severgnini captures something of the truth. Americans, at their best, have sought to control nature in order to make life longer and happier.

But Severgnini does not see that there is a difference between seeking to control nature and seeking to control other people. At its best, America excels at the former, and refrains from the latter.

(p. W9) A few years ago, Italian journalist Beppe Severgnini recounted his adventures in the U.S. in the book "Ciao, America!" in which he offered up humorous musings on many of the standard European complaints about the American way of living. Mr. Severgnini allows that he rather admires the Yankee "urge to control the outside world," whether that means sending planes off an aircraft carrier or sending out technicians from Carrier.

He notes that the refusal to suffer the sweaty indignity of equatorial heat is "the antithesis of passive resignation," and thus a perfect expression of the can-do American character. "In America, air-conditioning is not simply a way of cooling down a room," Mr. Severgnini writes. "It is an affirmation of supremacy."

For the full commentary, see:

ERIC FELTEN. "DE GUSTIBUS; The Big Chill: Giving AC the Cold Shoulder." The Wall Street Journal (Fri., July 23, 2010): W9.

August 24, 2010

Wozniak "Lucky" to Be Young "Just as a Revolution Is About to Take Off"

(p. 299) If you're as lucky as I've been, then you'll get to live in a time when you're young just as a revolution is about to take off. Just like Henry Ford was there for the automotive industry, I was there to see and build the first personal computers.

Back in the mid-1990s when I was teaching school, I thought one time to myself, Wow, I wish I could be twelve now, look at the things I could do with what's out there now.

(p. 300) But then I realized I was lucky. I got to see the before, the during, and the after of some of those changes in life. I got to be one of those few people who could effect some of those changes.

Excellence came to me from not having much money, and also from having good building skills but not having done these products before.

I hope you'll be as lucky as I am. The world needs inventors--great ones. You can be one. If you love what you do and are willing to do what it takes, it's within your reach. And it'll be worth every minute you spend alone at night, thinking and thinking about what it is you want to design or build. It'll be worth it, I promise.


Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

August 20, 2010

Wozniak on Borrowing Xerox Parc's Graphical User Interface (GUI)

(p. 293) But there was one exception. Right around 1980, Steve and a bunch of us from Apple got to tour the Xerox Palo Alto Research Center (PARC) facility, which is one of Xerox's research and development labs.

Inside, for the first time ever, we saw real video displays--computer monitors--and they were showing something entirely new They were showing the first graphical user interface (GUI)--art interface that lets you interact with icons and menus to control a program.

(p. 294) Up to this point, everything had been text-based. That's going to sound odd to all the people who don't remember it, but that's how everything worked back then. A computer user had to actually type in text commands--long, complicated ones--to make something happen.

But this experimental Xerox computer had windows popping up all over the place. And they were using this funny-looking device everyone now knows as a mouse, clicking on words and small pictures, the icons, to make things happen.

The minute I saw this interface, I knew it was the future. There wasn't a doubt in my mind. It was like a one-way door to the future--and once you went through it, you could never turn back. It was such a huge improvement in using computers. The GUI meant you could get a computer to do the same things it could normally do, but with much less physical and mental effort. It meant that nontechnical people could do some pretty powerful things with computers without having to sit there and learn how to type in long commands. Also, it let several different programs run in separate windows at the same time. That was powerful!

A few years later, Apple designed the Lisa computer, and later the Macintosh, around this concept. And Microsoft did it a couple years after that with Microsoft Windows. And now, more than twenty-five years after we saw that experimental computer in the Xerox PARC lab, all computers work like this.

It's so rare to be able to see the future like that. I can't promise it'll happen to you. But when you see it, you know it. If this ever happens to you, leap at the chance to get involved. Trust your instincts. It isn't often that the future lets you in like that.


Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

August 5, 2010

Driving Blind: Exploring the Unexplorable

BlindDriver2010-07-24.jpg"The National Federation of the Blind operates a science camp to inspire young blind students, such as the girl above [Addison Hugen]. The organization is working with Virginia Tech on a vehicle equipped with sensors for blind drivers." Source of photo: online version of the article quoted and cited below. Source of caption: print version of the article quoted and cited below. [Bracketed name from online version of caption.]

(p. 3A) WASHINGTON (AP) - Could a blind person drive a car? Researchers are trying to make that far-fetched notion a reality.

The National Federation of the Blind and Virginia Tech plan to demonstrate a prototype vehicle next year equipped with technology that helps a blind person drive a car independently.

The technology, called "nonvisual interfaces," uses sensors to let a blind driver maneuver a car based on information transmitted to him about his surroundings: whether another car or object is nearby, in front of him or in a neighboring lane.

Advocates for the blind consider it a "moon shot," a goal similar to President John F. Kennedy's pledge to land a man on the moon. For many blind people, driving a car long has been considered impossible. But researchers hope the project could revolutionize mobility and challenge long-held assumptions about limitations.

"We're exploring areas that have previously been regarded as unexplorable," said Dr. Marc Maurer, president of the National Federation of the Blind.

For the full story, see:

KEN THOMAS . "Blind Drivers Goal of High-Tech Car Project." Omaha World-Herald (Saturday, July 3, 2010): 3A.

(Note: the online version of the article has the title: "Driving while blind? Maybe, with new high-tech car.")

July 26, 2010

The British Museum Collaborating with Wikipedia

WikipediaVisitsBritishMuseum2010-07-05.jpg"Two visitors from Wikipedia, Liam Wyatt, left, and Joseph Seddon, at the British Museum." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. C1) The British Museum has begun an unusual collaboration with Wikipedia, the online, volunteer-written encyclopedia, to help ensure that the museum's expertise and notable artifacts are reflected in that digital reference's pages.

About 40 Wikipedia contributors in the London area spent Friday with a "backstage pass" to the museum, meeting with curators and taking photographs of the collection. And in a curious reversal in status, curators were invited to review Wikipedia's treatment of the museum's collection and make a case that important pieces were missing or given short shrift.

Among those wandering the galleries was the museum's first Wikipedian in residence, Liam Wyatt, who will spend five weeks in the museum's offices to build a relationship between the two organizations, one founded in 1753, the other in 2001.

"I looked at how many Rosetta Stone page views there were at Wikipedia," said Matthew Cock, who is in charge of the museum's Web site and is supervising the collaboration with Wikipedia. "That is perhaps our iconic object, and five times as many people go to the Wikipedia article as to ours."

In other words, if you can't beat 'em, join 'em.

Once criticized as amateurism run amok, Wikipedia has become ingrained in the online world: it is consulted by millions of users when there is breaking news; its articles are frequently the first result when a search engine is used.

. . .

(p. C6) Getting permission to work with Wikipedia was not as hard a sell as he expected, Mr. Cock said. "Everyone assumed everyone else hated it and that I shouldn't recommend it to the directorate," he said. "I laid it out, put a paper together. I won't say I was surprised, but I was very pleased it was very well received."

He said he had enthusiastic support from four departments, including Greek and Roman antiquity and prints and drawings. "I don't think it is just the young curators," he added.

For the full story, see:

NOAM COHEN. "Venerable British Museum Enlists in the Wikipedia Revolution." The New York Times (Sat., June 5, 2010): C1 & C6.

(Note: ellipsis added.)

(Note: the online version of the article is dated June 4, 2010.)

July 24, 2010

Android App Phones Play "One Seriously Crazy Game of Leapfrog"


"The Droid X is the latest "best Android phone on the market."" Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B1) You think technology moves too fast now? You think your camera, camcorder and computer become obsolete quickly?

Try buying an app phone. In this business, the state of the art changes as often as Lady Gaga changes outfits.

Suppose, for example, that you want one of the increasingly popular phones that run Google's Android software.

Last November, you might have been tempted by the Motorola Droid, "the best Android phone on the market." A month later, the HTC Hero was "the best Android phone on the market." By January, "the best Android phone yet" was the Nexus One. In April, "the best Android device that you can purchase" was the HTC Incredible. In May, "the best Android phone on the market" was the Sprint Evo.

Either "the best Android phone on the market" is a tech critic's tic, or we're witnessing one seriously crazy game of leapfrog.

The latest buzz is about the Motorola Droid X, which Verizon will offer in mid-July for $200.

. . .

(p. B8) . . . , it's thrilling to see the array of excellent app phones that the original iPhone begat. If you who crave power, speed, flexibility, dropless calls an almost-Imax screen and Verizon's network (as opposed to Sprint and its similar Evo), the Droid X is a big, beautiful contender for the "best Android phone on the market" crown.

This month's crown, anyway.

For the full story, see:

DAVID POGUE. "State of the Art; Big Phone, Big Screen, Big Pleasure." The New York Times (Thurs., July 1, 2010): B1 & B8.

(Note: ellipses added.)

(Note: the online version of the article is dated June 30, 2010.)

July 9, 2010

Smarter Info Technology Frees Workers from Routine and Creates Jobs

(p. A22) Smarter computing technology, experts say, ought to make the most skilled workers -- in science, the arts and business -- even more productive and prosperous by freeing them from routine tasks. Their prosperity translates to spending that creates jobs in stores, schools, gyms, construction and elsewhere.

Artificial intelligence, experts say, should also generate new jobs even as it displaces others. The smart machines of the future will need programming, servicing and upgrading -- work done, perhaps, by a new class of digital technicians. The intelligent machines, experts add, will be specialists in a field, like the medical assistant project at Microsoft. They must be tailored with specialized software, perhaps igniting a new industry for artificial intelligence applications.

Of course, no one really knows just what artificial intelligence will mean for jobs and the economy, but the technology is marching ahead. "Its potential is far greater than simply substituting technology for human labor," said Erik Brynjolfsson, an economist at the M.I.T Sloan School of Management.

For the full story, see:

STEVE LOHR. "Jobs Created and Displaced." The New York Times (Fri., June 25, 2010): A22.

(Note: the date of the online version of the article was June 24, 2010.)

July 8, 2010

Low End Tech Upstart Moves Up-Market to Compete with Incumbents


Source of graph: online version of the WSJ article quoted and cited below.

The MediaTek example briefly mentioned below, seems a promising fit with Christensen's theory of disruptive innovators.

(p. B7) TAIPEI--A little-known Taiwanese chip-design company is making waves in the cellphone business, grabbing market share from larger U.S. rivals and helping drive down phone prices for consumers.

. . .

While MediaTek isn't known for cutting-edge innovation, it has been able to apply the nimble, cost-cutting approach of Taiwan's contract manufacturers to the business of designing semiconductors, in which engineers use advanced software to lay out the microscopic circuits that make gadgets like cellphones function.

"MediaTek has brought down the cost significantly," says Jessica Chang, an analyst at Credit Suisse Group AG, who says mobile-phone makers are increasingly drawn to MediaTek's products because of their functionality and low cost.

For the full story, see

TING-I TSAI. "Taiwan Chip Firm Shakes Up Cellphone Business." The Wall Street Journal (Mon., APRIL 19, 2010): B7.

(Note: ellipsis added.)

On Christensen's theories, see:

Christensen, Clayton M., and Michael E. Raynor. The Innovator's Solution: Creating and Sustaining Successful Growth. Boston, MA: Harvard Business School Press, 2003.

July 2, 2010

Cellphones in North Korea Promote Free Speech

NorthKoreanDefectorCellphone2010-05-20.jpg"Mun Seong-hwi, a North Korean defector, speaking to someone in North Korea to gather information at his office in Seoul." Source of caption and photo: online version of the NYT article quoted and cited below.

I have long believed, but cannot prove, that on balance technology improves human freedom more than it endangers it.

The case of cellphones in North Korea supports my belief.

(p. A1) SEOUL, South Korea -- North Korea, one of the world's most impenetrable nations, is facing a new threat: networks of its own citizens feeding information about life there to South Korea and its Western allies.

The networks are the creation of a handful of North Korean defectors and South Korean human rights activists using cellphones to pierce North Korea's near-total news blackout. To build the networks, recruiters slip into China to woo the few North Koreans allowed to travel there, provide cellphones to smuggle across the border, then post informers' phoned and texted reports on Web sites.

The work is risky. Recruiters spend months identifying and coaxing potential informants, all the while evading agents from the North and the Chinese police bent on stopping their work. The North Koreans face even greater danger; exposure could lead to imprisonment -- or death.

For the full story, see

CHOE SANG-HUN. "North Koreans Use Cellphones to Bare Secrets." The New York Times (Mon., March 29, 2010): A1 & A10.

(Note: the online version of the article is dated March 28, 2010.)

June 29, 2010

Wozniak: "It Was as if My Whole Life Had Been Leading Up to this Point"

(p. 155) It was as if my whole life had been leading up to this point. I'd done my minicomputer redesigns. I'd done data on--screen with Pong and Breakout., and I'd already done a TV terminal. From the Cream Soda Computer and others, I knew how to connect memory and make a working system. I realized that all I needed was this Canadian processor or another processor like it and (p. 156) some memory chips. Then I'd have the computer I'd always wanted!

Oh my god. I could build my own computer, a computer I could own and design to do any neat things I wanted to do with it for the rest of my life.

I didn't need to spend $400 to get an Altair--which really was just a glorified bunch of chips with a metal frame around it and some lights. That was the same as my take-home salary, I mean, come on. And to make the Altair do anything interesting, I'd have to spend way, way more than that. Probably hundreds, even thousands of dollars. And besides, I'd already been there with the Cream Soda Computer. I was bored with it then. You never go back. You go forward. And now, the Cream Soda Computer could be my jumping-off point.

No way was I going to do that. I decided then and there I had the opportunity to build the complete computer I'd always wanted. I just needed any microprocessor, and I could build an extremely small computer I could write programs on. Programs like games, and the simulation programs I wrote at work. The possibilities went on and on. And I wouldn't have to buy an Altair to do it. I would design it. all by myself.

That night, the night of that first meeting, this whole vision of a kind of personal computer just popped into my head. All at once. Just like that.


Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

June 21, 2010

Electronics Projects Were Wozniak's "Passion" and "Pastime" and "Reward"

(p. 127) I think most people with day jobs like to do something totally different when they get home. Some people like to come home and watch TV. But my thing was electronics projects. It was my passion and it was my pastime.

Working on projects was something I did on my own time to reward myself, even though I wasn't getting rewarded on the outside, with money or other visible signs of success.


Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

June 17, 2010

Scientific Calculators Creatively Destroyed Slide Rules

(p. 120) I'd been a slide rule whiz in high school, so when I saw the calculator, it was just amazing. A slide rule was kind of like a ruler-- you had to look at it precisely to read the values. The most accurate number you could get was only three digits long, however, and even that result was always questionable. With a calculator, you could punch in precisely the digits you wanted. You didn't have to line up a slider. You could type in your numbers exactly, hit a button, and get an answer immediately. You could get that number all the way out to ten digits. For example, the real answer might be 3.158723623. An answer like that was much more precise than anything engineers had ever gotten before.

Well, the HP 35 was the first scientific calculator, and It was the first in history that you could actually hold in your hand. It could calculate sines and cosines and tangents, all the trigonometric and exponential/logarithmic functions engineers use to calculate and to do their jobs. This was 1973, and back then cal-(p. 121)culators--especially handheld calculators--were a very, very big deal.

. . .

There was no doubt in my mind that calculators were going to put slide rules out of business. (In fact, two years later you couldn't even buy a slide rule. It was extinct.) And now all of a sudden I'd gotten a job helping to design the next generation of these scientific calculators. It was like getting to be a part of history.


Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

(Note: ellipsis added.)

June 13, 2010

In the Age of Vacuum Tubes, 6th Grader's Dad Showed Him How Transistors Work

Wozniak went on to invent the personal computer.

This example would probably fit with some of what Malcolm Gladwell claims in his bestseller Outliers.

(p. 15) I have to point out here that at no time did my dad make a big deal about my progress in electronics. He taught me stuff, sure, but he always acted as if it was just normal for me. By the sixth grade, I was really advanced in math and science, everyone knew it, and I'd been tested for IQ and they told us it was 200-plus. But my dad never acted like this was something he should push me along with. He pulled out a blackboard from time to time, a tiny little blackboard we had in our house on Edmonton Avenue, and when I asked, he would answer anything and make diagrams for it. I remember how he showed me what happened if you put a plus voltage into a transistor and got a minus voltage out the other end of the transistor. There must have been an inverter, a type of logic gate. And he even physically taught me how to make an AND gate and an OR gate out of parts he got--parts called diodes and resistors. And he showed me how they needed a transistor in between to amplify the signal and connect the output of one gate to the input of the other.

(p. 16) To this very moment, that is the way every single digital device on the planet works at its most basic level.

He took the time--a lot of time--to show me those few little things. They were little things to him, even though Fairchild and Texas Instruments had just developed the transistor only a decade earlier.

It's amazing, really, to think that my dad taught me about transistors back when almost no one saw anything but vacuum tubes. So he was at the top of the state of the art, probably because his secret job put him in touch with such advanced technology. So I ended up being at the state of the art, too.

The way my dad taught me, though, was not to rote-memorize how parts are connected to form a gate, but to learn where the electrons flowed to make the gate do its job. To truly internalize and understand what is going on, not just read stuff off some blueprint or out of some book.

Those lessons he taught me still drive my intelligence and my methods for all the computer designs I do today.


Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

The reference to the Gladwell book is:

Gladwell, Malcolm. Outliers: The Story of Success. New York, NY: Little, Brown, and Co., 2008.

June 9, 2010

Wozniak's Dad Taught Him the Power of Technology

(p. 12) . . . my dad taught me . . . a lot about electronics. Boy, do I owe a lot to him for this. He first started telling me things and explaining things about electronics when I was really, really young--before I was even four years old. This is before he had that top secret job at Lockheed, when he worked at Electronic Data Systems in the Los Angeles area. One of my first memories is his taking me to his workplace on a weekend and showing me a few electronic parts, putting them on a table with me so I got to play with them and look at them. I can still picture him standing there working on some kind of equipment. I don't know if he was soldering or what, but I do remember him hooking something up to something else that looked like a little TV set. I now know it was an oscilloscope. And he told me he was trying to get something done, trying to get the picture on the screen with a line (it was a waveform) stable-looking so he could show his boss that his design worked.

And I remember sitting there and being so little, and thinking: Wow, what a great, great world he's living in. I mean, that's all I (p. 13) thought: Wow. For people who know how to do this stuff--how to take these little parts and make them work together to do something--well, these people must be the smartest people hi the world. That was really what went through my head, way back then.

Now, I was, of course, too young at that point to decide that I wanted to be an engineer. That came a few years later. I hadn't even been exposed to science fiction or books about inventors yet, but just then, at that moment, I could see right before my eyes that whatever my dad was doing, whatever it was, it was important and good.


Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

(Note: ellipses added.)

June 4, 2010

At Apple Wozniak Was the Inventor, and Jobs Was the Entrepreneur


Source of book image:

iWoz is a fun read, with wild fluctuations in the significance of what is written. When Wozniak writes about the ingredients of inventiveness, it is significant. When he talks about his pranks, or his obsessions with certain number combinations, it is strange. (Maybe I just haven't figured out the significance of Wozniak's quirks---I once heard George Stigler say that even the mistakes of a great mind were worth pondering.)

In the next few weeks I'll be quoting a few of the more significant passages.

An over-riding lesson from the book, is the extent to which both Wozniak and Jobs were necessary for the Apple achievement. Wozniak was a genius inventor, but he did not have the drive or the skills, or the judgment of the entrepreneur.

Schumpeter famously distinguished invention from innovation. Wozniak was the inventor, and Jobs was the innovator (aka, the entrepreneur).

Book discussed:

Wozniak, Steve, and Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It. New York: W. W. Norton & Co., 2006.

May 21, 2010

"The Evolutionary Concomitant of Incessant Climate Change Was Human Resilience"

CreativeObjectsEarlyMan2010-05-14.jpg"Early Homo sapiens created these symbolic objects between 60,000 and 30,000 years ago. Using natural materials and creativity, they combined animal and human features into fantastical creatures and fashioned instruments for making music. "Source of caption and photo: online version of the WSJ article quoted and cited below.

The sort of artifacts displayed above have been used to argue that homo sapiens had essentially reached their modern capabilities at least by 40,000 years ago.

The handaxes below are fascinating, in that they clearly display progress, and they clearly display how slow that progress was.

(p. D13) The mysterious Ice Age extinction of the Neanderthals, losers in the competition against modern humans, still fires the popular imagination. So it's startling to learn that as recently as 70,000 years ago, at least four human species coexisted, including tenacious, long-lived Homo erectus and diminutive, hobbit-like Homo floresiensis, found in Indonesia in 2003.

The sensational 1974 discovery in Ethiopia of "Lucy," resembling an ape but walking upright, located human origins 3.2 million years in the past. Those same fossil deposits have recently yielded even more-ancient ancestors, who stood on their own two feet as far back as six million years ago.

Paleoanthropology is thriving, and human fossil finds--more than 6,000 and counting--regularly force revisions of old timelines and theories. Our species, Homo sapiens, turns out to have had an abundance of long-lost cousins, though scientists are still arguing about the closeness of those relationships. The new David H. Koch Hall of Human Origins at the Smithsonian National Museum of Natural History, whose opening marked the museum's centennial, provides a formidable overview of this still-developing story.

. . .

It's long been accepted that different human species were adapted to thrive in specific climatic niches. Neanderthals had short, compact bodies to conserve heat and large nasal passages to warm frigid air, while some of our African forebears had long, skinny frames suited to hotter climes. But this exhibition contends that the evolutionary concomitant of incessant climate change was human resilience--the flexibility to make it almost anywhere, thanks to large, sophisticated brains and social networks.

Versatility apparently characterized even our oldest relatives. The ability to walk upright through the drier, more open grasslands did not immediately divest them of their penchant for climbing trees in the shrinking woodlands. A diorama of Lucy (Australopithecus afarensis) depicts her with one foot on the ground and another on a tree limb, symbolizing her straddling of two environments.

For the full review, see:

JULIA M. KLEIN. "Natural History; Our Species Rediscovers Its Cousins." The Wall Street Journal (Tues., May 11, 2010): D13.

(Note: ellipsis added.)

HandaxesSlowlyEvolved2010-05-13.jpg"Handaxes -- multipurpose tools used to chop wood, butcher animals, and make other tools -- dominated early human technology for more than a million years. Left to right: Africa (1.6 million years old), Asia (1.1 million years old), and Europe (250,000 years old)." Source of caption and photo: online version of the WSJ article quoted and cited above.

April 27, 2010

Technology Can Enable the Disabled

JonesEricProstheticFingers2010-04-26.jpg"Eric P. Jones demonstrating his new prosthetic fingers. They have helped him master movements other people take for granted, like pouring soda into a cup." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 4) ERIC JONES sat in a middle seat on a recent flight from the New York area to Florida, but he wasn't complaining. Instead, he was quietly enjoying actions that many other people might take for granted, like taking a cup of coffee from the flight attendant or changing the channel on his video monitor.

These simple movements were lost to Mr. Jones when the fingers and thumb on his right hand were amputated three years ago. But now he has a prosthetic replacement: a set of motorized digits that can clasp cans, flimsy plastic water bottles or even thin slips of paper.

"Pouring a can of soda into a cup -- that is a mundane daily action for most people, but to me it is a very big deal," said Mr. Jones, who lives with his family in Mamaroneck, N.Y. "I slip my bionic fingers on like a glove, and then I have five moveable fingers to grasp things. It's wonderful to have regained these functions."

Mr. Jones's prosthesis, called ProDigits, is made by Touch Bionics in Livingston, Scotland. The device can replace any or all fingers on a hand; each replacement digit has a tiny motor and gear box mounted at the base. Movement is controlled by a computer chip in the prosthesis.

For the full story, see:

ANNE EISENBERG. "Novelties; Grabbing Gracefully, With Replacement Fingers." The New York Times, SundayBusiness Section (Sun., April 9, 2010): 4.

(Note: ellipses added.)

April 21, 2010

Genetically Modified Crops Provide Benefits, Scientists Say

GeneticallyModifiedCornSeed2010-04-19.jpg"A Missouri corn and soybean farmer with a sample of BioTech seed corn." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B3) The report is described as the first comprehensive assessment of the impact of genetically modified crops on American farmers, who have rapidly adopted them since their introduction in 1996. The study was issued by the National Research Council, which is affiliated with the National Academy of Sciences and provides advice to the nation under a Congressional charter.

The report found that the crops allowed farmers to either reduce chemical spraying or to use less harmful chemicals. The crops also had lower production costs, higher output or extra convenience, benefits that generally outweighed the higher costs of the engineered seeds.

"That's a long and impressive list of benefits these crops can provide, and have provided to adopting farmers," David E. Ervin, the chairman of the committee that wrote the report, said on Tuesday during a webcast news conference from Washington.

For the full story, see:

ANDREW POLLACK. "Study Finds Benefits of Genetically Modified Crops But Warns of Overuse." The New York Times (Thurs., April 14, 2010): B3.

(Note: the online version of the article was dated April 13, 2010 and has the very different title "Study Says Overuse Threatens Gains From Modified Crops.")

April 1, 2010

"Real Innovation in Technology Involves a Leap Ahead"

iPad2010-03-16.jpg"GAME CHANGER? After months of anticipation, Apple unveiled its iPad tablet computer last week." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 1) The more, the better. That's the fashionable recipe for nurturing new ideas these days. It emphasizes a kind of Internet-era egalitarianism that celebrates the "wisdom of the crowd" and "open innovation." Assemble all the contributions in the digital suggestion box, we're told in books and academic research, and the result will be collective intelligence.

Yet Apple, a creativity factory meticulously built by Steven P. Jobs since he returned to the company in 1997, suggests another innovation formula -- one more elitist and individual.

This approach is reflected in the company's latest potentially game-changing gadget, the iPad tablet, unveiled last week. It may succeed or stumble but it clearly carries the taste and perspective of Mr. Jobs and seems stamped by the company's earlier marketing motto: Think Different.

. . .

(p. 6) Great products, according to Mr. Jobs, are triumphs of "taste." And taste, he explains, is a byproduct of study, observation and being steeped in the culture of the past and present, of "trying to expose yourself to the best things humans have done and then bring those things into what you are doing."

His is not a product-design philosophy steered by committee or determined by market research. The Jobs formula, say colleagues, relies heavily on tenacity, patience, belief and instinct. He gets deeply involved in hardware and software design choices, which await his personal nod or veto. Mr. Jobs, of course, is one member of a large team at Apple, even if he is the leader. Indeed, he has often described his role as a team leader. In choosing key members of his team, he looks for the multiplier factor of excellence. Truly outstanding designers, engineers and managers, he says, are not just 10 percent, 20 percent or 30 percent better than merely very good ones, but 10 times better. Their contributions, he adds, are the raw material of "aha" products, which make users rethink their notions of, say, a music player or cellphone.

"Real innovation in technology involves a leap ahead, anticipating needs that no one really knew they had and then delivering capabilities that redefine product categories," said David B. Yoffie, a professor at the Harvard Business School. "That's what Steve Jobs has done."

For the full commentary, see:

STEVE LOHR. "The Apple in His Eye." The New York Times, Week in Review Section (Sun., MARCH 4, 2010): 1 & 6.

(Note: ellipsis added.)

(Note: the online version of the article is dated January 29, 2010 and had the title "Steve Jobs and the Economics of Elitism.")

March 24, 2010

The Ultimate Complement: When Your Competitor Uses Your Product


". . . apparently a photo that was snapped from the iPhone as Ballmer brandished it above his head." Source of caption and photo:

(p. A1) REDMOND, Wash.--Microsoft Corp. employees are passionate users of the latest tech toys. But there is one gadget love that many at the company dare not name: the iPhone.

The iPhone is made, of course, by Microsoft's longtime rival, Apple Inc. The device's success is a nagging reminder for Microsoft executives of how the company's own efforts to compete in the mobile business have fallen short in recent years. What is especially painful is that many of Microsoft's own employees are nuts for the device.

The perils of being an iPhone user at Microsoft were on display last September. At an all- company meeting in a Seattle sports stadium, one hapless employee used his iPhone to snap photos of Microsoft Chief Executive Steve Ballmer. Mr. Ballmer snatched the iPhone out of the employee's hands, placed it on the ground and pretended to stomp on it in front of thousands of Microsoft workers, according to people present.

. . .

Nearly 10,000 iPhone users were accessing the Microsoft employee email system last year, say two people who heard the estimates from senior Microsoft executives. That figure equals about 10% of the company's glo-(p. A10)bal work force.

Employees at Apple, in contrast, appear to be more devoted to the company's own mobile phone. Several people who work at the company or deal regularly with employees there say they can't recall seeing Apple workers with mobile phones other than the iPhone in recent memory.

For the full story, see:

NICK WINGFIELD. "Forbidden Fruit: Microsoft Workers Hide Their iPhones; Steve Ballmer Sours on Apple Product; Work for Ford, Drive a Ford." The Wall Street Journal (Sat., MARCH 13, 2010): A1 & A10.

(Note: ellipses added.)

(Note: the online version of the article had the date MARCH 12, 2010.)

March 16, 2010

Myhrvold Innovates in Financing Innovation

MyhrvoldNathanIntellectualVentures2010-03-01.jpg "Nathan Myhrvold, chief of Intellectual Ventures, says patent holders are being treated unfairly." Source of caption and photo: online version of the NYT article quoted and cited below.

When Nathan Myhrvold was at Microsoft, he helped Bill Gates write The Road Ahead, a well-written book full of realistically optimistic speculation, forecast and analysis.

Besides his main initiative, discussed below, he has recently been in the news due to his bold and controversial suggestion for how to cheaply solve global warming.

(p. B1) BELLEVUE, Wash. -- Nathan Myhrvold wants to shake up the marketplace for ideas. His mission and the activities of the company he heads, Intellectual Ventures, a secretive $5 billion investment firm that has scooped up 30,000 patents, inspire admiration and angst.

Admirers of Mr. Myhrvold, the scientist who led Microsoft's technology development in the 1990s, see an innovator seeking to elevate the economic role and financial rewards for inventors whose patented ideas are often used without compensation by big technology companies. His detractors see a cynical operator deploying his bulging patent trove as a powerful bargaining chip, along with the implied threat of costly litigation, to prod high-tech companies to pay him lucrative fees. They call his company "Intellectual Vultures."

White hat or black hat, Intellectual Ventures is growing rapidly and becoming a major force in the marketplace for intellectual capital. Its rise comes as Congress is considering legislation, championed by large technology companies, that would make it more difficult for patent holders to win large damage awards in court -- changes that Mr. Myhrvold has opposed in Congressional testimony and that his company has lobbied against.

. . .

(p. B10) The issues surrounding Intellectual Ventures, viewed broadly, are the ground rules and incentives for innovation. "How this plays out will be crucial to the American economy," said Josh Lerner, an economist and patent expert at the Harvard Business School.

Mr. Myhrvold certainly thinks so. He says he is trying to build a robust, efficient market for "invention capital," much as private equity and venture capital developed in recent decades. "They started from nothing, were deeply misunderstood and were trashed by people threatened by new business models," he said in his offices here.

Mr. Myhrvold presents his case at length in a 7,000-word article published on Thursday in the Harvard Business Review. "If we and firms like us succeed," he writes, "the invention capital system will turbocharge technological progress, create many more new businesses, and change the world for the better."

In the article and in conversation, Mr. Myhrvold describes the patent world as a vastly underdeveloped market, starved for private capital and too dependent on federal financing for universities and government agencies, which is mainly aimed at scientific discovery anyway. Eventually, he foresees patents being valued as a separate asset class, like real estate or securities.

His antagonists, he says, are the "cozy oligarchy" of big technology companies like I.B.M., Hewlett-Packard and others that typically reach cross-licensing agreements with each other, and then refuse to deal with or acknowledge the work of inventors or smaller companies.

. . .

Mr. Myhrvold personifies the term polymath. He is a prolific patent producer himself, with more than 100 held or applied for. He earned his Ph.D. in physics from Princeton and did postdoctorate research on quantum field theory under Stephen Hawking, before founding a start-up that Microsoft acquired.

He is an accomplished French chef, who has also won a national barbecue contest in Tennessee. He is an avid wildlife photographer, and he has dabbled in paleontology, working on research projects digging for dinosaur remains in the Rockies.

For the full story, see:

STEVE LOHR. "Turning Patents Into 'Invention Capital'." The New York Times (Thur., February 18, 2010): B1 & B10.

(Note: ellipses added.)

(Note: the online version of the article is dated February 17, 2010.)

The Bill Gates book is:

Gates, Bill. The Road Ahead. New York: Viking Penguin, 1995.

Myhrvold's Harvard Business Review essay is:

Myhrvold, Nathan. "The Big Idea: Funding Eureka!" Harvard Business Review 88, no. 2 (March 2010): 40-50.

MyhrvoldNathanFreezeDryMachine2010-03-01.jpg "Nathan Myhrvold with a machine that freeze-dries food. Intellectual Ventures so far has paid $315 million to individual inventors." Source of caption and photo: online version of the NYT article quoted and cited above.

March 13, 2010

"The Tech Industry's Innovation Engine Is in Idle"

(p. B10) . . . , the tech industry's innovation engine is in idle. The annual Mobile World Congress here -- traditionally a place to introduce products that blend computer and phone functions in novel ways -- has featured tweaks on existing designs.

"It's like with evolution, where you have a mutation and then a great explosion of diversity," said Scott A. McGregor, the chief executive of Broadcom, which makes chips that go into a wide range of consumer electronics. "Then, you have a period where you see which creatures can survive the big change."

For the full story, see:

ASHLEE VANCE. "At Tech Conference, the Industry Tweaks and Bets on Survivors." The New York Times (Thur., February 18, 2010): B10.

(Note: ellipsis added.)

(Note: the online version of the article is dated February 17, 2010 and has the title "Tech Industry Catches Its Breath.")

February 14, 2010

Senile Mice Benefit from Cellphone Radiation


"Mice seem to reap cognitive benefits from cellphone electromagnetism." Source of caption and photo: online version of the WSJ article quoted and cited below.

(p. D4) Alzheimer's and Cell Phones: Radiation associated with long-term cellphone use appears to protect against and reverse Alzheimer's-like symptoms in mice, according to a study in the Journal of Alzheimer's Disease. Mice genetically engineered to develop brain impairments similar to Alzheimer's in humans were divided into two groups. One group was exposed twice daily to hour-long electromagnetic fields akin to those created during cellphone use. Mice in the other group were not exposed to the radiation. After seven months, young mice in the first group fared significantly better on cognitive tests than their unexposed littermates. Older mice, which had already developed symptoms of Alzheimer's, exposed to the radiation for eight months in a subsequent experiment also performed better than older nonexposed mice. Mice, younger and older, not engineered to develop Alzheimer's also appeared to benefit from the radiation. Biopsies suggested such exposure might fight Alzheimer's by inhibiting the buildup of certain protein plaques in the brain, the researchers said.

For the full story, see:

JEREMY SINGER-VINE. "RESEARCH REPORT; NEW MEDICAL FINDINGS; Cellphone Radiation Aids Sick Mice." The Wall Street Journal (Tues., JANUARY 12, 2010): D4.

February 7, 2010

Entrepreneur Kurzweil Brought Sunshine to Stevie Wonder's Life

(p. 265) On the snowy morning of January 13, 1976, . . . , there was unusual traffic on Rogers Street. Outside the gray one-story buildings with their clouded tilt-out windows, vans from various television channels maneuvered to park. A man from the National Federation of the Blind struggled over a snow bank onto the sidewalk and began tapping earnestly to get his bearings. A dark-haired young man set out on a three-block trek to the nearest vendor of coffee and donuts for the gathering media. In the room at number 68, two engineers poked at a gray box that looked like a mimeograph machine sprouting wires to a Digital Equipment Corporation computer. Several intense young men in their early twenties debated when to begin a demonstration of the device. The short, curly-haired leader of the group, twenty-seven-year-old Raymond Kurzweil, refused to start until the arrival of a reporter from The New York Times.

The event was a press conference announcing the first breakthrough product in the field of artificial intelligence: a reader for the blind. Described as an "omnifont character recognition device" linked to a synthetic voice, the machine could read nearly any kind of book or document laid face down on its glass lens. With a learning faculty that improved the device's performance as it proceeded through blurred, faded, or otherwise illegible print, the machine solved problems of pattern recognition and synthesis that had long confounded IBM, Xerox, and the Japanese conglomerates, as well as thousands of university researchers.

. . .

(p. 266) Stevie Wonder, the great blind musician, called. He had heard about the device after its appearance on the "Today Show" and it seemed a lifelong dream come true. He headed up to Cambridge to meet with Kurzweil.

. . .

As Kurzweil remembers, "He was very excited about it and wanted (p. 267) one right away, so we actually turned the factory upside down and produced a unit that day. We showed him how to hook it up himself. He left with it practically under his arm. I understand he took it straight to his hotel room, set it up. and read all night." As Wonder said, the technology has been "a brother and a friend . . . . without question, another sunshine of my life." Wonder stayed in touch with Kurzweil over the years and would play a key role in conceiving and launching a second major Kurzweil product.


Gilder, George. Microcosm: The Quantum Revolution in Economics and Technology. Paperback ed. New York: Touchstone, 1990.

(Note: italics in original; all ellipses added except the ellipsis internal to the last paragraph, which was in the original.)

January 29, 2010

Another Boeing BHAG Takes Flight

BoeingDreamlinerFirstFlight2010-01-23.jpg "Members of the public watched the first test flight of the Boeing 787 on Tuesday in Everett, Wash." Source of caption and photo: online version of the NYT article quoted and cited below.

In their stimulating business best-seller Built to Last Collins and Porrus have a chapter in which they argue that one way to attract and retain the best employees is to give them a difficult but important project to work on. They call such projects "BHAGs," which stands for Big Hairy Audacious Goals. Among their main examples (e.g., p. 104) of BHAGs were Boeing's development of the 707 and 747.

Boeing's latest BHAG is the 787 Dreamliner.

(p. A25) EVERETT, Wash. -- The new Boeing 787 Dreamliner lifted into the gray skies here for the first time on Tuesday morning, more than two years behind schedule and burdened with restoring Boeing's pre-eminence in global commercial aviation.

"Engines, engines, engines, engines!" shouted April Seixeiro, 37, when the glossy twin-engine plane began warming up across from where spectators had informally gathered at Paine Field. Ms. Seixeiro was among scores of local residents and self-described "aviation geeks" who came to watch the first flight.

Moments after the plane took off at 10:27 a.m., Mrs. Seixeiro was wiping tears from her eyes. A friend, Katie Bailey, 34, cried, too.

"That was so beautiful," Ms. Bailey said.

For the full story, see:

WILLIAM YARDLEY. "As 787 Takes Flight, Seattle Wonders About Boeing's Future." The New York Times (Weds., December 16, 2009): A25.

(Note: the online version of the article has the title "A Takeoff, and Hope, for Boeing Dreamliner" and is dated December 15, 2009.)

The reference for the Collins and Porras book is:

Collins, James C., and Jerry I. Porras. Built to Last: Successful Habits of Visionary Companies. New York: HarperBusiness, 1994.

January 27, 2010

Warming of Arctic Would Allow Faster, Safer Cable Route

NorthwestPassageFiberOpticCableRoute2010-01-23.jpg Source of map: online version of the Omaha World-Herald article quoted and cited below.

(p. 4A) ANCHORAGE, Alaska (AP) - Global warming has melted so much Arctic ice that a telecommunication group is moving forward with a project that was unthinkable just a few years ago: laying underwater fiber optic cable between Tokyo and London by way of the Northwest Passage.

The proposed system would nearly cut in half the time it takes to send messages from the United Kingdom to Asia, said Walt Ebell, CEO of Kodiak-Kenai Cable Co. The route is the shortest underwater path between Tokyo and London.

The quicker transmission time is important in the financial world where milliseconds can count in executing profitable trades and transactions. "Speed is the crux," Ebell said. "You're cutting the delay from 140 milliseconds to 88 milliseconds."

. . .

"It will provide the domestic market an alternative route not only to Europe - there's lots of cable across the Atlantic - but it will provide the East Coast with an alternative, faster route to Asia as well," he said.

The cable would pass mostly through U.S., Canadian international waters and avoid possible trouble spots along the way.

"You're not susceptible to 'events,' I should say, that you might run into with a cable that runs across Russia or the cables that run down around Asia and go up through the Suez Canal into the Mediterranean Sea. You're getting away from those choke points."

For the full story, see:

DAN JOLING, Associated Press Writer. "Loss of Arctic Ice Opens Up New Cable Route." Omaha World-Herald (Fri., January 22, 2010): 4A.

(Note: the online version of the article had the title: Global warming opens up Arctic for undersea cable" and was dated January 21, 2010.)

(Note: ellipsis added.)

January 18, 2010

Establishments Assume New Methods Are Unsound Methods

(p. 188) For the next two years, Conway coordinated her efforts under Sutherland at PARC with Mead's ongoing work at Caltech. But she was frustrated with the pace of progress. There was no shortage of innovative design ideas; computerized design tools had advanced dramatically since Mead's first efforts several years before. Yet the industry as a whole continued in the old rut. As Conway put it later, the problem was "How can you take methods that are new, methods that are not in common use and therefore perhaps considered unsound methods, and turn them into sound methods?" [Conway's italics].

She saw the challenge in the terms described in Thomas Kuhn's popular book The Structure of Scientific Revolutions. it was the problem that took Boltzmann to his grave. It was the problem of innovation depicted by economist Joseph Schumpeter in his essays on entrepreneurship: new systems lay waste to the systems of the past. Creativity is a solution for the creator and the new ventures he launches. But it wreaks dissolution--"creative destruction," in Schumpeter's words-- for the defenders of old methods. In fact, no matter how persuasive the advocates of change, it is very rare that an entrenched establishment will reform its ways. Establishments die or retire or fall in revolution; they only rarely transform themselves.


Gilder, George. Microcosm: The Quantum Revolution in Economics and Technology. Paperback ed. New York: Touchstone, 1990.

(Note: italics in original.)

January 9, 2010

Bose Leapfrogs the Competition in Defense of Your Peace and Quiet

BoseQuietComfort15.jpg"The Bose QuietComfort 15 has refined circuitry and redesigned earcaps." Source of caption: print version of the NYT article quoted and cited below. Source of photo: online version of the NYT article quoted and cited below.

(p. B8) . . . , if your sales are getting eaten alive by cheaper rivals, and you don't want to play the price game, you have only one option: play leapfrog. Make your gadget so much better than the me-toos that people will be willing to pay your premium once again.

That's the idea behind Bose's new QuietComfort 15 model ($300), which replaces the QuietComfort 2.

. . .

First, the QC15 model really, truly does advance the art of noise cancellation -- big time. The QC 2 headphones and my Panasonics cut the airplane roar by half. But the 15 reduced it by, say, 85 percent, leaving only a distant, whispery whoosh to remind you that you're in an aluminum tube 39,000 feet up in the air. Taking them off after a while, as you'll want to do because your ears get sweaty, is like walking into a rock concert when you've been outside the building.

For the full story, see:

DAVID POGUE. "State of the Art; Ho Ho Ho? You Won't Hear a Thing." The New York Times (Thurs., December 3, 2009): B1 & B8.

(Note: the online version of the article is "State of the Art; Bose's Latest Headphones Can Quell the Clangor" and is dated December 2, 2009.)

(Note: ellipses added.)

January 6, 2010

Replication Easier than "Sweat and Anguish" of First Discovery

(p. 137) No one will deny that Japan's triumph in semiconductors depended on American inventions. But many analysts rush on to a further theory that the Japanese remained far behind the United States until the mid- 1970s and caught up only through a massive government program of industrial targeting of American inventions by MITI.

Perhaps the leading expert on the subject is Makoto Kikuchi, a twenty-six-year veteran of MITI laboratories, now director of the Sony Research Center. The creator of the first transistor made in Japan, he readily acknowledges the key role of American successes in fueling the advances in his own country: "Replicating someone else's experiment, no matter how much painful effort it might take, is nothing compared with the sweat and anguish of the men who first made the discovery."

Kikuchi explains: "No matter how many failures I had, I knew that somewhere in the world people had already succeeded in making a transistor. The first discoverers . . . had to continue their work, their long succession of failures, face-to-face with the despairing possibility that in the end they might never succeed. . . . As I fought my own battle with the transistor, I felt this lesson in my very bones." Working at MITI's labs, Kikuchi was deeply grateful for the technological targets offered by American inventors.


Gilder, George. Microcosm: The Quantum Revolution in Economics and Technology. Paperback ed. New York: Touchstone, 1990.

(Note: ellipses in original.)

January 2, 2010

Entrepreneurial Innovation Comes from Diverse Outsiders Rather than Establishments

(p. 113) Firms that win by the curve of mind often abandon it when they establish themselves in the world of matter. They fight to preserve the value of their material investments in plant and equipment that embody the ideas and experience of their early years of success. They begin to exalt expertise and old knowledge, rights and reputation, over the constant learning and experience of innovative capitalism. They get fat.

A fat cat drifting off the curve, however, is a sitting duck for new nations and companies getting on it. The curve of mind thus tends to favor outsiders over establishments of all kinds. At the capitalist ball, the blood is seldom blue or the money rarely seasoned. Microcosmic technologies are no exception. Capitalism's most lavish display, the microcosm, is no respecter of persons.

The United States did not enter the microcosm through the portals of the Ivy League, with Brooks Brothers suits, gentleman Cs, and warbling society wives. Few people who think they are in already can summon the energies to break in. From immigrants and outcasts, street toughs and science wonks, nerds and boffins, the bearded and the beer-bellied, the tacky and uptight, and sometimes weird, the born again and born yesterday, with Adam's apples bobbing, psyches (p. 114) throbbing, and acne galore, the fraternity of the pizza breakfast, the Ferrari dream, the silicon truth, the midnight modem, and the seventy-hour week, from dirt farms and redneck shanties, trailer parks and Levittowns, in a rainbow parade of all colors and wavelengths, of the hyperneat and the sty high, the crewcut and khaki, the pony-tailed and punk, accented from Britain and Madras, from Israel and Malaya, from Paris and Parris Island, from Iowa and Havana, from Brooklyn and Boise and Belgrade and Vienna and Vietnam, from the coarse fanaticism and desperation, ambition and hunger, genius and sweat of the outsider, the downtrodden, the banished, and the bullied come most of the progress in the world and in Silicon Valley.


Gilder, George. Microcosm: The Quantum Revolution in Economics and Technology. Paperback ed. New York: Touchstone, 1990.

December 25, 2009

After Lab Accident, Chip Innovator Shima Was Resilient

The incident recounted below is from the story of the development of the 4004 microprocessor (which was the first commercially available microprocessor). Hoff and Shima played important roles in the development of the chip.

I am not sure that the main "lesson" from the incident is about the importance of details. (After all, many entrepreneurs, including Simplot, embark on big projects without a clear idea of how to accomplish the details.) A bigger and sounder lesson may be the usefulness of resilience for successful inventors and entrepreneurs.

(p. 104) Hoff's counterpart at Busicom was a young Japanese named Masatoshi Shima who also had been thinking about problems of computer architecture. An equally formidable intellect, Shima came to the project through a series of accidents, beginning with a misbegotten effort to launch a small rocket using gunpowder he made by hand in his high school chemistry laboratory. As he carefully followed the formula, he claims to have had the mixture exactly right, except for some details that he overlooked. The mixture exploded, and as he pulled away his right hand, it seemed a bloody stump. At the local hospital (p. 105) a doctor with wide experience treating combat wounds felt lucky to save the boy's thumb alone,

This ordeal taught the teen-aged Shima that "details are very important." In the future he should "pay attention to all the details." But the loss of his fingers convinced his parents--and later several key Japanese companies--that the boy should not become a chemical engineer, even though he had won his degree in chemical engineering. Thus Shima ended up at Busicom chiefly because it was run by a friend of one of his professors.


Gilder, George. Microcosm: The Quantum Revolution in Economics and Technology. Paperback ed. New York: Touchstone, 1990.

December 22, 2009

Packard Was Told, If He Wanted a Better Car "He Had Better Build It Himself"

PackardPanther1954SteeringWheel.JPGThe steering wheel of the 1954 Packard Panther. Source of photo: online version of the NYT article quoted and cited below.

(p. 11) The company may have started on a dare, according to "Packard: A History of the Motor Car and the Company," edited by Beverly Rae Kimes (Automobile Quarterly Publications, 2002).

After graduating from Lehigh University's engineering school and returning home to Warren, Ohio, James Ward Packard considered buying his first car, a Winton. When Packard asked for some special features, he got this response from Alexander Winton: "The Winton waggon (sic) as it stands is the ripened and perfected product of many years of lofty thought ... and could not be improved in any detail. If Mr. Packard wants any of his own cats and dogs worked into a waggon, he had better build it himself."

Despite the rude reply, Packard bought the car, but it broke down often. Commiserating over dinner with George Weiss, a friend (and Winton stockholder), Packard decided to take Winton's words seriously. It must have been an especially satisfying day for Packard on June 17, 1899, when Weiss sold his Winton stock and invested in Packard's new business, soon to be named the Ohio Automobile Company.

Although its first cars looked conventional, they had some unusual features. It was one of a few cars with an accelerator pedal, and its H-gate gearshift pattern, a Packard patent, was widely used in later years.

Packard's reputation for reliability and durability was established with its model A and B cars, but the company did not stop development there, even taking the lessons of early mishaps to improve subsequent vehicles.

During the summer of 1900, a model B swerved into a ditch after hitting a pothole -- a hazard on cars with tiller steering, as the impact could jerk the steering lever from the driver's grasp -- injuring the passenger and damaging the car. Packard started work on a solution; when the model C was introduced later that year, it featured the industry's first steering wheel.

. . .

After flirting with Nash in the early 1950s, Packard purchased Studebaker in 1954 (which explains why the Packard Predictor resides in the Studebaker Museum). Studebaker was larger but struggling. The merger hastened the end of both makes.

Still, Packard left its mark on the American auto industry.

For the full story, see:

ROBB MANDELBAUM. "Collecting; Packard's Visions of the Future, When It Still Had One ." The New York Times, SportsSunday Section (Sun., September 10, 2009): 11.

(Note: ellipsis added.)

December 14, 2009

Gilder's Microcosm Tells the Story of the Entrepreneurs Who Made Personal Computers Possible


Source of book image:

Many years ago Telecosm was the first George Gilder book that I read; I enjoyed it for its over-the-top verbal exuberance in detailing, praising and predicting the progress of the then-new broadband technologies. I bought his earlier Microcosm at about the same time, but didn't get around to reading it because I assumed it would be a dated read, dealing in a similar manner with the earlier personal computer (PC) technology.

In the last year or so I have read Gilder's Wealth and Poverty and Recapturing the Spirit of Enterprise. There is some interesting material in Gilder's famous Wealth and Poverty, which has sometimes been described as one of the main intellectual manifestos of the Reagan administration. But Recapturing the Spirit of Enterprise has become my favorite Gilder book (so far).

In each chapter, the main modus operandi of that book is to present a case study of a recent entrepreneur, with plenty of interpretation of the lessons to be learned about why entrepreneurship is important to the economy, what sort of personal characteristics are common in entrepreneurs, and what government policies encourage or discourage entrepreneurs.

In that book I read that the original plan had been to include several chapters on the entrepreneurs who had built the personal computer revolution. But the original manuscript grew to unwieldy size, and so the personal computer chapters became the basis of the book Microcosm.

So Microcosm moved to the top of my "to-read" list, and turned out to be a much less-dated book than I had expected.

Microcosm does for the personal computer entrepreneurs what Recapturing the Spirit of Enterprise did for a broader set of entrepreneurs.

In the next few weeks, I will occasionally quote a few especially important examples or thought-provoking observations from Microcosm.

Reference to Gilder's MIcrocosm:

Gilder, George. Microcosm: The Quantum Revolution in Economics and Technology. Paperback ed. New York: Touchstone, 1990.

Other Gilder books mentioned:

Gilder, George. Recapturing the Spirit of Enterprise: Updated for the 1990s. updated ed. New York: ICS Press, 1992. (The first edition was called simply The Spirit of Enterprise, and appeared in 1984.)

Gilder, George. Telecosm: The World after Bandwidth Abundance. Paperback ed. New York: Touchstone, 2002.

Gilder, George. Wealth and Poverty. 3rd ed. New York: ICS Press, 1993.

November 4, 2009

Musings in Defense of the Car

Studebakers were made mainly in South Bend, Indiana, where I was born and raised. (One of our early family cars was a Studebaker Scotsman, but, alas, it did not look much like the Studebaker Commander that was once pictured above.)

By the way, in the musings quoted below, my understanding is that O'Rourke is not entirely right about Henry Ford: I believe Ford bankrupted two auto companies before founding the one that made the Model T that was once pictured below.)

(p. W2) . . . cars didn't shape our existence; cars let us escape with our lives. We're way the heck out here in Valley Bottom Heights and Trout Antler Estates because we were at war with the cities. We fought rotten public schools, idiot municipal bureaucracies, corrupt political machines, rampant criminality and the pointy-headed busybodies. Cars gave us our dragoons and hussars, lent us speed and mobility, let us scout the terrain and probe the enemy's lines. And thanks to our cars, when we lost the cities we weren't forced to surrender, we were able to retreat.

. . .

I don't believe the pointy-heads give a damn about climate change or gas mileage, much less about whether I survive a head-on with one of their tax-sucking mass-transit projects. All they want to is to make me hate my car.

. . .

American cars have been manufactured mostly by romantic fools. David Buick, Ransom E. Olds, Louis Chevrolet, Robert and Louis Hupp of the Hupmobile, the Dodge brothers, the Studebaker brothers, the Packard brothers, the Duesenberg brothers, Charles W. Nash, E. L. Cord, John North Willys, Preston Tucker and William H. Murphy, whose Cadillac cars were designed by the young Henry Ford, all went broke making cars. The man who founded General Motors in 1908, William Crapo (really) Durant, went broke twice. Henry Ford, of course, did not go broke, nor was he a romantic, but judging by his opinions he certainly was a fool.

. . .

There are those of us who have had the good fortune to meet with strength and beauty, with majestic force in which we were willing to trust our lives.

For the full commentary, see:

P.J. O'ROURKE. "The End of the Affair. The fate of Detroit isn't a matter of economics. It's a tragic romance, whose magic was killed by bureaucrats, bad taste and busybodies. P.J. O'Rourke on why Americans fell out of love with the automobile." The Wall Street Journal (Sat., MAY 30, 2009): W1-W2.

(Note: ellipses added.)

(Note: thanks to my mother for refreshing my faulty memory on which model of Studebaker we owned.)

For some interesting brief background on Ford, see:

Nye, John Vincent. "Lucky Fools and Cautious Businessmen: On Entrepreneurship and the Measurement of Entrepreneurial Failure." In The Vital One: Essays in Honour of Jonathan R. T. Hughes, edited by Joel Mokyr. Greenwich, Conn. and London: JAI Press, 1991, pp.131-52.

October 31, 2009

Google Does Good

BookArkCartoon2009-10-23.jpg Source of cartoon: online version of the NYT commentary quoted and cited below.

(p. A25) . . . the vast majority of books ever written are not accessible to anyone except the most tenacious researchers at premier academic libraries. Books written after 1923 quickly disappear into a literary black hole. With rare exceptions, one can buy them only for the small number of years they are in print. After that, they are found only in a vanishing number of libraries and used book stores. As the years pass, contracts get lost and forgotten, authors and publishers disappear, the rights holders become impossible to track down.

Inevitably, the few remaining copies of the books are left to deteriorate slowly or are lost to fires, floods and other disasters. While I was at Stanford in 1998, floods damaged or destroyed tens of thousands of books. Unfortunately, such events are not uncommon -- a similar flood happened at Stanford just 20 years prior. You could read about it in The Stanford-Lockheed Meyer Library Flood Report, published in 1980, but this book itself is no longer available.

Because books are such an important part of the world's collective knowledge and cultural heritage, Larry Page, the co-founder of Google, first proposed that we digitize all books a decade ago, when we were a fledgling startup. At the time, it was viewed as so ambitious and challenging a project that we were unable to attract anyone to work on it. But five years later, in 2004, Google Books (then called Google Print) was born, allowing users to search hundreds of thousands of books. Today, they number over 10 million and counting.

. . .

In the Insurance Year Book 1880-1881, which I found on Google Books, Cornelius Walford chronicles the destruction of dozens of libraries and millions of books, in the hope that such a record will "impress the necessity of something being done" to preserve them. The famous library at Alexandria burned three times, in 48 B.C., A.D. 273 and A.D. 640, as did the Library of Congress, where a fire in 1851 destroyed two-thirds of the collection.

I hope such destruction never happens again, but history would suggest otherwise. More important, even if our cultural heritage stays intact in the world's foremost libraries, it is effectively lost if no one can access it easily. Many companies, libraries and organizations will play a role in saving and making available the works of the 20th century. Together, authors, publishers and Google are taking just one step toward this goal, but it's an important step. Let's not miss this opportunity.

For the full commentary, see:

SERGEY BRIN. "A Library to Last Forever." The New York Times (Fri., October 9, 2009): A25.

(Note: ellipses added.)

(Note: the online version is dated October 8th.)

September 24, 2009

Noble Savages Were Not So Noble

(p. A20) The idea that primitive hunter-gatherers lived in harmony with the landscape has long been challenged by researchers, who say Stone Age humans in fact wiped out many animal species in places as varied as the mountains of New Zealand and the plains of North America. Now scientists are proposing a new arena of ancient depredation: the coast.

In an article in Friday's issue of the journal Science, anthropologists at the Smithsonian Institution and the University of Oregon cite evidence of sometimes serious damage by early inhabitants along the coasts of the Aleutian Islands, New England, the Gulf of Mexico, South Africa and California's Channel Islands, where the researchers do fieldwork.

"Human influence is pretty pervasive," one of the authors, Torben C. Rick of the National Museum of Natural History, part of the Smithsonian Institution, said in an interview. "Hunter-gatherers with fairly simple technology were actively degrading some marine ecosystems" tens of thousands of years ago.

For the full story, see:

CORNELIA DEAN. "Ancient Man Hurt Coasts, Paper Says." The New York Times (Fri., August 21, 2009): A20.

September 17, 2009

Electric Mitsubishis and Nissans May Leapfrog Hybrid Toyotas

(p. B6) Both Nissan and Mitsubishi have their own reasons for rushing out an all-electric car. Having invested little in hybrids, they hope to leapfrog straight to the next technology.

. . .

"You don't see many competing technologies survive in a key market for very long," said Mr. Shimizu, the Keio University professor.

And more often than not in the history of innovation, a change in the dominant technology means a change in the market leader.

"Electric cars are a disruptive technology, and Toyota knows that," Mr. Shimizu said. "I wouldn't say Toyota is killing the electric vehicle. Perhaps Toyota is scared."

For the full story, see:

HIROKO TABUCHI. "The Electric Slide." The New York Times (Thursday, August 20, 2009): B1 & B6.

(Note: The online version of the article had the title: "Toyota, Hybrid Innovator, Holds Back in Race to Go Electric.")

(Note: ellipsis added.)

August 29, 2009

Andy Grove's Case Against the Car Bailout

(p. A13) Imagine if in the middle of the computer transformation the Reagan administration worried about the upheaval and tried to rescue this vital industry by making huge investments in leading mainframe companies. The purpose of such investments would have been to protect the viability of these companies. The effect, however, would have been to put the brakes on transformation and all but ensure that the U.S. would lose its leadership role.

The government's investment in General Motors might be directly helpful if the auto industry only had the recession to contend with. But that is not the case. The industry faces the confluence of a world-wide recession, rising fuel prices, environmental demands, globalization of manufacturing, and, most importantly, technological change involving the very nature of the automobile.

For the full commentary, see:

ANDREW S. GROVE. "What Detroit Can Learn From Silicon Valley; Vertically integrated production is a thing of the past. Will the auto industry's new overseers catch on?" Wall Street Journal (Mon., JULY 13, 2009): A13.

August 25, 2009

Wikipedia Continues to Gain Respect

(p. B5) Recognizing that the online encyclopedia Wikipedia is increasingly used by the public as a news source, Google News began this month to include Wikipedia among the stable of publications it trawls to create the site.

A visit to the Google News home page on Wednesday evening, for example, found that four of the 30 or so articles summarized there had prominent links to Wikipedia articles, including ones covering the global swine flu outbreak and the Iranian election protests.

. . .

The move by Google News was news to Wikipedia itself. Jay Walsh, a spokesman for the Wikimedia Foundation, said he learned about it by reading an online item on the subject by the Nieman Journalism Lab.

"Google is recognizing that Wikipedia is becoming a source for very up-to-date information," he said, although "it is an encyclopedia at the end of the day."

For the full story, see:

NOAM COHEN. "Google Starts Including Wikipedia on Its News Site." The New York Times Company (Weds., June 22, 2009): B5.

(Note: ellipsis added.)

August 19, 2009

"Established Experts Flee in Horror to All Available Caves and Cages"

(p. 96) While science and enterprise open vast new panoramas of opportunity, our established experts flee in horror to all available caves and cages, like so many primitives, terrified by freedom and change.


Gilder, George. Recapturing the Spirit of Enterprise: Updated for the 1990s. updated ed. New York: ICS Press, 1992.

August 6, 2009

"The Most Remarkable Period of Practical Inventiveness in World History"


Source of book image: online version of the WSJ review quoted and cited below.

(p. W8) There are technologies and then there are technologies. Some are trivial, such as Ziploc plastic bags. They're handy, to be sure, but they don't change the world. Some are extraordinarily simple but profound, such as the stirrup, which came along only after men had been riding horses for well over a thousand years. Nothing more than a ring of metal hung from a leather strap, the stirrup made cavalry the dominant force on the European battlefield and therefore made the mounted knight the dominant force in European society for several hundred years.

As Gavin Weightman's "The Industrial Revolutionaries" reminds us, inventions on the level of the stirrup's importance seemed to come every other month during the late 18th and 19th centuries -- what Mr. Weightman calls "the most remarkable period of practical inventiveness in world history."

When Thomas Hobbes famously wrote in the 17th century that the great majority of the population led lives that were "nasty, brutish and short," he was describing an agrarian society that was, in its essence, unchanged since the advent of agriculture about 10,000 years earlier. Ownership of land was the basis of wealth. Hobbes had no reason to think that the situation would change any time soon. But it did: A rapidly accelerating development of world-transforming technologies, subsumed under the rubric of "the Industrial Revolution," began in Britain and within 100 years had molded the modern world.

. . .

The Industrial Revolution revolutionized more than just the global economy: It transformed politics and society. A world divided between a handful of aristocrats and millions of peasants was transformed into a world dominated by the middle class, where wealth is widely distributed and the franchise universal.

For the full review, see:

JOHN STEELE GORDON. "Books; Inventing a New World; The men who engineered the astonishing emergence of the modern age." Wall Street Journal (Sat., April 11, 2009): W8.

(Note: ellipsis added.)

The book being reviewed, is:

Weightman, Gavin. The Industrial Revolutionaries: The Making of the Modern World 1776-1914. New York: Grove Press, 2009.

July 20, 2009

Durant and Studebaker Made Transition from Carriage to Car

Christensen's theory of disruptive innovation predicts that incumbents will seldom survive a major disruption. So it is interesting that Durant and Studebaker, appear to have been exceptions, since they made the transition from producing carriages to producing cars. (Willie Durant founded General Motors in 1908.)

(p. 189) In 1900, fifty-seven surviving American automobile firms, out of hundreds of contenders, produced some 4,000 cars, three-quarters of which ran on steam or electricity. Companies famous for other products were entering the fray. Among them were the makers of the Pope bicycle, the Pierce birdcage, the Peerless wringer, the Buick bathtub, the White sewing machine, and the Briscoe garbage can. All vied for the market with stationary-engine makers, machine-tool manufacturers, and spinoffs of leading carriage firms, Durant and Studebaker. Among the less promising entrants seemed a lanky young engineer from Edison Illuminating Company named Henry Ford, whose Detroit Automobile Company produced twenty-five cars and failed in 1900.

. . .

(p. 191) Willie Durant, who knew all about production and selling from his carriage business, decided it was time to move into cars after several months of driving a prototype containing David Buick's valve-in-head engine--the most powerful in the world for its size--through rural Michigan in 1904. Within four years, Durant was to parlay his sturdy Buick vehicle into domination of the automobile industry, with a 25 percent share of the market in 1908, the year he founded General Motors.


Gilder, George. Recapturing the Spirit of Enterprise: Updated for the 1990s. updated ed. New York: ICS Press, 1992.

(Note: ellipsis added.)

Christensen's theory is most fully expressed in:

Christensen, Clayton M., and Michael E. Raynor. The Innovator's Solution: Creating and Sustaining Successful Growth. Boston, MA: Harvard Business School Press, 2003.

July 6, 2009

Our "Patently Absurd" Patent System

(p. A15) The Founders might have used quill pens, but they would roll their eyes at how, in this supposedly technology-minded era, we're undermining their intention to encourage innovation. The U.S. is stumbling in the transition from their Industrial Age to our Information Age, despite the charge in the Constitution that Congress "promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries."

. . .

Both sides may be right. New empirical research by Boston University law professors James Bessen and Michael Meurer, reported in their book, "Patent Failure," found that the value of pharmaceutical patents outweighed the costs of pharmaceutical-patent litigation. But for all other industries combined, they estimate that since the mid-1990s, the cost of U.S. patent litigation to alleged infringers ($12 billion in legal and business costs in 1999) is greater than the global profits that companies earn from patents (less than $4 billion in 1999). Since the 1980s, patent litigation has tripled and the probability that a particular patent is litigated within four years has more than doubled. Small inventors feel the brunt of the uncertainty costs, since bigger companies only pay for rights they think the system will protect.

These are shocking findings, but they point to the solution. New drugs require great specificity to earn a patent, whereas patents are often granted to broad, thus vague, innovations in software, communications and other technologies. Ironically, the aggregate value of these technology patents is then wiped out through litigation costs.

Our patent system for most innovations has become patently absurd. It's a disincentive at a time when we expect software and other technology companies to be the growth engine of the economy. Imagine how much more productive our information-driven economy would be if the patent system lived up to the intention of the Founders, by encouraging progress instead of suppressing it.

For the full commentary, see:

L. GORDON CROVITZ. "OPINION: INFORMATION AGE; Patent Gridlock Suppresses Innovation." Wall Street Journal (Mon., JULY 14, 2008): A15.

(Note: ellipsis added.)

July 3, 2009

Berkshire BYD Technology Bet Based on Munger's View of BYD Manager


"BOOK VALUE: Berkshire Hathaway's Charles Munger reads businesses well -- and, as a bibliophile, he goes through several books a week." Source of caricature and caption: online version of the WSJ article quoted and cited below.

At a Berkshire Hathaway annual meeting a few years ago, I remember hearing Warren Buffett say that he stays away from technology stocks because he does not know how to judge which technologies are likely to succeed in the long-run. So I was a bit puzzled by the news that Berkshire Hathaway was investing in BYD, a Chinese company producing an electric car.

The passages quoted below may partially solve the puzzle: the investment in BYD was pushed by Charlie Munger and David Sokol, and was based more on a judgment about the quality of BYD's management, than the prospects for BYD's technology.

(p. C1) Mr. Munger's views have pushed Berkshire into some surprising directions. Several years ago, Mr. Munger learned of an obscure Chinese maker of batteries and automobiles called BYD Inc., which hopes to create a cheap, functional electric car.

A Chinese tech company is nothing like the shoe and underwear makers Berkshire had been buying. But Mr. Munger was enthusiastic, less about the technology than about Wang Chuanfu, who runs BYD. Mr. Wang, Mr. Munger says, is "likely to be one of the most important business people who ever lived."

Mr. Buffett was skeptical at first. But Mr. Munger persisted. David Sokol, chairman of Berkshire utility MidAmerican Energy Holdings Co., paid a visit to BYD's factory in China and agreed with Mr. Munger