Be Skeptical About the "Next Big Thing"

By Bob Seidensticker

(Back to article)

Terry Winograd’s 1971 PhD thesis was a blockbuster. He had created a simple computer graphics world inhabited by colored boxes, pyramids, and blocks. But that wasn’t the innovation. The breakthrough was that you could give commands to this virtual environment in real, unconstrained English.

For example: “Find a block which is taller than the one you are holding and put it in the box.” Or, you could ask about its problem solving process: “When did you pick up the green pyramid?” Or even demand that it parse sentences far too complex for everyday human conversation: “Does the shortest thing the tallest pyramid’s support supports support anything green?”

If the commands were ambiguous or a question needed an answer, the program would also respond in English.

Who could see this and not imagine HAL, the computer from the book 2001, just around the corner? And yet, 35 years later, HAL (or any practical example of advanced artificial intelligence) remains only a dream today.

The computer industry is often swamped with hype over some “inevitable” technology that’s just around the corner. Ebooks. Micropayments. ISDN. We may fear being left behind by the "Next Big Thing," but, as the example of language understanding indicates, the inevitable technology sometimes isn’t. Let’s look at a few more examples.

Japan’s Fifth Generation Computer project of the 1980s was to develop a powerful new type of computer and seize the computer initiative from the United States.

Japan competed ferociously in the areas of consumer electronics and semiconductors (and before that, shipbuilding, steel production, and other industries). Why couldn’t they do it again?

U.S. computer and software companies were alarmed, but the Fifth Generation Computer project was a flop.

The Iridium system uses satellites in low Earth orbit to provide worldwide telephone communication. One of its developers said, "Iridium will succeed because every time [Motorola] estimated the growth of cellular phones, we were low by a factor of four."

Clever technology to be sure, but it was an expensive option with strong competition from cell phones. In 1998, the service was operational. In 1999, it was bankrupt. And, in 2000, it was sold for less than one percent of the $3.4 billion invested in it.

Robert Lucky, the former executive director of Research at Bell Labs, personally experienced the failure of AT&T’s videophone project: “I myself inherited one and used it for about two years. I think I had the last one in the world. Alas, there was no one left to call.” And there are many more examples. The pushback against the virtual reality hype began in the early ’90s, and one wit noted there were more virtual reality conferences than customers.

Public key encryption was invented 30 years ago, but even now most email is not encrypted. The Josephson junction was an “inevitable” technology in the early 1980s. Is quantum computing inevitable in the same way?

This isn’t to say that none of these projects had an impact. However, what they delivered was far less than what was promised.

And there are more: Smart homes have not met expectations. Or biometrics, telecommuting, and smart cards. Or paperless offices, ebooks, micropayments, bubble memory, RISC logic, the network computer, and dozens more, both inside and outside the IT industry. None lived up to its hype.

It’s easy to get sidetracked to the reasons why these technologies failed—consumer resistance, the economics didn’t make sense, technical difficulties, and so on—but these reasons are beside the point.

These technologies were all big in their day. There was lots of buzz. The experts predicted great things for them. But history shows that most “inevitable” technologies aren’t. In fact, most new products fail.

The moral of the story? Be skeptical next time you hear about the "Next Big Thing."

Bob Seidensticker is an engineer who writes and speaks on the topic of technology change. A graduate of MIT, Bob has more than 25 years of experience in the computer industry. He is author of Future Hype: The Myths of Technology Change and holds 13 software patents.