Following his blockbuster biography of Steve Jobs, The Innovators is Walter Isaacson’s revealing story of the people who created the computer and the Internet. It is destined to be the standard history of the digital revolution and an indispensable guide to how innovation really happens.
What were the talents that allowed certain inventors and entrepreneurs to turn their visionary ideas into disruptive realities? What led to their creative leaps? Why did some succeed and others fail?
In his masterly saga, Isaacson begins with Ada Lovelace, Lord Byron’s daughter, who pioneered computer programming in the 1840s. He explores the fascinating personalities that created our current digital revolution, such as Vannevar Bush, Alan Turing, John von Neumann, J.C.R. Licklider, Doug Engelbart, Robert Noyce, Bill Gates, Steve Wozniak, Steve Jobs, Tim Berners-Lee, and Larry Page.
This is the story of how their minds worked and what made them so inventive. It’s also a narrative of how their ability to collaborate and master the art of teamwork made them even more creative.
For an era that seeks to foster innovation, creativity, and teamwork, The Innovators shows how they happen.
|Publisher:||Simon & Schuster|
|Product dimensions:||6.10(w) x 9.20(h) x 1.40(d)|
About the Author
Walter Isaacson, University Professor of History at Tulane, has been CEO of the Aspen Institute, chairman of CNN, and editor of Time magazine. He is the author of Leonardo da Vinci; The Innovators; Steve Jobs; Einstein: His Life and Universe; Benjamin Franklin: An American Life; and Kissinger: A Biography, and the coauthor of The Wise Men: Six Friends and the World They Made. Facebook: Walter Isaacson, Twitter: @WalterIsaacson
Date of Birth:May 20, 1952
Place of Birth:New Orleans, LA
Education:Harvard, B.A. in History and Literature, 1974; Oxford (Rhodes Scholar), M.A. in Philosophy, Politics, & Economics
Read an Excerpt
The computer and the Internet are among the most important inventions of our era, but few people know who created them. They were not conjured up in a garret or garage by solo inventors suitable to be singled out on magazine covers or put into a pantheon with Edison, Bell, and Morse. Instead, most of the innovations of the digital age were done collaboratively. There were a lot of fascinating people involved, some ingenious and a few even geniuses. This is the story of these pioneers, hackers, inventors, and entrepreneurs—who they were, how their minds worked, and what made them so creative. It’s also a narrative of how they collaborated and why their ability to work as teams made them even more creative.
The tale of their teamwork is important because we don’t often focus on how central that skill is to innovation. There are thousands of books celebrating people we biographers portray, or mythologize, as lone inventors. I’ve produced a few myself. Search the phrase “the man who invented” on Amazon and you get 1,860 book results. But we have far fewer tales of collaborative creativity, which is actually more important in understanding how today’s technology revolution was fashioned. It can also be more interesting.
We talk so much about innovation these days that it has become a buzzword, drained of clear meaning. So in this book I set out to report on how innovation actually happens in the real world. How did the most imaginative innovators of our time turn disruptive ideas into realities? I focus on a dozen or so of the most significant breakthroughs of the digital age and the people who made them. What ingredients produced their creative leaps? What skills proved most useful? How did they lead and collaborate? Why did some succeed and others fail?
I also explore the social and cultural forces that provide the atmosphere for innovation. For the birth of the digital age, this included a research ecosystem that was nurtured by government spending and managed by a military-industrial-academic collaboration. Intersecting with that was a loose alliance of community organizers, communal-minded hippies, do-it-yourself hobbyists, and homebrew hackers, most of whom were suspicious of centralized authority.
Histories can be written with a different emphasis on any of these factors. An example is the invention of the Harvard/IBM Mark I, the first big electromechanical computer. One of its programmers, Grace Hopper, wrote a history that focused on its primary creator, Howard Aiken. IBM countered with a history that featured its teams of faceless engineers who contributed the incremental innovations, from counters to card feeders, that went into the machine.
Likewise, what emphasis should be put on great individuals versus on cultural currents has long been a matter of dispute; in the mid-nineteenth century, Thomas Carlyle declared that “the history of the world is but the biography of great men,” and Herbert Spencer responded with a theory that emphasized the role of societal forces. Academics and participants often view this balance differently. “As a professor, I tended to think of history as run by impersonal forces,” Henry Kissinger told reporters during one of his Middle East shuttle missions in the 1970s. “But when you see it in practice, you see the difference personalities make.”1 When it comes to digital-age innovation, as with Middle East peacemaking, a variety of personal and cultural forces all come into play, and in this book I sought to weave them together.
The Internet was originally built to facilitate collaboration. By contrast, personal computers, especially those meant to be used at home, were devised as tools for individual creativity. For more than a decade, beginning in the early 1970s, the development of networks and that of home computers proceeded separately from one another. They finally began coming together in the late 1980s with the advent of modems, online services, and the Web. Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere.
Historians of science are sometimes wary about calling periods of great change revolutions, because they prefer to view progress as evolutionary. “There was no such thing as the Scientific Revolution, and this is a book about it,” is the wry opening sentence of the Harvard professor Steven Shapin’s book on that period. One method that Shapin used to escape his half-joking contradiction is to note how the key players of the period “vigorously expressed the view” that they were part of a revolution. “Our sense of radical change afoot comes substantially from them.”2
Likewise, most of us today share a sense that the digital advances of the past half century are transforming, perhaps even revolutionizing the way we live. I can recall the excitement that each new breakthrough engendered. My father and uncles were electrical engineers, and like many of the characters in this book I grew up with a basement workshop that had circuit boards to be soldered, radios to be opened, tubes to be tested, and boxes of transistors and resistors to be sorted and deployed. As an electronics geek who loved Heathkits and ham radios (WA5JTP), I can remember when vacuum tubes gave way to transistors. At college I learned programming using punch cards and recall when the agony of batch processing was replaced by the ecstasy of hands-on interaction. In the 1980s I thrilled to the static and screech that modems made when they opened for you the weirdly magical realm of online services and bulletin boards, and in the early 1990s I helped to run a digital division at Time and Time Warner that launched new Web and broadband Internet services. As Wordsworth said of the enthusiasts who were present at the beginning of the French Revolution, “Bliss was it in that dawn to be alive.”
I began work on this book more than a decade ago. It grew out of my fascination with the digital-age advances I had witnessed and also from my biography of Benjamin Franklin, who was an innovator, inventor, publisher, postal service pioneer, and all-around information networker and entrepreneur. I wanted to step away from doing biographies, which tend to emphasize the role of singular individuals, and once again do a book like The Wise Men, which I had coauthored with a colleague about the creative teamwork of six friends who shaped America’s cold war policies. My initial plan was to focus on the teams that invented the Internet. But when I interviewed Bill Gates, he convinced me that the simultaneous emergence of the Internet and the personal computer made for a richer tale. I put this book on hold early in 2009, when I began working on a biography of Steve Jobs. But his story reinforced my interest in how the development of the Internet and computers intertwined, so as soon as I finished that book, I went back to work on this tale of digital-age innovators.
The protocols of the Internet were devised by peer collaboration, and the resulting system seemed to have embedded in its genetic code a propensity to facilitate such collaboration. The power to create and transmit information was fully distributed to each of the nodes, and any attempt to impose controls or a hierarchy could be routed around. Without falling into the teleological fallacy of ascribing intentions or a personality to technology, it’s fair to say that a system of open networks connected to individually controlled computers tended, as the printing press did, to wrest control over the distribution of information from gatekeepers, central authorities, and institutions that employed scriveners and scribes. It became easier for ordinary folks to create and share content.
The collaboration that created the digital age was not just among peers but also between generations. Ideas were handed off from one cohort of innovators to the next. Another theme that emerged from my research was that users repeatedly commandeered digital innovations to create communications and social networking tools. I also became interested in how the quest for artificial intelligence—machines that think on their own—has consistently proved less fruitful than creating ways to forge a partnership or symbiosis between people and machines. In other words, the collaborative creativity that marked the digital age included collaboration between humans and machines.
Finally, I was struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences. They believed that beauty mattered. “I always thought of myself as a humanities person as a kid, but I liked electronics,” Jobs told me when I embarked on his biography. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” The people who were comfortable at this humanities-technology intersection helped to create the human-machine symbiosis that is at the core of this story.
Like many aspects of the digital age, this idea that innovation resides where art and science connect is not new. Leonardo da Vinci was the exemplar of the creativity that flourishes when the humanities and sciences interact. When Einstein was stymied while working out General Relativity, he would pull out his violin and play Mozart until he could reconnect to what he called the harmony of the spheres.
When it comes to computers, there is one other historical figure, not as well known, who embodied the combination of the arts and sciences. Like her famous father, she understood the romance of poetry. Unlike him, she also saw the romance of math and machinery. And that is where our story begins.
Table of Contents
Illustrated Timeline x
Chapter 1 Ada, Countess of Lovelace 7
Chapter 2 The Computer 35
Chapter 3 Programming 87
Chapter 4 The Transistor 131
Chapter 5 The Microchip 171
Chapter 6 Video Games 201
Chapter 7 The Internet 217
Chapter 8 The Personal Computer 253
Chapter 9 Software 313
Chapter 10 Online 383
Chapter 11 The Web 405
Chapter 12 Ada Forever 467
Photo Credits 525
Barnes & Noble Review Interview with Walter Isaacson
"Sometimes innovation is a matter of timing," writes Walter Isaacson. "Sometimes a big idea comes along at just the moment when the technology exists to implement it." But what happens when an individual conception leaps ahead of the tools available? As early as the 1840s, Ada Lovelace and Charles Babbage had developed the concept of the Analytical Engine, a programmable device that could perform potentially limitless kinds of operations in short, a computer.
But as Isaacson notes in his capacious and engrossing new book, The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution, no single inventor, no matter how inspired, could turn the Analytic Engine into the humming heart of the Information Age. That would require a century's worth of fascinatingly interconnected business plans, lab experiments, false starts, acrimonious feuds, unintended consequences, and, yes, eureka moments.
Most of all, as Isaacson shows throughout, it would require collaboration sometimes across decades, between colleagues who sometimes were, or became, fierce competitors. In a story whose web of interconnections seems to reflect the networked nature of the world to which it would give birth, we follow figures as famous as Alan Turing and Steve Jobs and as little known to the broad public as Internet visionary J.C.R. Licklider, and watch as programming languages, transistors, semiconductors, PCs, and the Web go from far-out ideas to the fabric of the everyday.
The author, of course, is as familiar with these precincts as anyone could be: the biographer of both America's first superstar inventor, Benjamin Franklin, and its most recent, Steve Jobs, Walter Isaacson has long been immersed in the question of how technology continues to shape our world. He spoke with me recently from his home in Washington, D.C., about the path through history his research took him on and where he sees it pointing next. The following is an edited transcript of our conversation. Bill Tipper
The Barnes & Noble Review: The Innovators is something of a departure from your previous work most of your books have focused on a single personality or a small, cohesive group. This is really a different project.
Walter Isaacson: We biographers realize that we distort history a little bit, by making it sound like there's some guy or gal in the garage or garret with a lightbulb moment, and the world changes. Usually, creativity comes from teams, from collaboration, and the people who created the Internet and the computer and the personal computer were all part of a wonderful mix of colorful characters. So I wanted to do a book that resurrects both the women and men who invented the computer and invented the Internet, but we don't know much about them.
BNR: There are so many personalities, so many micro stories within this kind of macro narrative. I was curious to know which were the real surprises for you? You'd already dipped into a lot of this territory.
WI: Yes. I think the creative and fun spirit of the women of ENIAC was a surprise to me. Also, the role that J.C.R. Licklider, an unknown hero of the Digital Age, played. He was an "aw, shucks" Missouri guy who didn't like to claim credit, but when he was at MIT he helped create interactive computing, easy-to-use screens and interfaces, and what he called an intergalactic computer network, all because he was helping create an air defense system for the United States. But he was also creating libraries of the future, and he really envisioned a lot of what we now have with graphical computer interfaces, as well as the Internet. I didn't previously know much about Licklider. Then I realized he is awesomely cool.
BNR: You portray him also as a really admirable person.
WI: He was a perfect saint for the Digital Age, because he loved to share credit.
BNR:You also challenge our notion of what an invention is. Take the computer, for example. What counts as the computer's invention?
WI: There are a lot of great visionaries, but vision without execution is just hallucination. There was a visionary in Iowa who figured out an electronic computer circuit, but he never really got the machine built. So I've tried to give more credit to people who can execute their vision.
BNR: As you point out, the things that we think of as the computer, or software, or the microchip in any of these cases the "invention" is rarely the end stage that we have now. In a way, each one of these things as we know it isn't one invention but a kind of constellation of inventions.
WI: And they all go together. You have to get the microchip in order to get the personal computer, and the combination of the personal computer and the Internet is 100 times more powerful than either of those inventions alone. Great visionaries not only understand innovation but understand the relationship between different innovations and how to pull them together like, for example, creating online services that can get people with personal computers onto the Internet.
BNR: You do a lot in The Innovators to highlight the moments when a new idea came along but it met a need that wasn't necessarily being clamored for at the moment. We think of technology as arising to solve problems that are there, but you make a nice point in your discussion of the invention of the transistor, which is such a crucial turning point both in your narrative and in your history.
WI: One of the things that Steve Jobs figured out was that you have to make not only innovations but products, and you have to figure out what people want before they know they want it. Nobody knew we needed 1,000 songs in our pocket, but Jobs came up with the iPod, which was a way to mix a computer and an MP3 player and a beautiful electronic device.
He combined all those things, and when he was doing the original Macintosh, somebody on the team said, "Shouldn't we have a focus group to figure out what people want in it?" He said, "Yeah. How do people know what they want until we show them?" That's what he does with the iPod.
Likewise, a guy named Pat Haggerty at Texas Instruments, he's making transistors, but nobody ever gets up in the morning and says, "I want to go buy a transistor." So he figures out, "I can make small radios." And nobody had thought of a radio other than as a household appliance, but this guy said, "People are going to want to take it with them to the beach, or the backyard, or to a party." So he invents the transistor radio. Getting back to your earlier point, it's a wonderful combination because it happens at the same time as rock 'n' roll is invented by Elvis Presley in the late 1950s, and the rebellious music, and the ability to have your own transistor radio actually go hand-in-hand, because you don't have to just listen to what your parents want to tune their radio to in the living room. You've got your own transistor radio.
BNR: Like the iPod, the transistor isn't something that you know that you want until you see it.
WI: And then you can't live without it. You never knew you wanted it, then you can't live without it. So there are three great steps to innovation. One is coming up with the vision, the second is executing it, and the third is making a product that people want. Because even if you invent a transistor, the world doesn't change until you've made products out of it.
BNR: There's a moment in the narrative in which Texas Instruments is trying to get the license to make transistors from Bell Labs, and it's a nice turning point in the book as well. It translates, I think, one of the other stories that runs through this, which is that along with the growth of a technology and a changing perspective about the potential of computing and electronics and digital technology, there's a change in American business, to a model that really goes from being skeptical of innovation to rewarding innovation. Texas Instruments was willing to pay for the license, but Bell Labs was reluctant to sell, because they didn't think Texas Instruments would do well with it. Bell Labs didn't have a financial incentive for refusing to sell the license.
BNR: It was more a cultural thing.
WI: What makes America so innovative is that when big companies find themselves a bit flatfooted, there's always some entrepreneur who is going to disrupt their business. Bell Labs doesn't ever quite figure out what to do with the transistor it invented, but all over the country people are turning transistors into products and then turning them into microchips that create more products.
BNR: That's something that, as you point out, was a shift in thinking for a lot of people. George Moore and others leave Shockley Semiconductor, and you see one Nobel Prize-winning innovator having to step aside, in a way. You also suggest this is a part of a cultural change where it's acceptable for these guys to go off and create a new company, as well as create new ideas.
WI: And what you have is the simultaneous invention of a start-up culture with venture capital coming to California. These venture capitalists don't care if you're part of a big corporation, and they don't care whether you might fail a couple of times. They're going to place bets on innovators. So a person like Bob Noyce and Gordon Moore could get Arthur Rock, who was creating a venture fund, to create a whole new business, as opposed to their having to work for an East Coast corporation.
BNR: You say that venture capitalism is as important to the Digital Age as the microchip.
WI: And in some ways these things go together. If you invent both a microchip and venture capital and a start-up culture, as well as a demand for a whole lot of products that use microchips, all of that goes into a mix, and suddenly you have a new industry.
BNR: Although you reach back to the work of Charles Babbage and Ada Lovelace, much of the work you describe in programming and electronics takes place against the backdrop of the Cold War and as much of it's going on, so is the space race with the Soviet Union. If you asked people during most of the twentieth century what a scientist did, or what the most glamorous picture of a scientist was, they would have probably answered you in terms of rockets or atomic energy. Now we think of digital electronics.
WI: I think that one fascinating trend of the Digital Age is that most innovations became more personal and more social. They got co- opted by users, so that the computer becomes a personal computer, and soon you're wearing it as a watch; and secondly, you're using it for social needs, not just for research or lone creativity. I think the power both of the computer and the Internet gets distributed to each individual user, as opposed to being something that big corporations or the government have control over.
BNR: I was talking some time ago to the writer Cilve Thompson, who wrote a book called Smarter than You Think . . .
WI: I reviewed it for the New York Times and gave it a good review.
BNR: Thompson writes in the book about the idea of the digital centaur, the person whose work is in a kind of collaboration with completing in such a way that you have almost that kind of human- computer hybrid that really is what enables certain kind of work to be done. You talk a lot about the possibilities for that similar kind of crossover. That is to say, rather than worrying about AI replacing people, you think that humans and computers are going to continue to curve off.
WI: Yes. I think that was an insight that begins with Ada Lovelace; and some of the coolest people of the Digital Revolution implemented it, including Vannevar Bush, Doug Engelbart, and Alan Kay, which is that combining in an intimate way computers with human creativity will be a lot more exciting than trying to build artificial intelligence machines that chug along without a tight partnership with humans.
BNR: There is a utopian strain in all of this, the notion that fundamentally what we are all engaged with in the development of this technology is a dream of liberation, that we can go to places we couldn't go, and we can relieve ourselves of problems that we could never solve ourselves.
WI: Vannevar Bush in 1945 talks of a machine like a personal computer that can amplify our mind. But as visionary as he was, he could not have imagined the astonishing power of personal computers and the Internet combined, where anybody, anywhere on the planet can create something in any format, from music to words to ideas to videos, and share it with anybody else on the planet, and also be able to retrieve any creative product that anybody in the world has made. This is a breathtaking leap in enabling both personal empowerment and also global creativity.
BNR: Given that, what do you get excited about when you look out at what's happening now, be it Internet based or kind of personal technology based?
WI: I think collaborative creativity is the exciting thing. People working together with crowds of people they don't know to produce interactive role-playing games, or crowd-sourced forms of storytelling. I think that there will be a merger of games, multimedia sites, and books to create whole new forms of content. I also hope that Bitcoin and other small payment systems will allow people to create things collaboratively and share the revenue in a fair way.
BNR: That last seems like an even more challenging thing, because you are also naturally interacting with the bad actions that currency itself seems to ignite in people.
WI: Yes, but Bitcoin offers so much promise for good that we've only begun to envision what cyber-currencies could do to create a new creative economy.
BNR: Speaking of Big Data and privacy concerns, did you fnd any of the people involved in the Digital Revolution looking forward and saying, "Well, wait a minute, this is going to mean the potential for limitless surveillance of our daily lives by governments and corporations?"
WI: No. I think that when the Internet was invented, it was designed by rebels and graduate students who were disciples of hackers and rebels the graduate students designing the Internet, who designed it in a way that made it hard to have top-down control. They also designed it in a way to allow anonymity. Now, both of those things can be used for good or for ill. But technological tools are only as good as we are, and that applies to the fountain pen and the telephone just as it does to the Internet.
BNR: Your description of Stewart Brand's role in fostering the ideas that underlay the Internet is fascinating.
WI: There are so many fun things that happened in the Digital Age, like The Mother of All Demos, in which counterculture gurus like Stewart Brand helped put on a show displaying the power of interactive computing being done by people like Doug Engelbart, and it becomes almost as if it were a rock show or an acid trip festival.
BNR: You can see how they were already thinking well ahead of what could be created in a factory and made the next year, and so many of the things that have been happening in only the last couple of decades were already being sketched out, in a sense.
WI: The great visionaries had a lot of fun, but you know, people like Doug Engelbart and all the people who become the heroes in the book are able to turn these visions into realities that change our lives.
BNR: One thing I wondered a lot about after reading this was how much you think there is still an unresolved tension between the inventive drive often seen here, which tends to be about people's personal obsessions, and then corporate monetization of this inventive drive, the increasing way in which we see it as being driven by products, essentially. Is there a tension there, or do we still have a healthy balance between those two things?
WI: I think there is a tension, but that's part of the balance, which is: People are doing things for their passion, including Wikipedia or creating Linux, doing it just because they want to share, and also people are doing it because they want to create a company that will make a profit. Only in America do you have this healthy balance between people doing open-source things and people doing proprietary things, and people creating new products they are trying to make money on, but they are also doing it out of pure passion.
BNR: Do you think that there is enough openness in the current culture so that things existing outside of, say, new apps for existing phones, are going to keep being developed at the same rate? Do you see any risks of a kind of closure around the devices that exist and that we're familiar with?
WI: No. I'm terribly optimistic about the flood of new ideas coming along. If you'd asked me five years ago, "Does the world need . . . What is every kid going to need next?" I never would have guessed Snap-Chat or What's App. But those go to the fact that we're social animals, as Aristotle said.
I do think that the next wave of innovation is going to come in two particular areas. One is a better connection between the creative industries and technology, so that we have whole new forms of creativity, such as crowd-sourced novels and interactive role-playing games. Hopefully, there will be a culture of Bitcoin-like micro- payments, so people can make a little bit of money by contributing to a big collaborative project. That's the theme in my book of the Arc of the Digital Age, people finding new ways to collaborate and to do things socially.
The other thing coming is, of course, the birth of more life sciences technology, you know, genetic engineering. That will demand a new model, because that's not as easy to do in your garage or basement, and it's not as easy for lone entrepreneurs . . . I mean, for sort of romantic entrepreneurs to do a start-up in something that involves, say, bioengineering.
BNR: One of the things that your book implies is that these things go together in ways that are not always predictable. The kind of dissemination of powerful computing power into the hands of individuals means that sequencing of DNA and other kinds of computational challenges for something like the biomedical world can increasingly be done on smaller and smaller scales.
WI: One great trend will be the democratization of Big Data. Once we can all tap into DNA databases and other such repositories of information, there will be probably some privacy concerns but certainly some great innovations.
October 13, 2014
Most Helpful Customer Reviews
This is a very well-written book about the history of the development of computers, with individual stories of each of the contributors to the hardware and software that preceded today's computers. It is a highly readable and fascinating series of stories of the individuals and teams that created our world of digital devices.
A very well written and very readable history of everything computer, from Ada Byron Lovelace thru to Larry Page and Sergey Brin. I learned all sorts of things I never knew about computers, the Internet and the World Wide Web. This is dense but not dull. Even if you know nothing about math, physics or coding, you can understand this (at least understand enough to 'get' it). These were a big bunch of wild and wooly guys and gals who brought us to 'now.'
This book was exactly what I expected - a history of the evolution of the computer, from theory, to early attempts, to today's computing technologies. It was easy to read with just the right amount of background on the key individuals.
If you feel the need to know the history of data processing and the internet then this is a must-read. I see this as a primer for the young student. It will fill in the gaps of your knowledge of computing and I think more importantly encourage new creative thinking. These "Innovators" didn't achieve this status by accident; they were groomed and challenged from their youth to dream, create and innovate. A few sections seem to drag just a bit but overall a worthwhile, fun, and informative book.
If you're interested in learning when and how all aspects of the computer were developed and by whom but are not computer savvy, then this is the book to read. Even if you do not know the difference between a bit and a byte, a server and a browser, or the Web and the Internet, you'll have no problem understanding the development of the computer. The book begins in the mid 1800s with the concept of a computer by English poet Lord Byron's daughter. From there the book explains what led to the development of the first digital computer in 1945, which happened to weigh 30 tons. Subsequent chapters deal with the development of the transistor, microchip, pocket calculator, microprocessor and XEROX PARC, which all led to the invention of the personal computer. Additional chapters explain the history behind the beginning of INTEL, Microsoft, Apple, AOL, Yahoo, and Google as well as the development of VisiCalc, the Hayes Smartmodem, Windows I, the Internet and World Wide Web, and the Mosaic browser.
Capturing a cultural revolution of the magnitude brought about by digital technology in one volume may be a bit too much to ask for. One can really expect to only capture a few snippets of the overall impact of such outside-the-box innovations such as the printing press and the electric light. The changes brought about by the advent of ubiquitous computer technology are unquestionably one of those topics that are challenging to condense. The first half of The Innovators does a competent job of tracing the evolution of computer technology from Ada Lovelace and Charles Babbage up to the dawn of the Internet and the World-Wide-Web with a few new pieces of new information and some alternative emphasis. But the author begins to wander mid-text with a questionable discussion of pot-smoking, LSD-popping fringe characters and their impact on computer technology. This over-emphasized tale results in some later glaring omissions that weaken the second half of the book. Missing entirely is any discussion of the Space Race and its impact on the electronics evolution. Likewise Wang is mentioned only once (without comment) and Sun Microsystems twice (likewise without any explanation of what "workstations" implies). In both cases (Wang by demonstrating the reality of a market for small computers, and Sun by both showing the usefulness of desk-top computing and the power of networking) the groundwork that spurred on the geeks and hackers was around long before the development of the personnel computer. Although the digital revolution has certainly had its moments of genius (especially at its start), the modern portion of the tale is more one of gross mistakes rather than inevitable evolution. What if Xerox had not ignored the breakthroughs made by its research division at PARC? What if AT&T had not given away the transistor and then fought tooth and nail to halt the development of networks? And, what if IBM had cut a better deal with Bill Gates rather than hand him the store on a golden platter? All the parts (often, literally) were laying about in the 1950s through 1970s for hackers and geeks to pluck and use. It is a tale of mega-corporate ineptitude as much as loner-in-a-garage innovation. Isaacson develops a number of themes (rather unevenly) throughout the book , but especially in the closing chapter: 1. The quest for digital artificial intelligence is probably misguided if not (as others have pointed out) dangerous. The emphasis on the future development of digital technology should be on computers as human assistants rather than replacements. 2. Innovation proceeds better and faster if the technologies are open and free. Largely missing from the discussion, however, is much about why shareware failed (at least pre-iPhone) and just who can claim to have supported the infants in the digital nurseries. 3. The evolution of social media in some sense fulfills the prescient insights of Ada Lovelace - computers as handmaidens of art. But, do Facebook and Twitter really have the equivalent social impact as Google and Wikipedia? Time will tell. And what about the impact of Amazon and just-in-time production? More than most histories of the computer revolution, Isaacson's The Innovators details the lives and personalities of the players involved over the technological developments themselves. That is the primary values of this book. The Innovators is very nicely illustrated with abundant images of both people and equipment. Notes are extensive as is a good Index. Richard R. Pardi Environmental Science William Paterson University
I was introduced to the the world of PC computers in late 1984. It was fascinating to read how it all happened up to that point and where we have gone up to today. I am thankful that I now know who these brilliant men and women are who laid the groundwork for our world today and that of our digital future. This is my second book by Mr. Isaacson and he does an excellent job of keeping me engaged with each page I turn.
This story is hnhhgasvx good?
We read this for book club and those in our group who are IT professionals enjoyed it. I did not enjoy it at all.
Has lots of informstion
It is very interestins
I dont understand not worth the $
Is this book good ? Please write back to me.