The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution
Note: I wrote the first three chapters. After that, I let AI write "in a continuous style" with access to the book, and edited it after.
Ada, Countess of Lovelaces
Ada Lady von Lovelace, born December 1815, daughter of the luddite Lord Byron, whom she never saw. Her mother tried to steer her away, from the poetic inclinations of her father, and she learned lots of math early on. Still, she kept her poetic side: calling what she like „the poetic science".
Babbage, developed the Difference Engine, held regular Salons in London's High Society, envisioned the Analytical Engine, takes inspiration from the Jacquard's looms for using punch cards.
Ada tries to work with Babbage, she thinks of herself as a genius, she translates Babbage's talk, and adds notes later, which will become more famous than the translation itself.
Ada was the only one visionary enough, to see that the digits on the cards could represent anything and not only numbers, if it had the relations necessary to perform operations on it. Not even Babbage had that in mind.
Babbage's Analytical Engine lost sponsorship. Ada wrote further notes, and Babbage wanted her to write a comment on the lost sponsorship. She did not.
Later, she proposed a sort of partnership between them. She would help him secure financing, in return she could run the business built on the machine. But Babbage declined. Still, they remained friends.
Ada's life would slowly go downhill. She did not publish any science papers anymore, became addicted to gambling, had an affair, which led to being blackmailed. After a painful uterus cancer, she died in 1852, buried next to her poet father.
The computer
There is a time for each innovation. Rick Rubin says: „Ideas are ripe. If you do not take them now, someone else will instead." They can also come too early, like Babbage; only properly appreciated 100 years later.
Punch cards, vacuum tubes, relays, transistors; all ways to automate computing.
1890: using punch cards, Hermann Hollerith reduces the time to tabulate the census from previously eight years to just one year. 1924: He founds the company, that later becomes IBM.
Alan Turing, a genius with a lonely intensity. Running a marathon a day, one time he stops for an idea, imagining a literal mechanical process, to solve the Entscheidungsproblem. It is the birth of the Turing machine: used to compute any computable sequence.
Claude Shannon works at MIT under Vannevar Bush and at Bell Labs, learns the idea exchange protocol they have there. Stibitz creates the K Model, done with relays.
Howard Aiken, did not get university funding, but from the military: yes. He came from the rough, self made successful, and in front of a task, became an approaching thunderstorm.
He built the Harvard Mark I. It was digital, but slow, built on mechanical relays, which meant six seconds for just one multiplication. But, it was fully automatic.
Meanwhile, in Germany, Konrad Zuse, a lanky civil engineer dropout, constructs the Z1 (1936) on his parents' living room floor, using thin metal plates for memory, a crank handle clocking binary flaps. Funds are scarce, parts are scavenged, his dream is clear: automatic calculation.
1941: Z3 comes alive, The war ministry yawns; the Air Force funds it anyway. April 1945, bombs flatten Berlin, Z3 melts. Zuse hauls the still unborn Z4 over snowy Alps on a horse cart, hides it in a Bavarian barn, later sells it to ETH Zürich. A phoenix of German computing.
„The computer" was not one invention. More, it was a moment in time, a window of opportunity, which many attacked, each in a different way.
Programming
There were now many machines. But were they flexible? Could you not have just one machine, but for everything? Enter: programs, or, programmable instruction sets. Grace Hooper, a studied mathematician, introduced subroutines, and helped coin the term bug, when programming the Mark I. But the machine was slow.
Jean Jennings wrote programs for the ENIAC. John Mauchly was its visionary. And John Eckert. The two knew: Stored memory was the next big step.
Von Neumann helped with Aiken's Mark I at Harvard. He wanted faster computers! And in August 1944, at the Aberdeen Train Station, he meets General Goldstein. He is beyond excited: Goldstein tells him about this electronic computer, capable of 333 multiplications per second. What took the Mark I eighty hours, the ENIAC does in less than one. He got involved. The successor would be called the EDVAC. Von Neumann disseminated and cross pollinated knowledge: Mauchly and Eckert were angry: They did not want it to be public domain, they wanted patents.
The Transistor
Bell Labs, December 1947: John Bardeen's quiet intuition, Walter Brattain's steady hands, William Shockley's volcanic ego. Germanium point contact device amplifies, switches, astonishes. They toast Christmas with bad coffee; science pages call it the 'transistor', crossover resistor.
Shockley wants all credit, bullies, patents, leaves for California. Eight rebels walk out in 1957: Robert Noyce, Gordon Moore, Jean Hoerni … 'Traitorous Eight' found Fairchild Semiconductor, planting the first seed of what journalists soon label Silicon Valley.
1956: Nobel Prize divides the trio; Bardeen takes it in stride, Brattain returns to quiet research, Shockley sinks into eugenics and obscurity.
Tiny sand switches replace glass tubes; reliability soars, heat sinks, size shrinks. The electric era downsizes. What made it possible? Not one person, the collaboration of many did.
The Microchip
Texas, summer 1958, air conditioners wheeze. Jack Kilby, newly hired at TI, stays while colleagues vacation, carves a complete oscillator on one slice of germanium. 'Monolithic idea' scribbled in notebook.
Half a continent away; Mountain View, January 1959. Robert Noyce sketches the planar process, links components with aluminum traces deposited like frost on silicon. No dangling wires, no hand soldered chaos, just lithographed logic.
Microchips make small computers possible. And there are customers: In space, every gram counts!
1961: first customers, Apollo guidance, Minuteman missile. 1965: Moore plots a graph, notes the breathtaking slope, transistors doubling every year, later every two. Journal editors shrug, hobbyists rejoice; 'Moore's Law' becomes the drummer of progress.
1971: Intel 4004, four bit brain on a fingernail, 1974: 8080, 1978: 8086. Calculators, digital watches, early video terminals, they all ride the microchip wave.
Video Games
MIT, 1961. PDP 1 in a midnight lit lab. Steve 'Slug' Russell, Martin Graetz, Wayne Wiitanen, science fiction addicts, code Spacewar!. A cornerstone of hacker culture. Gravity well, torcheship sprites, a canopy of ASCII stars. Students queue, coffee percolates; interactive computing is suddenly fun.
1971: Galaxy Game at Stanford, 25 cent slug buys 90 seconds of dogfight.
Nolan Bushnell, Utah engineer turned carnival barker, repackages the thrill as Computer Space (1971), then simplifies to PONG (Atari, 1972). Bars, bowling alleys, pizza joints fill with wood grain cabinets, a chime of quarters funding research and development.
Home consoles: Ralph Baer's Magnavox Odyssey (1972), Atari 2600 (1977), cartridges swap worlds, children pilot pixels after homework.
1980: Space Invaders, Pac Man, Donkey Kong, arcade loops, new icons. Games teach a generation how to program without them realizing.
The Internet
1962: J. C. R. Licklider mails his 'Intergalactic Network' memo, writes of "online communities." Paul Baran (RAND) and Donald Davies (NPL) converge on the same cure: packet switching, messages as chopped, reorderable shards.
1969, 29 October, 22:30 h. Student Charley Kline types 'LOGIN' from UCLA to SRI. Only 'LO' arrives before the remote IMP crashes. Three months later, four nodes chatter routinely.
Ray Tomlinson, 1971, bolts email onto the network, chooses the @ sign 'because it doesn't appear in names.'
1973-74: Vinton Cerf and Bob Kahn forge TCP/IP, handshakes, error checks, architecture agnostic. 1 January 1983: the big flag day switchover, ARPANET speaks TCP/IP; the name 'Internet' sticks.
Jon Postel keeps RFCs in a blue binder, assigns domain names from USC offices. NSFNet (1986) opens pipes to universities, commercial use still frowned upon, but inevitable. By 1990 ARPANET itself shuts down, its child grown too large.
The Personal Computer
January 1975: Popular Electronics unveils Altair 8800, array of red LEDs, toggle switches, no screen, no keyboard. Hobbyists smell the future. Bill Gates & Paul Allen hack BASIC in four weeks, fly to Albuquerque with tape; it runs first time. MITS sells 10 000 kits; Microsoft is born.
Homebrew Computer Club gathers in a Menlo Park garage, Lee Felsenstein fields questions, Bob Marsh shows IMSAI, Steve Wozniak arrives with a single board marvel. Jobs sells his VW microbus to fund Apple I (1976). Apple II (1977), color graphics, plastic case, instant boot.
VisiCalc (1979) transforms PCs into balance sheet engines; accountants justify buying Apples 'for the spreadsheet alone.'
1981: IBM PC model 5150, Intel 8088 inside, Microsoft DOS license: open architecture fosters a legion of 'clones.'
1984: Macintosh premieres with Ridley Scott's Super Bowl ad. Mouse, icons, WYSIWYG, ideas pilfered from Xerox PARC, skinned with calligraphy and charisma. GUI becomes gospel.
Software
1976: Gates' 'Open Letter to Hobbyists' argues that sharing kills creativity. Software morphs from free addon to main commodity.
Richard Stallman, beard flowing, launches GNU (1983), 'GNU's Not Unix', vows four freedoms. 1985: Free Software Foundation. GPL license (1989) infects derivatives with openness.
Unix splinters: AT&T System V, BSD in Berkeley, SunOS, HP UX, culture of ports and patches.
1991: Linus Torvalds, in Helsinki dorm, posts about a 'just for fun' kernel. Within months thousands patch drivers, filesystems. Linux marries GNU tools; servers hum.
1998: Netscape open sources Mozilla; Eric Raymond pens 'The Cathedral and the Bazaar,' Wall Street adopts the buzzword 'open source.'
Online
Early hosts charged by the hour. CompuServe opens forums, email, priced for engineers, embraced by hobbyists. Steve Case repackages ease, America Online's disks spill from magazines. "You've got mail!" greets novices, chat rooms mingle millions, paving culture for the wider net.
Academic pipes widen: Larry Landweber shepherds CSNET (1981), linking universities beyond defense labs. NSF funds backbone; usage jumps. Jon Postel, 'God of numbers,' assigns domains by hand, maintains RFC wisdom. Standards triumph over speed; TCP/IP remains the common tongue.
The Web
CERN, 1989. Tim Berners Lee sketches a web of nodes and links. 1990: first page served from a NeXT cube, Robert Cailliau evangelizes inside Geneva tunnels. Gift to the world: April 1993, code released free.
Illinois, 1993. Marc Andreessen and Eric Bina code Mosaic: inline images, point and click. Students download, professors link syllabi; traffic multiplies monthly.
1994: Netscape spins out, Navigator dominates, IPO stuns Wall Street. Microsoft retaliates, Internet Explorer inside Windows. Browser wars ensue; standards held by W3C.
Larry Page and Sergey Brin at Stanford (1996) rank pages by links, Google search, clean and fast. Wikipedia launches, open collaboration proves power of the commons Berners Lee envisioned.
Ada Forever
Isaacson closes the loop: Ada's 1843 dream of "poetical science" flowers in every smartphone, where circuits, code, and networks merge. Steve Jobs unveils iPhone (2007), data, voice, music in one palm.
Yet lesson stays: collaboration over lone genius. Open standards, cross disciplinary teams, mathematicians, tinkerers, poets, keep innovation alive. The baton passes again: next breakthroughs will belong to those who couple beauty with bytes, logic with imagination, Ada's spirit, forever.