The science fiction of Apple Computer
Contents
For the so-called “Apple Faithful,” 1996 was a gloomy year. Semiconductor executive Gil Amelio was driving the company to the grave. The next-generation operating system project that would rescue the Mac from its decrepit software infrastructure had stalled and cancelled, with the company hoping to buy its way out of the problem.
In the press, this “beleaguered” Apple was always described as on the brink of death, despite a passionate customer base and a unique cultural position. People loved the Mac, and identified with Apple even as its marketshare shrank.
Of course, we know what came next. Apple acquired NeXT, and a “consultant” Steve Jobs orchestrated a board coup that ousted Amelio and most of Apple’s directors. There followed a turnaround unlike anything else in technology history, building the kernel of a business now worth $2.6 trillion.
Much has been said about the basic business hygiene that made this turnaround possible. Led by Jobs, Apple leadership slashed the complex product line to four quadrants across two axes: desktop-portable and consumer-professional. They got piled-up inventory down from months to hours. They cleaned up the retail channel and went online with a web-based build-to-order strategy.
Of course, they integrated NeXTSTEP, NeXT’s BSD-based, next-generation operating system, whose descendent underpinnings and developer tools live on to this day in every Mac, iPhone, and Apple Watch.
But none of these tasks alone were sufficient to turn Apple’s fortunes around. The halo of innovation that Apple earned put winds in its cultural sails, creating a sense of potential that drove both adoption and its stock price.
But what does “innovation” mean in practice?
Apple rides the leading edge of miniaturization, abstraction and convenience
If you look at the sweep from the first iMac to the latest M2 chips, Apple has been an aggressive early-adopter of technologies that address familiar problems with smaller and more convenient solutions.
Apple’s turnaround was situated during a dramatic moment in hardware technology evolution, where things were simultaneously shrinking and increasing in power. What was casually termed innovation was Apple’s relentless application of emerging technologies to genuine consumer needs and frustrations.
The USB revolution
Before USB, I/O in personal computing was miserable.
A babel of ports and specifications addressed different categories of product. Input devices needed one form of connector, printers another, external disks yet another.
These connectors were mostly low-bandwidth and finicky, often requiring the computer be powered down entirely merely to connect or disconnect them. Perhaps the worst offender was SCSI, a high-bandwidth interface for disks and scanners, packed with rules the user had to learn and debug like individual device addresses and “termination” for daisy-chaining what was usually a single port per-computer. You could have only a handful of SCSI devices, and some of that number was gobbled up by internals.
Originally conceived by a consortium of Wintel stakeholders, the USB standard emerged quietly to fix all of this just as Apple entered the worst of its death throes, with limited initial adoption.
With the release of the first iMac, in 1998, Apple broke with precedent, making USB the computer’s exclusive peripheral interface. By contrast to previous I/O on the Mac, USB was a revelation in convenience: devices were hot-pluggable, no shutdown required. The ports and connectors were compact, yet offered 12 mbit connections, supporting everything from keyboards to scanners to external drives. Devices beyond keyboards could even draw enough energy from USB to skip an extra power cable. Best of all, USB hubs allowed endless expansion, up to a staggering 127 devices.
Though the iMac was friendly and fresh in its design, its everyday experience was a stark departure in ease-of-use, thanks in part to the simplicity of its peripheral story. Mac OS was retooled to make drivers easy to load as needed, eventually attempting to grab them from the web if they were missing on disk.
Meanwhile, in an unprecedented example of cross-platform compatibility, Apple’s embrace of USB created a compelling opportunity for peripheral manufacturers to target both Mac and PC users with a single SKU, reducing manufacturing and inventory complexity. In Jobs’s 1999 Macworld New York introduction of the iBook, he claimed that USB peripheral offerings had grown 10x, from just 25 devices at the iMac’s launch, to over 250 under a year later. Seeing the rush of consumer excitement for the iMac, manufacturers were happy to join the fray, offering hip, translucent complements to this strange computer that anyone could use.
Today, USB is ubiquitous. Every office has a drawer full of USB flash drives and cables. Desperate to make a mark, with nothing to lose, Apple went all-in on the next generation of peripheral technology, and won the day.
AirPort and WiFi
In that iBook intro, Steve’s showmanship finds a dramatic flourish.
Rather than telling the audience about his “one more thing,” he showed it to them, forcing them to draw their own conclusions as they made sense of the impossible.
Loading a variety of web pages, Jobs did what is now commonplace: he lifted the iBook from its demonstration table, carried it elsewhere, and continued clicking around in his browser, session uninterrupted, no cables in sight.
The audience roared.
This magic, he explained, was fruits of a partnership with Lucent, and the commercialization of an open networking standard, 802.11. Jobs assured us this was a technology fast heating up, but we didn’t have to wait. We could have it now, with iBook and AirPort.
Accompanying the small network card that iBook used to join networks, Apple also provided the UFO-like AirPort base station, which interfaced with either an ethernet network or your dialup ISP.
Inflation-adjusted, joining the WiFi revolution cost about $700, including the base station and a single interface card. Not to mention a new computer that could host this hardware.
Nevertheless, this was a revolution. No longer tethered to a desk, you could explore the internet and communicate around the world anywhere in your house, without running cables you’d later trip over.
More than anything else Apple would do until 2007, the early adoption of WiFi was science fiction shit.
Miniaturized storage and the iPod
By 1999, Apple was firmly out of the woods. Profitable quarters were regular, sales were healthy, and their industrial design and marketing prowess had firmly gripped culture at large.
But it would be the iPod that cemented Apple’s transition out of its “beleaguered” era and into its seemingly endless rise.
While iPod was a classic convergence of Apple taste and power, the technical underpinnings that made it possible were hidden entirely from the people who used it everyday.
The earliest iPods used a shockingly small but complete hard drive, allowing them to pack vast amounts of music into a pocket-sized device. Before USB 2.0, Apple used FireWire, with its 400 mbit throughput, to transfer MP3s rapidly from computer to pocket.
Whereas a portable music collection had once been a bulky accordion file packed with hour-long CDs, now even a vast collection was reduced to the size of a pack of playing cards. Navigation of that collection was breezy and fully automated, a convenient database allowing you to explore from several dimensions, from songs to artists, albums to genres. You were no longer a janitor of your discs.
Instead of fumbling, you’d glide your thumb over a pleasing control surface, diving effortlessly through as much music as you ever cared to listen to.
iPod was far from the first MP3 player. But most of them had far less capacity. iPod was not the first hard drive-based MP3 player, either. But its competitors were far bulkier, and constrained by the narrow pipes of USB 1.1, much slower besides.
Once again, at the first opportunity to press novel technologies into service, Apple packaged them up, sold them at good margins, and made a ton of money.
Multitouch and the iPhone
For most of us, our first exposure to gestural operating systems came from Minority Report, as Tom Cruise’s cop-turned-fugitive, John Anderton, explored vast media libraries in his digital pre-crime fortress.
The technology, though fanciful, was under development in reality as well. A startup called FingerWorks, launched by academics, were puttering around trying to make a gestural keyboard. It looked weird and they did not find commercial traction.
Nevertheless, they were exploring the edge of something pivotal, casting about though they were in the darkness. In 2005, the company was acquired by Apple in a secretive transaction.
Two years later, this gestural, multi-touch interface was reborn as the iPhone interface.
By contrast to existing portable computers, the iPhone’s gesture recognition was fluid and effortless. Devices by Palm, along with their Windows-based competition, all relied on resistive touch screens that could detect only a single point of input: the firm stab of a pointy stylus. Even at this, they often struggled to pinpoint the precise location of the tap, and the devices were a study in frustration and cursing.
The iPhone, meanwhile, used capacitance to detect touches, and could sense quite a few of them at once. As the skin changed the electrical properties of the screen surface, the iPhone could track multiple fingers and use their motion relative to one another to make certain conclusions about user intent.
Thus, like any pre-crime operative, we could pinch, stretch and glide our way through a new frontier of computing.
Minority Report depicted a futuristic society, with a computing interface that seemed impossibly distant. Apple delivered it to our pockets, for a mere $500, just five years later.
The future of the futuristic
Apple has continued this pattern, even as the potential for dramatic changes has stabilized. The shape of computers has remained largely constant since 2010, as as has appearance of our phones.
Nevertheless, the Apple Watch is a wildly miniaturized suite of sensors, computing power, and even cellular transceivers. It is a Star Trek comm badge and biometric sensor we can buy today.
AirPods do so much auditory processing, even noise canceling, in an impossibly small package.
And Apple’s custom silicon, now driving even desktop and portable Macs, provides surprising performance despite low power consumption and minimal heat.
We are no longer lurching from one style of interface abstraction to the next, as we did from 1998 through 2007, trading CRTs for LCDs, hardware keyboards for multitouch screens. Still, Apple seems to be maintaining its stance of early adoption, applying the cutting edge of technology to products you can buy from their website today.
As rumors swirl about the coming augmented reality headset Apple has been cooking up, it will be interesting to see which science fiction activities they casually drop into our laps.
The challenge is formidable. Doing this well requires significant energy, significant graphics processing, and two dense displays, one for each eye. Existing players have given us clunky things that feel less futuristic than they do uncomfortable and tedious: firm anchors to the present moment in their poorly managed constraints.
But for Apple, tomorrow’s technology is today’s margins. I wonder what they’re going to sell us next.