If you ask most people to pinpoint the origin of the multitouch interface—one of the most significant revolutions in how we interact with our computers—one now-legendary event will usually come to mind: Steve Jobs’ dramatic demo of the iPhone in January 2007. In 65 riveting minutes of high-tech showmanship, many of us saw for the first time a whole suite of interactions that would become second nature to billions of people around the world: pinch-and-zoom, inertial scrolling, virtual keyboards, “springboard” home screens.
But as tempting as it is to attribute technological revolutions like multitouch to a single visionary, the story of the multitouch interface is actually a case study in how major breakthroughs are inevitably collective affairs—the aggregate work of many different inventors solving distinct problems that can often seem completely unrelated.
© Blake Patterson, CC BY 2.0, via Wikimedia Commons
Sometimes our true software heroes are hidden because one single innovator has failed to get the recognition they deserve, because they didn’t become a wealthy entrepreneur or a celebrated industry visionary. But sometimes, the heroes are hidden because there are simply too many of them lurking behind a single innovation to do them all justice.
The multitouch interface is now a routine part of daily life for billions of people around the planet, used to execute countless tasks: checking email, playing games, composing music. But it began with one very urgent task: keeping airplanes from crashing into each other.
From the air traffic control to the physics lab
Sometime in the early 1960s, a British engineer at the Royal Radar Establishment named E. A. Johnson began tinkering with ideas for a new interface for air traffic controllers to manage flight paths in and out of UK airports—in addition to tracking planes for the purposes of air defense. The commercial aviation age was dawning, and the airspace around large metropolitan areas was becoming more challenging to manage. New jet planes were traveling at much higher speeds, making it even more essential that flight controllers could make quick, real-time decisions on the job.
Air traffic controllers were already used to tracking flight paths via radar screens; Johnson’s insight was that the screen itself could be used to help control those paths and not just visualize them. He built a working prototype of what is now called a “capacitive” touchscreen, with some core features that are still used today on multitouch devices, more than 50 years later. Because glass cannot conduct electricity, the screen is coated with a grid of transparent conductive material, like indium tin oxide. A minuscule amount of electricity flows through the grid at all times, but when a finger makes contact with a point on the screen, a tiny amount of electricity flows through the skin rather than the grid, which enables the device to detect the location of the finger on the screen.
“The touch display,” Johnson explained in a paper he published in 1965, “can be considered as a means by which a computer presents a list of those 'choices' which are available to a controller at any given instant and as the means by which the controller indicates his choice back to the computer.” Johnson had primarily focused on its use in the aviation context, but he seems to have grasped that his innovation had broader applications. The device, he noted, “has the potential to provide a very efficient coupling between man and machine.”
British air traffic controllers actually integrated Johnson’s device into their workflow in the late 1960s, but the technology failed to advance much beyond that limited application. Like many stories from the history of innovation, the next big breakthrough that would lay the groundwork for the multitouch interface would come out of a fortuitous accident. A few years after Johnson’s original paper was published, a physicist at the University of Kentucky named Samuel Hurst was working with a piece of equipment known as a Van de Graaff accelerator which was commonly used to study charged particles. Hurst and his colleagues had to rely on sluggish, analog strip chart recorders to collect data from their experiments, which slowed down their work. One day, Hurst hit upon the idea of using electrically conductive paper to automatically record x and y coordinates from the experiments. It made the process of data analysis much faster, but in building the device, Hurst began thinking about how the same technology could be applied to the x and y coordinates of a computer monitor.
Before long, Hurst had left the Van de Graff accelerator behind and founded a company called Elographics producing touchscreen interfaces for computers, which he initially ran out of his basement. Hurst correctly envisioned that a touch-based interface would be a major advance in making computers more accessible. “You could just look at a screen, poke your finger, and get an answer,” Hurst, who died in 2011, told a reporter many years later. “Anybody can poke a finger!”
Hurst eventually sold his company—it continues to operate to this day under the name Elotouch—and returned to his career as an academic researcher. But over the next decade or two, the idea that he had stumbled across in the lab began to spread around the world, as touchscreens slowly became a familiar experience, though the range of interaction was limited to simple, single-touch events, like typing a PIN into an ATM. A few new computing experiences—the Apple Newton or the Palm Pilot—explored the idea of direct contact with the screen using a stylus, but the vast majority of computers in circulation relied exclusively on mouse and keyboard-based interfaces.
The Apple Newton is now widely considered one of the most notorious flops in Apple’s history. Still, with hindsight, the device anticipated a number of computing developments that would become mainstream a decade later. Introduced in 1993, the Newton was a handheld device that lacked a physical keyboard—the primary input mechanism was handwriting using a special stylus and touch-sensitive screen. Unfortunately, the Newton’s handwriting recognition software proved to be famously unreliable (even the newspaper cartoon Doonsebury mocked it) and the product line was ultimately retired in the late 1990s. Shortly after that, Palm Computing had more success with their handheld device called the Palm Pilot that shared many features with the Newton but was less reliant on handwriting recognition.