1960–2000 The use of
touchscreen technology predates both multi-touch technology and the personal computer. Early synthesizer and electronic instrument builders like
Hugh Le Caine and
Robert Moog experimented with using touch-sensitive capacitance sensors to control the sounds made by their instruments.
IBM began building the first touch screens in the late 1960s. In 1972,
Control Data released the
PLATO IV computer, an infrared terminal used for educational purposes, which employed single-touch points in a 16×16 array user interface. These early touchscreens only registered one point of touch at a time. On-screen keyboards (a well-known feature today) were thus awkward to use, because key-rollover and holding down a shift key while typing another were not possible. Exceptions to these were a "cross-wire" multi-touch reconfigurable touchscreen keyboard/display developed at the
Massachusetts Institute of Technology in the early 1970s and the 16 button capacitive multi-touch screen developed at
CERN in 1972 for the controls of the
Super Proton Synchrotron that were under construction. In 1976 a new
x-y capacitive screen, based on the capacitance touch screens developed in 1972 by Danish electronics engineer
Bent Stumpe, was developed at
CERN. This technology, allowing an exact location of the different touch points, was used to develop a new type of
human machine interface (HMI) for the control room of the Super Proton Synchrotron particle accelerator. In a handwritten note dated 11 March 1972, Stumpe presented his proposed solution – a capacitive touch screen with a fixed number of programmable buttons presented on a display. The screen was to consist of a set of capacitors etched into a film of copper on a sheet of glass, each capacitor being constructed so that a nearby flat conductor, such as the surface of a finger, would increase the capacitance by a significant amount. The capacitors were to consist of fine lines etched in copper on a sheet of glass – fine enough (80 μm) and sufficiently far apart (80 μm) to be invisible. In the final device, a simple lacquer coating prevented the fingers from actually touching the capacitors. In the same year,
MIT described a keyboard with variable graphics capable of multi-touch detection. A 1982 system at the University of Toronto used a frosted-glass panel with a camera placed behind the glass. When a finger or several fingers pressed on the glass, the camera would detect the action as one or more black spots on an otherwise white background, allowing it to be registered as an input. Since the size of a dot was dependent on pressure (how hard the person was pressing on the glass), the system was somewhat pressure-sensitive as well. In the same year, the video-based Video Place/Video Desk system of
Myron Krueger was influential in development of multi-touch gestures such as pinch-to-zoom, though this system had no touch interaction itself. By 1984, both Bell Labs and
Carnegie Mellon University had working multi-touch-screen prototypes – both input and graphics – that could respond interactively in response to multiple finger inputs. The Bell Labs system was based on capacitive coupling of fingers, whereas the CMU system was optical. In 1985, the canonical multitouch pinch-to-zoom gesture was demonstrated, with coordinated graphics, on CMU's system. In October 1985,
Steve Jobs signed a non-disclosure agreement to tour CMU's Sensor Frame multi-touch lab. In 1990, Sears et al. published a review of academic research on single and multi-touch touchscreen
human–computer interaction of the time, describing single touch gestures such as rotating knobs, swiping the screen to activate a switch (or a U-shaped gesture for a toggle switch), and touchscreen keyboards (including a study that showed that users could type at 25 words per minute for a touchscreen keyboard compared with 58 words per minute for a standard keyboard, with multi-touch hypothesized to improve data entry rate); multi-touch gestures such as selecting a range of a line, connecting objects, and a "tap-click" gesture to select while maintaining location with another finger are also described. In 1991,
Pierre Wellner advanced the topic publishing about his multi-touch "Digital Desk", which supported multi-finger and pinching motions. Various companies expanded upon these inventions in the beginning of the twenty-first century.
2000–present Between 1999 and 2005, the company
Fingerworks developed various multi-touch technologies, including Touchstream keyboards and the iGesture Pad. in the early 2000s
Alan Hedge, professor of human factors and ergonomics at
Cornell University published several studies about this technology. In 2005, Apple acquired Fingerworks and its multi-touch technology. In 2004, French start-up JazzMutant developed the
Lemur Input Device, a music controller that became in 2005 the first commercial product to feature a proprietary transparent multi-touch screen, allowing direct, ten-finger manipulation on the display. In January 2007, multi-touch technology became mainstream with the
iPhone, and in its iPhone announcement Apple even stated it "invented multi touch". Nonetheless, both the function and the term predate the announcement or patent requests, except for the area of capacitive mobile screens, which did not exist before Fingerworks/Apple's technology (Fingerworks filed patents in 2001–2005, subsequent multi-touch refinements were patented by Apple). Nonetheless, the U.S. Patent and Trademark office declared that the "pinch-to-zoom" functionality was predicted by U.S. Patent # 7,844,915 relating to gestures on touch screens, filed by
Bran Ferren and
Daniel Hillis in 2005, as was inertial scrolling, thus invalidated a key claims of Apple's patent. In 2001, Microsoft's table-top touch platform,
Microsoft PixelSense (formerly Surface) started development, which interacts with both the user's touch and their electronic devices and became commercial on May 29, 2007. Similarly, in 2001, Mitsubishi Electric Research Laboratories (MERL) began development of a multi-touch, multi-user system called
DiamondTouch. In 2008, the Diamondtouch became a commercial product and is also based on capacitance, but able to differentiate between multiple simultaneous users or rather, the chairs in which each user is seated or the floorpad on which the user is standing. In 2007, NORTD labs
open source system offered its
CUBIT (multi-touch). Small-scale touch devices rapidly became commonplace in 2008. The number of touch screen telephones was expected to increase from 200,000 shipped in 2006 to 21 million in 2012. In May 2015, Apple was granted a patent for a "fusion keyboard", which turns individual physical keys into multi-touch buttons. ==Applications==