Augmented reality has been explored for many uses, including education and business. Some of the earliest cited examples include augmented reality used to support surgery by providing virtual overlays to guide medical practitioners, to AR content for astronomy and welding. Example application areas described below include archaeology, architecture, commerce and education.
Education and training AR for education and training can overlay 3D models and step-by-step guidance in real settings (e.g., anatomy, maintenance); systematic reviews report learning benefits alongside design and implementation caveats that vary by context and task.
Navigation and maps Augmented reality navigation overlays route guidance or hazard cues onto the real scene, typically via smartphone "live view" or in-vehicle heads-up displays. Research finds AR can improve wayfinding and driver situation awareness, but human-factors trade-offs (distraction, cognitive load, occlusion) matter for safety-critical use.
See also:
Head-up display,
Automotive navigation system,
Wayfinding Commerce In 2021,
iBite was one of the first iOS applications to integrate
Apple's ARKit & RealityKit Swift frameworks for interactive augmented reality digital ordering. iBite allows users to view 3D models of their food before ordering, and allow merchants to upload their own USDZ files which they can generate using iBite's patented photogrammetry software. In 2018,
Apple announced USDZ, a file format based on
Universal Scene Description from Pixar, which allows 3D objects to be viewed in AR on iPhones and iPads with iOS 12. Apple has created an AR QuickLook Gallery that allows people to experience augmented reality through their own Apple device. In 2018,
Shopify, the Canadian e-commerce company, announced AR Quick Look integration. Their merchants will be able to upload 3D models of their products and their users will be able to tap on the models inside the Safari browser on their iOS devices to view them in their real-world environments. AR technology is used by furniture retailers such as
IKEA,
Houzz, and
Wayfair. These retailers offer apps that allow consumers to view their products in their home prior to purchasing anything. In 2017,
Ikea announced the Ikea Place app. It contains a catalogue of over 2,000 products—nearly the company's full collection of sofas, armchairs, coffee tables, and storage units which one can place anywhere in a room with their phone. The app made it possible to have 3D and true-to-scale models of furniture in the customer's living space. IKEA realized that their customers are not shopping in stores as often or making direct purchases anymore. Shopify's acquisition of Primer, an AR
app aims to push small and medium-sized sellers towards interactive AR shopping with easy to use AR integration and user experience for both merchants and consumers. AR helps the retail industry reduce operating costs. Merchants upload product information to the AR system, and consumers can use mobile terminals to search and generate 3D maps.
Surgery One of the first applications of augmented reality was in healthcare, particularly to support the planning, practice, and training of surgical procedures. As far back as 1992, enhancing human performance during surgery was a formally stated objective when building the first augmented reality systems at U.S. Air Force laboratories. visualizing the position of a tumor in the video of an
endoscope, or radiation exposure risks from X-ray imaging devices. AR can enhance viewing a
fetus inside a mother's
womb. Siemens, Karl Storz and IRCAD have developed a system for
laparoscopic liver surgery that uses AR to view sub-surface tumors and vessels. Guidance overlays and image fusion support planning and intraoperative visualization across several specialties; reviews note accuracy/registration constraints and workflow integration issues. The
HoloLens is capable of displaying images for image-guided surgery. As augmented reality advances, it finds increasing applications in healthcare. Augmented reality and similar computer based-utilities are being used to train medical professionals. In healthcare, AR can be used to provide guidance during diagnostic and therapeutic interventions e.g. during surgery. Magee et al., for instance, describe the use of augmented reality for medical training in simulating ultrasound-guided needle placement. Recently, augmented reality began seeing adoption in
neurosurgery, a field that requires heavy amounts of imaging before procedures.
Smartglasses can be incorporated into the operating room to aide in surgical procedures; possibly displaying patient data conveniently while overlaying precise visual guides for the surgeon. Augmented reality headsets like the
Microsoft HoloLens have been theorized to allow for efficient sharing of information between doctors, in addition to providing a platform for enhanced training. While mixed reality has lots of potential for enhancing healthcare, it does have some drawbacks too.
Flight training Building on decades of perceptual-motor research in experimental psychology, researchers at the Aviation Research Laboratory of the
University of Illinois at Urbana–Champaign used augmented reality in the form of a flight path in the sky to teach flight students how to land an airplane using a flight simulator. An adaptive augmented schedule in which students were shown the augmentation only when they departed from the flight path proved to be a more effective training intervention than a constant schedule. Flight students taught to land in the simulator with the adaptive augmentation learned to land a light aircraft more quickly than students with the same amount of landing training in the simulator but with constant augmentation or without any augmentation.
Military The first augmented reality system that integrated haptic 3D input was the
Virtual Fixtures platform, which was developed in 1992 by Louis Rosenberg at the
Armstrong Laboratories of the
United States Air Force. It enabled human users to control
robots in real-world environments using a haptic controller. Published studies showed that by introducing virtual objects into the real world, significant performance increases could be achieved by human operators. An interesting early application of AR occurred when
Rockwell International created video map overlays of satellite and orbital debris tracks to aid in space observations at Air Force Maui Optical System. In their 1993 paper "Debris Correlation Using the Rockwell WorldView System" the authors describe the use of map overlays applied to video from space surveillance telescopes. The map overlays indicated the trajectories of various objects in geographic coordinates. This allowed telescope operators to identify satellites, and also to identify and catalog potentially dangerous space debris. Starting in 2003 the US Army integrated the SmartCam3D augmented reality system into the Shadow Unmanned Aerial System to aid sensor operators using telescopic cameras to locate people or points of interest. The system combined fixed geographic information including street names, points of interest, airports, and railroads with live video from the camera system. The system offered a "picture in picture" mode that allows it to show a synthetic view of the area surrounding the camera's field of view. This helps solve a problem in which the field of view is so narrow that it excludes important context, as if "looking through a soda straw". The system displays real-time friend/foe/neutral location markers blended with live video, providing the operator with improved situational awareness. Combat reality can be simulated and represented using complex, layered data and visual aides, most of which are
head-mounted displays (HMD), which encompass any display technology that can be worn on the user's head. Military training solutions are often built on
commercial off-the-shelf (COTS) technologies, such as
Improbable's synthetic environment platform, Virtual Battlespace 3 and VirTra, with the latter two platforms used by the
United States Army. , VirTra is being used by both civilian and military law enforcement to train personnel in a variety of scenarios, including active shooter, domestic violence, and military traffic stops. In 2017, the U.S. Army was developing the Synthetic Training Environment (STE), a collection of technologies for training purposes that was expected to include mixed reality. , STE was still in development without a projected completion date. Some recorded goals of STE included enhancing realism and increasing simulation training capabilities and STE availability to other systems. It was claimed that mixed-reality environments like STE could reduce training costs, such as reducing the amount of
ammunition expended during training. In 2018, it was reported that STE would include representation of any part of the world's terrain for training purposes. STE would offer a variety of training opportunities for squad brigade and combat teams, including
Stryker, armory, and infantry teams. Researchers at USAF Research Lab (Calhoun, Draper et al.) found an approximately two-fold increase in the speed at which UAV sensor operators found points of interest using this technology. This ability to maintain geographic awareness quantitatively enhances mission efficiency. The system is in use on the US Army RQ-7 Shadow and the MQ-1C Gray Eagle Unmanned Aerial Systems. In combat, AR can serve as a networked communication system that renders useful battlefield data onto a soldier's goggles in real time. From the soldier's viewpoint, people and various objects can be marked with special indicators to warn of potential dangers. Virtual maps and 360° view camera imaging can also be rendered to aid a soldier's navigation and battlefield perspective, and this can be transmitted to military leaders at a remote command center. The combination of 360° view cameras visualization and AR can be used on board combat vehicles and tanks as
circular review system. AR can be an effective tool for virtually mapping out the 3D topologies of munition storages in the terrain, with the choice of the munitions combination in stacks and distances between them with a visualization of risk areas. The scope of AR applications also includes visualization of data from embedded munitions monitoring sensors. The LandForm software was also test flown at the Army
Yuma Proving Ground in 1999. In the photo at right one can see the map markers indicating runways, air traffic control tower, taxiways, and hangars overlaid on the video.
Industrial environments In industrial environments, augmented reality is proving to have a substantial impact with use cases emerging across all aspect of the product lifecycle, starting from product design and new product introduction (NPI) to manufacturing to service and maintenance, to material handling and distribution. For example, labels were displayed on parts of a system to clarify operating instructions for a mechanic performing maintenance on a system. Assembly lines benefited from the usage of AR. In addition to Boeing, BMW and Volkswagen were known for incorporating this technology into assembly lines for monitoring process improvements. Big machines are difficult to maintain because of their multiple layers or structures. AR permits people to look through the machine as if with an x-ray, pointing them to the problem right away.
Functional mockup Augmented reality can be used to build
mockups that combine physical and digital elements. With the use of
simultaneous localization and mapping (SLAM), mockups can interact with the physical world to gain control of more realistic sensory experiences like
object permanence, which would normally be infeasible or extremely difficult to track and analyze without the use of both digital and physical aides.
Translation AR applications such as
Word Lens can interpret the foreign text on signs and menus and, in a user's augmented view, re-display the text in the user's language. Spoken words of a foreign language can be translated and displayed in a user's view as printed subtitles.
Human-in-the-loop operation of robots Recent advances in mixed-reality technologies have renewed interest in alternative modes of communication for human-robot interaction. Human operators wearing augmented reality headsets such as
HoloLens can interact with (control and monitor) e.g. robots and lifting machines on site in a digital factory setup. This use case typically requires real-time data communication between a mixed reality interface with the machine / process / system, which could be enabled by incorporating
digital twin technology. They would like to even modify their surroundings by erasing street signs, billboard ads, and uninteresting shopping windows. Consumers want to use augmented reality glasses to change their surroundings into something that reflects their own personal opinions. Around two in five want to change the way their surroundings look and even how people appear to them. ==Apps==