In the near future we may enrich our perception of re- Augmented reality (AR) is this technology to cre-
ality through revolutionary virtual augmentation. Aug- ate a next generation, reality-based interface (Jacob,
mented reality (AR) technologies offer an enhanced per- 2006) and in fact already exists, moving from labora-
ception to help us see, hear, and feel our environments tories around the world into various industries and con-
in new and enriched ways that will benefit us in fields sumer markets. AR supplements the real world with vir-
such as education, maintenance, design, reconnaissance, tual (computer-generated) objects that appear to coexist
to name but a few. This essay describes the field of AR, in the same space as the real world. According to Tech-
including its definition, its development history, its en- nology Review1 , AR is recognised at MIT as one of ten
abling technologies and the technological problems that emerging technologies of 2007, and we are at the verge
developers need to overcome. To give an idea of the state of embracing this very new and exciting kind of human-
of the art, some recent applications of AR technology are computer interaction.
also discussed as well as a number of known limitations For anyone who is interested and wants to get ac-
regarding human factors in the use of AR systems. quainted with the field, this essay provides an overview
of important technologies, applications and limitations of
1 Introduction AR systems. First, a short definition and history of AR
are given below. Section 2 describes technologies that en-
Imagine a technology with which you could see more able an augmented reality experience. To give an idea of
than others see, hear more than others hear, and perhaps the possibilities of AR systems, a number of recent appli-
even touch, smell and taste things that others can not. cations are discussed in Section 3. In Section 4 a number
What if we had technology to perceive completely com- of technological challenges and limitations regarding hu-
putational elements and objects within our real world ex- man factors are discussed that AR system designers have
perience, entire creatures and structures even that help to take into consideration. Finally, the essay concludes
us in our daily activities, while interacting almost uncon- in Section 5 with a number of directions that the author
sciously through mere gestures and speech? envisions AR research might take.
With such technology, mechanics could see instruc-
tions what to do next when repairing an unknown piece
of equipment, surgeons could see ultrasound scans of or-
gans while performing surgery on them, firefighters could
see building layouts to avoid otherwise invisible hazards,
soldiers could see positions of enemy snipers spotted by
unmanned reconnaissance aircraft, and we could read re-
views for each restaurant in the street were walking in,
Figure 1: Reality-virtuality continuum, adapted from
or battle 10-foot tall aliens on the way to work (Feiner,
Azuma et al. (2001).
2002).
This essay is an effort to explore the field of augmented reality beyond
the summary presented in the course material (Dix et al., 2004, pp.
736-7). The author wishes to thank the course lecturer, Bert Bongers,
1
for his feedback and shared insights. http://www.techreview.com/special/emerging/
1
2 Enabling technologies
1.1 Definition during the 1970s and 1980s. During this time mobile de-
vices like the Sony Walkman (1979), digital watches and
On the reality-virtuality continuum by Milgram personal digital organisers were introduced. This paved
and Kishino (1994) (Fig. 1), AR is one part of the gen- the way for wearable computing (Mann, 1997; Starner
eral area of mixed reality. Both virtual environments (or et al., 1997) in the 1990s as personal computers became
virtual reality) and augmented virtuality, in which real small enough to be worn at all times. Early palmtop com-
objects are added to virtual ones, replace the surrounding puters include the Psion I (1984), the Apple Newton Mes-
environment by a virtual one. In contrast, AR takes place sagePad (1993), and the Palm Pilot (1996). Today, many
in the real world. Following the definitions in (Azuma, mobile platforms exist that may support AR, such as per-
1997; Azuma et al., 2001), an AR system: sonal digital assistants (PDAs), tablet PCs, and mobile
phones.
combines real and virtual objects in a real environ-
It took until the early 1990s before the term aug-
ment;
mented reality was coined by Caudell and Mizell (1992),
registers (aligns) real and virtual objects with each scientists at Boeing Corporation who were developing
other; and an experimental AR system to help workers put together
wiring harnesses. True mobile AR was still out of reach,
runs interactively, in three dimensions, and in real but a few years later Loomis et al. (1993) developed a
time. GPS-based outdoor system that presents navigational as-
sistance to the visually impaired with spatial audio over-
Three aspects of this definition are important to men- lays. Soon computing and tracking devices became suf-
tion. Firstly, it is not restricted to particular display tech- ficiently powerful and small enough to support graphical
nologies such as a head-mounted display (HMD). Nor is overlay in mobile settings. Feiner et al. (1997) created an
the definition limited to the sense of sight, as AR can early prototype of a mobile AR system (MARS) that reg-
and potentially will apply to all senses, including hear- isters 3D graphical tour guide information with buildings
ing, touch, and smell. Finally, removing real objects by and artifacts the visitor sees.
overlaying virtual ones, approaches known as mediated By the late 1990s, as AR became a research field
or diminished reality, is also considered AR. of its own, several conferences on AR began, includ-
ing the International Workshop and Symposium on Aug-
mented Reality, the International Symposium on Mixed
Reality, and the Designing Augmented Reality Environ-
ments workshop. Around this time, well-funded organi-
sations were formed such as the Mixed Reality Systems
Laboratory3 (MRLab) in Japan and the Arvika consor-
tium4 in Germany. Also, it became possible to rapidly
build AR applications thanks to freely available soft-
ware toolkits like the ARToolKit. In the mean time, sev-
eral surveys appeared that give an overview on AR ad-
Figure 2: The worlds first head-mounted display with vances, describe its problems, and summarise develop-
the Sword of Damocles (Sutherland, 1968). ments (Azuma, 1997; Azuma et al., 2001). By 2001,
MRLab finished their pilot research, and the symposia
were united in the International Symposium on Mixed
and Augmented Reality5 (ISMAR), which has become
1.2 Brief history the major symposium for industry and research to ex-
change problems and solutions.
The first AR prototypes (Fig. 2), created by computer
graphics pioneer Ivan Sutherland and his students at Har-
vard University and the University of Utah, appeared in 2 Enabling technologies
the 1960s and used a see-through HMD2 to present 3D
graphics (Sutherland, 1968). The technological demands for AR are much higher than
A small group of researchers at U.S. Air Forces Arm- for virtual environments or VR, which is why the field of
strong Laboratory, the NASA Ames Research Center, the AR took longer to mature than that of VR. However, the
Massachusetts Institute of Technology, and the Univer-
sity of North Carolina at Chapel Hill continued research 3
http://www.mr-system.com/
4
http://www.arvika.de/
2 5
http://www.telemusic.org:8080/ramgen/realvideo/HeadMounted.rm http://www.augmented-reality.org/
2
2.1 Displays
2.1 Displays
Of all modalities in human sensory input, sight, sound
and/or touch are currently the senses that AR systems
commonly apply. This section mainly focuses on visual
displays, however aural (sound) displays are mentioned Figure 3: Visual display techniques and positioning
briefly below. Haptic (touch) displays are discussed with (Bimber and Raskar, 2005a).
the interfaces in Section 2.3, while olfactory (smell) and
gustatory (taste) displays are less developed or practically
non-existent and will not be discussed in this essay. perception alone but displays only the AR overlay by
means of transparent mirrors and lenses. The third ap-
proach is to project the AR overlay onto real objects
2.1.1 Aural display themselves resulting in projective displays. The three
Aural display application in AR is mostly limited techniques may be applied at varying distance from the
to self-explanatory mono (0-dimensional), stereo (1- viewer (Fig. 3). Each combination of technique and dis-
dimensional) or surround (2-dimensional) headphones tance is listed in the overview presented in Table 1 with a
and loudspeakers. True 3D aural display is currently comparison of their individual advantages.
found in more immersive simulations of virtual environ-
ments and augmented virtuality or still in experimental Video see-through Besides being the cheapest and
stages. easiest to implement, this display technique offers the
Haptic audio refers to sound that is felt rather than following advantages. Since reality is digitised, it is eas-
heard (Hughes et al., 2006) and is already applied in con- ier to mediate or remove objects from reality. This in-
sumer devices such as Turtle Beachs Ear Force6 head- clude removing or replacing fiducial markers or place-
phones to increase the sense of realism and impact, but holders with virtual objects (see for instance Fig. 11 and
also to enhance user interfaces of e.g. mobile phones 28). Also, brightness and contrast of virtual objects are
(Chang and OSullivan, 2005). Recent developments matched easily with the real environment. The digitised
in this area are presented in workshops such as the in- images allow tracking of head movement for better reg-
ternational workshop on Haptic Audio Visual Environ- istration. It also becomes possible to match perception
ments7 and the upcoming second international workshop delays of the real and virtual. Disadvantages of video
on Haptic and Audio Interaction Design8 . see-through include a low resolution of reality, a limited
field-of-view (although this can easily be increased), and
2.1.2 Visual display user disorientation due to a parallax (eye-offset) due to
the cameras positioning at a distance from the viewers
There are basically three ways to visually present an aug- true eye location, causing significant adjustment effort
mented reality. Closest to virtual reality is video see- for the viewer (Biocca and Rolland, 1998). This prob-
through, where the virtual environment is replaced by a lem was solved at the MR Lab by aligning the video cap-
video feed of reality and the AR is overlayed upon the ture (Takagi et al., 2000). A final drawback is the focus
digitised images. Another way that includes Sutherlands distance of this technique which is fixed in most display
approach is optical see-through and leaves the real-world types, providing poor eye accommodation. Some head-
6 mounted setups can however move the display (or a lens
http://www.turtlebeach.com/site/products/earforce/x2/
7
http://www.discover.uottawa.ca/have2006/ in front of it) to cover a range of .25 meters to infinity
8
http://www.haid2007.org/ within .3 seconds (Sugihara and Miyasato, 1998).
3
2 Enabling technologies
(a)
(a)
(b)
4
2.1 Displays
(b)
(a) (c)
5
2 Enabling technologies
(a) (b)
(c) (d)
BlueTooth.
Fig. 10 shows examples of four (parallax-free) head- Figure 10: Head-worn visual displays
worn display types: Canons Co-Optical Axis See-
through Augmented Reality (COASTAR) video see-
through display (Tamura et al., 2001) (Fig. 10a), Konica displays) introduced the small Pico Projector (PicoP)
Minoltas holographic optical see-through Forgettable which is 8mm thick, provides full-colour imagery of
Display prototype (Kasai et al., 2000) (Fig. 10b), Micro- 13661024 pixels at 60Hz using three lasers, and will
Visions monochromatic and monocular Nomad retinal probably appear embedded in mobile phones soon.
scanning display (Hollerer and Feiner, 2004) (Fig. 10c),
and an organic light-emitting diode (OLED) based head- Spatial The last category of displays are placed stat-
mounted projective display (Rolland et al., 2005) (Fig. ically within the environment and include screen-based
10d). video see-through displays, spatial optical see-through
displays, and projective displays. These techniques lend
themselves well for large presentations and exhibitions
Hand-held This category includes hand-held with limited interaction. Early ways of creating AR are
video/optical see-through displays as well as hand- based on conventional screens (computer or television)
held projectors. Although this category of displays is that show a camera feed with an AR overlay. This tech-
bulkier than head-worn displays, it is currently the best nique is now being applied in the world of sports tele-
work-around to introduce AR to a mass market due vision where environments such as swimming pools and
to low production costs and ease of use. For instance, race tracks are well defined and easy to augment. Head-
hand-held video see-through AR acting as magnifying up displays (HUDs) in military cockpits are a form of
glasses may be based on existing consumer products like spatial optical see-through and are becoming a standard
mobile phones (Mohring et al., 2004) (Fig. 11a) that extension for production cars to project navigational di-
show 3D objects, or personal digital assistants/PDAs rections in the windshield (Narzt et al., 2006). User view-
(Wagner and Schmalstieg, 2003) (Fig. 11b) with e.g. points relative to the AR overlay hardly change in these
navigation information. Stetten et al. (2001) apply cases due to the confined space. Spatial see-through dis-
optical see-through in their hand-held sonic flashlight plays may however appear misaligned when users move
to display medical ultrasound imaging directly over the around in open spaces, for instance when AR overlay is
scanned organ (Fig. 12a). One example of a hand-held presented on a transparent screen such as the invisible
projective display or AR flashlight is the iLamp by interface by Ogi et al. (2001) (Fig. 13a). 3D holographs
Raskar et al. (2003). This context-aware or tracked solve the alignment problem, as Goebbels et al. (2001)
projector adjusts the imagery based on the current show with the ARSyS TriCorder10 (Fig. 13b) by the Ger-
orientation of the projector relative to the environment
(Fig. 12b). Recently, MicroVision (from the retinal 10
http://www.arsys-tricorder.de
6
2.2 Tracking sensors and approaches
man Fraunhofer IMK (now IAIS11 ) research center. annotation should point to the non-occluded parts only
(Bell et al., 2001).
2.2 Tracking sensors and approaches Fortunately, most environmental models do not need
to be very detailed about textures or materials. Usually
Before an AR system can display virtual objects into a a cloud of unconnected 3D sample points suffices for
real environment, the system must be able to sense the example to present occluded buildings and essentially let
environment and track the viewers (relative) movement users see through walls. To create a traveller guidance
preferably with six degrees of freedom (6DOF): three service (TGS), Kim et al. (2006) used models from a ge-
variables (x, y, an z) for position and three angles (yaw, ographical information system (GIS).
pitch, and roll) for orientation.
Stoakley et al. (1995) present users with the spatial
There must be some model of the environment to allow
model itself, an oriented map of the environment or world
tracking for correct AR registration. Furthermore, most
in miniature (WIM), to assist in navigation.
environments have to be prepared before an AR system is
able to track 6DOF movement, but not all tracking tech-
niques work in all environments. To this day, determining
the orientation of a user is still a complex problem with Modelling techniques Creating 3D models of large
no single best solution. environments is a research challenge in its own right. Au-
tomatic, semiautomatic, and manual techniques can be
2.2.1 Modelling environments employed, and Piekarski and Thomas (2001) even em-
ployed AR itself for modelling purposes. Conversely, a
Both tracking and registration techniques rely on envi- laser range finder used for environmental modelling may
ronmental models, often 3D geometrical models. To an- also enable users themselves to place notes into the envi-
notate for instance windows, entrances, or rooms, an AR ronment (Patel et al., 2006).
system needs to know where they are located with regard
to the users current position and field of view. There are significant research problems involved in
Sometimes the annotations themselves may be oc- both the modelling of arbitrary complex 3D spatial mod-
cluded based on environmental model. For instance when els as well as the organisation of storage and querying of
an annotated building is occluded by other objects, the such data in spatial databases. These databases may also
need to change quite rapidly as real environments are of-
11
http://www.iais.fraunhofer.de/ ten also dynamic.
7
2 Enabling technologies
(a)
(a)
(b)
(b)
Figure 12: Hand-held optical and projective displays
Compared to virtual environments, AR tracking devices Global positioning systems For outdoor tracking by
must have higher accuracy, a wider input variety and global positioning system (GPS) there exist the American
bandwidth, and longer ranges (Azuma, 1997). Registra- 24-satellite Navstar GPS (Getting, 1993), the Russian
tion accuracy depends not only on the geometrical model counterpart constellation Glonass, and the 30-satellite
but also on the distance of the objects to be annotated. GPS Galileo, currently being launched by the European
The further away an object (i) the less impact errors in Union and operational in 2010.
position tracking have and (ii) the more impact errors in Direct visibility with at least four satellites is no longer
orientation tracking have on the overall misregistration necessary with assisted GPS12 (A-GPS), a worldwide
(Azuma, 1999). network of servers and base stations enable signal broad-
Tracking is usually easier in indoor settings than in cast in for instance urban canyons and indoor environ-
outdoor settings as the tracking devices do not have to ments. Plain GPS is accurate to about 10-15 meters, but
be completely mobile and wearable or deal with shock, with the wide area augmentation system (WAAS) tech-
abuse, weather, etc. In stead the indoor environment nology may be increased to 3-4 meters. For more accu-
is easily modelled and prepared, and conditions such racy, the environments have to be prepared with a local
as lighting and temperature may be controlled. Cur- base station that sends a differential error-correction sig-
rently, unprepared outdoor environments still pose track- nal to the roaming unit: differential GPS yields 1-3 me-
ing problems with no single best solution. ter accuracy, while the real-time-kinematic or RTK GPS,
based on carrier-phase ambiguity resolution, can estimate
positions accurately to within centimeters. Update rates
Mechanical, ultrasonic, and magnetic Early tracking of commercial GPS systems such as the MS750 RTK re-
techniques are restricted to indoor use as they require ceiver by Trimble13 have increased from five to twenty
special equipment to be placed around the user. The first times a second and are deemed suitable for tracking fast
HMD by Sutherland (1968) was tracked mechanically motion of people and objects (Hollerer and Feiner, 2004).
(Fig. 2) through ceiling-mounted hardware also nick-
named the Sword of Damocles. Devices that send and
Radio Other tracking methods that require environ-
receive ultrasonic chirps and determine the position, i.e.
ment preparation by placing devices are based on ultra
ultrasonic positioning, were already experimented with
wide band radio waves. Active radio frequency iden-
by Sutherland (1968) and are still used today. A decade
or so later Polhemus magnetic trackers that measure dis- 12
http://www.globallocate.com/
tances within electromagnetic fields were introduced by 13
http://www.trimble.com/
8
2.2 Tracking sensors and approaches
(b)
Optical Promising approaches for 6DOF pose estima-
tion of users and objects in general settings are vision-
Figure 13: Spatial visual displays based. In closed-loop tracking, the field of view of the
camera coincides with that of the user (e.g. in video see-
through) allowing for pixel-perfect registration of virtual
objects. Conversely in open-loop tracking, the system
relies only on the sensed pose of the user and the envi-
ronmental model.
Using one or two tiny cameras, model-based ap-
proaches can recognise landmarks (given an accurate en-
vironmental model) or detect relative movement dynam-
ically between frames. There are a number of techniques
to detect scene geometry (e.g. landmark or template
matching) and camera motion in both 2D (e.g. optical
flow) and 3D which require varying amounts of com-
putation. Julier and Bishop (2002) combine some in a
number of test scenarios. Although early vision-based
tracking and interaction applications in prepared envi-
ronments use fiducial markers (e.g. Naimark and Foxlin,
Figure 14: An ultra wide band (UWB) based positioning 2002; Schmalstieg et al., 2000) or light emitting diodes
system
c 2007 PanGo. (LEDs) to see how and where to register virtual objects,
there is a growing body of research on markerless AR
for tracking physical positions (Chia et al., 2002; Com-
tification (RFID) chips may be positioned inside struc- port et al., 2003; Ferrari et al., 2001; Gordon and Lowe,
tures such as aircraft (Willers, 2006) to allow in situ po- 2004; Koch et al., 2005). Robustness is still low and
sitioning. Complementary to RFID one can apply the computational costs high, but results of these pure vision-
wide-area IEEE 802.11b/g standards for both wireless based approaches (hybrid and/or markerless) for general-
networking and tracking as well. The achievable reso- case, real-time tracking are very promising.
lution depends on the density of deployed access points
in the network. Several techniques are researched by
Hybrid Commercial hybrid tracking systems became
Bahl and Padmanabhan (2000); Castro et al. (2001) and
available during the 1990s and use for instance elec-
vendors like PanGo14 (Fig. 14), AeroScout15 and Eka-
tromagnetic compasses (magnetometers), gravitational
hau16 offer integrated systems for personnel and equip-
tilt sensors (inclinometers), and gyroscopes (mechanical
ment tracking in for instance hospitals.
and optical) for orientation tracking and ultrasonic, mag-
netic, and optical position tracking. Hybrid tracking ap-
14 proaches are currently the most promising way to deal
http://www.pango.com/
15
http://aeroscout.com/ with the difficulties posed by general indoor and outdoor
16
http://www.ekahau.com/ mobile AR environments (Hollerer and Feiner, 2004).
9
2 Enabling technologies
Azuma et al. (2006) investigate hybrid methods without require well-tuned motor skills from the users. Ideally
vision-based tracking suitable for military use at night the number of extra devices that have to be carried around
in an outdoor environment with less than ten beacons in mobile UIs is reduced, but this may be difficult with
mounted on for instance unmanned air vehicles (UAVs). current mobile computing and UI technologies.
Devices like the mouse are tangible and unidirec-
2.3 User interface and interaction tional, they communicate from the user to the AR system
only. Common 3D equivalents are tangible user inter-
Besides registering virtual data with the users real world faces (TUIs) like paddles and wands. Ishii and Ullmer
perception, the system needs to provide some kind of in- (1997) discuss a number of tangible interfaces devel-
terface with both virtual and real objects. oped at MITs Tangible Media Group17 including ph-
icons (physical icons) and sliding instruments. Some
New UI paradigm WIMP (windows, icons, menus, and TUIs have placeholders or markers on them so the AR
pointing), as the conventional desktop UI metaphor is re- system can replace them visually with virtual objects.
ferred to, does not apply that well to AR systems. Not Poupyrev et al. (2001) use tiles with fiducial markers,
only is interaction required with six degrees of freedom while in StudierStube, Schmalstieg et al. (2000) allow
(6DOF) rather than 2D, the use of conventional devices users to interact through a Personal Interaction Panel with
like a mouse and keyboard are cumbersome to wear and 2D and 3D widgets that also recognises pen-based ges-
reduce the AR experience. tures in 6DOF (Fig. 15).
Like in WIMP UIs, AR interfaces have to support se-
lecting, positioning, and rotating of virtual objects, draw-
ing paths or trajectories, assigning quantitative values
(quantification) and text input. However as a general UI
principle, AR interaction also includes the selection, an-
notation, and, possibly, direct manipulation of physical
objects. This computing paradigm is still a challenge
(Hollerer and Feiner, 2004).
10
2.3 User interface and interaction
Khoudja et al. (2004) survey tactile interfaces used in ing and storing everything that is taking place in front of
teleoperation, 3D surface simulation, games, etc. the mobile AR system user.
Common in indoor virtual or augmented environments
is the use of additional orientation and position track-
ers to provide 6DOF hand tracking for manipulating vir-
tual objects. For outdoor environments, Foxlin and Har-
rington (2000) experimented with ultrasonic tracking of
finger-worn acoustic emitters using three head-worn mi-
crophones.
11
2 Enabling technologies
12
3.1 Personal information systems
13
3 Applications
tual use in car windshield heads-up displays. Tonnis et al. and Wohlgemuth, 2002). The MR Lab used data from
(2005) investigate the success of using AR warnings to DaimlerChryslers cars to create Clear and Present Car,
direct a car drivers attention towards danger (Fig. 21b). a simulation where one can open the door of a virtual
Kim et al. (2006) describe how a 2D traveler guidance concept car and experience the interior, dash board lay
service can be made 3D using GIS data for AR naviga- out and interface design for usability testing (Tamura,
tion. Nokias MARA project35 researches deployment of 2002; Tamura et al., 2001). Notice how the steering
AR on current mobile phone technology. wheel is drawn around the hands, rather than over them
(Fig. 22b).
Touring Hollerer et al. (1999) use AR to create situ-
ated documentaries about historic events, while Vlahakis
et al. (2001) present the ArcheoGuide project that recon-
structs a cultural heritage site in Olympia, Greece. With
this system, visitors can view and learn ancient architec-
ture and customs. Similar systems have been developed
for the Pompeii site (Magnenat-Thalmann and Papagian-
nakis, 2005; Papagiannakis et al., 2005). The lifeClip-
per36 project does about the same for structures and tech-
nologies in medieval Germany and is moving from an art Figure 23: Robot sensor data visualisation (Collett
project to serious AR exhibition. Bartie and Mackaness and MacDonald, 2006).
(2006) introduced a touring system to explore landmarks
in the cityscape of Edinburgh that works with speech Another interesting application presented by Collett
recognition. and MacDonald (2006) is the visualisation of robot pro-
grams (Fig. 23). With small robots such as the auto-
3.2 Industrial and military applications mated vacuum cleaner Roomba from iRobot37 entering
our daily lives, visualising their sensor ranges and in-
Design, assembly, and maintenance are typical areas
tended trajectories might be welcome extensions.
where AR may prove useful. These activities may be
augmented in both corporate and military settings.
Assembly BMW is experimenting with AR to improve
welding processes on their cars (Sandor and Klinker,
2005). Assisting the production process at Boeing,
Mizell (2001) use AR to overlay schematic diagrams
and accompanying documentation directly onto wooden
boards on which electrical wires are routed, bundled, and
sleeved. Curtis et al. (1998) verify the AR and find that
workers using AR create wire bundles as well as con-
ventional approaches, even though tracking and display
technologies were limited at the time.
(a) (b)
14
3.3 Medical applications
while (Willers, 2006) research AR support for Airbus 3.3 Medical applications
cable and water systems (Fig. 24). Leading (and talking)
Similar to maintenance personnel, roaming nurses and
workers through the assembly process of large aircraft is
doctors could benefit from important information being
not suited for stationary AR solutions, yet mobility and
delivered directly to their glasses (Hasvold, 2002). Sur-
tracking with so much metal around also prove to be chal-
geons however require very precise registration while AR
lenging.
system mobility is less of an issue.
An extra benefit of augmented assembly and construc-
tion is the possibility to monitor and schedule individ-
ual progress in order to manage large complex construc-
tion projects. An example by Feiner et al. (1999) gener-
ates overview renderings of the entire construction scene
while workers use their HMD to see which strut is to
be placed where in a space-frame structure. Distributed
interaction on construction is further studied by Olwal
and Feiner (2004).
15
3 Applications
(a) (b)
16
Kaufmann (2002); Kaufmann et al. (2000) introduce
the Construct3D tool for math and geometry education,
based on the StudierStube framework (Fig. 30a). In
MARIE (Fig. 30b), based in turn on the Construct3D
tool, Liarokapis et al. (2004) employ screen-based AR
with Web3D to support engineering education. Recently
MIT Education Arcade introduced game-based learning
in Mystery at the museum and Environmental Detec-
(a) (b) tives where each educative game has an engaging back-
story, differentiated character roles, reactive third parties,
guided debriefing, synthetic activities, and embedded re-
call/replay to promote both engagement and learning
Figure 29: MR2 (Tamura, 2002) and ARTHUR (Broll (Kaplan-Leiserson, 2004).
et al., 2004).
4 Limitations
displays (such as Emmie (Butz et al., 1999), and Studier-
Stube (Szalavari et al., 1998), MR2 (Tamura, 2002) and AR faces technical challenges regarding for example
ARTHUR (Broll et al., 2004)). Privacy management is binocular (stereo) view, high resolution, colour depth, lu-
handled in the Emmie system through such metaphors as minance, contrast, field of view, and focus depth. How-
lamps and mirrors. Making sure everybody knows what ever, before AR becomes accepted as part of users ev-
someone is pointing at is a problem that StudierStube eryday life, just like a mobile phone and a personal dig-
overcomes by using virtual representation of physical ital assistant (PDA), issues regarding intuitive interfaces,
pointers. Similarly, Tamura (2002) presented a mixed costs, weight, power usage, ergonomics, and appearance
reality meeting room (MR2 ) for 3D presentations (Fig. must also be addressed. A number of limitations, some of
29a). For urban planning purposes, Broll et al. (2004) which have been mentioned earlier, are categorised here.
introduced ARTHUR, complete with pedestrian flow vi-
sualisation (Fig. 29b) but lacking augmented pointers. Portability and outdoor use Most mobile AR systems
mentioned in this survey are cumbersome, requiring a
heavy backpack to carry the PC, sensors, display, bat-
3.6 Education and training
teries, and everything else. Connections between all the
Close to earlier mentioned collaborative applications like devices must be able to withstand outdoor use, including
games and planning are AR tools that support educa- weather and shock, but universal serial bus (USB) con-
tion with 3D objects. Many studies research this area nectors are known to fail easily. However, recent devel-
of application such as (Billinghurst, 2003; G. Heide- opments in mobile technology like cell phones and PDAs
mann and Ritter, 2005; Hughes et al., 2005; Kommers are bridging the gap towards mobile AR.
and Zhiming; Kritzenberger et al., 2002; Pan et al., 2006; Optical and video see-through displays are usually un-
Winkler et al., 2003). suited for outdoor use due to low brightness, contrast,
resolution, and field of view. However, recently devel-
oped at MicroVision, laser-powered displays offer a new
dimension in head-mounted and hand-held displays that
overcomes this problem.
Most portable computers have only one CPU which
limits the amount of visual and hybrid tracking. More
generally, consumer operating systems are not suited for
real-time computing, while specialised real-time operat-
ing systems dont have the drivers to support the sensors
and graphics in modern hardware.
(a) (b)
17
References
through calibration-free or auto-calibrating approaches Social acceptance Getting people to use AR may be
that minimise set-up requirements. The latter use re- more challenging than expected, and many factors play a
dundant sensor information to automatically measure and role in social acceptance of AR ranging from unobtrusive
compensate for changing calibration parameters (Azuma fashionable appearance (gloves, helmets, etc.) to privacy
et al., 2001). concerns. For instance, Accentures Assistant (Fig. 20)
blinks a light when it records for the sole purpose of alert-
ing the person who is being recorded. These fundamental
Latency A large source of dynamic registration errors
issues must be addressed before AR is widely accepted
are system delays (Azuma et al., 2001). Techniques like
(Hollerer and Feiner, 2004).
precalculation, temporal stream matching (in video see-
through such as live broadcasts), and prediction of future
viewpoints may solve some delay. System latency can 5 Conclusion
also be scheduled to reduce errors through careful system
design, and pre-rendered images may be shifted at the This essay presented a state of the art survey of AR tech-
last instant to compensate for pan-tilt motions. Similarly, nologies, applications and limitations. The comparative
image warping may correct delays in 6DOF motion (both table on displays (Table 1) and survey of frameworks and
translation and rotation). content authoring tools (Section 2.4) are specifically new
to the AR research literature. With over a hundred ref-
erences this essay has become a comprehensive investi-
Depth perception One difficult registration problem gation into AR and hopefully provides a suitable starting
is accurate depth perception. Stereoscopic displays point for readers new to the field.
help, but additional problems including accommodation- AR has come a long way but still has some distance
vergence conflicts or low resolution and dim displays to go before industries, the military and the general pub-
cause object to appear further away than they should be lic will accept it as a familiar user interface. For exam-
(Drascic and Milgram, 1996). Correct occlusion amelio- ple, Airbus CIMPA still struggles to get their AR systems
rates some depth problems (Rolland and Fuchs, 2000), for assembly support accepted by the workers (Willers,
as does consistent registration for different eyepoint lo- 2006). On the other hand, companies like Information in
cations (Vaissie and Rolland, 2000). Place estimate that by 2014, 30% of mobile workers will
be using augmented reality. Within 5-10 years, Feiner
Adaptation In early video see-through systems with a (2002) believes that augmented reality will have a more
parallax, users need to adapt to vertical displaced view- profound effect on the way in which we develop and in-
points. In an experiment by Biocca and Rolland (1998), teract with future computers. With the advent of such
subjects exhibit a large overshoot in a depth-pointing task complementary technologies as tactile networks by Red-
after removing the HMD. Tacton42 , artificial intelligence, cybernetics, and (non-
invasive) brain-computer interfaces, AR might soon pave
the way for ubiquitous (anytime-anywhere) computing
Fatigue and eye strain Like the parallax problem,
(Weiser, 1991) of a more natural kind (Abowd and My-
biocular displays (where both eyes see the same image)
natt, 2000) or even human-machine symbiosis as envi-
cause significantly more discomfort than monocular or
sioned by Licklider (1960).
binocular displays, both in eye strain and fatigue (Ellis
et al., 1997).
References
Overload and over-reliance Aside from technical
A BOWD , G. D. AND M YNATT, E. D. Charting past,
challenges, the user interface must also follow some present, and future research in ubiquitous comput-
guidelines as not to overload the user with informa- ing. ACM Trans. on Computer-Human Interaction 7,
tion while also preventing the user to overly rely on the 1 (Mar. 2000), pp. 2958. 5
AR system such that important cues from the environ-
ment are missed (Tang et al., 2003). At BMW, Bengler A HMAD , F. AND M USILEK , P. A keystroke and pointer
and Passaro (2004) use guidelines for AR system design control input interface for wearable computers. In Per-
in cars, including orientation on the driving task, no mov- Com06: Proc. 4th IEEE Intl Conf. Pervasive Com-
ing or obstructing imagery, add only information that im- puting and Communications (2006), pp. 211. 2.3
proves driving performance, avoid side effects like tunnel
vision and cognitive capture, and only use information A NTONIAC , P. AND P ULLI , P. Marisilmobile user
that does not distract, intrude or disturb given different interface framework for virtual enterprise. In Proc.
situations. 42
http://www.redtacton.com/
18
References
7th Intl Conf. Concurrent Enterprising (Bremen, June B ENALI -K HOUDJA , M., H AFEZ , M., A LEXANDRE , J.-
2001), pp. 171180. 2.3 M. AND K HEDDAR , A. Tactile interfaces: a state-of-
the-art survey. In ISR04: Proc. 35th Intl Symp. on
A ZUMA , R. T. A survey of augmented reality. Presence Robotics (Paris, France, Mar. 2004). 2.3
6, 4 (Aug. 1997), pp. 355385. 1.1, 1.2, 2.2.2
B ENFORD , S., G REENHALGH , C., R EYNARD , G.,
B ROWN , C. AND KOLEVA , B. Understanding and
A ZUMA , R. T. Mixed Reality: Merging Real and Virtual
constructing shared spaces with mixed-reality bound-
Worlds. Ohmsha/Springer, Tokyo/New York, 1999,
aries. ACM Trans. on Computer-Human Interaction 5,
ch. The challenge of making augmented reality work
3 (Sep. 1998), pp. 185223. 3.5
outdoors, pp. 379390. ISBN 3-540-65623-5. 2.2.2
B ENGLER , K. AND PASSARO , R. Augmented reality
A ZUMA , R. T., BAILLOT, Y., B EHRINGER , R., in cars: Requirements and constraints. In ISMAR04:
F EINER , S. K., J ULIER , S. AND M AC I NTYRE , B. Proc. 3rd Intl Symp. on Mixed and Augmented Reality
Recent advances in augmented reality. Computer (2004). 4
Graphics and Applications 21, 6 (Nov./Dec. 2001),
pp. 3447. 1, 1.1, 1.2, 27, 4, 4 B ILLINGHURST, M. Augmented reality in education.
New Horizons in Learning 9, 1 (Oct. 2003). 3.6
A ZUMA , R. T., N EELY III, H., DAILY, M. B ILLINGHURST, M. AND K ATO , H. Collaborative
AND L EONARD , J. Performance analysis of an out-
mixed reality. In ISMR99: Proc. Intl Symp. on Mixed
door augmented reality tracking system that relies Reality (Yokohama, Japan, Mar. 1999), Springer Ver-
upon a few mobile beacons. In ISMAR06: Proc. 5th lag, pp. 261284. 3.5
Intl Symp. on Mixed and Augmented Reality (Santa
Barbara, CA, USA, Oct. 2006), pp. 101104. 2.2.2, B ILLINGHURST, M., K ATO , H. AND P OUPYREV, I. The
3.2 MagicBookmoving seamlessly between reality and
virtuality. Computer Graphics and Applications 21,
BAHL , P. AND PADMANABHAN , V. N. RADAR: an in- 3 (May/June 2001), pp. 68. 3.5
building RF-based user location and tracking system.
In Proc. IEEE Infocom (Tel Aviv, Israel, 2000), IEEE B IMBER , O. AND R ASKAR , R. Modern approaches to
CS Press, pp. 775784. ISBN 0-7803-5880-5. 2.2.2 augmented reality. In SIGGRAPH05 Tutorial on Spa-
tial AR (2005a). 3, 4, 5, 6, 8, 9
BAILLOT, Y., B ROWN , D. AND J ULIER , S. Authoring of B IMBER , O. AND R ASKAR , R. Spatial Augmented Re-
physical models using mobile computers. In ISWC01: ality: Merging Real and Virtual Worlds. A. K. Peters,
Proc. 5th Intl Symp. on Wearable Computers (Wash- Ltd., Wellesley, MA, USA, 2005b. ISBN 1-56881-
ington, DC, USA, 2001), IEEE CS Press, p. 39. ISBN 230-2. 2.1.2
0-7695-1318-2. 3.2
B IOCCA , F. A. AND ROLLAND , J. P. Virtual eyes can re-
BARTIE , P. J. AND M ACKANESS , W. A. Development arrange your bodyadaptation to visual displacement
of a speech-based augmented reality system to support in see-through, head-mounted displays. Presence 7, 3
exploration of cityscape. Trans. in GIS 10, 1 (2006), (June 1998), pp. 262277. 2.1.2, 4
pp. 6386. 3.1
B ROLL , W., L INDT, I., O HLENBURG , J.,
W ITTK AMPER , M., Y UAN , C., N OVOTNY, T.,
B EHRINGER , R., TAM , C., M C G EE , J., S UN - FATAH GEN . S CHIECK , A., M OTTRAM , C.
DARESWARAN , S. AND VASSILIOU , M. Wearable
AND S TROTHMANN , A. ARTHUR: A collabo-
augmented reality testbed for navigation and control, rative augmented environment for architectural design
built solely with commercial-off-the-shelf (COTS) and urban planning. Journal of Virtual Reality and
hardware. In ISAR00: Proc. Intl Symp. Augmented Broadcasting 1, 1 (Dec. 2004). 29, 3.5
Reality (Washington, DC, USA, Oct. 2000), IEEE CS
Press, pp. 1219. 2.4 B UCHMANN , V., V IOLICH , S., B ILLINGHURST,
M. AND C OCKBURN , A. FingARtips: Ges-
B ELL , B., F EINER , S. AND H OLLERER , T. View ture based direct manipulation in augmented reality.
management for virtual and augmented reality. In GRAPHITE04: Proc. 2nd Intl Conf. on Com-
In UIST01: Proc. ACM Symp. on User Interface Soft- puter graphics and interactive techniques (Singapore,
ware and Technology (Orlando, FL, 2001), vol. 3 of 2004), ACM Press, pp. 212221. ISBN 1-58113-883-
CHI Letters, ACM Press, pp. 101110. 2.2.1 0. 2.3
19
References
B UTZ , A., H OLLERER , T., F EINER , S., M AC I N - C ORREIA , N. AND ROMERO , L. Storing user experi-
TYRE , B. AND B ESHERS , C. Enveloping users ences in mixed reality using hypermedia. The Visual
and computers in a collaborative 3D augmented real- Computer 22, 12 (2006), pp. 9911001. 2.4
ity. In IWAR99: Proc. 2nd Intl Workshop on Aug-
mented Reality (Washington, DC, USA, 1999), IEEE C RABTREE , A., B ENFORD , S., RODDEN , T., G REEN -
CS Press, pp. 3544. ISBN 0-7695-0359-4. 3.5 HALGH , C., F LINTHAM , M., A NASTASI , R.,
D ROZD , A., A DAMS , M., ROW-FARR , J., TANDA -
C AKMAKCI , O. AND ROLLAND , J. Head-worn dis- VANITJ , N. AND S TEED , A. Orchestrating a mixed
plays: A review. Display Technology 2, 3 (Sep. 2006), reality game on the ground. In Proc. CHI Conf. on
pp. 199216. Invited. 2.1.3 Human Factors in Computing Systems (Vienna, Aus-
tria, Apr. 2004), ACM Press. 3.4
C ASTRO , P., C HIU , P., K REMENEK , T. AND M UNTZ ,
R. A probabilistic room location service for wire- C URTIS , D., M IZELL , D., G RUENBAUM , P.
less networked environments. In UbiComp01: Proc. AND JANIN , A. Several devils in the details:
Ubiquitous Computing (New York, NY, USA, 2001), Making an AR application work in the airplane fac-
vol. 2201 of Lecture Notes in Computer Science, ACM tory. In IWAR98: Proc. Intl Workshop on Augmented
Press, pp. 1835. 2.2.2 Reality (Natick, MA, USA, 1998), A. K. Peters, Ltd.,
pp. 4760. ISBN 1-56881-098-9. 3.2
C AUDELL , T. P. AND M IZELL , D. W. Augmented
reality: An application of heads-up display technol- D IX , A., F INLAY, J. E., A BOWD , G. D. AND B EALE ,
ogy to manual manufacturing processes. In Proc. R. Human-Computer Interaction, 3rd ed. Pearson
Hawaii Intl Conf. on Systems Sciences (Washington, Education Ltd., Harlow, England, 2004. ISBN 0130-
DC, USA, 1992), IEEE CS Press. 1.2 46109.
C AVALLARO , R. The foxtrax hockey puck tracking D RASCIC , D. AND M ILGRAM , P. Perceptual issues in
system. Computer Graphics and Applications 17, 2 augmented reality. In Proc. SPIE, Stereoscopic Dis-
(Mar./Apr. 1997), pp. 612. 3.4 plays VII and Virtual Systems III (Bellingham, WA,
C HANG , A. AND OS ULLIVAN , C. Audio-haptic feed- USA, 1996), vol. 2653, SPIE Press, pp. 123134. 4
back in mobile phones. In CHI05: Proc. SIGCHI
E LLIS , S. R., B REANT, F., M ANGES , B., JACOBY, R.
Conf. on Human Factors in Computing Systems (Port-
AND A DELSTEIN , B. D. Factors influencing opera-
land, OR, USA, 2005), ACM Press, pp. 12641267.
tor interaction with virtual objects viewed via head-
ISBN 1-59593-002-7. 2.1.1
mounted seethrough displays: Viewing conditions and
C HEOK , A. D., WAN , F. S., YANG , X., W EIHUA , W., rendering latency. In VRAIS97: Proc. Virtual Reality
H UANG , L. M., B ILLINGHURST, M. AND K ATO , H. Ann. Intl Symp. (Washington, DC, USA, 1997), IEEE
Game-city: A ubiquitous large area multi-interface CS Press, pp. 138145. ISBN 0-8186-7843-7. 4
mixed reality game space for wearable computers.
FARRINGDON , J., M OORE , A. J., T ILBURY, N.,
IEEE CS Press. ISBN 0-7695-1816-8/02. 3.4
C HURCH , J. AND B IEMOND , P. D. Wearable sen-
C HIA , K. W., C HEOK , A. D. AND P RINCE , S. J. D. sor badge and sensor jacket for context awareness.
Online 6 DOF augmented reality registration from nat- In ISWC99: Proc. 3rd Intl Symp. on Wearable Com-
ural features. In ISMAR02: Proc. Intl Symp. on puters (San Francisco, CA, 1999), IEEE CS Press,
Mixed and Augmented Reality (Darmstadt, Germany, pp. 107113. 2.3
Oct. 2002), IEEE CS Press. 2.2.2
F EINER , S., M AC I NTYRE , B., H OLLERER , T.
C OLLETT, T. H. J. AND M AC D ONALD , B. A. De- AND W EBSTER , A. A touring machine: Prototyp-
veloper oriented visualisation of a robot program: An ing 3D mobile augmented reality systems for explor-
augmented reality approach. In HRI06 (Salt Lake City, ing the urban environment. In ISWC97: Proc. 1st
Utah, USA, Mar. 2006), ACM Press, pp. 4956. ISBN Intl Symp. on Wearable Computers (Cambridge, MA,
1-59593-294-1/06/0003. 23, 3.2 1997), IEEE CS Press, pp. 7481. 1.2
20
References
F EINER , S. K. Augmented reality: A new way of seeing. H. F., S ADER , R., T HIERINGER , F., A LBRECHT, K.,
Scientific American (Apr. 2002). 1, 5 P RAXMARER , K., K EEVE , E., H ANSSEN , N., K ROL ,
Z. AND H ASENBRINK , F. Development of an aug-
F ERRARI , V., T UYTELAARS , T. AND VAN G OOL , L. mented reality system for intra-operative navigation in
Markerless augmented reality with a real-time affine maxillo-facial surgery. In Proc. BMBF Statustagung
region tracker. In ISAR01: Proc. 2nd Intl Symp. Aug- (Stuttgart, Germany, 2001), pp. 237246. 2.1.3
mented Reality (Washington, DC, USA, 2001), IEEE
CS Press. 2.2.2 G ORDON , I. AND L OWE , D. G. What and where: 3D
object recognition with accurate pose. In ISMAR04:
F IAMBOLIS , P. Virtual retinal display (VRD) tech- Proc. 3rd Intl Symp. on Mixed and Augmented Real-
nology. Available at http://www.cs.nps.navy. ity (Arlington, VA, USA, Nov. 2004), IEEE CS Press.
mil/people/faculty/capps/4473/projects/ 2.2.2
fiambolis/vrd/vrd_full.html. Aug. 1999. 2.1.2
H ASVOLD , P. In-the-field health informatics. In The
F IORENTINO , M., DE A MICIS , R., M ONNO , G. Open Group conf. (Paris, France, 2002). 3.3
AND S TORK , A. Spacedesign: A mixed reality
workspace for aesthetic industrial design. In IS- H AYWARD , V., A STLEY, O., C RUZ -H ERNANDEZ , M.,
MAR02: Proc. Intl Symp. on Mixed and Augmented G RANT, D. AND ROBLES -D E L A T ORRE , G. Hap-
Reality (Darmstadt, Germany, Oct. 2002), IEEE CS tic interfaces and devices. Sensor Review 24 (2004),
Press, p. 86. ISBN 0-7695-1781-1. 22, 3.2 pp. 1629. 2.3
F LINTHAM , M., A NASTASI , R., B ENFORD , S., H EM - H ENRYSSON , A., B ILLINGHURST, M. AND O LLILA ,
MINGS , T., C RABTREE , A., G REENHALGH , C., M. Face to face collaborative AR on mobile phones.
RODDEN , T., TANDAVANITJ , N., A DAMS , M. In ISMAR05: Proc. 4th Intl Symp. on Mixed and Aug-
AND ROW-FARR , J. Where on-line meets on-the- mented Reality (Vienna, Austria, Oct. 2005), IEEE CS
streets: Experiences with mobile mixed reality games. Press, pp. 8089. ISBN 0-7695-2459-1. 28, 3.4
2003. 3.4
H OLLERER , T., F EINER , S. AND PAVLIK , J. Situated
F OXLIN , E. AND H ARRINGTON , M. Weartrack: A self- documentaries: Embedding multimedia presentations
referenced head and hand tracker for wearable com- in the real world. In ISWC99: Proc. 3rd Intl Symp.
puters and portable VR. In ISWC00: Proc. 4th Intl on Wearable Computers (San Francisco, CA, 1999),
Symp. on Wearable Computers (Atlanta, GA, 2000), IEEE CS Press, pp. 7986. 3.1
IEEE CS Press, pp. 155162. 2.3
H OLLERER , T. H. AND F EINER , S. K. Telegeoinformat-
F RIEDRICH , W. AND W OHLGEMUTH , W. ARVIKA ics: Location-Based Computing and Services. Taylor
augmented reality for development, production and & Francis Books Ltd., 2004, ch. Mobile Augmented
servicing. ARVR Beitrag, 2002. 3.2, 3.2, 3.2 Reality. 2.1.3, 2.2.2, 2.2.2, 2.3, 2.3, 2.4, 3.1, 4
F UCHS , H., L IVINGSTON , M. A., R ASKAR , R., H UGHES , C. E., S TAPLETON , C. B., H UGHES , D. E.
C OLUCCI , D., K ELLER , K., S TATE , A., C RAW- AND S MITH , E. M. Mixed reality in education, enter-
FORD , J. R., R ADEMACHER , P., D RAKE , S. H. tainment, and training. Computer Graphics and Appli-
AND M EYER , A. A. Augmented reality visualization cations (Nov./Dec. 2005), pp. 2430. 3.6
for laparoscopic surgery. In MICCAI98: Proc. 1st
Intl Conf. Medical Image Computing and Computer- H UGHES , C. E., S TAPLETON , C. B. AND OC ONNOR ,
Assisted Intervention (Heidelberg, Germany, 1998), M. The evolution of a framework for mixed reality
Springer-Verlag, pp. 934943. ISBN 3-540-65136-5. experiences. In Emerging Technologies of Augmented
25, 3.3 Reality: Interfaces and Design (Hershey, PA, USA,
Nov. 2006), Idea Group Publishing. 2.1.1
G. H EIDEMANN , H. B EKEL , I. B. AND R ITTER , H. In-
teractive online learning. Pattern Recognition and Im- I SHII , H. AND U LLMER , B. Tangible bits: Towards
age Analysis 15, 1 (2005), pp. 5558. 3.6 seamless interfaces between people, bits and atoms.
In CHI97: Proc. SIGCHI Conf. on Human Factors in
G ETTING , I. The global positioning system. Spectrum Computing Systems (New York, NY, USA, Mar. 1997),
30, 12 (1993), pp. 3647. 2.2.2 ACM Press, pp. 234241. 2.3
G OEBBELS , G., T ROCHE , K., B RAUN , M., I VANOVIC , JACOB , R. J. K. What is the next generation of
A., G RAB , A., VON L OBTOW , K., Z EILHOFER , human-computer interaction? In CHI06: Proc.
21
References
SIGCHI Conf. on Human Factors in Computing Sys- K IYOKAWA , K., B ILLINGHURST, M., C AMPBELL , B.
tems (Montreal, Quebec, Canada, Apr. 2006), ACM AND W OODS , E. An occlusion-capable optical see-
Press, pp. 17071710. ISBN 1-59593-298-4. 1 through head mount display for supporting co-located
collaboration. In ISMAR03: Proc. 2nd Intl Symp.
J EBARA , T., E YSTER , C., W EAVER , J., S TARNER , on Mixed and Augmented Reality (Tokyo, Japan, Oct.
T. AND P ENTLAND , A. Stochasticks: Augment- 2003), IEEE CS Press, p. 133. ISBN 0-7695-2006-5.
ing the billiards experience with probabilistic vision 2.1.2
and wearable computers. In ISWC97: Proc. 1st
Intl Symp. on Wearable Computers (Washington, DC, K LINKER , G., S TRICKER , D. AND R EINERS , D. Aug-
USA, 1997), IEEE CS Press, pp. 138145. 3.4 mented Reality and Wearable Computers. Lawrence
Erlbaum Press, 2001, ch. Augmented Reality for Ex-
J ULIER , S. AND B ISHOP, G., E . Computer Graphics and terior Construction Applications. 3.2
Applications, Special Issue on Tracking, vol. 6. IEEE
CS Press, Washington, DC, USA, 2002. 2.2.2 KOCH , R., KOESER , K., S TRECKEL , B. AND E VERS -
S ENNE , J.-F. Markerless image-based 3D tracking for
J ULIER , S., K ING , R., C OLBERT, B., D URBIN , J. real-time augmented reality applications. 2005. 2.2.2
AND ROSENBLUM , L. The software architecture of a
real-time battlefield visualization virtual environment. KOMMERS , P. A. AND Z HIMING , Z. Virtual reality
In VR99: Proc. IEEE Virtual Reality (Washington, for education. Available at http://projects.edte.
DC, USA, 1999), IEEE CS Press, pp. 2936. ISBN utwente.nl/proo/kommers.htm. 3.6
0-7695-0093-5. 3.2
K RITZENBERGER , H., W INKLER , T. AND H ERCZEG ,
M. Collaborative and constructive learning of ele-
J ULIER , S. J., L ANZAGORTA , M., BAILLOT, Y.,
mentary school children in experiental learning spaces
ROSENBLUM , L. J., F EINER , S., H OLLERER , T.
along the virtuality continuum. In Mensch & Com-
AND S ESTITO , S. Information filtering for mobile
puter 2002: Vom interaktiven Werkzeug zu koopera-
augmented reality. In ISAR00: Proc. Intl Symp. Aug-
tiven Arbeits- und Lernwelten, M. Herczeg, W. Prinz,
mented Reality (Washington, DC, USA, 2000), IEEE
and H. Oberquelle, Eds. B. G. Teubner, Stuttgart, Ger-
CS Press, pp. 311. 3.2
many, 2002. 3.6
K APLAN -L EISERSON , E. Trend: Augmented reality
K UMAR , M. AND W INOGRAD , T. GUIDe: Gaze-
check. Learning Circuits (Dec. 2004). 3.2, 3.6
enhanced UI design. In CHI07: Proc. SIGCHI Conf.
on Human Factors in Computing Systems (San Jose,
K ASAI , I., TANIJIRI , Y., E NDO , T. AND U EDA , H.
CA, Apr./May 2007). 2.3
A forgettable near eye display. In ISWC00: Proc.
4th Intl Symp. on Wearable Computers (Atlanta, GA, L IAROKAPIS , F., M OURKOUSSIS , N., W HITE , M.,
USA, 2000), IEEE CS Press, pp. 115118. 2.1.3 DARCY, J., S IFNIOTIS , M., P ETRIDIS , P., BASU , A.
AND L ISTER , P. F. Web3D and augmented reality to
K AUFMANN , H. Construct3D: an augmented reality support engineering education. World Trans. on Engi-
application for mathematics and geometry education. neering and Technology Education 3, 1 (2004), pp. 1
In MULTIMEDIA02: Proc. 10th ACM Intl Conf. 4. UICEE. 30, 3.6
on Multimedia (Juan-les-Pins, France, 2002), ACM
Press, pp. 656657. ISBN 1-58113-620-X. 3.6 L ICKLIDER , J. Man-computer symbiosis. IRE Trans. on
Human Factors in Electronics 1 (1960), pp. 411. 2.3,
K AUFMANN , H., S CHMALSTIEG , D. AND WAGNER , 5
M. Construct3D: A virtual reality application for
mathematics and geometry education. Education and L OOMIS , J., G OLLEDGE , R. AND K LATZKY, R. Per-
Information Technologies 5, 4 (Dec. 2000), pp. 263 sonal guidance system for the visually impaired using
276. 30, 3.6 GPS, GIS, and VR technologies. In Proc. Conf. on Vir-
tual Reality and Persons with Disabilities (Millbrae,
K IM , S., K IM , H., E OM , S., M AHALIK , N. P. CA, 1993). 1.2
AND A HN , B. A reliable new 2-stage distributed inter-
active tgs system based on gis database and augmented M AGNENAT-T HALMANN , N. AND PAPAGIANNAKIS ,
reality. In IEICE Trans. on Information and Systems, G. Virtual worlds and augmented reality in cultural
vol. E89-D. Jan. 2006. 2.2.1, 3.1 heritage applications. 2005. 3.1
22
References
M ANN , S. Wearable computing: A first step toward per- NAVAB , N., BANI -H ASHEMI , A. AND M ITSCHKE , M.
sonal imaging. Computer 30, 2 (Feb. 1997), pp. 2532. Merging visible and invisible: Two camera-augmented
1.2, 2.4 mobile C-arm (CAMC) applications. In IWAR99:
Proc. 2nd Intl Workshop on Augmented Reality
M ATYSCZOK , C., R ADKOWSKI , R. AND B ERSSEN - (Washington, DC, USA, 1999), IEEE CS Press,
BRUEGGE , J. AR-bowling: immersive and realistic pp. 134141. ISBN 0-7695-0359-4. 3.3
game play in real environments using augmented real-
ity. In ACE04: Proc. ACM SIGCHI Intl Conf. on Ad- O GI , T., YAMADA , T., YAMAMOTO , K. AND H IROSE ,
vances in Computer Entertainment technology (Singa- M. Invisible interface for immer sive virtual world.
pore, 2004), ACM Press, pp. 269276. ISBN 1-58113- In IPT01: Proc. Immersive Projection Technology
882-2. 3.4 Workshop (Stuttgart, Germany, 2001), pp. 237246.
2.1.3
M AYOL , W. W. AND M URRAY, D. W. Wearable
hand activity recognition for event summarization. O HSHIMA , T., S ATOH , K., YAMAMOTO , H.
In ISWC05: Proc. 9th Intl Symp. on Wearable Com- AND TAMURA , H. AR2 Hockey: A case study
puters (Osaka, Japan, Oct. 2005), IEEE CS Press. 2.3 of collaborative augmented reality. In VRAIS98:
Proc. Virtual Reality Ann. Intl Symp. (Washington,
M ERTEN , M. Erweiterte realitatverschmelzung zweier
DC, USA, 1998), IEEE CS Press, pp. 268275. 3.4
welten. Deutsches Arzteblatt 104, 13 (Mar. 2007). 26,
3.3 O HSHIMA , T., S ATOH , K., YAMAMOTO , H.
AND TAMURA , H. RV-Border Guards: A multi-
M ILGRAM , P. AND K ISHINO , F. A taxonomy of mixed
reality visual displays. IEICE Trans. on Information player mixed reality entertainment. Trans. Virtual
and Systems E77-D, 12 (Dec. 1994), pp. 1321. 1.1 Reality Soc. Japan 4, 4 (1999), pp. 699705. 3.4
M IZELL , D. Fundamentals of Wearable Computers and O LWAL , A. AND F EINER , S. Unit: modular develop-
Augumented Reality. Lawrence Erlbaum Assoc, Mah- ment of distributed interaction techniques for highly
wah, NJ, 2001, ch. Boeings wire bundle assembly interactive user interfaces. In GRAPHITE04: Proc.
project, pp. 447467. 3.2 2nd Intl Conf. on Computer graphics and interactive
techniques (Singapore, 2004), ACM Press, pp. 131
M OHRING , M., L ESSIG , C. AND B IMBER , O. Video 138. ISBN 1-58113-883-0. 3.2
see-through AR on consumer cell-phones. In IS-
MAR04: Proc. 3rd Intl Symp. on Mixed and Aug- PAN , Z., Z HIGENG , A. D., YANG , H., Z HU , J.
mented Reality (Arlington, VA, USA, Nov. 2004), AND S HI , J. Virtual reality and mixed reality for vir-
IEEE CS Press, pp. 252253. ISBN 0-7695-2191-6. tual learning environments. Computers & Graphics
2.1.3 30, 1 (Feb. 2006), pp. 2028. 3.6
NAIMARK , L. AND F OXLIN , E. Circular data ma- PAPAGIANNAKIS , G., S CHERTENLEIB , S.,
trix fiducial system and robust image processing for OK ENNEDY, B., A REVALO -P OIZAT, M.,
a wearable vision-inertial self-tracker. In ISMAR02: M AGNENAT-T HALMANN , N., S TODDART, A.
Proc. Intl Symp. on Mixed and Augmented Reality AND T HALMANN , D. Mixing virtual and real scenes
(Darmstadt, Germany, Oct. 2002), IEEE CS Press. in the site of ancient pompeii. Computer Animation
2.2.2 and Virtual Worlds 16 (2005), pp. 1124. 3.1
NARZT, W., P OMBERGER , G., F ERSCHA , A., KOLB , PATEL , S. N., R EKIMOTO , J. AND A BOWD , G. D.
D., M ULLER
, R., W IEGHARDT, J., H ORTNER , H. iCam: Precise at-a-distance interaction in the physical
AND L INDINGER , C. Pervasive information acquisi- environment. Feb. 2006. 2.2.1
tion for mobile ar-navigation systems. In WMCSA03:
Proc. 5th Workshop on Mobile Computing Systems and P ICARD , R. W. Affective Computing. MIT Press, Cam-
Applications (Oct. 2003), IEEE CS Press, pp. 1320. bridge, MA, USA, 1997. 2.3
ISBN 0-7695-1995-4/03. 21, 3.1
P IEKARSKI , W. AND T HOMAS , B. Tinmith-Metro: New
NARZT, W., P OMBERGER , G., F ERSCHA , A., KOLB , outdoor techniques for creating city models with an
D., M ULLER
, R., W IEGHARDT, J., H ORTNER , H. augmented reality wearable computer. In ISWC01:
AND L INDINGER , C. Augmented reality navigation Proc. 5th Intl Symp. on Wearable Computers (Zurich,
systems. Universal Access in the Information Society Switzerland, 2001), IEEE CS Press, pp. 3138. 2.2.1,
4, 3 (2006), pp. 177187. 2.1.3, 3.1 3.4
23
References
P IEKARSKI , W. AND T HOMAS , B. ARQuake: the out- ROLLAND , J. P., B IOCCA , F., H AMZA -L UP, F., H A ,
door augmented reality gaming system. Communica- Y. AND M ARTINS , R. Development of head-mounted
tions of the ACM 45, 1 (2002), pp. 3638. 3.4 projection displays for distributed, collaborative, aug-
mented reality applications. Presence 14, 5 (2005),
P IEKARSKI , W., G UNTHER , B. AND T HOMAS , B. In- pp. 528549. 2.1.3
tegrating virtual and augmented realities in an outdoor
application. In IWAR99: Proc. 2nd Intl Workshop S ANDOR , C. AND K LINKER , G. A rapid prototyp-
on Augmented Reality (Washington, DC, USA, 1999), ing software infrastructure for user interfaces in ubiq-
IEEE CS Press, pp. 4554. 3.2, 3.4 uitous augmented reality. Personal and Ubiquitous
Computing 9, 3 (May 2005), pp. 169185. 3.2
P IETRZAK , P., A RYA , M., J OSEPH , J. V. AND PATEL ,
H. R. H. Three-dimensional visualization in laparo- S CHMALSTIEG , D., F UHRMANN , A. AND H ESINA , G.
scopic surgery. British Journal of Urology Interna- Bridging multiple user interface dimensions with aug-
tional 98, 2 (Aug. 2006), pp. 253256. 3.3 mented reality. In ISAR00: Proc. Intl Symp. Aug-
mented Reality (2000). 2.2.2, 15, 2.3
P OUPYREV, I., TAN , D., B ILLINGHURST, M., K ATO ,
H., R EGENBRECHT, H. AND T ETSUTANI , N. Tiles: S CHOWENGERDT, B. T., S EIBEL , E. J., K ELLY, J. P.,
A mixed reality authoring interface. 2001. 2.3 S ILVERMAN , N. L. AND F URNESS III, T. A. Binoc-
ular retinal scanning laser display with integrated fo-
R AAB , F. H., B LOOD , E. B., S TEINER , T. O.
cus cues for ocular accommodation. In Proc. SPIE,
AND J ONES , H. R. Magnetic position and orientation
Electronic Imaging Science and Technology, Stereo-
tracking system. Trans. on Aerospace and Electronic
scopic Displays and Applications XIV (Bellingham,
Systems 15, 5 (1979), pp. 709717. 2.2.2
WA, USA, Jan. 2003), vol. 5006, SPIE Press. 2.1.2,
R ASKAR , R., W ELCH , G. AND C HEN , W.-C. Table-top 7
spatially-augmented realty: Bringing physical models
to life with projected imagery. In IWAR99: Proc. 2nd S TARNER , T., M ANN , S., R HODES , B., L EVINE , J.,
Intl Workshop on Augmented Reality (Washington, H EALEY, J., K IRSCH , D., P ICARD , R., AND P ENT-
LAND , A. Augmented reality through wearable com-
DC, USA, 1999), IEEE CS Press, pp. 6470. ISBN
0-7695-0359-4. 2.1.2 puting. Presence 6, 4 (1997), pp. 386398. 1.2, 3.1
R ASKAR , R., VAN BAAR , J., B EARDSLEY, P., S TETTEN , G., C HIB , V., H ILDEBRAND , D.
W ILLWACHER , T., R AO , S. AND F ORLINES , C. iL- AND B URSEE , J. Real time tomographic reflection:
amps: geometrically aware and self-configuring pro- Phantoms for calibration and biopsy. In ISAR01:
jectors. ACM Trans. on Graphics 22, 3 (2003), Proc. 2nd Intl Symp. Augmented Reality (Washington,
pp. 809818. 2.1.3 DC, USA, 2001), IEEE CS Press, pp. 1119. ISBN
0-7695-1375-1. 2.1.3
R EKIMOTO , J. Transvision: A hand-held augmented re-
ality system for collaborative design. In VSMM96: S TOAKLEY, R., C ONWAY, M. AND PAUSCH , R. Vir-
Proc. Virtual Systems and Multimedia (Gifu, Japan, tual reality on a WIM: Interactive worlds in miniature.
1996), Intl Soc. Virtual Systems and Multimedia, In CHI95: Proc. SIGCHI Conf. on Human Factors in
pp. 8590. 3.5 Computing Systems (1995), pp. 265272. 2.2.1
R EKIMOTO , J. NaviCam: A magnifying glass approach S UGIHARA , T. AND M IYASATO , T. A lightweight 3-D
to augmented reality. Presence 6, 4 (Aug. 1997), HMD with accommodative compensation. In SID98:
pp. 399412. 3.1 Proc. 29th Society for Information Display (San Jose,
CA, USA, May 1998), pp. 927930. 2.1.2
R EKIMOTO , J. AND S AITOH , M. Augmented surfaces:
A spatially continuous workspace for hybrid comput- S UTHERLAND , I. E. A head-mounted three-dimensional
ing environments. In CHI99: Proc. SIGCHI Conf. display. In Proc. Fall Joint Computer Conf. (Washing-
on Human Factors in Computing Systems (Pittsburgh, ton, DC, 1968), Thompson Books, pp. 757764. 2,
Pennsylvania, USA, 1999), ACM Press, pp. 378385. 1.2, 2.2.2
ISBN 0-201-48559-1. 3.5
, Z., S CHMALSTIEG , D., F UHRMANN , A.
S ZALAV ARI
ROLLAND , J. P. AND F UCHS , H. Optical versus video AND G ERVAUTZ , M. Studierstube: An environment
seethrough head-mounted displays in medical visual- for collaboration in augmented reality. Virtual Reality
ization. Presence 9, 3 (June 2000), pp. 287309. 4 3, 1 (1998), pp. 3749. 2.4, 3.5
24
References
TAKAGI , A., YAMAZAKI , S., S AITO , Y. VOGT, S., K HAMENE , A. AND S AUER , F. Reality aug-
AND TANIGUCHI , N. Development of a stereo mentation for medical procedures: System architec-
video see-through HMD for AR systems. In ISAR00: ture, single camera marker tracking, and system eval-
Proc. Intl Symp. Augmented Reality (Munchen, uation. Intl Journal of Computer Vision 70, 2 (2006),
Germany, 2000), ACM Press. ISBN 0-7695-0846-4. pp. 179190. 3.3
2.1.2
WAGNER , D. AND S CHMALSTIEG , D. First steps to-
TAMURA , H. Steady steps and giant leap toward practi- wards handheld augmented reality. In ISWC03: Proc.
cal mixed reality systems and applications. In VAR02: 7th Intl Symp. on Wearable Computers (Washington,
Proc. Intl Status Conf. on Virtual and Augmented Re- DC, USA, 2003), IEEE CS Press, p. 127. ISBN 0-
ality (Leipzig, Germany, Nov. 2002). 22, 3.2, 3.4, 29, 7695-2034-0. 2.1.3
3.5
W EISER , M. The computer for the 21st century. Scien-
TAMURA , H., YAMAMOTO , H. AND K ATAYAMA , A. tific American 265, 3 (Sep. 1991), pp. 94104. 5
Mixed reality: Future dreams seen at the border be-
W ILLERS , D. Augmented reality at airbus. In IS-
tween real and virtual worlds. Computer Graphics and
MAR06: Proc. 5th Intl Symp. on Mixed and Aug-
Applications 21, 6 (Nov./Dec. 2001), pp. 6470. 2.1.3,
mented Reality (Santa Barbara, CA, USA, Oct. 2006),
22, 3.2, 3.4
Airbus CIMPA GmbH, IEEE CS Press. 2.2.2, 24, 3.2,
5
TANG , A., OWEN , C., B IOCCA , F. AND M OU , W.
Comparative effectiveness of augmented reality in ob- W INKLER , T., K RITZENBERGER , H. AND H ERCZEG ,
ject assembly. In CHI03: Proc. SIGCHI Conf. on Hu- M. Mixed reality environments as collaborative and
man Factors in Computing Systems (Ft. Lauderdale, constructive learning spaces for elementary school
Florida, USA, 2003), ACM Press, pp. 7380. ISBN children. 2003. 3.6
1-58113-630-7. 4
T ONNIS , M., S ANDOR , C., K LINKER , G., L ANGE , C.
AND B UBB , H. Experimental evaluation of an aug-
mented reality visualization for directing a car drivers
attention. In ISMAR05: Proc. 4th Intl Symp. on
Mixed and Augmented Reality (Vienna, Austria, Oct.
2005), IEEE CS Press, pp. 5659. ISBN 0-7695-2459-
1. 21, 3.1
25
The author has requested enhancement of the downloaded file. All in-text references underlined in blue are linked to publications on ResearchGate.