Tracking Requirements for Augmented Reality

Ronald Azuma

In this issue, Fitzmaurice and Feiner describe two different augmented reality systems. Such systems require highly capable head and object trackers to create an effective illusion of virtual objects coexisting with the real world. For ordinary virtual environments that completely replace the real world with a virtual world, it suffices to know the approximate position and orientation of the user's head. Small errors are not easily discernible because the user's visual sense tends to override the conflicting signals from his vestibular and proprioceptive systems. But in augmented reality, virtual objects supplement rather than supplant the real world. Preserving the illusion that the two coexist requires proper alignment and registration of the virtual objects to the real world. Even tiny errors in registration are easily detectable by the human visual system. What does augmented reality require from trackers to avoid such errors?

First, a tracker must be accurate to a small fraction of a degree in orientation and a few millimeters (mm) in position. Errors in measured head orientation usually cause larger registration offsets than object orientation errors do, making this requirement more critical for systems based on Head-Mounted Displays (HMDs). Try the following simple demonstration. Take out a dime and hold it at arm's length. The diameter of the dime covers approximately 1.5 degrees of arc. In comparison, a full moon covers 1/2 degree of arc. Now imagine a virtual coffee cup sitting on the corner of a real table two meters away from you. An angular error of 1.5 degrees in head orientation moves the cup by about 52 mm. Clearly, small orientation errors could result in a cup suspended in midair or interpenetrating the table. Similarly, if we want the cup to stay within 1 to 2 mm of its true position, then we cannot tolerate tracker positional errors of more than 1 to 2 mm.

Second, the combined latency of the tracker and the graphics engine must be very low. Combined latency is the delay from the time the tracker subsystem takes its measurements to the time the corresponding images appear in the display devices. Many HMD-based systems have a combined latency over 100 ms. At a moderate head or object rotation rate of 50 degrees per second, 100 milliseconds (ms) of latency causes 5 degrees of angular error. At a rapid rate of 300 degrees per second, keeping angular errors below 0.5 degrees requires a combined latency of under 2 ms!

Finally, the tracker must work at long ranges. When the environment is completely virtual, long-range trackers aren't required because we can create an illusion of flight by translating all the objects around a stationary user. But in augmented reality, flying is not a valid means of locomotion. The virtual objects must remain registered with the real world. Since we cannot translate real objects around a user at the touch of a button, the user instead must move himself or herself and the display devices worn. Thus, many augmented reality applications demand extended-range trackers that can support walking users. For example, Fitzmaurice's active maps and augmented-library applications require trackers that can cover an entire map or all the bookshelves in the library, respectively.

No existing system completely satisfies all of these requirements. Systems commonly used to track airplanes, ships and cars have sufficient range but insufficient accuracy. Many different tracking technologies exist [1], but almost all are short-range systems that cannot be easily extended.

An exception is an optoelectronic system developed by UNC Chapel Hill that can be extended to arbitrary room sizes, while still providing reasonable tracking performance. Optical sensors mounted on the head unit view panels of infrared beacons in the ceiling above the user (Photos 1, 2, Figure 1). The known locations of these beacons and the measurements taken by the sensors provide enough information to compute the position and orientation of the user's head. The system can resolve head motions of under 2 mm in position and 0.2 degrees in orientation, without the distortions commonly seen in magnetic trackers. Typical values for the update rate and latency are 70- to 80 Hz and 15- to 30 ms respectively. The existing ceiling covers a 10-x-12 area (in feet), but we can extend the range by simply adding more panels to the ceiling grid. By the time this article is published, a new expanded ceiling that covers approximately 16- x 30 feet should be operational. UNC first demonstrated this system to the public in the Tomorrow's Realities gallery of the ACM's SIGGRAPH '91 conference in Las Vegas, and to our knowledge this is the first demonstrated scalable tracking system for HMDs [2].

While this system is suitable for augmented reality applications, it is far from ideal. We need to reduce the weight of the head unit and increase the restricted head rotation range. Due to line-of-sight constraints, this system is not well suited for object tracking, although we do have a "hat" that tracks an ultrasonic wand (Photo 3). Because of the large number of beacons in the ceiling, we sometimes call it "the thousand points of light." Research is needed to develop long-range trackers that require far fewer modifications to the environment. Perhaps the most effective solutions will be technology hybrids. For example, inertial trackers have infinite range, but lose accuracy with time due to accumulated drift. Occasional measurements from several accurate but short- range trackers might control that drift. These and other potential improvements must be explored to meet the stringent requirements of augmented reality.


The optoelectronic tracker was partially supported by ONR contract N00014-86- K-0680, DARPA contract DAEA 18-90-C-0044, and NSF contract ASC-8920219.


1. Meyer, K., Applewhite, H., Biocca, F. A survey of position trackers. Presence 1, 2 (Spring 1992), 173-200.

2. Ward, M., Azuma, R., Bennett, R., Gottschalk, S., Fuchs, H. A demonstrated optical tracker with scalable work area for head-mounted display systems. In Proceedings of 1992 Symposium on Interactive 3D Graphics (Mar. 29 - Apr. 1, Cambridge, Mass.). Computer Graphics 1992, 43-52.

CR Categories and Subject Descriptors: I.3.1 [Computer Graphics]: Hardware Architecture - three-dimensional displays; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism - Virtual Reality

Additional Key Words and Phrases: Augmented Reality, tracking

Back to Ron Azuma's home page.