Augmented reality: from eclectic history to today’s electricity
By John J. Simmins, Gerald R. Gray & Norman McCollough
You might not have realized it, but augmented reality (AR) has been incorporated into your college football experience since 1998. Thanks to a company called Sportvision, the viewers at home know when a team gets a first down before viewers at the stadium. How? TV viewers see the line of scrimmage marker drawn across the screen. That’s augmented reality—but not virtual reality.
Virtual reality is where your entire realm of sensory perceptions is replaced by interactions with an electronic device; augmented reality uses computer-generated imagery and sounds to enhance the real world.
Back to the game. During it, you’re watching computer generated information overlaid on a view of the real world—augmented reality. It works well on television, but could it be used in the workplace?
It can, even in our industry. So, let’s talk augmented reality—how it started and how it now applies to your job.
A little history
The first true mention of augmented reality is in L. Frank Baum’s novel The Master Key in 1901. Baum, who also wrote the Wizard of Oz, writes of a set of spectacles that showed the character of people: "While you wear them, everyone you meet will be marked upon the forehead with a letter indicating his or her character. The good will bear the letter 'G,' the evil the letter 'E.' The wise will be marked with a 'W' and the foolish with an 'F.' The kind will show a 'K' upon their foreheads and the cruel a letter 'C.'"
Augmented reality was completely beyond reality until the mid 20th Century, when it moved from fictional to possible.
In 1962, Martin Heilig’s work in what he labeled “experience theater” culminated in a patent. Heilig, known as the father of virtual reality, tried to develop an experience that went past simple visual stimulation to include sound, smell and touch. He called it “Sensorama,” and the original run included a series of journeys: a motorcycle ride through Brooklyn (complete with seat vibrations mimicking the movement of the bike and the smell of baking pizza wafting in from a passing shop) and a view of a belly-dancer (with the whiff of perfume).
As a harbinger of the ultimate utility of augmented reality, the patent for the Sensorama mentions a use for the technology where "there are increasing demands today for ways and means to teach and train individuals without actually subjecting the individuals to possible hazards of particular situations.”
Heilig’s device was technically virtual reality, but it sparked research, inspired by the space program, throughout the sixties that would culminate in the Sword of Damocles, the first augmented reality developed by famed computer scientist Ivan Sutherland while at Harvard College in 1968. The system was head-mounted display (HMD) suspended from a ceiling where the viewer was fed computer-generated graphics. This device would come much closer to achieving what Baum envisioned at the beginning of the century, in fact.
Unwieldy—it was suspended from the ceiling, after all—and primitive, it nevertheless tracked the user’s gaze and kept the projected images in the proper location.
It took two scientists at Boeing, Thomas Caudell and David Mizell, to put augmented reality to work. In analyzing the complexity in the manufacture of the Boeing 747, they realized they needed something: to digitize the documentation and create an environment where the worker could access necessary drawings and directions without leaving the immediate worksite. What Caudell and Mizell developed was a see-through, virtual reality goggle to augment the worker’s field of view with useful and dynamically changing information. And it was Caudell who first used the term “augmented reality.”
From Oz to EPRI
EPRI’s research in augmented reality started with the question: “if you can project a first down marker on reality at a football game, why can’t you project GIS (geospatial information system) information on reality in a work environment?” The project is called “Field Force Data Visualization” (FFDV) and relies on three technologies:
- Augmented reality – overlay GIS data, measurements, menus, and documentation on an interactive view of the utility distribution or transmission environment.
- GIS context awareness – knowing the location of the worker relative to utility assets, customizing the user’s experience and defining what the user can and cannot do.
- Common Information Model (CIM) messaging – using CIM to communicate with multiple back office systems to accomplish the allowed workflows.
The concept of FFDV was to develop an integrated field force data visualization and integration tool for managing any work and maintaining any asset in the field that can be utilized across the industry. The applications that were envisioned include:
- Viewing asset maintenance manuals
- Performing storm damage assessment or inspections
- Accessing asset information in the field
- Facilitating switching communications
- Integrating work-order information flows
- Obtaining real-time system status validation
- Visualizing faults in the field
- Overlaying any data, like weather or operational data in the field
- Viewing and analyzing power quality data in the field
- Using the same technology in the control center as well as in the field devices
A key use-case could be to allow a field crew out in the field to be able to identify all relevant data for the network at the crew’s current location. From this interface, they could navigate through all the data for a transformer, identify its location in the GIS and view a single-line diagram, be shown the down-stream circuit on a map, query into its asset history, maintenance history, manufacturer information and catalogue etc. Upon arriving in a street, the GPS receiver, magnetometer (compass) and gyroscopes built into a tablet would identify the crew’s location and orientation. The crew member could then hold up the mobile device and see real-time graphical information overlaid on the camera’s view of the area—just like that line drawn on the TV during the game.
The data would be correlated with a single line diagram and a map. The crew member could query as to the state of a switch, identify premises with outages, automatically tag devices, etc. This would all be possible with properly integrated data environment and accurate GIS data.
As a first step in bringing this technology to fruition, EPRI is developing a field force tool on an iOS-based device (think iPad or iPhone) platform based in the Common Information Model (CIM) for messaging. The application used GIS context awareness to create the option available to the user. What the user sees and the operations they are able to do are dependent on their role and their spatial relationship to the utility asset. For example if the user has permission, they might have the following options:
- Schedule maintenance
- View the manual of the nearest device
- Move the asset location (“redlining”)
- View location details – asset history
- Report storm damage
- Take a photo of the asset or site
- The user can access the menu by tapping on the symbol for the asset in the map view or in the augmented reality view. Details for storm damage assessment, asset maintenance, etc. will be on subsequent sub menus and forms.
EPRI has installed one demonstration project and plans on three more this year. Each successive implementation will add more capability to the platform and augmented reality will become more real than Baum or Helig could have dreamed—and for a better purpose than smelling a bellydancer’s perfume, by far. One could even venture that the work life of augmented reality may even surpass it’s great use in college football, if EPRI has it’s way.