Augmented Reality in AECAECbytes Feature (July 30, 2015)
While the concept of virtual reality is now very familiar to most technology professionals and its use commonplace in many industries including AEC, there is another concept that is slowly emerging and gaining ground in technology—that of “augmented reality.” In contrast to the fully immersive experience of virtual reality (VR), augmented reality (AR) “augments” physical reality with additional relevant content instead of replacing it altogether. There is a whole industry of augmented reality technologies that is emerging, with dedicated conferences showcasing the latest developments in AR hardware, software, as well as individual applications. A case in point is the annual Augmented World Expo (AWE) event, which I have attended for two years now to get a better understanding of AR technology and how it is evolving.
While the commercial application of AR in AEC is still quite nascent, we are starting to see some interesting ideas emerge as well as some early-stage developments. The recent 2015 AWE conference had an entire track devoted to AR in Real Estate, Urban Design, and Construction, and while it reinforced the fact that it’s still early days yet, it did provide some measure how the technology might be useful. Let’s take a closer look at augmented reality and its potential application in AEC.
How Augmented Reality Works
Since augmented reality is adding additional content to a real scene instead of replacing it altogether, it requires being able to see physical reality in conjunction with a filter or lens which overlays the additional content on top of it. Thus, there is a hardware component to the technology, which currently falls into two main categories—mobile phones or tablets that need to be held up in front of the viewer, or wearable glasses through which both the real scene and the augmented content can be viewed. An example of the first category is illustrated in Figure 1, which shows an app developed by ViewAR for a manufacturer of outdoor furniture. It allows a customer to use a smartphone or tablet to visualize different types of outdoor furniture from the manufacturer’s catalog against their actual outdoor space to better assist them in making a selection. This is by far the most common category of AR since smartphones are so ubiquitous, making AR available to everyone—all they need to do is download an app. Not surprisingly, the number of AR apps added to both the iOS and Android app stores continues to multiply every day.
The second category, wearable glasses, is far from ubiquitous—in fact, most people do not have ready access to the specialized glasses that are needed to see augmented reality. However, when available, they have a definite advantage over smartphones and tablets in that they do not need to be constantly held in front of you to experience an AR enhanced display, which can get quite tiresome, not to mention exhausting, after some time. In contrast, the glasses are hands-free, and provide a continuous and steady AR experience until they are taken off (Figure 2). Currently, the glasses that are available are quite heavy and unwieldy, and it is not surprising that a lot of the development in AR hardware is happening on the glasses front in an attempt to make them lighter and easier to wear for a longer period of time. In fact, over time, they can even evolve into a “fashion accessory” like the Apple Watch and be included in the trendy “wearable tech” category.
So what is it exactly that triggers the display of augmented content? There are two approaches to enabling this. The first, and less complicated, approach is to use a marker that is recognized by the AR application using the camera of the viewing device; it then overlays the augmented content at that location. The simplest kind of marker is a square 2D pattern, which can be easily recognized by the application when parsing through camera data. The example shown in Figure 2 used this kind of marker, with the display of the augmented data for each chemical element triggered by the unique pattern printed on the back of the card for the element. Markers can also be more complex, such as the example shown in Figure 3, where the 2D drawing of a house acts as the trigger for the AR display of its 3D model. What makes marker-based AR technology easier to develop is that it is hard-coded in the application. In other words, the application is programmed to recognize a certain pattern in a scene though the device’s camera, and when it is found, it simply brings up the AR content associated with that pattern. The most basic AR apps are developed to recognize only a single pattern and show the AR content associated with it—an example being a free AR basketball game I downloaded on my phone just to try it out—while more expansive AR apps can recognize several triggers and display the corresponding content—such as the Augment app that powers the AR display shown in Figure 3.
The second approach to triggering the display of AR content is by detecting the location and field of view of the AR device provided by its camera, compass and GPS data—which most mobile devices already have and which all specialized AR glasses need to have— and displaying the data relevant to that location. This type of AR is also called “marker-less” as it does not rely on a hard-coded marker as a trigger. Needless to say, this type of AR is much more difficult to develop as it must rely not only on precisely identifying the location and orientation of the user but also on recognizing, on the fly, the patterns, images, or other features that will be needed to trigger the corresponding AR displays. An example of this kind of AR implemented on a mobile phone for mapping and navigation is shown in Figure 4.
And finally, of course, there is the multitude of AR apps, each of which has been developed to address a specific task, whether it is to augment 2D information on paper with 3D content—used primarily in education, retail, toys and games, etc.—or augment real-world scenes with additional useful information—potentially useful in almost every field of human activity, including AEC. The AEC track at the recent AWE conference provided a good sense of the current implementation of AR in the field.
AEC Track at the AWE Conference
Much of augmented reality at the moment is focused on consumer and retail applications, and this seems to be true of AEC as well. At the Real Estate, Urban Design, and Construction track at AWE, we saw an app developed by an Austrian company, Imabis, which is focused on providing real estate data to potential clients. As shown in Figure 5, it lets you see available properties for sale or rent, their prices, and other relevant data on a phone as you are viewing a neighborhood through it. This is a good example of the marker-less AR technology that was described earlier.
Another application of VR in real estate sales was demonstrated by the start-up company, Augmented Pixels, headquartered in Silicon Valley. While Augmented Pixels also develops AR apps for other fields such as retail, its prime focus is the real estate market where its AR apps can allow real estate developers to provide potential clients with the opportunity to visualize a building in 3D and get a better understanding of it. This can happen in two ways. The first is through markers in traditional 2D print material such as real estate brochures allowing customers to see the corresponding content in 3D, similar to the application shown earlier in Figure 3. The second method is more unique—it uses a physical 3D model as the trigger for different AR experiences that can allow the customer to get a better understanding of the interior details of a building that cannot be provided by a static physical model. As shown in the top image of Figure 6, when the physical model is viewed using the Augmented Pixels app on a tablet, several markers are displayed, each of which is tied to a different AR experience that can be selected by clicking on it. Some of the markers simply provide additional information, while others launch more sophisticated interactions such as the ability to explore the interior of a space in 3D (lower image of Figure 6).
Augmented reality is also becoming the subject of academic research in AEC, as demonstrated by two presentations in the AEC-related track at the AWE conference. A group at NYU’s Media Lab has developed an AR app that augments the NYU college campus when viewed through a smartphone with additional useful information about the different buildings, navigation aids, safety features, etc. (Figure 7). In another presentation, we saw several AR projects that have been developed at the VTT Technical Research Centre of Finland, the largest research institute in Northern Europe. Its AR development has scanned a wide spectrum of fields including games and entertainment, print and advertising, interior design, tourism, industrial production and maintenance, city planning, and construction. A more recent research project called “DigiSpaces” is focused on the use of AR for building maintenance (Figure 8). The idea is that the BIM model of a building is used to augment the actual physical views seen by a maintenance worker—through a phone or tablet—and can be used to highlight problem areas, look through a surface at the underlying building elements that are otherwise hidden, get additional information on any equipment including maintenance notes and links if they are included, and if required, visualize the building as it was built compared to how it was designed.
While Augmented Reality as a technology is rapidly gaining momentum, it has still to show real potential in mainstream AEC as evidenced not only by the presentations at the AEC-related track at the AWE conference but also by the lack of any visible AR development by mainstream AEC technology vendors such as Autodesk, Bentley, Graphisoft, and others. In fact, the only AR related technology I am aware of that is being developed by a leading vendor is Trimble’s work on Microsoft’s HoloLens platform with SketchUp and Trimble Connect (Figure 9). I had the chance to get a demo of this technology in early May, and what it essentially does is project a 3D hologram of a SketchUp model in physical space that can be seen and interacted with by a user wearing the HoloLens headset. It also allows a construction problem on site to be seen as a hologram and collaboratively resolved with another team member by interacting with it. Microsoft and Trimble refer to this technology as "mixed reality" rather than “augmented reality,” but it seems as if the two technologies are not that far apart, both conceptually as well as in terms of their potential application in AEC.
Going forward, it would seem as though the most compelling applications of AR in AEC are for design and construction visualization, and building and infrastructure maintenance. Skeptics will question which we need the technology in the first place given that we already great visualizations tools on the computer, but it is undeniable that, at least for construction and especially maintenance, it would be extremely helpful to “see” inside a structure at what is underneath. Imagine a contractor being able to determine the exact location of studs in a wall or pipes under a slab and avoid unnecessary drilling and cutting. Or a maintenance engineer being able to see the exact location of utilities under a street and know exactly where to dig for access. Of course, this requires models of buildings and infrastructure to be brought into an AR platform and superimposed exactly on top of the corresponding real-world elements, which seems a complex undertaking—although it seems, from its presentation at AWE, that VTT has made some progress on that front.
One thing, however, is certain—augmented reality may be a promising technology, but it will be a while before we can see some tangible benefits of it in AEC.
Related Archive Articles
- The "Internet of Things" in AEC
- While the "Internet of Things" can be used to make buildings with smarter controls and sensors once they are built, and inhabited, can it also be applied in the design and construction phases?
- iPad Apps for AEC: Design and Visualization
- An overview of Graphisoft's BIMx app for iPad, Autodesk's new cloud strategy and the Autodesk Design Review app, the iVisit 3D app, and the Inception app from Architactile.
- SketchUp Pro 2014
- This review explores the key new features in SketchUp Pro 2014, the paid professional version of SketchUp, and, in particular, the increasing AEC-specific and BIM-related capabilities that are being added to it under the Trimble umbrella.
- AEC Technology Updates, Fall 2014
- Updates including Vectorworks Architect 2015, Newforma Project Center Eleventh Edition, form•Z 8, Vico Office R5, Dassault Systèmes "Façade Design for Fabrication" tool, and more.
- Autodesk's 2015 Building Design Portfolio
- Improvements include expanding the capabilities of FormIt and Dynamo, extending the scope and scale of Revit especially for fabrication and construction, enhancing the point cloud capabilities across all of Autodesk's modeling products, tighter integration with Autodesk cloud services, and improved analysis and simulation.