Image
BLOG

How AR and VR Add Dimension to AEG

Virtual reality (VR) and its sibling, augmented reality (AR), have been around in various forms for several decades, but until recently they have remained the exclusive domain of cutting-edge, technologically sophisticated design firms—with savvy, well-funded clients who can afford the significant hardware and software costs. Recent, drastic reductions in these costs, enabled partially by the rise of high-quality consumer gaming computers, are finally making these technologies widely available to the design and communication processes of our AEG industry.

Virtual reality, in which a user participates in an immersive visual environment by wearing a stereoscopic headset, allows viewers to navigate a 3D environment of many different scales, from a single room to an entire city. This technology is especially useful in exploring new building designs, user experiences, and in understanding the relationships of spaces and the user’s emotional connection with a proposed design.

Augmented reality differs slightly from VR; instead of a fully immersive environment, it creates a virtual digital overlay on top of physical reality. There are different implementations of AR, from apps that allow a phone or tablet to be used as a viewing “portal” to transparent goggles that overlay 3D scenes and information directly onto the user’s field of vision.

While both technologies have evolved rapidly in the last decade, there remain a number of substantial hurdles before either is a seamless experience, which can be leveraged “off-the-shelf” by an untrained user. Perhaps the single largest challenge is the slight lag in headset graphics that accompany head movements, resulting in a discomforting sense of imbalance or even nausea. This can be overcome through practice and experience, however. Currently, AR solutions lag behind VR solutions, as the total immersion of the VR experience removes the disconnect between visual tracking of the real-world environment and its simulated overlay.

Even with the recent and rapid evolution of these technologies, as the new decade begins, we are truly at a watershed moment in virtual reality applications. This revolution arrives in the form of real-time ray tracing technologies, long considered the holy grail of visual and rendering graphics. New chips developed by Nvidia, which began shipping in 2019, allow for real-time depiction of lighting, acoustics, thermodynamics and many other physical phenomena whose simulation relies on massively parallel geometric transformations.

What does this mean for architectural design? For the first time, users will be able to experience physically accurate, real-time lighting conditions that are essential to our understanding of spaces and environments, not just in terms of technical aspects like scale or illumination metrics but in the more personal, intangible terms of emotional connection, sense of place and “vibe.” Users will be able to rapidly explore design options, cycling through different daylighting scenarios or evaluating a number of lighting schemes—all without leaving the headset environment or waiting for time-consuming pre-processing.

In addition to expanding our creative imagination, this technology will permit fast, accurate simulation and exploration of other physical phenomena. Real-time acoustical modeling will improve both quality and accessibility of spaces by allowing designers to understand the complex interactions of geometry and materials, which contribute to the acoustical qualities of an environment. Real-time thermal radiation visualizations will improve our ability to design sustainably, treating thermal management as another brush in our tool kit rather than an esoteric mathematical specialty.

Beyond these ray casting models, many other complex simulations exhibit performance benefits from these parallel processing GPU (graphics processing unit) architectures. While many of these techniques have been around for decades, the introduction of real-time performance transforms them from static evaluations into interactive design tools. Structural dynamics can be overlaid and optimized, improving material efficiency and improving communication between disciplines. Pedestrian and vehicle traffic models can be accurate, massive and modified “on the fly” to better understand the unforeseen implications that small relational changes can have on complex systems.

These possibilities have long been touted as the future of VR/AR, and proof of concepts in one form or another have existed for at least the last decade. However, the technology is finally catching up to the promise, and the coming decade will see the impact of high-quality, affordable and flexible virtual environments in the design and construction of our own physical reality.

Image
Will Tufts

Will Tufts is an architectural designer with experience in master planning, urbanism, vernacular architecture, computational and parametric design, energy modeling, and 3D rendering and animation.