Mission critical construction environments, such as data centers, semiconductor fabrication facilities, and life sciences manufacturing plants, present unique challenges that demand robust geospatial management and specialized expertise. While each facility type has its own nuances, they all share a common challenge: incorrectly sized openings, including those for cables, pipes, HVAC duct transitions, and equipment access panel openings.
This universal and seemingly innocuous issue of openings that are too small or too large or in the wrong location can lead to serious consequences. Without the help of construction validation processes supported by digital intent models, mission critical facilities may experience significant construction delays, substantial cost overruns, and long-term operational failures, such as equipment damage caused by overheating or contamination.
What Happens When Openings Aren’t Correctly Sized?
The primary issue caused by incorrectly sized or improperly placed openings in walls and floors within mission critical facilities is noncompliance with fire-safety requirements. These facilities must adhere to strict building codes and regulations governing wall and floor penetrations, involving multiple stakeholders throughout design, construction, coordination, programming, and final sign-off. This includes wall installation, mechanical/electrical/plumbing installation, architect/engineer approval, and general contractor accountability.
Each stage requires coordinated input and can influence the work of subsequent trades. If an opening is incorrectly located, it can compromise structural integrity. If it is undersized, MEP systems may not fit properly, which can also hinder the use of modular racks. These issues ultimately introduce negative impacts across the project, affecting schedule, cost, change management, risk management, and stakeholder engagement.
It’s worth noting that for greenfield data center builds, the critical coordination of wall openings is rigidly built into the facility’s modular design, making mistakes of this nature less likely. Wall openings serve as pathways between two elements inside the facility — and if those elements aren’t positioned correctly, the accuracy of the opening’s location becomes irrelevant. In the context of new builds, therefore, the precise placement of prefabricated assemblies is more important than ensuring the opening itself is in the correct location.
The Advantage of Catching Issues Digitally Rather Than Physically
Project owners are acutely aware of the consequences that arise from incorrectly sized or improperly positioned openings. When they inevitably encounter these mistakes, they must undertake tedious and costly rework and modifications, eating up precious time. This challenge is especially pronounced with legacy, or pre-building information modeling, semiconductor fabs that often operate with drawings and models that no longer reflect field conditions.
Although project owners cannot ensure total perfection across all openings on a mission critical construction site, they can control how early incorrect openings are identified. With construction validation processes supported by digital models that accurately reflect what has been built, teams can resolve sizing problems digitally before they become costly issues in the field.
Diagnosing opening issues begins with design teams generating a digital intent model. A geospatial solution partner then translates that model onto the ground, verifies construction against it, and continuously tracks changes as they occur. Typically, geospatial teams use multiple tools — including laser scanning, control networks, building information modeling analytics, and Power BI dashboards — to support progress tracking, identify incorrect openings, and validate installations.
In the mission critical sector, digital and virtual services have become fully embedded into the construction process and are pivotal to improving schedule accuracy, cost certainty, change management, and risk mitigation. Nevertheless, these tools are the minimum entry point. In such highly complex construction environments, simply identifying issues isn’t enough; especially when the solution is not always obvious.
Geospatial Technology is Table Stakes, People-Centric Collaboration is the Difference Maker
Consider a math professor who marks only the incorrect answers on their students’ tests. These redlines tell the students which questions they got wrong but don’t provide the formulas to help them arrive at the correct answers. The same can be said about mission critical construction environments with incorrectly sized openings, in that a geospatial partner or construction validator should do more than just flag issues; they should propose genuine solutions.
A geospatial partner should support the entire resolution process, working directly with contractors to adjust designs, improve installation methods, and maintain accuracy across both on-site and off-site fabrication. This ongoing digital project collaboration delivers a notable return on investment by reducing rework, avoiding delays, supporting “build-right-first-time” outcomes, and reducing requests for information costs by resolving issues digitally rather than through on-site rework.
What makes digital project collaboration so effective is that it is a people-centric method. The geospatial partner seeks to elevate the work of all the entities and vendors involved. Best-in-class partners that utilize this approach often do not require construction companies to use a certain type of software or technology, nor do they demand that their client remove any part of
their supply chain or even the contractor that made the opening error in the first place. Rather, this people-centric construction process aims to establish digital oversight on complex projects for better program certainty.
Minimizing Physical Mistakes Amid Escalating Demand
The development of mission critical facilities shows no sign of slowing down. As organizations push for faster, more cost-efficient builds to meet escalating demand, the value of robust digital oversight will only continue to grow. Catching issues like incorrectly sized openings early, and then resolving those discrepancies digitally rather than physically, will empower teams to deliver higher-quality facilities at the speed today’s market requires.
To kick off 2026 right, every day this month our global leaders throughout Woolpert shared their perspectives on what to watch for across the architecture, engineering, and geospatial industry in the year ahead. The idea was to spotlight areas of opportunity and areas of concern, sparking conversation and enhancing our collective ability to build a better tomorrow.
“One of our pillars to success at Woolpert is industry leadership, and it’s the insights of our global team that drive it,” Woolpert President and CEO Neil Churman said. “We’re fortunate to have an amazing group of thought leaders who truly value education, collaboration, and innovation. Their insights keep our company at the forefront of industry and technology trends to help our clients see around corners, deliver faster and more efficiently, and support our ever-changing world.”
The report below provides a high-level view of their insights shared this month.
Healthcare design is integral to the success and effectiveness of any healthcare facility. Research shows that the design of healthcare environments can positively or negatively affect health outcomes and psychological well-being for patients and their loved ones. It also influences the efficiency, safety, and stress levels of clinical staff.
In addition to designing healing environments that incorporate natural lighting, positive distractions, noise reduction, and ease of navigation, architects must allocate space for current and future equipment needs. This balancing act will become increasingly challenging with the recent influx of artificial intelligence and robots into these environments.
As Woolpert’s global director of healthcare, I believe much has been said in the industry about the impact AI and robotics will have on healthcare. However, what is often overlooked is how these technologies will affect the built environments of hospitals, medical centers, and healthcare systems.
How New Technology Influences Healthcare Design
In October, I joined Woolpert to build on the exceptional work of Bermello Ajamil, a Woolpert Company, and to expand the firm’s healthcare design practice worldwide. Previously, I served as president and CEO of E4H Architecture, the nation’s largest architecture firm dedicated exclusively to healthcare. Throughout my career, I have witnessed periods of rapid technological advancement that transformed healthcare design, and I believe we are now on the cusp of a similar transformation.
I’ve seen electronic health records and cloud storage render filing cabinets and supply closets obsolete, and marveled as telehealth and virtual care bridged hundreds of miles to diagnose patients with a phone call. Momentous changes such as these have paved the way for what healthcare systems face today as AI and robotics permeate processes, workflows, and physical environments.
AI has evolved into real-life care delivery, supporting diagnostic imaging, clinical decision-making, and even drug research and development. AI-enabled robots are also rapidly gaining traction in the healthcare industry. As with past technological shifts, today’s healthcare designers must create spaces that seamlessly incorporate and accommodate these innovations while remaining flexible enough to handle the unpredictable future.
The Architectural Response to the Clinical AI Revolution
New healthcare facilities have the advantage of being designed from the outset with spaces dedicated to AI command centers, high-tech robots, and robust data infrastructure. Existing facilities, however, don’t have that luxury.
Healthcare designers must determine how much space robots will require: Where will they be stored? How and where will they charge? Where will maintenance occur? Will specific rooms be needed for these functions? In most cases, facilities will need to reallocate space — either by repurposing general-purpose areas or constructing entirely new rooms.
Next‑generation robots are intelligent, AI‑powered, and increasingly vital to health systems’ delivery of care. As they begin autonomously navigating hospital corridors, they will likely require dedicated pathways. Hallways may need delineated lanes and specialized public‑awareness signage to prevent congestion and accidents. Designers might even incorporate creative solutions such as interior drones and ceiling‑ or floor‑mounted navigation systems.
Operating rooms also need to accommodate advanced robotics and multimodal imaging technologies. Robotic surgical systems and other high-tech equipment are expected to make these specialty rooms even larger.
Building on the success of telehealth, one approach hospitals and medical facilities are using to increase accessibility is creating dedicated spaces for both healthcare providers and remote patients. These spaces are specifically designed with widescreen visibility, high‑definition cameras, and sound‑isolating features for privacy. This strategy benefits patients by allowing them to remain comfortably at home rather than making costly, time‑consuming trips to healthcare facilities.
Energy and Data Considerations: The Additional Demands of AI and Robots
Healthcare designers also must consider energy requirements when preparing medical facilities for the widespread adoption of AI and advanced robotics. These technologies require significant energy to function and generate massive amounts of data, which in turn demands immense computational power.
Additionally, the future of healthcare includes smart‑room technology, giving patients easy control over multiple aspects of their environment, including lighting, temperature, sound, and streaming services. While these amenities enhance comfort and improve the patient experience, they will similarly introduce additional power and data requirements.
Effective strategies healthcare designers can leverage to address these challenges include integrating renewable energy sources, implementing advanced cooling and heat recovery systems, and building robust data infrastructure. Of course, all of these solutions will necessitate sufficient space allocations.
Although AI is driving increased energy demands, it can also help optimize energy management. AI- and machine learning-driven building management systems, for instance, can reduce hospital energy consumption by optimizing HVAC systems, lighting, and other utilities based on real-time data, occupancy, and predictive analytics.
The Importance of Flexibility in Healthcare Design
As with most technological leaps forward, healthcare designers will face unknowns as they work to create optimal spaces to heal, work, and thrive. The spatial and energy demands of AI-enabled robots in healthcare are no exception. We can position current and future healthcare facilities for success by designing adaptive, scalable spaces. The more modular and flexible these designs are, the more seamlessly they can incorporate technological upgrades.
Since the birth of our nation, population and manufacturing centers have thrived on water access through our coastal ports and inland waterways. They provided critical access from the coasts to the heartland long before there were roads and rails. As the U.S. has grown, so has our reliance on our inland marine transportation system (IMTS) that today links agriculture, energy, manufacturing, and national security across the country and around the world.
The IMTS has evolved into a large and complex network that supports the national and international transshipment of goods annually. It contributes almost $500 billion to the U.S. gross domestic product, while saving approximately $8 billion dollars compared to shipping by road or rail. This network directly supports 38 of 50 states through 12,000 miles of navigable waterways and 192 navigation locks that serve hundreds of intermodal ports, terminals, shippers, and transportation companies. It is crucial to the country staying competitive in agriculture and energy exports and to enabling our manufacturing process.
Yet, despite its critical role supporting our nation’s economy and security, the IMTS has been neglected and subsequently compromised. Floods, droughts, sedimentation, environmental conditions, infrastructure health, and economic factors have impacted the once robust and now fragile network. Because these challenges occur over time, they are easily overlooked; but the impact and ramifications of chronic underinvestment are huge and growing. According to a 2017 study from the National Waterways Foundation and the U.S. Maritime Administration, delays due to inland navigation lock failures alone cost shippers over $1 billion annually.
To constructively address these issues and unleash the immense potential of the IMTS, it will take adequate funding and a comprehensive and innovative approach. We must consider not only the connectivity of the system, but its power to protect and increase economic growth, bolster defense capabilities, and improve both national and international trade. Successful, smaller-scale examples of this approach already exist, and they include the Saint Lawrence Seaway and the Rhine and Danube River systems. More complex examples, such as the global marine shipping and air traffic control systems, also hold great insights for IMTS improvements.
YOne could argue that the time has come to develop a new American waterway system that will capitalize on the immense potential this nationwide network provides. This should be a multifaceted approach with its objectives plainly stated to embrace and leverage the distinct advantages of this inherently complicated system. Chief areas of focus for IMTS modernization would be:
- Implement a systems approach: Integrate the IMTS and all its elements into an interconnected network linking the U.S. interior to national and international trade.
- Improve data management: Simplify data capture, accuracy, and accessibility, and reduce the cost (time and money) of information management
- Improve infrastructure management: Establish a long-term and large-scale investment plan to ensure reliability and optimize design performance for entire expected benefit lifecycle (structural health monitoring, digital twin, sustainment and predictive maintenance, controls modernization, sediment management and dredging, etc.)
- Innovate shipping: Increase waterway utilization and types of use (intermodal, container-on-barge, etc.)
- Develop a focused IMTS freight model aligned with U.S. Department of Transportation and the Committee on Marine Transportation System’s existing national freight strategies: This can inform IMTS user groups and help guide national priorities and policy, including how the IMTS increases connections with other transportation modes.
This might seem daunting, but the clock is ticking. This proven approach will help leverage our massive investment, advance economic and security opportunities, and enable us to grow and thrive as a nation. The EU has been successful with this route, shipping nearly twice our annual cargo on a waterway roughly half the size of the U.S. Among other precedents, in 1954, the U.S. established the Ohio River Navigation and Modernization program that ran from 1954 through 2023, replacing 52 navigation locks with 19 new modernized locks.
The benefits a modern IMTS would deliver are significant, including better utilization, lower transport costs, improved transit times, better data management and accessibility, and innovative shipping and infrastructure management. Our quality of life is directly affected by the health of the inland waterway, from supplying essential daily products to the big-picture impact of our country’s economic performance.
The untapped potential of this network is massive, and the U.S. has the technology and expertise to execute. What we need is the plan to be championed.
Construction projects for semiconductor fabrication facilities are highly complex and capital-intensive, with a typical fab now costing $10 billion and requiring 6,000 workers over three years to complete. Like other mission critical facilities — such as data centers and pharmaceutical manufacturing plants — semiconductor fabs are in extremely high demand worldwide due to the ever-growing need for the chips that power the modern technology people use every day.
Each site requires spatial precision in design and construction to ensure millimeter-level accuracy. If measurements are even a hair off, critical components — such as tools and automated material handling systems — may need to be repositioned, adversely affecting schedules, time-to-production, and already immense budgets. The pressure for “right first time” accuracy is further intensified by the shortage of skilled labor in key trades and the industry’s shift toward off-site fabrication.
As a result, fabs must now be delivered with fewer resources while maintaining, and ideally increasing, speed and quality of execution to meet aggressive timelines. Geospatial management provides the precision, automation, and digital continuity required to address these constraints.
The Primary Challenges the Semiconductor Industry Currently Faces
One of the semiconductor industry’s biggest challenges today is the need for precise on-site measurement. Compared to traditional construction projects, semiconductor fabrication sites are exceptionally dense and intricate, housing thousands of highly specialized tools and systems that must operate in perfect harmony. Every component — from piping, electrical conduits, and gas and chemical lines to cleanroom equipment — must be installed with millimeter tolerances to maintain precise process control.
This level of precision and complexity means that if components arrive on-site and fail to fit as intended, the consequences can be costly and disruptive. Imagine a scenario where one contractor installs structural steel and another fabricates equipment to fit between those steel elements. If the steel columns are even a few millimeters out of position, the prefabricated components may not fit. These seemingly innocuous deviations can cascade into major delays and months of rework, derailing schedules, and creating cost overruns.
Another challenge is the mounting emphasis on off-site fabrication, where components are built at a different location and then delivered to the construction site for installation. For these components to fit perfectly upon arrival, precise coordination and measurement are essential — a task complicated by the involvement of multiple contractors. From electricians and pipefitters to plumbers and wire installers, each relies on their own technical drawings. When these drawings differ, which is often the case, problems are inevitable.
Additionally, the semiconductor industry faces persistent workforce shortages. Large projects can require thousands of workers and dozens of contractors. Although the CHIPS Act injected significant funding into semiconductor companies, their design requirements are so specialized that only a finite number of professionals are available to support these projects. In short, demand for new semiconductor facilities far exceeds the supply of qualified workers.
At the same time, timelines continue to become more aggressive. Chipmakers need fabs online quickly to meet global demand, stay ahead of technology cycles, and maintain a competitive advantage. In the past, a six- or eight-month delay might have been tolerated. Today, companies expect projects to be completed as quickly as possible, a challenge that is exacerbated by the industry’s inability to find skilled labor. These demands are increasingly difficult to meet without new digital strategies and tighter geospatial governance.
The Benefits of Geospatial Management for Semiconductor Fab Construction
To address these challenges, it is paramount that companies work with a geospatial solutions expert throughout the construction and design process. An ideal partner will integrate high-accuracy surveying, building information modeling (BIM) validation, and continuous model-to-field alignment — from site preparation through tool installation — reducing rework and redesigns, minimizing wasted effort, and keeping projects on schedule. This will ultimately accelerate the construction of semiconductor fabs.
There is a critical need for governance around measurement and positioning on semiconductor projects. In these congested environments, all the different contractors installing multi-million-dollar tools and components need a common spatial framework to ensure millimeter-level accuracy. Geospatial management provides that framework by establishing a control network — a grid of physical reference points on the ground that allow engineers to accurately position themselves on-site. These reference points create a critical link between the digital design world (BIM models) and the physical construction environment.
By bridging the gap between digital and physical, this geospatial framework delivers a unified view of the site, where all spatial data originates from a single source of truth. Verified spatial data is shared across engineering, procurement, and construction partners, original equipment manufacturers, and owner stakeholders. With all stakeholders accessing the same accurate data, decisions are based on actual site conditions, significantly reducing manual verification, rework cycles, and field-based problem solving.
Geospatial management also plays a critical role in construction validation. A best-in-class geospatial solutions partner continuously checks installation accuracy as the project progresses and identifies any deviations before they spiral into larger downstream issues. If the geospatial partner detects something out of tolerance, it alerts the contractors and project managers early so corrective action can be taken before subsequent work begins. This proactive approach enables cleaner installations, fewer clashes, and measurable improvements in project timelines. It also reduces reliance on large on-site labor teams to resolve clashes and positioning errors.
The Role of Digital Twins in Semiconductor Fab Construction
The structured as-built data produced through geospatial management forms the foundation for high-fidelity digital twins — a fully digital representation of a semiconductor facility. While digital twins vary in complexity, at their most advanced level they enable powerful simulations of plant performance. These simulations help users identify inefficiencies and make data-driven adjustments before implementing changes in the physical environment.
For semiconductor fabrication facilities, digital twins support a variety of different applications, including energy optimization, predictive maintenance, and future retooling. The result: reduced labor demand, lower rework ratios, compressed installation timelines, improved ramp-to-yield performance, and accelerated time-to-value — critical advantages in an increasingly competitive semiconductor manufacturing landscape.
Of course, the effectiveness of a digital twin model depends entirely on the quality of the data used to build it. Like with a large language model, inaccurate input data produces unreliable output. The issue for the semiconductor industry is that the multiple contractors involved in these projects are each responsible for a specific scope, and so they create individual models, which are combined into a federated model representing the entire project. However, design changes and on-site adjustments often occur throughout the project, and updating these complex digital twin models is time-consuming and costly. As a result, many models end up being inaccurate.
Leading geospatial solution providers overcome these issues by delivering the building blocks for digital twins. Woolpert, for example, can digitize legacy sites that predate BIM and modern modeling standards to create accurate base models. The multidisciplinary firm can also compare contractor models against actual site conditions, identify discrepancies, and make necessary adjustments. This rigorous validation ensures that the final models truly reflect the as-built environment, enabling the most reliable simulations.
How to Choose the Right Geospatial Partner
Building a semiconductor fab facility is an extraordinarily involved undertaking with many different players and moving pieces. Experience is essential to navigate challenges effectively. Similarly, geospatial management for these types of critical facilities is incredibly complicated, requiring experience across disciplines — especially digital twin development.
For companies looking to accelerate and enhance fab construction and operational readiness, partnering with an experienced, multidisciplinary geospatial solutions provider is crucial. This partner must understand the processes and pain points unique to semiconductor fab construction and have a proven track record of success. With timelines being more aggressive than ever, working with the right partner can mean the difference between staying ahead of schedule or falling behind and incurring significant costs.
One of the first differences American tourists often notice between the United States and Europe is the age of the buildings. In the United Kingdom, for example, 38% of homes were built before 1946. In contrast, the average age of an owner-occupied home in the U.S. is about 40 years old. These older buildings require specialized care, maintenance, and refurbishment, typically overseen by surveying firms with expertise in disciplines such as building pathology.
What is Building Pathology?
Buildings, much like humans, deteriorate over time. The rate and nature of this deterioration depend on factors such as the quality of materials used and the standard of construction. To ensure a building remains usable and enjoyable for years to come, it’s essential to understand its materials, how it’s deteriorating, and which interventions are appropriate and when to apply them.
There are multiple disciplines employed by surveying firms to maintain older buildings in Europe. Building pathology, for instance, is the process of understanding why defects appear, why materials break down, and why a structure might not perform as expected. Building pathology can include anything from addressing damp ingress and roof leaks to façade deterioration and timber decay. Fundamentally, it enables professionals to investigate defects and determine when to intervene with the right solutions.
Leading building pathology firms excel at maintaining, enhancing, and extending existing buildings — skills that are essential in regions with older building stock. These firms typically possess extensive experience in core building surveying services, legacy defect investigations, project management, and dilapidations. In the UK, surveyors often manage party wall matters, while in Ireland, they may act as building control assessors for local authorities.
Why is Building Pathology so Important in Europe?
In Europe, building pathology plays a vital role in the upkeep of legally protected historic structures — such as “protected structures” in Ireland and “listed buildings” in the UK — where owners are required to repair, maintain, and upgrade in ways that preserve the building’s original fabric and character. Building pathology helps preserve these protected buildings, allowing surveyors to understand how and when these buildings were constructed, as well as the materials used.
While building pathology is essential for older buildings, newer ones can benefit from it too. In Ireland, for example, the economic boom of the 1990s triggered a surge in construction. However, the emphasis on quantity during that time often compromised quality, resulting in a legacy of defects. Addressing these issues requires building surveyors to apply building pathology techniques to repair and upgrade those buildings.
Building pathology is also critical for sustainability. Consider that when an older building gets demolished, the structure releases a significant amount of embodied carbon. It is often said that the greenest building is the one that is already built, and through building pathology, professionals can effectively retrofit or refurbish an existing building to be more sustainable, avoiding unnecessary demolition. Energy retrofitting can even increase rental income, as shown by a study on office buildings in Ireland, and help organizations demonstrate progress toward meeting climate goals.
If demolition is unavoidable, building pathology and pre-demolition surveys can help maximize material reuse, which represents a growing trend in Ireland, especially. These surveys assess what materials — like bricks, steel, or timber — can be salvaged and reused in the refurbishment process.

Building Pathology vs. Facility Condition Assessment
Chartered building surveyors are recognized as experts in building pathology. While this profession is well-established in the UK and Ireland, it is less common in the U.S., where facility condition assessments are more typical. One common type of facility condition assessment involves collecting visual observations to evaluate the general condition of a building and its surrounding site.
These assessments often address code compliance, safety concerns, and potential upgrades, helping clients prioritize improvements. The evaluation typically includes a review of façades, roofs, interiors, mechanical, electrical, plumbing systems, and structural components. Insights are largely based on the assessor’s expertise and their understanding of the client’s goals during the observation process.
Building pathology, however, is much more diagnostic than facility condition assessment. It relates more to the study of defects (and failures) in buildings to understand the root cause and possible fixes. Unlike visual assessments, building pathology involves deeper analysis, often examining how environmental factors affect building elements, such as façades, foundations, and roofs.
Of course, there are nuances and overlaps between facility condition assessment and building pathology. And although these services are crucial in Europe, the need spans continents. European firms can benefit greatly from adopting facility condition assessment techniques — and vice versa for American firms, as demonstrated by Woolpert expanding its services and capabilities through Murphy Geospatial, Bluesky, and Omega Surveying Services, which all recently rebranded to Woolpert or as a Woolpert company.
The Benefit of Marrying Building Pathology with Geospatial Insights
Just as the combination of facility condition assessment and building pathology offers significant value for clients with portfolios of aging assets, so too does the union of building pathology with geospatial insights. In Europe especially, the need to understand buildings is growing, driven by an increasing focus on retrofitting. Knowing what’s in the portfolio and understanding the existing building stock is becoming more critical every day.
Geospatial firms create digital representations of buildings, primarily focusing on capturing spatial layouts. A building pathology expert complements these efforts by assessing and documenting the condition of building elements and — in some cases — diagnosing essential repairs to help preserve that asset’s value. Together, they deliver a more holistic view of the building — combining geospatial insights with detailed material and structural analysis.
The combination of spatial data and material analysis provides clients with a deeper, more valuable understanding of their buildings. Moreover, by delivering actionable insights and extracting intelligence from the data — rather than leaving clients to decipher it themselves — firms empower clients to make more informed decisions about their building portfolios.
The Future of Building Intelligence
Whether a building is 100 days old or 100 years old, there is a growing desire among organizations, including designers, building managers, owners’ management companies, and investment funds, to digitize their portfolios. In addition to having access to digital models, they want to better understand the history of their assets. Eventually, they want to implement Internet of Things sensors to gain real-time operational data and foster a culture of predictive maintenance to avoid costly reactive repairs.
Achieving these goals requires more than just building pathology expertise; it demands a robust integration of architecture, engineering, and surveying services. When these disciplines are brought together under the roof of a multidisciplinary firm, companies can gain a deeper understanding of their asset portfolios across Europe and beyond.
For the first time ever, Woolpert exhibited at INTERGEO in Frankfurt, Germany, and was joined by colleagues from Murphy Geospatial, a Woolpert Company, Bluesky International Limited, a Woolpert Company, and Woolpert Asia-Pacific. As the world’s leading international trade fair and conference for the geospatial industry, INTERGEO is a respected venue for productive, high-level conversations with global geospatial clients and business partners.
INTERGEO Takeaways
The Woolpert team offered several key takeaways from the conference.
The Geospatial Industry is Excited About the Future of Lidar
Woolpert’s Amar Nayegandhi gave two presentations: “High-Definition Lidar Development” and “SLAM Navigation for Hydrographic Surveys in GNSS-Denied Environments.” He was thrilled by the audience’s response to Woolpert’s investment in developing emerging lidar technologies, such as the next-gen Zeus airborne sensing platform, and the fusion of simultaneous localization and mapping (SLAM) navigation for improving accuracy in vessel-based bathymetry. These innovations are poised to significantly influence how lidar is applied in the marketplace, particularly in terrestrial and maritime environments.
Connecting with the Expanding Team
Not only did the conference enable Woolpert to connect with clients, but also with its growing team.
Nayegandhi noted, “It has been highly productive to have Woolpert, Murphy Geospatial, and Bluesky represented at this event. I am very pleased with how well we have integrated as a team, enhancing our understanding of each organization’s expertise and identifying opportunities to advance value for our clients.”
Bluesky’s Rachel Tidmarsh, who was invited to participate in the World Geospatial Industry Council panel, “Data to Decisions,” echoed Nayegandhi’s sentiment. “I think the culture and the values of Woolpert sit very closely and align with Bluesky’s. It was great to be able to talk to everybody about the additional services we can offer.”
Woolpert’s Augmented Capabilities: The Value of a Multidisciplinary Firm
This year’s INTERGEO conference gave Woolpert the perfect platform to showcase its augmented capabilities and highlight the value of working with a multidisciplinary firm.
Adapting to an Evolving Industry
Niall Murphy, CEO of Murphy Geospatial, pointed out that the geospatial industry is rapidly evolving, and over the past decade, there has been a noticeable convergence between the architecture, engineering, and geospatial disciplines driven by increasing interoperability, the growing need for multidisciplinary teams, and other factors.
“What we are really seeing is that the barriers between those professions — architecture, engineering, and geospatial — are continuing to be challenged and removed,” Niall Murphy explained.
A Truly End-To-End Solution
Woolpert continues to bolster its capabilities through strategic acquisitions, bringing firms with state-of-the-art technology and expertise to the table. Woolpert’s geospatial expertise — especially high-altitude lidar — combined with Bluesky’s airborne data collection and high-resolution imagery and Murphy Geospatial’s precise ground surveying capabilities enable Woolpert to provide truly end-to-end solutions for clients around the globe.
“Quite simply, it’s an end-to-end solution that clients are getting and benefiting from us as a group,” Raymond Murphy CSO of Murphy Geospatial said.
Maximizing the Usefulness of Data
Data is another area where the combined strength of Woolpert, Murphy Geospatial, and Bluesky will be particularly valuable for clients.
Tidmarsh emphasized that Bluesky’s access to Woolpert’s AI teams and tech stacks supports more consistent large-volume data processing, generates clearer insights for customers, and enables faster decision-making. This data also contains a wealth of insights that can now be extracted through AI and machine learning — capabilities that, according to Tidmarsh, “…were previously prohibited due to the amount of resources required and the associated costs. But now we can actually extract this information and deliver it to our customers.”
Raymond Murphy stressed that data capture isn’t enough: Firms must provide meaningful insights to their clients.
“A lot of geospatial companies are very good at data capture. So, the evolution is getting into what we do with this data and creating better insights. When you look across our company, we have a data-centric approach to how we design, engineer, and deliver geospatial data,” Raymond Murphy made clear.
Nurturing Woolpert Europe
The ongoing transition of Bluesky and Murphy Geospatial to Woolpert branding strengthens Woolpert’s surveying footprint in Europe and is helping the global firm deliver even more value to its European clients.
Recruiting Top European Talent
Having a strong presence at conferences like INTERGEO is pivotal for Woolpert’s recruiting efforts. Germany is central to manufacturing and technology on the European continent, and Raymond Murphy noted that, in addition to clients and partners, students were curious about what Woolpert has to offer.
Delivering World-Class Services to European Clients
Murphy Geospatial recently expanded into Germany with a beachhead project in Dresden for ESMC, a large semiconductor client. The firm is also working on data center projects in Frankfurt and Berlin.
According to Niall Murphy, “We have been experts in the semiconductor space for a number of years working for clients like Intel. We are using this opportunity to grow out the rest of our solutions and offerings through our monitoring division, widen our geospatial capabilities, and work on large mainstream infrastructure projects.”
Blending Capabilities, Building Culture: Woolpert’s Unified Vision at INTERGEO
Nayegandhi summarized Woolpert’s success at INTERGEO: “We now offer our clients a broad range of capabilities, which we integrate to deliver tailored solutions that meet their evolving needs. Our success is founded on having the right people, and the strong cultural alignment among our family of firms collaborating together demonstrates our potential — particularly in Europe — to expand our operations beyond previous limits.”
Technology infrastructure—data centers, semiconductor fabrication factories, EV battery plants—depends not just on power and connectivity, but on water. As these industries have grown, water has quietly become one of the most underappreciated gating factors in scale, reliability, cost, and risk. Reconceiving water strategy isn’t optional—it is critical to support the expansion of these industries.
The Rising Tide: Electricity, Heat, and Water Demand
The Lawrence Berkeley National Laboratory (LBNL) published a study in 2024 reporting that U.S. data center electricity demand climbed from 58 terawatt-hours (TWh) in 2014 to 176 TWh in 2023. Projections suggest additional growth to between 325 and 580 TWh by 2028 (DOE, 2024; Shehabi et al., 2024). In 2023, data centers consumed roughly 4.4% of U.S. electricity production; by 2028, that share could rise to as much as 12% (Shehabi et al., 2024).
This increase is significant for water consumption, because when more electrical power is consumed, for example by computing workloads in a data center, it results in higher heat production that must be cooled. Increasing demand for high-performance computing, cloud services, and AI workloads are driving higher power densities within data centers along with higher capacity sites. In many cases, this is forcing designs toward water-intensive means of heat rejection.
Yet consuming water for cooling in data centers is hardly a new challenge. The U.S. Energy Information Administration has long discussed the tension between cooling systems and water supply, noting that data center water demands compete with other industrial and municipal needs (EIA, “Today in Energy”). This historical context underscores water’s role as not only a future risk, but also an inherent constraint.
Why Water Quality, Quantity, and Reliability Matter
Evaporative mechanical cooling systems, available in a wide range of configurations, offer superior thermal performance compared to air-only cooling. This enhanced efficiency—along with reductions in power consumption and operating costs—comes with trade-offs, including substantial and continuous water withdrawal, treatment, return, and management requirements.
Water quality is a critical consideration in evaporative cooling systems. Elevated levels of total dissolved solids (TDS), ions, suspended particles, microbial load, and high conductivity can lead to mineral scaling, corrosion, reduced heat exchange efficiency, and shortened equipment lifespan.
In semiconductor fabs and precision manufacturing, ultrapure water (UPW) is needed. Any deviation—trace ions, particulate contamination, etc.—can degrade yields. So, for chip fabs, water isn’t just for cooling, it is part of the process control environment.
For battery/advanced manufacturing, consistent feed water quality is essential for chemical consistency, process reliability, and regulatory compliance. Fluctuations in water pressure or feed purity can cascade failures.
Water Footprint: Direct + Indirect Consumption
LBNL also estimates that in 2023, U.S. data centers consumed about 17 billion gallons of water directly for cooling and around 211 billion gallons indirectly through power generation (CivilBeat, 2025; “Data Centers Consume Massive Amounts of Water”). These figures illustrate that risks associated with local and regional water availability translate directly to risks associated with power supply and the ability to remove heat from facility operations. Thus, managing water footprint means both optimizing internal water supply and understanding upstream water stresses tied to power generation.
Water Cooling Has a Legacy; This is Not Just a Trend
Because water cooling is not novel, many of the technical, regulatory, and operational challenges are well documented:
- Competing demands for freshwater (municipal, industrial, agricultural) have long constrained large thermal systems.
- Utilities historically have not sized systems expecting hyperscale data center water demands—so capacity, permitting, and capital infrastructure upgrades often lag.
- Regulatory constraints: discharge permits, temperature limits, evaporation accounting, zero liquid discharge (ZLD) mandates, groundwater drawdown regulations, and “water rights” constraints are restrictive.
- Retrofitting or scaling water systems after site selection is costly, delayed, and risky.
- Climate variability, drought cycles, and regulatory tightening (especially in water-scarce regions) amplify uncertainty.
This legacy reality means that firms cannot treat water as a variable to be “solved later.” It must be integral from site selection through lifetime operations.
Integrating Water Strategy across Hyperscale Life Cycles
An effective water strategy must be multi-phase, integrated, and adaptable. Here’s how Woolpert’s architecture, engineering, and geospatial (AEG) approach delivers:
- Site Selection and Early Concept Planning
- Geospatial and hydraulic modeling.
- Assessment of watershed behavior, recharge paths, and surface water access.
- Analysis of water rights mapping, aquifer stress, and long-term sustainability.
- Evaluation of local utility intake/distribution capacity, and review of future municipal plans evaluation.
- Climate resilience screening (drought risk, variability, etc.) and regulatory risk profiling.
- Evaluation and prioritization of sites with multi-benefit infrastructure. Such as existing co-located industrial or municipal reuse infrastructure, or those where water can be sourced from treated effluent or shared corridor pipeline.
- Architectural and System Design
- Early alignment of cooling system design alternatives (air, evaporative, direct liquid, hybrid) with water risk profile.
- Design of modular water systems (e.g., capacity for phase-wise expansion) to match growth demands and timing.
- Incorporate closed-loop recycling, condensate capture, blow-down recovery, heat reuse, and smart controls to minimize net withdrawal.
- For chip fabs and sensitive processes, embed advanced treatment (RO, deionization, polishing) and redundancy into water design scaffolding.
- Microclimate- and humidity-aware architectural detailing to enhance envelope performance.
- Explore the viability of fallback cooling (e.g., partial air cooling) in water stressedwater-stressed regions.
- Construction, Integration, and Commissioning
- Through cConstruction eEngineering and iInspection (CE&I), ensure water -infrastructure is properly installed, leak-tested, hydraulically balanced, and integrated with mechanical, electrical, and site systems.
- Test under multiple load points, seasonal extremes, and degraded supply conditions to validate resilience.
- Operations and Performance Monitoring
- Deploy instrumentation to monitor flows, pressures, conductivity, turbidity, differential pressure, temperature, and water chemistry in real time.
- Use analytics to detect drift, leaks, inefficiency, scaling onset, or quality issues.
- Model future drought or supply stress and test system operation under “stress mode.”
- Engage with utilities to forecast shared demand, leverage demand response or expansion, and participate in shared infrastructure planning.
- Plan incremental system upgrades tied to planned growth rather than overbuilding early.
Water as Differentiator, Not as Constraint
Technology infrastructure developers who approach water as an afterthought will face implementation delays, premiums, regulatory and permit pushback, and performance risk. Those who embed water strategy early gain advantages in cost, speed, resilience, and stakeholder trust.
Woolpert’s integrated AEG model reframes water as a core design and operational asset, not as a siloed plumbing afterthought. From watershed modeling to mission-critical mechanical systems to construction assurance, Woolpert empowers technology clients to transform hidden water risks into strategic advantages.
References:
CivilBeat. (2025, August 25). Data centers consume massive amounts of water (U.S. data center water consumption). CivilBeat.
DOE. (2024, December 20). DOE Releases New Report Evaluating Increase in Electricity Demand from Data Centers. U.S. Department of Energy.
U.S. Energy Information Administration. (n.d.). Today in Energy: Why water cooling and data center loads compete with water supply.
A new FAA rule, Part 108, is about to reshape the way drones can operate in America. The FAA’s Notice of Proposed Rulemaking (NPRM) is the biggest regulatory step for the Unmanned Aircraft Systems (UAS) industry in over a decade. The NPRM introduces proposed rules for two new regulations:
- Part 108, which establishes rules for operating beyond visual line of sight (BVLOS).
- Part 146, a new pathway for the certification and oversight of entities that provide automated data services for safety functions that support BVLOS, like strategic airspace deconfliction.
This post will focus on Part 108 and its implications for the aviation community and UAS teams.
How Part 108 Differs from Part 107
Unlike Part 107, which limits UAS flights to visual line of sight (VLOS) unless a waiver is granted, Part 108 is designed to enable routine BVLOS operations. The rule introduces new roles—Operations Supervisors and Flight Coordinators — replacing the traditional Remote Pilot designation, reflecting the FAA’s recognition of increasing automation in UAS operations, particularly in industries like package delivery. Part 108 shifts the operational model from “human in the loop” to “human on the loop,” allowing drones to operate largely autonomously while humans oversee and intervene remotely only when necessary. Wings Remote Operations Center, pictured below, is an example of what BVLOS operations could look like in a package delivery setting.

Source: Wing (2022)
The role changes also shift the liability from personal liability on behalf of the pilot to company liability. Instead of individual pilots being certificated by the FAA as in Part 107, companies will manage the training of their staff, ensuring they learn the criteria outlined in Part 108.
The new rules also allow for significantly heavier drones, up to 1,320 pounds, compared with the 55-pound limit under Part 107, expanding payload capacity and mission flexibility. For permitted operations in shielded areas — such as near infrastructure where traditional manned aircraft do not typically operate — drones now have priority right of way. Perhaps most impactful, Part 108 allows operators to manage fleets of highly automated drones flying BVLOS across larger areas and in synchronization with one another, rather than relying on a single pilot manually controlling one drone at a time. These changes represent a fundamental shift in how drones can be deployed, moving from small-scale, manual operations to large-scale, automated missions capable of handling a wide variety of missions without constant human manipulation.
Historically, the drone industry has focused on developing aircraft under 55 pounds to comply with the limitations of Part 107. This weight cap shaped the market, favoring lightweight platforms with constrained payload capacity and endurance. With Part 108 raising the allowable weight to 1,320 pounds, there will likely be a significant expansion of available drone platforms. This change will open the door to larger, more capable sensors and payloads, greater mission duration, and integrated systems that were previously impractical under Part 107.
It is important to note that Part 108 does not replace Part 107, so operations can continue under the existing Part 107 framework.
Part 108 Framework Explained
Part 108 introduces two distinct operational pathways:
- Permitted Operations
- Certificated Operations
Permitted operations are designed for lower-risk missions and offer a streamlined approval process. This pathway includes activities like training, flight demonstrations, package delivery, agriculture, aerial surveying, civic functions, and recreational use — typically in areas with low population density to minimize ground risk. Each permit is tied to a specific category, and operators are expected to stay within those bounds. Operational limits under the permit framework are designed to reflect the reduced risk profile. For example, package delivery missions may involve up to 100 drones, while categories like agriculture, surveying, and civic interest are capped at 25 active aircraft. Recreational use is limited to a single drone. Testing activities are exempt from aircraft caps but must be conducted in sparsely populated areas.
For organizations aiming to scale up or operate in more densely populated environments, the certificate pathway offers broader capabilities but with increased regulatory scrutiny. Certificates are available for package delivery, agriculture, aerial surveying, and civic interest missions. These operations are considered higher-risk and therefore require more robust safety protocols, including formal training programs, Safety Management Systems, hazardous materials handling, and detailed equipment procedures.
Aerial survey missions are supported under both pathways, but with different constraints. Under the permitted category, operators are limited to 25 active aircraft, a maximum takeoff weight of 110 pounds, and flights restricted to areas with Category 3 or lower population densities. Certificated operations allow for the full 1,320-pound limit and flights over Category 4 or lower population densities, while maintaining the same 25-aircraft cap. Additional personnel requirements also apply, including TSA background checks, crew duty time limits, and mandatory rest periods. Whether a company opts for a permit or a certificate will depend on the nature of their operations and the environments in which they intend to fly.
What is Enabling BVLOS?
As mentioned previously, Part 108 leans into the general trend of autonomy moving from “human in the loop” to “human on the loop”. This next evolution of autonomy will require operators to manage multiple drones flying across larger areas. Consequently, aircraft will need a Simplified User Interface (SUI). An SUI helps reduce the flight coordinator’s cognitive load by requiring minimal input for the aircraft to operate.
Additionally, all drones flying in Class B or C airspace operating under Part 108 will need onboard Detect-And-Avoid (DAA) systems to autonomously deconflict from other aircraft, making mitigation decisions without sole reliance on the human operator. For Part 108 operations outside of Class B and C, operators may employ ground-based radar as a means of DAA if DAA is not available onboard the aircraft. AFRL’s Skyvision bus is an example of ground-based DAA utilized at the National Advanced Air Mobility Center for Excellence.

SUI and DAA systems are central to enabling BVLOS operations. However, additional safety layers are required — though these are not as prescriptive as previous FAA rules for aircraft airworthiness certification. To accommodate this shift, the FAA is adopting performance-based requirements, similar to those used in the Electric Vertical Take-Off and Landing (eVTOL) space. Under Part 108, UAS Original Equipment Manufacturers (OEMs) must obtain airworthiness acceptance, which is distinct from a traditional airworthiness certificate. To receive this acceptance, manufacturers must submit a Declaration of Compliance to the FAA, confirming that all Part 108 requirements have been met. They must also agree to provide any test, inspection, or flight data upon request for compliance audits. Many of these requirements stem from industry standards. For example, the aircraft separation component of DAA is derived from standards set by ASTM Committee F38 on UAS and Radio Technical Commission for Aeronautics for Airborne Collision Avoidance System. Rather than creating new static rules, the FAA allows these standards to evolve with technology by referencing and updating cited industry standards. Note that this process applies only to uncrewed aircraft without passengers. Uncrewed aircraft that intend to carry passengers, such as Wisk’s eVTOL, must still obtain airworthiness certification under Part 21.

Source: AFRL (2019)
As discussed, DAA provides separation from an aircraft perspective, but the other major safety layer is the use of Automated Data Service Providers (ADSPs), which plays a critical role in the digital infrastructure that enables safe BVLOS operations via UAS Traffic Management (UTM), governed under the new Part 146. These tech companies provide back-end data to BVLOS operators in the form of strategic deconfliction, conformance monitoring, airspace data delivery and conflict alerts — all of which must be interoperable with other ADSPs.
Requirements for ADSPs in Part 146 were also derived from industry standards, such as ASTM Standard for UTM UAS Service Supplier Interoperability, which was later validated in partnership with the FAA through live testing at the UTM Key Site in North Texas beginning in 2023.
Strategic deconfliction is a preflight function that checks a flight plan’s intent against other flight plans in the system to detect conflicts, adjusting the flight plans altitude, routing, or departure times until a conflict-free route is found. Conformance monitoring refers to how the ADSP tracks a UA’s adherence to its planned flight route, notifying other airspace users if the UA deviates from its operational intent, so that the flight coordinator can take action to mitigate potential collision risks. DAA, SUI, and ADSP, when paired with additional elements specific to some OEMs or operators — such as specialized C2 links, ground-based radars, and weather monitoring services — combine to create multilayered safety cases for scaled BVLOS operations under Part 108.
Big Picture: What will Part 108 Mean for the Aviation Industry?
This evolution will mean greater flexibility in how aerial data is collected and delivered. Heavier drones can carry more advanced sensors, closing the gap between manned sensor capabilities, while simultaneously enabling richer datasets and more efficient workflows. While the energy density of today’s batteries remains a limiting factor (still tied to OEM innovation), advancements in energy systems such as gas or hydrogen fuel cells will continue to push the boundaries of what’s possible. In parallel, hybrid drone solutions will gain greater prominence in the industry, combining electric and more traditional hydrocarbon fuel-based propulsion (or hydrogen) to extend flight times and payload capacity. As technology progresses, so will our ability to support larger, more complex projects with fewer operational constraints.
Part 108 presents an opportunity to rethink how projects are surveyed, mapped, and delivered. UAS teams that are already ISO 9001:2015 compliant are well positioned for a smooth transition from Part 107 to Part 108. These teams will have an additional leg-up if they have already adopted formal training programs, risk assessments, and operational procedures. As Part 108 moves toward finalization, heralding a new age in UAS efficiency, it is crucial to continue monitoring this rulemaking and prepare accordingly for any changes.

Source: Woolpert (2025)

Source: Woolpert (2024)
Proactive coastal resilience is at the forefront of national conversations as extreme weather events, like hurricanes and tropical storms, have increased in frequency and intensity during the past 30 years along the Atlantic, Caribbean, and Gulf coasts.
The National Ocean Service, a branch of the National Oceanic and Atmospheric Administration, defines coastal resilience as proactively building a community’s capacity to bounce back from hazardous events such as hurricanes, coastal storms, and flooding.
Coastal communities can employ a variety of methods to achieve coastal resilience. One of the most useful—and underutilized—practices is collecting coastal elevation data via aerial lidar before and after hurricanes to monitor beach erosion.
St. Johns County Enhances Coastal Resilience with Lidar
An archetypal case study highlighting the importance of proactive coastal resilience comes from St. Johns County, Florida. In 2019, county officials entered a contractual agreement with Woolpert to collect and process coastal elevation data every summer before hurricane season to help establish a crucial baseline for that year.
If a tropical storm or hurricane ever hit, Woolpert would deploy teams to collect new lidar data. The county could then compare pre- and post-storm data to determine exactly how much the coastline had eroded.
This foresight proved invaluable. In 2022, Hurricane Ian, a powerful Category 4 hurricane, and Hurricane Nicole, a sprawling late-season Category 1 hurricane, battered St. Johns County’s coastline and vital tourist beaches. Woolpert strategically flew lidar on Sept. 23, before Ian’s arrival, again on Oct. 11, and once more on Nov. 18 to assess Nicole’s impact.
While these missions were logistically challenging, Woolpert’s lidar strategy produced a highly accurate surface model with about 5 centimeters of vertical precision. This supported the creation of 1-foot resolution elevation models and grayscale intensity images that detailed the extent of the coastline damage. On average, Hurricane Ian wiped approximately 50 feet of beach, and Hurricane Nicole around 60 feet.
What are the Benefits of Collecting Pre- and Post-Storm Coastal Elevation Data?
In the case of St. Johns County, the primary benefit of having highly detailed and visual pre- and post-storm elevation datasets was that they enabled officials to apply for grants through the Federal Emergency Management Agency—specifically, FEMA Category B berm. Getting FEMA reimbursement is not easy, and it is particularly challenging to demonstrate that a beach or dune is eligible for this type of funding.
Counties must have readily available datasets to streamline FEMA’s analysis, especially when timelines are tight. Another benefit of having accurate pre- and post-storm elevation data is that beaches classified as engineered (like St John’s) can be eligible for additional funding. Elevation data also helps officials identify areas most vulnerable to inundation and destruction, and to effectively narrow the scope of coastal resilience projects.
Additionally, because St. Johns County was able to quickly secure funding, it could promptly begin beach renourishment. Rebuilding damaged areas along the coastline was essential, as many of the affected beaches were commercially significant tourist destinations. Ultimately, prioritizing pre- and post-hurricane elevation data helped safeguard the county’s economy.
The Key to Coastal Resilience is Proactivity
Coastal resilience should not be reactive, but proactive. Like St. Johns County, other coastal counties should take the initiative to begin long-term monitoring programs, collecting pre-storm elevation data as a form of insurance in case a hurricane or tropical cyclone affects them later in the year.
Currently, St. Johns County is an anomaly, as there are not many counties across the Atlantic, Gulf, and even West coasts routinely collecting pre-storm data. Granted, some counties do use lidar data; however, these datasets are often too outdated to serve as a sufficient baseline to qualify for FEMA funding.
Proactive coastal resilience efforts are also beneficial from a return on investment perspective. If a hurricane strikes a county with outdated elevation data and causes significant damage, it may be ineligible for FEMA funding, forcing local or state governments to cover the costs. Alternatively, it is much more cost-effective to have a partner firm collect data on a regular basis before every hurricane season.
Preparing For the Next Storm Starts Today
As storm frequency and intensity increase, more coastal communities in Florida and throughout the nation should proactively prioritize coastal resilience. While it is impossible to completely negate the effects of natural disasters, communities can certainly put themselves in the best position to respond by partnering with qualified firms — increasing the likelihood of securing federal funding and streamlining recovery efforts.