1. Introduction
We are now in the Anthropocene - an unprecedented time of global environmental change caused by urbanization [
1], desertification [
2], biodiversity loss [
3], and increase in frequency and severity of extreme events [
4]. The drivers for this change in our interconnected social-ecological world include global warming [
4] and rapid population growth [
5], and are likely to threaten people’s lives, livelihoods and assets in years to come [
2]. This increasing risk motivates a need to think about mitigation of impact.
Broadly speaking, this is what is referred to as resilience - the prevention, ability to withstand, respond and recover from environmental perturbations and shocks [
6]. Some leading questions in disturbance and resilience ecology include: 1) How and why does life organize across scales? [
7]. This references questions in disease and infestation, evolution, food security, type conversations , migrations, extinctions, etc; 2) How will disturbance impact how life organizes across scales? [
8]. Encompassing questions of mass extinctions, biodiversity loss, habitat fragmentation, settlements, land cover and land use change, etc.; and 3) How do we mitigate our climate risk to ecosystems and communities as a function of vulnerability, exposure and hazard? [
9]. This includes questions about storms, fire, water commodities, food security, air quality, sea level rise, etc. Researchers are now well-positioned to examine these questions and inform a more resilient future as we navigate the information age with cutting-edge analytics and big data on Earth observations from diverse data sources, from remote sensing [
10] to social media [
11].
1.1. Environmental Resilience Technology: Definitions
It is increasingly evident that new technologies informed by climate and environmental research, as well as new applications of existing technologies, will be both useful and necessary to support environmental change mitigation and adaptation toward more resilient communities. These “environmental resilience technologies” may include remote sensing, machine learning, artificial intelligence, geospatial analytics, sensors, and social media data that can be synthesized and used to support decisions in the natural, built and social environment; it may also include technologies that can directly increase resilience, such as wildfire mitigation methods, improved built infrastructure, better water management, direct air carbon capture, and renewable energy generation. One embodiment of environmental resilience technologies, relevant to the present work, is the synthesis and interpretation of data for solutions-oriented science to create value-added analytics that enable society to become more resilient to changing environmental futures. Within this definition of environmental resilience technology, we further define value-added analytics and solutions-oriented science as follows.
Figure 1.
Environmental Resilience Technology addresses societal challenges through solution development that involves cross-sector collaborations for scaling and adoptions. Specific to this manuscript, we develop a model for solutions-oriented science that provides value-added analytics.
Figure 1.
Environmental Resilience Technology addresses societal challenges through solution development that involves cross-sector collaborations for scaling and adoptions. Specific to this manuscript, we develop a model for solutions-oriented science that provides value-added analytics.
At present, there are dozens of solutions for different applications and yet one of the biggest limitations is sustaining the applications for use [
12] through time as much of the research funding focuses on developing solutions. As such, innovation requires not merely creating a piece of technology, but developing the business model for sustained use. Because value inherently has cost [
13], in the context of solutions using analytics,
value-added analytics are worth paying for and thus, have the ability to go from research to commercialization.
A growing methodology used for commercializing a product or service is human-centered design (HCD) [
14,
15]. HCD, also known as “design thinking” or “systems thinking” is a bottoms-up approach to problem solving that starts by understanding people, their actions, decisions, and feelings. It is rooted in anthropology and engineering [
16,
17,
18]. It is best suited for complex, non-linear problems where gaining an understanding of user needs leads to better and more effective solutions. HCD differs from traditional business and management approaches in that it is not a hypothesis-first approach; rather it starts from the study of people in a problem or opportunity space and leads to the generation of solutions from what is learned. HCD’s approach to innovation of commercial products and services centers human desirability, technical feasibility, and business viability. Of these three factors, it always begins with human desirability, and includes the human perspective and emotion in all steps of the problem-solving space. Some limitations of traditional HCD include incremental design changes based on the human-centric perspective, typically within a single application rather than broader examination of related systems and the decision space [
19]. This often occurs when designers are too closely anchored in what people are directly asking for, rather than seeking to understand and solve for the root causes behind those desires and novel future designs [
20] that could systematically address these core issues; this is especially true in data science applications [
21]. As Henry Ford is famously quoted for saying, “If I had asked people what they wanted, they would have said faster horses” [
22].
The WKID Innovation framework, however, focuses on transparent and systematic traceability across the decision space to identify information technology requirements and inform data science in support of closing information gaps. WKID Innovation was designed to tackle wicked problems, which are known in complex system science as problems that have no single causes or best solutions [
23]. In modern society, multiple organizations often tackle wicked problems thus presenting differences in values across entities that complicates decision making. As defined here, decisions are made using two elements [
24]: 1) data, which can be quantitative or qualitative, and 2) values that are represented as the subjective weighting of costs and benefits of the outcome of that decision. While the data and synthesized information may provide guidance for potential outcomes, often the solution depends on different users’ values, which often results in disagreement about the cause of and solution to the problem. Wicked problems fundamentally need systematic change to their drivers, which has the potential to simultaneously address many causes and solutions at once. WKID Innovation defines “wisdom” in the Knowledge Hierarchy [
25] as informed action and sets a foundation for defining
solutions-oriented science as scientific discovery that informs how people and communities act. Drivers of how we act and operate as a society include the rules that govern us (Policy), our markets for exchange of goods and services (Economics), our values (Sociocultural factors), and the tools that we use (Technology). Often a Political, Economic, Social and Technological (“PEST” [
26]) analysis is used to assess these external factors.
The WKID Innovation framework [
27] uses NASA system engineering [
28], the Knowledge Hierarchy [
25], and PEST [
26] to identify and define “intel” (synthesized information in the form of knowledge) from data with the objective of changing actions based on decisions. The framework leverages Theory of Change [
29] as a method of social change to map the decisions behind informed action within a system. Specifically, WKID Innovation maps the intended goal of a decision, the decision to be made, who makes it, why they make it, what tools they use, and on what information they rely as well as any gaps or limitations. Broadly speaking, the systems in which communities operate are defined by the processes driven by policy, economics, sociocultural factors, and tools [
26] that inform and facilitate interactions of people, hardware or the infrastructure supporting the system, and software or the means of communicating.
Since WKID Innovation systematically evaluates the information needed to change a system, and HCD provides a framework for iterative engagement to create market viable and consumer desirable products and services, the two methodologies can be combined to identify value-added analytics that support solutions-oriented science. With respect to environmental resilience technology, this framework can be applied to facilitate the transition from global change research to applications in operational decision making that aim to support a more resilient future.
1.2. Resilience Technology in Practice: Academic-Private Partnerships
Research to commercialization can be achieved through academic-private partnerships. Deloitte Consulting and the University of Colorado Boulder (CU Boulder) launched the Climate Innovation Collaboratory (CIC) in 2022 to translate cutting-edge climate research and data into meaningful solutions for government agencies, businesses, non-profit organizations, and communities. As a convening body, the CIC enables climate impact through technology and market development. These engagements between Deloitte and CU Boulder develop critical climate data analytics, research and technology to build innovative and meaningful solutions to help Deloitte partners and clients become more climate resilient. As part of this effort, the CIC’s first efforts include two environmental resilient technology projects: 1) wildfire analytics and 2) environmental commodity markets for water and soils management.
Establishing effective academic-private partnerships is challenging and requires a model for how to transition from research to commercialization. In this paper we develop and apply a Research to Commercialization (R2C) model for environmental resilience technology in support of: 1) systematically identifying information gaps in the decision space, 2) defining algorithm requirements, and 3) developing sustainable solutions for resilience to global environmental change through market viability. This paper describes the outcomes achieved from the wildfire analytics project to support community wildfire preparedness, resilience, and response. The pilot project combined CU Boulder’s experience with wildfire research and applications and Deloitte’s experience with human-centered design. In the following sections, we apply our R2C model to a case study to address the wicked wildfire problem in the United States.
1.2. Case Study Background: the Wicked Wildfire Problem in the United States
For the scope of this work, we explored how data analytics could support megafire mitigation, resilience, and response. The wicked wildfire problem refers to the fires that matter, or the fires that have negative social impact [
30]; these are commonly referred to by the public as “megafires”. Megafire is a sociopolitical term, and therefore has no standard unit, but in the Western United States, megafire generally refers to fires that meet some or all of the following criteria: 1) they burn more than 50,000 acres or ~20,234 hectares [
25], 2) they produce smoke affecting millions of people over regional scales [
32], 3) they threaten residential homes or community structures [
33,
34], and 4) they may be fast moving, creating entrapment scenarios for local residence and making suppression efforts even more difficult.
Megafires result from the interactions between flammable fuels, climate, and at least one source of ignition [
29]. These interactions don’t share a common mechanism, so managing for megafires requires a complex understanding of multiple systems and the ability to execute complex decision making across those systems. Climate change from greenhouse gasses emitted from burning of fossil fuels [
4] is resulting in warmer, dryer, and windier conditions, ripe for increased fire danger [
36] and likelihood of large fires in the continental United States [
37,
38,
39]. Despite fuel constraints from increased fire in the future, fire area is still expected to increase [
40]. Ignition rates for fires, including from lightning and people (e.g., prescribed fire, arson, overheated cars, power lines, bonfires, fireworks, etc.), are variable, but we know that humans change the fire season, extending it year round and sometimes peaking outside when lightning ignitions are possible [
33]. Furthermore, humans are changing where fires occur and how proximal they are to settlements [
34] through urban expansion into wildlands. Finally, the United States has experienced nearly a century of fire exclusion (putting out all fires) and limited prescribed burning that has led to extreme fuel accumulation [
41], which affects fire behavior substantially in some vegetation types, and especially in steep terrain [
42]. Not only does fuel accumulation affect fire behavior, but it can affect the condition of the fuels as vegetation competes for resources and can stress the ecosystem, which can worsen burn severity, [
43] especially under a warming climate that further exacerbates ecosystem stress. Land use and management has also contributed to fire frequency and severity, and the type of fuels available to burn. For example, land use influences the presence of human development and its influence on homes as fuel [
34,
44,
45] as well as the presence of invasive species [
46,
47] that affect fire occurrence and frequency [
48] . When you put all these ingredients together, the fires that matter (megafires) are the ones with high risk to society [
30], and many actors play an important role in mitigating their impact.
Because different actors are responsible for different contributors to the wicked wildfire problem, there are no single solutions. Furthermore, there are a plethora of technologies aimed and designed for different actors across the disaster lifecycle from pre-fire resilience, preparedness, and hazard mitigation to active fire detection, tracking, and response to post-fire recovery [12; Supplemental Material S1]. In fact, lack of technology is not the biggest challenge facing the wicked wildfire community, but rather: 1) a need for strategic, and coordinated efforts, 2) access to data and standardization, 3) research and development that thinks holistically about the problem across the disaster life-cycle in the context of resilience; and 4) considerations of financing solutions for long-term sustainability [
12,
49]. Specific to point four, the wicked wildfire problem provides a strong use case for testing a research to commercialization model.
3. Results
Our analysis of the CTM (Supplemental Material S1) showed that the most highly relied upon information across user personas were about Public Perception and Dynamic Risk (
Figure 5a). Public Perception of fire risk or of the role/performance of the interviewee was most needed across user stories, representing the greatest need for this type of information to make more decisions within the wicked wildfire solution space (
Figure 5b). While more user personas need information on ‘dynamic risk’ (
Figure 5a), how they define that varies by decision (
Figure 5b) within the context of fire hazard (i.e., the potential of fire to occur). Information on fire hazard was equally as important (
Figure 5b) as information on exposure (
Figure 5b), while information on how vulnerable a community is, was more needed across the decision space (
Figure 5b).
Our analysis from the HCD methods that constrained the analysis of the CTM showed that solely relying on HCD would have excluded the development of social media filtering, despite being the most needed piece of information across user personas and stories (
Figure 5). While HCD focused on workers and communities involved in wildfire mitigation, response, and recovery, which allowed for identifying some of the greatest root problems in this space and the creation of more effective solutions, it didn’t capture the unimagined solution of social media filtering by interviewees. Specifically, because few technologies exist (if any at all) to filter social media data down to unique content contributions, and there is often widespread mistrust of information shared on social media, and it was hard for potential users to imagine what this would look like. However, when questioned about what drives decisions and what was missing information, the systematic analysis from WKID Innovation highlighted the overarching need for information on public perception.
Figure 5.
Synthesis of the Change Traceability Matrix (CTM) focusing on: a) percent of user personas (out of 8) needing information directly related to or relying on filtered social media or evacuations from high fire risk areas defined as a function of hazard, exposure, and vulnerability; and b) percent of user stories (out of 39) relying on this information to make a decision.
Figure 5.
Synthesis of the Change Traceability Matrix (CTM) focusing on: a) percent of user personas (out of 8) needing information directly related to or relying on filtered social media or evacuations from high fire risk areas defined as a function of hazard, exposure, and vulnerability; and b) percent of user stories (out of 39) relying on this information to make a decision.
Despite hazard information being most valuable across decisions, the most common information gap across personas (
Table 1) is in social media filtering and risk futures. In particular, recognizing the need for not merely filtering by hashtag, but use of artificial intelligence for identifying deep fakes, misinformation, bots, verified accounts or unique contributions to the conversation during natural disasters is critical. Additionally, risk futures include consideration for how we message risk and not just how we have historically calculated it as acres burned.
The PTM showed that the most valuable requirement for any information technology across user personas (
Table 2) was the need for interoperable “plug in” technologies that work with decision makers' existing tools. Next, was the need for information technologies to be accessible via limited connectivity and with limited compute resources. Finally, all information should be intuitive to interpret and information should be consistent when scaled between federal reporting to local decision support for implementation.
With these constraints we demonstrated some analytics of value for determining evacuations from a fire (
Figure 7) and how public perception influences our ability to communicate and keep communities safe (
Figure 6). The first evacuation notice appears on Twitter at 12:57 pm but does not include specifics about which areas are affected by the evacuation order. Official guidelines specify that if you see flames evacuate. Shortly after that individual replies provide details about which areas are currently affected and that the entire town of Superior is under evacuation orders. Over the next hour video is shared by those returning to grab their belongings and pets document conditions and traffic flow into Superior and Boulder, the first areas under mandatory evacuation. Within an hour traffic is at a standstill for the town of Superior and residents of Boulder. Shortly after 2pm the town of Louisville is ordered to evacuate. There is a steady stream of communications as evacuated residents share updates from their mobile phones stuck in traffic. Information is shared about unforeseen events such as traffic light outages and freight trains, also disrupted by the fire, blocking surface streets. By analyzing evacuation potential (
Figure 6) at walking speed, we see that it can take hours to escape the fire perimeter and surrounding areas. Pedestrians or disabled travelers without access to motorized transport may not be able to cover enough ground to find a safe destination and would rely on others to get them out safely. These would not be concerns if traffic were traveling at normal speeds as residents would have the entire Denver Metro Area at their disposal to find friends, family, hotels, or shelters.
Results from the social media filter juxtaposed with the evacuation maps highlight the role the public plays in bridging information gaps between official reporting and Geographic Information System (GIS;
Figure 6) such as Google Maps or Waze and enabling a more efficient evacuation. Tweets from evacuees provide destinations and public perceptions about evacuation messaging that can be used in modeling. Furthermore, machine learning using observed traffic patterns from cameras, or calculated through real-time navigational tools such as Google maps and Waze, can be validated using personal tweets with advice on the most efficient routes. Furthermore, tweets from official sources can highlight key breakdowns in communication and the lack of cross-entity coordination during the evacuation.
Figure 6.
Twitter in a disaster: Information content from social media can provide critical context to understand evacuation patterns during natural disasters, such as the Marshall fire in Boulder County in 2021. Note: TWITTER, TWEET, RETWEET and the Twitter Bird logo are trademarks of Twitter Inc. or its affiliates.
Figure 6.
Twitter in a disaster: Information content from social media can provide critical context to understand evacuation patterns during natural disasters, such as the Marshall fire in Boulder County in 2021. Note: TWITTER, TWEET, RETWEET and the Twitter Bird logo are trademarks of Twitter Inc. or its affiliates.
Figure 7.
Better understanding evacuations: Utilizing data analytics to advance understanding of bottlenecks during natural disasters such as during the Marshall fire in Boulder County in December of 2021. The blue polygon is the Marshall fire perimeter. White lines are municipal boundaries within the evacuation zone. Gray lines represent the transport network. The first panel shows the normal traffic conditions where a driver can easily escape the fire perimeter within 20 minutes. The second panel shows moderate traffic conditions where the average travel speed is 8 miles per hour. The third panel shows heavy traffic conditions moving at an average of 4 miles per hour. 4 mph is approximately walking speed and, at that speed, there are locations where it would be difficult to escape the fire at this speed. These are models of traffic conditions based on local transport network.
Figure 7.
Better understanding evacuations: Utilizing data analytics to advance understanding of bottlenecks during natural disasters such as during the Marshall fire in Boulder County in December of 2021. The blue polygon is the Marshall fire perimeter. White lines are municipal boundaries within the evacuation zone. Gray lines represent the transport network. The first panel shows the normal traffic conditions where a driver can easily escape the fire perimeter within 20 minutes. The second panel shows moderate traffic conditions where the average travel speed is 8 miles per hour. The third panel shows heavy traffic conditions moving at an average of 4 miles per hour. 4 mph is approximately walking speed and, at that speed, there are locations where it would be difficult to escape the fire at this speed. These are models of traffic conditions based on local transport network.
The results of our analysis that overlaid major road networks with increases in fire hazard characterized by larger and more frequent fires in the future as a result of climate change [
37,
60], shows that more places in the SouthEast and up into the midwest and Northeast United States are likely to experience increased burden on infrastructure for evacuations (
Figure 8).
4. Discussion
Environmental resilience technology offers great promise to shift science from observing environmental change to providing solutions that serve society. An important step for transitioning research using environmental data to commercialization is creating analytics of value. A study investigating 905 sustainable development projects in China and the United States found that big data analytics and artificial intelligence play an important role in the success of both sustainable design and commercialization, which mediates the relationship between emerging capabilities, sustainable growth, and performance [
61]. Moreover, it has been shown that cutting-edge artificial intelligence and data fusion can lead to important and robust predictions about our possible future climate and environmental pathways [
62,
63]. There is a growing call to leverage science to better inform market solutions with a key focus on technology breakthroughs that meet societal and economic needs to address climate change [
10,
64].
The R2C model demonstrated in this paper reflects the integration of public-private capabilities [
49,
65], which are key to solving complex environmental challenges. The R2C model integrates the WKID Innovation framework [
27] with HCD, to map the information gaps across multiple end user types and identify where the greatest gain would happen in the development of market-driven solutions. WKID Innovation provides an in-depth systematic analysis of the decision space and uses researcher domain expertise about the complexity of the interconnected human-environmental system to identify data-driven solutions. Research on its own, however, does not precipitate sustainable solutions. Sustainable solutions consider a funding model beyond preliminary research or development [
12]. Designing and clearly stating the requirements for how research fits into the decision space at the outset ensures an off-ramp from research to application [
66], but it does not mitigate risks within sustained operations [
67], which can be limited by demand (or market viability). By integrating WKID Innovation with HCD and market research, we are able to leverage the strengths of each framework to mitigate risks to technology adoption, thereby scaling the use-inspired translational research of environmental science domain experts [
68] to drive tomorrow’s technologies and solutions.
R2C sits in the research to commercialization taxonomy of “contract research and consultancy” and addresses the call to integrate traditional agency knowledge of translational research and development (e.g., WKID Innovation) with resource-based methods motivated by understanding organizational needs such as technology, strategy, and markets (e.g., HCD) [
69,
70]. By leveraging HCD, R2C does not differ too drastically from other models proposed for co-production [
71] that rely on iterative feedback from users, but does provide more top-down complex systems analysis that overcomes some of the limitations of HCD [
19,
20,
21]. Expanding translational ecology [
72], which seeks to link ecological knowledge to decision making for use-inspired research [
68] and real world outcomes [
73], the R2C model goes beyond information sharing to co-developing not only analytics, but also technological solutions that facilitate changing how people interact with the natural environment making for a more resilient future. Models of co-production [
74,
75] and translational ecology [
72,
73,
76], offer potential mechanisms and motivations for environmental sciences to build cross-sector partnerships with end users, but they focus on end-user uptake of information particularly in the non-profit sector and federal/state/local governments, rather than on the potential for markets to drive solutions. They do, however, acknowledge the importance of building workforce with skills beyond academic education [
76].
There is a long history of research-to-market pathways, particularly in engineering [
77], medicine and pharmaceuticals [
78], computer science [
79], and biology [
80] — less so in ecology and environmental sciences. The incredible wealth of environmental data from satellite sensors, social media platforms, government records, and other data sources offers remarkable opportunities for market-driven solutions to complex environmental challenges. For example, carbon markets offer a means for offsetting fossil fuel emissions by sequestering carbon [
81] or reparative finance for water security [
82]. Some companies are starting to use data to verify and validate credits that support improved natural resource management, but there is a need for such analytics to be trustworthy - transparent, consistent, and secure (
Table 2). Coproduction between industry and scientific research at academic institutions offers a foundation for developing such trustworthy information.
Co-production that merges academic research with industry application, however, requires navigating differences in culture and institutional practices; through development of this R2C model, we have learned three key lessons and present some best practices for others aiming to apply this approach. First, thought needs to go into how to delineate intellectual property from the start [
83] and how a team can co-produce a project together, while still maintaining each institutions' rights. A key challenge arises when negotiating contracts and maintaining rights around intellectual property. Legally, “coproduction” muddies the water around who owns what based on contributions [
84,
85]. A best practice would be to clearly communicate this between all participants and to use multiple modalities for communicating such as the use of diagrams, legal contracts, gantt charts, and documentation of roles and responsibilities in advance.
Second, we recommend establishing mechanisms such as agreements on sharing knowledge and data. Such mechanisms will help hold the tension between open science defined as “transparent and accessible knowledge that is shared and developed through collaborative networks” [
86], and proprietary information sold as value-added analytics. This would require consideration for establishing data sharing standards [
87] that account for FAIR (Findable, Accessible, Interoperable, and Reproducible) and CARE (Collective Benefit, Authority to Control, Respectful, and Ethical) data principles [
88,
89] as well as supporting trustworthy artificial intelligence (AI) [
51], which considers: i) the objectivity of data that promotes fairness and mitigates harmful bias; ii) how it is secured and protected against unauthorized access use, disclosure, disruption, modification or destruction; iii) how it protects safety and does not cause unacceptable risk; iv) how it protects privacy; v) how explainable or transparent an algorithm is; vi) how accurate it is; vii) how reliably it performs as expected; viii) how resilient it is by its adaptive capability; and ix) how accountable it is for tracing record of events in the decision making process.
Third, we recommend developing and implementing a communication plan between the institutions that can help to overcome variations in meeting and work cultures. A communication plan [
87] that includes a management structure with roles and responsibilities as well as reporting structure within teams [
83] can alleviate this confusion and enable work management within teams that can cross-pollinate to the other team(s) relying on the outcomes of the partnership. Such a plan would also help alleviate tensions between the faster cadence of deliverables required in industry and the slower turn around in academia due to the exploratory nature of the work and different incentive structures [
90]. Clear technical deliverables, timelines, and inter-team meetings that create “tie-points” between industry and academic workflows can allow each team to manage as is culturally appropriate, while still collaborating towards the same end purpose. This is similar to findings that recommend taking a “design approach” to cross-sector collaborations, by taking the end product in mind and iterating throughout [
91].
With respect to the presented use case for this R2C model, our findings are supported by recent developments in both research and by industry, but provide clear guidance on requirements for integration into the decision space. With respect to risk assessments and development of hazard, exposure, and vulnerability, new models are being developed for risk [
30,
92], hazard [
31,
38,
43,
59,
93], exposure [
44,
94], and vulnerability [
95,
96] and are being integrated into commercial capabilities such as Risk Factor by First Street Foundation and public offerings such as the US Forest Service’s WildfireRisk.org or Fuelcast.net. By combining these risk data analytics with evacuation simulations (
Figure 7), there is opportunity for resilience planning with respect to infrastructure planning. Our case study showed that traffic is a predictable consequence of fire evacuations and it could be mitigated with both planning and technology. Combining longer-term fire hazard analytics with these simulations could further show how and why traffic develops and to experiment with new road configurations that save lives by preventing entrapment as people move away from high natural hazard areas (e.g.,
Appendix B User Stories 3.6 and 3.7). Finally, with respect to using information gleaned from social media filtering, research advancements show capabilities for geolocating [
97,
98,
99], identifying misinformation and deep fakes [
100], and unique contributions [
11]. These advancements can greatly enhance more traditional geographic information system (GIS) analytics that use remote sensing and rely on maps.
Author Contributions
Conceptualization, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Christina Zapata, Ethan Doyle, Evan Thomas and Jennifer Balch; Data curation, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Virginia Iglesias, Christina Zapata, Maxwell Cook, Jilmarie Stephens and Ty Tuff; Formal analysis, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Christina Zapata, Michael Byrne, Laurel Cooper and Mario Tapia; Funding acquisition, E. Natasha Stavros, Ethan Doyle, Evan Thomas, SJ Maxted, Rana Sen and Jennifer Balch; Investigation, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Virginia Iglesias, Christina Zapata, Michael Byrne, Maxwell Cook, Jilmarie Stephens, Mario Tapia and Ty Tuff; Methodology, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Christina Zapata, Ethan Doyle and Jennifer Balch; Project administration, E. Natasha Stavros, Caroline Gezon, Evan Thomas and Jennifer Balch; Resources, E. Natasha Stavros, SJ Maxted, Rana Sen and Jennifer Balch; Supervision, E. Natasha Stavros, Caroline Gezon and Jennifer Balch; Validation, Caroline Gezon, Lise St Denis, Christina Zapata, Michael Byrne, Laurel Cooper and Mario Tapia; Visualization, E. Natasha Stavros, Lise St Denis, Jilmarie Stephens and Ty Tuff; Writing – original draft, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Virginia Iglesias, Maxwell Cook, Jilmarie Stephens, Ty Tuff, Evan Thomas and Jennifer Balch; Writing – review & editing, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Virginia Iglesias, Maxwell Cook, Jilmarie Stephens, Ty Tuff, Evan Thomas and Jennifer Balch.