Preprint
Article

Environmental Resilience Technology: Sustainable Solutions Using Value-Added Analytics in a Changing World

Altmetrics

Downloads

154

Views

61

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

23 July 2023

Posted:

24 July 2023

You are already at the latest version

Alerts
Abstract
Global climate change and associated environmental extremes present a pressing need to understand and predict social-environmental impacts while identifying opportunities for mitigation and adaptation. In support of informing a more resilient future, emerging data analytics technologies can leverage the growing availability of Earth observations from diverse data sources ranging from satellites to sensors to social media. Yet, there remains a need to transition from research for knowledge gain to sustained operational deployment. In this paper, we use the Wisdom-Knowledge-Information-Data (WKID) Innovation framework to inform solutions-oriented science in an integrated Research to Commercialization (R2C) model that explores market viability for value-added analytics. We conduct a case study using this R2C model to address the wicked wildfire problem. By integrating WKID and human centered design (HCD), through an industry-university partnership called the Climate Innovation Collaboratory between the University of Colorado Boulder and Deloitte Consulting, we systematically evaluated 39 different user stories across 8 user personas and identified common gaps, how to define information technologies that add value, and how to develop a marketable product. This R2C model could enable transition of knowledge to operational implementation informing policy and decision makers, tasked with addressing some of our most challenging environmental problems.
Keywords: 
Subject: Environmental and Earth Sciences  -   Sustainable Science and Technology

1. Introduction

We are now in the Anthropocene - an unprecedented time of global environmental change caused by urbanization [1], desertification [2], biodiversity loss [3], and increase in frequency and severity of extreme events [4]. The drivers for this change in our interconnected social-ecological world include global warming [4] and rapid population growth [5], and are likely to threaten people’s lives, livelihoods and assets in years to come [2]. This increasing risk motivates a need to think about mitigation of impact.
Broadly speaking, this is what is referred to as resilience - the prevention, ability to withstand, respond and recover from environmental perturbations and shocks [6]. Some leading questions in disturbance and resilience ecology include: 1) How and why does life organize across scales? [7]. This references questions in disease and infestation, evolution, food security, type conversations , migrations, extinctions, etc; 2) How will disturbance impact how life organizes across scales? [8]. Encompassing questions of mass extinctions, biodiversity loss, habitat fragmentation, settlements, land cover and land use change, etc.; and 3) How do we mitigate our climate risk to ecosystems and communities as a function of vulnerability, exposure and hazard? [9]. This includes questions about storms, fire, water commodities, food security, air quality, sea level rise, etc. Researchers are now well-positioned to examine these questions and inform a more resilient future as we navigate the information age with cutting-edge analytics and big data on Earth observations from diverse data sources, from remote sensing [10] to social media [11].

1.1. Environmental Resilience Technology: Definitions

It is increasingly evident that new technologies informed by climate and environmental research, as well as new applications of existing technologies, will be both useful and necessary to support environmental change mitigation and adaptation toward more resilient communities. These “environmental resilience technologies” may include remote sensing, machine learning, artificial intelligence, geospatial analytics, sensors, and social media data that can be synthesized and used to support decisions in the natural, built and social environment; it may also include technologies that can directly increase resilience, such as wildfire mitigation methods, improved built infrastructure, better water management, direct air carbon capture, and renewable energy generation. One embodiment of environmental resilience technologies, relevant to the present work, is the synthesis and interpretation of data for solutions-oriented science to create value-added analytics that enable society to become more resilient to changing environmental futures. Within this definition of environmental resilience technology, we further define value-added analytics and solutions-oriented science as follows.
Figure 1. Environmental Resilience Technology addresses societal challenges through solution development that involves cross-sector collaborations for scaling and adoptions. Specific to this manuscript, we develop a model for solutions-oriented science that provides value-added analytics.
Figure 1. Environmental Resilience Technology addresses societal challenges through solution development that involves cross-sector collaborations for scaling and adoptions. Specific to this manuscript, we develop a model for solutions-oriented science that provides value-added analytics.
Preprints 80313 g001
At present, there are dozens of solutions for different applications and yet one of the biggest limitations is sustaining the applications for use [12] through time as much of the research funding focuses on developing solutions. As such, innovation requires not merely creating a piece of technology, but developing the business model for sustained use. Because value inherently has cost [13], in the context of solutions using analytics, value-added analytics are worth paying for and thus, have the ability to go from research to commercialization.
A growing methodology used for commercializing a product or service is human-centered design (HCD) [14,15]. HCD, also known as “design thinking” or “systems thinking” is a bottoms-up approach to problem solving that starts by understanding people, their actions, decisions, and feelings. It is rooted in anthropology and engineering [16,17,18]. It is best suited for complex, non-linear problems where gaining an understanding of user needs leads to better and more effective solutions. HCD differs from traditional business and management approaches in that it is not a hypothesis-first approach; rather it starts from the study of people in a problem or opportunity space and leads to the generation of solutions from what is learned. HCD’s approach to innovation of commercial products and services centers human desirability, technical feasibility, and business viability. Of these three factors, it always begins with human desirability, and includes the human perspective and emotion in all steps of the problem-solving space. Some limitations of traditional HCD include incremental design changes based on the human-centric perspective, typically within a single application rather than broader examination of related systems and the decision space [19]. This often occurs when designers are too closely anchored in what people are directly asking for, rather than seeking to understand and solve for the root causes behind those desires and novel future designs [20] that could systematically address these core issues; this is especially true in data science applications [21]. As Henry Ford is famously quoted for saying, “If I had asked people what they wanted, they would have said faster horses” [22].
The WKID Innovation framework, however, focuses on transparent and systematic traceability across the decision space to identify information technology requirements and inform data science in support of closing information gaps. WKID Innovation was designed to tackle wicked problems, which are known in complex system science as problems that have no single causes or best solutions [23]. In modern society, multiple organizations often tackle wicked problems thus presenting differences in values across entities that complicates decision making. As defined here, decisions are made using two elements [24]: 1) data, which can be quantitative or qualitative, and 2) values that are represented as the subjective weighting of costs and benefits of the outcome of that decision. While the data and synthesized information may provide guidance for potential outcomes, often the solution depends on different users’ values, which often results in disagreement about the cause of and solution to the problem. Wicked problems fundamentally need systematic change to their drivers, which has the potential to simultaneously address many causes and solutions at once. WKID Innovation defines “wisdom” in the Knowledge Hierarchy [25] as informed action and sets a foundation for defining solutions-oriented science as scientific discovery that informs how people and communities act. Drivers of how we act and operate as a society include the rules that govern us (Policy), our markets for exchange of goods and services (Economics), our values (Sociocultural factors), and the tools that we use (Technology). Often a Political, Economic, Social and Technological (“PEST” [26]) analysis is used to assess these external factors.
The WKID Innovation framework [27] uses NASA system engineering [28], the Knowledge Hierarchy [25], and PEST [26] to identify and define “intel” (synthesized information in the form of knowledge) from data with the objective of changing actions based on decisions. The framework leverages Theory of Change [29] as a method of social change to map the decisions behind informed action within a system. Specifically, WKID Innovation maps the intended goal of a decision, the decision to be made, who makes it, why they make it, what tools they use, and on what information they rely as well as any gaps or limitations. Broadly speaking, the systems in which communities operate are defined by the processes driven by policy, economics, sociocultural factors, and tools [26] that inform and facilitate interactions of people, hardware or the infrastructure supporting the system, and software or the means of communicating.
Since WKID Innovation systematically evaluates the information needed to change a system, and HCD provides a framework for iterative engagement to create market viable and consumer desirable products and services, the two methodologies can be combined to identify value-added analytics that support solutions-oriented science. With respect to environmental resilience technology, this framework can be applied to facilitate the transition from global change research to applications in operational decision making that aim to support a more resilient future.

1.2. Resilience Technology in Practice: Academic-Private Partnerships

Research to commercialization can be achieved through academic-private partnerships. Deloitte Consulting and the University of Colorado Boulder (CU Boulder) launched the Climate Innovation Collaboratory (CIC) in 2022 to translate cutting-edge climate research and data into meaningful solutions for government agencies, businesses, non-profit organizations, and communities. As a convening body, the CIC enables climate impact through technology and market development. These engagements between Deloitte and CU Boulder develop critical climate data analytics, research and technology to build innovative and meaningful solutions to help Deloitte partners and clients become more climate resilient. As part of this effort, the CIC’s first efforts include two environmental resilient technology projects: 1) wildfire analytics and 2) environmental commodity markets for water and soils management.
Establishing effective academic-private partnerships is challenging and requires a model for how to transition from research to commercialization. In this paper we develop and apply a Research to Commercialization (R2C) model for environmental resilience technology in support of: 1) systematically identifying information gaps in the decision space, 2) defining algorithm requirements, and 3) developing sustainable solutions for resilience to global environmental change through market viability. This paper describes the outcomes achieved from the wildfire analytics project to support community wildfire preparedness, resilience, and response. The pilot project combined CU Boulder’s experience with wildfire research and applications and Deloitte’s experience with human-centered design. In the following sections, we apply our R2C model to a case study to address the wicked wildfire problem in the United States.

1.2. Case Study Background: the Wicked Wildfire Problem in the United States

For the scope of this work, we explored how data analytics could support megafire mitigation, resilience, and response. The wicked wildfire problem refers to the fires that matter, or the fires that have negative social impact [30]; these are commonly referred to by the public as “megafires”. Megafire is a sociopolitical term, and therefore has no standard unit, but in the Western United States, megafire generally refers to fires that meet some or all of the following criteria: 1) they burn more than 50,000 acres or ~20,234 hectares [25], 2) they produce smoke affecting millions of people over regional scales [32], 3) they threaten residential homes or community structures [33,34], and 4) they may be fast moving, creating entrapment scenarios for local residence and making suppression efforts even more difficult.
Megafires result from the interactions between flammable fuels, climate, and at least one source of ignition [29]. These interactions don’t share a common mechanism, so managing for megafires requires a complex understanding of multiple systems and the ability to execute complex decision making across those systems. Climate change from greenhouse gasses emitted from burning of fossil fuels [4] is resulting in warmer, dryer, and windier conditions, ripe for increased fire danger [36] and likelihood of large fires in the continental United States [37,38,39]. Despite fuel constraints from increased fire in the future, fire area is still expected to increase [40]. Ignition rates for fires, including from lightning and people (e.g., prescribed fire, arson, overheated cars, power lines, bonfires, fireworks, etc.), are variable, but we know that humans change the fire season, extending it year round and sometimes peaking outside when lightning ignitions are possible [33]. Furthermore, humans are changing where fires occur and how proximal they are to settlements [34] through urban expansion into wildlands. Finally, the United States has experienced nearly a century of fire exclusion (putting out all fires) and limited prescribed burning that has led to extreme fuel accumulation [41], which affects fire behavior substantially in some vegetation types, and especially in steep terrain [42]. Not only does fuel accumulation affect fire behavior, but it can affect the condition of the fuels as vegetation competes for resources and can stress the ecosystem, which can worsen burn severity, [43] especially under a warming climate that further exacerbates ecosystem stress. Land use and management has also contributed to fire frequency and severity, and the type of fuels available to burn. For example, land use influences the presence of human development and its influence on homes as fuel [34,44,45] as well as the presence of invasive species [46,47] that affect fire occurrence and frequency [48] . ​​When you put all these ingredients together, the fires that matter (megafires) are the ones with high risk to society [30], and many actors play an important role in mitigating their impact.
Because different actors are responsible for different contributors to the wicked wildfire problem, there are no single solutions. Furthermore, there are a plethora of technologies aimed and designed for different actors across the disaster lifecycle from pre-fire resilience, preparedness, and hazard mitigation to active fire detection, tracking, and response to post-fire recovery [12; Supplemental Material S1]. In fact, lack of technology is not the biggest challenge facing the wicked wildfire community, but rather: 1) a need for strategic, and coordinated efforts, 2) access to data and standardization, 3) research and development that thinks holistically about the problem across the disaster life-cycle in the context of resilience; and 4) considerations of financing solutions for long-term sustainability [12,49]. Specific to point four, the wicked wildfire problem provides a strong use case for testing a research to commercialization model.

2. Methods

2.1. Research to Commercialization (R2C) Model

First we developed a Research to Research to Commercialization (R2C) model (Figure 2) that integrates the WKID Innovation and HCD methodologies. The HCD approach engages a variety of tools and frameworks to assess user needs and market conditions. Leveraging the outcomes of interviews, we developed user stories, use cases and an ecosystem map to define the current market landscape. To find the intersection of innovation potential, we estimated the user desirability, business viability, and technical feasibility of a development, based on qualitative and market research. Our HCD process brought potential end users into the design process through interviews and focus groups to define the problem set, narrow potential solution sets, and test initial solution prototypes and wireframes. The HCD process also aims to ensure end user needs are met and aligned to organizational requirements by reviewing enterprise architectures for industry-provided technologies capable of scaling to support big organizations.
WKID Innovation uses two core tools: the Change Traceability Matrix (CTM) and a Product Traceability Matrix (PTM) [27]. The CTM organizes information to provide a comprehensive look across the decision space to identify common information needs and specifications to change the way decisions are made. It documents everything from the key performance indicators (KPIs; e.g., “# lives saves”,”$ saved”) used, the decisions being made, by whom and for what purpose, how decisions are funded, what motivates the decision, the tools they currently use, the information they need, gaps and limitations, and key parties engaged and affected by the problem. The PTM provides high-level “requirements” on product definition. WKID Innovation does not assess market viability explicitly, but rather assumes that if there is a need and it fits the decision space, it has value. This may not always be the case and interpretation will be needed in the context of business needs and decisions.
Figure 2. An Environmental Resilience Technology research to commercialization (R2C) model that integrates WKID Innovation and Human Centered Design (HCD) to define information of value for changing systems by leveraging global change research to inform decisions around a more resilient future.
Figure 2. An Environmental Resilience Technology research to commercialization (R2C) model that integrates WKID Innovation and Human Centered Design (HCD) to define information of value for changing systems by leveraging global change research to inform decisions around a more resilient future.
Preprints 80313 g002
In this R2C model, we allowed for the WKID Innovation framework to drive the initial interviews, providing more structure than emergent probing using in HCD for initial user interviews. We conducted 26 interviews across four gradients of representative decision makers, which we called user archetypes (Figure 3). The first gradient spanned decision makers in pre-fire, active fire, and post-fire situations such as emergency responders at local, state, and federal levels; public information officers (e.g., communications manager); land use and asset planners (e.g., developers, real estate, (re-)insurance companies, city planners, and utility companies); recovery (e.g., public assistance, emergency management organizations, and (re-)insurance companies); resilience and preparedness planning (e.g., local, state, and federal offices), land managers (i.e., private, state, federal, and Tribal partners), and policy implementation and enforcement agents. The second gradient included people working in public, private, and non-profit sectors. The third gradient spanned decision makers from rural to urban environments. Finally, we interviewed across the gradient of decision maker budgets from low to high. User personas were created based on the sector (public, private, non-profit, academic), the domain of jurisdiction (local, state, regional, national), and the user archetypes: Resilience, Public Information Officer, Land Planning, Recovery, Land Managers, Policy Implementation/Enforcement, and Emergency Management.
Figure 3. Number of interviewees by user persona. Note that here we show five personas when the analysis determined 8 personas because Emergency Responders, Land Management/ Preparedness/ Resilience Planning, and Public Information Officers were further partitioned into local & state/regional personas.
Figure 3. Number of interviewees by user persona. Note that here we show five personas when the analysis determined 8 personas because Emergency Responders, Land Management/ Preparedness/ Resilience Planning, and Public Information Officers were further partitioned into local & state/regional personas.
Preprints 80313 g003

2.2. Design Analysis

For the interviews, we standardized the questions we asked (Appendix A) and used a standardized data collection form to take notes as a precaution against subjective interviewer bias. For each interview there was a lead interviewer and two note takers, one taking verbatim notes and the other synthesizing key points emphasized by the interviewee. The questions (Appendix A) paralleled the structure to populate a Change Traceability Matrix (Supplemental Material S1). All notes were input into an online form in real time, which generated a shared spreadsheet with 36 columns denoted as A through AJ. These columns were then cross-walked to the CTM to show where key information from explicit questions likely populate into the CTM.
To parse from the interview questions into the CTM, we assigned columns from the interview form as most likely responses to each column of the CTM (Figure 4) and pulled key information from responses into synthesized statements relevant to each column of the CTM. We then created a separate CTM per User Archetype as a different sheet in the CTM. Each response was then read and parsed into the appropriate User Persona sheet by creating a new row for each KPI starting left to right. Sometimes one decision would have multiple KPIs as goals or objectives; in these cases, the decision descriptor was a merged cell spanning all associated KPIs. Similarly, any one KPI may rely on multiple types of information. A new row was added for each piece of needed information and the KPI cell related to that decision merged to span all needed information. While every effort was made to ask all questions, sometimes interviewees answered questions not yet asked in response to earlier questions and sometimes interviewees simply didn’t understand or couldn’t answer a question. In these cases any questions that participants did not provide details on were left blank in the final CTM and represent gaps in information from the interviews or and potential gaps in the system. Bolded text in the CTM represents key quotes from interviews that capture a recurring sentiment by interviewees within that persona. Both the verbatim and synthesis notes (each taken by a different note taker) were treated independently (as different data points) and analyzed through a double-blind process whereby the interviewee and the analyzer were not identified to either party. As information from the interview overlapped with existing synthesized information in the CTM, no new rows were created, but may have been edited to provide more clarity. All details specific to a company, agency, or organization were removed for anonymity. As such, the CTM provides a synthesis that captures general methods, mechanisms, and patterns for a user persona and each key decision they make.
Prior to interview synthesis, the CU Boulder researchers presented nine data analytics capabilities in development , which were used as potential solutions for further refinement. As part of the HCD process, the joint Deloitte-CU-Boulder team spent several days qualitatively synthesizing the insights gathered from the interviews, documenting individual pain-points and solution areas with sticky-notes, and then aggregating information into cross-cutting insights and themes. These insights were later validated through the CTM analysis.
We then used a common framework in HCD to gauge and then maximize human desirability, business viability, and technical feasibility. Using this desirability-viability-feasibility, or “DVF” framework [50], the team analyzed the nine researcher data analytic concepts, and down-selected three that scored the highest to pursue further in agile product development. The integrated CU Boulder-Deloitte team discussed research insights to collaboratively arrive at a desirability score together for each potential solution. The Deloitte team led the scoring for business viability, using market research insights and subject-matter-expertise from project advisors working in the field. The CU-Boulder team led the scoring for technical feasibility, leveraging their knowledge of what their team could build and how long it would take to feasibly develop. The initial areas selected for further research and development included several data analytics product concepts in the wildfire risk and resilience space: “dynamic risk”, “evacuations” and “public perception.”
Using these identified value-added information needs, a PTM (Supplemental Material S1) was created. We used the CTM, and looked across all user personas, identifying any time that “risk” (and “hazard”, “exposure”, or “vulnerability” as these are inclusive of the definition of risk [30]), “evacuation”, and “public perception (or “social media” or “# likes and retweets”) were identified as needed information in the associated limitation or opportunity column (Figure 4) and then ported it into the PTM and categorized: Product Definition; User Experience (“UX”)/ Training & Personnel; Accessibility; Business Model; Trustworthy AI [51], Product Requirements; Other Tech & Culture Needs; Information Gaps. Each of these limitations and opportunities thus became a row in the PTM and were then reviewed and grouped based on similarity and a requirement was written. Each requirement was then cross-referenced with the number of rows and unique personas they represented. The number of user personas was recorded.
Two analyses were performed on the PTM. One investigated the number of user personas and the other the number of decisions used to create user stories for each of the key information variables related to the highest priority, market-viable information needs: “dynamic risk” and “public perception”. User stories were constructed as: To make a [decision] for [purpose], a [user persona possibly including generalized job descriptor - e.g. “local emergency response firefighter] relies on [information]. Because each interviewee provided a job descriptor and was assigned a persona, stories could be verified with interviewees, while maintaining anonymity of respondents. The number of user personas and user stories were determined based on information needs specifically using the terms: “public perception”, “dynamic risk”, “vulnerability”, “hazard”, “exposure”, “filtered communications'', “evacuation”, “misinformation”, “# of likes and retweets”. One user persona may make many different decisions for different reasons. This is represented by the 39 user stories from 8 user personas.

2.3. Human Centered Design: Proof-of-Concept Demonstrations

Following the HCD process, and using the CTM and PTM analysis of user stories to further define each information variable, we developed proof-of-concept analyses of social media filtering, juxtaposed with Geographic Information System (GIS) evacuations, and annual fire hazard maps out to 2070. Recognizing that people evacuate away from fire hazard and recent investments in infrastructure and climate resilience, projecting long-term fire hazard across the continental USA in the context of current infrastructure, proved highly desirable and market viable. Here we describe the methods to create these proof-of-concept analyses, which were used in continued iterations of qualitative research to gather feedback from potential users.
Human developments fill geographical space using a combination of different land covers (e.g., agricultural, forest, development, etc) and transport networks. Wildland fires spread and burn across natural and human-produced fuels. Humans trying to escape the path of those fires use local transport networks to move to a low-danger destination. In the case of the Marshall fire, human development is adjacent to a large expanse of naturally preserved Boulder County open space and the transport network is best described as suburban, dominated by slow, curved residential streets nested into a meandering development. As the fire traveled through grasslands and suburbia with gusts greater than 100 miles per hour [52], evacuees flooded onto small, circuitous streets and sat in traffic trying to get away from the fire blowing overhead.
For the social media filtering and evacuations, we focused on the Colorado Marshall Fire, which happened on New Years Eve (December 30, 2021 - January 3, 2022) in a suburban neighborhood with traditionally low fire hazard and for which tens of thousands were evacuated. Social media data was collected using the Twitter api version 2 [53] to extract all original tweets (no retweets) containing the hashtag #marshallfire or the words “Marshall Fire” from December 30th, 2021 through the end of January. We collected beyond the containment date to capture the post-fire discussion related to the evacuation and returns. This resulted in 26,788 tweets with 1,756 of those related in some way to the evacuation. These tweets were further stratified using a neural net classifier to identify the tweet contributor role [11,54,55]. The content of the evacuation subset tweets were examined manually, including the related conversational threads and linked sources to build a timeline of events (Figure 6). We filtered out media sources and focused on official messaging and contributions from people directly impacted by the event. This allowed us to identify the key issues and communication disconnects related to the evacuation as it evolved over time and the socio-technical innovations that supported decision-making and communication. Based on our initial analysis we expanded the data collection to include the names of the towns under evacuation orders (Louisville, Superior, Boulder, & Lafayette) and tweets related to evacuation or traffic. For example we searched for original tweets for the town of Louisville using terms Louisville plus either the word traffic or evacuation during the evacuation timeframe. After eliminating noise (e.g. tweets reporting traffic in Louisville, KY or marketing tweets promoting superior internet traffic) this resulted in the addition of 232 tweets.
We simulated the evacuation of traffic [56] away from the Marshall fire using three inputs: a network representation of the road network provided by Open Street Map, a fire perimeter from the National Interagency Fire Center [57] showing the extent of the burned area, and a list of origins and destinations for travelers evacuating the fire. We made a naive estimation of origins to be one person leaving each destroyed structure, determined using locations reported by the Boulder County Sheriff's office [43] and geocoded to matching building records from the Zillow ZTRAX dataset [58]. We estimated isoclines within different amounts of travel time.
We used existing research to predict future fires from 2020 to 2060 in the contiguous U.S. [59,60] and overlaid major road networks to better see where infrastructure exists and hazard is changing. Deloitte continues to lead the agile HCD process refining enterprise technologies for usability and the CU Boulder team adapts algorithms to provide value-added analytics in the spatial and temporal resolutions relevant for decision making.

3. Results

Our analysis of the CTM (Supplemental Material S1) showed that the most highly relied upon information across user personas were about Public Perception and Dynamic Risk (Figure 5a). Public Perception of fire risk or of the role/performance of the interviewee was most needed across user stories, representing the greatest need for this type of information to make more decisions within the wicked wildfire solution space (Figure 5b). While more user personas need information on ‘dynamic risk’ (Figure 5a), how they define that varies by decision (Figure 5b) within the context of fire hazard (i.e., the potential of fire to occur). Information on fire hazard was equally as important (Figure 5b) as information on exposure (Figure 5b), while information on how vulnerable a community is, was more needed across the decision space (Figure 5b).
Our analysis from the HCD methods that constrained the analysis of the CTM showed that solely relying on HCD would have excluded the development of social media filtering, despite being the most needed piece of information across user personas and stories (Figure 5). While HCD focused on workers and communities involved in wildfire mitigation, response, and recovery, which allowed for identifying some of the greatest root problems in this space and the creation of more effective solutions, it didn’t capture the unimagined solution of social media filtering by interviewees. Specifically, because few technologies exist (if any at all) to filter social media data down to unique content contributions, and there is often widespread mistrust of information shared on social media, and it was hard for potential users to imagine what this would look like. However, when questioned about what drives decisions and what was missing information, the systematic analysis from WKID Innovation highlighted the overarching need for information on public perception.
Figure 5. Synthesis of the Change Traceability Matrix (CTM) focusing on: a) percent of user personas (out of 8) needing information directly related to or relying on filtered social media or evacuations from high fire risk areas defined as a function of hazard, exposure, and vulnerability; and b) percent of user stories (out of 39) relying on this information to make a decision.
Figure 5. Synthesis of the Change Traceability Matrix (CTM) focusing on: a) percent of user personas (out of 8) needing information directly related to or relying on filtered social media or evacuations from high fire risk areas defined as a function of hazard, exposure, and vulnerability; and b) percent of user stories (out of 39) relying on this information to make a decision.
Preprints 80313 g005
Despite hazard information being most valuable across decisions, the most common information gap across personas (Table 1) is in social media filtering and risk futures. In particular, recognizing the need for not merely filtering by hashtag, but use of artificial intelligence for identifying deep fakes, misinformation, bots, verified accounts or unique contributions to the conversation during natural disasters is critical. Additionally, risk futures include consideration for how we message risk and not just how we have historically calculated it as acres burned.
The PTM showed that the most valuable requirement for any information technology across user personas (Table 2) was the need for interoperable “plug in” technologies that work with decision makers' existing tools. Next, was the need for information technologies to be accessible via limited connectivity and with limited compute resources. Finally, all information should be intuitive to interpret and information should be consistent when scaled between federal reporting to local decision support for implementation.
With these constraints we demonstrated some analytics of value for determining evacuations from a fire (Figure 7) and how public perception influences our ability to communicate and keep communities safe (Figure 6). The first evacuation notice appears on Twitter at 12:57 pm but does not include specifics about which areas are affected by the evacuation order. Official guidelines specify that if you see flames evacuate. Shortly after that individual replies provide details about which areas are currently affected and that the entire town of Superior is under evacuation orders. Over the next hour video is shared by those returning to grab their belongings and pets document conditions and traffic flow into Superior and Boulder, the first areas under mandatory evacuation. Within an hour traffic is at a standstill for the town of Superior and residents of Boulder. Shortly after 2pm the town of Louisville is ordered to evacuate. There is a steady stream of communications as evacuated residents share updates from their mobile phones stuck in traffic. Information is shared about unforeseen events such as traffic light outages and freight trains, also disrupted by the fire, blocking surface streets. By analyzing evacuation potential (Figure 6) at walking speed, we see that it can take hours to escape the fire perimeter and surrounding areas. Pedestrians or disabled travelers without access to motorized transport may not be able to cover enough ground to find a safe destination and would rely on others to get them out safely. These would not be concerns if traffic were traveling at normal speeds as residents would have the entire Denver Metro Area at their disposal to find friends, family, hotels, or shelters.
Results from the social media filter juxtaposed with the evacuation maps highlight the role the public plays in bridging information gaps between official reporting and Geographic Information System (GIS; Figure 6) such as Google Maps or Waze and enabling a more efficient evacuation. Tweets from evacuees provide destinations and public perceptions about evacuation messaging that can be used in modeling. Furthermore, machine learning using observed traffic patterns from cameras, or calculated through real-time navigational tools such as Google maps and Waze, can be validated using personal tweets with advice on the most efficient routes. Furthermore, tweets from official sources can highlight key breakdowns in communication and the lack of cross-entity coordination during the evacuation.
Figure 6. Twitter in a disaster: Information content from social media can provide critical context to understand evacuation patterns during natural disasters, such as the Marshall fire in Boulder County in 2021. Note: TWITTER, TWEET, RETWEET and the Twitter Bird logo are trademarks of Twitter Inc. or its affiliates.
Figure 6. Twitter in a disaster: Information content from social media can provide critical context to understand evacuation patterns during natural disasters, such as the Marshall fire in Boulder County in 2021. Note: TWITTER, TWEET, RETWEET and the Twitter Bird logo are trademarks of Twitter Inc. or its affiliates.
Preprints 80313 g006
Figure 7. Better understanding evacuations: Utilizing data analytics to advance understanding of bottlenecks during natural disasters such as during the Marshall fire in Boulder County in December of 2021. The blue polygon is the Marshall fire perimeter. White lines are municipal boundaries within the evacuation zone. Gray lines represent the transport network. The first panel shows the normal traffic conditions where a driver can easily escape the fire perimeter within 20 minutes. The second panel shows moderate traffic conditions where the average travel speed is 8 miles per hour. The third panel shows heavy traffic conditions moving at an average of 4 miles per hour. 4 mph is approximately walking speed and, at that speed, there are locations where it would be difficult to escape the fire at this speed. These are models of traffic conditions based on local transport network.
Figure 7. Better understanding evacuations: Utilizing data analytics to advance understanding of bottlenecks during natural disasters such as during the Marshall fire in Boulder County in December of 2021. The blue polygon is the Marshall fire perimeter. White lines are municipal boundaries within the evacuation zone. Gray lines represent the transport network. The first panel shows the normal traffic conditions where a driver can easily escape the fire perimeter within 20 minutes. The second panel shows moderate traffic conditions where the average travel speed is 8 miles per hour. The third panel shows heavy traffic conditions moving at an average of 4 miles per hour. 4 mph is approximately walking speed and, at that speed, there are locations where it would be difficult to escape the fire at this speed. These are models of traffic conditions based on local transport network.
Preprints 80313 g007
The results of our analysis that overlaid major road networks with increases in fire hazard characterized by larger and more frequent fires in the future as a result of climate change [37,60], shows that more places in the SouthEast and up into the midwest and Northeast United States are likely to experience increased burden on infrastructure for evacuations (Figure 8).

4. Discussion

Environmental resilience technology offers great promise to shift science from observing environmental change to providing solutions that serve society. An important step for transitioning research using environmental data to commercialization is creating analytics of value. A study investigating 905 sustainable development projects in China and the United States found that big data analytics and artificial intelligence play an important role in the success of both sustainable design and commercialization, which mediates the relationship between emerging capabilities, sustainable growth, and performance [61]. Moreover, it has been shown that cutting-edge artificial intelligence and data fusion can lead to important and robust predictions about our possible future climate and environmental pathways [62,63]. There is a growing call to leverage science to better inform market solutions with a key focus on technology breakthroughs that meet societal and economic needs to address climate change [10,64].
The R2C model demonstrated in this paper reflects the integration of public-private capabilities [49,65], which are key to solving complex environmental challenges. The R2C model integrates the WKID Innovation framework [27] with HCD, to map the information gaps across multiple end user types and identify where the greatest gain would happen in the development of market-driven solutions. WKID Innovation provides an in-depth systematic analysis of the decision space and uses researcher domain expertise about the complexity of the interconnected human-environmental system to identify data-driven solutions. Research on its own, however, does not precipitate sustainable solutions. Sustainable solutions consider a funding model beyond preliminary research or development [12]. Designing and clearly stating the requirements for how research fits into the decision space at the outset ensures an off-ramp from research to application [66], but it does not mitigate risks within sustained operations [67], which can be limited by demand (or market viability). By integrating WKID Innovation with HCD and market research, we are able to leverage the strengths of each framework to mitigate risks to technology adoption, thereby scaling the use-inspired translational research of environmental science domain experts [68] to drive tomorrow’s technologies and solutions.
R2C sits in the research to commercialization taxonomy of “contract research and consultancy” and addresses the call to integrate traditional agency knowledge of translational research and development (e.g., WKID Innovation) with resource-based methods motivated by understanding organizational needs such as technology, strategy, and markets (e.g., HCD) [69,70]. By leveraging HCD, R2C does not differ too drastically from other models proposed for co-production [71] that rely on iterative feedback from users, but does provide more top-down complex systems analysis that overcomes some of the limitations of HCD [19,20,21]. Expanding translational ecology [72], which seeks to link ecological knowledge to decision making for use-inspired research [68] and real world outcomes [73], the R2C model goes beyond information sharing to co-developing not only analytics, but also technological solutions that facilitate changing how people interact with the natural environment making for a more resilient future. Models of co-production [74,75] and translational ecology [72,73,76], offer potential mechanisms and motivations for environmental sciences to build cross-sector partnerships with end users, but they focus on end-user uptake of information particularly in the non-profit sector and federal/state/local governments, rather than on the potential for markets to drive solutions. They do, however, acknowledge the importance of building workforce with skills beyond academic education [76].
There is a long history of research-to-market pathways, particularly in engineering [77], medicine and pharmaceuticals [78], computer science [79], and biology [80] — less so in ecology and environmental sciences. The incredible wealth of environmental data from satellite sensors, social media platforms, government records, and other data sources offers remarkable opportunities for market-driven solutions to complex environmental challenges. For example, carbon markets offer a means for offsetting fossil fuel emissions by sequestering carbon [81] or reparative finance for water security [82]. Some companies are starting to use data to verify and validate credits that support improved natural resource management, but there is a need for such analytics to be trustworthy - transparent, consistent, and secure (Table 2). Coproduction between industry and scientific research at academic institutions offers a foundation for developing such trustworthy information.
Co-production that merges academic research with industry application, however, requires navigating differences in culture and institutional practices; through development of this R2C model, we have learned three key lessons and present some best practices for others aiming to apply this approach. First, thought needs to go into how to delineate intellectual property from the start [83] and how a team can co-produce a project together, while still maintaining each institutions' rights. A key challenge arises when negotiating contracts and maintaining rights around intellectual property. Legally, “coproduction” muddies the water around who owns what based on contributions [84,85]. A best practice would be to clearly communicate this between all participants and to use multiple modalities for communicating such as the use of diagrams, legal contracts, gantt charts, and documentation of roles and responsibilities in advance.
Second, we recommend establishing mechanisms such as agreements on sharing knowledge and data. Such mechanisms will help hold the tension between open science defined as “transparent and accessible knowledge that is shared and developed through collaborative networks” [86], and proprietary information sold as value-added analytics. This would require consideration for establishing data sharing standards [87] that account for FAIR (Findable, Accessible, Interoperable, and Reproducible) and CARE (Collective Benefit, Authority to Control, Respectful, and Ethical) data principles [88,89] as well as supporting trustworthy artificial intelligence (AI) [51], which considers: i) the objectivity of data that promotes fairness and mitigates harmful bias; ii) how it is secured and protected against unauthorized access use, disclosure, disruption, modification or destruction; iii) how it protects safety and does not cause unacceptable risk; iv) how it protects privacy; v) how explainable or transparent an algorithm is; vi) how accurate it is; vii) how reliably it performs as expected; viii) how resilient it is by its adaptive capability; and ix) how accountable it is for tracing record of events in the decision making process.
Third, we recommend developing and implementing a communication plan between the institutions that can help to overcome variations in meeting and work cultures. A communication plan [87] that includes a management structure with roles and responsibilities as well as reporting structure within teams [83] can alleviate this confusion and enable work management within teams that can cross-pollinate to the other team(s) relying on the outcomes of the partnership. Such a plan would also help alleviate tensions between the faster cadence of deliverables required in industry and the slower turn around in academia due to the exploratory nature of the work and different incentive structures [90]. Clear technical deliverables, timelines, and inter-team meetings that create “tie-points” between industry and academic workflows can allow each team to manage as is culturally appropriate, while still collaborating towards the same end purpose. This is similar to findings that recommend taking a “design approach” to cross-sector collaborations, by taking the end product in mind and iterating throughout [91].
With respect to the presented use case for this R2C model, our findings are supported by recent developments in both research and by industry, but provide clear guidance on requirements for integration into the decision space. With respect to risk assessments and development of hazard, exposure, and vulnerability, new models are being developed for risk [30,92], hazard [31,38,43,59,93], exposure [44,94], and vulnerability [95,96] and are being integrated into commercial capabilities such as Risk Factor by First Street Foundation and public offerings such as the US Forest Service’s WildfireRisk.org or Fuelcast.net. By combining these risk data analytics with evacuation simulations (Figure 7), there is opportunity for resilience planning with respect to infrastructure planning. Our case study showed that traffic is a predictable consequence of fire evacuations and it could be mitigated with both planning and technology. Combining longer-term fire hazard analytics with these simulations could further show how and why traffic develops and to experiment with new road configurations that save lives by preventing entrapment as people move away from high natural hazard areas (e.g., Appendix B User Stories 3.6 and 3.7). Finally, with respect to using information gleaned from social media filtering, research advancements show capabilities for geolocating [97,98,99], identifying misinformation and deep fakes [100], and unique contributions [11]. These advancements can greatly enhance more traditional geographic information system (GIS) analytics that use remote sensing and rely on maps.

5. Conclusion

For the scope of this paper, we studied a subset of environmental resilience technology focused on the synthesis and interpretation of data for solutions-oriented science to create value-added analytics that enable society to become more resilient to environmental change. There is a critical need to leverage domain expertise from research in creating value-added analytics that can inform a more resilient future as data can provide key insights for decision making, but it requires consideration of sustainable solutions that can fund themselves (i.e., research to commercialization). Methods exist to help with designing data analytics that provide information to change the decision space (WKID Innovation) and that can work with Human Centered Design for developing commercial solutions.
The alliance between CU Boulder and Deloitte is a model for this type of approach and the need for more efficient transfers of knowledge and practice between industry and academic partners. Bridging the gap between research and commercialization necessitates academic-private partnerships for serving the public that come with their own challenges as different sectors operate differently. Key to academia and industry working together to co-produce solutions are: i) clear delineation of intellectual property rights; ii) a communication plan with clearly delineated technical deliverables that help overcome cultural differences in working styles and reward systems; and iii) early discussion of how to satisfy both open science approaches and protect proprietary information and strategy. This view of environmental resilience technology addresses the critical need for a more resilient future by shifting away from observing environmental change to providing solutions that serve society through academic-private partnerships that transition research to commercial applications.

Supplementary Materials

The following supporting information can be downloaded at the website of this paper posted on Preprints.org, Table S1: Change Traceability Matrices and Product Traceability Matrix. Supplemental Materials S1: Change and Product Traceability Matrices.

Author Contributions

Conceptualization, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Christina Zapata, Ethan Doyle, Evan Thomas and Jennifer Balch; Data curation, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Virginia Iglesias, Christina Zapata, Maxwell Cook, Jilmarie Stephens and Ty Tuff; Formal analysis, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Christina Zapata, Michael Byrne, Laurel Cooper and Mario Tapia; Funding acquisition, E. Natasha Stavros, Ethan Doyle, Evan Thomas, SJ Maxted, Rana Sen and Jennifer Balch; Investigation, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Virginia Iglesias, Christina Zapata, Michael Byrne, Maxwell Cook, Jilmarie Stephens, Mario Tapia and Ty Tuff; Methodology, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Christina Zapata, Ethan Doyle and Jennifer Balch; Project administration, E. Natasha Stavros, Caroline Gezon, Evan Thomas and Jennifer Balch; Resources, E. Natasha Stavros, SJ Maxted, Rana Sen and Jennifer Balch; Supervision, E. Natasha Stavros, Caroline Gezon and Jennifer Balch; Validation, Caroline Gezon, Lise St Denis, Christina Zapata, Michael Byrne, Laurel Cooper and Mario Tapia; Visualization, E. Natasha Stavros, Lise St Denis, Jilmarie Stephens and Ty Tuff; Writing – original draft, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Virginia Iglesias, Maxwell Cook, Jilmarie Stephens, Ty Tuff, Evan Thomas and Jennifer Balch; Writing – review & editing, E. Natasha Stavros, Caroline Gezon, Lise St Denis, Virginia Iglesias, Maxwell Cook, Jilmarie Stephens, Ty Tuff, Evan Thomas and Jennifer Balch.

Funding

This research was funded by Deloitte Consulting, LLC., through the Climate Innovation Collaboratory.

Data Availability Statement

To maintain anonymity of interview participants, we provide synthesized data in the form of the Change Traceability Matrix and Product Traceability Matrix as supplemental material to this manuscript.

Acknowledgments

The authors would like to thank Casey Jenson with the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder helped with Figures 4 and 6. Thank you to Daniel Morton with the Renewable and Sustainable Energy Institute (RASEI) at the University of Colorado Boulder who contributed graphic design for Figure 1. The authors would also like to thank Jessica Helzer for her work helping to negotiate the partnership between Deloitte LLC and University of Colorado Boulder.

Conflicts of Interest

The funding source (Deloitte Consulting, LLC) for this research precipitated in the design of the research, but the analysis was led by the University of Colorado Boulder who does not receive long-term monetary compensation beyond the contract to do the work for this study and develop the analytics identified by it. Lead author E. Natasha Stavros is the founder of WKID Innovation and has an LLC called WKID Solutions providing education resources on the framework.

Appendix A: Interview Questions

Attach PDF of google form questions (all internal comments removed): https://drive.google.com/file/d/1vYQaocrNOD9QEaw4_Zrcjd2qp6-U6n9A/view?usp=share_link

Appendix B: User Stories

  • Evacuation Route User Stories
    1.1.
    For protecting lives, property, and the environment, a Local Emergency Responder relies on information such as commute hours with respect to weather, time of day, distance, and fire behavior.
    1.2.
    For protecting lives, property, and the environment, a Local Emergency Responder relies on information such as Variable Sheriff/Emergency Management response time to communities in need.
    1.3.
    To effectively communicate with communities (access - e.g. 5G, language, messaging, notifications/alerts, etc), a Local/Regional Resilience Administrator relies on information of transportation networks (who is going where and how - e.g., public transit - on what ingress/egress routes).
  • Dynamic Risk as a function of hazard, exposure and vulnerability User Stories
    2.1.
    For protecting lives, property, and the environment, a Local Emergency Responder relies on information such as dynamic risk by parcel based on fuels, weather, and home inspection information.
    2.2.
    To determine where and when to strategically position resources on the ground at the right location when needed and in response to mutual aid requisitions brokered between the public and local, state, and federal agencies, a State/Regional Emergency Responder relies on information such as dynamic current risk as it relates to anticipated short-term impacts from fire.
    2.3.
    For Disaster Declaration recommendations sent to the President determine how much grant dollars are needed for what kind of assistance and for how long to which communities to build capability for state and local level response based on a cost-benefit analysis, a Regional Recovery Administrator relies on information of fire risk.
    2.4.
    For developing a strategic plan on what mitigation efforts to prioritize based on capability/capacity, infrastructure programs, and social justice that is often vetted with the local community through public engagement exercise and must be approved by city council/commissioner, a Local Land Use/Land Management/ Resilience Planner relies on information of dynamic fire risk as it relates to changing fire hazard (as people cut trees, and structures are built/destroyed as combustible fuels), structural exposure, and structural vulnerability.
    2.5.
    To develop a wildfire strategy with priority high risk areas and methods for reducing wildfire risk (fuels management - mechanical, prescribed fire, etc) decided by rangers in each Forest park and local Resilience Offices/County Commissioners and sometimes regionally (most contentious) often communicated and negotiated with the local communities, State/Regional Resilience Administrators rely on information of community Risk updated quarterly that scales from parcel to regional context (e.g., identify highest risk communities locally and regionally).
    2.6.
    To determine how many staff to hire in support of producing requested analytics by policy makers to assess capacity for meeting legislation mandates, a Resilience Planning Analytics Office relies on information of dynamic risk by parcel (60m pixel) of assets (structures, power lines, habitat, critical infrastructure, watersheds, etc) based on fuels, weather and home inspection information.
    2.7.
    To manage risk/reward trade-offs in natural perils insurance portfolio by deciding whether or not to take on a risk (e.g., wildfire exposure) and what to charge for that risk based on where it sits within company tolerance for loss as it is written: property, finance, insurance, reinsurance, [Re-]insurance companies models assets that they want to insure and send it to the underwriter who will assess the premium that can be charged for the risk and an engineering team may visit the site and assess while offering services like mitigation advice. Underwriting then accepts/rejects risks and may initiate a process with the broker. Models are run daily on existing reinsured portfolios and monthly on the insured portfolios. This relies on information of dynamic risk (to insured losses) as asset (building) exposure and (building) vulnerability to hazard (not just today, but how it's likely to change).
  • Hazard User Stories
    3.1.
    To determine where and when to strategically position resources (contracted or in-house) on the ground at the right location when needed and in response to mutual aid requests brokered between the public and local, state and federal agencies, a state/regional emergency responder relies on information of dynamic "current" fire "risk" (i.e. hazard) as it relates to changes in fuels, topography and weather.
    3.2.
    To determine where and when to set fuel breaks (e.g., prescribe fire, hand crew, dozer, etc) during response to active wildfire or in the “shoulder” season, a State/Regional Emergency Responder relies on information of dynamic "current" fire risk as it relates to evolving hazard of fuel condition (stress/moisture, beetles, etc), type (veg and urban), and accumulation.
    3.3.
    To decide to defend a home or not, a Local/State/Regional Firefighter on the scene relies on information on home building materials.
    3.4.
    To determine how many staff to hire in support of producing requested analytics by policy makers to assess capacity for meeting legislation mandates, a Resilience Planning Analytics Office relies on information of National-scale, rapid, annual updates of vegetation and fuels (updated 3D layers).
    3.5.
    To manage risk/reward trade-offs in natural perils insurance portfolio by deciding whether or not to take on a risk (e.g., wildfire exposure) and what to charge for that risk based on where it sits within company tolerance for loss as it is written: property, finance, insurance, reinsurance, [Re-]Insurance companies models assets that they want to insure and send it to the underwriter who will assess the premium that can be charged for the risk and an engineering team may visit the site and assess while offering services like mitigation advice. Underwriting then accepts/rejects risks and may initiate a process with the broker. Models are run daily on existing reinsured portfolios and monthly on the insured portfolios. This relies on information of dynamic hazards (not just today, but how it's likely to change).
    3.6.
    To develop a strategic plan on what mitigation efforts to prioritize based on capability/capacity, infrastructure programs, and social justice that is often vetted with the local community through public engagement exercise and must be approved by city council/commissioner, a Local Land Manager/Land Use/Resilience Planning Administrator relies on information of dynamic fire hazard (as people cut trees, and structures are built/destroyed as combustible fuels).
    3.7.
    To target communications and prepare communities about risk reduction needs and measures (e.g., evacuation routes and planning as well as home hardening), a Local/Regional/National Resilience Administrator relies on information of building locations.
    3.8.
    To develop a wildfire strategy with priority high risk areas and methods for reducing wildfire risk (fuels management - mechanical, prescribed fire, etc) decided by rangers in each Forest park and local Resilience Offices/County Commissioners and sometimes regionally (most contentious) often communicated and negotiated with the local communities, State/Regional Resilience Administrators rely on information of fuel composition updated quarterly.
  • Vulnerability User Stories
    4.1.
    For Disaster Declarations, Regional Admins write a recommendation to the President to determine how much grant dollars are needed for what kind of assistance and for how long to which communities to build capability for state and local level response based on a cost-benefit analysis. To do this, a Regional Recovery admin relies on information of maps of the built-environment (structure).
    4.2.
    To build capacity for mitigation through projects that reduce future costs (e.g. debris removal, home hardening, defensible space), State/Regional Recovery administrators rely on information of projected maps of built-environment (structure).
    4.3.
    To determine where to focus, sheltering resources for both displaced citizens and responders, State/Regional Recovery Administrators rely on information of social vulnerability.
    4.4.
    To develop a strategic plan on what mitigation efforts to prioritize based on capability/capacity, infrastructure programs, and social justice that is often vetted with the local community through public engagement exercise and must be approved by city council/commissioner, Local Land Use/Land Management/Resilient Planners rely on information on building vulnerability (ignite-ability) based on factors such as low-income housing, retrofitting, materials, etc.
    4.5.
    To develop a strategic plan on what mitigation efforts to prioritize based on capability/capacity, infrastructure programs, and social justice that is often vetted with the local community through public engagement exercise and must be approved by city council/commissioner, Local Land Use/Land Management/Resilient Planners rely on information of social equity.
    4.6.
    To manage risk/reward trade-offs in natural perils insurance portfolio by deciding whether or not to take on a risk (e.g., wildfire exposure) and what to charge for that risk based on where it sits within company tolerance for loss as it is written: property, finance, insurance, reinsurance, [Re-]insurance companies models assets that they want to insure and send it to the underwriter who will assess the premium that can be charged for the risk and an engineering team may visit the site and assess while offering services like mitigation advice. Underwriting then accepts/rejects risks and may initiate a process with the broker. Models are run daily on existing reinsured portfolios and monthly on the insured portfolios. This relies on information of (building) vulnerability.
    4.7.
    To determine whether to defend a home or not, a local emergency response firefighter relies on information such as the Urban Biomass "green biomass" as a Wildland Urban Interface (WUI) layer.
    4.8.
    To support evacuation planning, a Local Resilience Administrator relies on information of social equity.
  • Exposure User Stories
    5.1.
    To determine Building capacity for mitigation through projects that reduce future costs (e.g. debris removal, home hardening, defensible space), a State/Regional Recovery Administrator relies on information of projections of built-environment (structure) maps.
    5.2.
    To manage risk/reward trade-offs in natural perils insurance portfolio by deciding whether or not to take on a risk (e.g., wildfire exposure) and what to charge for that risk based on where it sits within company tolerance for loss as it is written: property, finance, insurance, reinsurance, [Re-]insurance companies models assets that they want to insure and send it to the underwriter who will assess the premium that can be charged for the risk and an engineering team may visit the site and assess while offering services like mitigation advice. Underwriting then accepts/rejects risks and may initiate a process with the broker. Models are run daily on existing reinsured portfolios and monthly on the insured portfolios. This relies on information of dynamic asset (building) exposure.
    5.3.
    To develop a strategic plan on what mitigation efforts to prioritize based on capability/capacity, infrastructure programs, and social justice that is often vetted with the local community through public engagement exercise and must be approved by city council/commissioner, a Local Land Manager/Land Use/Resilience Planning Administrator relies on information of dynamic structural exposure.
    5.4.
    To develop a strategic plan on what mitigation efforts to prioritize based on capability/capacity, infrastructure programs, and social justice that is often vetted with the local community through public engagement exercise and must be approved by city council/commissioner, Local Land Use/Land Management/Resilient Planners rely on information of social equity.
    5.5.
    To manage risk/reward trade-offs in natural perils insurance portfolio by deciding whether or not to take on a risk (e.g., wildfire exposure) and what to charge for that risk based on where it sits within company tolerance for loss as it is written: property, finance, insurance, reinsurance, [Re-]insurance companies models assets that they want to insure and send it to the underwriter who will assess the premium that can be charged for the risk and an engineering team may visit the site and assess while offering services like mitigation advice. Underwriting then accepts/rejects risks and may initiate a process with the broker. Models are run daily on existing reinsured portfolios and monthly on the insured portfolios. This relies on information of asset locations now and in the future.
  • Social Media Influence User Stories
    6.1.
    To influence communication strategy for effective communications with communities (access - e.g. 5G, language, messaging, notifications/alerts, etc), a Local/State/Regional Resilience Administrator relies on information of number of likes and impressions of messaging.
  • Misinformation User Stories
    7.1.
    To decide how, when and what vetted, validated information [on community needs and situational awareness] to disseminate to the public in a timely manner and where to get the information from, a Local/Regional Public Information Officer needs to identify point sources of misinformation and misinformation itself.
    7.2.
    To develop a strategic plan on what mitigation efforts to prioritize based on capability/capacity, infrastructure programs, and social justice that is often vetted with the local community through public engagement exercise and must be approved by city council/commissioner, a Local/Regional Resilience Administrator relies on information of public perception of risk and mitigation efforts with filtered misinformation.
  • Filtered Communications User Stories
    8.1.
    To prioritize 9-1-1 emergency response dispatch of consolidated resource requests (reducing calls to the right number of resource needs, rather than resources/caller who may call about the same event) to the right local agency, a Local Emergency Responder relies on situational information (weapons, threats, etc).
    8.2.
    To decide when people can return based on hazards and access to utilities (water and power), a Regional Recovery Administrator relies on information on who is evacuating and not evacuating in real-time.
    8.3.
    To decide where to focus on sheltering resources for both displaced citizens and responders, a Regional Recovery Administrator relies on information of who needs resources (filtered by social media).
    8.4.
    To decide if resources spent helping on the ground are less than they would receive in consulting on recovery, a Regional Recovery consulting company relies on validated, geolocated information from reliable sources on damages (e.g., downed power lines).
    8.5.
    To decide what agency information to share publicly based on what the public needs to know to reduce the number of duplicate calls on the same incident, a Local Public Information Officer relies on information of evacuations (plans and crowdsourced feedback on available/limited resources and access).
    8.6.
    To decide how, when and what vetted, validated information [on community needs and situational awareness] to disseminate to the public in a timely manner and where to get the information from, a Regional Public Information Officer will use information to identify the mavens (local media influencers).
    8.7.
    To build public-facing relationships around a cohesive, collaborative strategy across political boundaries for incident response, Incident Command approves staff of the National Incident Management Office (NIMO) as part of the USFS to use validated information on the local event with images, location, timestamps, and information on who took it.
    8.8.
    To determine how and when to pay out on a claim and how to reorganize capital to handle catastrophic events, a [Re-]insurance company in National/International resilience planning relies on information of claim validation in the form of geolocation and photos.
  • Public Perception User Stories
    9.1.
    To protect lives, property, and the environment through response, prevention, and education is made locally across departments, Local Emergency Responders coordinated across jurisdictional boundaries ("mutual aid) by the State (e.g. CAL FIRE) with federal resources allocated by Geographic Area Coordination Centers (GACC) rely on information of community perceptions of risk based on fire history and awareness.
    9.2.
    To protect lives, property, and the environment through response, prevention, and education is made locally across departments, Local Emergency Responders coordinated across jurisdictional boundaries ("mutual aid) by the State (e.g. CAL FIRE) with federal resources allocated by Geographic Area Coordination Centers (GACC) rely on information of public perception with respect to rumor control of misinformation and ability to turn information into intelligence.
    9.3.
    To develop resilience plans coordinated with each community locally based on watersheds on for planning evacuation routes, infrastructure improvements, where to do fuel hazard reductions, and which homes to defend during active response, Local Emergency Responders rely on information of public perception with respect to rumor control of misinformation about resilience measures (e.g., prescribed fire).
    9.4.
    To prioritize 9-1-1 emergency response dispatch of consolidated resource requests (reducing calls to the right number of resource needs, rather than resources/caller who may call about the same event) to the right local agency, Local Emergency Responders rely on information of public perception of the event.
    9.5.
    To set strategic priorities of how to use limited staff to be successful and where to prioritize investments to reduce risks (e.g., construction tailored to threats) in preparation for upcoming wildfire season, grant management of State/Regional Emergency Management rely on information of public perception about prioritization of protecting assets based on variable value systems.
    9.6.
    To decide where and when to strategically position resources (contracted or in-house) on the ground at the right location when needed and in response to mutual aid requests brokered between the public and local, state and federal agencies, State and Regional Emergency Managers rely on information public perception about prioritization of protecting assets based on variable value systems (e.g., timber vs homes).
    9.7.
    To decide whether to defend a home or not, a local/state/regional firefighter relies on information of public perception about prioritization of protecting assets based on variable value systems (e.g., timber vs homes).
    9.8.
    To decide where, when and what kind of fuel breaks (prescribe a fire, hand crew, dozer, goats, etc) during response to active wildfire or in the “shoulder season”, local/state/regional firefighters rely on information of public perception of fuel treatments and geotagged photos of what's happening, written text needs to be verified in real-time (trusted vs not-trusted).
    9.9.
    To decide when to alert and warn people of risk and how to educate the public to take mitigation action, a Local Resilience Administrator relies on information of public perception of events in real-time as they happen and that is reliable from trusted sources and accurate with photos and geotagging.
    9.10.
    To develop a strategic plan on what mitigation efforts to prioritize based on capability/capacity, infrastructure programs, and social justice that is often vetted with the local community through public engagement exercise and must be approved by city council/commissioner, a Local/Regional Resilience Administrator relies on information of public perception of risk and mitigation efforts with filtered misinformation.
    9.11.
    To decide how to transition from strategic planning to implementation based on priorities of the local community identified by and ranked by the city Chief Resilience Officer, a Local/Regional Resilience Administrator relies on information of public perception of risk and mitigation efforts with filtered misinformation.
    9.12.
    To communicate and prepare communities about risk reduction needs and measures (e.g., evacuation routes and planning as well as home hardening), a Local/Regional Resilience Administrator relies on information of public perception and understanding of fire risk and preparedness as well as public-sentiment to determine messaging to communities of fire expectations.
    9.13.
    To influence communication strategy for effective communications with communities (access - e.g. 5G, language, messaging, notifications/alerts, etc), a Local/State/Regional Resilience Administrator relies on information of the number of public sentiment to determine buy-in of assets being protected.
    9.14.
    To decide how, when and what vetted, validated information [on community needs and situational awareness] to disseminate to the public in a timely manner and where to get the information from within the constraints and scope directed by a Incident Commander, a Local/Regional Public Information Officer relies on information of public sentiment of the event.

References

  1. IOM World Migration Report 2015 - Migrants and Cities: New Partnerships to Manage Mobility; 2015; Vol. 8; ISBN 978-92-9068-709-2.
  2. IPCC Climate Change and Land: IPCC Special Report on Climate Change, Desertification, Land Degradation, Sustainable Land Management, Food Security, and Greenhouse Gas Fluxes in Terrestrial Ecosystems; 1st ed.; Cambridge University Press, 2022; ISBN 978-1-00-915798-8.
  3. Barnosky, A.D.; Matzke, N.; Tomiya, S.; Wogan, G.O.U.; Swartz, B.; Quental, T.B.; Marshall, C.; McGuire, J.L.; Lindsey, E.L.; Maguire, K.C.; et al. Has the Earth’s Sixth Mass Extinction Already Arrived? Nature 2011, 471, 51–57. [Google Scholar] [CrossRef]
  4. Climate Change 2021: The Physical Science Basis. Contribution of Working Group I to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change; Masson-Delmotte, V.; Zhai, P.; Pirani, A.; Connors, S.L.; Péan, C.; Berger, S.; Caud, N.; Chen, Y.; Goldfarb, L.; Gomis, M.I.; Huang, M.; Leitzell, K.; Lonnoy, E.; Matthews, J.B.R.; Maycock, T.K.; Waterfield, T.; Yelekçi, Ö.; Yu, R.; Zhou, B. (Eds.) Cambridge University Press: Cambridge, United Kingdom and New York, NY, USA, 2021. [Google Scholar]
  5. Khan, I.; Hou, F.; Le, H.P. The Impact of Natural Resources, Energy Consumption, and Population Growth on Environmental Quality: Fresh Evidence from the United States of America. Sci. Total Environ. 2021, 754, 142222. [Google Scholar] [CrossRef]
  6. Van Meerbeek, K.; Jucker, T.; Svenning, J.-C. Unifying the Concepts of Stability and Resilience in Ecology. J. Ecol. 2021, 109, 3114–3132. [Google Scholar] [CrossRef]
  7. Bohrer, G.; Cavender-Bares, J.; Chaplin-Kramer, R.; Chavez, F.; Dietze, M.; Fatoyinbo, T.; Gaddis, K.; Geller, G.; Guralnick, R.; Hestir, E.; et al. NASA Biological Diversity and Ecological Forecasting: Current State of Knowledge and Considerations for the Next Decade; NASA, 2022.
  8. IPBES Summary for Policymakers of the Global Assessment Report on Biodiversity and Ecosystem Services; Zenodo, 2019.
  9. IPCC Summary for Policymakers. In Climate Change 2022: Impacts, Adaptation and Vulnerability. Contribution of Working Group II to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change; Pörtner, H.-O., Roberts, D.C., Tignor, M.M.B., Poloczanska, E.S., Mintenbeck, K., Alegría, A., Craig, M., Langsdorf, S., Löschke, S., Möller, V., Okem, A., Rama, B., Eds.; Cambridge University Press, 2022.
  10. NRC Thriving on Our Changing Planet: A Decadal Strategy for Earth Observation from Space: An Overview for Decision Makers and the Public; National Academies Press: Washington, D.C., 2019; p. 25437; ISBN 978-0-309-49241-6.
  11. Diaz, J.; Denis, L.S. Classifying Twitter Users for Disaster Response: A Highly Multimodal or Simple Approach? 2020, 16.
  12. Tabor, K. Achieving Multiple Conservation Goals with Satellite-Based Monitoring and Alert Systems. Doctoral Dissertation, University of Maryland: Baltimore County, 2023.
  13. Ingram, P.; Choi, Y.; Martin, R.L.; Reeves, M.; Gulati, R. Harvard Business Review. 2022.
  14. Sharp, H.; Preece, J.; Rogers, Y. Interaction Design: Beyond Human-Computer Interaction, 5th edition.; Wiley: Indianapolis, IN, 2019; ISBN 978-1-119-54725-9. [Google Scholar]
  15. Norman, D. The Design Of Everyday Things; Revised edition.; Basic Books: New York, New York, 2013; ISBN 978-0-465-05065-9. [Google Scholar]
  16. Koberg, D.; Bagnall, J. The Universal Traveler: A Soft-Systems Guide to: Creativity, Problem-Solving, and the Process of Reaching Goals; Revised edition.; W. Kaufmann: Los Altos, Calif, 1974; ISBN 978-0-913232-05-7. [Google Scholar]
  17. Nelson, G. How to See: A Guide to Reading Our Manmade Environment; Little, Brown and Company, 1979; ISBN 978-0-316-60312-6.
  18. McKim, R.H. Experiences in Visual Thinking, 2nd Edition; 2nd edition.; Cengage Learning: Monterey, Calif, 1980; ISBN 978-0-8185-0411-2. [Google Scholar]
  19. Thomas, V.; Remy, C.; Bates, O. The Limits of HCD: Reimagining the Anthropocentricity of ISO 9241-210. In Proceedings of the Proceedings of the 2017 Workshop on Computing Within Limits; Association for Computing Machinery: New York, NY, USA, June 22 2017; pp. 85–92.
  20. Velsen, L. van; Ludden, G.; Grünloh, C. The Limitations of User-and Human-Centered Design in an EHealth Context and How to Move Beyond Them. J. Med. Internet Res. 2022, 24, e37341. [Google Scholar] [CrossRef] [PubMed]
  21. Tanweer, A.; Aragon, C.R.; Muller, M.; Guha, S.; Passi, S.; Neff, G.; Kogan, M. Interrogating Human-Centered Data Science: Taking Stock of Opportunities and Limitations. In Proceedings of the Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, April 28 2022; pp. 1–6.
  22. Vlaskovits, P. Harvard Business Review. August 29 2011.
  23. Rittel, H.W.J.; Webber, M.M. Dilemmas in a General Theory of Planning. Policy Sci. 1973, 4, 155–169. [Google Scholar] [CrossRef]
  24. Kennedy, M.C.; Ford, E.D.; Singleton, P.; Finney, M.; Agee, J.K. Informed Multi-Objective Decision-Making in Environmental Management Using Pareto Optimality. J. Appl. Ecol. 2008, 45, 181–192. [Google Scholar] [CrossRef]
  25. Ackoff, R. From Data to Wisdom. Journal of Applied Systems Analysis 1989, 16, 3–9. [Google Scholar]
  26. Aguilar, F.J. Scanning the Business Environment; Macmillan Publishers Limited, 1967.
  27. Stavros, E.N. Wicked Problems Need WKID Innovation. Res.-Technol. Manag. 2022, 65, 39–47. [CrossRef]
  28. NASA NASA Systems Engineering Handbook. NASA SP-2016-6105 2007, 297.
  29. Taplin, D.H.; Clark, H.; Collins, E.; Colby, D.C. Theory of Change; ActKnowledge: New York, NY, 2003. [Google Scholar]
  30. Iglesias, V.; Stavros, N.; Balch, J.K.; Barrett, K.; Cobian-Iñiguez, J.; Hester, C.; Kolden, C.A.; Leyk, S.; Nagy, R.C.; Reid, C.E.; et al. Fires That Matter: Reconceptualizing Fire Risk to Include Interactions between Humans and the Natural Environment. Environ. Res. Lett. 2022, 17, 045014. [Google Scholar] [CrossRef]
  31. Stavros, E.N.; Abatzoglou, J.; Larkin, N.K.; McKenzie, D.; Steel, E.A. Climate and Very Large Wildland Fires in the Contiguous Western USA. Int. J. Wildland Fire 2014, 23, 899. [Google Scholar] [CrossRef]
  32. Jerrett, M.; Jina, A.S.; Marlier, M.E. Up in Smoke: California’s Greenhouse Gas Reductions Could Be Wiped out by 2020 Wildfires. Environ. Pollut. 2022, 310, 119888. [Google Scholar] [CrossRef] [PubMed]
  33. Balch, J.K.; Bradley, B.A.; Abatzoglou, J.T.; Nagy, R.C.; Fusco, E.J.; Mahood, A.L. Human-Started Wildfires Expand the Fire Niche across the United States. Proc. Natl. Acad. Sci. 2017, 114, 2946–2951. [Google Scholar] [CrossRef]
  34. Mietkiewicz, N.; Balch, J.K.; Schoennagel, T.; Leyk, S.; St. Denis, L.A.; Bradley, B.A. In the Line of Fire: Consequences of Human-Ignited Wildfires to Homes in the U.S. (1992–2015). Fire 2020, 3, 50. [CrossRef]
  35. Moritz, M.A.; Morais, M.E.; Summerell, L.A.; Carlson, J.M.; Doyle, J. Wildfires, Complexity, and Highly Optimized Tolerance. Proc. Natl. Acad. Sci. 2005, 102, 17912–17917. [Google Scholar] [CrossRef]
  36. Jolly, W.M.; Cochrane, M.A.; Freeborn, P.H.; Holden, Z.A.; Brown, T.J.; Williamson, G.J.; Bowman, D.M.J.S. Climate-Induced Variations in Global Wildfire Danger from 1979 to 2013. Nat. Commun. 2015, 6, 7537. [Google Scholar] [CrossRef] [PubMed]
  37. Stavros, E.N.; Abatzoglou, J.T.; McKenzie, D.; Larkin, N.K. Regional Projections of the Likelihood of Very Large Wildland Fires under a Changing Climate in the Contiguous Western United States. Clim. Change 2014, 126, 455–468. [Google Scholar] [CrossRef]
  38. Barbero, R.; Abatzoglou, J.T.; Larkin, N.K.; Kolden, C.A.; Stocks, B. Climate Change Presents Increased Potential for Very Large Fires in the Contiguous United States. Int. J. Wildland Fire 2015, 24, 892. [Google Scholar] [CrossRef]
  39. Iglesias, V.; Balch, J.K.; Travis, W.R. U.S. Fires Became Larger, More Frequent, and More Widespread in the 2000s. Sci. Adv. 8, eabc0020. [CrossRef]
  40. Abatzoglou, J.T.; Battisti, D.S.; Williams, A.P.; Hansen, W.D.; Harvey, B.J.; Kolden, C.A. Projected Increases in Western US Forest Fire despite Growing Fuel Constraints. Commun. Earth Environ. 2021, 2, 1–8. [Google Scholar] [CrossRef]
  41. Kolden, C.A. We’re Not Doing Enough Prescribed Fire in the Western United States to Mitigate Wildfire Risk. Fire 2019, 2, 30. [Google Scholar] [CrossRef]
  42. Coen, J.L.; Stavros, E.N.; Fites-Kaufman, J.A. Deconstructing the King Megafire. Ecol. Appl. 2018, 28, 1565–1580. [Google Scholar] [CrossRef]
  43. Pascolini-Campbell, M.; Lee, C.; Stavros, N.; Fisher, J.B. ECOSTRESS Reveals Pre-Fire Vegetation Controls on Burn Severity for Southern California Wildfires of 2020. Glob. Ecol. Biogeogr. 2022, n/a. [CrossRef]
  44. Iglesias, V.; Braswell, A.E.; Rossi, M.W.; Joseph, M.B.; McShane, C.; Cattau, M.; Koontz, M.J.; McGlinchy, J.; Nagy, R.C.; Balch, J.; et al. Risky Development: Increasing Exposure to Natural Hazards in the United States. Earths Future 2021, n/a, e2020EF001795. [CrossRef]
  45. Higuera, P.E.; Cook, M.C.; Balch, J.K.; Stavros, E.N.; Mahood, A.L.; St. Denis, L.A. Shifting Social-Ecological Fire Regimes Explain Increasing Structure Loss from Western Wildfires. PNAS Nexus 2023, 2, pgad005. [CrossRef]
  46. Vilà, M.; Ibáñez, I. Plant Invasions in the Landscape. Landsc. Ecol. 2011, 26, 461–472. [Google Scholar] [CrossRef]
  47. Mosher, E.S.; Silander, J.A.; Latimer, A.M. The Role of Land-Use History in Major Invasions by Woody Plant Species in the Northeastern North American Landscape. Biol. Invasions 2009, 11, 2317. [Google Scholar] [CrossRef]
  48. Fusco, E.J.; Finn, J.T.; Balch, J.K.; Nagy, R.C.; Bradley, B.A. Invasive Grasses Increase Fire Occurrence and Frequency across US Ecoregions. Proc. Natl. Acad. Sci. 2019, 116, 23594–23599. [Google Scholar] [CrossRef] [PubMed]
  49. Stavros, E.N.; Iglesias, V.; Decastro, A. The Wicked Wildfire Problem and Solution Space for Detecting and Tracking the Fires That Matter. Available online: http://www.essoar.org/doi/10.1002/essoar.10506888.1 (accessed on 7 June 2021).
  50. Nagji, B.; Tuff, G. Harvard Business Review. May 1 2012.
  51. NIST Trustworthy and Responsible AI. NIST 2022.
  52. Fovell, R.G.; Brewer, M.J.; Garmong, R.J. The December 2021 Marshall Fire: Predictability and Gust Forecasts from Operational Models. Atmosphere 2022, 13, 765. [Google Scholar] [CrossRef]
  53. Twitter Inc. Twitter API V2 2023.
  54. Denis, L.A.S.; Hughes, A.L. ‘What I Need to Know Is What I Don’t Know!’: Filtering Disaster Twitter Data for Information from Local Individuals. In Proceedings of the Social Media for Disaster Response and Resilience Proceedings, Hughes, A.L., McNeill, F., Zobel, C., Eds.; Blacksburg, VA, USA; 2020. [Google Scholar]
  55. St Denis, L. Social Media Content Filtering For Emergency Management.
  56. Pereira, R.H.M.; Saraiva, M.; Herszenhut, D.; Braga, C.K.V.; Conway, M.W. R5r: Rapid Realistic Routing on Multimodal Transport Networks with R5 in R. Findings 2021. [Google Scholar] [CrossRef]
  57. NIFC NIFC Open Data Site: Federal Interagency Wildland Fire Maps and Data for All 2018.
  58. Zillow Inc. ZTRAX: Zillow Transaction and Assessment Dataset 2016.
  59. Joseph, M.B.; Rossi, M.W.; Mietkiewicz, N.P.; Mahood, A.L.; Cattau, M.E.; St. Denis, L.A.; Nagy, R.C.; Iglesias, V.; Abatzoglou, J.T.; Balch, J.K. Spatiotemporal Prediction of Wildfire Size Extremes with Bayesian Finite Sample Maxima. Ecol. Appl. 2019, 29. [Google Scholar] [CrossRef]
  60. Stephens, J.J.; Joseph, M.B.; Iglesias, V.; Tuff, T.; Mahood, A.; Rangwala, I.; Wolken, J.; Balch, J.K. Fires of Unusual Size: Future of Extreme and Novel Wildfire in a Warming United States (2020-2060). 2023. Manuscript submitted for publication.
  61. Zhang, H.; Song, M.; He, H. Achieving the Success of Sustainability Development Projects through Big Data Analytics and Artificial Intelligence Capability. Sustainability 2020, 12, 949. [Google Scholar] [CrossRef]
  62. Harfouche, A.L.; Jacobson, D.A.; Kainer, D.; Romero, J.C.; Harfouche, A.H.; Scarascia Mugnozza, G.; Moshelion, M.; Tuskan, G.A.; Keurentjes, J.J.B.; Altman, A. Accelerating Climate Resilient Plant Breeding by Applying Next-Generation Artificial Intelligence. Trends Biotechnol. 2019, 37, 1217–1235. [Google Scholar] [CrossRef]
  63. Jung, J.; Maeda, M.; Chang, A.; Bhandari, M.; Ashapure, A.; Landivar-Bowles, J. The Potential of Remote Sensing and Artificial Intelligence as Tools to Improve the Resilience of Agriculture Production Systems. Curr. Opin. Biotechnol. 2021, 70, 15–22. [Google Scholar] [CrossRef]
  64. NSF Directorate for Technology, Innovation and Partnerships (TIP) Resources and Contracts. Available online: https://beta.nsf.gov/tip/resources (accessed on 29 March 2023).
  65. Ramirez, J. International Association of Wildland Fire. 2021.
  66. NRC Accelerating Technology Transi- Tion: Bridging the Valley of Death for Materials and Processes in Defense Systems; National Academies Press: Washington, D.C., 2004.
  67. NASA The Application Readiness Level Metric; NASA Applied Sciences, 2019.
  68. Hallett, L.M.; Morelli, T.L.; Gerber, L.R.; Moritz, M.A.; Schwartz, M.W.; Stephenson, N.L.; Tank, J.L.; Williamson, M.A.; Woodhouse, C.A. Navigating Translational Ecology: Creating Opportunities for Scientist Participation. Front. Ecol. Environ. 2017, 15, 578–586. [Google Scholar] [CrossRef]
  69. Markman, G.D.; Siegel, D.S.; Wright, M. Research and Technology Commercialization. J. Manag. Stud. 2008, 45, 1401–1423. [Google Scholar] [CrossRef]
  70. Polk, M. Transdisciplinary Co-Production: Designing and Testing a Transdisciplinary Research Framework for Societal Problem Solving. Futures 2015, 65, 110–122. [Google Scholar] [CrossRef]
  71. Hawkins, J.; Madden, K.; Fletcher, A.; Midgley, L.; Grant, A.; Cox, G.; Moore, L.; Campbell, R.; Murphy, S.; Bonell, C.; et al. Development of a Framework for the Co-Production and Prototyping of Public Health Interventions. BMC Public Health 2017, 17, 689. [Google Scholar] [CrossRef] [PubMed]
  72. Schlesinger, W.H. Translational Ecology. Science 2010, 329, 609–609. [Google Scholar] [CrossRef]
  73. Enquist, C.A.; Jackson, S.T.; Garfin, G.M.; Davis, F.W.; Gerber, L.R.; Littell, J.A.; Tank, J.L.; Terando, A.J.; Wall, T.U.; Halpern, B.; et al. Foundations of Translational Ecology. Front. Ecol. Environ. 2017, 15, 541–550. [Google Scholar] [CrossRef]
  74. Bovaird, T. Beyond Engagement and Participation: User and Community Coproduction of Public Services. Public Adm. Rev. 2007, 67, 846–860. [Google Scholar] [CrossRef]
  75. Brandsen, T.; Honingh, M. Distinguishing Different Types of Coproduction: A Conceptual Analysis Based on the Classical Definitions. Public Adm. Rev. 2016, 76, 427–435. [Google Scholar] [CrossRef]
  76. Schwartz, M.W.; Hiers, J.K.; Davis, F.W.; Garfin, G.M.; Jackson, S.T.; Terando, A.J.; Woodhouse, C.A.; Morelli, T.L.; Williamson, M.A.; Brunson, M.W. Developing a Translational Ecology Workforce. Front. Ecol. Environ. 2017, 15, 587–596. [Google Scholar] [CrossRef]
  77. Gwyn, C.W.; Silverman, P.J. EUVL: Transition from Research to Commercialization. In Proceedings of the Photomask and Next-Generation Lithography Mask Technology X, SPIE, August 28 2003; Vol. 5130; pp. 990–1004. [Google Scholar]
  78. Wouters, O.J.; McKee, M.; Luyten, J. Estimated Research and Development Investment Needed to Bring a New Medicine to Market, 2009-2018. JAMA 2020, 323, 844–853. [Google Scholar] [CrossRef]
  79. Buyya, R. Market-Oriented Cloud Computing: Vision, Hype, and Reality of Delivering Computing as the 5th Utility. In Proceedings of the 2009 Fourth ChinaGrid Annual Conference; August 2009; pp. xii–xv.
  80. Stoeklé, H.-C.; Mamzer-Bruneel, M.-F.; Vogt, G.; Hervé, C. 23andMe: A New Two-Sided Data-Banking Market Model. BMC Med. Ethics 2016, 17, 19. [Google Scholar] [CrossRef]
  81. Duchelle, A.E.; Simonet, G.; Sunderlin, W.D.; Wunder, S. What Is REDD+ Achieving on the Ground? Curr. Opin. Environ. Sustain. 2018, 32, 134–140. [Google Scholar] [CrossRef]
  82. Thomas, E.; Ntazinda, J.; Kathuni, S. Applying Climate Reparative Finance toward Water Security. Sci. Total Environ. 2023, 875, 162506. [Google Scholar] [CrossRef]
  83. Bryson, J.M.; Crosby, B.C.; Stone, M.M. The Design and Implementation of Cross-Sector Collaborations: Propositions from the Literature. Public Adm. Rev. 2006, 66, 44–55. [Google Scholar] [CrossRef]
  84. Florin, U.; Lindhult, E. Norms and Ethics: Prerequisites for Excellence in Co-Production. 2015.
  85. Page, K. Ethics and the Co-Production of Knowledge. 2022.
  86. Vicente-Saez, R.; Martinez-Fuentes, C. Open Science Now: A Systematic Literature Review for an Integrated Definition. J. Bus. Res. 2018, 88, 428–436. [Google Scholar] [CrossRef]
  87. Mikhaylov, S.J.; Esteve, M.; Campion, A. Artificial Intelligence for the Public Sector: Opportunities and Challenges of Cross-Sector Collaboration. Philos. Trans. R. Soc. Math. Phys. Eng. Sci. 2018, 376, 20170357. [Google Scholar] [CrossRef] [PubMed]
  88. Wilkinson, M.D.; Dumontier, M.; Aalbersberg, Ij.J.; Appleton, G.; Axton, M.; Baak, A.; Blomberg, N.; Boiten, J.-W.; da Silva Santos, L.B.; Bourne, P.E.; et al. The FAIR Guiding Principles for Scientific Data Management and Stewardship. Sci. Data 2016, 3, 160018. [Google Scholar] [CrossRef] [PubMed]
  89. Carroll, S.R.; Garba, I.; Figueroa-Rodriguez, O.L.; Halbrook, J.; Raseroka, K.; Rodriguez-Lonebear, D.; Rowe, R.; Rodrigo, S.; Walker, J.D.; Anderson, J.; et al. The CARE Principles for Indigenous Data Governance. Available online: https://datascience.codata.org/articles/10.5334/dsj-2020-043/ (accessed on 20 April 2021).
  90. Rasmussen, E.; Moen, Ø.; Gulbrandsen, M. Initiatives to Promote Commercialization of University Knowledge. Technovation 2006, 26, 518–533. [Google Scholar] [CrossRef]
  91. Bryson, J.M.; Crosby, B.C.; Stone, M.M. Designing and Implementing Cross-Sector Collaborations: Needed and Challenging. Public Adm. Rev. 2015, 75, 647–663. [Google Scholar] [CrossRef]
  92. Mahmoud, H.; Chulahwat, A. Assessing Wildland–Urban Interface Fire Risk. R. Soc. Open Sci. 2020, 7, 201183. [Google Scholar] [CrossRef]
  93. Farahmand, A.; Stavros, E.N.; Reager, J.T.; Behrangi, A. Introducing Spatially Distributed Fire Danger from Earth Observations (FDEO) Using Satellite-Based Data in the Contiguous United States. Remote Sens. 2020, 12, 1252. [Google Scholar] [CrossRef]
  94. Mann, M.L.; Berck, P.; Moritz, M.A.; Batllori, E.; Baldwin, J.G.; Gately, C.K.; Cameron, D.R. Modeling Residential Development in California from 2000 to 2050: Integrating Wildfire Risk, Wildland and Agricultural Encroachment. Land Use Policy 2014, 41, 438–452. [Google Scholar] [CrossRef]
  95. Quarles, S.L.; Pohl, K. Building a Wildfire-Resistant Home: Codes and Costs; Headwaters Economics, 2018.
  96. Chulahwat, A.; Mahmoud, H.; Monedero, S.; Diez Vizcaíno, F.J.; Ramirez, J.; Buckley, D.; Forradellas, A.C. Integrated Graph Measures Reveal Survival Likelihood for Buildings in Wildfire Events. Sci. Rep. 2022, 12, 15954. [Google Scholar] [CrossRef]
  97. Silvio S, R.J. Traffic Observatory: A System to Detect and Locate Traffic Events and Conditions Using Twitter. Presented at the Proceedings of the 5th ACM SIGSPATIAL International Workshop on Location-Based Social Networks, 2012.
  98. Kotzias, D.; Lappas, T.; Gunopulos, D. Home Is Where Your Friends Are: Utilizing the Social Graph to Locate Twitter Users in a City. Inf. Syst. 2016, 57, 77–87. [Google Scholar] [CrossRef]
  99. Cheng, Z.; Caverlee, J.; Lee, K. You Are Where You Tweet: A Content-Based Approach to Geo-Locating Twitter Users. In Proceedings of the Proceedings of the 19th ACM international conference on Information and knowledge management; Association for Computing Machinery: New York, NY, USA, October 26, 2010; pp. 759–768. [Google Scholar]
  100. Islam, M.R.; Liu, S.; Wang, X.; Xu, G. Deep Learning for Misinformation Detection on Online Social Networks: A Survey and New Perspectives. Soc. Netw. Anal. Min. 2020, 10, 82. [Google Scholar] [CrossRef] [PubMed]
Figure 4. A cartoon of the traceability for organizing information from interviews into the Change Traceability Matrix (CTM).
Figure 4. A cartoon of the traceability for organizing information from interviews into the Change Traceability Matrix (CTM).
Preprints 80313 g004
Figure 8. Climate-driven changes in future fire: We show increases in fires per year and burned area and as percent change between 1990-2019 and 2020-2060 with current major USA road networks to identify where roads are in the context of increasing fire hazard.
Figure 8. Climate-driven changes in future fire: We show increases in fires per year and burned area and as percent change between 1990-2019 and 2020-2060 with current major USA road networks to identify where roads are in the context of increasing fire hazard.
Preprints 80313 g008
Table 1. Information gaps identified across user personas and a summary definition listed as a requirement for providing that information.
Table 1. Information gaps identified across user personas and a summary definition listed as a requirement for providing that information.
Type of Limitation or Opportunity # of User Personas Requirement Consolidated Across Personas
Social Media 5 Social media information shall include filters by: "deep fakes", misinformation, bots, verified accounts, etc.
Risk Futures 5 Risk futures that project risk, as defined by how it is messaged rather than just acres burned, under different management scenarios to link cost of management to risk mitigation benefit
Risk General 3 Risk information shall provide uncertainty by each layer: hazard, exposure, vulnerability
Risk information should consider scalability beyond data limitations of the United States.
Risk information shall include more than simple maps of the Wildland Urban Interface.
Hazard 2 Hazard information shall provide fuel maps that are updated frequently as fuels change.
Vulnerability 2 Vulnerability information shall include building ignition potential today and into the future.
Incident Reporting 2 Incident information shall automatically populate based on curated data from different data sources.
Exposure 1 Exposure information shall include building locations today and likely locations into the future.
General 1 Information technologies shall focus on proactive solutions rather than only reactive solutions (i.e., suppression)
1 Information of value shall include metadata.
1 Impact information shall link building damage to insurance policies.
1 Information of value shall be verified with local knowledge.
1 Information of value shall provide the granularity needed to inform decisions.
1 Information technologies shall enable analytics (e.g., trend analyses).
Table 2. This is the Product Traceability Matrix that provides requirements for providing information and information technologies of value.
Table 2. This is the Product Traceability Matrix that provides requirements for providing information and information technologies of value.
Type of Limitation or Opportunity # of User Personas Requirement Consolidated Across Personas
Product Definition 7 Information technology shall be interoperable to "plug in" to existing data portals used by User Personas to help reduce the number of sources/screens that they have to visit and enable them to use existing data layers.
1 Data platforms should plug into a single existing government data portal when one becomes available by the federal government.
User Experience (UX) 5 Information layers shall be intuitive to interpret to reduce training for use.
Accessibility 6 Information technologies shall be accessible via a cell phone or government laptop with limited connectivity.
1 Information layers shall be accessible via both information technologies and print outs.
1 Information layers shall be archivable with provenance to be public record.
Business Model 5 Information technologies shall meet the objectives of federal funding sources, while also servicing local and state decision needs.
Trustworthy AI 4 Information for value-added analytics shall have transparent documentation of algorithms.
Information for value-added analytics shall be open source.
Information for value-added analytics shall include uncertainty and error propagation.
Product Requirements 5 Information for value-added analytics shall be archived for long-term access.
Information for value-added analytics shall be pre-processed and ready to use.
Information for value-added analytics shall incentivize more resilient behavior and penalize less resilient behavior.
Information technologies shall integrate cybersecurity.
Information technologies should be marketed to the relevant agencies for using the available information.
Other Technology and Cultural Needs 1 Information technologies should include a business model to better serve less advantaged communities without exploiting them.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated