Fast affectation maps for earthquakes are obtained mostly by employing ground motion prediction equations (GMPE) which are empirical relationships that relate maximum shaking with the distance from the event source. Together with site effects from the upper-most geological layers, it gives a quick but rough approximation of the affectation at a wide area. Physical 3D modelling of seismic waves can result in a much more detailed map of shaking for a particular event, although this requires a lot of computing time, urgent access to supercomputing resources, and it is very sensitive to uncertainties in the velocity model of the subsurface and to source parameters.
FTRT tsunami computations are crucial in the context of Tsunami Early Warning Systems (TEWS). Greatly improved and highly efficient computational methods are the first raw ingredient to achieve extremely fast and effective calculations. HPC facilities have the role to bring this efficiency to a maximum possible while drastically reducing computational times. This PD will comprise both earthquake and landslide sources. Earthquake tsunami generation is to an extent simpler than landslide tsunami generation, as landslide generated tsunamis depend on the landslide dynamics which necessitate coupling dynamic landslide simulation models to the tsunami propagation. In both cases, FTRT simulations in several contexts and configurations will be the final aim of this pilot.
High-resolution 4D models for volcanic plumes and pyroclastic flows are needed for interpretation of observational data (in particular radar, TIR images, and infrasounds), calibration of empirical parameters in integral models used for PVHA, and more accurate description (reduction of the epistemic uncertainty) of the ash source for atmospheric dispersion models. This Pilot will be developed after the ASHEE code has gone through the optimization and enabling stage. Numerical simulations at very-high resolution will be performed, with reference to specific selected test cases.
A pending problem of current operational tsunami simulations is their capability to compute forecasts in real-time, partly due to efficiency reasons, partly due to the initial condition assimilation problem. Furthermore, tsunami early-warning should issue a science-based alarm of an imminent damaging event with enough accuracy and reliability to justify measures such as evacuations. The fast traveling signals of seismic (surface) waves and acoustic waves, preceding the slower propagating tsunami signal, gives hope to establishing early warning systems employing ocean bottom pressure gauges or on-demand buoy measurements. A hydrostatic 2D model (as usually applied in tsunami modeling) is not capable of representing acoustic nor seismic waves adequately.
PSHA is widely established for deciding safety criteria for making official national hazard maps, developing building code requirements, safety of critical infrastructure (e.g. nuclear power plants) and determining earthquake insurance rates by governments and industry. However, PSHA currently rests on empirical, time-independent assumptions known to be too simplistic and conflict with earthquake physics. Respective deficits become apparent as many damaging earthquakes occur in regions rated as low-risk by PSHA hazard maps and near-fault effects from rupture on extended faults is not taken into account. Combined simulations of dynamic fault rupture and seismic wave propagation are crucial tools to shed light onto the poorly constrained processes of earthquake faulting. Realistic model setups should acknowledge topography, 3D geological structures, rheology, and fault geometries with appropriate stress and frictional parameters, all of which contribute to complex ground motion patterns. A fundamental challenge hereby is to model the high frequency content of the three-dimensional wave field, since the frequency range of 0–10 Hz is of pivotal importance for engineering purposes. Multiple executions of such multi-physics simulations need to be performed to provide a probabilistic-based hazard estimation.
PVHA methodologies provide a framework for assessing the likelihood of a given measure of intensity of different volcanic phenomena, such as tephra loading on the ground, airborne ash concentration, pyroclastic flows, etc., being exceeded at a particular location within a given time period. This pilot deals with regional long- and short-term PVHA. Regional assessments are crucial for a better land-use planning and for counter-measurements for risk mitigation actions of civil protection authorities. Because of the computational costs required to adequately simulate volcanic phenomena, PVHA is most based on single or very few selected reference scenarios. Independently of the degree of approximation of the used numerical model, PVHA for tephra loading and/or airborne ash concentration will necessitate a high number (typically several thousands, in order to capture variability in meteorological and volcanological conditions) of tephra dispersion simulations of which each is moderately intensive. This pilot will comprise both long- and short-term probabilistic hazard assessment for volcanic tephra fallout by adopting and improving a methodology recently proposed (Sandri et al., 2016) able to capture aleatory and epistemic uncertainties. Long term probabilistic hazard assessment for PDCs will also be envisaged, focussing on aleatory and epistemic uncertainties on Eruptive Source Parameters. Since tephra fallout models allow also a consistent treatment of spatially and temporally variable wind fields and can describe also phenomena like ash aggregation, an Exascale capacity will allow also to spatially extend, for the first time, the PVHA for evaluating potential impact from all active volcanoes in Italy on the entire national territory.
PTHA methodologies provide a framework for assessing the likelihood of a given measure of tsunami intensity (e.g., maximum run-up height) being exceeded at a particular location within a given time period. As historical tsunami catalogs are almost inherently incomplete, numerical models of tsunami generation, propagation, and inundation (e.g. Geist and Lynett, 2014) are combined with source probabilities to provide quantitative estimates of the probability of exceedance of different levels of a given tsunami metric. Here, a pilot for regional and local longterm PTHA is proposed. The regional assessment is crucial for driving the individualization of the coastal areas more exposed to a significant tsunami hazard. This can be used as a proxy of a local hazard to define tsunami evacuation zones for civil protection purposes. Starting from regional assessment, detailed inundation maps can be produced at local scale to allow local authorities to design ad hoc emergency plans. PTHA is mostly developed for seismic sources (SPTHA), whereas corresponding methods for landslides sources (LPTHA) are far less mature. In both cases, PTHA will necessitate a high number (typically tens of to hundreds of thousands) of depth averaged (2D) simulations, each of them being moderately intensive.
A crucial aspect in TEWS is the time at which robust and accurate earthquake parameters are available; indeed, in the first minutes after the seismic event occurred only magnitude and hypocentre are available, whereas information about the fault mechanism, fundamental to better constrain the tsunami source, is missing. Nevertheless, a tsunami warning has to be issued as fast as possible, attempting to forecast the coastal tsunami impact. Without earthquake fault mechanism being available in real time, it is necessary to find a balance between tsunami forecast accuracy and earliness. In this pilot, we propose a novel methodology that provides a rapid tsunami forecast at several coastal points; the forecast, only for seismically-induced tsunamis, is formulated for each point as a probability density function (PDF) of a given tsunami metric and arises from an ensemble of scenarios and a priori regional seismotectonic information. We called this method Probabilistic Tsunami Forecast (PTF). This novel methodology will allow taking into account the uncertainty in the tsunami forecast that is presently overlooked. As a specific implementation of PTF, we also propose a novel procedure that exploiting a single or a suite of scenarios arising from PTF, it is possible to assess the coastal tsunami impact and the associated uncertainty (e.g., in terms of inundation maps) for specific sites.
Adopting 3D simulations of wave propagation will benefit the assessment of PSHA, in particular reducing the uncertainties related to the use of GMPEs. Analysing ground motion scenarios for potential seismic events relies on the capability of accurately imaging the geological structure at multiple scales and of using the information buried in seismic "big data" records. Several projects, like the PRACE IMAGINE_IT, are taking advantage of Tier0 systems in order to capture a realistic heterogeneous image of the interior of the Earth. This pilot aims to extend the experience on full waveform adjoint tomography gained by these projects on one or more selected European small regions (for example the industrial region of South-Eastern Sicily) where improvements towards an accurate PSHA is critical for increasing the resilience in terms of seismic hazard.
Modern digital seismological networks record a wealth of continuous waveforms with huge numbers of buried signals of seismic sources (earthquakes, slow earthquakes, environmental sources). Breakthroughs can be achieved with systematic exploration of these observations in a common framework that involves data-streaming workflows combining advanced nonstationary signal processing and statistical analysis, together with Bayesian inference and machine learning methods, and with important applications for innovative continuous monitoring and space-and-time statistical analysis in active regions (tectonic, volcanic) of the energy-release activity before large earthquakes or volcanic eruptions.
The Earth’s magnetic field is sustained by a fluid dynamo operating in the Earth’s fluid outer core. Its geometry and strength define the equivalent of the climatological mean over which the interaction of the Earth with its magnetic environment takes place. It is consequently important to make physics-based predictions of the evolution of the dynamo field over the next few decades. In addition, the geomagnetic field has the remarkable ability to reverse its polarity every now and then (the last reversal occurred some 780.000 years ago). Observations of the properties of the field during polarity transition are sparse, and ultra-high resolution simulations should help better define these properties.
Operational volcanic ash dispersal forecasts are routinely used to prevent aircraft encounters with volcanic ash clouds and to perform re-routings avoiding contaminated airspace areas. However, a gap exists between current operational forecast products (e.g. issued by the Volcanic Ash Advisory Centers) and the requirements of the aviation sector and related stakeholders. Two aspects are particularly critical: 1) time and space scales of current forecasts are coarse (for example, the current operational setup of the London VAAC at U.K. Met. Office outputs on a 40 km horizontal resolution grid and 6 hour time averages) and; 2) quantitative forecasts. Several studies (e.g. Kristiansen et al., 2012) have concluded that the main source of epistemic/aleatory uncertainty in ash dispersal forecasts comes from the quantification of the source term (eruption column height and strength) which, very often, is not fully-constrained on real time. This limitation can be circumvented in part by integrating into models ash cloud observations away from the source, typically from satellite retrievals of fine ash column mass load (i.e. vertical integration of concentration). Model data assimilation has the potential to improve ash dispersal forecasts by an efficient joint estimation of the (uncertain) volcanic source parameters and the state of the ash cloud.