DRIHM and its companion project DRIHM2US have developed a prototype research infrastructure for simulating the complete process involved in extreme hydrometeorological events, enabling a step change in how scientists can approach studying high-impact weather events.
Every year, high-impact weather events (HIWE) related to meteorological, hydrological, geological, and climate hazards cause significant loss of life. From 1970 to 2012, about 9,000 HIWE were reported globally. All together, they caused the loss of 1.94 million lives and economic damage of $2.4 trillion (U.S. dollars; UNISDR 2014). Storms, droughts, floods, extreme temperatures, and coastal hazards all figure on the lists of the worst HIWE-related disasters. Storms and floods accounted for 79% (44% floods and 35% storms) of the total number of disasters caused by weather, water, and climate extremes and caused 54% of lives lost (14% floods and 40% storms) and 84% (33% floods and 51% storms) of economic losses (WMO 2014). This may hold back economic and social development by years or even decades.
Disaster risk reduction (DRR) is a broad issue that calls for political commitment and public understanding in order to be properly addressed. The DRR primary aim is to make the public aware of the risks it faces from natural hazards, such as storms and flash floods, and offers reassurances that adequate resources are available to minimize their impacts. A relevant indicator of the reliability and proper functioning of a DRR organization is its ability to inform the public of the procedures it relies upon to rapidly assess and to alert when a disaster is impending. The knowledge that warnings will be issued with clear and sound procedures, with the most advanced tools, also helps to create a consensus toward the authority, which in turn helps it in its risk reduction efforts, such as with the control of land or property limitations.
In summary, sophisticated “warning scenarios” not only serve immediate needs in a crisis, but also establish the credibility of the organizations, furthering the development of consensus on the required regulations, with a strong focus on risk reduction.
Improving the quality and reliability of such sophisticated warning scenarios requires focused hydrometeorological research (Parodi et al. 2012) to 1) understand, explain, and predict the physical processes producing HIWEs, 2) understand the possible intensification of such events because of climate change effects, and 3) explore the potential of the increasing computational power provided by high-performance computing (HPC), high-throughput computing (HTC), and cloud computing—in combination often called e-infrastructures—to provide deeper understanding of those events through fine-resolution modeling over large domains.
At the heart of these research challenges lies the ability to have easy access to hydrometeorological data and models and to facilitate the necessary collaboration between meteorologists, hydrologists, and Earth science experts to achieve accelerated scientific advances in hydrometeorological research (HMR). This can be achieved through stronger collaboration with the information and communication technologies (ICT) community, which continually provides new technological solutions (Shapiro et al. 2007, 2010; Shukla et al. 2009, 2010; Charlton-Perez et al. 2015; Leong et al. 2015).
The European Union (EU)-funded projects Distributed Research Infrastructure for Hydro-Meteorology (DRIHM; www.drihm.eu) and Distributed Research Infrastructure for Hydro-Meteorology to United States of America (DRIHM2US; www.drihm2us.eu), together denoted as DRIHM(2US), developed a prototype distributed computing infrastructure (DCI) to facilitate this collaboration providing advanced end-to-end HMR services (models, datasets, and postprocessing tools), with the aim of paving the way to a step change in how scientists can approach studying HIWEs, with a special focus on flood and flash flood events. This paper discusses how DRIHM(2US) services now make it possible to work in a modular environment and enhance the modeling and data processing capabilities of the HMR community through the adaptation, optimization, and integration of dedicated HMR services over the associated e-infrastructure, itself featuring several different computing paradigms (HPC, HTC, and cloud computing).
The paper is organized as follows: The next section presents the motivations of the proposed DRIHM(2US) DCI for hydrometeorology; followed by a discussion of the key DRIHM(2US) elements, a detailed review of the application of the DRIHM(2US) services to the Genoa, Italy, 2014 flash flood event, and then a discussion and conclusions.
THE DISTRIBUTED COMPUTING INFRASTRUCTURE FOR HYDROMETEOROLOGY: MOTIVATIONS.
The quality, quantity, and complexity of model engines, postprocessing tools, and datasets for hydrometeorology and climate research have dramatically increased over the past 15 years. Some state-of-the-art initiatives can be identified: the Community Earth System Model (CESM; Hurrell et al. 2013), which provides a fully coupled, global climate-modeling suite; the Community Surface Dynamics Modeling System (CSDMS), using a component-based approach to support geoscience modeling of Earth’s surface (Peckham et al. 2013); the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS; Horsburg et al. 2009), which provides an Internet-based system for sharing hydrologic data, through databases and servers, connected through web services and client applications, allowing for the publication, discovery, and access of data; the Earth System Modeling Framework (ESMF; Hill et al. 2004), which provides generic tools for building climate, numerical weather prediction, data assimilation, and other Earth science applications; and the Water Information Research and Development Alliance (WIRADA, 2008–ongoing) initiative, which is a partnership between the Bureau of Meteorology and the Commonwealth Scientific and Industrial Research Organisation in Australia, covering four broad categories (water information systems, foundation data products, water accounting and assessment, and water forecasting and prediction).
Along these lines, a first analysis of existing gaps between the most advanced HMR communities and the best available ICT tools was conducted within the DRIHMS project in 2011 (Schiffers et al. 2011). The analysis was based on the results of two questionnaires, one for the HMR community and one for the ICT community, augmented by additional expert interviews. Globally, about 300 respondents from 40 countries returned the questionnaire: 82% from European institutions, while the remaining 18% came from overseas, mainly from the United States. At the European level, the leading countries in terms of number of collected questionnaires were Italy (20%), Germany (11%), France (9%), Spain (9%), and the United Kingdom (4%). Most of the HMR respondents were from the fields of hydrometeorology (40%) or meteorology (43%), with a smaller but still significant contribution from hydrology (10%). About half of the HMR respondents were from research institutions (47%), with the remainder from institutions with both research and operational responsibilities (38%) or purely operational institutions (15%) A summary of the results did indicate that the ICT challenges for HMR scientists include the ability to exploit significant computational resources for research and operational activities and the ability to retrieve and access data from different sources.
DRIHM2US represented the natural evolution of the DRIHMS project survey activities. The key element was a set of transatlantic networking activities involving hydrometeorologists, climate scientists, and ICT scientists from both Europe and the United States, all focused on the challenge to overcome current limitations in the interplay between existing e-science environments in these two fields. The DRIHM2US consultation (Harpham et al. 2017) focused on identifying the most important features for state-of-the-art numerical models, including eliciting and prioritizing research and development needs, identifying opportunities to answer these needs, and how such a research infrastructure can be maintained, operated, and improved over time. Overall, responses were received from about 150 EU and U.S. specialists from a wide variety of organizations and roles, exhibiting a very high level of experience, ranging from scientific communities and citizen scientists to ICT support staff. Respondents gave a consensus on a number of key factors, which must be taken into account when scoping and specifying any future e-science infrastructure for HMR. Variations between respondents from Europe and those from the United States indicated slightly different experiences with such infrastructures but with a fairly united view overall: It must be very easy to access, very easy to use, and accompanied by comprehensive training and support. It must be built on a clear set of standards, particularly for data and model interfacing with the objective of enabling flexible usage, not restricting users. It must not be tied too strongly to any HMR community, should allow interface with other adjacent scientific communities, but not become too vast and unwieldy. With regard to data, practitioners from the United States had better experiences of access to open data than their European counterparts.
The DRIHM(2US) initiative has built on these DRIHMS and DRIHM2US findings and has developed a modular environment, with the DRIHM(2US) DCI enabling
the provision of integrated HMR services (such as meteorological models, hydrological models, stochastic downscaling tools, and hydraulics models) enabled by unified access to and seamless integration of underlying e-infrastructures;
the design, development, and deployment of user-friendly interfaces aiming to abstract HMR service provision from the underlying e-infrastructure complexities and specific implementations, thus enabling multidisciplinary and global collaboration between meteorologists, hydrologists, and possibly other Earth scientists; and
the user-driven composition of virtual facilities in the form of hydrometeorological forecasting chains, composed by different HMR resources (models, postprocessing tools, and data).
The result is an enhancement of the modeling and data processing capabilities of the HMR community through the adaptation, optimization, and integration of dedicated HMR services relying on different computing paradigms and technologies (e.g., high-performance, high-throughput, and cloud and grid computing).
KEY ELEMENTS OF THE DRIHM(2US) INITIATIVE.
The DRIHM(2US) scientific case is built around three experiment modeling suites able to address the interdisciplinary and international challenges of HMR in forecasting flash floods related HIWE. These three different modeling experiment suites (Fig. 1) compose the so-called hydrometeorological forecasting chain, whose end result is a prediction of a hydrological quantity such as river runoff and water level, feasible by feeding the prediction with a large variety of models and data sources.
HMR experiment suites.
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
On a conceptual level, a complete hydrometeorological forecasting chain consists of three consecutive layers:
The rainfall layer pertains to the combination of different numerical weather prediction (NWP) models to form a high-resolution multimodel ensemble together with the possibility to apply stochastic downscaling algorithms to enable the production of quantitative rainfall predictions for severe rainfall events.
The discharge layer concerns the combination of outputs’ data from the rainfall layer, such as rainfall, temperature at 2 m, wind speed and strength, and relative humidity predictions, with corresponding observations, which became inputs into multiple hydrological models to enable the production of river discharge predictions.
The water level, flow, and impact layer address the execution of hydraulic model compositions in different modes to assess the water levels, flow, and impact created by the flood events and to compare them against observations through verification metrics.
Model chaining and model interoperability: The metadata, adaptors, and portability (MAP) approach.
DRIHM(2US) identified an initial set of state-of-the-art model engines for the different modeling experiment suites. In the present version of the platform, nine are the available models: three meteorological models, the Advanced Research version of the Weather Research and Forecasting (WRF) Model (WRF-ARW; Michalakes et al. 2004), the Nonhydrostatic Mesoscale Model (WRF-NMM; Janjić et al. 2005), and Meso-NH (Lafore et al. 1998), together with the option for stochastic downscaling with the Rainfall Filtered Autoregressive Model (RainFARM; Rebora et al. 2006). Three hydrological models simulating catchment drainage are available: the semidistributed rainfall–runoff model Discharge River Forecast (DRiFt; Giannoni et al. 2000), the distributed rainfall–runoff model Real-Time Interactive Basin Simulator (RIBS: Garrote and Bras 1995), and the distributed hydrological model Hydrologiska Byråns Vattenbalansavdelning (HBV; Bergström 1995), together with the option to initialize the models by using rain gauge observations. Two main hydraulic options have been provided for simulating the flood itself: an Open Modeling Interface (OpenMI; Gregersen et al. 2007) composition of three models to model lateral exchanges between a river channel and a floodplain [MASCARET, Regional Statistical Forecast Model (RSFM), and impact calculator (or property damage) or Delft3D (Delft Hydraulics 1999)]. Although other hydrological and hydraulic models could be introduced in the platform, the present configuration allows for realizing 3 × 3 × 2 (18) different hydrometeorological chains. Both meteorological and hydrological models have been selected for their specific ability to reproduce mesoscale deep moist convective processes and their hydrological effects in areas where complex topography plays a crucial role (Atencia et al. 2011;, Fiori et al. 2017).
When considering standards-based DCIs for running numerical models and accessing the supporting data, new numerical models can be written to be directly compliant with the standards incorporated. Conversely, if the infrastructure is to include existing models, then these must be made compliant to the necessary level (e.g., input and output data). The DRIHM(2US) e-infrastructure is exclusively populated by legacy models, ranging from those common to their scientific domains with long development histories and large user bases to research standard code, which has been iterated many times at universities. To incorporate such a wide variety of models, a simple gateway concept for numerical model compatibility was derived. Adherence to this would make a model compatible for implementation on the infrastructure and also point toward future, more formal standardization.
Then, DRIHM(2US) abstracted common characteristics from leading integrated modeling technologies and derived a generic framework, characterized as the model MAP approach (Harpham et al. 2015, 2016): metadata, documentation, and license (each model must be supplied with metadata according to a given standard, appropriate documentation, and a license for users to use it); adaptors (or bridges) must be provided, which translate the model inputs and outputs to and from common standards; and portability (each model must be made portable, that is, not tied strongly to local infrastructure). The model MAP is a key factor enabling the extensibility of the DRIHM(2US) portal at the model level by allowing the inclusion of new HMR model engines and also across new model domains. An example in this sense is provided by the Application of Research to Operations at Mesoscale (AROME) model in Fig. 2; the AROME model engine, an instance of which is operational at Météo-France, cannot be shared on the DRIHM DCI because of strict licensing constraints. Still, building on the MAP concepts, its outputs can be used to force subsequent hydrological models, as described in the section “An example of DRIHM(2US) case study: The Genoa 2014 flash flood.”
DRIHM(2US) models chain.
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
DRIHM(2US) makes possible any combination of the abovementioned (and new) models in a chain using two standards-based interfaces (Harpham and Danovaro 2015): the precipitation (P) interface between the meteorological models and hydrological models is a one-way, file-based interface using the Network Common Data Form (netCDF) file format, since the meteorological outputs are grid series. The second one is the flow (Q) interface, which allows us to use the hydrological variables as inputs for the hydraulic models, applying the Water Markup Language, version 2 (WaterML2), file format for point series outputs. Interfaces in addition to the P and Q ones can be added to support other topics of potential interest such as ocean dynamics or coastal morphology. Indeed, the P interface has been conceived to offer mainly the precipitation variable for flooding studies, as the reference use case of DRIHM(2US) platform, together with other key meteorological parameters such as 10-m wind speed and direction, surface air pressure, 2-m temperature and specific humidity, latent and sensible heat fluxes, and incoming solar radiation, depending on the hydrological models selected for the prescribed chain. Thus, DRIHM(2US) offers the potential to be extended to many other modeling domains with minimal additional effort.
The underlying e-infrastructure.
From an ICT operational perspective, the major objective of DRIHM(2US) is to support users in enabling the HMR community to set up chains of models on various spatiotemporal scales, to support their integrated configuration, to fetch the data, and to execute the workflow on the most appropriate ICT resources. Such resources are available for the community within the existing European and U.S. e-infrastructures ecosystem while adhering to constraints imposed by model developers, data owners, and resource providers.
To overcome these challenges, DRIHM(2US) developed the science bus concept adapted from Chappell’s enterprise service bus approach (Chappell 2004) but extending it to support the required model chaining and the chain execution on grid resources [e.g., granted by the European Grid Infrastructure (EGI); Kranzlmüller et al. 2010], cloud resources (e.g., available through EGI’s Federated Cloud Initiative), HPC resources [e.g., provided through the Partnership for Advanced Computing in Europe (PRACE); Turunen et al. 2010].
Because of their heterogeneity, the runtime environments for the various models need to be prepared prior to model execution in a standardized manner in order to make them executable on the arbitrary grid, cloud, and HPC resources available. These aspects represent the adaptor and portability of the model MAP. While the software modules required to make the model engines compliant to the adopted P and Q interfaces (i.e., the adapters) could be supplied by the model developers, the assembly line management (the process to make a model engine portable) has been provided by DRIHM(2US). This strategy has been proven stable and extensible. It allows for seamlessly integrating new HMR applications with legacy ones, and it supports access to the bus through external web services.
The DRIHM portal: A science gateway for hydrometeorology.
The DRIHM portal is the scientific gateway designed to shape the DRIHM(2US) vision (Danovaro et al. 2014). The portal supports users in experiment configuration and execution by providing integrated solutions to manage and exploit the e-infrastructure’s key ingredients: state-of-the-art numerical simulation model engines (Fig. 2), a set of powerful distributed ICT resources, and an easy-to-use interface. The result is a flexible and extensible environment that guides, for example, in the case of the WRF-ARW, the user in the domain(s) selection (Fig. 3, top) and parameters’ selection (Fig. 3, middle), produces ready to use configuration files or name lists (Fig. 3, bottom), manages the job submission and result retrieval, and enables results to be analyzed and compared in a straightforward way.
The DRIHM(2US) portal snapshot. (top) WRF-ARW domains configuration, (middle) the WRF-ARW physics options menu, and (bottom) a snapshot of the namelist.input configuration file.
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
The DRIHM(2US) portal is based on the technologies proposed by the Scientific Gateway Based User Support (SCI-BUS; Kacsuk et al. 2013) project, that is, a customized version for e-science environments of the generic-purpose Web Service–Parallel Grid Run-Time and Application Development Environment (WS-PGRADE)/Grid and Cloud User Support Environment (gUSE) portal family (D’Agostino et al. 2015). The principle is to improve the way the scientist works by decoupling the HMR aspects from the ICT aspects, shielding non-ICT experts from the underlying ICT complexities that specific implementations and computational resources require.
The portal represents a step beyond the state of the art in HMR because models can be freely combined in complex simulation chains. The adoption of standardized interfaces and proper data conversion tools developed in the project results in the possibility to interpret Fig. 2 as a direct graph; models are the nodes, and arrows are the directed arcs, connecting two models sharing the same interface. Each possible simulation chain is a path on the directed graph; thus, the selection of a single model, or a complex chain (e.g., exploiting WRF-NMM, RainFARM, RIBS, and Delft3D), defines valid chains, supported by the science gateway.
Three user categories exist for the DRIHM portal: citizen scientists, scientists, and expert scientists. All generic users have to register (http://portal.drihm.eu/liferay-portal-6.1.0) on the DRIHM portal, and they are automatically classified as citizen scientists. Then HM researchers can apply to be classified as scientists or expert scientists on the basis of their skills and research purposes, which will be assessed by a DRIHM(2US) review committee. Each category corresponds to different rules and regulations to access the available services. For example, the calibration of a basin for executing hydrological simulation has to be inserted by expert hydrologists, who can assure the correctness of data and therefore the scientific validity of simulation results. While scientists are free to define every single parameter of an experiment and to use their own input data, citizen scientists are supplied with predefined scenarios targeted to give maximum insight to these nontechnical users. All DRIHM(2US) users are offered a rich set of training information to learn more about available modeling and visualization services. This training information has allowed introducing DRIHM(2US) in some curricula of high degree studies (e.g., European master of meteorology, University of Barcelona).
Key feedbacks and achievements of the DRIHM(2US) 2014 implementation workshop.
The DRIHM(2US) platform was thoroughly tested during a hands-on workshop organized in Madrid, Spain, in September 2014. The workshop gathered 31 participants, coming from many European countries but also the United States, Bolivia, Barbados, Sudan, Thailand, and the Philippines, selected from over 150 applications, to receive training on how to build interoperable forecasting chains in DRIHM(2US) and execute them in HPC platforms. After testing the DRIHM(2US) system through the portal, the participants were asked to provide feedback through a questionnaire. Most participants agreed that DRIHM(2US) can fill a crucial gap in hydrometeorology, allowing practitioners to widen the scope of their work by including other types of models that they had not been using so far. Some participants expected to enhance their modeling chains by incorporating some DRIHM(2US) components, while others enjoyed the opportunity of simple access to the grid computing infrastructure. DRIHM(2US) was generally perceived as a developing project, and participants encouraged further improvement, mainly along two lines: data availability to run models in large geographical areas and flexibility on workflow configuration to customize its application (for instance, to model calibration or data assimilation). Several participants mentioned the possibility of configuring model instances for their own case studies. Overall, the participants expected the platform to grow in the near future, including more models and more critical cases.
AN EXAMPLE OF DRIHM(2US) CASE STUDY: THE GENOA 2014 FLASH FLOOD.
During the project lifetime, the western Mediterranean area was affected by a number of very intense flash flood phenomena that hit regions with complex topography, lasted less than 1 day, and produced hundreds of millimeters of rainfall in a few hours on small watersheds. These kinds of HIWEs are nowadays documented by various studies, both because of their increasing number and the improved observational capabilities (Alexander et al. 2006; Coumou and Rahmstorf 2012; Min et al. 2011; Tramblay et al. 2013). The postevent analysis of one of these events, which occurred in October 2014 in Genoa (Liguria, Italy), is reported here because its analysis involved all the functionalities associated with DRIHM(2US), allowing the use of different models and the nesting of different runs down to a very fine meteorological grid size of 200 m.
The Genoa 2014 event.
The Genoa 2014 event, affecting the Liguria region in northwestern Italy (Fig. 4a), and in particular the torrentlike river, which crosses the city center, called Bisagno (Fig. 4b; 100 km2), was characterized by two distinct phases. Panels on the right side of Fig. 4 show the first phase during the morning of the day until 1200 UTC, with rainfall depths between 50 and 130 mm on the Bisagno catchment. After a break in the rainfall phenomena, the second phase in the late evening was characterized by hourly rainfall peaks around 100–130 mm between 2000 and 2100 UTC, reaching rainfall depths between 150 and 260 mm. The daily peak rainfall depth was around 400 mm, and the average rainfall depth over the catchment was 220 mm. As a consequence of these torrential rainfalls the Bisagno River produced a deadly flash flood in the city center (around 2100 UTC) with a discharge peak of 1,100 m3 s−1.
(left) (a) Northern Italy topography. (b) Bisagno River catchment (100 km2) and positions of four telemetering rain gauges. (right) Time series of observed hourly rainfall depth and relative cumulated rainfall depth for the four rain gauges on 9 Oct 2014 (Fiori et al. 2017).
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
It is worth noting that because of its peculiar spatiotemporal evolution and intrinsic low predictability, the operational hydrometeorological suite, composed by the Modello Locale in Hybrid Coordinates (MOLOCH; Buzzi et al. 2014) meteorological model at cloud-permitting grid spacing (2 km), the RainFARM model, and the DRiFt model [as used at the hydrometeorological office of the Liguria Region Environmental Agency (ARPAL)], was not able to predict, with adequate accuracy, 12–24 h in advance the observed peak discharge (Fig. 5) that occurred around 2200 UTC at the Passerella Firpo gauging station, near the Genoa city center; the ensemble of DRiFt hydrographs (about 50) falls well below the Q = 500 m3 s−1 (return period T = 10 yr) critical discharge. Similar results hold if using the quantitative precipitation forecast (QPF) provided by the Consortium for Small-Scale Modeling (COSMO) 2.8-km (Baldauf et al. 2011) operational model (not shown). For this reason the event resulted in a missed alert.
Bisagno at Passerella Firpo (close to the Genoa city center). The operational hydrometeorological suite, composed of the MOLOCH (Buzzi et al. 2014) meteorological model at cloud-permitting grid spacing (2 km), the RainFARM model, and the DRiFt model, was not able to predict, with adequate accuracy, 12–24 h in advance the observed Q discharge; the ensemble of DRiFt hydrographs (about 50, black curves depict the 80% confidence interval) falls well below the Q = 500 m3 s−1 (T = 10 yr) critical discharge.
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
At the mesoscale, the V-shaped back-building mesoscale convective system (MCS) observed during the events of October 2010 and October and November 2011 over the Liguria region (Cassola et al. 2015; Davolio et al. 2015; Rebora et al. 2013) was also a peculiar characteristic of the Genoa 2014 event. Schumacher and Johnson (2005, 2006, 2008, 2009) describe back-building MCS mechanisms, which have been found to be one of the major patterns that characterizes these severe and persistent phenomena (Fiori et al. 2017). The process is an atmospheric setting where convective cells repeatedly develop upstream of previous ones and pass over the same region; radar signatures of these storms are characterized by a typical V shape (Fig. 6). As already done for the 2011 events (Rebora et al. 2013), a map of persistence of rainfall intensity exceeding a threshold of 1 mm h−1 over 24 h, obtained by the Italian Radar National composite, is produced (Fig. 7); the V-shaped MCS structure is apparent. As for the previous HIWEs over the Liguria region, a fundamental meteorological feature for establishing and maintaining the back-building process for the Genoa 2014 HIWE has been the presence of a robust convergence line over the Liguria Sea (Fig. 8), as detected by the Advanced Scatterometer (ASCAT), a six-beam spaceborne radar instrument designed to measure wind fields over the oceans (Wilson et al. 2010).
Radar reflectivity images for four different V-shaped back-building MCSs that occurred in the last 5 years over the Liguria region. The selected timing refers to the most intense phase of each event. The bottom-right map concerns the event studied in this paper.
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
Percentage of the day (9 Oct 2014) when the rainfall intensity exceeded 1 mm h–1 (Italian Radar National composite). The Bisagno River catchment is highlighted in Fiori et al. (2017).
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
ASCAT ocean surface wind vectors (25-km resolution) for the Genoa 2014 HIWE (ASCAT 2047 UTC).
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
The Genoa 2014 event hydrometeorological predictive ability through DRIHM services.
Bearing in mind the ARPAL operational results and first considering the cloud-permitting grid spacing QPF results, the RIBS model, through the DRIHM portal, has been forced by 34 different members of the AROME ensemble at 2.5 km (Hally et al. 2015), initialized at 0000 UTC 9 October 2014. Its ouputs have been postprocessed in agreement with the MAP procedure and then offered in netCDF Climate and Forecast (CF) format (Harpham and Danovaro 2015) for the chaining with the RIBS model. Only one AROME member among the 34 members of the ensemble—namely, number 8—has been able to produce a peak discharge above 500 m3 s−1 even if with a significant error (>12 h) in terms of the time of the peak with respect to the observed one (Fig. 9, bottom).
(top) Comparison between the mean rainfall over the Bisagno catchment predicted by the AROME member 8 (black dashed line) and observed (blue heavy dashed line). (bottom) Comparison between the observed discharge, near Genoa city center, at Passerella Firpo (red squares), RIBS-simulated discharge using observed rainfall depth (blue heavy dashed line), and RIBS-simulated discharge using the QPF produced by AROME member 8 (black dashed line).
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
Moving toward cloud-resolving grid spacing for QPF results, the RIBS model has then been forced by 10 different members of the Meso-NH ensemble at 500-m grid spacing (Fig. 10), generated in agreement with the procedure described in Hally et al. (2015) and initialized at 0000 UTC 9 October 2014. Almost all the Meso-NH members driving RIBS are enabling discharge predictions above 500 m3 s−1, again underestimating the true event and with timing error but still very relevant to operational logic, while some members are even able to predict the double peak discharge (morning and evening, member 9) in the observed hydrograph.
(top) For all Meso-NH members, comparison between the mean rainfall over the Bisagno catchment predicted by each Meso-NH member (member 9 highlighted in heavy dashed green) and observed (blue heavy dashed line). (bottom) Comparison between the observed discharge, near Genoa city center, at Passerella Firpo (red squares), RIBS-simulated discharge using observed rainfall depth (blue heavy dashed line), and RIBS-simulated discharge using the QPF produced by each Meso-NH member (member 9 highlighted in heavy dashed green).
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
The Meso-NH results are also confirmed by another hydrometeorological chain experimented on the DRIHM(2US) platform: WRF-ARW at 1-km and 200-m grid spacing, with the setup defined in Fiori et al. (2014, 2017), feeding into the RIBS model (Fig. 11). The results confirm again, even in deterministic mode, the added value provided by the adoption of grid spacing in the cloud-resolving range. In particular with the WRF-ARW 1 km as forcing (Fig. 11, bottom, black dashed line), it is possible to predict a discharge peak around 800 m3 s−1 exhibiting, however, a significant timing error. The situation improves when the cloud-resolving simulation is used because the peak discharge becomes comparable with the observed one, and the temporal offset is reduced (Fig. 11, bottom, green dashed line) by about 3 h.
(top) Comparison between the mean rainfall over the Bisagno catchment predicted by each WRF-ARW 1 km (black dashed line) and 200 m (green dashed line) and observed (blue heavy dashed line). (bottom) Comparison between the observed discharge at Passerella Firpo (red squares), RIBS-simulated discharge using observed rainfall depth (blue heavy dashed line), and RIBS-simulated discharge using the QPF produced by the WRF-ARW 1 km (black dashed line) and 200 m (green dashed line).
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
Similar findings emerge by using the WRF-ARW–DRiFt chain (Fig. 12); although, it is interesting to notice that despite the same WRF-ARW input (1,000 and 200 m), RIBS and DRiFt produce significantly different peak discharges but still have a return period well above 20–30 years, based on regional frequency analysis of annual rainfall and discharge maxima in the Liguria region (Boni et al. 2006, 2007).
(top) Comparison between the mean rainfall over the Bisagno catchment predicted by each WRF-ARW 1 km (black dashed line) and 200 m (green dashed line) and observed (blue heavy dashed line). (bottom) Comparison between the observed discharge at Passerella Firpo (red squares), DRiFt-simulated discharge using observed rainfall depth (blue heavy dashed line), and DRiFt-simulated discharge using the QPF produced by the WRF-ARW 1 km (black dashed line) and 200 m (green dashed line).
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
The comparison between the different hydrometeorological forecasting chains in both operational and hindcast modes suggest that the adoption of finer grid spacing QPF results (in the cloud-resolving range for WRF-ARW 1 km and 200 m as well as Meso-NH 500 m) provide better results in terms of peak discharge and its timing than using cloud-permitting QPF results (operational COSMO 2.8 km and MOLOCH 2 km as well AROME 2.5 km in hindcast mode). Figure 13 confirms this statement, since the finer the grid spacing (Figs. 13e–g), the better approximated are the 24-h quantitative precipitation estimation (QPE) radar (Fig. 13a) and its localization over the Bisagno catchment.
Comparison between 24-h QPE radar (a) for 9 Oct 2014 and 24-h QPF provided by the cloud-permitting simulations [(b) COSMO 2.8 km, (c) MOLOCH 2 km, and (d) AROME 2.5-km member 8] and the cloud-resolving simulations [(e) WRF 1.0 km, (f) Meso-NH 0.5 km, and (g) WRF 0.2 km]. The Bisagno catchment (100 km2) is highlighted.
Citation: Bulletin of the American Meteorological Society 98, 10; 10.1175/BAMS-D-16-0279.1
All together the DRIHM(2US) platform allowed the execution, within a time frame of 4 h, of 48 hydrometeorological workflows with 15 of them (30%) predicting a peak discharge above 500 m3 s−1, thus justifying, in an operational logic, the issuing of an alert.
CONCLUSIONS.
DRIHM(2US) represents a promising advancement in hydrometeorological research because it allows the researchers/operators to repeat simulations of HIWE critical cases much more rapidly, giving more scientific confidence and allowing more simulations and analysis of the results. Where before HMR chains were often clumsily stitched together and hardwired to individual models, DRIHM(2US) allows a more interoperable and extensible model chain formulation.
The DRIHM(2US) services now make it possible to work in a modular environment with the enhanced modeling and data processing capabilities of the HMR community through the adaptation, optimization, and integration of dedicated HMR services over the DRIHM e-infrastructure. Different computing paradigms have been utilized (HPC, HTC, and cloud computing). By integrating HMR resources, DRIHM allows specialists to enter the e-science environments more easily and at the same time stimulate use by nonspecialists.
In general, the DRIHM(2US) innovations work together to enable a step change in how scientists can approach studying HIWE:
DRIHM-distributed computing infrastructure: This allows scientists to execute model chains with each model executed on the most appropriate computing resource. These include high-performance computing environments such as PRACE (which provide massively parallel machines), grid, and cloud environments such as those supported by EGI.
DRIHM portal: Supported by the gUSE technology, the DRIHM portal allows users to execute model chains by selecting from sets of meteorological, hydrological, and hydraulic models. Triggering the models and passing data between them is done seamlessly on behalf of the user. Facilities are also included to run ensembles and visualize outputs.
Standards: DRIHM has been built around standards. Many of these relate to environmental numerical models such as those related to cataloging, coupling, and file formats. In addition to providing invaluable evidence in how standards are applied, DRIHM(2US) has specified particular implementations of standards such as netCDF, WaterML, and International Organization for Standardization (ISO) 19139.
Interoperability experiments: Ranging from ICT infrastructures to semantic vocabularies, DRIHM(2US) has tested transatlantic interoperability. European numerical models have been coupled to those from the United States, a European workflow engine has accessed computing resources on Exteme Science and Engineering Discovery Environment (XSEDE) in the United States, and WaterML2 data have been digested directly into an Open Modeling Interface (OpenMI) composition via web services.
Model MAP: No new numerical models were written; all of the models used were either established research codes at universities, commercial products, or community models with large user bases. To handle the functional and technical diversity presented, DRIHM(2US) developed the MAP gateway concept: metadata, adaptors, and portability as a route to model standardization and interoperability.
The DRIHM(2US) results for the Genoa 2014 critical case demonstrate the great potential, from a research and potentially from an operational standpoint, of the DRIHM services.
In this sense, a dialogue has been initiated with the Liguria region and the Italian Civil Protection Department authorities, aiming at sharing the key findings and achievements of the project. This culminated in an invitation for the DRIHM(2US) initiative to present its achievements at the Major Risks National Committee meeting held in Rome, Italy, on 23 February 2015 devoted to open a discussion on the improvement of the predictive ability of HIWEs over complex topographic areas in Italy. The key recommendations provided by DRIHM(2US) and accepted by the committee have been in favor of using a multimodel approach in hydrometeorological forecasting chains to achieve cloud-permitting-resolution grid spacing (1 km or so) to model HIWEs in complex topographic areas, to recognize the importance of DRIHM(2US)-like platforms to gain a deeper understanding of these extreme hydrometeorological events, and finally to recognize the relevance of cloud, grid, and high-performance computing.
ACKNOWLEDGMENTS
This work was supported by the FP7 DRIHM (Distributed Research Infrastructure for Hydro-Meteorology, 2011–15) project (Contract 283568) and by the FP7 DRIHM2US (Distributed Research Infrastructure for Hydo-Meteorology to United States of America, 2012–15) project (Contract 313122). We acknowledge the Italian Civil Protection Department, Regione Liguria, and Regione Piemonte for providing us with the data of the regional meteorological observation networks. The numerical simulations were performed on the SuperMUC Petascale System of the LRZ Supercomputing Centre, Garching, Germany (Project ID: pr45de) and on European Grid Computing (EGI) resources.
REFERENCES
Alexander, L. V., and Coauthors, 2006: Global observed changes in daily climate extremes of temperature and precipitation. J. Geophys. Res., 111, D05109, doi:10.1029/2005JD006290.
Atencia, A., L. Mediero, M. C. Llasat, and L. Garrote, 2011: Effect of radar rainfall time resolution on the predictive capability of a distributed hydrologic model. Hydrol. Earth Syst. Sci., 15, 3809–3827, doi:10.5194/hess-15-3809-2011.
Baldauf, M., A. Seifert, J. Förstner, D. Majewski, M. Raschendorfer, and T. Reinhardt, 2011: Operational convective-scale numerical weather prediction with the COSMO model: Description and sensitivities. Mon. Wea. Rev., 139, 3887–3905, doi:10.1175/MWR-D-10-05013.1.
Bergström, S., 1995: The HBV model. Computer Models of Watershed Hydrology, V. P. Singh, Ed., Water Resources Publications, 443–476.
Boni, G., A. Parodi, and R. Rudari, 2006: Extreme rainfall events: Learning from raingauge time series. J. Hydrol., 327, 304–314, doi:10.1016/j.jhydrol.2005.11.050.
Boni, G., L. Ferraris, F. Giannoni, G. Roth, and R. Rudari, 2007: Flood probability analysis for un-gauged watersheds by means of a simple distributed hydrologic model. Adv. Water Resour., 30, 2135–2144, doi:10.1016/j.advwatres.2006.08.009.
Buzzi, A., S. Davolio, P. Malguzzi, O. Drofa, and D. Mastrangelo, 2014: Heavy rainfall episodes over Liguria in autumn 2011: Numerical forecasting experiments. Nat. Hazards Earth Syst. Sci., 14, 1325–1340, doi:10.5194/nhess-14-1325-2014.
Cassola, F., F. Ferrari, and A. Mazzino, 2015: Numerical simulations of Mediterranean heavy precipitation events with the WRF Model: A verification exercise using different approaches. Atmos. Res., 164–165, 210–225, doi:10.1016/j.atmosres.2015.05.010.
Chappell, D. A., 2004: Enterprise Service Bus. O’Reilly Media, 276 pp.
Charlton-Perez, C., H. L. Cloke, and A. Ghelli, 2015: Rainfall: High-resolution observation and prediction. Meteor. Appl., 22, 1–2, doi:10.1002/met.1496.
Coumou, D., and S. Rahmstorf, 2012: A decade of weather extremes. Nat. Climate Change, 2, 491–496, doi:10.1038/nclimate1452.
D’Agostino, D., E. Danovaro, A. Clematis, L. Roverelli, G. Zereik, A. Parodi, and A. Galizia, 2015: Lessons learned implementing a science gateway for hydro-meteorological research. Concurrency Comput., 28, 2014–2023, doi:10.1002/cpe.3700.
Danovaro, E., and Coauthors, 2014: Setting up an hydro-meteo experiment in minutes: The DRIHM e-infrastructure for HM research. Preprints, 10th Int. Conf. on e-Science, Sao Paulo, Brazil, IEEE, 47–54, doi:10.1109/eScience.2014.40.
Davolio, S., F. Silvestro, and P. Malguzzi, 2015: Effects of increasing horizontal resolution in a convection-permitting model on flood forecasting: The 2011 dramatic events in Liguria, Italy. J. Hydrometeor., 16, 1843–1856, doi:10.1175/JHM-D-14-0094.1.
Delft Hydraulics, 1999: Delft3D-FLOW user manual. Delft Hydraulics, 684 pp.
Fiori, E., A. Comellas, L. Molini, N. Rebora, F. Siccardi, D. J. Gochis, S. Tanelli, and A. Parodi, 2014: Analysis and hindcast simulations of an extreme rainfall event in the Mediterranean area: The Genoa 2011 case. Atmos. Res., 138, 13–29, doi:10.1016/j.atmosres.2013.10.007.
Fiori, E., L. Ferraris, L. Molini, F. Siccardi, D. Kranzlmueller, and A. Parodi, 2017: Triggering and evolution of a deep convective system in the Mediterranean Sea: Modelling and observations at a very fine scale. Quart. J. Roy. Meteor. Soc., 143, 927–941, doi:10.1002/qj.2977.
Garrote, L., and R. L. Bras, 1995: An integrated software environment for real-time use of a distributed hydrologic model. J. Hydrol., 167, 307–326, doi:10.1016/0022-1694(94)02593-Z.
Giannoni, F., G. Roth, and R. Rudari, 2000: A semi-distributed rainfall-runoff model based on a geomorphologic approach. Phys. Chem. Earth, 25B, 665–671, doi:10.1016/S1464-1909(00)00082-4.
Gregersen, J., P. Gijsbers, and S. Westen, 2007: OpenMI: Open modelling interface. J. Hydroinform., 9, 175–191, doi:10.2166/hydro.2007.023.
Hally, A., and Coauthors, 2015: Hydrometeorological multi-model ensemble simulations of the 4 November 2011 flash-flood event in Genoa, Italy, in the framework of the DRIHM project. Nat. Hazards Earth Syst. Sci., 15, 537–555, doi:10.5194/nhess-15-537-2015.
Harpham, Q., and E. Danovaro, 2015: Towards standard metadata to support models and interfaces in a hydro-meteorological model chain. J. Hydroinform., 17, 260–274, doi:10.2166/hydro.2014.061.
Harpham, Q., P. Cleverley, D. D’Agostino, A. Galizia, E. Danovaro, F. Delogu, and E. Fiori, 2015: Using a model MAP to prepare hydro-meteorological models for generic use. Environ. Modell. Software, 73, 260–271, doi:10.1016/j.envsoft.2015.08.007.
Harpham, Q., J. Lhomme, A. Parodi, E. Fiori, B. Jagers, and A. Galizia, 2016: Using OpenMI and a model MAP to integrate WaterML2 and NetCDF data sources into flood modeling of Genoa, Italy. J. Amer. Water Resour. Assoc., 52, 933–949, doi:10.1111/1752-1688.12418.
Harpham, Q., O. Gimeno, A. Parodi, and D. D’Agostino, 2017: A stakeholder consultation into hydro-meteorological e-science environments. Earth Sci. Inf., 10, 219–234, doi:10.1007/s12145-017-0294-6.
Hill, C., C. DeLuca, V. Balaji, M. Suarez, and A. da Silva, 2004: The architecture of the earth system modeling framework. Comput. Sci. Eng., 6, 18–28, doi:10.1109/MCISE.2004.1255817.
Hurrell, J., and Coauthors, 2013: The Community Earth System Model: A framework for collaborative research. Bull. Amer. Meteor. Soc., 94, 1339–1360, doi:10.1175/BAMS-D-12-00121.1.
Janjić, Z., T. Black, M. Pyle, E. Rogers, H. Y. Chuang, and G. DiMego, 2005: High resolution applications of the WRF NMM. 21st Conf. on Weather Analysis and Forecasting/17th Conf. on Numerical Weather Prediction, Washington, DC, Amer. Meteor. Soc. 16A.4. [Available online athttps://ams.confex.com/ams/WAFNWP34BC/techprogram/paper_93724.htm.]
Kacsuk, P., G. Terstyanszky, A. Balasko, K. Karoczkai, and Z. Farkas, 2013: Executing multi-workflow simulations on a mixed grid/cloud infrastructure using the SHIWA and SCI-BUS technology. Cloud Computing and Big Data, C. Catlett et al., Eds., Advances in Parallel Computing Series, Vol. 23, IOS Press, 141–160.
Kranzlmüller, D., J. M. de Lucas, and P. Öster, 2010: The European Grid Initiative (EGI). Remote Instrumentation and Virtual Laboratories, Springer, 61–66.
Lafore, J. P., and Coauthors, 1998: The Meso-NH atmospheric simulation system. Part I: Adiabatic formulation and control simulations. Ann. Geophys., 16, 90–109, doi:10.1007/s00585-997-0090-6.
Leong, S. H., and D. Kranzlmüller, 2015: Towards a general definition of urgent computing. Procedia Comput. Sci., 51, 2337–2346, doi:10.1016/j.procs.2015.05.402.
Michalakes, J., J. Dudhia, D. O. Gill, T. B. Henderson, J. B. Klemp, W. Skamarock, and W. Wang, 2004: The Weather Research and Forecast Model: Software architecture and performance. Proc. 11th ECMWF Workshop on the Use of High Performance Computing in Meteorology, Reading, United Kingdom, ECMWF, 156–168.
Min, S.-K., X. Zhang, F. W. Zwiers,and G. C. Hegerl, 2011: Human contribution to more-intense precipitation extremes. Nature, 470, 378–381, doi:10.1038/nature09763.
Parodi, A., G. Boni, L. Ferraris, F. Siccardi, P. Pagliara, E. Trovatore, E. Foufoula-Georgiou, and D. Kranzlmueller, 2012: The “perfect storm”: From across the Atlantic to the hills of Genoa. Eos, Trans. Amer. Geophys. Union, 93, 225–226, doi:10.1029/2012EO240001.
Peckham, S. D., E. W. Hutton, and B. Norris, 2013: A component-based approach to integrated modeling in the geosciences: The design of CSDMS. Comput. Geosci., 53, 3–12, doi:10.1016/j.cageo.2012.04.002.
Rebora, N., L. Ferraris, J. von Hardenberg, and A. Provenzale, 2006: RainFARM: Rainfall downscaling by a filtered autoregressive model. J. Hydrometeor., 7, 724–738, doi:10.1175/JHM517.1.
Rebora, N., and Coauthors, 2013: Extreme rainfall in the Mediterranean: What can we learn from observations? J. Hydrometeor., 14, 906–922, doi:10.1175/JHM-D-12-083.1.
Schiffers, M., and Coauthors, 2011: Towards a grid infrastructure for hydro-meteorological research. Comput. Sci., 12, 45–62, doi:10.7494/csci.2011.12.0.45.
Schumacher, R. S., and R. H. Johnson, 2005: Organization and environmental properties of extreme-rain-producing mesoscale convective systems. Mon. Wea. Rev., 133, 961–976, doi:10.1175/MWR2899.1.
Schumacher, R. S., and R. H. Johnson, 2006: Characteristics of U.S. extreme rain events during 1999–2003. Wea. Forecasting, 21, 69–85, doi:10.1175/WAF900.1.
Schumacher, R. S., and R. H. Johnson, 2008: Mesoscale processes contributing to extreme rainfall in a midlatitude warm-season flash flood. Mon. Wea. Rev., 136, 3964–3986, doi:10.1175/2008MWR2471.1.
Schumacher, R. S., and R. H. Johnson, 2009: Quasi-stationary, extreme-rain-producing convective systems associated with midlevel cyclonic circulations. Wea. Forecasting, 24, 555–574, doi:10.1175/2008WAF2222173.1.
Shapiro, M., and Coauthors, 2007: The socio-economic and environmental benefits of a revolution in weather, climate and Earth system analysis and prediction. The Full Picture, GEO Secretariat, Ed., Group on Earth Observations/Tudor Rose, 136–139.
Shapiro, M., and Coauthors, 2010: An Earth-system prediction initiative for the twenty-first century. Bull. Amer. Meteor. Soc., 91, 1377–1388, doi:10.1175/2010BAMS2944.1.
Shukla, J., R. Hagedorn, M. Miller, T. N. Palmer, B. Hoskins, J. Kinter, J. Marotzke, and J. Slingo, 2009: Strategies: Revolution in climate prediction is both necessary and possible: A declaration at the world modelling summit for climate prediction. Bull. Amer. Meteor. Soc., 90, 175–178, doi:10.1175/2008BAMS2759.1.
Shukla, J., T. N. Palmer, R. Hagedorn, B. Hoskins, J. Kinter, J. Marotzke, M. Miller, and J. Slingo, 2010: Toward a new generation of world climate research and computing facilities. Bull. Amer. Meteor. Soc., 91, 1407–1412, doi:10.1175/2010BAMS2900.1.
Tramblay, Y., L. Neppel, J. Carreau, and K. Najib, 2013: Non-stationary frequency analysis of heavy rainfall events in southern France. Hydrol. Sci. J., 58, 280–294, doi:10.1080/02626667.2012.754988.
Turunen, A., J. P. Nominé, F. Robin, D. Erwin, H. Huber, A. Berg, R. Murri, and A. Simpson, 2010: PRACE: Partnership for Advanced Computing in Europe preparation of a petascale supercomputing infrastructure for European scientists. Numerical Modeling of Space Plasma Flows, Astronum-2009, N. V. Pogorelov, E. Audit, and G. Zank, Eds., Astronomical Society of the Pacific Conference Series, Vol. 429, Astronomical Society of the Pacific, 317.
UNISDR, 2014: Annual report 2014. United Nations Office for Deisaster Risk Reduction Rep., 68 pp. [Available online at www.unisdr.org/files/42667_unisdrannualreport2014.pdf.]
Wilson, J. J., and Coauthors, 2010: Radiometric calibration of the advanced wind scatterometer radar ASCAT carried onboard the METOP: A satellite. IEEE Trans. Geosci. Remote Sens., 48, 3236–3255, doi:10.1109/TGRS.2010.2045763.
WMO, 2014:Atlas of mortality and economic losses from weather, climate and water extremes (1970–2012). WMO-1123, 48 pp.