1. Introduction
Wind power is one of the fastest growing areas of the energy sector and as the amount of wind generation increases, it is important to have accurate forecasts of wind power production to integrate this variable resource onto the power grid (Shaw et al. 2009; Marquis et al. 2011; Banta et al. 2013). Numerical weather prediction (NWP) models are a critical part of wind forecasting so understanding the sources of NWP forecast error is necessary to improve power forecasts. It is well documented that the parameterization of planetary boundary layer (PBL) processes is a significant source of error in near-surface wind speed forecasts, and therefore, forecasted power production (Yang et al. 2013; Draxl et al. 2014; Carvalho et al. 2014; Krogsaeter and Reuder 2015; Gómez-Navarro et al. 2015). A recent project led by the Department of Energy, the Second Wind Forecast Improvement Project (WFIP2), specifically focused on improving model boundary layer processes to improve low-level atmospheric wind prediction (Shaw et al. 2019; Wilczak et al. 2019; Olson et al. 2019). One research goal of that project was to quantify the uncertainty of the predicted low-level wind field based on perturbations to the parameters of the Mellor–Yamada–Nakanishi–Niino PBL scheme (MYNN, Nakanishi and Niino 2004, 2006, 2009) currently used in the U.S. operational High-Resolution Rapid Refresh (HRRR) model (Benjamin et al. 2016). Such uncertainty quantification (UQ) could be used to improve the specific parameter configuration in the HRRR modeling system.
Two recent UQ studies investigated the sensitivity of hub-height wind speed forecasts to changes in the value of parameters in the MYNN PBL scheme and found that a small number of parameters are responsible for the majority of the forecast uncertainty. Forecast wind speed was most sensitive to three of the eight parameters investigated in Jahn et al. (2017). Yang et al. (2017) investigated 12 parameters in the MYNN scheme and 14 parameters in the MM5 surface layer scheme and also found that a small number of parameters were responsible for the majority of the forecast variance. Furthermore, they found that parametric sensitivity changed with terrain slope and between nighttime and daytime periods. More details on these studies are presented in section 2a.
WFIP2 focused on improving wind forecasts in complex terrain. The project was centered on Wasco, Oregon, in the northwestern United States due to the large number of wind farms nearby and prominent terrain features such as the Columbia River basin (CRB) and Cascade Mountains (Fig. 1). Further, WFIP2 specifically examined meteorological phenomena that produced wind ramps, or changes in wind speed, associated with relatively large forecast errors. This study examines wind ramp events in the CRB that were caused by two meteorological phenomena involved with WFIP2: warm season marine push events and cool season stable mix-outs. Marine push events are characterized by a surge of cool, moist maritime air moving inland and displacing warm, dry continental air and frequently results in a temperature drop, an increase in humidity and a change in wind velocity (Mass et al. 1986). Stable mix-out events involve the erosion of cold air in a mountain basin or valley by warm, high-momentum air aloft which can produce sudden increases in low-level wind speeds (Whiteman et al. 2001; Reeves and Stensrud 2009). The difficulty of NWP models to represent stable mix-out events is a result of inadequate representation of turbulent mixing processes (Lareau and Horel 2015; Foster et al. 2017; Crosman and Horel 2017).

Terrain height (in meters) in the Pacific Northwest with the Cascade Mountains and Columbia River basin (CRB) labeled. The purple box indicates the WascoBox and the red dots are the location surface stations in Wasco (WASC) and Astoria (ASTO). The yellow dots are individual wind turbines.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Terrain height (in meters) in the Pacific Northwest with the Cascade Mountains and Columbia River basin (CRB) labeled. The purple box indicates the WascoBox and the red dots are the location surface stations in Wasco (WASC) and Astoria (ASTO). The yellow dots are individual wind turbines.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Terrain height (in meters) in the Pacific Northwest with the Cascade Mountains and Columbia River basin (CRB) labeled. The purple box indicates the WascoBox and the red dots are the location surface stations in Wasco (WASC) and Astoria (ASTO). The yellow dots are individual wind turbines.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
The purpose of this study is to build on the work of Yang et al. (2017) by understanding how parametric sensitivity varies across specific wind ramp-producing events, how it changes temporally throughout these events, and whether it varies with horizontal resolution. Another important, and unique, goal is to understand whether parametric sensitivity varies throughout multiple forecasts of a single event. This has implications for UQ-based parameter tuning on a single forecast as that forecast may show very different parametric sensitivity to a forecast with slightly different initial conditions, potentially reducing the effectiveness of the tuning. All of these results are expected to contribute to model improvement efforts in the future, especially since the variability of parametric sensitivity investigated here provides key knowledge to execute such efforts successfully. This study uses four wind ramp events from WFIP2, two marine pushes and two mix-out events, to investigate these questions.
The outline of the remainder of this paper is as follows: section 2 provides a description of the background and methodology, section 3 presents results, section 4 discusses the results, and conclusions are provided in section 5.
2. Background and methodology
a. MYNN parametric sensitivity
The MYNN uses a gradient theory closure scheme, which requires calculating the value of a master mixing length, L, and dimensionless stability functions. The formulation can be found in Eqs. (27)–(51) in Nakanishi and Niino (2009). Jahn et al. (2017) investigated the sensitivity of 110-m wind speed to eight closure constants (A1, A2, B1, B2, C1, C2, C3, and C5) used in calculating the stability functions. Forecasts were found to be most sensitive to A1, B1, and C1. B1 also modulates the dissipation rate of turbulent kinetic energy (TKE), and they note that reducing the value of B1 results in less TKE and mixing, allowing for a stronger LLJ to develop.
Yang et al. (2017) investigated the sensitivity of 80-m wind speeds in the Columbia River basin to 12 parameters in the MYNN scheme and 14 parameters in the MM5 surface layer scheme and found that parametric sensitivity changed with the terrain slope and between nighttime and daytime periods. Parametric sensitivity was assessed using an ensemble of forecasts with each ensemble member having a unique combination of parametric values. A generalized linear model (GLM) was used to quantify the contribution of each parameter to the ensemble variance. The average daytime wind speeds and average nighttime wind speeds over a month long period were used to investigate diurnal differences. Yang et al. (2017) ran separate UQ experiments on the PBL parameters and the surface layer parameters.
The PBL experiments included the parameters B1, C3, C5, and γ1 (in stability function calculations), α1, α2, α3, α4, α5, and β (used in calculating L) as well as the Prandtl number, Pr, and TKE diffusion factor, Df. Though Yang et al. (2017) did not specify perturbations to A1 and C1, these parameters were varied as they are functions of B1 and γ1. Results from the PBL experiment indicate that β and B1 are responsible for the largest contribution to wind speed variance during the day, with β more important in lower terrain areas and B1 more important at higher elevations. The parameters α4 and α1 are also responsible for significant contributions to wind speed variance during the day. At night, Pr, α5 and B1 are the largest sources of wind speed variance. The surface layer UQ experiment showed that zf (surface roughness scaling factor) and k (the von Kármán constant) contributed the majority of the total variance to forecast wind speed during daytime and nighttime conditions.
b. Experimental setup
Two types of ensembles are used in this study. Figure 2 shows a diagram of the experimental process. The first type of ensemble is an initial condition (IC) ensemble that uses the same model physics for each member but varied ICs provided by an ensemble Kalman filter (EnKF) data assimilation system. The second type of ensemble are physics ensembles that assess MYNN parametric sensitivity based on the experimental design from Yang et al. (2017). Selected members from the IC ensemble provide input and boundary files for the physics ensembles. This experimental framework allows for a unique sampling of state space and parameter space. A conceptual diagram of this framework is shown in Fig. 3.

Diagram showing the steps in the experimental process.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Diagram showing the steps in the experimental process.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Diagram showing the steps in the experimental process.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Conceptual example of the experimental setup spanning both state space and parameter space. The vertical axis represents variations in state space; the black dots represent the individual members of the IC ensemble with the black line representing state space spanned by the IC ensemble. The yellow dot with black outline is a central member of the IC ensemble and provides input and boundary files to the physics ensemble represented by the yellow dots. The yellow line represents the parameter space spanned by this ensemble. Similarly, other colors represent other physics ensembles, which sample parameter space throughout different parts of the state space.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Conceptual example of the experimental setup spanning both state space and parameter space. The vertical axis represents variations in state space; the black dots represent the individual members of the IC ensemble with the black line representing state space spanned by the IC ensemble. The yellow dot with black outline is a central member of the IC ensemble and provides input and boundary files to the physics ensemble represented by the yellow dots. The yellow line represents the parameter space spanned by this ensemble. Similarly, other colors represent other physics ensembles, which sample parameter space throughout different parts of the state space.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Conceptual example of the experimental setup spanning both state space and parameter space. The vertical axis represents variations in state space; the black dots represent the individual members of the IC ensemble with the black line representing state space spanned by the IC ensemble. The yellow dot with black outline is a central member of the IC ensemble and provides input and boundary files to the physics ensemble represented by the yellow dots. The yellow line represents the parameter space spanned by this ensemble. Similarly, other colors represent other physics ensembles, which sample parameter space throughout different parts of the state space.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
1) WRF configuration
Experiments for this study are conducted using WRF ARW version 3.6 (Skamarock et al. 2008) with 3 domains: 1) an outer domain with 12 km horizontal grid spacing, 2) a second nested domain with a 4 km grid spacing, and 3) a third nested domain with 1.333 km grid spacing centered on the Columbia River valley (Fig. 4). The domains are nested using one-way nesting so that the performance of Domain 2 can be assessed independent of the presence of Domain 3. All domains use the 55 vertical levels matching the setup in Yang et al. (2017). At Wasco, there are 26 vertical levels in the lowest 1 km AGL and 14 levels in the lowest 200 m. The HRRR setup has 9 levels below 1 km and 5 below 200 m.

The three domains used in this study. Domain 1 is the entire map and Domains 2 and 3 are shown by the black boxes. The grid spacing for Domains 1, 2, and 3 are 12, 4, and 1.333 km, respectively.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

The three domains used in this study. Domain 1 is the entire map and Domains 2 and 3 are shown by the black boxes. The grid spacing for Domains 1, 2, and 3 are 12, 4, and 1.333 km, respectively.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
The three domains used in this study. Domain 1 is the entire map and Domains 2 and 3 are shown by the black boxes. The grid spacing for Domains 1, 2, and 3 are 12, 4, and 1.333 km, respectively.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
The model physics used in this study were chosen to match those used by WFIP2 model development team and operational HRRR model as closely as possible. However, in order to modify surface layer parameters following Yang et al. (2017), the MM5 surface layer scheme (Grell et al. 1994) was used instead of the Eta scheme which is used by the HRRR . This study uses the Thompson aerosol-aware microphysics scheme (Thompson and Eidhammer 2014), the unified Noah land surface model (Tewari et al. 2004), and the Rapid Radiative Transfer Model for Global Climate Models (RRTMG; Iacono et al. 2008) for both shortwave and longwave parameterizations with radiation calls every 15 min. The HRRR model does not use a cumulus scheme so the Tiedtke scheme (Tiedtke 1989; Zhang et al. 2011) was selected for use on the outer domain only. All models use the 2.5 order MYNN boundary layer scheme. Forecasts were run using adaptive time stepping, with a target CFL of 1.2. Time step ratios of 3 were used between domains 1 and 2, and between domains 2 and 3. Experimental runs consisted of 24-h forecasts with hourly output.
2) Initial condition ensemble and data assimilation setup
The 63 member IC ensembles were initialized from GFS forecasts using perturbations from the climatological covariances in the WRF data assimilation package (WRFDA, Barker et al. 2004). Data assimilation was performed with the Data Assimilation Research Testbed (DART; Anderson et al. 2009) ensemble adjustment Kalman filter (EAKF; Anderson 2001) using cloud track wind (CTW), radiosonde, Aircraft Communication Addressing and Reporting System (ACARS), marine, mesonet, and aviation routine weather report (METAR) observations obtained through the Meteorological Assimilation and Data Ingest System (MADIS). No observations from the WFIP2 field campaign were assimilated. Observations were assimilated every 6 h, when available, with the exception of radiosonde observations which were only available at 0000 and 1200 UTC. Covariance localization (Gaspari and Cohn 1999) and inflation (Anderson and Anderson 1999) was applied during the assimilation processes using the values listed in Table 1. The system was cycled every 6 h over a 48-h period without extended forecasts to allow for the development of flow dependent covariances before running extended 24-h forecasts to capture the events of interest. Assimilation was only performed on Domains 1 and 2 to avoid the computational cost of cycling forecasts on Domain 3. Before running the extended forecasts, input files for Domain 3 were generated from the Domain 2 input files using the WRF nestdown tool.
DART parameters used on Domains 1 and 2.


The lateral boundary conditions (LBCs) for the outer domain came from the NCEP Global Ensemble Forecast System (GEFS). The GEFS currently contains 21 members so boundary conditions for the full 63 member EnKF ensemble were obtained by time lagging. For a given forecast initiation time, members 1–21 used data from the current GEFS forecast, members 22–42 used data from the previous GEFS forecast (6 h old), and members 43–63 used data from the 12-h-old GEFS forecast.
Two extended forecasts were run for each event, with the first forecast beginning roughly 18 h prior to the wind ramp, and the second beginning 6 h later. The exception was the 9 April 2016 case, which was a prolonged event with two ramps. For this case the two initializations were 12 h apart instead of 6. The black dots in Fig. 3 represent the different ICs for a given initialization.
3) Selection of parent members for physics experiments
For each EnKF extended ensemble forecast, five physics ensembles are run using ICs and LBCs from individual members of the EnKF ensemble. Several factors are considered when selecting the EnKF “parent” members upon which to base the physics ensembles. The primary criterion is selecting members that span the forecast distribution of 80 m AGL wind speed in Wasco. Secondary factors include the timing and amplitude of the ramp, the pressure difference between the coast and the CRB basin (for marine push cases) and the temperature difference between 1000 m AGL and the surface (for mix-out cases). The same colors are used to represent parent members throughout this study. The five parent members are 1) a low wind speed, or weak ramp member (in red), 2) a member near the center of the ensemble distribution or that closely matches observed wind speeds (yellow), 3) a high wind speed, or large ramp member (green), 4) a “bust” forecast that does not produce a ramp (blue), and 5) a member that has an interesting feature in the pressure or temperature gradient plots, or is very similar to another member already selected as a parent member (purple).
4) Physics ensemble setup
Physics ensembles are used to assess the sensitivity of hub height wind speeds to 9 parameters in the MYNN scheme, created about the various initial conditions conceptually depicted in Fig. 3. The nine parameters used in this study are listed in Table 2, along with their default values and the range of values used in this study. These parameters were selected based on results from Yang et al. (2017); six boundary layer parameters (B1, Pr, α1, α4, α5, and β) and two surface layer parameters (zf and k) were each responsible for at least 10% of wind speed variance during the daytime or nighttime and were selected for this study. The final parameter, γ1, is included for its role in modifying the closure constants A1 and C1, which (Jahn et al. 2017) found to be the most significant parameters (along with B1 which is already included in this study).
Parameters used in this study.


The range of values for each parameter are the same as those used in Yang et al. (2017) with the exception of α4. Yang et al. (2017) used a range of 20–100 for α4. The WFIP2 model development team has experimented with α4 values of 10 (personal communication with Joe Olson), so the α4 range in this study is 10–100. A quasi–Monte Carlo sampling method is used to select the values of the parameters for individual ensemble members (Caflisch 1998; Hou et al. 2012). This is done in Python using the sobel_seq package (Fox et al. 2016). 81 members are used in each physics ensemble. The other parameters (i.e., aside from the 9 varied here) were set at the default values listed in Yang et al. (2017).
c. Parametric uncertainty quantification
The method for quantifying parametric sensitivity is similar to the one used in Yang et al. (2017). A generalized linear model (GLM) is used to assess how much of the wind speed variance is due to the variations in particular parameter’s value using the glm function in R (R Core Team 2017). This is an iterative process that involves stepwise linear regression with the wind speed at a single forecast hour as the predictand. Parameters are individually added to the model and the change in variance explained by the model is attributed to that parameter. The final results are normalized by the original forecast variance so the reduction in the percentage of variance was used as the UQ metric.
The order of the parameters produced slightly different UQ values. Up to a 10% difference in UQ values was observed for some parameters as a result of changing the order of the parameters in the stepwise regression. To account for this, all UQ analysis was performed five times, each with a different order of parameters.
The GLM in this study only has linear and quadratic terms. Interaction terms were investigated, but in most cases their inclusion did not result in much reduction in wind speed variance. There are some cases where interaction terms contributed over 10% of the total variance, however, since this occurs a minority of the time they are not included in the analysis. This is the same technique used in Yang et al. (2017).
Parameters that were responsible for at least 10% of the total variance were deemed “significant” parameters. This was based on the average value of the 5 UQ regressions. If a parameter unequivocally contributed more variance to the forecast than any other parameter (i.e., no overlap in the range of UQ values) it was considered the “primary” parameter at that time.
d. Wind metric
The WFIP2 observational network was centered on Wasco, Oregon, due to the large number of wind farms in the area. The primary metric in this study is the average 80 m AGL wind speed in a 28 km × 32 km box centered on Wasco (Fig. 1) and referred to as the “WascoBox” for the remainder of this study.
e. Case overviews
Two marine push events and two mix-out events were analyzed, however, for brevity a detailed analysis of one of each event types will be presented. These events were selected from a catalogue of ramp events created during WFIP2 as cases that have high importance to the wind industry and had poor forecast skill [Atmosphere to Electrons (A2e) 2015]. This study provides a detailed analysis of ramp case, while other WFIP2 papers cover a breadth of ramp events (Djalalova et al. 2019, manuscript submitted to Wea. Forecasting; Banta et al. 2019, manuscript submitted to Mon. Wea. Rev.; McCaffrey et al. 2019).
1) 9 April 2016 marine push
On the 9th of April 2016 a marine push produced a strong up ramp at Wasco. Figure 5 shows 81 m AGL wind speed and direction observations from a profiling radar in Wasco (top) as well as the sea level pressure (SLP) at Wasco, on the coast (Astoria), and the difference. At 1200 UTC 8 April, SLP in Wasco began to drop producing an inland-directed pressure gradient. Shortly after 0400 UTC 9 April, the marine push arrived in Wasco as indicated by the shift in wind direction from easterly to westerly and corresponding increase in wind speed. The wind speed continued to increase over the next 9 h, peaking at greater than 12 m s−1. From 1500 to 2000 UTC winds remained near 10 m s−1 followed by a second increase in wind speed from 2100 UTC 9 April to 0100 UTC 10 April, with a maximum wind speed in excess of 17 m s−1.

(top) Wind speed and wind direction observations and (bottom) sea level pressure observations showing the 9 Apr marine push event. The wind speed and directions observations are from the lowest range gate (bin center of 81 m AGL) from the radar wind profiler located at the Wasco Airport. The sea level pressure observations are from the mesonet stations in Astoria and Wasco. See Fig. 4 for station locations. The dashed black line shows Wasco SLP subtracted from the Astoria SLP (right axis).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

(top) Wind speed and wind direction observations and (bottom) sea level pressure observations showing the 9 Apr marine push event. The wind speed and directions observations are from the lowest range gate (bin center of 81 m AGL) from the radar wind profiler located at the Wasco Airport. The sea level pressure observations are from the mesonet stations in Astoria and Wasco. See Fig. 4 for station locations. The dashed black line shows Wasco SLP subtracted from the Astoria SLP (right axis).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
(top) Wind speed and wind direction observations and (bottom) sea level pressure observations showing the 9 Apr marine push event. The wind speed and directions observations are from the lowest range gate (bin center of 81 m AGL) from the radar wind profiler located at the Wasco Airport. The sea level pressure observations are from the mesonet stations in Astoria and Wasco. See Fig. 4 for station locations. The dashed black line shows Wasco SLP subtracted from the Astoria SLP (right axis).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
2) 18 January 2017 mix-out event
On 18 January 2017 there was a partial mix-out of a cold pool in the CRB. The mix-out stalled about 200 m AGL in Wasco, however, real time models forecasted an increase in rotor layer wind speeds that did not occur, producing significant forecast error and an overestimation of wind generation. Figure 6 shows the temperature, wind speed and wind direction profiles at the Wasco airport spanning this event. During 17 January, a strong cold pool was in place with surface temperature 15°C colder than temperatures at 1000 m AGL. Within the cold pool there were weak (<7 m s−1) easterly winds, while above 600 m winds were 10–15 m s−1 and out of the south to southwest. Throughout 17 January, the top of the cold pool remained 600 to 700 m AGL. Shortly before 0000 UTC 18 April there was an increase in wind speed aloft, with winds at 1000 m exceeding 20 m s−1. The increased wind shear produced turbulent mixing, which began to erode the cold pool, resulting in a gradual descent of the warm, high-momentum air. Between 1000 and 2000 UTC there was a rapid erosion of the cold pool as the warm, high-momentum layer descended before stalling out approximately 200 m AGL. By 0000 UTC 19 April, there was a cooling of the air 600–1600 m AGL, which weakened the capping layer and ended the cold pool.

Time–height plots of (top) temperature, (middle) wind speed, and (bottom) wind direction from the Wasco radiometer and radar wind profiler during the 18 Jan cold pool mix-out event.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Time–height plots of (top) temperature, (middle) wind speed, and (bottom) wind direction from the Wasco radiometer and radar wind profiler during the 18 Jan cold pool mix-out event.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Time–height plots of (top) temperature, (middle) wind speed, and (bottom) wind direction from the Wasco radiometer and radar wind profiler during the 18 Jan cold pool mix-out event.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
3. Results and discussion
a. 8 April UQ analysis
The WascoBox wind speeds for the EnKF ensemble initialized at 1800 UTC 8 April are shown in Fig. 7, along with observations from the radar and sodar deployed at the Wasco airport. The ensemble distribution encompasses the observed wind speed for the final 20 h of the forecast and many individual ensemble members produce a wind ramp. The EnKF ensemble produces a large range of ramp magnitudes so members 55 (“Large Ramp”), 9 (“Central”), 17 (“Weak Ramp”), 62 (“No Ramp”), and 6 (“Analog”) are used in physics experiments. The Central member closely matches the observed wind speeds throughout the ramp event while the Analog member was selected due to its similarity to the Central member. The physics ensembles are shown in the lower panels of Fig. 8.

Time series of the WascoBox wind speed for all members of the EnKF ensemble initialized at 1800 UTC 8 Apr. The dashed lines show 80 and 81 m AGL wind speed observations from a sodar and profiling radar, respectively, deployed at the Wasco airport. The ensemble mean is indicated by the dark blue, dot–dashed line. The five solid, colored lines indicate the parent members used in subsequent physics experiments.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Time series of the WascoBox wind speed for all members of the EnKF ensemble initialized at 1800 UTC 8 Apr. The dashed lines show 80 and 81 m AGL wind speed observations from a sodar and profiling radar, respectively, deployed at the Wasco airport. The ensemble mean is indicated by the dark blue, dot–dashed line. The five solid, colored lines indicate the parent members used in subsequent physics experiments.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Time series of the WascoBox wind speed for all members of the EnKF ensemble initialized at 1800 UTC 8 Apr. The dashed lines show 80 and 81 m AGL wind speed observations from a sodar and profiling radar, respectively, deployed at the Wasco airport. The ensemble mean is indicated by the dark blue, dot–dashed line. The five solid, colored lines indicate the parent members used in subsequent physics experiments.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

WascoBox wind speed for the (top) EnKF ensemble and the five physics ensembles initialized at 1800 UTC 8 Apr. The colored, bold line in each of the physics plots shows the wind speed from the parent member that provided the ICs and LBCs for the physics ensemble and is included for reference.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

WascoBox wind speed for the (top) EnKF ensemble and the five physics ensembles initialized at 1800 UTC 8 Apr. The colored, bold line in each of the physics plots shows the wind speed from the parent member that provided the ICs and LBCs for the physics ensemble and is included for reference.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
WascoBox wind speed for the (top) EnKF ensemble and the five physics ensembles initialized at 1800 UTC 8 Apr. The colored, bold line in each of the physics plots shows the wind speed from the parent member that provided the ICs and LBCs for the physics ensemble and is included for reference.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Parametric sensitivity results for the 1800 UTC 8 April case are shown in Fig. 9. The time series on the left indicates the percentage of the total variance attributed to a specific parameter throughout the forecast. The envelope for each parameter indicates the maximum and minimum of the five regression orders, while the solid line shows the mean of the five regressions. The colored bars in the histograms on the right display the number of hours in the forecast that a particular parameter is “significant,” while the black bars show the number of hours a that a parameter is the “primary” parameter, based on the criteria outlined in section 2c.

(left) Time series of the percentage of Domain 3 WascoBox wind speed variance attributed to individual parameters for the 1800 UTC 8 Apr physics ensembles. The envelopes show the maximum and minimum values of the five UQ regressions. (right) The number of hours in the forecast a parameter is significant (colored bars), and the number of hours in the forecast a parameter is the primary parameter (black bars). The initial time (forecast hour zero) is omitted as there is no variance at that time.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

(left) Time series of the percentage of Domain 3 WascoBox wind speed variance attributed to individual parameters for the 1800 UTC 8 Apr physics ensembles. The envelopes show the maximum and minimum values of the five UQ regressions. (right) The number of hours in the forecast a parameter is significant (colored bars), and the number of hours in the forecast a parameter is the primary parameter (black bars). The initial time (forecast hour zero) is omitted as there is no variance at that time.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
(left) Time series of the percentage of Domain 3 WascoBox wind speed variance attributed to individual parameters for the 1800 UTC 8 Apr physics ensembles. The envelopes show the maximum and minimum values of the five UQ regressions. (right) The number of hours in the forecast a parameter is significant (colored bars), and the number of hours in the forecast a parameter is the primary parameter (black bars). The initial time (forecast hour zero) is omitted as there is no variance at that time.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
The inspection of several boundary layer variables can help with the interpretation of parametric sensitivity. Figure 10 shows the static stability, the dimensionless height (ζ = z/LM, where LM is the Monin–Obukov length and z is height), PBL height, WascoBox wind speed, TKE, and TKE tendency for the dissipation, buoyancy and shear terms. Parameters are valid at the Wasco airport with the exception of the WascoBox wind speed. Other than ζ and the TKE tendency terms, all values are the mean of the physics ensembles. ζ and the TKE tendency are from the parent EnKF member.

Boundary layer variables for the 1800 UTC 8 Apr ensembles. (top left) Static stability at the Wasco Airport for 40–120 m AGL, solid lines, and 2–1000 m ALG, dotted lines; (middle left) ζ at 80 m AGL at the Wasco Airport, note that ζ values above 3 and below −3 have been clipped to allow the scale of the vertical axis to focus on values near 0; and (bottom left) PBL height at Wasco Airport. (top right) Mean WascoBox wind speed; solid lines show the mean and shaded regions show the interquartile range for each UQ ensemble. (middle right) TKE at the Wasco Airport; solid lines show the mean and shaded regions show the interquartile range for each UQ ensemble. (bottom right) Tendency of the dissipation (solid line), buoyancy (dotted line), and shear (dashed line) terms in the TKE budget at the Wasco Airport. The values of ζ and the TKE budget terms are from the parent members, all other plots are from the UQ ensembles.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Boundary layer variables for the 1800 UTC 8 Apr ensembles. (top left) Static stability at the Wasco Airport for 40–120 m AGL, solid lines, and 2–1000 m ALG, dotted lines; (middle left) ζ at 80 m AGL at the Wasco Airport, note that ζ values above 3 and below −3 have been clipped to allow the scale of the vertical axis to focus on values near 0; and (bottom left) PBL height at Wasco Airport. (top right) Mean WascoBox wind speed; solid lines show the mean and shaded regions show the interquartile range for each UQ ensemble. (middle right) TKE at the Wasco Airport; solid lines show the mean and shaded regions show the interquartile range for each UQ ensemble. (bottom right) Tendency of the dissipation (solid line), buoyancy (dotted line), and shear (dashed line) terms in the TKE budget at the Wasco Airport. The values of ζ and the TKE budget terms are from the parent members, all other plots are from the UQ ensembles.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Boundary layer variables for the 1800 UTC 8 Apr ensembles. (top left) Static stability at the Wasco Airport for 40–120 m AGL, solid lines, and 2–1000 m ALG, dotted lines; (middle left) ζ at 80 m AGL at the Wasco Airport, note that ζ values above 3 and below −3 have been clipped to allow the scale of the vertical axis to focus on values near 0; and (bottom left) PBL height at Wasco Airport. (top right) Mean WascoBox wind speed; solid lines show the mean and shaded regions show the interquartile range for each UQ ensemble. (middle right) TKE at the Wasco Airport; solid lines show the mean and shaded regions show the interquartile range for each UQ ensemble. (bottom right) Tendency of the dissipation (solid line), buoyancy (dotted line), and shear (dashed line) terms in the TKE budget at the Wasco Airport. The values of ζ and the TKE budget terms are from the parent members, all other plots are from the UQ ensembles.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
The transition to the nocturnal boundary layer is marked by several important changes sensitivity. The first 6 h of the forecast correspond to the afternoon (local time) with the nocturnal transition occurring between 0100 and 0200 UTC. This transition is marked by changes in rotor layer (40–120 m) stability, ζ and PBL height (Fig. 10). Before 0100 UTC, α1 is the primary parameter for at least 1 h in each ensemble. α1 regulates the mixing length associated with the PBL height, and as the PBL height shrinks between 0200 and 0500 UTC the sensitivity to α1 decreases in each ensemble. Another significant parameter during the afternoon period is β, which impacts calculations of L during unstable conditions, when ζ is negative. As ζ moves from unstable to stable (negative to positive) the sensitivity to β drops. After 1500 UTC, the daytime boundary layer redevelops and sensitivity to β increases in the Central, Analog, and Large Ramp ensembles.
In stable conditions α5 influences L when ζ is positive. In Fig. 9, α5 is significant for 6 or more hours in three ensembles (the Central, Analog, and Large Ramp ensembles), but the timing of the sensitivity is important. The Large Ramp ensemble has a sharp increase in sensitivity to α5 between 0400 and 0600 UTC. This is shortly after the transition to the stable, nighttime conditions, but is also the same time the wind ramp occurs. The Central and Analog ensembles have increases in α5 sensitivity near 0900 UTC, which is when the wind ramp occurs in these ensembles. This timing suggests that the sensitivity to α5 is related to the wind ramp event and not simply the diurnal cycle. The fact that the two remaining physics ensembles have little to no wind ramp and low sensitivity to α5 strengthens this conclusion. The No Ramp ensemble has no significant sensitivity to α5 and no wind ramp, while the Small Ramp ensemble has weak sensitivity to α5 at 1400 and 1500 UTC, which corresponds to the peak of the small wind ramp in this ensemble.
The relationship between the arrival of the wind ramp and the parametric sensitivity are the result of changes in the state of the PBL. The Large Ramp, Central, and Analog ensembles have an increase in shear generated TKE after the wind speed increase, and therefore, more boundary layer mixing. As a result, there are changes to the PBL structure in these ensembles, which include an increase in the PBL height, a decrease in ζ and an increase in TKE. Due to the increased boundary layer mixing, the value of the mixing length is more important in the ensembles with stronger ramp events than in ensembles with a weak ramp or no ramp. However, the stable environment means that the relevant parameter in mixing LS calculations is α5, and not β or α4.
The timing of the ramp also coincides with several other changes to sensitivity. There is increased sensitivity to the surface roughness, zf, after the ramp event in the Central, Analog, and Large Ramp ensembles. This is likely a result of the role surface roughness plays in generating TKE from wind shear, as these ensembles have the highest postramp wind speeds. Sensitivity to B1 grows after the wind ramp in the Large Ramp ensemble, and B1 remains a significant parameter for the remainder of the forecast (with the exception of 1 h). B1 is related to the TKE dissipation rate. There is an increase in the TKE dissipation rate after the ramp, which explains the significant sensitivity to B1 at this time. The Central and Analog ensembles also have in increase in B1 sensitivity during the ramp event. Curiously, B1 sensitivity in these ensembles decreases after the ramp event despite a growth in TKE dissipation (though B1 is significant from 1500 to 1700 UTC in the Analog ensemble). This demonstrates that ensembles with similar boundary layer states can have different parametric sensitivities.
The sensitivities of the Weak Ramp and No Ramp ensembles are quite different from the three ensembles that produce strong ramp events. The No Ramp ensemble is largely sensitive to Pr after 0300 UTC. The lack of a marine push, along with the low PBL height and high ζ values overnight suggest that the No Ramp ensemble enjoys a more typical night. The strong sensitivity to Pr during a typical night would agree with Yang et al. (2017) which found Pr to be the most important nighttime parameter during the month of May.
The Weak Ramp ensemble has a gradual increase in wind speed from 0700 to 1400 UTC. During this time, the ensemble is mostly sensitive to B1 and γ1, and has a low TKE dissipation rate. After 1400 UTC there is a decrease in wind speed, and an increase in TKE and the TKE dissipation rate. This corresponds to a drop in sensitivity to B1 and γ1 and a sharp increase in sensitivity to Pr. Since B1 influences the TKE dissipation rate, it is interesting that the sensitivity to B1 is high when TKE dissipation is low, and that when the TKE dissipation rate increases at 1500 UTC B1 is no longer a significant parameter. B1, along with γ1, impacts the value of closure constants A1 and C1 that are in turn used to calculate the stability functions. Therefore, the sensitivity to B1 in the Weak Ramp ensemble is more likely due to its impact on the stability functions than the TKE dissipation rate. Jahn et al. (2017) previously demonstrated the sensitivity of wind speed forecasts to A1 and C1.
Overall, β, α1, k, B1, and Pr are significant parameters for at least 6 h in every ensemble. zf is significant for 5 h in the No Ramp ensemble, and for more than 6 h in the other four ensembles. γ1 is significant in every ensemble, while α5 and α4 are significant in four of the ensembles.
These results show that there are notable sensitivity differences between physics ensembles. Many of these differences are the result of different boundary layer states. For the first 9 h of the forecast, the ensembles have similar PBL characteristics including similar PBL heights, TKE, WascoBox wind speeds, rotor layer stability and negative values of ζ. At this time all the ensembles are primarily sensitive to α1, β, and k. After 0300 UTC, differences in the PBL state develop in response to the marine push, and this is reflected in different parametric sensitivities. The two ensembles with the most similar wind speed forecasts and boundary layer states, the Central and Analog ensembles, have the most similar parametric sensitivities, though there are still differences between the two (e.g., sensitivity to B1 between 1500 and 1700 UTC). Not all of the sensitivity differences are easily explained by the PBL values in Fig. 10, and it is beyond the scope of this study to investigate the reason for every difference. These differences are not random variations since many of them have clear meteorological interpretations as described above.
b. 18 January UQ analysis
There is a smaller spread in ramp magnitudes for the 18 January EnKF ensemble (Fig. 11), when compared to the 8 April EnKF ensemble, so wind speeds and surface temperature are also used to select Member 19 (“Slow”), Member 26 (“Central”), Member 56 (“Fast”), Member 7 (“Cold”), and Member 57 (“Early Ramp”) as parents for physics ensembles (Fig. 12). The Cold member has the lowest surface (2 m) temperatures in the ensemble.

Time series of the WascoBox wind speed for all members of the EnKF ensemble initialized at 0000 UTC 18 Jan (same color convention for members used to force the physics ensembles as in Fig. 7).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Time series of the WascoBox wind speed for all members of the EnKF ensemble initialized at 0000 UTC 18 Jan (same color convention for members used to force the physics ensembles as in Fig. 7).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Time series of the WascoBox wind speed for all members of the EnKF ensemble initialized at 0000 UTC 18 Jan (same color convention for members used to force the physics ensembles as in Fig. 7).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

WascoBox wind speed for the (top) EnKF ensemble and the five physics ensembles initialized at 0000 UTC 18 Jan (same color convention for members used to force the physics ensembles as in Fig. 8).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

WascoBox wind speed for the (top) EnKF ensemble and the five physics ensembles initialized at 0000 UTC 18 Jan (same color convention for members used to force the physics ensembles as in Fig. 8).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
WascoBox wind speed for the (top) EnKF ensemble and the five physics ensembles initialized at 0000 UTC 18 Jan (same color convention for members used to force the physics ensembles as in Fig. 8).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Figures 13 and 14 show the 0000 UTC 18 January parametric sensitivity results and PBL plots. In each of the five ensembles, B1 and γ1 are significant for more than 12 h, and Pr is significant for more than 6 h. However, there are differences in when these parameters are important. The relative importance of Pr, B1, and γ1 alternate throughout the Slow ensemble. Each is the primary parameter at least twice during the forecast, but never for more than 3 h at a time. In the Fast and Early Ramp ensembles B1 is the primary parameter, and γ1 is a significant parameter, for most of the forecast. However, in the Early Ramp ensemble, after 1700 UTC there is little sensitivity to B1 and γ1, while Pr is the primary parameter for the majority of the remaining hours.

Sensitivity of Domain 3 WascoBox wind speed to physics parameters for the 0000 UTC 18 Jan physics ensembles (same layout and color convention as in Fig. 9).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Sensitivity of Domain 3 WascoBox wind speed to physics parameters for the 0000 UTC 18 Jan physics ensembles (same layout and color convention as in Fig. 9).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Sensitivity of Domain 3 WascoBox wind speed to physics parameters for the 0000 UTC 18 Jan physics ensembles (same layout and color convention as in Fig. 9).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Boundary layer variables for the 0000 UTC 18 Jan ensembles (same layout and color convention as in Fig. 10).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Boundary layer variables for the 0000 UTC 18 Jan ensembles (same layout and color convention as in Fig. 10).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Boundary layer variables for the 0000 UTC 18 Jan ensembles (same layout and color convention as in Fig. 10).
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Low TKE dissipation rates, and significant sensitivity to B1 and γ1, suggest that the sensitivity to B1 is through the stability functions, rather than through the TKE dissipation rate. However, results in the Fast ensemble suggests that the sensitivity to B1 is from its impact on both the stability functions and the TKE dissipation rate. Before 1200 UTC, B1 and γ1 are significant parameters and the TKE dissipation rate is low. This is consistent with the interpretation that B1 influences the forecast through the stability parameters. Between 1700 and 1800 UTC there is a spike in the TKE dissipation rate, an increase in sensitivity to B1 and a decrease in the sensitivity to γ1. The low sensitivity to γ1 suggests that variations in the stability functions do not impact the forecast at this time. This, along with the large TKE dissipation rates, implies that the strong sensitivity to B1 at 1700 and 1800 UTC is through the TKE dissipation rate. This highlights the difficulty in understanding the details of each parameter’s importance, as it can rarely be explained by a single process. However, this reinforces the conclusion that differences between the state of physics ensembles can produce significant variations in the parametric sensitivity of each ensemble.
Overall B1, Pr and γ1 are the most important parameters in this forecast. B1 is significant for more than 15 h, and is the primary parameter for at least 6 h in every ensemble. γ1 is significant for at least 12 h in every ensemble. β and α4 are not significant parameters in any ensemble, due to the stable conditions during the forecast period.
c. Model resolution
Results presented up to this point have been from Domain 3 (1.333 km grid spacing). Figure 15 shows UQ results for the 0000 UTC 18 January ensembles from Domain 2 (4 km grid spacing). Comparing Figs. 13 and 16 reveals that while many of the broad sensitivity patterns are the same between the two domains, some notable differences exist. In the Domain 3 UQ plot for the Fast ensemble, γ1 is a significant parameter for 20 h, while in the Domain 2 results γ1 is significant for 5 h. α5 is significant for 11 h on Domain 2, but is insignificant on Domain 3. Though B1 is the primary parameter for most of the forecast on both domains, there are notable differences in the magnitude of the sensitivity. Between 1500 and 1800 UTC there is a dip in the amount of variance attributed to B1 on both domains. This drop is more pronounced in the Domain 2 results, as B1 is responsible less than 20% of the forecast variance at 1600 UTC. B1 is responsible for more than 40% of the forecast variance in Domain 3 at this time. While other examples exist, these are sufficient to demonstrate that the sensitivity to individual parameters is a function of the model resolution.

Sensitivity of Domain 2 WascoBox wind speed to physics parameters for the 0000 UTC 18 Jan physics ensembles.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Sensitivity of Domain 2 WascoBox wind speed to physics parameters for the 0000 UTC 18 Jan physics ensembles.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Sensitivity of Domain 2 WascoBox wind speed to physics parameters for the 0000 UTC 18 Jan physics ensembles.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Percentage of hours a parameter was significant (colored bars) and the primary parameter (black bars) for each ramp event. The April and July events are marine pushes and the January and December cases are mix-outs.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1

Percentage of hours a parameter was significant (colored bars) and the primary parameter (black bars) for each ramp event. The April and July events are marine pushes and the January and December cases are mix-outs.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Percentage of hours a parameter was significant (colored bars) and the primary parameter (black bars) for each ramp event. The April and July events are marine pushes and the January and December cases are mix-outs.
Citation: Monthly Weather Review 147, 12; 10.1175/MWR-D-19-0019.1
Variations in parametric sensitivity also have implications for the model improvement process. Sensitivity differences between Domain 2 and Domain 3 indicate that the sensitivity to individual parameters is a function of grid spacing. Therefore, any work on parameter optimization is resolution specific. The sensitivity difference between Domain 2 and Domain 3 tend to be much smaller than the sensitivity difference between different ensemble members. Therefore, if future sensitivity studies are limited by compute resources, using a coarser model resolution with more IC sources would likely provide the best assessment of parametric sensitivity.
d. Common sensitivity features
Despite the sensitivity differences that exist between physics ensembles, some generalizations can be made about the relevance of the parameters in each case. For each case, the number of hours a parameter was significant in each of the five ensembles was summed. The result is the percentage of hours a parameter was significant at that initialization time, shown in Fig. 16 by the colored bars. The black bars show the percentage of hours a parameter was the primary parameter.
Of the nine parameters used in this study, B1 is most frequently a significant parameter and primary parameter. The importance of B1 across a range of cases, and stability conditions, is consistent with the results from Yang et al. (2017). They attributed this to the role B1 has in regulating the TKE dissipation rate. Some results in this study agree with this explanation, as there are periods when increased sensitivity to B1 is simultaneous with changes in the TKE dissipation rate. However, there are also periods when the TKE dissipation rate does not explain the sensitivity to B1. At these times, γ1 is often a significant parameter. It is suspected that during times with low TKE dissipation rates and significant sensitivity to γ1 that the importance of B1 is a due to is influence on the stability functions rather than through its influence on the TKE dissipation rate.
After B1, Pr is the parameter that is most frequently significant. Yang et al. (2017) found that Pr was responsible for more wind speed variance during the night than any other parameter. The frequent sensitivity to Pr during the winter cases is consistent with this finding, as conditions are stable during the majority of these forecasts. However, Pr is also an important parameter during the marine push cases, and in some members is the primary parameter during the daytime hours. This is in contrast to Yang et al. (2017) who found Pr to be one of the least relevant parameters during the day. Additional work is needed to fully understand what is clearly an important parameter.
The importance of γ1 in this study is in contrast to the results in Yang et al. (2017), which found γ1 to be one of the least important parameters. The PBL ensemble in Yang et al. (2017) included γ1, C3 and C5, all of which impact the stability functions. C3 accounted for roughly 5% of the variance in Yang et al. (2017), while C5 was practically irrelevant. As C3 and C5 are not included in this study, all variation in the stability functions are due to γ1 and B1, therefore likely increasing the importance of γ1 in regulating the stability functions in this study.
There are notable differences in sensitivity between the marine push and mix-out cases. β, α4, and α1 are more frequently significant during the marine push forecasts than in the mix-out forecasts. β and α4 are only relevant during unstable conditions while α1 impacts the mixing length associated with PBL height. These results show that the sensitivity to individual parameters changes between marine push cases and winter cases. Furthermore, there are differences between the April and July cases, demonstrating that parametric sensitivity varies between similar cases. As with the other changes in sensitivity discussed in this study, variations in the atmospheric state between the April and July cases, and between the marine push and winter cases, are the source of the sensitivity differences.
It is beyond the scope of this article to provide detailed analysis comparing parametric sensitivity and IC sensitivity. However, as Figs. 8 and 12 suggest wind ramps are more sensitive to IC variations than physics perturbations it is worth mentioning briefly. In both marine push cases, the EnKF ensemble has more wind speed variance than any of the physics ensembles. There is little variation in the timing and amplitude of the ramp event within physics ensembles, with the exception of two physics ensembles from the July marine push (not shown). In the mix-out cases, wind speed variance in the EnKF ensemble is again larger than most of the physics ensembles, though there are periods where the variance of one or two physics ensembles exceeds that of the EnKF ensemble. Results from these four cases suggest that wind ramp forecasts are more sensitive to IC uncertainty than physics uncertainty, however, more work is needed to understanding the relative importance of uncertainty sources and their impacts on predictability.
4. Discussion
There are several points worth highlighting in these results. First, within a single physics ensemble there are variations in sensitivity over the course of the forecast. Some of these changes are driven by diurnal cycles and demonstrate that changes in parametric sensitivity are related to the PBL state. Other temporal changes are related to the occurrence of the wind ramp. As with the diurnal sensitivity cycles, these changes in sensitivity are related to the state of the ensemble. In the April case, the arrival of the marine push produces increased wind speeds, a growth in the PBL height, and higher TKE dissipation rates. The increased sensitivity to α1 is likely related to the change in PBL height, while changes in sensitivity to B1 can be explained by the higher TKE dissipation rates. The sensitivity to zf is related to the higher wind speeds and the role of surface roughness in generating mixing, via wind shear, in stable conditions. The important conclusion is that as the state of the atmosphere changes throughout the day, and in response to meteorological events, the sensitivity to specific parameters changes as well.
A second takeaway is that parametric sensitivity changes between physics ensembles. Throughout this study there are no two physics ensembles that have the same parametric sensitivity. The Central and Analog members were selected from the 1800 UTC 8 April EnKF ensemble in large part due to the similar forecasts they produced. While the physics ensembles that use these two parent members have many sensitivities in common, there are notable differences. The Analog physics ensemble has a spike in sensitivity to zf when the wind ramp begins at 0600 UTC, and has an increase in sensitivity to B1 starting at 1500 UTC when there is a peak in the TKE dissipation rate. The Central physics ensemble produces very similar wind speeds and TKE dissipation rates as the Analog ensemble, but it lacks both of these sensitivity features. The UQ results from the Central and Analog ensembles are two of the most similar in this study. The differences in parametric sensitivity between two physics ensembles are typically much larger.
It is not surprising that variations in the state impact the sensitivity to PBL parameterizations. The MYNN scheme is designed to treat stable and unstable conditions differently, so the fact that parametric sensitivity changes with stability is reassuring. This is what Yang et al. (2017) showed with large sensitivity differences between daytime and nighttime periods. While stable versus unstable conditions is an extreme case, what is novel about this study is the demonstration that dramatic sensitivity differences exist between multiple forecasts of the same event.
This result has important consequences for future sensitivity studies and model development work. To make improvements to parameterization schemes, it is necessary to have an understanding of how uncertainties in the scheme impact a forecast. Sensitivity studies are an effective method of identifying the most important uncertainties. However, it is clear that the sensitivity to a parameter is not consistent across forecasts, even between similar forecasts of the same event, such as the members of an EnKF ensemble. In the context of an IC ensemble this means that changing the physics parameterization to improve one member will not necessarily improve all members, and could possibly degrade the forecast in some members. The same can be said for two deterministic forecasts run with different ICs, such as different global models. Beyond the context of ensemble forecasting, this result broadly suggests that variations in parametric sensitivity exist between forecasts of similar event types. Therefore, changes to the model physics to improve one marine push forecast will not necessarily improve all marine push forecasts. The sensitivity differences between the April and July cases support this interpretation.
This also highlights the difficulty of separating IC error from physics error in a NWP model. If parametric sensitivity were consistent throughout the EnKF ensemble it would be possible to isolate the error resulting from the physics parameterizations. It is clear that uncertainties in the initial state affects how the physics parameterizations influence wind speed forecasts. This means that future sensitivity studies would benefit from incorporating multiple IC sources into their experimental design to more fully span parametric sensitivity throughout state space. Since physics uncertainty cannot be isolated from IC error future efforts at model improvement should consider IC error throughout the development process, and potentially as part of the model.
5. Conclusions
This study developed an ensemble framework that allowed for a novel sampling of state space and PBL parameter space. It was used to investigate two ramp-causing phenomena in the Columbia River basin (CRB): marine pushes and cold pool mix-out events. An EnKF data assimilation system was used to produce an initial condition (IC) ensemble using standard physics parameterizations. Five members of the IC ensemble were selected as parent members to provide ICs and lateral boundary conditions to the physics experiments. The value of parameters in the MYNN PBL scheme and MM5 surface layer scheme were perturbed in a systematic way to evaluate the parametric sensitivity in the parent members. Parametric sensitivity was assessed by the wind speed variance in the physics ensembles and through UQ analysis.
The major finding in this work is that parametric sensitivity varies with IC member, over time, between cases, and with different model resolutions. The sensitivity of wind speed forecasts to variations in the value of physics parameters is different between members of the EnKF ensemble. Even members with similar wind speed forecasts are shown to have markedly different parametric sensitivities. The variations in sensitivity are the result of differences in the atmospheric state between the ensemble members. Parametric sensitivity is also shown to vary throughout the course of the forecast. Some changes are due to the diurnal cycle, while others are associated with meteorological events, including wind ramps. There are large sensitivity differences between the marine push and the mix-out cases, as well as smaller differences between cases of the same event type. Finally, comparisons of the parametric sensitivity between wind speeds from two model domains reveal that sensitivity changes with model resolution.
These results indicate that any improvements to physics parameters are specific to the ICs, model configuration and case. Modifying the model physics to improve one forecast could degrade forecasts during other cases, for different event types, or even in members of the same ensemble. This suggests that an ideal set of model parameters for all wind forecasts does not exist, and that efforts at model improvement should not focus on errors in the physics scheme alone. The difficulty of isolating physics error from IC error means that they should both be considered when making model improvements, as well as during the forecasting process. It is our hope that these results can lead to forecast improvements that allow for flexible model configurations in both space and time.
Acknowledgments
We thank Ben Yang for help in understanding and reproducing his UQ methodology. We also thank the Department of Energy and Vaisala for funding and coordinating WFIP2, as well as the individuals who collected the observational data, compiled the event log, and managed the A2e Data Archive and Portal. Computing resources were provided by the Texas Tech High Performance Computing Center and the Texas Advanced Computing Center.
REFERENCES
Anderson, J. L., 2001: An ensemble adjustment Kalman filter for data assimilation. Mon. Wea. Rev., 129, 2884–2903, https://doi.org/10.1175/1520-0493(2001)129<2884:AEAKFF>2.0.CO;2.
Anderson, J. L., and S. L. Anderson, 1999: A Monte Carlo implementation of the nonlinear filtering problem to produce ensemble assimilations and forecasts. Mon. Wea. Rev., 127, 2741–2758, https://doi.org/10.1175/1520-0493(1999)127<2741:AMCIOT>2.0.CO;2.
Anderson, J. L., T. Hoar, K. Raeder, H. Liu, N. Collins, R. D. Torn, and A. Avellano, 2009: The Data Assimilation Research Testbed: A community facility. Bull. Amer. Meteor. Soc., 90, 1283–1296, https://doi.org/10.1175/2009BAMS2618.1.
Atmosphere to Electrons (A2e), 2015: wfip2/log.z01.00. A2e Data Archive and Portal, U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, accessed 15 October 2017, https://doi.org/10.21947/1523403.
Banta, R. M., Y. L. Pichugina, N. D. Kelley, R. M. Hardesty, and W. A. Brewer, 2013: Wind energy meteorology: Insight into wind properties in the turbine-rotor layer of the atmosphere from high-resolution Doppler lidar. Bull. Amer. Meteor. Soc., 94, 883–902, https://doi.org/10.1175/BAMS-D-11-00057.1.
Barker, D. M., W. Huang, Y.-R. Guo, A. J. Bourgeois, and Q. N. Xiao, 2004: A three-dimensional variational data assimilation system for MM5 : Implementation and initial results. Mon. Wea. Rev., 132, 897–914, https://doi.org/10.1175/152-0-0493(2004)132<0897:ATVDAS>2.0.CO;2.
Benjamin, S. G., and Coauthors, 2016: A North American hourly assimilation and model forecast cycle: The Rapid Refresh. Mon. Wea. Rev., 144, 1669–1694, https://doi.org/10.1175/MWR-D-15-0242.1.
Caflisch, R. E., 1998: Monte Carlo and quasi-Monte Carlo methods. Acta Numer., 7, 1–49, https://doi.org/10.1017/S0962492900002804.
Carvalho, D., A. Rocha, M. Gomez-Gesteira, and C. Silva Santos, 2014: Sensitivity of the WRF model wind simulation and wind energy production estimates to planetary boundary layer parameterizations for onshore and offshore areas in the Iberian Peninsula. Appl. Energy, 135, 234–246, https://doi.org/10.1016/j.apenergy.2014.08.082.
Crosman, E. T., and J. D. Horel, 2017: Large-eddy simulations of a Salt Lake Valley cold-air pool. Atmos. Res., 193, 10–25, https://doi.org/10.1016/j.atmosres.2017.04.010.
Draxl, C., A. N. Hahmann, A. Peña, and G. Giebel, 2014: Evaluating winds and vertical wind shear from Weather Research and Forecasting model forecasts using seven planetary boundary layer schemes. Wind Energy, 17, 39–55, https://doi.org/10.1002/we.1555.
Foster, C. S., E. T. Crosman, and J. D. Horel, 2017: Simulations of a cold-air pool in Utah’s Salt Lake Valley: Sensitivity to land use and snow cover. Bound.-Layer Meteor., 164, 63–87, https://doi.org/10.1007/s10546-017-0240-7.
Fox, B., J. Burkardt, and C. Chisari, 2016: SOBOL—The Sobol quasirandom sequence: sobel_seq.py. Accessed April 2017, http://people.sc.fsu.edu/~jburkardt/py_src/sobol/sobol.html.
Gaspari, G., and S. E. Cohn, 1999: Construction of correlation functions in two and three dimensions. Quart. J. Roy. Meteor. Soc., 125, 723–757, https://doi.org/10.1002/qj.49712555417.
Gómez-Navarro, J. J., C. C. Raible, and S. Dierer, 2015: Sensitivity of the WRF model to PBL parametrisations and nesting techniques: Evaluation of wind storms over complex terrain. Geosci. Model Dev., 8, 3349–3363, https://doi.org/10.5194/gmd-8-3349-2015.
Grell, G. A., J. Dudhia, and D. R. Stauffer, 1994: A description of the fifth-generation Penn State/NCAR Mesoscale Model (MM5). NCAR Tech. Note NCAR/TN-398+STR, University Corporation for Atmospheric Research, 121 pp., https://doi.org/10.5065/D60Z716B.
Hou, Z., M. Huang, L. R. Leung, G. Lin, and D. M. Ricciuto, 2012: Sensitivity of surface flux simulations to hydrologic parameters based on an uncertainty quantification framework applied to the Community Land Model. J. Geophys. Res., 117, D15108, https://doi.org/10.1029/2012JD017521.
Iacono, M. J., J. S. Delamere, E. J. Mlawer, M. W. Shephard, S. A. Clough, and W. D. Collins, 2008: Radiative forcing by long-lived greenhouse gases: Calculations with the AER radiative transfer models. J. Geophys. Res., 113, 2–9, https://doi.org/10.1029/2008JD009944.
Jahn, D. E., E. S. Takle, and W. A. Gallus, 2017: Wind-ramp-forecast sensitivity to closure parameters in a boundary-layer parametrization scheme. Bound.-Layer Meteor., 164, 475–490, https://doi.org/10.1007/s10546-017-0250-5.
Krogsaeter, O., and J. Reuder, 2015: Validation of boundary layer parameterization schemes in the weather research and forecasting model under the aspect of offshore wind energy applications—Part I: Average wind speed and wind shear. Wind Energy, 18, 769–782, https://doi.org/10.1002/we.1727.
Lareau, N. P., and J. D. Horel, 2015: Turbulent erosion of persistent cold-air pools: Numerical simulations. J. Atmos. Sci., 72, 1409–1427, https://doi.org/10.1175/JAS-D-14-0173.1.
Marquis, M., J. Wilczak, M. Ahlstrom, J. Sharp, R. Stern, J. C.Smith, and S. Calvert, 2011: Forecasting the wind to reach significant penetration levels of wind energy. Bull. Amer. Meteor. Soc., 92, 1159–1171, https://doi.org/10.1175/2011BAMS3033.1.
Mass, C. F., M. D. Albright, and D. J. Brees, 1986: The onshore surge of marine air into the Pacific Northwest: A coastal region of complex terrain. Mon. Wea. Rev., 114, 2602–2627, https://doi.org/10.1175/1520-0493(1986)114<2602:TOSOMA>2.0.CO;2.
McCaffrey, K., and Coauthors, 2019: Identification and characterization of persistent cold pool events from temperature and wind profilers in the Columbia River basin. J. Appl. Meteor. Climatol., https://doi.org/10.1175/JAMC-D-19-0046.1, in press.
Nakanishi, M., and H. Niino, 2004: An improved Mellor–Yamada Level-3 model with condensation physics: Its design and verification. Bound.-Layer Meteor., 112, 1–31, https://doi.org/10.1023/B:BOUN.0000020164.04146.98.
Nakanishi, M., and H. Niino, 2006: An improved Mellor–Yamada Level-3 model: Its numerical stability and application to a regional prediction of advection fog. Bound.-Layer Meteor., 119, 397–407, https://doi.org/10.1007/s10546-005-9030-8.
Nakanishi, M., and H. Niino, 2009: Development of an improved turbulence closure model for the atmospheric boundary layer. J. Meteor. Soc. Japan, 87, 895–912, https://doi.org/10.2151/jmsj.87.895.
Olson, J. B., and Coauthors, 2019: Improving wind energy forecasting through numerical weather prediction model development. Bull. Amer. Meteor. Soc., 100, 2201–2220, https://doi.org/10.1175/BAMS-D-18-0040.1.
R Core Team, 2017: R: A Language and Environment for Statistical Computing. Accessed April 2017, https://www.r-project.org.
Reeves, H. D., and D. J. Stensrud, 2009: Synoptic-scale flow and valley cold pool evolution in the western United States. Wea. Forecastng, 24, 1625–1643, https://doi.org/10.1175/2009WAF2222234.1.
Shaw, W. J., and Coauthors, 2019: The Second Wind Forecast Improvement Project (WFIP2): General overview. Bull. Amer. Meteor. Soc., 100, 1687–1699, https://doi.org/10.1175/BAMS-D-18-0036.1.
Shaw, W. J., J. K. Lundquist, and S. J. Schreck, 2009: Research needs for wind resource characterization. Bull. Amer. Meteor. Soc., 90, 535–538, https://doi.org/10.1175/2008BAMS2729.1.
Skamarock, W. C., and Coauthors, 2008: A description of the Advanced Research WRF version 3. NCAR Tech. Note NCAR/TN-475+STR, 113 pp., https://doi.org/10.5065/D68S4MVH.
Tewari, M., and Coauthors, 2004: Implementation and verification of the unified Noah land surface model in the WRF model. 20th Conf. on Weather Analysis and Forecasting/16th Conf. on Numerical Weather Prediction, Seattle, WA, Amer. Meteor. Soc., 14.2a, https://ams.confex.com/ams/84Annual/techprogram/paper_69061.htm.
Thompson, G., and T. Eidhammer, 2014: A study of aerosol impacts on clouds and precipitation development in a large winter cyclone. J. Atmos. Sci., 71, 3636–3658, https://doi.org/10.1175/JAS-D-13-0305.1.
Tiedtke, M., 1989: A comprehensive mass flux scheme for cumulus parameterization in large-scale models. Mon. Wea. Rev., 117, 1779–1800, https://doi.org/10.1175/1520-0493(1989)117<1779:ACMFSF>2.0.CO;2.
Whiteman, C. D., S. Zhong, W. J. Shaw, J. M. Hubbe, X. Bian, and J. Mittelstadt, 2001: Cold pools in the Columbia basin. Wea. Forecasting, 16, 432–447, https://doi.org/10.1175/1520-0434(2001)016<0432:CPITCB>2.0.CO;2.
Wilczak, J. M., and Coauthors, 2019: The Second Wind Forecast Improvement Project (WFIP2): Observational field campaign. Bull. Amer. Meteor. Soc., 100, 1701–1723, https://doi.org/10.1175/BAMS-D-18-0035.1.
Yang, B., and Coauthors, 2017: Sensitivity of turbine-height wind speeds to parameters in planetary boundary-layer and surface-layer schemes in the Weather Research and Forecasting Model. Bound.-Layer Meteor., 162, 117–142, https://doi.org/10.1007/s10546-016-0185-2.
Yang, Q., L. K. Berg, M. Pekour, J. D. Fast, R. K. Newsom, M. Stoelinga, and C. Finley, 2013: Evaluation of WRF-predicted near-hub-height winds and ramp events over a Pacific Northwest site with complex terrain. J. Appl. Meteor. Climatol., 52, 1753–1763, https://doi.org/10.1175/JAMC-D-12-0267.1.
Zhang, C., Y. Wang, and K. Hamilton, 2011: Improved representation of boundary layer clouds over the Southeast Pacific in ARW-WRF using a modified Tiedtke cumulus parameterization scheme. Mon. Wea. Rev., 139, 3489–3513, https://doi.org/10.1175/MWR-D-10-05091.1.