Effect of Boundary Conditions on Adjoint-Based Forecast Sensitivity Observation Impact in a Regional Model

Hyun Mee Kim aAtmospheric Predictability and Data Assimilation Laboratory, Department of Atmospheric Sciences, Yonsei University, Seoul, South Korea

Search for other papers by Hyun Mee Kim in
Current site
Google Scholar
PubMed
Close
and
Dae-Hui Kim aAtmospheric Predictability and Data Assimilation Laboratory, Department of Atmospheric Sciences, Yonsei University, Seoul, South Korea

Search for other papers by Dae-Hui Kim in
Current site
Google Scholar
PubMed
Close
Open access

Abstract

In this study, the effect of boundary-condition configurations in the regional Weather Research and Forecasting (WRF) Model on the adjoint-based forecast sensitivity observation impact (FSOI) for 24 h forecast error reduction was evaluated. The FSOI has been used to diagnose the impact of observations on the forecast performance in several global and regional models. Different from the global model, in the regional model, the lateral boundaries affect forecasts and FSOI results. Several experiments with different lateral boundary conditions were conducted. The experimental period was from 1 to 14 June 2015. With or without data assimilation, the larger the buffer size in lateral boundary conditions, the smaller the forecast error. The nonlinear and linear forecast error reduction (i.e., observation impact) decreased as the buffer size increased, implying larger impact of lateral boundaries and smaller observation impact on the forecast error. In most experiments, in terms of observation types (variables), upper-air radiosonde observations (brightness temperature) exhibited the greatest observation impact. The ranking of observation impacts was consistent for observation types and variables among experiments with a constraint in the response function at the upper boundary. The fractions of beneficial observations were approximately 60%, and did not considerably vary depending on the boundary conditions specified when calculating the FSOI in the regional modeling framework.

© 2021 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Hyun Mee Kim, khm@yonsei.ac.kr

Abstract

In this study, the effect of boundary-condition configurations in the regional Weather Research and Forecasting (WRF) Model on the adjoint-based forecast sensitivity observation impact (FSOI) for 24 h forecast error reduction was evaluated. The FSOI has been used to diagnose the impact of observations on the forecast performance in several global and regional models. Different from the global model, in the regional model, the lateral boundaries affect forecasts and FSOI results. Several experiments with different lateral boundary conditions were conducted. The experimental period was from 1 to 14 June 2015. With or without data assimilation, the larger the buffer size in lateral boundary conditions, the smaller the forecast error. The nonlinear and linear forecast error reduction (i.e., observation impact) decreased as the buffer size increased, implying larger impact of lateral boundaries and smaller observation impact on the forecast error. In most experiments, in terms of observation types (variables), upper-air radiosonde observations (brightness temperature) exhibited the greatest observation impact. The ranking of observation impacts was consistent for observation types and variables among experiments with a constraint in the response function at the upper boundary. The fractions of beneficial observations were approximately 60%, and did not considerably vary depending on the boundary conditions specified when calculating the FSOI in the regional modeling framework.

© 2021 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Hyun Mee Kim, khm@yonsei.ac.kr

1. Introduction

The initial conditions obtained by assimilating the model background and observations are used to predict weather in numerical weather prediction (NWP). Because individual observations assimilated to produce the initial conditions for the prediction contribute differently to the performance of forecasts, the impact of individual observations on the forecasts needs to be evaluated quantitatively to improve the performance of the NWP.

The impact of real observations on forecasts can be measured by observation system experiments (OSEs) (Cardinali 2009; Kelly et al. 2007; Yamaguchi et al. 2009; Jung et al. 2010, 2012; Kim et al. 2017) or by using the forecast sensitivity observation impact (FSOI) method based on adjoint sensitivity (Baker and Daley 2000; Langland and Baker 2004; Errico 2007; Cardinali 2009; Gelaro and Zhu 2009; Jung et al. 2013; Kim et al. 2017; Kim and Kim 2019). The OSEs generally require numerous computational resources because many forecast experiments with different observation sets are needed to examine the impact of each observation set. In contrast to OSEs, the FSOI based on the adjoint method can measure the impact of individual observations assimilated independently through one backward adjoint integration.

The adjoint-based FSOI has been developed and calculated in global modeling systems operated in national meteorological centers to monitor the impact of observation data, including satellite data (Daescu 2008; Joo et al. 2013; Lorenc and Marriott 2014; Kim and Kim 2014, 2017, 2018, 2019). The adjoint-based FSOI indicates that the Infrared Atmospheric Sounding Interferometer (IASI) and Advanced Microwave Sounding Unit-A (AMSU-A) exhibit the largest impact on short-term forecast performance and the beneficial observation rate in reducing the forecast error is approximately 50%–55% in the global model. This indicates that approximately half of the observations contributes to the reduction of the forecast error and the other half increases the forecast error in the current global modeling system according to linear hypotheses.

In comparison with the global model, the adjoint-based FSOI in the regional model showed different rankings of the observation impacts depending on the regional situation (Jung et al. 2013; Kim et al. 2017). Additionally, the beneficial observation rate in the regional model was consistently greater (approximately 60%–65%). In contrast to the global model, the boundary conditions can affect the forecast results in the regional model. Wang et al. (2010) investigated the effect of lateral boundary-condition configurations (i.e., nudging coefficient and buffer zone size) on the track forecasts of Typhoon Winnie (1997) in the regional climate model. Kumar et al. (2011) demonstrated that the model domain size affected the intensity and track forecast of Tropical Cyclone Sidr (2007) in the Bay of Bengal, in the high-resolution regional model. To minimize the lateral boundary effects on the FSOI calculation, Amerault et al. (2013) discussed the influence of lateral boundaries on forecast error in COAMPS regional model. Despite these studies considering the effect of lateral boundary conditions on the general performance of the model forecasts, the effect of boundary conditions on the FSOI has not been fully examined. Because the forecasts in the regional model are affected by the lateral boundary conditions, they may affect the adjoint-based FSOI results as well.

The FSOI calculation is performed by two processes: a forward forecast accompanying the cycling run and a backward adjoint calculation. Therefore, the boundary conditions for both the forward forecast and the backward adjoint calculation can be specifically configured to investigate the effect of the boundary-condition configurations encountered at each process of the FSOI calculation on the FSOI results.

In this study, using the Weather Research and Forecasting (WRF) Model, WRF Data Assimilation System (WRFDA), and the adjoint model of WRFPLUS (Zhang et al. 2013), the effect of boundary conditions on the FSOI results was examined for the period from 1 to 14 June 2015 in East Asia. Using the FSOI calculation method, the total observation impact, observation impact per observation, and beneficial observation rate were deduced.

Section 2 presents the methodology and experiments, section 3 presents the results, and section 4 presents the summary and discussion.

2. Methodology

a. Forecast sensitivity to observation

The forecast value of the NWP is determined using the nonlinear equation below:
xf=N(xa),
where xa and xf are the initial condition (i.e., analysis) and forecast, respectively, and N is the nonlinear model. The forecast aspect of interest (or response function) R can be represented by a function of xf as
R=f(xf).
The first-order variation of R (δR) is expressed as
δR=δxa,Rxa=δxf,Rxf=Mδxa,Rxf,
where M is the tangent linear model of the nonlinear model N, δxa is the variation of the initial condition, and the forecast variation satisfies δxf = Mδxa. The relationship of the second and fourth term of Eq. (3) satisfies Eq. (4), where MT is the adjoint model of M:
Rxa=MTRxf.
The optimal linear analysis equation to produce xa in the DA is expressed as (Kalnay 2003)
xa=xb+K(yHxb)=xb+Kd,
where xb is the background, y is the observation, K is the Kalman gain matrix, and d is an innovation vector. Using Eq. (5), Eq. (3) is expressed as
δR=δxa,Rxa=Kd,Rxa=d,KTRxa=d,Ry,
where KT the adjoint of the Kalman gain matrix. Using Eq. (4) and the relationship between the fourth and fifth terms in Eq. (6), the forecast sensitivity to observation (FSO) is expressed as in Baker and Daley (2000):
Ry=KTRxa=KTMTRxf.

b. Forecast error reduction and forecast sensitivity observation impact

Given the true state (xt), the forecast error is expressed as
e=(xfxt)TC(xfxt)=σ,x,y12{u'2+υ'2+(gN¯θ¯)2θ'2+(1ρ¯cs)2p'2},
where C is a diagonal matrix associated with the dry total energy norm (Rabier et al. 1996; Jung et al. 2013; Kim et al. 2017). Here, N¯, θ¯, ρ¯ and cs represent the Brunt–Väisälä frequency, potential temperature, density, and speed of sound, respectively. The true state is assumed to be the analysis of the cycling experiments.
The nonlinear forecast error reduction (FER; Δe) is defined as the difference between the forecast error based on the forecast trajectory (xaf) integrated from the analysis (xa) and the forecast error based on the forecast trajectory (xaf) integrated from the background (xb), and expressed as (Langland and Baker 2004)
Δe=(xafxt)TC(xafxt)(xbfxt)TC(xbfxt),
where the first and second terms on the right-hand side of Eq. (9) are denoted as ea and eb, respectively. Using Eqs. (6), (7), and (9), the linear approximation (δe, observation impact) of the nonlinear FER in the observation space (Gelaro et al. 2007) becomes
δe=dTKT{MbTC(xbfxt)+MaTC(xafxt)},
where Ma and Mb are the tangent linear models that are linearized along the forecast trajectories xaf and xbf, respectively. By the augmented third-order approximation (Gelaro et al. 2007), Eq. (10) is written as
δe3a=dTKT{MaTC(xbfxt)+MaTC(xafxt)}.

The δe3a is the FSOI calculated in this study.

c. Model, data assimilation, and observations

The model used was WRF v3.8 and the physics scheme used were Yonsei University (YSU) scheme (Hong et al. 2006) for planetary boundary layer parameterization, Noah land surface model (Chen and Dudhia 2001) for surface parameterization, Kain–Fritsch scheme (Kain 2004) for cumulus parameterization, WRF single-moment 6-class scheme (Hong and Lim 2006) for microphysics parameterization, Dudhia scheme (Dudhia 1989) for shortwave parameterization, and Rapid Radiative Transfer Model scheme (Mlawer et al. 1997) for longwave parameterization. The horizontal resolution was 18 km centered at 26°N, 128°E with 361 × 252 grid points in the west Pacific and East Asia domain (Fig. 1) and there are 61 vertical model layers up to 10 hPa, which is the top of the modeled atmosphere.

Fig. 1.
Fig. 1.

Experimental domain of this study, indicated by the black solid line. The buffer zone with 5 buffer grids (gray solid), 10 buffer grids (black dashed), and 20 buffer grids (gray dashed).

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

The DA system used was WRFDA 3DVAR (Barker et al. 2004; Barker et al. 2012). The background error covariance was calculated using the National Meteorological Center (NMC) method (Parrish and Derber 1992). The differences between the 12 h forecasts and 24 h forecasts for the period from 0000 UTC 26 May to 1200 UTC 15 June 2015 were used to obtain the background error covariance.

The observation data (Table 1) used were Prepared Binary Universal Form for the Representation of Meteorological Data (PREPBUFR) assimilated in the Global Assimilation System (GDAS) of National Centers for Environmental Prediction (NCEP). Additionally, the AMSU-A data were also assimilated with a thinning distance of 90 km. Some channels (i.e., NOAA-18 AMSU-A channel 9, MetOp-2 AMSU-A channel 7, and NOAA-19 AMSU-A channel 8) of AMSU-A data were not assimilated since they were not provided by the PREPBUFR due to low quality (https://www.emc.ncep.noaa.gov/mmb/data_processing/Satellite_Historical_Documentation.htm).

Table 1.

Descriptions of the observation types used. The left column denotes typical observation types, and the right column represents the specific observation types in acronyms. The observation variables are the zonal wind (U), meridional wind (V), temperature (T), surface pressure (P), specific humidity (Q), total precipitable water (TPW), and brightness temperature (BT).

Table 1.

d. Experimental framework

The schematic of the experiments is shown in Fig. 2. The cycling experiments were conducted from 0000 UTC 25 May to 0000 UTC 16 June 2015. After one week of model spinup, the results for the 2-week period from 0000 UTC 1 to 1800 UTC 14 June 2015 were analyzed. The lateral boundary conditions to run the WRF were NCEP Final analysis (FNL) with 1° × 1° horizontal resolution, and updated every 6 h (i.e., 0000, 0600, 1200, and 1800 UTC). If the global model forecasts were used for the lateral boundary conditions, then the lateral boundary conditions applied to the forecast trajectory integrated from the analysis were different from those applied to the forecast trajectory integrated from the background, which causes the FER affected by both observation impact and changes of lateral boundary conditions in time. Since the FER caused by the observation impact was the main focus in this study, the FNL was used for the lateral boundary condition and the aforementioned differences in the lateral boundary conditions when using the global model forecasts was not considered.

Fig. 2.
Fig. 2.

Process of cycling DA experiment and FSOI calculation.

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

The initial condition at 0000 UTC 25 May 2015 was also NCEP FNL. After then, the initial conditions were generated using 3DVAR every 6 h (i.e., 0000, 0600, 1200, and 1800 UTC) through cycling experiments. The assimilation window at each analysis time was ±3 h. The initial conditions at each analysis time were used to obtain the 24 and 30 h forecasts. The 24 h FER at a specific time is the difference between the forecast error based on 24 h forecast integrated from the analysis and the forecast error based on 24 h forecast integrated from the background (i.e., 30 h forecast integrated from the analysis 6 h ahead). Next, the FSOI was calculated using both 24 h forecast errors as the input to the adjoint model.

Thus, the FSOI calculation was performed in two steps. The first step is analysis–forecast cycling, which is based on the analysis generation by the DA process and forward model integration. The second step is the adjoint calculation, which is backward adjoint integration using Eq. (11). The boundary conditions affect both steps in the FSOI estimation using the regional model. For the forward model integration in the first step, the effect of the lateral boundary conditions with a buffer zone were considered. For the backward adjoint model integration in the second step, the effects of both the lateral boundary conditions with a buffer zone and the constrained response functions at the upper boundary were considered.

The experiments conducted are summarized in Table 2. EXP0 has the five buffer grids in the lateral boundary condition for both the forward and backward integration steps, which implies that lateral boundary buffers are applied on the outmost five grid points of the model domain. The 5 buffer grids in the lateral boundary condition is the default configuration in the WRF forward and backward integrations, and is called the basic buffer. For the FSOI calculation, no constraints were applied on the response function at the upper boundaries in EXP0. In addition to the configuration of EXP0, the constraint at the upper boundary is applied in the backward integration of EXP1 by setting the response function above 150 hPa as 0. This constraint is usually used in the adjoint integration (Gelaro et al. 2010; Lorenc and Marriott 2014; Kim et al. 2017; Kim and Kim 2017, 2018, 2019) in both global and regional models because the response function above 150 hPa includes large forecast errors near the model top and the forecast in the troposphere is of interest in terms of adjoint sensitivities and FSOI. Thus, the difference between EXP1 and EXP0 is caused by the upper boundary configuration of the response function in the backward adjoint integration. EXP1 is the typical setting used in many adjoint sensitivity studies and FSOI studies; thus, it can be treated as the control experiment. To examine the effect of buffering in the lateral boundary conditions during the forward cycling and backward adjoint integrations, two additional experiments were conducted: EXP2 (EXP3) with twice (4 times) the buffer grids compared with the basic buffer configuration. For EXP2 and EXP3, the top boundary configuration in the backward integration is the same as in EXP1. Thus, the differences between EXP2/EXP3 and EXP1 are caused by the buffering size of the lateral boundaries in the forward and backward model integrations.

Table 2.

Experimental configuration with different combinations of boundary condition and response function configuration in the forward and backward model integrations during the FSOI calculation.

Table 2.

The buffer grid points of the lateral boundaries correspond to “specify zone” and “relaxation zone” in Skamarock et al. (2008). Relaxation in the lateral boundaries of the regional model makes the model relaxed toward the large-scale forecast or analysis (i.e., lateral boundary conditions) provided by the larger domain model, which reduces the mismatch between the lateral boundary conditions and the regional model. Details of the lateral boundary buffers in the regional model are presented in Davies and Turner (1977) and Skamarock et al. (2008).

3. Results

a. Forecast error

Figure 3 shows time series of forecast errors with DA (ea) and without DA (eb), which are calculated by Eq. (8), for each experiment during the experimental period. The forecast errors (ea and eb) are defined in relation to the response function considered during the backward model integration. This is because the nonlinear FER based on both ea and eb should be compared with the linearly approximated FER calculated after integration of the backward adjoint model.

Fig. 3.
Fig. 3.

Time series of forecast error with DA (gray) and without DA (black) during the experimental period: (a) EXP0, (b) EXP1, (c) EXP2, and (d) EXP3. Note that the vertical scale of (a) is greater than those of (b)–(d).

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

In all experiments, ea is smaller than eb, which indicates that the forecast error decreases after DA. Based on the experimental configuration presented in Table 2, EXP0 shows the largest forecast error, followed by EXP1, EXP2, and EXP3. Compared with EXP0, in which the forecast errors are considered for all vertical layers, EXP1 exhibits significantly smaller forecast errors because the response function is defined blow 150 hPa at the start of the backward model integration. Thus, the configuration of the response function at the upper boundary (i.e., response function defined below 150 hPa) makes the forecast error much smaller. The previous FSOI studies using the regional model (e.g., Kim et al. 2017) used the boundary-condition configurations of EXP1. In comparison with EXP1, EXP2 exhibits a smaller forecast error as the buffer size of EXP2 in the forward and backward model integrations is twice the buffer size of EXP1. As the buffer size of EXP3 is even larger than that of EXP1 (i.e., fourfold larger than EXP1), EXP3 exhibits the smallest forecast errors. With or without DA (ea or eb), the smaller forecast error for a larger buffer size is caused by the smaller difference between the forecast and the analysis that is treated as the true state. Thus, the larger the buffer size, the smaller the forecast error. The buffer area in the lateral boundaries reduces the overspecification of the lateral boundary conditions in the lateral boundaries of the regional model, which could reduce the error in the regional model. By increasing the buffer size, the error caused by the overspecification of the lateral boundary conditions could be further reduced.

The average forecast errors without DA (eb) of EXP0, EXP1, EXP2, and EXP3 during the experimental period were 4649.71, 725.61, 672.71, and 578.62 (× 105 J kg−1), respectively (Table 3). The eb of EXP1 was reduced by 84.4% in comparison with that in EXP0, implying the significant impact of the response function configuration at the upper boundaries in the backward adjoint integration. The eb of EXP2 and EXP3 was reduced by 7.3% and 14% relative to EXP1 and EXP2, respectively, indicating the effect of the buffer size configuration in the forward and backward model integrations on the forecast error eb. As the buffer grid size increased from 10 to 20, the percentage of decrease in the forecast error became twice larger than in the case from 5 to 10. The average forecast errors with DA (ea) of EXP0, EXP1, EXP2, and EXP3 during the experimental period were 3772.13, 592.71, 555.51, and 480.65 × 105 J kg−1, respectively (Table 3). The ea of EXP1 was reduced by 84.3% relative to EXP0, that of EXP2 was reduced by 6.3% relative to EXP1, and that of EXP3 was reduced by 13.5% relative to EXP2. These rates of decrease were considerably similar to those in the case of eb. The forecast errors with DA varied significantly among the experiments. Similarly, the forecast errors without DA also varied significantly among the experiments. These indicate the significant effect of the boundary-condition configuration on both the forecast errors with DA and those without DA.

Table 3.

Average forecast errors without DA (eb) and with DA (ea) for each experiment. The unit is ×105 J kg−1.

Table 3.

In comparison with the large percentage changes among experiments for ea or eb, the ea was reduced by 17%–19% relative to eb for all experiments, which implies that the DA effect on each experiment did not vary significantly among experiments. Therefore, the reduction rates in the forecast error with DA in comparison with those without DA were similar for all experiments regardless of the boundary-condition configuration, indicating that the DA effect was somewhat consistent for the given boundary configurations.

b. Forecast error reduction

Figure 4 shows the time series of the nonlinear FER and the linear approximation of the nonlinear FER (i.e., linear FER or observation impact) for each experiment during the experimental period. The nonlinear and linear FERs in each experiment were similar, indicating that the linear approximation holds well. The nonlinear and linear FERs of EXP0 were −877.58 × 105 J kg−1 and −870.86 × 105 J kg−1, respectively. The nonlinear FER of EXP1 was −132.90 × 105 J kg−1, and the linear FER of EXP1 was −136.56 × 105 J kg−1. These values were 85% and 84% smaller than the nonlinear and linear FERs of EXP0, respectively. The significantly smaller FERs in EXP1 than in EXP0 were due to the difference in the top boundary configuration of the adjoint calculation. Compared to other experiments, the consideration of the response function above 150 hPa in EXP0 caused a significant difference in the forecasts with and without DA. The nonlinear (linear) FER of EXP2 was −117.20 × 105 J kg−1 (−121.51 × 105 J kg−1), which was 12% (11%) smaller than the nonlinear (linear) FER of EXP1. The smaller FERs in EXP2 compared with those in EXP1 were due to the difference in the buffer size of the lateral boundary condition of the forward and backward simulations. EXP2 had a twofold larger buffer size in the lateral boundary condition than that of EXP1, resulting in a 11%–12% smaller nonlinear FER and linear FER (i.e., observation impact). This implies that the difference between the forecast error with DA and the forecast error without DA becomes smaller and thus, the observation impact on the forecast error becomes smaller. In the case of EXP3, the nonlinear (linear) FER was −97.97 × 105 J kg−1 (−95.39 × 105 J kg−1), which was 16% (21%) smaller than the nonlinear (linear) FER of EXP2. The larger rates of decrease in the nonlinear and linear FERs in EXP3, relative to those in EXP2, imply the greater impact of the lateral boundaries caused by buffer size and smaller observation impact on the forecast error. Thus, as the buffer size increases, the impact of observations (or the impact of DA) in reducing the forecast error decreases.

Fig. 4.
Fig. 4.

Time series of nonlinear FER (black) and approximated FER (i.e., linear FER or observation impact, gray) during the experimental period: (a) EXP0, (b) EXP1, (c) EXP2, and (d) EXP3. Note that the vertical scale of (a) is greater than those of (b)–(d).

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

Figure 5 shows the time series of the differences in linear FER between experiments. In comparison with EXP0, other experiments show considerably smaller observation impacts. Compared with the difference between EXP1 and EXP2, the difference between EXP3 and EXP1 or EXP3 and EXP2 is greater, indicating that the larger the buffer size in the lateral boundary condition, the smaller the observation impact. The observation impacts of EXP2 and EXP3 are 11% and 30% smaller than the observation impact of EXP1, respectively. Thus, the decrease in FER (i.e., the smaller difference between the forecast errors with and without DA) implies that the forecast error without DA becomes similar to that with DA, and the observation impact by DA on the forecast becomes smaller.

Fig. 5.
Fig. 5.

Time series of linear FER (black) difference between experiments during the experimental period: (a) EXP1 − EXP0, (b) EXP2 − EXP1, (c) EXP3 − EXP1, and (d) EXP3 − EXP2. Note that the vertical scale of (a) is greater than those of (b)–(d).

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

c. Observation impact for observation type and variable

Figure 6 shows the time-averaged observation impacts for observation types and variables during the experimental period for the four experiments in descending order with respect to EXP1. In comparison with the other experiments, the observation impact of EXP0 was much greater. After the observation impact of EXP0, that of EXP1 was the greatest, followed by EXP2 and EXP3, which is consistent with the results discussed in section 3b. For observation types in EXP1, EXP2, and EXP3, SOUND exhibited the greatest observation impact followed by QSCAT, GEOAMV, NOAA-19 AMSUA, SYNOP, NOAA-18 AMSUA, NOAA-15 AMSUA, MetOp-2 AMSUA, METAR, PILOT, SHIPS, PROFILER, GPSPW, AIREP, and BUOY. This result is similar to those reported by Kim et al. (2017) and Jung et al. (2013). However, in contrast to the FSOI studies using the global model (Joo et al. 2013; Kim and Kim 2019), the ranking of the observation impact of the aircraft data (i.e., AIREP) was relatively low (Fig. 6a) due to the small number of AIREP data (Fig. 6b) used in this study. The ranking of the observation impact in EXP0 was similar to, but not exactly the same as, those in EXP1, EXP2, and EXP3. In terms of observation variables, the brightness temperature showed the greatest observation impact followed by meridional wind, zonal wind, temperature, specific humidity, and surface pressure.

Fig. 6.
Fig. 6.

Time-averaged observation impacts for (a) observation types and (c) observation variables during the experimental period. (b) Time-averaged observation number for observation types. All ranks are aligned in descending order with respect to the absolute observation impact values of EXP1. Black, hatched, gray, and white bars represent the observation impacts of EXP0, EXP1, EXP2, and EXP3, respectively.

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

The ranking of observation impact was consistent for observation types and variables for EXP1, EXP2, and EXP3. In contrast, the decreasing rates of the observation impacts for observation types and variables were not consistent for EXP1, EXP2, and EXP3. In comparison with EXP1, EXP2 showed an averaged decreasing rate of 11% for individual observation impacts for individual observation types and variables. In comparison with EXP1, EXP3 exhibited an averaged decreasing rate of 37% for individual observation impacts for individual observation types, and 26% for individual observation impacts for individual observation variables. Thus, the observation impact decreases rapidly as the buffer zone increases.

Figure 7 shows the time-averaged observation impacts normalized by the number of observations for observation types and variables during the experimental period, for the four experiments in descending order with respect to EXP1. Similar to the total observation impact in Fig. 6, the observation impact of EXP0 was much greater than those of other experiments. After the observation impact of EXP0, that of EXP1 was the greatest, followed by EXP2 and EXP3, which is consistent with the results for the total observation impact. For the observation types in EXP1, EXP2, and EXP3, GPSPW exhibited the greatest observation impact followed by SYNOP, SOUND, METAR, PILOT, AIREP, PROFILER, NOAA-18 AMSUA, NOAA-15 AMSUA, NOAA-19 AMSUA, MetOp-2 AMSUA, SHIPS, GEOAMV, QSCAT, and BUOY. The ranking of the normalized observation impact in EXP0 was different from that in EXP1, EXP2, and EXP3. Compared with the total observation impact (Fig. 6a), the normalized observation impact of AIREP was relatively large (Fig. 7a), which implies large observation impact per aircraft data. In terms of observation variables, temperature exhibited the greatest observation impact followed by specific humidity, surface pressure, brightness temperature, meridional wind, and zonal wind. The normalized observation impact for T and P were greater than those for U and V, indicating that the observation variables associated with the height information exhibit greater normalized observation impacts than those associated with the momentum information. For EXP1, EXP2, and EXP3, the ranking of observation impact was consistent for observation types and variables, whereas the decreasing rates of observation impacts for observation types and variables were not consistent.

Fig. 7.
Fig. 7.

As in Fig. 6, but normalized by the number of observations for a given observation type or variable.

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

Figure 8 shows the time-averaged observation impacts of AMSU-A data channels on board of NOAA-15, NOAA-18, NOAA-19, and MetOp-2 satellites. The peak pressure levels (altitude) of channels 5, 6, 7, 8, and 9 are approximately 700 hPa (3 km), 400 hPa (8 km), 250 hPa (11 km), 150 hPa (14 km), and 90 hPa (17 km), respectively (Shi et al. 2013; Lupu et al. 2016). Similar to the total and normalized observation impacts in Figs. 6 and 7, the satellite observation impact of EXP0 was considerably greater than those of other experiments. After EXP0, the impact of EXP1 was the greatest, followed by EXP2 and EXP3, for all satellites and channels. The decreasing rates of the observation impacts of EXP2 and EXP3 in comparison with EXP1 were the greatest for channel 5, which measures the brightness temperature of the lower troposphere. In contrast, the observation impact for channel 7 in EXP2 and EXP3 did not decrease considerably compared with EXP1. The average observation impact for channel 5 in EXP2 (EXP3) decreased by 15.2% (35.1%) in comparison with that in EXP1, whereas the average observation impact for channel 7 in EXP2 (EXP3) decreased by 11.3% (18.8%) in comparison with that in EXP1. Therefore, the observation impacts of brightness temperature for satellite observations (i.e., NOAA-15, NOAA-18, NOAA-19, and MetOp-2) in EXP1, EXP2, and EXP3 are sensitive to the channels associated with vertical heights.

Fig. 8.
Fig. 8.

Time-averaged observation impact for (a) NOAA-15, (b) NOAA-18, (c) NOAA-19, and (d) MetOp-2 for each experiment during the experimental period. Black, red, green, yellow, and blue colors represent channels 5, 6, 7, 8, and 9, respectively.

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

Figure 9 shows the vertical profiles of observation impact for SOUND and GEOAMV observations. The observation impacts of EXP1, EXP2, and EXP3 for both SOUND and AMV were greatest at the layers with greatest observation numbers. Except for the upper boundaries, the observation impacts of SOUND in EXP1, EXP2, and EXP3 were vertically homogeneously decreased. This is different from the observation impacts for brightness temperature that vary depending on the channels (i.e., vertical heights). In addition, the observation impacts of AMV in EXP1, EXP2, and EXP3 showed vertically similar decreasing rates except for heights of 200–300 hPa. In contrast to the observation impacts of brightness temperature for satellite observations that are sensitive to observation heights, the observation impacts of U, V, T, and Q for SOUND in EXP1, EXP2, and EXP3 are not significantly changed with respect to observation heights.

Fig. 9.
Fig. 9.

Vertical profile of (a) SOUND observation impact, (b) SOUND observation number, (c) GEOAMV observation impact, and (d) GEOAMV observation number for EXP1 (dots), EXP2 (squares), and EXP3 (circles).

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

d. Beneficial observation rate

Figure 10a shows the fraction of beneficial observations for each observation type, for the four experiments in descending order with respect to EXP1. The GPSPW with small observation numbers exhibited the largest average value of 71.5%. The beneficial observation rates for observation types averaged for experiments were between 59.0% and 61.6%. The variation of the beneficial observation rates between experiments was smaller than that of the observation impacts in Figs. 6 and 7, depending on the observation types. Compared with the beneficial observation rate in EXP1, that in EXP2 (EXP3) increased (decreased) by 0.01% (0.8%). This indicates a negligible change between EXP1, EXP2, and EXP3. In comparison with the beneficial observation rate in EXP0, those in EXP1, EXP2, and EXP3 decreased in a range of 3.3%–4.1% on average.

Fig. 10.
Fig. 10.

Fraction of beneficial observations for (a) observation types and (b) observation variables. The ranks are aligned in descending order with respect to the values of EXP1. Black, hatched, gray, and white bars represent the fraction of beneficial observations of EXP0, EXP1, EXP2, and EXP3, respectively. The dashed line indicates a ratio of 50%.

Citation: Journal of Atmospheric and Oceanic Technology 38, 7; 10.1175/JTECH-D-20-0040.1

However, the beneficial observation rates in EXP1, EXP2, and EXP3 were greater than that of EXP0 for certain observation types (e.g., METAR and SHIPs) and similar to that of EXP0 for SYNOP, which are all surface observation data. Unlike other surface observations, BUOY data showed distinct decrease of the beneficial observation rate from EXP0 to EXP1, EXP2, and EXP3. The BUOY data mostly consisted of pressure observations at the surface, which could contribute to the reduction of the forecast error in the entire troposphere (Lei and Anderson 2014). By setting the response function above 150 hPa zero in EXP1, EXP2, and EXP3, the beneficial observation rate of BUOY distinctly decreases from 57.31% in EXP0 to approximately 50% in other experiments. Thus, approximately 7% of BUOY observations (especially pressure observations) may contribute to the reduction of the forecast error above 150 hPa. The other surface data (i.e., SYNOP, SHIPS, and METAR) have considerable number of observations for U, V, T, Q, as well as P. Thus, by averaging beneficial observation rates for all variables, the beneficial observation rates for other surface data did not show much variation between EXP0 and other experiments. Detailed analysis of the impact of observation types and variables on the forecast error reduction for certain atmospheric layers is beyond the scope of this study and will be discussed in the further study.

Figure 10b shows the fraction of beneficial observations for each observation variable, for the four experiments in descending order with respect to EXP1. The Q exhibited the largest value, followed by T, BT, V, U, and P. Except P and Q, EXP2 and EXP3 show smaller beneficial observation rates compared with EXP1 for other variables. Interestingly, the beneficial observation rate for Q in EXP1, EXP2, and EXP3 are greater than that for Q in EXP0. Thus, the specific humidity observations beneficial to the forecast error reduction of the atmosphere below 150 hPa shows higher rate than that of the atmosphere from the surface to the 10 hPa including the layer above 150 hPa. Similar to Fig. 10a, the variation of the beneficial observation rates between experiments was smaller than that of the observation impacts, in relation to the observation variables. The beneficial observation rates for observation variables averaged for experiments were between 60.5% and 62.1%. In comparison with the beneficial observation rate in EXP0, those in EXP1, EXP2, and EXP3 decreased in a range of 2.3%–2.7% on average, which could be a considerable change in the beneficial observation rate. Except Q, the 2.3%–2.7% greater beneficial observation rate in EXP0 implies that those 2.3%–2.7% contribute to the forecast error reduction of the atmospheric layer above 150 hPa. In comparison with the beneficial observation rate in EXP1, those in EXP2 and EXP3 decreased by 0.2% and 0.4%, respectively, demonstrating little change between EXP1, EXP2, and EXP3. The beneficial observation rates for T were greater than those for U and V by approximately 10%–14%, indicating that the observation variables associated with the height information may be more beneficial than those associated with the momentum information, similar to Fig. 7.

The relatively smaller variations in the fraction of beneficial observations in comparison with the observation impacts among experiments imply that the fraction of beneficial observations does not vary considerably depending on the boundary-condition configurations (especially lateral) when calculating the FSOI in the regional modeling framework. Despite the relatively smaller variations of the beneficial observation rate compared to those of the observation impact between experiments, the beneficial observation rate of individual observation types and variables for the specific atmospheric layers could be changed considerably, as shown in the beneficial observation rates of BUOY in Fig. 10a. Detailed analysis of the beneficial observation rates of observation types and variables on the forecast error reduction for specific atmospheric layers will be discussed in the further study.

4. Summary and discussion

Numerous increases in observation data including satellite data and improvements in numerical models are rapidly increasing the capability of numerical forecasting. These data are used in the data assimilation process, where the background state of the model and the observational data are combined to obtain the initial conditions for short-, medium-, and long-term (climate) forecasts. To evaluate the impact of individual observations on numerical forecasts, it is necessary to quantitatively calculate the effect of each observation on numerical forecasts. The FSOI method has been used to quantitatively calculate the impact of individual observations on numerical forecasts. In the regional model framework, the FSOI calculation is affected by the boundary conditions of the model. Nevertheless, the effect of boundary conditions on the FSOI results in the regional model framework has not been fully studied. Thus, in this study, the effect of the boundary-condition configuration during the FSOI calculation in the regional model (i.e., WRF) was evaluated.

Four experiments were conducted. EXP0 had the basic buffer in the lateral boundary condition for both the forward model and backward adjoint model integration steps, and did not have the upper boundary constraint in the response function for the backward model integration. EXP1 also had the basic buffer, along with the constraint in the response function in the backward integration by setting the response function above 150 hPa as 0. EXP2 had 2 times and EXP3 had 4 times the size of the buffer grids compared with the basic buffer configuration in the lateral boundary conditions during the forward and backward integrations, with the same constraint in the response function at the upper boundary as EXP1.

The experiment without the upper boundary constraint in the response function (EXP0) exhibited the largest forecast error and observation impact, although EXP0 is not the typical type of experiment used in FSOI studies. With or without DA, the forecast errors were smaller for larger buffer sizes in the lateral boundaries. This is caused by the small difference between the forecast and the analysis that was treated as the true state. Thus, the larger the buffer size, the smaller the forecast error. The differences between the forecast errors with DA (or without DA) in different experiments with different boundary-condition configurations are large. This indicates that the boundary-condition configuration has a significant effect on the forecast errors with DA (or without DA). The larger decrease rates in nonlinear and linear FERs in the experiment with larger buffer sizes implies larger impact of the lateral boundaries caused by buffer size and smaller observation impact on the forecast error. Thus, the observation impact decreases as the buffer size increases. The decrease in FER (i.e., the smaller difference between the forecast errors with and without DA) implies that the forecast error without DA becomes similar to that with DA and the observation impact by DA on the forecast becomes smaller.

In all experiments, SOUND exhibited the greatest total observation impact, followed by QSCAT, GEOAMV, and AMSU-A on board NOAA and MetOp satellites. After that, the surface and upper-air observations generally followed although the ranking of individual observation types was mixed. In terms of the observation variables, the brightness temperature exhibited the greatest observation impact followed by meridional wind, zonal wind, temperature, specific humidity, and surface pressure. The ranking of observation impacts was consistent for observation types and variables between experiments with the upper boundary constraint in the response function (i.e., EXP1, EXP2, and EXP3).

Compared with EXP1, the observation impacts for EXP2 and EXP3 decrease the most for channel 5, which measure the brightness temperature of the lower troposphere. In EXP1, EXP2, and EXP3, the observation impacts of the brightness temperature for satellite observations (i.e., NOAA-15, NOAA-18, NOAA-19, and MetOp-2) were sensitive to the observation heights. In contrast, the observation impacts of U, V, T, and Q for SOUND, or U and V for AMV, were not considerably sensitive to the observation heights.

The average beneficial observation rates for observation types were between 59.0% and 61.6%. The beneficial observation rates for T were greater than those for U and V by approximately 10%–14%. This indicates that the observation variables associated with the height information may be more beneficial than those associated with the momentum information. Thus, observing the height information of the atmosphere and constraining winds (i.e., momentum information) through the background error covariance used for Eq. (5) seems to work more effectively in this framework than vice versa. Although related to the height information, BT exhibited a relatively smaller beneficial observation rate, which is associated with relatively larger uncertainty of BT. Since the adjoint-based forecast sensitivities [i.e., linear combination of singular vectors (SVs)] and SVs vary depending on the norm chosen (moist or dry total energy norm) (Kim and Jung 2009; Jung and Kim 2009), the FSOI results for the Q variable could be changed for the moist total energy norm, which needs further study.

The smaller variation of the fraction of beneficial observations in comparison with the observation impacts among experiments implies that the fraction of beneficial observations does not vary much depending on the lateral boundary conditions specified when calculating the FSOI in the regional modeling framework. This characteristic insensitivity of the beneficial observation rate for the boundary-condition configurations would be useful in selecting positive observation sets within the regional modeling framework, especially when producing regional reanalyses and reforecasts.

Acknowledgments

This study was supported by a National Research Foundation of Korea (NRF) grant funded by the South Korean government (Ministry of Science and ICT) (Grant 2021R1A2C1012572) and the Yonsei Signature Research Cluster Program of 2021 (2021-22-0003). The simulations were primarily conducted by utilizing the supercomputer system supported by the National Center for Meteorological Supercomputer of the Korea Meteorological Administration (KMA). Additionally, the authors appreciate Myunghwan Kim and Dr. Sung-Min Kim for their earlier work associated with this study.

Data availability statement

The observational data used in this study were Prepared Binary Universal Form for the Representation of Meteorological Data (PREPBUFR) of National Centers for Environmental Prediction (NCEP). The WRF, WRFDA, and WRFPLUS v3.8 were run on the supercomputer of the National Center for Meteorological Supercomputer of the KMA (http://super.kma.go.kr). The model output data evaluated are archived on the cluster of the National Center for Meteorological Supercomputer of the KMA (http://super.kma.go.kr).

REFERENCES

  • Amerault, C., K. Sashegyi, P. Pauley, and J. Doyle, 2013: Quantifying observation impact for a limited area atmospheric forecast model. Data Assimilation for Atmospheric, Oceanic and Hydrologic Applications, Vol. II, S. Park and L. Xu, Eds., Springer, 125–145.

  • Baker, N. L., and R. Daley, 2000: Observation and background adjoint sensitivity in the adaptive observation-targeting problem. Quart. J. Roy. Meteor. Soc., 126, 14311454, https://doi.org/10.1002/qj.49712656511.

    • Search Google Scholar
    • Export Citation
  • Barker, D. M., W. Huang, Y.-R. Guo, A. J. Bourgeois, and Q. N. Xiao, 2004: A three-dimensional variational data assimilation system for MM5: Implementation and initial results. Mon. Wea. Rev., 132, 897914, https://doi.org/10.1175/1520-0493(2004)132<0897:ATVDAS>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Barker, D. M., and Coauthors, 2012: The Weather Research and Forecasting Model’s Community Variational/Ensemble Data Assimilation System: WRFDA. Bull. Amer. Meteor. Soc., 93, 831843, https://doi.org/10.1175/BAMS-D-11-00167.1.

    • Search Google Scholar
    • Export Citation
  • Cardinali, C., 2009: Monitoring the observation impact on the short-range forecast. Quart. J. Roy. Meteor. Soc., 135, 239250, https://doi.org/10.1002/qj.366.

    • Search Google Scholar
    • Export Citation
  • Chen, F., and J. Dudhia, 2001: Coupling an advanced land surface–hydrology model with the Penn State–NCAR MM5 modeling system. Part I: Model implementation and sensitivity. Mon. Wea. Rev., 129, 569585, https://doi.org/10.1175/1520-0493(2001)129<0569:CAALSH>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Daescu, D. N., 2008: On the sensitivity equations of four-dimensional variational (4D-Var) data assimilation. Mon. Wea. Rev., 136, 30503065, https://doi.org/10.1175/2007MWR2382.1.

    • Search Google Scholar
    • Export Citation
  • Davies, H. C., and R. E. Turner, 1977: Updating prediction models by dynamical relaxation: An examination of the technique. Quart. J. Roy. Meteor. Soc., 103, 225245, https://doi.org/10.1002/qj.49710343602.

    • Search Google Scholar
    • Export Citation
  • Dudhia, J., 1989: Numerical study of convection observed during the Winter Monsoon Experiment using a mesoscale two-dimensional model. J. Atmos. Sci., 46, 30773107, https://doi.org/10.1175/1520-0469(1989)046<3077:NSOCOD>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Errico, R. M., 2007: Interpretations of an adjoint-derived observational impact measure. Tellus, 59A, 273276, https://doi.org/10.1111/j.1600-0870.2006.00217.x.

    • Search Google Scholar
    • Export Citation
  • Gelaro, R., and Y. Zhu, 2009: Examination of observation impacts derived from observing system experiments (OSEs) and adjoint models. Tellus, 61A, 179193, https://doi.org/10.1111/j.1600-0870.2008.00388.x.

    • Search Google Scholar
    • Export Citation
  • Gelaro, R., Y. Zhu, and R. M. Errico, 2007: Examination of various-order adjoint-based approximations of observation impact. Meteor. Z., 16, 685692, https://doi.org/10.1127/0941-2948/2007/0248.

    • Search Google Scholar
    • Export Citation
  • Gelaro, R., R. H. Langland, S. Pellerin, and R. Todling, 2010: The THORPEX observation impact intercomparison experiment. Mon. Wea. Rev., 138, 40094025, https://doi.org/10.1175/2010MWR3393.1.

    • Search Google Scholar
    • Export Citation
  • Hong, S.-Y., and J.-O. Lim, 2006: The WRF single-moment 6-class microphysics scheme (WSM6). Asia-Pac. J. Atmos. Sci., 42, 129151.

  • Hong, S.-Y., Y. Noh, and J. Dudhia, 2006: A new vertical diffusion package with an explicit treatment of entrainment processes. Mon. Wea. Rev., 134, 23182341, https://doi.org/10.1175/MWR3199.1.

    • Search Google Scholar
    • Export Citation
  • Joo, S., J. Eyre, and R. Marriott, 2013: The impact of MetOp and other satellite data within the Met Office global NWP system using an adjoint-based sensitivity method. Mon. Wea. Rev., 141, 33313342, https://doi.org/10.1175/MWR-D-12-00232.1.

    • Search Google Scholar
    • Export Citation
  • Jung, B.-J., and H. M. Kim, 2009: Moist-adjoint based forecast sensitivities for a heavy snowfall event over the Korean Peninsula on 4–5 March 2004. J. Geophys. Res., 114, D15104, https://doi.org/10.1029/2008JD011370.

    • Search Google Scholar
    • Export Citation
  • Jung, B.-J., H. M. Kim, Y.-H. Kim, E.-H. Jeon, and K.-H. Kim, 2010: Observation system experiments for Typhoon Jangmi (200815) observed during T-PARC. Asia-Pac. J. Atmos. Sci., 46, 305316, https://doi.org/10.1007/s13143-010-1007-y.

    • Search Google Scholar
    • Export Citation
  • Jung, B.-J., H. M. Kim, F. Zhang, and C.-C. Wu, 2012: Effect of targeted dropsonde observations and best track data on the track forecasts of Typhoon Sinlaku (2008) using an ensemble Kalman filter. Tellus, 64A, 14984, https://doi.org/10.3402/tellusa.v64i0.14984.

    • Search Google Scholar
    • Export Citation
  • Jung, B.-J., H. M. Kim, T. Auligne, X. Zhang, and X.-Y. Huang, 2013: Adjoint-derived observation impact using WRF in the western North Pacific. Mon. Wea. Rev., 141, 40804097, https://doi.org/10.1175/MWR-D-12-00197.1.

    • Search Google Scholar
    • Export Citation
  • Kain, J. S., 2004: The Kain–Fritsch convective parameterization: An update. J. Appl. Meteor., 43, 170181, https://doi.org/10.1175/1520-0450(2004)043<0170:TKCPAU>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Kalnay, E., 2003: Atmospheric Modelling, Data Assimilation, and Predictability. Cambridge University Press, 341 pp.

  • Kelly, G., J.-N. Thepaut, R. Buizza, and C. Cardinali, 2007: The value of observations. I: Data denial experiments for the Atlantic and the Pacific. Quart. J. Roy. Meteor. Soc., 133, 18031815, https://doi.org/10.1002/qj.150.

    • Search Google Scholar
    • Export Citation
  • Kim, H. M., and B.-J. Jung, 2009: Influence of moist physics and norms on singular vectors for a tropical cyclone. Mon. Wea. Rev., 137, 525543, https://doi.org/10.1175/2008MWR2739.1.

    • Search Google Scholar
    • Export Citation
  • Kim, M., H. M. Kim, J. Kim, S.-M. Kim, C. Velden, and B. Hoover, 2017: Effect of enhanced satellite-derived atmospheric motion vectors on numerical weather prediction in East Asia using an adjoint-based observation impact method. Wea. Forecasting, 32, 579594, https://doi.org/10.1175/WAF-D-16-0061.1.

    • Search Google Scholar
    • Export Citation
  • Kim, S.-M., and H. M. Kim, 2014: Sampling error of observation impact statistics. Tellus, 66A, 25435, https://doi.org/10.3402/tellusa.v66.25435.

    • Search Google Scholar
    • Export Citation
  • Kim, S.-M., and H. M. Kim, 2017: Adjoint-based observation impact of Advanced Microwave Sounding Unit-A (AMSU-A) on the short range forecasts in East Asia (in Korean). Atmosphere, 27, 93104, https://doi.org/10.14191/Atmos.2017.27.1.093.

    • Search Google Scholar
    • Export Citation
  • Kim, S.-M., and H. M. Kim, 2018: Effect of observation error variance adjustment on numerical weather prediction using forecast sensitivity to error covariance parameters. Tellus, 70A, 116, https://doi.org/10.1080/16000870.2018.1492839.

    • Search Google Scholar
    • Export Citation
  • Kim, S.-M., and H. M. Kim, 2019: Forecast sensitivity observation impact in the 4DVAR and hybrid-4DVAR data assimilation systems. J. Atmos. Oceanic Technol., 36, 15631575, https://doi.org/10.1175/JTECH-D-18-0240.1.

    • Search Google Scholar
    • Export Citation
  • Kumar, A., J. Done, and J. Dudhia, 2011: Simulations of Cyclone Sidr in the Bay of Bengal with a high-resolution model: Sensitivity to large-scale boundary forcing. Meteor. Atmos. Phys., 114, 123137, https://doi.org/10.1007/s00703-011-0161-9.

    • Search Google Scholar
    • Export Citation
  • Langland, R., and N. Baker, 2004: Estimation of observation impact using the NRL atmospheric variational data assimilation adjoint system. Tellus, 56A, 189201, https://doi.org/10.3402/tellusa.v56i3.14413.

    • Search Google Scholar
    • Export Citation
  • Lei, L., and J. L. Anderson, 2014: Impacts of frequent assimilation of surface pressure observations on atmospheric analyses. Mon. Wea. Rev., 142, 44774483, https://doi.org/10.1175/MWR-D-14-00097.1.

    • Search Google Scholar
    • Export Citation
  • Lorenc, A. C., and R. Marriott, 2014: Forecast sensitivity to observations in the Met Office global numerical weather prediction system. Quart. J. Roy. Meteor. Soc., 140, 209224, https://doi.org/10.1002/qj.2122.

    • Search Google Scholar
    • Export Citation
  • Lupu, C., A. Geer, N. Bormann, and S. English, 2016: An evaluation of radiative transfer modelling errors in AMSU-A data. ECMWF Tech. Memo. 770, 39 pp., https://www.ecmwf.int/en/elibrary/16711-evaluation-radiative-transfer-modelling-errors-amsu-data.

  • Mlawer, E. J., S. J. Taubman, P. D. Brown, M. J. Iacono, and S. A. Clough, 1997: Radiative transfer for inhomogeneous atmosphere: RRTM, a validated correlated-k model for the longwave. J. Geophys. Res., 102, 16 66316 682, https://doi.org/10.1029/97JD00237.

    • Search Google Scholar
    • Export Citation
  • Parrish, D. F., and J. C. Derber, 1992: The National Meteorological Center’s spectral statistical-interpolation analysis system. Mon. Wea. Rev., 120, 17471763, https://doi.org/10.1175/1520-0493(1992)120<1747:TNMCSS>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Rabier, F., E. Klinker, P. Courtier, and A. Hollingsworth, 1996: Sensitivity of forecast errors to initial conditions. Quart. J. Roy. Meteor. Soc., 122, 121150, https://doi.org/10.1002/qj.49712252906.

    • Search Google Scholar
    • Export Citation
  • Shi, Y., K.-F. Li, Y. L. Yung, H. H. Aumann, Z. Shi, and T. Y. Hou, 2013: A decadal microwave record of tropical air temperature from AMSU-A/Aqua observations. Climate Dyn., 41, 13851405, https://doi.org/10.1007/s00382-013-1696-x.

    • Search Google Scholar
    • Export Citation
  • Skamarock, W. C., and Coauthors, 2008: A description of the Advanced Research WRF version 3. NCAR Tech. Note NCAR/TN-475+STR, 113 pp., https://doi.org/10.5065/D68S4MVH.

  • Wang, X. D., Z. Zhong, Y. J. Hu, and H. H. Yuan, 2010: Effect of lateral boundary scheme on the simulation of tropical cyclone track in regional climate model RegCM3. Asia-Pac. J. Atmos. Sci., 46, 221230, https://doi.org/10.1007/s13143-010-0019-y.

    • Search Google Scholar
    • Export Citation
  • Yamaguchi, M., T. Iriguchi, T. Nakazawa, and C.-C. Wu, 2009: An observing system experiment for Typhoon Conson (2004) using a singular vector method and DOTSTAR data. Mon. Wea. Rev., 137, 28012816, https://doi.org/10.1175/2009MWR2683.1.

    • Search Google Scholar
    • Export Citation
  • Zhang, X., X.-Y. Huang, and N. Pan, 2013: Development of the upgraded tangent linear and adjoint of the Weather Research and Forecasting (WRF) Model. J. Atmos. Oceanic Technol., 30, 11801188, https://doi.org/10.1175/JTECH-D-12-00213.1.

    • Search Google Scholar
    • Export Citation
Save
  • Amerault, C., K. Sashegyi, P. Pauley, and J. Doyle, 2013: Quantifying observation impact for a limited area atmospheric forecast model. Data Assimilation for Atmospheric, Oceanic and Hydrologic Applications, Vol. II, S. Park and L. Xu, Eds., Springer, 125–145.

  • Baker, N. L., and R. Daley, 2000: Observation and background adjoint sensitivity in the adaptive observation-targeting problem. Quart. J. Roy. Meteor. Soc., 126, 14311454, https://doi.org/10.1002/qj.49712656511.

    • Search Google Scholar
    • Export Citation
  • Barker, D. M., W. Huang, Y.-R. Guo, A. J. Bourgeois, and Q. N. Xiao, 2004: A three-dimensional variational data assimilation system for MM5: Implementation and initial results. Mon. Wea. Rev., 132, 897914, https://doi.org/10.1175/1520-0493(2004)132<0897:ATVDAS>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Barker, D. M., and Coauthors, 2012: The Weather Research and Forecasting Model’s Community Variational/Ensemble Data Assimilation System: WRFDA. Bull. Amer. Meteor. Soc., 93, 831843, https://doi.org/10.1175/BAMS-D-11-00167.1.

    • Search Google Scholar
    • Export Citation
  • Cardinali, C., 2009: Monitoring the observation impact on the short-range forecast. Quart. J. Roy. Meteor. Soc., 135, 239250, https://doi.org/10.1002/qj.366.

    • Search Google Scholar
    • Export Citation
  • Chen, F., and J. Dudhia, 2001: Coupling an advanced land surface–hydrology model with the Penn State–NCAR MM5 modeling system. Part I: Model implementation and sensitivity. Mon. Wea. Rev., 129, 569585, https://doi.org/10.1175/1520-0493(2001)129<0569:CAALSH>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Daescu, D. N., 2008: On the sensitivity equations of four-dimensional variational (4D-Var) data assimilation. Mon. Wea. Rev., 136, 30503065, https://doi.org/10.1175/2007MWR2382.1.

    • Search Google Scholar
    • Export Citation
  • Davies, H. C., and R. E. Turner, 1977: Updating prediction models by dynamical relaxation: An examination of the technique. Quart. J. Roy. Meteor. Soc., 103, 225245, https://doi.org/10.1002/qj.49710343602.

    • Search Google Scholar
    • Export Citation
  • Dudhia, J., 1989: Numerical study of convection observed during the Winter Monsoon Experiment using a mesoscale two-dimensional model. J. Atmos. Sci., 46, 30773107, https://doi.org/10.1175/1520-0469(1989)046<3077:NSOCOD>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Errico, R. M., 2007: Interpretations of an adjoint-derived observational impact measure. Tellus, 59A, 273276, https://doi.org/10.1111/j.1600-0870.2006.00217.x.

    • Search Google Scholar
    • Export Citation
  • Gelaro, R., and Y. Zhu, 2009: Examination of observation impacts derived from observing system experiments (OSEs) and adjoint models. Tellus, 61A, 179193, https://doi.org/10.1111/j.1600-0870.2008.00388.x.

    • Search Google Scholar
    • Export Citation
  • Gelaro, R., Y. Zhu, and R. M. Errico, 2007: Examination of various-order adjoint-based approximations of observation impact. Meteor. Z., 16, 685692, https://doi.org/10.1127/0941-2948/2007/0248.

    • Search Google Scholar
    • Export Citation
  • Gelaro, R., R. H. Langland, S. Pellerin, and R. Todling, 2010: The THORPEX observation impact intercomparison experiment. Mon. Wea. Rev., 138, 40094025, https://doi.org/10.1175/2010MWR3393.1.

    • Search Google Scholar
    • Export Citation
  • Hong, S.-Y., and J.-O. Lim, 2006: The WRF single-moment 6-class microphysics scheme (WSM6). Asia-Pac. J. Atmos. Sci., 42, 129151.

  • Hong, S.-Y., Y. Noh, and J. Dudhia, 2006: A new vertical diffusion package with an explicit treatment of entrainment processes. Mon. Wea. Rev., 134, 23182341, https://doi.org/10.1175/MWR3199.1.

    • Search Google Scholar
    • Export Citation
  • Joo, S., J. Eyre, and R. Marriott, 2013: The impact of MetOp and other satellite data within the Met Office global NWP system using an adjoint-based sensitivity method. Mon. Wea. Rev., 141, 33313342, https://doi.org/10.1175/MWR-D-12-00232.1.

    • Search Google Scholar
    • Export Citation
  • Jung, B.-J., and H. M. Kim, 2009: Moist-adjoint based forecast sensitivities for a heavy snowfall event over the Korean Peninsula on 4–5 March 2004. J. Geophys. Res., 114, D15104, https://doi.org/10.1029/2008JD011370.

    • Search Google Scholar
    • Export Citation
  • Jung, B.-J., H. M. Kim, Y.-H. Kim, E.-H. Jeon, and K.-H. Kim, 2010: Observation system experiments for Typhoon Jangmi (200815) observed during T-PARC. Asia-Pac. J. Atmos. Sci., 46, 305316, https://doi.org/10.1007/s13143-010-1007-y.

    • Search Google Scholar
    • Export Citation
  • Jung, B.-J., H. M. Kim, F. Zhang, and C.-C. Wu, 2012: Effect of targeted dropsonde observations and best track data on the track forecasts of Typhoon Sinlaku (2008) using an ensemble Kalman filter. Tellus, 64A, 14984, https://doi.org/10.3402/tellusa.v64i0.14984.

    • Search Google Scholar
    • Export Citation
  • Jung, B.-J., H. M. Kim, T. Auligne, X. Zhang, and X.-Y. Huang, 2013: Adjoint-derived observation impact using WRF in the western North Pacific. Mon. Wea. Rev., 141, 40804097, https://doi.org/10.1175/MWR-D-12-00197.1.

    • Search Google Scholar
    • Export Citation
  • Kain, J. S., 2004: The Kain–Fritsch convective parameterization: An update. J. Appl. Meteor., 43, 170181, https://doi.org/10.1175/1520-0450(2004)043<0170:TKCPAU>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Kalnay, E., 2003: Atmospheric Modelling, Data Assimilation, and Predictability. Cambridge University Press, 341 pp.

  • Kelly, G., J.-N. Thepaut, R. Buizza, and C. Cardinali, 2007: The value of observations. I: Data denial experiments for the Atlantic and the Pacific. Quart. J. Roy. Meteor. Soc., 133, 18031815, https://doi.org/10.1002/qj.150.

    • Search Google Scholar
    • Export Citation
  • Kim, H. M., and B.-J. Jung, 2009: Influence of moist physics and norms on singular vectors for a tropical cyclone. Mon. Wea. Rev., 137, 525543, https://doi.org/10.1175/2008MWR2739.1.

    • Search Google Scholar
    • Export Citation
  • Kim, M., H. M. Kim, J. Kim, S.-M. Kim, C. Velden, and B. Hoover, 2017: Effect of enhanced satellite-derived atmospheric motion vectors on numerical weather prediction in East Asia using an adjoint-based observation impact method. Wea. Forecasting, 32, 579594, https://doi.org/10.1175/WAF-D-16-0061.1.

    • Search Google Scholar
    • Export Citation
  • Kim, S.-M., and H. M. Kim, 2014: Sampling error of observation impact statistics. Tellus, 66A, 25435, https://doi.org/10.3402/tellusa.v66.25435.

    • Search Google Scholar
    • Export Citation
  • Kim, S.-M., and H. M. Kim, 2017: Adjoint-based observation impact of Advanced Microwave Sounding Unit-A (AMSU-A) on the short range forecasts in East Asia (in Korean). Atmosphere, 27, 93104, https://doi.org/10.14191/Atmos.2017.27.1.093.

    • Search Google Scholar
    • Export Citation
  • Kim, S.-M., and H. M. Kim, 2018: Effect of observation error variance adjustment on numerical weather prediction using forecast sensitivity to error covariance parameters. Tellus, 70A, 116, https://doi.org/10.1080/16000870.2018.1492839.

    • Search Google Scholar
    • Export Citation
  • Kim, S.-M., and H. M. Kim, 2019: Forecast sensitivity observation impact in the 4DVAR and hybrid-4DVAR data assimilation systems. J. Atmos. Oceanic Technol., 36, 15631575, https://doi.org/10.1175/JTECH-D-18-0240.1.

    • Search Google Scholar
    • Export Citation
  • Kumar, A., J. Done, and J. Dudhia, 2011: Simulations of Cyclone Sidr in the Bay of Bengal with a high-resolution model: Sensitivity to large-scale boundary forcing. Meteor. Atmos. Phys., 114, 123137, https://doi.org/10.1007/s00703-011-0161-9.

    • Search Google Scholar
    • Export Citation
  • Langland, R., and N. Baker, 2004: Estimation of observation impact using the NRL atmospheric variational data assimilation adjoint system. Tellus, 56A, 189201, https://doi.org/10.3402/tellusa.v56i3.14413.

    • Search Google Scholar
    • Export Citation
  • Lei, L., and J. L. Anderson, 2014: Impacts of frequent assimilation of surface pressure observations on atmospheric analyses. Mon. Wea. Rev., 142, 44774483, https://doi.org/10.1175/MWR-D-14-00097.1.

    • Search Google Scholar
    • Export Citation
  • Lorenc, A. C., and R. Marriott, 2014: Forecast sensitivity to observations in the Met Office global numerical weather prediction system. Quart. J. Roy. Meteor. Soc., 140, 209224, https://doi.org/10.1002/qj.2122.

    • Search Google Scholar
    • Export Citation
  • Lupu, C., A. Geer, N. Bormann, and S. English, 2016: An evaluation of radiative transfer modelling errors in AMSU-A data. ECMWF Tech. Memo. 770, 39 pp., https://www.ecmwf.int/en/elibrary/16711-evaluation-radiative-transfer-modelling-errors-amsu-data.

  • Mlawer, E. J., S. J. Taubman, P. D. Brown, M. J. Iacono, and S. A. Clough, 1997: Radiative transfer for inhomogeneous atmosphere: RRTM, a validated correlated-k model for the longwave. J. Geophys. Res., 102, 16 66316 682, https://doi.org/10.1029/97JD00237.

    • Search Google Scholar
    • Export Citation
  • Parrish, D. F., and J. C. Derber, 1992: The National Meteorological Center’s spectral statistical-interpolation analysis system. Mon. Wea. Rev., 120, 17471763, https://doi.org/10.1175/1520-0493(1992)120<1747:TNMCSS>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Rabier, F., E. Klinker, P. Courtier, and A. Hollingsworth, 1996: Sensitivity of forecast errors to initial conditions. Quart. J. Roy. Meteor. Soc., 122, 121150, https://doi.org/10.1002/qj.49712252906.

    • Search Google Scholar
    • Export Citation
  • Shi, Y., K.-F. Li, Y. L. Yung, H. H. Aumann, Z. Shi, and T. Y. Hou, 2013: A decadal microwave record of tropical air temperature from AMSU-A/Aqua observations. Climate Dyn., 41, 13851405, https://doi.org/10.1007/s00382-013-1696-x.

    • Search Google Scholar
    • Export Citation
  • Skamarock, W. C., and Coauthors, 2008: A description of the Advanced Research WRF version 3. NCAR Tech. Note NCAR/TN-475+STR, 113 pp., https://doi.org/10.5065/D68S4MVH.

  • Wang, X. D., Z. Zhong, Y. J. Hu, and H. H. Yuan, 2010: Effect of lateral boundary scheme on the simulation of tropical cyclone track in regional climate model RegCM3. Asia-Pac. J. Atmos. Sci., 46, 221230, https://doi.org/10.1007/s13143-010-0019-y.

    • Search Google Scholar
    • Export Citation
  • Yamaguchi, M., T. Iriguchi, T. Nakazawa, and C.-C. Wu, 2009: An observing system experiment for Typhoon Conson (2004) using a singular vector method and DOTSTAR data. Mon. Wea. Rev., 137, 28012816, https://doi.org/10.1175/2009MWR2683.1.

    • Search Google Scholar
    • Export Citation
  • Zhang, X., X.-Y. Huang, and N. Pan, 2013: Development of the upgraded tangent linear and adjoint of the Weather Research and Forecasting (WRF) Model. J. Atmos. Oceanic Technol., 30, 11801188, https://doi.org/10.1175/JTECH-D-12-00213.1.

    • Search Google Scholar
    • Export Citation
  • Fig. 1.

    Experimental domain of this study, indicated by the black solid line. The buffer zone with 5 buffer grids (gray solid), 10 buffer grids (black dashed), and 20 buffer grids (gray dashed).

  • Fig. 2.

    Process of cycling DA experiment and FSOI calculation.

  • Fig. 3.

    Time series of forecast error with DA (gray) and without DA (black) during the experimental period: (a) EXP0, (b) EXP1, (c) EXP2, and (d) EXP3. Note that the vertical scale of (a) is greater than those of (b)–(d).

  • Fig. 4.

    Time series of nonlinear FER (black) and approximated FER (i.e., linear FER or observation impact, gray) during the experimental period: (a) EXP0, (b) EXP1, (c) EXP2, and (d) EXP3. Note that the vertical scale of (a) is greater than those of (b)–(d).

  • Fig. 5.

    Time series of linear FER (black) difference between experiments during the experimental period: (a) EXP1 − EXP0, (b) EXP2 − EXP1, (c) EXP3 − EXP1, and (d) EXP3 − EXP2. Note that the vertical scale of (a) is greater than those of (b)–(d).

  • Fig. 6.

    Time-averaged observation impacts for (a) observation types and (c) observation variables during the experimental period. (b) Time-averaged observation number for observation types. All ranks are aligned in descending order with respect to the absolute observation impact values of EXP1. Black, hatched, gray, and white bars represent the observation impacts of EXP0, EXP1, EXP2, and EXP3, respectively.

  • Fig. 7.

    As in Fig. 6, but normalized by the number of observations for a given observation type or variable.

  • Fig. 8.

    Time-averaged observation impact for (a) NOAA-15, (b) NOAA-18, (c) NOAA-19, and (d) MetOp-2 for each experiment during the experimental period. Black, red, green, yellow, and blue colors represent channels 5, 6, 7, 8, and 9, respectively.

  • Fig. 9.

    Vertical profile of (a) SOUND observation impact, (b) SOUND observation number, (c) GEOAMV observation impact, and (d) GEOAMV observation number for EXP1 (dots), EXP2 (squares), and EXP3 (circles).

  • Fig. 10.

    Fraction of beneficial observations for (a) observation types and (b) observation variables. The ranks are aligned in descending order with respect to the values of EXP1. Black, hatched, gray, and white bars represent the fraction of beneficial observations of EXP0, EXP1, EXP2, and EXP3, respectively. The dashed line indicates a ratio of 50%.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 667 133 11
PDF Downloads 715 156 10