## 1. Introduction

In numerical weather prediction (NWP) systems, analysis is conducted using a data assimilation (DA) system that combines observations and the background by considering their respective error statistics. Although the number of observations has rapidly increased, it is not clear that these observations are always beneficial to forecast performance. Thus, it is necessary to monitor and evaluate how observations are used in DA and forecast systems.

The impact of observations on forecasts has traditionally been assessed with observing system experiments (OSEs: Atlas 1997; Masutani et al. 2010). In OSEs, new analyses are made by adding (subtracting) specific observations to (from) the reference set of observations in a DA system. The impact of given observations can then be evaluated by comparing the forecast integrated from a new analysis with that from a reference analysis. By performing OSEs, the observation impact for any forecast lead time or for subsets of geographical regions (e.g., the tropics, each hemisphere, or East Asia) can be evaluated. OSEs are usually performed in operational centers to assess the contribution of each observation type to the improvement of their NWP systems (e.g., Bouttier and Kelly 2001; Kelly et al. 2007; Zapotocny et al. 2008; WMO 2008; Laroche and Sarrazin 2010a,b). However, OSEs require considerable computational resources because the entire DA and forecast system must be run independently of the reference analysis–forecast run to evaluate the impact of each observation.

An alternative method to evaluate the impact of observations on forecasts is the adjoint-derived observation impact method, which is based on the adjoint-based observation sensitivity approach introduced by Baker and Daley (2000) in the context of adaptive observation. The methodology of Baker and Daley (2000) provided the basis for succeeding adjoint-derived observation impact studies. The adjoint-derived observation impact method can simultaneously evaluate the observation impact for all datasets with less computation than OSEs by using the adjoint of the DA system and forecast system. The adjoint of the DA system can be developed by line-by-line adjoint coding (Zhu and Gelaro 2008) or by estimating the analysis error covariance using an iterative minimization algorithm (Tremolet 2008). Higher-order approximations of forecast error measurement and their characteristics in the context of adjoint-derived observation impact calculation have been discussed by Errico (2007), Gelaro et al. (2007), and Tremolet (2007). Other higher-order approximations of forecast error were introduced as a parametric approach by Daescu and Todling (2009). Langland and Baker (2004) analyzed the observation impact of the short-range forecast error in the Naval Research Laboratory (NRL) system. Gelaro et al. (2010) compared the observation impacts from three global operational systems from NRL, Environment Canada (EC), and the Global Modeling and Assimilation Office (GMAO) of the National Aeronautics and Space Administration (NASA).

Two methods of observation impact estimation (i.e., OSEs and the adjoint-derived method) were evaluated and compared by Gelaro and Zhu (2009) for the NASA GMAO system and by Cardinali (2009) for the European Centre for Medium-Range Weather Forecasts (ECMWF) system. These authors found a qualitatively similar observation impact on short-range forecasts using both methods. Observation impact estimation without the adjoint model was also recently suggested and evaluated in the context of ensemble data assimilation (Ancell and Hakim 2007; Torn and Hakim 2008; Liu and Kalnay 2008; Li et al. 2010; Kunii et al. 2012). The impact of observations from special field campaigns was also evaluated for the Fronts and Atlantic Storm-Track Experiment (FASTEX; Doerenbecher and Bergot 2001; Fourrie et al. 2002) and the Atlantic The Observing System Research and Predictability Experiment (THORPEX) Regional Campaign (A-TReC; Langland 2005).

In this study, the impact of observations on forecasts is evaluated using the adjoint-derived method in a limited-area model for the 2008 typhoon season, during which an international field campaign, the THORPEX Pacific Asian Regional Campaign (T-PARC), was performed. This evaluation employs the Advanced Research version of the Weather Research and Forecasting Model (ARW-WRF), its adjoint model (WRFPLUS), and the corresponding three-dimensional variational data assimilation (3DVAR) system, centered in East Asia and the western North Pacific. The adjoint-derived observation impact is compared to that from OSEs for major observation types. Sensitivities to the background and observation error covariance parameter are also evaluated. The adjoint-derived observation impact tool used in this study was developed by Auligné et al. (2011). To the authors' knowledge, this is the first study to fully assess the adjoint-derived observation impact in the WRF system and the sensitivity to the error covariance parameter within the limited-area model framework. Section 2 introduces the methodology for adjoint-derived observation impact, and section 3 provides the experimental framework. Section 4 provides the observation impact results, and section 5 presents a summary and discussion.

## 2. Methodology

### a. Basic concept

^{1}the sensitivity to the initial state can be calculated as follows:

There have been many previous studies using adjoint-derived forecast sensitivity analysis for idealized cyclogenesis (Rabier et al. 1992; Langland et al. 1995; Kim and Beare 2011), real cyclogenesis (Errico and Vukicevic 1992; Rabier et al. 1996; Klinker et al. 1998; Zou et al. 1998; Langland et al. 2002; Kleist and Morgan 2005a,b; Ancell and Mass 2006, 2008; Jung and Kim 2009), tropical cyclones (Kim and Jung 2006; Wu et al. 2007; Chu et al. 2011), and Asian dust transport events (Kim et al. 2008; Kim and Kay 2010). A detailed introduction to adjoint theory and applications can be found in Errico (1997).

*R*in (2.2) can be expressed using (2.4) and the adjointness relationship as

*R*can be calculated as in Baker and Daley (2000):

### b. Practical approach

*R*as the difference between

## 3. Experimental framework

### a. Modeling system

The numerical experiments in this study use the ARW-WRF system (Skamarock et al. 2008). The model domain comprises 141 (zonal direction) by 131 (meridional direction) horizontal grid points centered at 25°N latitude and 125°E longitude, with a 45-km horizontal grid spacing that includes East Asia and the western North Pacific.^{2} The domain has 41 vertical levels with the model top at 50 hPa. The subgrid-scale parameterizations used in this study include the new Kain–Fritsch scheme (Kain 2004) for cumulus parameterization, the WRF single-moment 6-class scheme (Hong and Lim 2006) for microphysics parameterization, the Dudhia scheme (Dudhia 1989) for shortwave radiation parameterization, the Rapid Radiative Transfer Model (RRTM) scheme (Mlawer et al. 1997) for longwave radiation parameterization, the Yonsei University (YSU) scheme (Hong et al. 2006) for planetary boundary layer parameterization, and the Noah land surface model (Chen and Dudhia 2001) for land surface parameterization.

The adjoint version of WRF is needed to calculate the adjoint-derived observation impact. This study uses the WRFPLUS system (Xiao et al. 2008; Huang et al. 2009; Zhang and Huang 2011), which includes the adjoint and tangent linear version of WRF.

Because the WRF system is based on a limited-area model, the lateral boundary condition (LBC) is essential. This study uses the Final Analysis (FNL) data of the National Centers for Environmental Prediction (NCEP) at 1° × 1° horizontal resolution to provide the LBC and initial condition in the analysis–forecast cycles.

### b. Analysis system

The 3DVAR (Barker et al. 2004; Barker et al. 2012) DA system within the WRFDA system is used as an analysis system.^{3} In WRFDA, the analysis procedure is performed using the incremental formulation (Courtier et al. 1994) with the definition of analysis increment *J* in (2.13) is minimized using the Lanczos algorithm (Golub and Van Loan 1996; Auligné et al. 2011). The background error statistics (BES) for the 3DVAR DA system are calculated using the National Meteorological Center (NMC, now known as NCEP) method (Parrish and Derber 1992). This study uses 47-day statistics of the difference between the 12- and 24-h forecasts from 15 August to 30 September 2008 to estimate the BES for a given domain using the gen_be utilities within the WRFDA system.

### c. Observations

Table 1 provides acronyms of the observation types, and Table 2 summarizes the observations used in this study with their assimilated observational variables. The observation data are from prepared Binary Universal Form for the Representation of Meteorological Data (PREPBUFR) format files, which are assimilated to the operational NCEP Global Data Assimilation System (GDAS) and archived in the National Center for Atmospheric Research (NCAR) Research Data Archive (RDA). The raw data are already processed in PREPBUFR format as done at NCEP, and additional processing is performed in the WRFDA system, including data thinning, bias correction, and other quality control procedures. Conventional observations and satellite wind observations [i.e., atmospheric motion vector (AMV) wind from a geostationary satellite (GEOAMV) and Quick Scatterometer (QuikSCAT)] are thinned at 20-km resolution, and the satellite radiance observations are thinned at 90-km resolution.

Acronyms used for the various observation types studied.

Descriptions of the observation types used in this study. The rightmost column contains specific observation types with corresponding assimilated observational variables; *u*, *υ*, *T*, *q*, Ps, TPW, and Tb represent the zonal wind, meridional wind, temperature, specific humidity, surface pressure, total precipitable water, and brightness temperature, respectively.

### d. Reference experiment

The reference state (REFER) is produced by assimilating all observations introduced in section 3c into the WRF 3DVAR DA system from 15 August to 2 October 2008. The earliest integration starts at 0000 UTC 15 August 2008 with the NCEP FNL data as an initial condition. The reference analyses are subsequently constructed with a 6-h assimilation cycle. The radiance observations are directly assimilated in the 3DVAR DA system with the Community Radiative Transfer Model (CRTM; Han et al. 2006) as a forward operator. To handle systematic biases in the radiance observations, the variational bias correction scheme (VarBC; Derber and Wu 1998; Dee 2005; Auligné et al. 2007) is used.

### e. Experimental design

The adjoint-derived observation impact is evaluated from 0000 UTC 16 August to 1800 UTC 1 October 2008 using the WRF, WRFPLUS, and WRFDA systems. The response function for the adjoint model integration is the difference in the forecast errors shown in (2.9). The error reduction of 6-h forecasts is chosen partly because the data assimilation window of most operational centers is 6 h and partly because of limited computational resources. An additional experiment with 24-h forecasts was performed to demonstrate the validity of the results with 6-h forecasts. To define the forecast errors in (2.8), the reference analysis that assimilated all observation types in the WRFDA system (section 3d) is considered a true state. The dry total energy norm is used to define the forecast error norm in (2.7). The forecast error is calculated for the entire domain.

## 4. Results

### a. Validity of the linear estimation

Figure 1a compares the forecast error reduction in (2.9) with its linear estimation in (2.11). Overall,

### b. Sensitivity to observation

#### 1) Observation impact estimation

In this section, the observation impact is aggregated with some subset (i.e., variable, type, or channel) for each analysis time; then, the time-averaged statistic is evaluated. Figure 2 shows the observation impact according to the observation variables (i.e., *u*, *υ*, *T*, Ps, *q*, and Tb). The satellite wind observations (wind_s) are distinguished from the conventional wind observations (wind_c). Because the dry total energy norm is used for the forecast error definition, the observation impact has the unit of joules per kilogram (i.e., J kg^{−1}). The greatest total observation impact is from the radiance observations (Fig. 2a). This impact is one order of magnitude greater than the impacts of the other variables. For conventional observations, the total observation impact of the momentum variables is greater than those of the mass and moisture variables, which is consistent with the recent result reported in (World Meteorological Organization) WMO (2012). The satellite wind observations have a much smaller total impact than the conventional wind observations. The magnitude of the total observation impact is closely related to the number of observations (Fig. 2b). The radiance observation has the greatest observation number, which contributes to the greatest total observation impact of the radiance observation. For conventional observations, the observation number for wind variables is greater than the observation numbers for temperature, surface pressure, and humidity variables (Fig. 2b). To evaluate the normalized observation impact, the observation impact per observation number is calculated (Fig. 2c). The conventional wind observations have the greatest normalized impact. The pressure (Ps) observation, which has the least total impact, has the second greatest normalized impact.

Time-averaged statistics aggregated for each observation variable for (a) total observation impact, (b) number of observations, (c) normalized observation impact, and (d) fraction of beneficial observations. Satellite and conventional wind observations are represented by wind_s and wind_c, respectively.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Time-averaged statistics aggregated for each observation variable for (a) total observation impact, (b) number of observations, (c) normalized observation impact, and (d) fraction of beneficial observations. Satellite and conventional wind observations are represented by wind_s and wind_c, respectively.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Time-averaged statistics aggregated for each observation variable for (a) total observation impact, (b) number of observations, (c) normalized observation impact, and (d) fraction of beneficial observations. Satellite and conventional wind observations are represented by wind_s and wind_c, respectively.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

In an idealized study, Bengtsson (1980) found that surface pressure observations contain useful information about the amplitudes and phases of vertically tilted baroclinic modes throughout the troposphere. Compo et al. (2006) and Whitaker et al. (2009) also found that surface pressure observations provide more information on large-scale tropospheric circulation than do surface temperature or wind observations. Through geostrophy, the surface pressure information yields a reasonable approximation of the barotropic part of the flow, which accounts for a substantial part of the total flow (Compo et al. 2011). The normalized impact of radiance observations is comparable to that of temperature. The moisture observations have the smallest normalized impact. Figure 2d shows the fraction of beneficial observations that reduce the 6-h forecast error for each observational variable. This statistic is averaged for all analysis times. Only two-thirds of the observations contribute to the reduction in the 6-h forecast error, a proportion that is significantly greater than that found in previous studies [50%–54% in Gelaro et al. (2010) and slightly greater than 50% in Kunii et al. (2012)]. This difference may be because the true state is taken from an analysis made at every 6 h in a cycling experiment; hence, the true state is partly correlated to the forecast based on the assimilation at the initial time.

Figure 3 shows the observation impact due to the observation types, which are nine types of conventional observations (SOUND, SYNOP, PILOT, AIREP, GPSPW, METAR, SHIPS, PROFL, and BUOY; see Table 1 for explanations of the terms) and three types of satellite observations (QuikSCAT, GEOAMV, and AMSU-A radiance observations) obtained from four different satellites: *National Oceanic and Atmospheric Administration* (*NOAA*)-*15*, -*16*, and *-17*, and the Meteorological Operational satellite *MetOp-A*. In the time-averaged total observation impact for each observation type, the SOUND observations have the greatest impact. The radiance observations of each satellite also have a large total observation impact. The sum of the AMSU-A impacts from the four satellites is much greater than the impact of SOUND, which demonstrates that satellite observation is an indispensable observing system component in the NWP system. The SYNOP and two satellite wind observations (i.e., QuikSCAT and GEOAMV) followed the SOUND and AMSU-A in importance. The smaller impact of GEOAMV, when used together with many satellite radiance data (i.e., AMSU-A), is consistent with recent studies in WMO (2012). By normalizing with the observation number (Fig. 3b), the impact of radiance observations from four satellites and SOUND observation is greatly reduced and becomes comparable to that of other observation types (Fig. 3c). The greatest observation impact per observation number is obtained from the GPSPW, which observes the total precipitable water from the surface global positioning system (GPS).Note that GPSPW has a great impact on 6-h forecast error reduction even though the forecast error is defined without the moisture term, and moisture is assimilated with a univariate formulation in the WRFDA system. In contrast to the results obtained for the total observation impact, the satellite wind observations (i.e., GEOAMV and QuikSCAT) have a relatively small observation impact per observation number. When examining the fraction of beneficial observations (Fig. 3d), it is observed that 60%–70% of the observations are beneficial, similar to the result shown in Fig. 2d.

As in Fig. 2, but for time-averaged statistics stratified by observation type.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

As in Fig. 2, but for time-averaged statistics stratified by observation type.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

As in Fig. 2, but for time-averaged statistics stratified by observation type.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Figure 4 shows the observation impact for each channel of the AMSU-A radiance observations. The time-averaged total observation impact is greatest for most channels of the *MetOp-A* and *NOAA-18* satellites. Both the total and normalized observation impacts are greatest for channel 9, which is sensitive to the upper troposphere (Figs. 4a and 4c). Whereas the larger beneficial fraction is indicated by channel 9, the smaller fraction is indicated by channel 5, which is related to mid- to lower-tropospheric temperatures (Fig. 4d), and is not consistent with the results of Gelaro et al. (2010), which indicated that channels 5–7 had the greatest impact. This discrepancy may be due to the different configuration of the modeling system used. The modeling system in this study has a relatively low model top (50 hPa), which makes the forecast error largest in the upper troposphere (not shown). In addition, the error norm used in this study does not consider the weighting for vertical level thickness; thus, the forecast in the upper level where the density is low may be overemphasized. Proper norm definition is one of the problems encountered when the adjoint-derived observation impact methodology is used.

As in Fig. 2, but for time-averaged statistics stratified by AMSU-A channel from four satellites.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

As in Fig. 2, but for time-averaged statistics stratified by AMSU-A channel from four satellites.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

As in Fig. 2, but for time-averaged statistics stratified by AMSU-A channel from four satellites.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Figure 5 presents scatterplots of the innovation and corresponding observation impact at 0000 UTC 11 September 2008. Overall, the scatterplots display bow-shaped features, which indicate that greater innovation corresponds to greater observation impact. However, a small observation impact can occur despite a large innovation because the given observation may have low observation sensitivity. The observations near the *x* axis represent those characteristics. As discussed in sections 4b(2) and 4b(3), not all observations beneficially contribute to forecast error reduction. At a given analysis time, a larger beneficial fraction is shown for AMSU-A channels 7 and 9 from *MetOp-A* (70% and 72%, respectively) than for GEOAMV *u* and *υ* (53% and 60%, respectively). Channel 9 shows higher observation sensitivity than channel 7, as shown in Fig. 4c.

Scatterplots of innovation and observation impact for (a) zonal and (b) meridional wind observations of GEOAMV; and channels (c) 7 and (d) 9 of AMSU-A from *METOP-A* satellite at 0000 UTC 11 Sep 2008.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Scatterplots of innovation and observation impact for (a) zonal and (b) meridional wind observations of GEOAMV; and channels (c) 7 and (d) 9 of AMSU-A from *METOP-A* satellite at 0000 UTC 11 Sep 2008.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Scatterplots of innovation and observation impact for (a) zonal and (b) meridional wind observations of GEOAMV; and channels (c) 7 and (d) 9 of AMSU-A from *METOP-A* satellite at 0000 UTC 11 Sep 2008.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

To verify the validity of the observation impact discussed above for forecast errors longer than 6 h, the observation impact on the 24-h forecast error reduction was also evaluated for a 2-week period from 0000 UTC 1 to 1800 UTC 14 September 2008. The results of the 24-h forecast error reduction are generally consistent with that of the 6-h forecast error reduction except for the beneficial fraction (not shown). The major rankings of the observation variables and major observation types are unchanged. The relative impact of the surface pressure observation is increased for the 24-h forecast error reduction. Except for the surface pressure observation, the beneficial fraction of the other observation variables is reduced by approximately 6% (from 66%–72% to 60%–66%). This implies that the verifying analysis is partly correlated with the assimilated observations. Although the impact of LBC for the longer forecast time is one of the main interests of a limited-area model (e.g., Errico et al. 1993; Gustafsson et al. 1998), the impact of LBC on the results may be small for the longer forecast period examined here because the general features of the observation impacts for both the 6- and 24-h forecast error reductions are very similar. The impact of LBC on the forecast sensitivity to observations within a limited-area model deserves further study with greater detail.

#### 2) Observation impact distribution

This section presents the vertical and horizontal distributions of the observation impact for SOUND and GEOAMV. Figure 6 shows the vertical distribution of the observation impact evaluated in this study. The observation impact is categorized for eight vertical intervals: surface (1000)–850, 850–700, 700–500, 500–400, 400–300, 300–200, 200–100, and 100–model top (50) hPa. For SOUND, although the total observation impact has two peaks at 100–200 and 500–700 hPa (Fig. 6a), the observation impact per observation number is greatest in the upper troposphere (Fig. 6c). Compared to the results obtained for SOUND, the vertical distribution obtained for GEOAMV exhibits more vertical variations. The total observation impact for GEOAMV is greatest at 200–300 hPa (Fig. 6d) and is one order of magnitude greater than those of other levels. Although the total observation impact is small at the lower levels, the observation impact per observation number is greatest at 500–700 hPa and below 850 hPa because GEOAMV provides wind retrievals in the oceanic area where the observation network is sparse.

Vertical distribution of (a),(d) time-averaged total observation impact; (b),(e) number of observations; and (c),(f) normalized observation impact for (top) SOUND and (bottom) GEOAMV.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Vertical distribution of (a),(d) time-averaged total observation impact; (b),(e) number of observations; and (c),(f) normalized observation impact for (top) SOUND and (bottom) GEOAMV.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Vertical distribution of (a),(d) time-averaged total observation impact; (b),(e) number of observations; and (c),(f) normalized observation impact for (top) SOUND and (bottom) GEOAMV.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Figure 7 shows the observation impact per observation number for SOUND during targeted dropsonde observations for Typhoons Sinlaku (200813) and Jangmi (200815). Most soundings benefit the 6-h forecast error reduction. The impact of the dropsonde soundings near the typhoon is similar to or greater than that of the radiosonde soundings. However, the dropsonde observations near the typhoon core region degrade the 6-h forecast (Fig. 7c). This negative impact of dropsonde observations near the core regions has been reported in Aberson (2008) and Weissmann et al. (2011) and is mainly due to the lack of representation of the inner-core structure in model fields and the DA procedure.

Horizontal distribution of normalized observation impact for SOUND at (a) 0000 UTC 10 Sep, (b) 0000 UTC 11 Sep, (c) 1200 UTC 11 Sep, (d) 0000 UTC 27 Sep, and (e) 0000 UTC 28 Sep 2008. Typhoons Sinlaku (200813) and Jangmi (200815) are shown at analysis times of (a)–(c) and (d),(e), respectively.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Horizontal distribution of normalized observation impact for SOUND at (a) 0000 UTC 10 Sep, (b) 0000 UTC 11 Sep, (c) 1200 UTC 11 Sep, (d) 0000 UTC 27 Sep, and (e) 0000 UTC 28 Sep 2008. Typhoons Sinlaku (200813) and Jangmi (200815) are shown at analysis times of (a)–(c) and (d),(e), respectively.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Horizontal distribution of normalized observation impact for SOUND at (a) 0000 UTC 10 Sep, (b) 0000 UTC 11 Sep, (c) 1200 UTC 11 Sep, (d) 0000 UTC 27 Sep, and (e) 0000 UTC 28 Sep 2008. Typhoons Sinlaku (200813) and Jangmi (200815) are shown at analysis times of (a)–(c) and (d),(e), respectively.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

#### 3) Observation impact for targeted regions

This section presents the observation impacts for targeted regions. Figure 8a shows the analyzed track and the best track from the Regional Specialized Meteorological Center (RSMC) Tokyo for Typhoon Sinlaku (200813). In general, the REFER analysis simulates the major movement of the typhoon track well. However, it cannot simulate the track during the periods of initial development and rapid development. This may be due to 1) the coarse resolution in the current configuration of the numerical modeling system, 2) the use of static background error statistics for analysis of the typhoon, and/or 3) the limited in situ observations near the storms. While the observation impact on the 6-h forecast error reduction was calculated for the entire domain in the previous sections, it is calculated here for a limited area that is defined horizontally over most of the typhoon tracks and vertically from the surface to 300 hPa for the period from 8 to 21 September 2008 (Fig. 8a).

(a) The horizontal distribution of the targeted region, the analyzed typhoon track (open circles), and the RSMC best track (crosses); (b) observation impacts for the observation variables; (c) observation impacts for the observation types; and (d) the horizontal distribution of the normalized observation impact for SOUND at 1200 UTC 11 Sep 2008.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

(a) The horizontal distribution of the targeted region, the analyzed typhoon track (open circles), and the RSMC best track (crosses); (b) observation impacts for the observation variables; (c) observation impacts for the observation types; and (d) the horizontal distribution of the normalized observation impact for SOUND at 1200 UTC 11 Sep 2008.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

(a) The horizontal distribution of the targeted region, the analyzed typhoon track (open circles), and the RSMC best track (crosses); (b) observation impacts for the observation variables; (c) observation impacts for the observation types; and (d) the horizontal distribution of the normalized observation impact for SOUND at 1200 UTC 11 Sep 2008.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Figure 8b presents the time-averaged total observation impact. The satellite wind observations provide less information than the conventional wind observations, similar to the findings presented in Fig. 2a. Comparing Fig. 8b with Fig. 3a, the impact of radiance observation is reduced because the targeted region has less chance of collocating the passage of a polar-orbited satellite than does the entire region (Fig. 8c). Relative to the radiance observation, the impacts of SOUND and SYNOP are increased. By defining the forecast error in the targeted (limited) area near the typhoon, the observation impact related to the typhoon forecast can be emphasized. Compared to Fig. 7c, whereas the impacts of dropsonde and radiosonde observations near the coastline are emphasized in terms of the typhoon forecast, the impacts of radiosonde over the north of Japan are greatly reduced (Fig. 8d).

### c. Sensitivity to the error covariance parameter

*i*represents a certain subset of observations. The sensitivities to the error covariance parameter are then

Figure 9a shows the sensitivity to

(a) Time series of the sensitivity to the background error covariance parameter. Time-averaged sensitivity to the observation error covariance parameter, aggregated with (b) observation variable and (c) observation type; Bg in (b) and (c) represents the time-averaged sensitivity to the background error covariance parameter.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

(a) Time series of the sensitivity to the background error covariance parameter. Time-averaged sensitivity to the observation error covariance parameter, aggregated with (b) observation variable and (c) observation type; Bg in (b) and (c) represents the time-averaged sensitivity to the background error covariance parameter.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

(a) Time series of the sensitivity to the background error covariance parameter. Time-averaged sensitivity to the observation error covariance parameter, aggregated with (b) observation variable and (c) observation type; Bg in (b) and (c) represents the time-averaged sensitivity to the background error covariance parameter.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

### d. Observing system experiments

This section compares the impact of observations evaluated using typical OSEs with that evaluated using the adjoint-derived method. The REFER that assimilates all observation types (section 3d) is used as a control experiment. Each OSE is then conducted by performing new analysis–forecast cycles for a given period (i.e., from 15 August to 2 October 2008). In the new analysis procedure, the specific observation type to be evaluated is removed from the observation set in data-denial experiments (Arnold and Dey 1986). Through a series of OSEs, the impact of five major observation types (i.e., AMSU-A, SOUND, SYNOP, GEOAMV, and QuikSCAT) is evaluated. The corresponding OSEs are referred to as EXP_AMSU-A, EXP_SOUND, EXP_SYNOP, EXP_GEOAMV, and EXP_QuikSCAT, respectively. The configurations of the analysis and forecast system are the same as those described in section 3.

Figure 10 shows the time-averaged forecast error norms for each OSE using (2.7) for 6- to 24-h forecast times. The forecast fields are verified with respect to reference analysis fields. For all forecast times, the forecast error is greatest for EXP_AMSU-A. The second greatest impact is shown for EXP_SOUND. This is consistent with the result obtained from the adjoint-derived observation impact.^{4} In addition, a comparable impact is shown for SYNOP and QuikSCAT in both the adjoint-derived experiment and the OSE. However, although the impact of GEOAMV is similar to that of SYNOP and QuikSCAT in the OSE, the impact of GEOAMV is smaller than that of SYNOP and QuikSCAT in the adjoint-derived experiment. Gelaro and Zhu (2009) and Cardinali (2009) also reported qualitative similarity between the adjoint method and the OSE for the major observation types, but dissimilarity for the minor observation types. During the evaluation of the observation impact on the forecast, the adjoint-derived method and OSEs have different characteristics. The adjoint-derived method can provide the observation impact for an individual observation component (and for each analysis cycle) using the adjoint of the data assimilation system in the context of all present observations. Conversely, the observation impact from OSEs is cumulative from the analysis and forecast cycles during some period. During such a period, the removal of a specific observing system changes the quality of the DA system in each OSE. This fundamental difference may affect the comparison of the observation impact of the two methods.

Time-averaged forecast error norms for 6- to 24-h forecast times from OSEs.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Time-averaged forecast error norms for 6- to 24-h forecast times from OSEs.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

Time-averaged forecast error norms for 6- to 24-h forecast times from OSEs.

Citation: Monthly Weather Review 141, 11; 10.1175/MWR-D-12-00197.1

## 5. Summary and discussion

In NWP systems, an analysis is determined by a data assimilation (DA) system that combines observations and a background that usually consists of short-term forecast fields, together with their error statistics. Although the available number of observations has increased rapidly, it is not clear that these observations always benefit forecast performance. Thus, it is necessary to monitor and evaluate how the observations are used in DA and forecast systems. In this study, the impact of observation was evaluated using two methods: the adjoint-derived method and traditional OSEs. In the adjoint-derived method, the adjoint of the forecast system and the adjoint of the analysis system were used to simultaneously estimate the observation impact for all observations. The adjoint-derived method also requires fewer computational resources than OSEs and can monitor the observation impact in an operational framework.

In the adjoint-derived observation impact method, the radiance observation has the greatest total impact, partly due to its high number of observations. Normalized by observation number, the impact is greatest for conventional wind observations. The greater impact of conventional wind observations compared with mass observations is consistent with recent studies reported in WMO (2012). Surface pressure observations (the second largest normalized impact) have been reported to contain a greater amount of useful information about large-scale circulation than that provided by surface wind or temperature observations (Compo et al. 2006; Whitaker et al. 2009). When aggregated with each observation type, the total impact is greatest for SOUND and each satellite, followed by SYNOP, QuikSCAT, GEOAMV, and METAR. The normalized impacts for QuikSCAT and GEOAMV are smaller than for other observation types. The fraction of beneficial observations is approximately 66%–72%, which is significantly greater than that reported in previous observation impact studies [50%–54% in Gelaro et al. (2010) and slightly greater than 50% in Kunii et al. (2012)]. This difference may occur because the true state in this study was taken from an analysis conducted every 6 h during a cycling experiment; thus, the true state is partly correlated with the forecast integrated from the analysis at the initial time.

An additional experiment with 24-h forecast error reduction showed that the observation impact decreases to 60%–66%, a value that is also higher than that obtained in the previous studies. The fraction of nonbeneficial observations (30%–40%) found here may be inevitable for the data assimilation and forecast system used in this study. Although the error statistics were perfectly specified, only 60%–65% of the observations have a beneficial impact on the analysis field in a simple idealized assimilation system when the background field and observations are of comparable accuracy (Ehrendorfer 2007). This indicates that further investigations regarding the statistical nature of the data assimilation procedure are needed.

For radiance observations, channel 9, which is sensitive to the upper troposphere, had a greater impact than channel 5, which is sensitive to the middle to lower troposphere. These results are inconsistent with the results of Gelaro et al. (2010), which state that the impacts of channels 5–7 are the greatest. This difference is mainly due to the different configurations of the modeling systems used in each study and the different definitions of the forecast error norm used.

For vertical distribution, whereas the total impact for GEOAMV was greatest in the layer between 200 and 300 hPa, the normalized impact was greatest in the middle to lower troposphere. For SOUND, both the total and normalized impacts were greatest in the upper troposphere. For several analysis times of Typhoons Sinlaku (200813) and Jangmi (200815), the observation impact of the dropsonde soundings near the typhoon is similar to or greater than that of the radiosonde soundings. When the adjoint-derived observation impact was evaluated for the targeted area near Typhoon Sinlaku, the impact of the radiance observations was reduced, whereas the impact of the surface pressure observations was increased.

Based on work by Daescu and Todling (2010), the sensitivity to the error covariance parameter was also evaluated. In the current framework of the analysis system, the background error covariance is overconfident, the observation error covariance is underconfident, and reducing observation error covariance helps to reduce forecast error.

The adjoint-derived observation impact was compared to the observation impact deduced from OSEs that were performed as data-denial experiments for the major observation types. Consistent with the adjoint-derived impact results, the greatest impact of AMSU-A and SOUND was found with OSEs. Overall, a qualitatively similar impact was demonstrated for the major observation types between the adjoint method and OSEs, and a dissimilar impact was shown for minor observation types; these findings are similar to those obtained in previous research using operational global models (Cardinali 2009; Gelaro and Zhu 2009). The disagreement in the observation impact for minor observation types may be due to the different characteristics of the methods used to evaluate the observation impact. The variation of forecast error reduction from the assimilated observations used in this study is somewhat greater than that in previous studies using a global modeling system, as shown in Gelaro et al. (2010). This difference may occur because the distribution of the observation systems (mainly polar-orbiting satellites) is largely variable for a given limited domain.

This study confirms that SOUND provides primary information on the atmospheric state as an in situ observation and that satellite radiance observations are an essential component among the atmospheric observation systems as a remote sensing measurement. This study also confirms that the adjoint-derived method can be used to evaluate the observation impact in a limited area analysis and forecast system that is focused on East Asia and the western North Pacific, whereas OSEs are not feasible in an operational sense. Furthermore, the adjoint-derived sensitivity can be used to evaluate the background and observation error covariance, which play important roles in NWP systems. The adjoint-derived method and OSEs can provide comprehensive information on the effects of the observation systems on the overall forecast ability of the NWP systems, especially when abundant satellite observations are available.

## Acknowledgments

The authors thank the three anonymous reviewers for their valuable comments. This study was supported by the Korea Meteorological Administration Research and Development Program under Grant CATER 2012-2030.

## REFERENCES

Aberson, S. D., 2008: Large forecast degradations due to synoptic surveillance during the 2004 and 2005 hurricane seasons.

,*Mon. Wea. Rev.***136**, 3138–3150.Ancell, B. C., and C. F. Mass, 2006: Structure, growth rates, and tangent linear accuracy of adjoint sensitivities with respect to horizontal and vertical resolution.

,*Mon. Wea. Rev.***134**, 2971–2988.Ancell, B. C., and G. J. Hakim, 2007: Comparing adjoint- and ensemble-sensitivity analysis with applications to observation targeting.

,*Mon. Wea. Rev.***135**, 4117–4134.Ancell, B. C., and C. F. Mass, 2008: The variability of adjoint sensitivity with respect to model physics and basic-state trajectory.

,*Mon. Wea. Rev.***136**, 4612–4628.Arnold, C. P., and C. H. Dey, 1986: Observing-systems simulation experiments: Past, present, and future.

,*Bull. Amer. Meteor. Soc.***67**, 687–695.Atlas, R., 1997: Atmospheric observations and experiments to assess their usefulness in data assimilation.

,*J. Meteor. Soc. Japan***75**, 111–130.Auligné, T., A. P. McNally, and D. P. Dee, 2007: Adaptive bias correction for satellite data in a numerical weather prediction system.

,*Quart. J. Roy. Meteor. Soc.***133**, 631–642.Auligné, T., H. Huang, H.-C. Lin, H. Wang, Q. Xiao, X. Zhang, and X. Zhang, 2011: Forecast Sensitivity to Observations (FSO)—WRF/WRFPLUS/WRFDA v3.3 user's guide. NCAR, 8 pp. [Available online at http://www.mmm.ucar.edu/wrf/users/wrfda/Tutorials/2012_July/docs/README_FSO_v3.3.pdf.]

Baker, N. L., and R. Daley, 2000: Observation and background adjoint sensitivity in the adaptive observation-targeting problem.

,*Quart. J. Roy. Meteor. Soc.***126**, 1431–1454.Bannister, R. N., 2008a: A review of forecast error covariance statistics in atmospheric variational data assimilation. I: Characteristics and measurements of forecast error covariances.

,*Quart. J. Roy. Meteor. Soc.***134**, 1951–1970.Bannister, R. N., 2008b: A review of forecast error covariance statistics in atmospheric variational data assimilation. II: Modelling the forecast error covariance statistics.

,*Quart. J. Roy. Meteor. Soc.***134**, 1971–1996.Barker, D. M., W. Huang, Y.-R. Guo, A. J. Bourgeois, and Q. N. Xiao, 2004: A three-dimensional variational data assimilation system for use with MM5: Implementation and initial results.

,*Mon. Wea. Rev.***132**, 897–914.Barker, D. M., and Coauthors, 2012: The Weather Research and Forecasting model's community variational/ensemble data assimilation system: WRFDA.

,*Bull. Amer. Meteor. Soc.***93**, 831–843.Bengtsson, L., 1980: On the use of a time sequence of surface pressures in four-dimensional data assimilation.

,*Tellus***32**, 189–197.Bouttier, F., and G. Kelly, 2001: Observing-system experiments in the ECMWF 4D-Var data assimilation system.

,*Quart. J. Roy. Meteor. Soc.***127**, 1469–1488.Cardinali, C., 2009: Monitoring the observation impact on the short-range forecast.

,*Quart. J. Roy. Meteor. Soc.***135**, 239–250.Chen, F., and J. Dudhia, 2001: Coupling an advanced land surface–hydrology model with the Penn State–NCAR MM5 modeling system. Part I: Model implementation and sensitivity.

,*Mon. Wea. Rev.***129**, 569–585.Chu, K., Q. Xiao, Z. Tan, and J. Gu, 2011: A forecast sensitivity study on the intensity change of Typhoon Sinlaku (2008).

,*J. Geophys. Res.***116**, D22109, doi:10.1029/2011JD016127.Compo, G. P., J. S. Whitaker, and P. D. Sardeshmukh, 2006: Feasibility of a 100-year reanalysis using only surface pressure data.

,*Bull. Amer. Meteor. Soc.***87**, 175–190.Compo, G. P., and Coauthors, 2011: The Twentieth Century Reanalysis Project.

,*Quart. J. Roy. Meteor. Soc.***137**, 1–28, doi:10.1002/qj.776.Courtier, P., J.-N. Thepaut, and A. Hollingsworth, 1994: A strategy for operational implementation of 4D-Var, using an incremental approach.

,*Quart. J. Roy. Meteor. Soc.***120**, 1367–1387.Courtier, P., 1997: Variational methods.

,*J. Meteor. Soc. Japan***75**, 211–218.Daescu, D. N., 2008: On the sensitivity equations of four-dimensional variational (4D-Var) data assimilation.

,*Mon. Wea. Rev.***136**, 3050–3065.Daescu, D. N., and R. Todling, 2009: Adjoint estimation of the variation in model functional output due to the assimilation of data.

,*Mon. Wea. Rev.***137**, 1705–1716.Daescu, D. N., and R. Todling, 2010: Adjoint sensitivity of the model forecast to data assimilation system error covariance parameters.

,*Quart. J. Roy. Meteor. Soc.***136**, 2000–2012.Daescu, D. N., and R. H. Langland, 2013: Error covariance sensitivity and impact estimation with adjoint 4D-Var: Theoretical aspect and first applications to NAVDAS-AR.

,*Quart. J. Roy. Meteor. Soc.***139**, 226–241.Dee, D. P., 2005: Bias and data assimilation.

,*Quart. J. Roy. Meteor. Soc.***131**, 3323–3343.Derber, J. C., and W.-S. Wu, 1998: The use of TOVS cloud-cleared radiances in the NCEP SSI analysis system.

,*Mon. Wea. Rev.***126**, 2287–2299.Doerenbecher, A., and T. Bergot, 2001: Sensitivity to observations applied to FASTEX cases.

,*Nonlinear Processes Geophys.***8**, 467–481.Dudhia, J., 1989: Numerical study of convection observed during the Winter Monsoon Experiment using a mesoscale two-dimensional model.

,*J. Atmos. Sci.***46**, 3077–3107.Ehrendorfer, M., 2007: A review of issues in ensemble-based Kalman filtering.

,*Meteor. Z.***16**, 795–818.Errico, R. M., 1997: What is an adjoint model?

,*Bull. Amer. Meteor. Soc.***78**, 2577–2591.Errico, R. M., 2007: Interpretations of an adjoint-derived observational impact measure.

,*Tellus***59A**, 273–276.Errico, R. M., and T. Vukicevic, 1992: Sensitivity analysis using an adjoint of the PSU-NCAR mesoscale model.

,*Mon. Wea. Rev.***120**, 1644–1660.Errico, R. M., T. Vukicevic, and K. Raeder, 1993: Comparison of initial and lateral boundary condition sensitivity for a limited-area model.

,*Tellus***45A**, 539–557.Fourrie, N., A. Doerenbecher, T. Bergot, and A. Joly, 2002: Adjoint sensitivity of the forecast to TOVS observations.

,*Quart. J. Roy. Meteor. Soc.***128**, 2759–2777.Gelaro, R., and Y. Zhu, 2009: Examination of observation impacts derived from observing system experiments (OSEs) and adjoint models.

,*Tellus***61A**, 179–193.Gelaro, R., Y. Zhu, and R. M. Errico, 2007: Examination of various-order adjoint-based approximations of observation impact.

,*Meteor. Z.***16**, 685–692.Gelaro, R., R. H. Langland, S. Pellerin, and R. Todling, 2010: The THORPEX observation impact intercomparison experiment.

,*Mon. Wea. Rev.***138**, 4009–4025.Golub, G. H., and C. F. Van Loan, 1996:

*Matrix Computations.*3rd ed. Johns Hopkins University Press, 694 pp.Gustafsson, N., E. Källén, and S. Thorsteinsson, 1998: Sensitivity of forecast errors to initial and lateral boundary conditions.

,*Tellus***50A**, 167–185.Han, Y., P. van Delst, Q. Liu, F. Weng, B. Yan, R. Treadon, and J. Derber, 2006: JCSDA Community Radiative Transfer Model (CRTM)—version 1. NOAA Tech. Rep. NESDIS 122, 40 pp.

Hong, S.-Y., and J.-O. Lim, 2006: The WRF single-moment 6-class microphysics scheme (WSM6).

,*J. Korean Meteor. Soc.***42**, 129–151.Hong, S.-Y., Y. Noh, and J. Dudhia, 2006: A new vertical diffusion package with an explicit treatment of entrainment processes.

,*Mon. Wea. Rev.***134**, 2318–2341.Huang, X.-Y., and Coauthors, 2009: Four-dimensional variational data assimilation for WRF: Formulation and preliminary results.

,*Mon. Wea. Rev.***137**, 299–314.Jung, B.-J., and H. M. Kim, 2009: Moist-adjoint based forecast sensitivities for a heavy snowfall event over the Korean peninsula on 4–5 March 2004.

,*J. Geophys. Res.***114**, D15104, doi:10.1029/2008JD011370.Kain, J. S., 2004: The Kain–Fritsch convective parameterization: An update.

,*J. Appl. Meteor.***43**, 170–181.Kalnay, E., 2003:

*Atmospheric Modeling, Data Assimilation and Predictability*. Cambridge University Press, 341 pp.Kelly, G., J.-N. Thepaut, R. Buizza, and C. Cardinali, 2007: The value of observations. I: Data denial experiments for the Atlantic and the Pacific.

,*Quart. J. Roy. Meteor. Soc.***133**, 1803–1815.Kim, H. M., and B.-J. Jung, 2006: Adjoint-based forecast sensitivities of Typhoon Rusa.

,*Geophys. Res. Lett.***33**, L21813, doi:10.1029/2006GL027289.Kim, H. M., and J. K. Kay, 2010: Forecast sensitivity analysis of an Asian dust event occurred on 6-8 May 2007 in Korea (in Korean with English abstract).

*Atmosphere,***20,**399–414.Kim, H. M., and R. J. Beare, 2011: Characteristics of adjoint sensitivity to potential vorticity.

,*Meteor. Atmos. Phys.***111**, 91–102.Kim, H. M., J. K. Kay, and B.-J. Jung, 2008: Application of adjoint-based forecast sensitivities to Asian dust transport events in Korea.

,*Water Air Soil Pollut.***195**, 335–343, doi:10.1007/s11270-008-9750-8.Kleist, D. T., and M. C. Morgan, 2005a: Interpretation of the structure and evolution of adjoint-derived forecast sensitivity gradients.

,*Mon. Wea. Rev.***133**, 466–484.Kleist, D. T., and M. C. Morgan, 2005b: Application of adjoint-derived forecast sensitivities to the 24–25 January 2000 U.S. East coast snowstorm.

,*Mon. Wea. Rev.***133**, 3148–3175.Klinker, E., F. Rabier, and R. Gelaro, 1998: Estimation of key analysis errors using the adjoint technique.

,*Quart. J. Roy. Meteor. Soc.***124**, 1909–1933.Kunii, M., T. Miyoshi, and E. Kalnay, 2012: Estimating impact of real observations in regional numerical weather prediction using an ensemble Kalman filter.

,*Mon. Wea. Rev.***140**, 1975–1987.Langland, R. H., 2005: Observation impact during the North Atlantic TReC-2003.

,*Mon. Wea. Rev.***133**, 2297–2309.Langland, R. H., and N. L. Baker, 2004: Estimation of observation impact using the NRL atmospheric variational data assimilation adjoint system.

,*Tellus***56A**, 189–201.Langland, R. H., R. L. Elsberry, and R. M. Errico, 1995: Evaluation of physical processes in an idealized extratropical cyclone using adjoint sensitivity.

,*Quart. J. Roy. Meteor. Soc.***121**, 1349–1386.Langland, R. H., M. A. Shapiro, and R. Gelaro, 2002: Initial condition sensitivity and error growth in forecasts of the 25 January 2000 East Coast snowstorm.

,*Mon. Wea. Rev.***130**, 957–974.Laroche, S., and R. Sarrazin, 2010a: Impact study with observations assimilated over North America and the North Pacific Ocean on the MSC global forecast system. Part I: Contribution of radiosonde, aircraft and satellite data.

,*Atmos.–Ocean***48**, 10–25.Laroche, S., and R. Sarrazin, 2010b: Impact study with observations assimilated over North America and the North Pacific Ocean on the MSC global forecast system. Part II: Sensitivity experiments.

,*Atmos.–Ocean***48**, 26–38.Le Dimet, F.-X., H.-E. Ngodock, B. Luong, and J. Verron, 1997: Sensitivity analysis in variational data assimilation.

,*J. Meteor. Soc. Japan***75**, 245–255.Li, H., J. Liu, and E. Kalnay, 2010: Correction of “Estimating observation impact without adjoint model in an ensemble Kalman filter.”

,*Quart. J. Roy. Meteor. Soc.***136**, 1652–1654.Liu, H., and J. Li, 2010: An improvement in forecasting rapid intensification of Typhoon Sinlaku (2008) using clear-sky full spatial resolution advanced IR soundings.

,*J. Appl. Meteor. Climatol.***49**, 821–827.Liu, J., and E. Kalnay, 2008: Estimating observation impact without adjoint model in an ensemble Kalman filter.

,*Quart. J. Roy. Meteor. Soc.***134**, 1327–1335.Lorenc, A. C., 1986: Analysis methods for numerical weather prediction.

,*Quart. J. Roy. Meteor. Soc.***112**, 1177–1194.Masutani, M., and Coauthors, 2010: Observing system simulation experiments.

*Data Assimilation: Making Sense of Observations,*W. Lahoz, B. Khattatov, and R. Menard, Eds., Springer, 647–679.Michel, Y., and T. Auligné, 2010: Inhomogeneous background error modeling and estimation over Antarctica.

,*Mon. Wea. Rev.***138**, 2229–2252.Mlawer, E. J., S. J. Taubman, P. D. Brown, M. J. Iacono, and S. A. Clough, 1997: Radiative transfer for inhomogeneous atmosphere: RRTM, a validated correlated-

*k*model for the longwave.*J. Geophys. Res.,***102**(D14), 16 663–16 682.Palmer, T. N., R. Gelaro, J. Barkmeijer, and R. Buizza, 1998: Singular vectors, metrics, and adaptive observations.

,*J. Atmos. Sci.***55**, 633–653.Parrish, D. F., and J. C. Derber, 1992: The National Meteorological Center's spectral statistical-interpolation analysis system.

,*Mon. Wea. Rev.***120**, 1747–1763.Rabier, F., P. Courtier, and O. Talagrand, 1992: An application of adjoint models to sensitivity analysis.

,*Beitr. Phys. Atmos.***65**, 177–192.Rabier, F., E. Klinker, P. Courtier, and A. Hollingsworth, 1996: Sensitivity of forecast errors to initial conditions.

,*Quart. J. Roy. Meteor. Soc.***122**, 121–150.Skamarock, W. C., and Coauthors, 2008: A description of the Advanced Research WRF version 3. NCAR Tech. Note NCAR/TN-475+STR, 113 pp.

Torn, R. D., and G. J. Hakim, 2008: Ensemble-based sensitivity analysis.

,*Mon. Wea. Rev.***136**, 663–677.Tremolet, Y., 2007: First-order and higher-order approximations of observation impact.

,*Meteor. Z.***16**, 693–694.Tremolet, Y., 2008: Computation of observation sensitivity and observation impact in incremental variational data assimilation.

,*Tellus***60A**, 964–978.Wang, X., D. M. Barker, C. Snyder, and T. M. Hamill, 2008: A hybrid ETKF-3DVAR data assimilation scheme for the WRF model. Part II: Real observation experiments.

,*Mon. Wea. Rev.***136**, 5132–5147.Weissmann, M., and Coauthors, 2011: The influence of assimilating dropsonde data on typhoon track and midlatitude forecasts.

,*Mon. Wea. Rev.***139**, 908–920.Whitaker, J. S., G. P. Compo, and J.-N. Thépaut, 2009: A comparison of variational and ensemble-based data assimilation systems for reanalysis of sparse observations.

,*Mon. Wea. Rev.***137**, 1991–1999.WMO, 2008:

*Proceedings of the Fourth WMO Workshop on the Impact of Various Observing Systems on Numerical Weather Prediction.*Geneva, Switzerland, WMO, 218 pp. [Available online at http://www.wmo.int/pages/prog/www/OSY/Meetings/NWP-4-Geneva2008/Abridged_Version.pdf.]WMO, 2012:

*Proceedings of the Fifth WMO Workshop on the Impact of Various Observing Systems on Numerical Weather Prediction.*Tech. Rep. 2012-1, Sedona, AZ, WMO, 25 pp. [Available online at http://www.wmo.int/pages/prog/www/OSY/Meetings/NWP5_Sedona2012/Final_Report.pdf.]Wu, C.-C., J.-H. Chen, P.-H. Lin, and K.-H. Chou, 2007: Targeted observations of tropical cyclone movement based on the adjoint-derived sensitivity steering vector.

,*J. Atmos. Sci.***64**, 2611–2626.Wu, C.-C., Y.-H. Hunag, and G.-Y. Lien, 2012: Concentric eyewall formation in Typhoon Sinlaku (2008). Part I: Assimilation of T-PARC data based on the ensemble Kalman filter (EnKF).

,*Mon. Wea. Rev.***140**, 506–527.Xiao, Q., and Coauthors, 2008: Application of an adiabatic WRF adjoint to the investigation of the May 2004 McMurdo, Antarctica, severe wind event.

,*Mon. Wea. Rev.***136**, 3696–3713.Zapotocny, T. H., J. A. Jung, J. F. Le Marshall, and R. E. Treadon, 2008: A two-season impact study of four satellite data types and rawinsonde data in the NCEP Global Data Assimilation System.

,*Wea. Forecasting***23**, 80–100.Zhang, X., and X.-Y. Huang, 2011: Recent upgrades and improvements of WRF 4D-Var V3.3.

*Proc. 12th WRF User's Workshop,*Boulder, CO, NCAR, 7B.3. [Available online at http://www.mmm.ucar.edu/wrf/users/workshops/WS2011/WorkshopPapers.php.]Zhu, Y., and R. Gelaro, 2008: Observation sensitivity calculations using the adjoint of the Gridpoint Statistical Interpolation (GSI) analysis system.

,*Mon. Wea. Rev.***136**, 335–351.Zou, X., F. Vandenberghe, M. Pondeca, and Y.-H. Kuo, 1997: Introduction to adjoint techniques and the MM5 adjoint modeling system. NCAR Tech. Note NCAR/TN-435+STR, 110 pp.

Zou, X., Y.-H. Kuo, and S. Low-Nam, 1998: Medium-range prediction of an extratropical oceanic cyclone: Impact of initial state.

,*Mon. Wea. Rev.***126**, 2737–2763.

^{1}

Note that

^{2}

The domain is centered in the typhoon region rather than near the Korean Peninsula because the experiments were conducted for the 2008 typhoon season, when the international field campaign T-PARC was conducted.

^{3}

The WRFDA system includes four-dimensional variational data analysis (4DVAR; Huang et al. 2009) and hybrid systems (Wang et al. 2008), as well as the 3DVAR system. Because of the complex characteristics of the adjoint-derived observation impact estimation, we adopted 3DVAR as an analysis system. It does not consider the time variation of state. First guess at appropriate time (FGAT) was not used in this study.