Search Results

You are looking at 1 - 10 of 39 items for

  • Author or Editor: Masao Kanamitsu x
  • Refine by Access: All Content x
Clear All Modify Search
Masao Kanamitsu

Abstract

The National Meteorological Center's (NMC) Global Data Assimilation and Forecast System is described in some detail. The system consists of 1) preprocessing of the initial guess, 2) optimum interpolation objective analysis, 3) update of the initial guess, 4) initialization, 5) forecast, and 6) postprocessing of the forecast.

The assimilation and forecast system are continually evolving; the version described here was implemented on 30 November 1988.

Full access
Masao Kanamitsu and Suranjana Saha

Abstract

The budget of the systematic component of the short-range forecast error in the National Meteorological Center's Medium-Range Forecast Model (NMC MRF) is examined. The budget is computed for the spectral coefficients and the variances of vorticity, divergence, virtual temperature, and specific humility at every time step during the 24-h model integration. Two months in winter and three months in summer, totaling 150 cases, were integrated with the budget diagnostics. The results of the budget of the spectral coefficients—that is, the budget of mean error—showed compensation among large terms except near the model boundary; therefore, it is difficult to point to a significant source of the systematic error in the free atmosphere. Near the model lower boundaries, dynamics cannot fully compensate physical forcing, and estimation of some physical processes responsible for the mean errors is possible. In contrast, the budget of the variance of the coefficients—that is, the energy budget—is more interesting and informative. The most apparent problem found in the model is a loss of rotational kinetic energy in the medium (total wavenumber n = 11–40) and small (n = 41–80) scales in the free atmosphere. About 50% of the loss is explained by the excessive horizontal and vertical diffusion. There is a strong indication that the rest of the loss of kinetic energy is related to the insufficient generation of available potential energy in the medium scale.

To isolate further the cause of the error in the energetics, several forecasts with budget diagnostics were performed. The experiments showed complex interactions between the physics and dynamics and among the different physical processes. Particularly noteworthy are (a) the compensation between horizontal and vertical diffusion and (b) the balance among horizontal/vertical diffusion, the barotropic scale interaction, and the baroclinic conversion terms in the rotational kinetic energy equation. The results of this study guided the design and implementation of changes in the NMC model in the horizontal diffusion and the cumulus parameterization.

Full access
Masao Kanamitsu and Suranjana Saha

Abstract

Atmospheric budget calculations suffer from various observational and numerical errors. This paper demonstrates that all budget calculations applied to a large number of samples suffer from additional errors originating from systematic tendency errors of the budget equation used. Quantitative evaluation of this systematic tendency error for various types of budget computations showed that the systematic tendency errors are generally comparable in magnitude to the leading terms in the budget equations. Because of this error, the calculated budget does not satisfy conservation properties under steady conditions.

Full access
Hideki Kanamaru and Masao Kanamitsu

Abstract

As an extreme demonstration of regional climate model capability, a dynamical downscaling of the NCEP–NCAR reanalysis was successfully performed over the Northern Hemisphere. Its success is due to the use of the scale-selective bias-correction scheme, which maintains the large-scale analysis of the driving global reanalysis in the interior of the domain where lateral boundary forcing has very little control. The downscaled analysis was found to produce reasonable regional details by comparison against 0.5° gridded analysis from the Climatic Research Unit of the University of East Anglia. Comparisons with smaller-area regional downscaling runs in India, Europe, and Japan using the same downscaling system showed that there is no degradation of quality in downscaled climate analysis by expanding the domain from a regional scale to a hemispherical scale.

Full access
Kei Yoshimura and Masao Kanamitsu

Abstract

With the aim of producing higher-resolution global reanalysis datasets from coarse-resolution reanalysis, a global version of the dynamical downscaling using a global spectral model is developed. A variant of spectral nudging, the modified form of scale-selective bias correction developed for regional models is adopted. The method includes 1) nudging of temperature in addition to the zonal and meridional components of winds, 2) nudging to the perturbation field rather than to the perturbation tendency, and 3) no nudging and correction of the humidity. The downscaling experiment was performed using a T248L28 (about 50-km resolution) global model, driven by the so-called R-2 reanalysis (T62L28 resolution, or about 200-km resolution) during 2001. Evaluation with high-resolution observations showed that the monthly averaged global surface temperature and daily variation of precipitation were much improved. Over North America, surface wind speed and temperature are much better, and over Japan, the diurnal pattern of surface temperature is much improved, as are wind speed and precipitation, but not humidity. Three well-known synoptic/subsynoptic-scale weather patterns over the United States, Europe, and Antarctica were shown to become more realistic. This study suggests that the global downscaling is a viable and economical method for obtaining high-resolution reanalysis without rerunning a very expensive high-resolution full data assimilation.

Full access
Eugenia Kalnay and Masao Kanamitsu

Abstract

In atmospheric models that include vertical diffusion and surface fluxes of heat and moisture it is common to observe large amplitude “fibrillations” associated with these noniinear damping terms. In this paper this phenomenon is studied through the analysis of a simple nonlinear damping equation, ∂X/∂t = −(KXP)X + S. It is concluded that the behavior of several time schemes for the strongly nonlinear damping equations currently used can be quite pathological, with either large amplitude oscillations, or even nonoscillatory but incorrect solutions. Also presented are new simple schemes, which are easy to implement and have a much wider range of stability. These schemes are applied in the new National Meteorological Center (NMC) spectral model.

Full access
Hideki Kanamaru and Masao Kanamitsu

Abstract

The California Reanalysis Downscaling at 10 km (CaRD10) was compared with the North American Regional Reanalysis (NARR), which is a data assimilation regional analysis at 32-km resolution and 3-hourly output using the Eta Model for the period 1979 through the present using the NCEP/Department of Energy (DOE) reanalysis as lateral boundary conditions. The objectives of this comparison are twofold: 1) to understand the efficacy of regional downscaling and horizontal resolution and 2) to estimate the uncertainties in regional analyses due to system differences.

The large-scale component of atmospheric analysis is similar in CaRD10 and NARR. The CaRD10 daily winds fit better to station observations than NARR over ocean where daily variability is large and over land. The daily near-surface temperature comparison shows a similar temporal correlation with observations in CaRD10 and NARR. Several synoptic examples such as the Catalina eddy, coastally trapped wind reversal, and Santa Ana winds are better produced in CaRD10 than NARR. These suggest that the horizontal resolution of the model has a large influence on the regional analysis, and the near-surface observation is not properly assimilated in the current state-of-the-art regional data assimilation system.

The CaRD10 near-surface temperature and winds on monthly and hourly scales are similar to NARR with more regional details available in CaRD10. The Southwestern monsoon is poorly reproduced in CaRD10 because of the position of the lateral boundary. The spatial pattern of the two precipitation analyses is similar, but CaRD10 shows smaller-scale features despite a positive bias. The trends of 500-hPa height and precipitation are similar in the two analyses but the near-surface temperature trend spatial patterns do not agree, suggesting the importance of regional topography, model physics, and land surface schemes. A comparison of a major storm event shows that both analyses suffer from budget residual. CaRD10’s large precipitation is related to wind direction, spatial distribution of precipitable water, and a large moisture convergence.

Dynamical downscaling forced by a global analysis is a computationally economical approach to regional-scale long-term climate analysis and can provide a high-quality climate analysis comparable to current state-of-the-art data-assimilated regional reanalysis. However, uncertainties in regional analyses can be large and caution should be exercised when using them for climate applications.

Full access
Masao Kanamitsu and Hideki Kanamaru

Abstract

For the purpose of producing datasets for regional-scale climate change research and application, the NCEP–NCAR reanalysis for the period 1948–2005 was dynamically downscaled to hourly, 10-km resolution over California using the Regional Spectral Model.

This is Part I of a two-part paper, describing the details of the downscaling system and comparing the downscaled analysis [California Reanalysis Downscaling at 10 km (CaRD10)] against observation and global analysis. An extensive validation of the downscaled analysis was performed using station observations, Higgins gridded precipitation analysis, and Precipitation-Elevation Regression on Independent Slopes Model (PRISM) precipitation analysis.

In general, the CaRD10 near-surface wind and temperature fit better to regional-scale station observations than the NCEP–NCAR reanalysis used to force the regional model, supporting the premise that the regional downscaling is a viable method to attain regional detail from large-scale analysis. This advantage of CaRD10 was found on all time scales, ranging from hourly to decadal scales (i.e., from diurnal variation to multidecadal trend).

Dynamically downscaled analysis provides ways to study various regional climate phenomena of different time scales because all produced variables are dynamically, physically, and hydrologically consistent. However, the CaRD10 is not free from problems. It suffers from positive bias in precipitation for heavy precipitation events. The CaRD10 is inaccurate near the lateral boundary where regional detail is damped by the lateral boundary relaxation. It is important to understand these limitations before the downscaled analysis is used for research.

Full access
Tosiyuki Nakaegawa and Masao Kanamitsu

Abstract

Cluster analysis was used to study seasonal forecast skills of the winter season NCEP seasonal forecast model (SFM) hindcasts over the Pacific–North America (PNA) sector. Two skill scores based on cluster mean and ensemble mean are compared. It was shown that the anomaly correlation coefficients (ACCs) of cluster mean are generally higher than those of the simple ensemble mean. The results indicated that the skill was affected by the existence of multiple atmospheric regimes. Multiple regimes tend to appear more often in near-normal tropical Pacific sea surface temperature (SST) episodes, while a single regime tends to appear during warm/cold episodes. The dissimilarity among the cluster members is small and the number of the dominant cluster members is also small when the tropical SST anomaly is large, suggesting that the external forcing reduces the frequency of occurrence of the multiple regimes. The ACC improvements from the ensemble mean ACCs to the cluster mean ACCs are statistically significant. Thus, the cluster mean can be used as a supplementary tool for seasonal forecasting.

Full access
Kei Yoshimura and Masao Kanamitsu

Abstract

This research was motivated by the need for an improved method compared to the conventional brute-force approach to ensemble downscaling. That method simply applies dynamical downscaling to each ensemble member. It obtains a reliable forecast by taking the ensemble average of all the downscaled ensemble members. This approach, although straightforward, has a problem in that the computational cost is too large for an operational environment. Herein a method for downscaling ensemble mean forecasts is proposed. Although this method does not provide probabilistic forecasts, it will provide the regional-scale detail at minimum cost. In this product, all of the predicted parameters are dynamically and physically consistent (i.e., most likely to occur on a seasonal time scale). It is believed that such a product has great utility for regional climate forecast and application products. The method applies a correction to one of the global forecast members in such a way that the seasonal mean is equal to that of the ensemble mean, and it then downscales the corrected global forecast. This method was tested for a 140-yr period by using the Twentieth-Century Reanalysis dataset, which is a product of ensemble Kalman filtering data assimilation. Use of the method clearly improves the downscaling skill compared to the case of using only a single member; the skill becomes equivalent to that achieved when between two and six members are used directly.

Full access