Search Results

You are looking at 1 - 10 of 14 items for

  • Author or Editor: Masayoshi Ishii x
  • Refine by Access: All Content x
Clear All Modify Search
Tatsuo Suzuki and Masayoshi Ishii

Abstract

Using historical ocean hydrographic observations, decadal to multidecadal sea level changes from 1951 to 2007 in the North Pacific were investigated focusing on vertical density structures. Hydrographically, the sea level changes could reflect the following: changes in the depth of the main pycnocline, density gradient changes across the pycnocline, and modification of the water mass density structure within the pycnocline. The first two processes are characterized as the first baroclinic mode. The changes in density stratification across the pycnocline are sufficiently small to maintain the vertical profile of the first baroclinic mode in this analysis period. Therefore, the first mode should represent mainly the dynamical response to the wind stress forcing. Meanwhile, changes in the composite of all modes of order greater than 1 (remaining baroclinic mode) can be attributed to water mass modifications above the pycnocline. The first baroclinic mode is associated with 40–60-yr fluctuations in the subtropical gyre and bidecadal fluctuations of the Kuroshio Extension (KE) in response to basin-scale wind stress changes. In addition to this, the remaining baroclinic mode exhibits strong variability around the recirculation region south of the KE and regions downstream of the KE, accompanied by 40–60-yr and bidecadal fluctuations, respectively. These fluctuations follow spinup/spindown of the subtropical gyre and meridional shifts of the KE shown in the first mode, respectively. A lag correlation analysis suggests that interdecadal sea level changes due to water mass density changes are a secondary consequence of changes in basin-scale wind stress forcing related to the ocean circulation changes associated with the first mode.

Full access
Shoji Hirahara, Masayoshi Ishii, and Yoshikazu Fukuda

Abstract

A new sea surface temperature (SST) analysis on a centennial time scale is presented. In this analysis, a daily SST field is constructed as a sum of a trend, interannual variations, and daily changes, using in situ SST and sea ice concentration observations. All SST values are accompanied with theory-based analysis errors as a measure of reliability. An improved equation is introduced to represent the ice–SST relationship, which is used to produce SST data from observed sea ice concentrations. Prior to the analysis, biases of individual SST measurement types are estimated for a homogenized long-term time series of global mean SST. Because metadata necessary for the bias correction are unavailable for many historical observational reports, the biases are determined so as to ensure consistency among existing SST and nighttime air temperature observations. The global mean SSTs with bias-corrected observations are in agreement with those of a previously published study, which adopted a different approach. Satellite observations are newly introduced for the purpose of reconstruction of SST variability over data-sparse regions. Moreover, uncertainty in areal means of the present and previous SST analyses is investigated using the theoretical analysis errors and estimated sampling errors. The result confirms the advantages of the present analysis, and it is helpful in understanding the reliability of SST for a specific area and time period.

Full access
Masayoshi Ishii, Masahide Kimoto, and Misako Kachi

Abstract

An objective analysis of monthly ocean subsurface temperatures from 1950 to 1998 is carried out. The analysis scheme and the results with estimated analysis errors are presented.

The analysis domain is global with a horizontal grid of 1° × 1° and 14 vertical levels in the upper 500 m. Subsurface temperature observations used in the objective analysis are archived by the National Ocean Data Center of the National Oceanic and Atmospheric Administration, together with those collected through the global telecommunication system and domestic communication lines in Japan. All the observations are preprocessed by quality control and data selection procedures developed in this study. Together with these observations, three-dimensional fields of the upper-ocean temperature are optimally estimated using a variational technique. To ensure smooth and continuous vertical temperature profiles, a constraint term is introduced to the cost function that is minimized in the analysis. In addition, the analysis scheme is formulated to constrain mixed layer temperatures to become close to sea surface temperatures produced by the Met Office.

The three-dimensional structure of thermal anomalies is represented by the objective analysis. Interannual variations of temperature anomalies in the northern and tropical Pacific are presented and examined with the estimated errors. For the purpose of verification against independent observations of the objective analysis, dynamical heights estimated from the analyzed temperatures and climatological salinity are compared with tide gauge and sea surface height observations.

An investigation of analysis errors and signal-to-noise (S–N) ratio reveals that the reliability increases in the tropical Pacific since the 1970s and the S–N ratio for seasonally averaged temperatures in a 3° latitude × 6° longitude box at 100-m depth is 2.5 in the 1990s. This is not only due to the increase in data sampling but also to an increase in interannual variances of subsurface temperature. At midlatitudes of the Northern (Southern) Hemisphere, the S–N ratio is above (below) unity over the whole period of the objective analysis. The changes are very small in these 50 yr, although recent observational networks cover the global oceans better and the observations are more homogeneously distributed than those of the previous decades.

Full access
Yukiko Imada, Shinjiro Kanae, Masahide Kimoto, Masahiro Watanabe, and Masayoshi Ishii

Abstract

Predictability of above-normal rainfall over Thailand during the rainy season of 2011 was investigated with a one-tier seasonal prediction system based on an atmosphere–ocean coupled general circulation model (CGCM) combined with a statistical downscaling method. The statistical relationship was derived using singular value decomposition analysis (SVDA) between observed regional rainfall and the hindcast of tropical sea surface temperature (SST) from the seasonal prediction system, which has an ability to forecast oceanic variability for lead times up to several months. The downscaled product of 2011 local rainfall was obtained by combining rainfall patterns derived from significant modes of SVDA. This method has the advantage in terms of flexibility that phenomenon-based statistical relationships, such as teleconnections associated with El Niño–Southern Oscillation (ENSO), Indian Ocean dipole (IOD), or the newly recognized central Pacific El Niño, are considered separately in each SVDA mode. The downscaled prediction initialized from 1 August 2011 reproduced the anomalously intense precipitation pattern over Indochina including northern Thailand during the latter half of the rainy season, even though the direct hindcast from the CGCM failed to predict the local rainfall distribution and intensity. Further analysis revealed that this method is applicable to the other recent events such as heavy rainfall during the rainy seasons of 2002 and 2008 in Indochina.

Full access
Sayaka Yasunaka, Masayoshi Ishii, Masahide Kimoto, Takashi Mochizuki, and Hideo Shiogama

Abstract

The influence of the expendable bathythermograph (XBT) depth bias correction on decadal climate prediction is presented by using a coupled atmosphere–ocean general circulation model called the Model for Interdisciplinary Research on Climate 3 (MIROC3). The global mean subsurface ocean temperatures that were simulated by the model with the prescribed anthropogenic and natural forcing are consistent with bias-corrected observations from the mid-1960s onward, but not with uncorrected observations. The latter is reflected by biases in subsurface ocean temperatures, particularly along thermoclines in the tropics and subtropics. When the correction is not applied to XBT observations, these biases are retained in data assimilation results for the model’s initial conditions. Hindcasting past Pacific decadal oscillations (PDOs) is more successful in the experiment with the bias-corrected observations than that without the correction. Improvement of skill in predicting 5-yr mean vertically average ocean subsurface temperature is also seen in the tropical and the central North Pacific where PDO-related signals appear large.

Full access
Yukiko Imada, Hiroaki Tatebe, Masayoshi Ishii, Yoshimitsu Chikamoto, Masato Mori, Miki Arai, Masahiro Watanabe, and Masahide Kimoto

Abstract

Predictability of El Niño–Southern Oscillation (ENSO) is examined using ensemble hindcasts made with a seasonal prediction system based on the atmosphere and ocean general circulation model, the Model for Interdisciplinary Research on Climate, version 5 (MIROC5). Particular attention is paid to differences in predictive skill in terms of the prediction error for two prominent types of El Niño: the conventional eastern Pacific (EP) El Niño and the central Pacific (CP) El Niño, the latter having a maximum warming around the date line. Although the system adopts ocean anomaly assimilation for the initialization process, it maintains a significant ability to predict ENSO with a lead time of more than half a year. This is partly due to the fact that the system is little affected by the “spring prediction barrier,” because increases in the error have little dependence on the thermocline variability. Composite analyses of each type of El Niño reveal that, compared to EP El Niños, the ability to predict CP El Niños is limited and has a shorter lead time. This is because CP El Niños have relatively small amplitudes, and thus they are more affected by atmospheric noise; this prevents development of oceanic signals that can be used for prediction.

Full access
Tim Boyer, Catia M. Domingues, Simon A. Good, Gregory C. Johnson, John M. Lyman, Masayoshi Ishii, Viktor Gouretski, Josh K. Willis, John Antonov, Susan Wijffels, John A. Church, Rebecca Cowley, and Nathaniel L. Bindoff

Abstract

Ocean warming accounts for the majority of the earth’s recent energy imbalance. Historic ocean heat content (OHC) changes are important for understanding changing climate. Calculations of OHC anomalies (OHCA) from in situ measurements provide estimates of these changes. Uncertainties in OHCA estimates arise from calculating global fields from temporally and spatially irregular data (mapping method), instrument bias corrections, and the definitions of a baseline climatology from which anomalies are calculated. To investigate sensitivity of OHCA estimates for the upper 700 m to these different factors, the same quality-controlled dataset is used by seven groups and comparisons are made. Two time periods (1970–2008 and 1993–2008) are examined. Uncertainty due to the mapping method is 16.5 ZJ for 1970–2008 and 17.1 ZJ for 1993–2008 (1 ZJ = 1 × 1021 J). Uncertainty due to instrument bias correction varied from 8.0 to 17.9 ZJ for 1970–2008 and from 10.9 to 22.4 ZJ for 1993–2008, depending on mapping method. Uncertainty due to baseline mean varied from 3.5 to 14.5 ZJ for 1970–2008 and from 2.7 to 9.8 ZJ for 1993–2008, depending on mapping method and offsets. On average mapping method is the largest source of uncertainty. The linear trend varied from 1.3 to 5.0 ZJ yr−1 (0.08–0.31 W m−2) for 1970–2008 and from 1.5 to 9.4 ZJ yr−1 (0.09–0.58 W m−2) for 1993–2008, depending on method, instrument bias correction, and baseline mean. Despite these complications, a statistically robust upper-ocean warming was found in all cases for the full time period.

Full access
Abhishek Savita, Catia M. Domingues, Tim Boyer, Viktor Gouretski, Masayoshi Ishii, Gregory C. Johnson, John M. Lyman, Josh K. Willis, Simon J. Marsland, William Hobbs, John A. Church, Didier P. Monselesan, Peter Dobrohotoff, Rebecca Cowley, and Susan E. Wijffels

Abstract

The Earth system is accumulating energy due to human-induced activities. More than 90% of this energy has been stored in the ocean as heat since 1970, with ∼60% of that in the upper 700 m. Differences in upper-ocean heat content anomaly (OHCA) estimates, however, exist. Here, we use a dataset protocol for 1970–2008—with six instrumental bias adjustments applied to expendable bathythermograph (XBT) data, and mapped by six research groups—to evaluate the spatiotemporal spread in upper OHCA estimates arising from two choices: 1) those arising from instrumental bias adjustments and 2) those arising from mathematical (i.e., mapping) techniques to interpolate and extrapolate data in space and time. We also examined the effect of a common ocean mask, which reveals that exclusion of shallow seas can reduce global OHCA estimates up to 13%. Spread due to mapping method is largest in the Indian Ocean and in the eddy-rich and frontal regions of all basins. Spread due to XBT bias adjustment is largest in the Pacific Ocean within 30°N–30°S. In both mapping and XBT cases, spread is higher for 1990–2004. Statistically different trends among mapping methods are found not only in the poorly observed Southern Ocean but also in the well-observed northwest Atlantic. Our results cannot determine the best mapping or bias adjustment schemes, but they identify where important sensitivities exist, and thus where further understanding will help to refine OHCA estimates. These results highlight the need for further coordinated OHCA studies to evaluate the performance of existing mapping methods along with comprehensive assessment of uncertainty estimates.

Open access
Ryo Mizuta, Akihiko Murata, Masayoshi Ishii, Hideo Shiogama, Kenshi Hibino, Nobuhito Mori, Osamu Arakawa, Yukiko Imada, Kohei Yoshida, Toshinori Aoyagi, Hiroaki Kawase, Masato Mori, Yasuko Okada, Tomoya Shimura, Toshiharu Nagatomo, Mikiko Ikeda, Hirokazu Endo, Masaya Nosaka, Miki Arai, Chiharu Takahashi, Kenji Tanaka, Tetsuya Takemi, Yasuto Tachikawa, Khujanazarov Temur, Youichi Kamae, Masahiro Watanabe, Hidetaka Sasaki, Akio Kitoh, Izuru Takayabu, Eiichi Nakakita, and Masahide Kimoto

Abstract

An unprecedentedly large ensemble of climate simulations with a 60-km atmospheric general circulation model and dynamical downscaling with a 20-km regional climate model has been performed to obtain probabilistic future projections of low-frequency local-scale events. The climate of the latter half of the twentieth century, the climate 4 K warmer than the preindustrial climate, and the climate of the latter half of the twentieth century without historical trends associated with the anthropogenic effect are each simulated for more than 5,000 years. From large ensemble simulations, probabilistic future changes in extreme events are available directly without using any statistical models. The atmospheric models are highly skillful in representing localized extreme events, such as heavy precipitation and tropical cyclones. Moreover, mean climate changes in the models are consistent with those in phase 5 of the Coupled Model Intercomparison Project (CMIP5) ensembles. Therefore, the results enable the assessment of probabilistic change in localized severe events that have large uncertainty from internal variability. The simulation outputs are open to the public as a database called “Database for Policy Decision Making for Future Climate Change” (d4PDF), which is intended to be utilized for impact assessment studies and adaptation planning for global warming.

Full access
Adrian M. Tompkins, María Inés Ortiz De Zárate, Ramiro I. Saurral, Carolina Vera, Celeste Saulo, William J. Merryfield, Michael Sigmond, Woo-Sung Lee, Johanna Baehr, Alain Braun, Amy Butler, Michel Déqué, Francisco J. Doblas-Reyes, Margaret Gordon, Adam A. Scaife, Yukiko Imada, Masayoshi Ishii, Tomoaki Ose, Ben Kirtman, Arun Kumar, Wolfgang A. Müller, Anna Pirani, Tim Stockdale, Michel Rixen, and Tamaki Yasuda
Open access