Search Results
You are looking at 31 - 40 of 40 items for
- Author or Editor: Jin Huang x
- Refine by Access: All Content x
The Coordinated Enhanced Observing Period (CEOP) is an international project that was first proposed by the Global Energy and Water Cycle Experiment (GEWEX) in 1997 and was formally launched in 2001. Since that time it has been adopted by the World Climate Research Programme (WCRP), which views it as an essential part of its strategy for developing global datasets to evaluate global climate models, and by the Integrated Global Observing Strategy Partnership (IGOS-P), which views it as the first element of its global water cycle theme. The United States has been an active partner in all phases of CEOP. In particular, the United States has taken the lead in contributing data from a number of reference sites, providing data processing, and archiving capabilities and related research activities through the GEWEX Americas Prediction Project (GAPP). Other U.S. programs and agencies are providing components including model and data assimilation output, satellite data, and other services. The U.S. science community has also been using the CEOP database in model evaluation and phenomenological studies. This article summarizes the U.S. contributions during the first phase of CEOP and outlines opportunities for readers to become involved in the data analysis phase of the project.
The Coordinated Enhanced Observing Period (CEOP) is an international project that was first proposed by the Global Energy and Water Cycle Experiment (GEWEX) in 1997 and was formally launched in 2001. Since that time it has been adopted by the World Climate Research Programme (WCRP), which views it as an essential part of its strategy for developing global datasets to evaluate global climate models, and by the Integrated Global Observing Strategy Partnership (IGOS-P), which views it as the first element of its global water cycle theme. The United States has been an active partner in all phases of CEOP. In particular, the United States has taken the lead in contributing data from a number of reference sites, providing data processing, and archiving capabilities and related research activities through the GEWEX Americas Prediction Project (GAPP). Other U.S. programs and agencies are providing components including model and data assimilation output, satellite data, and other services. The U.S. science community has also been using the CEOP database in model evaluation and phenomenological studies. This article summarizes the U.S. contributions during the first phase of CEOP and outlines opportunities for readers to become involved in the data analysis phase of the project.
Abstract
Global simulations have been conducted with the European Centre for Medium-Range Weather Forecasts operational model run at T1279 resolution for multiple decades representing climate from the late twentieth and late twenty-first centuries. Changes in key components of the water cycle are examined, focusing on variations at short time scales. Metrics of coupling and feedbacks between soil moisture and surface fluxes and between surface fluxes and properties of the planetary boundary layer (PBL) are inspected. Features of precipitation and other water cycle trends from coupled climate model consensus projections are well simulated. Extreme 6-hourly rainfall totals become more intense over much of the globe, suggesting an increased risk for flash floods. Seasonal-scale droughts are projected to escalate over much of the subtropics and midlatitudes during summer, while tropical and winter droughts become less likely. These changes are accompanied by an increase in the responsiveness of surface evapotranspiration to soil moisture variations. Even though daytime PBL depths increase over most locations in the next century, greater latent heat fluxes also occur over most land areas, contributing a larger energy effect per unit mass of air, except over some semiarid regions. This general increase in land–atmosphere coupling is represented in a combined metric as a “land coupling index” that incorporates the terrestrial and atmospheric effects together. The enhanced feedbacks are consistent with the precipitation changes, but a causal connection cannot be made without further sensitivity studies. Nevertheless, this approach could be applied to the output of traditional climate change simulations to assess changes in land–atmosphere feedbacks.
Abstract
Global simulations have been conducted with the European Centre for Medium-Range Weather Forecasts operational model run at T1279 resolution for multiple decades representing climate from the late twentieth and late twenty-first centuries. Changes in key components of the water cycle are examined, focusing on variations at short time scales. Metrics of coupling and feedbacks between soil moisture and surface fluxes and between surface fluxes and properties of the planetary boundary layer (PBL) are inspected. Features of precipitation and other water cycle trends from coupled climate model consensus projections are well simulated. Extreme 6-hourly rainfall totals become more intense over much of the globe, suggesting an increased risk for flash floods. Seasonal-scale droughts are projected to escalate over much of the subtropics and midlatitudes during summer, while tropical and winter droughts become less likely. These changes are accompanied by an increase in the responsiveness of surface evapotranspiration to soil moisture variations. Even though daytime PBL depths increase over most locations in the next century, greater latent heat fluxes also occur over most land areas, contributing a larger energy effect per unit mass of air, except over some semiarid regions. This general increase in land–atmosphere coupling is represented in a combined metric as a “land coupling index” that incorporates the terrestrial and atmospheric effects together. The enhanced feedbacks are consistent with the precipitation changes, but a causal connection cannot be made without further sensitivity studies. Nevertheless, this approach could be applied to the output of traditional climate change simulations to assess changes in land–atmosphere feedbacks.
Abstract
Northern Hemisphere tropical cyclone (TC) activity is investigated in multiyear global climate simulations with the ECMWF Integrated Forecast System (IFS) at 10-km resolution forced by the observed records of sea surface temperature and sea ice. The results are compared to analogous simulations with the 16-, 39-, and 125-km versions of the model as well as observations.
In the North Atlantic, mean TC frequency in the 10-km model is comparable to the observed frequency, whereas it is too low in the other versions. While spatial distributions of the genesis and track densities improve systematically with increasing resolution, the 10-km model displays qualitatively more realistic simulation of the track density in the western subtropical North Atlantic. In the North Pacific, the TC count tends to be too high in the west and too low in the east for all resolutions. These model errors appear to be associated with the errors in the large-scale environmental conditions that are fairly similar in this region for all model versions.
The largest benefits of the 10-km simulation are the dramatically more accurate representation of the TC intensity distribution and the structure of the most intense storms. The model can generate a supertyphoon with a maximum surface wind speed of 68.4 m s−1. The life cycle of an intense TC comprises intensity fluctuations that occur in apparent connection with the variations of the eyewall/rainband structure. These findings suggest that a hydrostatic model with cumulus parameterization and of high enough resolution could be efficiently used to simulate the TC intensity response (and the associated structural changes) to future climate change.
Abstract
Northern Hemisphere tropical cyclone (TC) activity is investigated in multiyear global climate simulations with the ECMWF Integrated Forecast System (IFS) at 10-km resolution forced by the observed records of sea surface temperature and sea ice. The results are compared to analogous simulations with the 16-, 39-, and 125-km versions of the model as well as observations.
In the North Atlantic, mean TC frequency in the 10-km model is comparable to the observed frequency, whereas it is too low in the other versions. While spatial distributions of the genesis and track densities improve systematically with increasing resolution, the 10-km model displays qualitatively more realistic simulation of the track density in the western subtropical North Atlantic. In the North Pacific, the TC count tends to be too high in the west and too low in the east for all resolutions. These model errors appear to be associated with the errors in the large-scale environmental conditions that are fairly similar in this region for all model versions.
The largest benefits of the 10-km simulation are the dramatically more accurate representation of the TC intensity distribution and the structure of the most intense storms. The model can generate a supertyphoon with a maximum surface wind speed of 68.4 m s−1. The life cycle of an intense TC comprises intensity fluctuations that occur in apparent connection with the variations of the eyewall/rainband structure. These findings suggest that a hydrostatic model with cumulus parameterization and of high enough resolution could be efficiently used to simulate the TC intensity response (and the associated structural changes) to future climate change.
Abstract
How tropical cyclone (TC) activity in the northwestern Pacific might change in a future climate is assessed using multidecadal Atmospheric Model Intercomparison Project (AMIP)-style and time-slice simulations with the ECMWF Integrated Forecast System (IFS) at 16-km and 125-km global resolution. Both models reproduce many aspects of the present-day TC climatology and variability well, although the 16-km IFS is far more skillful in simulating the full intensity distribution and genesis locations, including their changes in response to El Niño–Southern Oscillation. Both IFS models project a small change in TC frequency at the end of the twenty-first century related to distinct shifts in genesis locations. In the 16-km IFS, this shift is southward and is likely driven by the southeastward penetration of the monsoon trough/subtropical high circulation system and the southward shift in activity of the synoptic-scale tropical disturbances in response to the strengthening of deep convective activity over the central equatorial Pacific in a future climate. The 16-km IFS also projects about a 50% increase in the power dissipation index, mainly due to significant increases in the frequency of the more intense storms, which is comparable to the natural variability in the model. Based on composite analysis of large samples of supertyphoons, both the development rate and the peak intensities of these storms increase in a future climate, which is consistent with their tendency to develop more to the south, within an environment that is thermodynamically more favorable for faster development and higher intensities. Coherent changes in the vertical structure of supertyphoon composites show system-scale amplification of the primary and secondary circulations with signs of contraction, a deeper warm core, and an upward shift in the outflow layer and the frequency of the most intense updrafts. Considering the large differences in the projections of TC intensity change between the 16-km and 125-km IFS, this study further emphasizes the need for high-resolution modeling in assessing potential changes in TC activity.
Abstract
How tropical cyclone (TC) activity in the northwestern Pacific might change in a future climate is assessed using multidecadal Atmospheric Model Intercomparison Project (AMIP)-style and time-slice simulations with the ECMWF Integrated Forecast System (IFS) at 16-km and 125-km global resolution. Both models reproduce many aspects of the present-day TC climatology and variability well, although the 16-km IFS is far more skillful in simulating the full intensity distribution and genesis locations, including their changes in response to El Niño–Southern Oscillation. Both IFS models project a small change in TC frequency at the end of the twenty-first century related to distinct shifts in genesis locations. In the 16-km IFS, this shift is southward and is likely driven by the southeastward penetration of the monsoon trough/subtropical high circulation system and the southward shift in activity of the synoptic-scale tropical disturbances in response to the strengthening of deep convective activity over the central equatorial Pacific in a future climate. The 16-km IFS also projects about a 50% increase in the power dissipation index, mainly due to significant increases in the frequency of the more intense storms, which is comparable to the natural variability in the model. Based on composite analysis of large samples of supertyphoons, both the development rate and the peak intensities of these storms increase in a future climate, which is consistent with their tendency to develop more to the south, within an environment that is thermodynamically more favorable for faster development and higher intensities. Coherent changes in the vertical structure of supertyphoon composites show system-scale amplification of the primary and secondary circulations with signs of contraction, a deeper warm core, and an upward shift in the outflow layer and the frequency of the most intense updrafts. Considering the large differences in the projections of TC intensity change between the 16-km and 125-km IFS, this study further emphasizes the need for high-resolution modeling in assessing potential changes in TC activity.
Abstract
The future state of the global water cycle and prediction of freshwater availability for humans around the world remain among the challenges of climate research and are relevant to several United Nations Sustainable Development Goals. The Global Precipitation EXperiment (GPEX) takes on the challenge of improving the prediction of precipitation quantity, phase, timing and intensity, characteristics that are products of a complex integrated system. It will achieve this by leveraging existing World Climate Research Programme (WCRP) activities and community capabilities in satellite, surface-based, and airborne observations, modeling and experimental research, and by conducting new and focused activities. It was launched in October 2023 as a WCRP Lighthouse Activity. Here we present an overview of the GPEX Science Plan that articulates the primary science questions related to precipitation measurements, process understanding, model performance and improvements, and plans for capacity development. The central phase of GPEX is the WCRP Years of Precipitation for 2-3 years with coordinated global field campaigns focusing on different storm types (atmospheric rivers, mesoscale convective systems, monsoons, and tropical cyclones, among others) over different regions and seasons. Activities are planned over the three phases (before, during, and after the Years of Precipitation) spanning a decade. These include gridded data evaluation and development, advanced modeling, enhanced understanding of processes critical to precipitation, multi-scale prediction of precipitation events across scales, and capacity development. These activities will be further developed as part of the GPEX Implementation Plan.
Abstract
The future state of the global water cycle and prediction of freshwater availability for humans around the world remain among the challenges of climate research and are relevant to several United Nations Sustainable Development Goals. The Global Precipitation EXperiment (GPEX) takes on the challenge of improving the prediction of precipitation quantity, phase, timing and intensity, characteristics that are products of a complex integrated system. It will achieve this by leveraging existing World Climate Research Programme (WCRP) activities and community capabilities in satellite, surface-based, and airborne observations, modeling and experimental research, and by conducting new and focused activities. It was launched in October 2023 as a WCRP Lighthouse Activity. Here we present an overview of the GPEX Science Plan that articulates the primary science questions related to precipitation measurements, process understanding, model performance and improvements, and plans for capacity development. The central phase of GPEX is the WCRP Years of Precipitation for 2-3 years with coordinated global field campaigns focusing on different storm types (atmospheric rivers, mesoscale convective systems, monsoons, and tropical cyclones, among others) over different regions and seasons. Activities are planned over the three phases (before, during, and after the Years of Precipitation) spanning a decade. These include gridded data evaluation and development, advanced modeling, enhanced understanding of processes critical to precipitation, multi-scale prediction of precipitation events across scales, and capacity development. These activities will be further developed as part of the GPEX Implementation Plan.
Test beds have emerged as a critical mechanism linking weather research with forecasting operations. The U.S. Weather Research Program (USWRP) was formed in the 1990s to help identify key gaps in research related to major weather prediction problems and the role of observations and numerical models. This planning effort ultimately revealed the need for greater capacity and new approaches to improve the connectivity between the research and forecasting enterprise.
Out of this developed the seeds for what is now termed “test beds.” While many individual projects, and even more broadly the NOAA/National Weather Service (NWS) Modernization, were successful in advancing weather prediction services, it was recognized that specific forecast problems warranted a more focused and elevated level of effort. The USWRP helped develop these concepts with science teams and provided seed funding for several of the test beds described.
Based on the varying NOAA mission requirements for forecasting, differences in the organizational structure and methods used to provide those services, and differences in the state of the science related to those forecast challenges, test beds have taken on differing characteristics, strategies, and priorities. Current test bed efforts described have all emerged between 2000 and 2011 and focus on hurricanes (Joint Hurricane Testbed), precipitation (Hydrometeorology Testbed), satellite data assimilation (Joint Center for Satellite Data Assimilation), severe weather (Hazardous Weather Testbed), satellite data support for severe weather prediction (Short-Term Prediction Research and Transition Center), mesoscale modeling (Developmental Testbed Center), climate forecast products (Climate Testbed), testing and evaluation of satellite capabilities [Geostationary Operational Environmental Satellite-R Series (GOES-R) Proving Ground], aviation applications (Aviation Weather Testbed), and observing system experiments (OSSE Testbed).
Test beds have emerged as a critical mechanism linking weather research with forecasting operations. The U.S. Weather Research Program (USWRP) was formed in the 1990s to help identify key gaps in research related to major weather prediction problems and the role of observations and numerical models. This planning effort ultimately revealed the need for greater capacity and new approaches to improve the connectivity between the research and forecasting enterprise.
Out of this developed the seeds for what is now termed “test beds.” While many individual projects, and even more broadly the NOAA/National Weather Service (NWS) Modernization, were successful in advancing weather prediction services, it was recognized that specific forecast problems warranted a more focused and elevated level of effort. The USWRP helped develop these concepts with science teams and provided seed funding for several of the test beds described.
Based on the varying NOAA mission requirements for forecasting, differences in the organizational structure and methods used to provide those services, and differences in the state of the science related to those forecast challenges, test beds have taken on differing characteristics, strategies, and priorities. Current test bed efforts described have all emerged between 2000 and 2011 and focus on hurricanes (Joint Hurricane Testbed), precipitation (Hydrometeorology Testbed), satellite data assimilation (Joint Center for Satellite Data Assimilation), severe weather (Hazardous Weather Testbed), satellite data support for severe weather prediction (Short-Term Prediction Research and Transition Center), mesoscale modeling (Developmental Testbed Center), climate forecast products (Climate Testbed), testing and evaluation of satellite capabilities [Geostationary Operational Environmental Satellite-R Series (GOES-R) Proving Ground], aviation applications (Aviation Weather Testbed), and observing system experiments (OSSE Testbed).
Abstract
As the second-largest shifting sand desert worldwide, the Taklimakan Desert (TD) represents the typical aeolian landforms in arid regions as an important source of global dust aerosols. It directly affects the ecological environment and human health across East Asia. Thus, establishing a comprehensive environment and climate observation network for field research in the TD region is essential to improve our understanding of the desert meteorology and environment, assess its impact, mitigate potential environmental issues, and promote sustainable development. With a nearly 20-yr effort under the extremely harsh conditions of the TD, the Desert Environment and Climate Observation Network (DECON) has been established completely covering the TD region. The core of DECON is the Tazhong station in the hinterland of the TD. Moreover, the network also includes 4 satellite stations located along the edge of the TD for synergistic observations, and 18 automatic weather stations interspersed between them. Thus, DECON marks a new chapter of environmental and meteorological observation capabilities over the TD, including dust storms, dust emission and transport mechanisms, desert land–atmosphere interactions, desert boundary layer structure, ground calibration for remote sensing monitoring, and desert carbon sinks. In addition, DECON promotes cooperation and communication within the research community in the field of desert environments and climate, which promotes a better understanding of the status and role of desert ecosystems. Finally, DECON is expected to provide the basic support necessary for coordinated environmental and meteorological monitoring and mitigation, joint construction of ecologically friendly communities, and sustainable development of central Asia.
Abstract
As the second-largest shifting sand desert worldwide, the Taklimakan Desert (TD) represents the typical aeolian landforms in arid regions as an important source of global dust aerosols. It directly affects the ecological environment and human health across East Asia. Thus, establishing a comprehensive environment and climate observation network for field research in the TD region is essential to improve our understanding of the desert meteorology and environment, assess its impact, mitigate potential environmental issues, and promote sustainable development. With a nearly 20-yr effort under the extremely harsh conditions of the TD, the Desert Environment and Climate Observation Network (DECON) has been established completely covering the TD region. The core of DECON is the Tazhong station in the hinterland of the TD. Moreover, the network also includes 4 satellite stations located along the edge of the TD for synergistic observations, and 18 automatic weather stations interspersed between them. Thus, DECON marks a new chapter of environmental and meteorological observation capabilities over the TD, including dust storms, dust emission and transport mechanisms, desert land–atmosphere interactions, desert boundary layer structure, ground calibration for remote sensing monitoring, and desert carbon sinks. In addition, DECON promotes cooperation and communication within the research community in the field of desert environments and climate, which promotes a better understanding of the status and role of desert ecosystems. Finally, DECON is expected to provide the basic support necessary for coordinated environmental and meteorological monitoring and mitigation, joint construction of ecologically friendly communities, and sustainable development of central Asia.
The recent U.S. National Academies report, Assessment of Intraseasonal to Interannual Climate Prediction and Predictability, was unequivocal in recommending the need for the development of a North American Multimodel Ensemble (NMME) operational predictive capability. Indeed, this effort is required to meet the specific tailored regional prediction and decision support needs of a large community of climate information users.
The multimodel ensemble approach has proven extremely effective at quantifying prediction uncertainty due to uncertainty in model formulation and has proven to produce better prediction quality (on average) than any single model ensemble. This multimodel approach is the basis for several international collaborative prediction research efforts and an operational European system, and there are numerous examples of how this multimodel ensemble approach yields superior forecasts compared to any single model.
Based on two NOAA Climate Test bed (CTB) NMME workshops (18 February and 8 April 2011), a collaborative and coordinated implementation strategy for a NMME prediction system has been developed and is currently delivering real-time seasonal-to-interannual predictions on the NOAA Climate Prediction Center (CPC) operational schedule. The hindcast and real-time prediction data are readily available (e.g., http://iridl.ldeo.columbia.edu/SOURCES/.Models/.NMME/) and in graphical format from CPC (www.cpc.ncep.noaa.gov/products/NMME/). Moreover, the NMME forecast is already currently being used as guidance for operational forecasters. This paper describes the new NMME effort, and presents an overview of the multimodel forecast quality and the complementary skill associated with individual models.
The recent U.S. National Academies report, Assessment of Intraseasonal to Interannual Climate Prediction and Predictability, was unequivocal in recommending the need for the development of a North American Multimodel Ensemble (NMME) operational predictive capability. Indeed, this effort is required to meet the specific tailored regional prediction and decision support needs of a large community of climate information users.
The multimodel ensemble approach has proven extremely effective at quantifying prediction uncertainty due to uncertainty in model formulation and has proven to produce better prediction quality (on average) than any single model ensemble. This multimodel approach is the basis for several international collaborative prediction research efforts and an operational European system, and there are numerous examples of how this multimodel ensemble approach yields superior forecasts compared to any single model.
Based on two NOAA Climate Test bed (CTB) NMME workshops (18 February and 8 April 2011), a collaborative and coordinated implementation strategy for a NMME prediction system has been developed and is currently delivering real-time seasonal-to-interannual predictions on the NOAA Climate Prediction Center (CPC) operational schedule. The hindcast and real-time prediction data are readily available (e.g., http://iridl.ldeo.columbia.edu/SOURCES/.Models/.NMME/) and in graphical format from CPC (www.cpc.ncep.noaa.gov/products/NMME/). Moreover, the NMME forecast is already currently being used as guidance for operational forecasters. This paper describes the new NMME effort, and presents an overview of the multimodel forecast quality and the complementary skill associated with individual models.
The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission will provide a calibration laboratory in orbit for the purpose of accurately measuring and attributing climate change. CLARREO measurements establish new climate change benchmarks with high absolute radiometric accuracy and high statistical confidence across a wide range of essential climate variables. CLARREO's inherently high absolute accuracy will be verified and traceable on orbit to Système Internationale (SI) units. The benchmarks established by CLARREO will be critical for assessing changes in the Earth system and climate model predictive capabilities for decades into the future as society works to meet the challenge of optimizing strategies for mitigating and adapting to climate change. The CLARREO benchmarks are derived from measurements of the Earth's thermal infrared spectrum (5–50 μm), the spectrum of solar radiation reflected by the Earth and its atmosphere (320–2300 nm), and radio occultation refractivity from which accurate temperature profiles are derived. The mission has the ability to provide new spectral fingerprints of climate change, as well as to provide the first orbiting radiometer with accuracy sufficient to serve as the reference transfer standard for other space sensors, in essence serving as a “NIST [National Institute of Standards and Technology] in orbit.” CLARREO will greatly improve the accuracy and relevance of a wide range of space-borne instruments for decadal climate change. Finally, CLARREO has developed new metrics and methods for determining the accuracy requirements of climate observations for a wide range of climate variables and uncertainty sources. These methods should be useful for improving our understanding of observing requirements for most climate change observations.
The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission will provide a calibration laboratory in orbit for the purpose of accurately measuring and attributing climate change. CLARREO measurements establish new climate change benchmarks with high absolute radiometric accuracy and high statistical confidence across a wide range of essential climate variables. CLARREO's inherently high absolute accuracy will be verified and traceable on orbit to Système Internationale (SI) units. The benchmarks established by CLARREO will be critical for assessing changes in the Earth system and climate model predictive capabilities for decades into the future as society works to meet the challenge of optimizing strategies for mitigating and adapting to climate change. The CLARREO benchmarks are derived from measurements of the Earth's thermal infrared spectrum (5–50 μm), the spectrum of solar radiation reflected by the Earth and its atmosphere (320–2300 nm), and radio occultation refractivity from which accurate temperature profiles are derived. The mission has the ability to provide new spectral fingerprints of climate change, as well as to provide the first orbiting radiometer with accuracy sufficient to serve as the reference transfer standard for other space sensors, in essence serving as a “NIST [National Institute of Standards and Technology] in orbit.” CLARREO will greatly improve the accuracy and relevance of a wide range of space-borne instruments for decadal climate change. Finally, CLARREO has developed new metrics and methods for determining the accuracy requirements of climate observations for a wide range of climate variables and uncertainty sources. These methods should be useful for improving our understanding of observing requirements for most climate change observations.
The importance of using dedicated high-end computing resources to enable high spatial resolution in global climate models and advance knowledge of the climate system has been evaluated in an international collaboration called Project Athena. Inspired by the World Modeling Summit of 2008 and made possible by the availability of dedicated high-end computing resources provided by the National Science Foundation from October 2009 through March 2010, Project Athena demonstrated the sensitivity of climate simulations to spatial resolution and to the representation of subgrid-scale processes with horizontal resolutions up to 10 times higher than contemporary climate models. While many aspects of the mean climate were found to be reassuringly similar, beyond a suggested minimum resolution, the magnitudes and structure of regional effects can differ substantially. Project Athena served as a pilot project to demonstrate that an effective international collaboration can be formed to efficiently exploit dedicated supercomputing resources. The outcomes to date suggest that, in addition to substantial and dedicated computing resources, future climate modeling and prediction require a substantial research effort to efficiently explore the fidelity of climate models when explicitly resolving important atmospheric and oceanic processes.
The importance of using dedicated high-end computing resources to enable high spatial resolution in global climate models and advance knowledge of the climate system has been evaluated in an international collaboration called Project Athena. Inspired by the World Modeling Summit of 2008 and made possible by the availability of dedicated high-end computing resources provided by the National Science Foundation from October 2009 through March 2010, Project Athena demonstrated the sensitivity of climate simulations to spatial resolution and to the representation of subgrid-scale processes with horizontal resolutions up to 10 times higher than contemporary climate models. While many aspects of the mean climate were found to be reassuringly similar, beyond a suggested minimum resolution, the magnitudes and structure of regional effects can differ substantially. Project Athena served as a pilot project to demonstrate that an effective international collaboration can be formed to efficiently exploit dedicated supercomputing resources. The outcomes to date suggest that, in addition to substantial and dedicated computing resources, future climate modeling and prediction require a substantial research effort to efficiently explore the fidelity of climate models when explicitly resolving important atmospheric and oceanic processes.