A method is described to calibrate a satellite microwave radiometer operating near 18–37 GHz on decadal time scales for the purposes of climate studies. The method uses stable on-earth brightness temperature references over the full dynamic range of on-earth brightness temperatures to stabilize the radiometer calibration and is applied to the Ocean Topography Experiment (TOPEX) Microwave Radiometer (TMR). These references are a vicarious cold reference, which is a statistical lower bound on ocean surface brightness temperature, and heavily vegetated, pseudoblackbody regions in the Amazon rain forest. The sensitivity of the on-earth references to climate variability is assessed. No significant climate sensitivity is found in the cold reference, as it is not sensitive to a climate minimum (e.g., coldest sea surface temperature or driest atmosphere) but arises because of a minimum in the sea surface radio brightness that occurs in the middle of the climatic distribution of sea surface temperatures (SSTs). The hot reference is observed to have a small climate dependency, which is most evident during the 1997/98 El Niño event. A time-dependent model for the hot reference region is constructed using meteorological fields from the National Centers for Environmental Prediction–National Center for Atmospheric Research (NCEP–NCAR) reanalysis product. This model is shown to accurately account for the small climate variations in this reference. In addition to the long-term stabilization of the brightness temperatures, an improvement to the TMR antenna pattern correction is described that removes residual geographically correlated errors, in particular errors correlated with distance to land or sea ice. The recalibrated TMR climate data record is cross-validated with the climate data record produced from the Special Sensor Microwave Imager (SSM/I). It is shown that the intersensor drift is small, providing realistic error bars for the climate trends generated from the instrument pair, as well as validating both the methodology described in this paper and the SSM/I climate data record.
As the satellite data record of earth observations grows, the possibility of observing the global climate system and its variability becomes feasible. Among other things, this requires precise instrument calibration, free of long-term systematic errors that would otherwise be falsely interpreted as climatic signals. Microwave radiometers, in particular, which have been flying since the late 1970s, provide several geophysical variables that are key components of the climate system, including but not limited to precipitable water vapor, integrated cloud liquid water, wind speed, precipitation, and sea ice extent. Methods for stabilizing the long-term calibration of microwave radiometers are steadily progressing (Mears et al. 2003; Thorne et al. 2005; Brown et al. 2007; Hilburn and Wentz 2008), but there remains the question of how these records can be reliably validated. Here, we demonstrate a method to stabilize the long-term brightness temperature calibration of a microwave radiometer operating near 18–37 GHz. This method is fundamentally different from those used to produce other passive microwave climate records, such as the long Special Sensor Microwave Imager (SSM/I) data record. The methodology is applied to the extended time series from the Ocean Topography Experiment (TOPEX) Microwave Radiometer (TMR). The resulting climate data record is intervalidated with that produced independently from SSM/I.
The TMR was included on the TOPEX/Poseidon oceanography satellite to measure the wet tropospheric path delay experienced by the radar altimeter signal. This satellite was decommissioned in January 2006 but has generated a 13-yr time series of precipitable water vapor (PWV), cloud liquid water (CLW), and wind speed (WS) measurements over the ocean. This extraordinarily long time series of global measurements from a single instrument is unprecedented and is an ideal opportunity to derive a climate data record that is complementary to existing products and will increase our confidence in the record as a whole. With this in mind, an effort was undertaken to minimize systematic calibration errors in the 13-yr time series. The TMR brightness temperatures are stabilized over this 13-yr period by calibrating to stable on-earth brightness temperature references at the hottest and coldest on-earth brightness temperatures. It is of critical importance that any climate signals in the references be characterized and removed. A description of the steps required to remove small climatic signals in the on-earth references and the recalibration methodology are presented. Additionally, small geographically correlated errors were minimized by improving the method by which antenna sidelobe contamination in the TMR antenna temperatures is removed. The recalibration methodology described is not limited to the TMR channels or viewing geometry and can be applied to other microwave radiometers operating near the range of 18–37 GHz.
The TMR orbit, as well as its longevity, makes it ideal for cross-validating climate records produced using other satellite radiometers. The TMR flew in a non-sun-synchronous orbit inclined at 66° that therefore crosses the orbit of many radiometers on polar-orbiting sun-synchronous satellites. The calibrated TMR data record is used to cross-validate the SSM/I climate record that is produced by interleaving the observations from five SSM/I sensors (Wentz 1997). This intervalidation of two independently calibrated instruments then provides a realistic estimate of the uncertainty in the derived climate trends from the instrument pair.
2. TMR instrument description
The TMR is nadir pointing and measures radiometric brightness temperatures (TB) at 18.0, 21.0, and 37.0 GHz. It has a single, multifrequency feed horn that illuminates a 60-cm offset parabolic reflector. The main-beam efficiency of the TMR antenna is 93%–95%, and the 3-dB footprint diameters ware 26–46 km, with the range shown for the three channels in both cases. The TMR antenna temperatures are operationally calibrated using a two-point calibration method by internally switching to an ambient reference load (∼300 K) and an external horn that views cold space (Ruf et al. 1995). The calibrated antenna temperatures are converted to main-beam brightness temperatures by employing an antenna pattern correction algorithm that removes the contributions from the on-earth and off-earth sidelobes (Janssen et al. 1995). The main-beam brightness temperatures are then input to a geophysical retrieval algorithm to produce precipitable water vapor, cloud liquid water, and wind speed estimates (Keihm et al. 1995).
Unlike many scanning radiometers that use external calibration sources to calibrate the system, including the feed horn, the internal calibration references of the TMR require a correction for the component losses and reflections between the antenna and internal calibration plane. This so-called front-end path correction, and other corrections, such as a nonlinearity correction, was derived during the TMR prelaunch calibration phase (Ruf et al. 1995). This calibration was fine-tuned just after launch to correct for a gross gain and offset error (Ruf et al. 1994). Any subsequent on-orbit changes in the losses or reflections of the components between the antenna and the internal calibration plane will introduce changes in the calibration that cannot be tracked by the internal references and must be detected and corrected using external brightness temperature references.
3. Selection of on-earth brightness temperature references
The decadal-scale stability of the radiometer will be ensured by forcing the radiometer measurements to agree with stable on-earth brightness temperature references. This requires that either the selected references are insensitive to climate variations or that any climate signals present in the references are removed. Two such references that bound the on-earth dynamic range of brightness temperature measured by the TMR are a so-called vicarious cold reference (Ruf 2000) and pseudoblackbody regions of the Amazon rain forest (Brown and Ruf 2005). These references have been successfully demonstrated for the on-orbit calibration of several microwave radiometers (Ruf 2002; Brown et al. 2004; Ruf et al. 2006; Brown et al. 2007; Mo 2007), and a rigorous comparison of several on-earth calibration references identified these as the most reliable for the long-term on-orbit monitoring and cross-calibration of the European Remote Sensing Satellite-2 (ERS-2) radiometer and the TMR (Eymard et al. 2005). However, these previous studies have not fully considered the stability of the references on decadal time scales, as discussed below.
a. Long-term stability of the vicarious cold reference
The cold reference values are computed by applying the vicarious cold reference (VCR) algorithm to a large database of modeled brightness temperatures at the TMR frequencies calculated using globally distributed open-ocean radiosonde observations. The algorithm involves the generation of cubic polynomial fits to the cumulative distribution function of the coldest brightness temperatures for each channel, then extrapolation to the zero population values of brightness temperature. The resultant cold reference values are 124.4, 131.8, and 154.3 K at 18, 21, and 37 GHz, respectively, with an absolute uncertainty of 1 K, defined by the uncertainty in the dielectric constant of seawater. Details about the radiative transfer model used are found in Brown et al. (2004). The same radiative transfer model was used to derive both the modeled cold reference values and the coefficients used in the TMR statistical geophysical retrieval algorithms, ensuring consistency between the two and reducing the impact of the absolute uncertainty in the modeled cold reference value on the geophysical retrievals.
The stability of the VCR is related to the probability of occurrence of the coldest brightness temperatures. The probability of occurrence of the coldest brightness temperatures is in turn defined by the probability of occurrence of the geophysical states that produce the coldest top-of-atmosphere (TOA) TBs. Between 18 and 37 GHz, the minimum surface emission on the earth occurs for a flat ocean surface at a frequency-dependent optimal SST at which the product of the sea surface emissivity and sea surface temperature is a minimum. This optimum SST is 15.1°C at 18.0 GHz, 17.9°C at 21.0 GHz, and 26.4°C at 37.0 GHz at nadir incidence angle (Ruf 2000). The surface TB is then propagated through the atmosphere, where oxygen, water vapor, and liquid water drops absorb and re-emit the radiation, which increases the TOA TB. For a globally distributed TB dataset under clear sky and calm wind conditions, water vapor and SST will be the most dominant geophysical variables in driving the probability of the coldest TBs observed by the TMR frequency range. Note that the large size of the TMR footprints (∼40 km) prevents scattering over large storms from being a significant contributor to the coldest TOA brightness temperatures. Because water vapor and SST are inherently coupled through the Clausius–Clapeyron relationship, the coldest TOA brightness temperatures will occur over a range of SST and column PWV combinations near the optimal SST. In general, as the surface temperature decreases, the precipitable water vapor decreases. So as we move away from the SST of minimum surface emission toward colder SSTs, the probability of drier atmospheric conditions increases, creating a balance between the increasing surface contribution and the decreasing atmospheric contribution. This effect broadens the probability distribution function of the coldest TOA TBs in geophysical state space.
To examine this further, a large database of modeled brightness temperatures (TBs) at the TMR frequencies is generated using four-times-daily meteorological fields for 1998 from the National Centers for Environmental Prediction–National Center for Atmospheric Research (NCEP–NCAR) reanalysis product (Kalnay et al. 1996). Figure 1 shows the joint probability distribution of SST and precipitable water vapor computed from the NCEP database. It is observed that dry atmospheric conditions (PWV < 10 mm) occur over a wide range of SST values.
Statistical box plots that were next generated from the modeled TB database show the statistical distribution of the 18.0-, 21.0-, and 37.0-GHz TOA TB with respect to SST and PWV (Fig. 2). Each box plot shows the bounds of the upper and lower quartile of the data, represented by the box, and the extent of the data (1.5 times the interquartile distance), represented by the whiskers on either side of the box. The black plus signs represent points greater than 1.5 times the interquartile distance. The arrows in the bottom three plots indicate approximately where the minimum surface emission occurs. It is indeed observed that the coldest brightness temperatures at each frequency are observed over a broad range of SST and PWV combinations and do not necessarily occur at a climatic minimum (e.g., lowest SST, driest atmospheres). This characteristic reduces the sensitivity of the VCR to climate perturbations. To bias the cold reference on long time scales, the probability of every combination of SST and PWV that lead to the coldest TOA TBs would have to simultaneously decrease below some threshold. This is an unlikely scenario for global climate change. This assertion is supported by the TMR data record, as will be discussed in a subsequent section.
b. Long-term stability of the hot brightness temperature reference
Heavily vegetated regions in the Amazon rain forest are used to calibrate the TMR TBs at the hot end of the spectrum. A model for the brightness temperature of these regions as a function of frequency, incidence angle, time of day, and time of year was previously developed and parameterized using SSM/I data (Brown and Ruf 2005). This model for the Amazon rain forest is static and does not account for year-to-year variations. With the exception of a significant climate perturbation, such as an El Niño event, the year-to-year variations are assumed to be small but cannot be neglected when calibrating the radiometer for climatology. The variations in the hot reference are mainly driven by changes in the temperature of the forest canopy and to some extent by the atmospheric water vapor concentration. Another potential source of change is large-scale deforestation, but this is not considered to be significant here because of the large TMR footprints and averaging areas used. However, this may not be true for future sensors.
To quantify the variations of the Amazon hot reference over time due to variations in surface temperature and water vapor concentration, the NCEP–NCAR reanalysis fields are used as inputs to a radiative transfer model to compute the hot reference as a function of time (Kalnay et al. 1996). The NCEP reanalysis fields were generated using a static advanced data assimilation system with stringent quality control on the input data sources to eliminate jumps or other spurious artifacts in the data over time. These fields contain the surface and atmospheric state variables needed to model the top-of-atmosphere brightness temperature and are available four times daily. The modeled values for the hot reference were computed using the same atmospheric absorption model as the cold reference and the values of the surface emissivity for each channel determined from SSM/I Amazon data. A second-order Fourier harmonic fit was used to interpolate the computed brightness temperatures to once per hour. The resultant model hot reference values averaged over time of day and time of year are 286, 286, and 283 K at 18, 21, and 37 GHz, respectively, with an absolute uncertainty of 1 K (Brown and Ruf 2005).
Figure 3 shows the observed TMR 18.0-GHz brightness temperature averaged for the Amazon region over a 10-day repeat cycle from the original TMR Geophysical Data Record (GDR) (Fig. 3, left) along with the model values computed from the NCEP fields (Fig. 3, right). The model values for each 10-day cycle are selected based on the TMR local overpass times, then averaged. The black lines are 1-yr running averages of the measured and modeled hot reference values, respectively. The strong 1998 El Niño introduces an approximately 1-K increase in the measured TMR 18-GHz TB, which is reflected in both the measured and modeled reference values. If not accounted for, this climate signal would be spuriously introduced into the brightness temperatures and, hence, the geophysical retrieval time series. The scatter between the cycle averages is caused by the significant diurnal variation of the Amazon brightness temperature (∼6 K), since the scene is sampled at different local solar times. The difference in magnitude between the model and the TMR brightness temperatures (roughly 4 K) is due to a gain error that was present in the original TMR data record. The origin of this gain error was traced to the original on-orbit calibration of the TMR (Ruf et al. 1994). This will be discussed in more detail in subsequent sections.
4. Observed TMR calibration anomalies
The TMR is operationally calibrated by internally switching to a cold sky horn and an internal ambient reference load. A ferrite switch assembly connects the radiometer receiver to the calibration sources at 14-s intervals. Because the electrical paths differ between the multifrequency feed horn (MFFH), cold sky horn, and ambient reference, any long-term variations in the losses or reflections of the radiometer components in front of the internal calibration plane will introduce changes into the antenna temperature calibration, requiring corrections using external reference sources. Furthermore, any small residual errors in the loss terms used in the correction for the front-end path loss will introduce a calibration error that is a function of the instrument temperature.
a. Characterization of radiometer drift
The presence of a drift in the TMR became evident midway through the TOPEX mission. It was first suggested from an observed ∼2 mm yr−1 drift in the comparison of sea level measurements between TOPEX and the global tide gauge network (Mitchum 1998). The latitudinal dependence of the relative sea level drift suggested that a drift in the TMR wet-delay correction be considered as an explanation. Following this, it was recognized that measurements of the lowest radiometer ocean brightness temperatures, under clear, calm, and dry conditions, provided an extremely stable, cycle-to-cycle calibration reference, which led to the development of the VCR algorithm (Ruf 2000). This algorithm was used to characterize the drift in coldest TMR brightness temperatures (Ruf 2002). As it turned out, the global average 2 mm yr−1 drift observed in the tide gauge sea level comparisons was equivalent in magnitude and direction to the wet-delay drift predicted by a drift in the 18-GHz TB derived from the VCR algorithm (Keihm et al. 2000). The drift in the TMR is revisited here for the complete 13-yr time series using the VCR and the Amazon hot reference derived from the NCEP inputs.
Figure 4 shows the 13-yr (481 10-day exact repeat orbit cycles) history of the original TMR-measured VCR and Amazon hot TBs for the three channels, plotted as variations from the modeled values. The largest drift, averaging ∼0.27 K yr−1 during the first 7 yr, is observed in the 18-GHz channel only at the cold end of the TB spectrum. This is consistent with the explanation for this drift as a slow change in the cold sky horn switch isolation over time (Ruf 2002), which would introduce a gain drift that would be maximum for cold TBs and minimum for hot TBs near the physical temperature of the internal reference load (∼285 K). Small drifts of approximately 0.5 K over the 13-yr time series are also observed in the 21.0- and 37.0-GHz channels at the cold end. The 18.0- and 21.0-GHz channels exhibit little or no drift at the hot end of the spectrum, but the 37.0-GHz channel is observed to drift by approximately 0.5 K at the hot end over the mission. The apparent gain and offset errors observed in the TMR, evident as offsets from the modeled hot and cold reference values, are discussed in the following subsection.
No significant perturbations in the measured-minus-modeled hot and cold reference are observed during the strong 1997/98 El Niño event, further reinforcing the assertion that the cold reference is insensitive to climate perturbations and that the NCEP-based model for the Amazon rain forest TB accurately accounts for the real year-to-year variations in the hot reference. The 1997/98 El Niño serves as an excellent metric to assess the sensitivity of any on-earth TB reference to climate change, since it was such a large perturbation on the global climate state over a relatively short period of time. The stability of the cold reference over time, as opposed to the hot reference, is much more critical for calibrating over-ocean data records, since a majority of the TBs over the ocean occur near the cold reference value. Statistically, 50% of the open-ocean TBs at 18 and 37 GHz fall within 15 K of the cold reference values.
b. Gain and offset errors
Referencing Fig. 4, the offsets between the TMR-measured and modeled cold reference are generally small, peaking at −1.5 K for the 37-GHz channel at the beginning of the mission. The offsets at the hot end are much larger, up to 4.5 K for the 21.0-GHz channel, suggesting both a significant gain and bias error in the TMR brightness temperature calibration relative to the modeled reference values. During the initial on-orbit calibration of the TMR, a ∼10% gain error was observed in comparisons between the TMR integrated vapor and collocated radiosonde observations (Ruf et al. 1994). Based on the uncertainties in the water vapor absorption model and the techniques used to validate the TB calibration, it was deduced that it was equally probable that the observed gain error could be caused by a TB calibration error or an error in the absorption model. Therefore, half of the gain error was allocated to the brightness temperature calibration, and the remaining was allocated to the water vapor absorption model used to generate the retrieval algorithm coefficients. The comparisons with the improved Amazon model imply that the entire ∼10% gain error should have been allocated to the brightness temperature calibration.
c. Errors correlated with instrument temperature
Because the TMR used internal calibration references, the electrical path for the calibration measurements was different from that for the antenna measurements. This required that front-end component losses and reflections between the antenna and the internal calibration plane be accounted for in the antenna temperature calibration algorithm. The internal calibration plane is defined as the point in the radiometer receiver chain after which the remaining components are common to the calibration and antenna measurements. For TMR, this plane is just after the internal calibration switch and before the isolator. For an externally calibrated conically scanning instrument like SSM/I, this plane would be between the feed horn and the main reflector. A detailed description of the TMR antenna temperature calibration algorithm is given in Ruf et al. (1995). It is shown that the model for the radiometer front end can be parameterized as a linear combination of the physical temperatures of the front-end components. The front-end path loss coefficients were derived during prelaunch calibration and testing. An error in this correction would create an instrument-temperature-dependent bias in the TMR calibration.
It was first discovered that there was such a bias in the TMR during the tandem mission just after the launch of the Jason-1 satellite in 2002. The TOPEX and Jason-1 satellites flew in formation with only a 70-s displacement between them, allowing for an unprecedented level of intercalibration between the TMR and the Jason Microwave Radiometer (JMR). It was quickly discovered that there was a time-variable path-delay bias between the JMR and TMR that was correlated with the TMR instrument temperature. This cyclical bias totaled about 5 mm peak-to-peak as the satellite transitioned through different yaw steering modes, producing 15°C physical temperature variations in the instrument that repeated on 60-day intervals (Zlotnicki and Callahan 2002). This path-delay bias was traced to small instrument-temperature-dependent biases in the antenna temperature calibration (Brown et al. 2002). At that time, an ad hoc empirical correction was applied to the TMR data to remove the bias correlated with the attitude of the satellite. Knowledge of the instrument-temperature dependency at the hot and cold reference points permits the tuning of the front-end path loss coefficients to mitigate the error.
The instrument-temperature-dependent biases in the TMR calibration are observed by sampling the on-earth hot and cold references as a function of the physical temperature of the front-end components. Figure 5 shows the TMR 37.0-GHz measured cold reference and TMR-measured–minus–modeled hot reference as a function of the physical temperature of the internal load. Each point was found by binning the 13-yr data record as a function of the internal load temperature in 0.5-K increments and computing the average hot and cold reference values for each bin. A slope is clearly present in both, with a slightly different magnitude at the cold and hot ends. This process was repeated with 1-yr subsets of the data, and no significant change in the slope was observed over time. Both the gain and offset calibration equations in the TMR TA algorithm contain temperature-dependent terms. The residual dependency of the TB on the internal load temperature (dTB/dTLoad) of the channels for both hot and cold TBs is shown in Table 1.
d. Antenna pattern correction
The TMR had a 60-cm offset parabolic reflector. The main-beam efficiency of the antenna was between 93% and 95% across the channels, necessitating a correction for the on-earth and off-earth sidelobe contributions. The TMR main-beam brightness temperature is determined through an antenna pattern deconvolution of the measured antenna temperature. In practice, the antenna pattern correction (APC) algorithm is simplified such that the antenna temperature is written as a linear combination of the brightness temperature in the main beam, the on-earth sidelobes, and the off-earth sidelobes, which view cold space (Janssen et al. 1995). The algorithm requires both the fraction of power received in and the effective brightness temperature incident on an annular ring of the antenna pattern. The fractional power received in the on-earth sidelobe region (10°–55° off boresight) and the off-earth sidelobe region (>55° from boresight) is determined from the antenna patterns measured prelaunch. For the off-earth sidelobes, the incident brightness temperature is simply the cosmic microwave background. The brightness temperature contribution for the on-earth sidelobes is more complicated and a function of the geophysical state of the surrounding scene. In the original APC algorithm, on-earth sidelobe contributions were based on island radiosonde data and on models with a simple latitude dependence (Janssen et al. 1995). This approach did not account for the complex geographic and temporal variations of water vapor, or for land and sea ice contamination of the on-earth sidelobes. The treatment of the on-earth sidelobe brightness temperature distribution was improved based on a concept suggested by Obligis et al. (2007). The concept is to generate climatological maps of the on-earth sidelobe TB distribution, referred to as TE, using the actual radiometer measurements over a period of several years. The approach used here is slightly different from that of Obligis et al. (2007), in that we explicitly account for the variation in TB with incidence angle in the far sidelobes, as opposed to simply averaging the nadir measurements in the on-earth sidelobe region.
The procedure to generate the climatological TE maps is summarized as follows. First, nadir maps of TMR TB are generated for each channel in a 0.4° × 0.4° grid every 2 months for the last 5 yr of the mission (2000–05). The polar regions, above TMR’s orbit, are assumed constant at 221 K. Six bimonthly averaged nadir TB maps for each frequency are created from the 5 yr of data. A quadratic function is used to map the measured nadir TB to the TB that would be observed at any other earth incidence angle where the polynomial coefficients are a function of incidence angle and frequency. The coefficients in this algorithm are parameterized using a database of modeled brightness temperatures generated from a globally distributed open-ocean island radiosonde database. The sidelobe brightness temperature contribution is assumed to be unpolarized, and no incidence angle dependence is assumed for land and ice data (i.e., the nadir TB value is used). The next step is to use this function to generate the sidelobe brightness temperature distribution, , which represents the brightness temperature that would be observed at an elevation angle of θel and azimuth angle of ϕ from the boresight of the antenna that is centered over the earth coordinates (Lat0, Lon0). To estimate the integrated on-earth sidelobe brightness at a given location, this temperature distribution is convolved with the TMR antenna pattern. This procedure is used to generate TE values on a 2.4° × 2.4° grid for each of the six nadir TB maps. The 2.4° grid spacing is used because TE is observed to be spatially a smooth function, lending itself to accurate bilinear interpolation, and because this is close to the nominal spacing between TMR footprints near the equator, meaning there would be little additional information gained from reducing the grid spacing. In practice, the revised TMR APC algorithm determines the TE value by interpolating as a function of latitude, longitude, and day of the year. Examples of the 21.0-GHz maps for days of year 1, 82, 182, and 272 are shown in Fig. 6. Including the interpolation over time accounts for the seasonal migration of water vapor and sea ice. This is evident as the increased TE in the Northern Hemisphere summer due to water vapor and the increased TE around Antarctica during the Southern Hemisphere winter–spring.
5. Recalibration of the TMR
It is the goal of the recalibration effort to minimize the calibration anomalies discussed in the previous section. The instrument-temperature dependency was found to be constant over time and was removed by adjusting the front-end path loss coefficients that operate on the TMR thermistor measurements in the antenna temperature algorithm. The gain and offset errors as a function of time were removed by fitting a third-order polynomial to the biases relative to the modeled hot and cold references, shown in Fig. 4, and linearly interpolating for brightness temperatures in between. By fitting a smooth function to the entire 13-yr TMR data record, the introduction of spurious short-time-scale geophysical signals, such as residual interannual variations in the references, is minimized. The instrument-temperature dependency was reduced to less than 0.1 K peak to peak at the cold end. The residual instrument-temperature dependency of the TMR brightness temperatures is shown in Table 2 for the measured hot and cold TBs.
The effect of the improved APC algorithm is assessed by applying the VCR algorithm to TMR data binned by the new 18.0-GHz sidelobe brightness temperature model. With the new algorithm, increasing TE is nearly equivalent to a decreasing distance from land. The TBs processed using the old and new APC algorithms for the last 3 yr of the TMR dataset are binned by the new 18-GHz TE in 1-K increments. The VCR is then computed for each bin for the old and new APC algorithms. The result for the old APC algorithm is shown in the left panel of Fig. 7, and the result with the new APC algorithm is shown in the right panel. In both cases, the measured cold reference value is differenced from the mean open-ocean value (defined as TE < 160 K) to highlight the difference approaching land. It is observed that the measured cold reference using the original APC algorithm increases as a function of TE with a slope of about 0.03, which is almost exactly the value of the on-earth sidelobe fraction divided by the main-beam efficiency. This is what would be expected from the derivative of TB with respect to TE and validates this approach for assessing the improvement provided by the new APC algorithm. The residual correlation of the measured cold reference value with TE is significantly reduced by using the new algorithm. It should be noted that this agreement was not forced and is only a by-product of implementing the climatological sidelobe brightness temperature maps in the TMR APC algorithm. It is an indication that the errors in the modeled on-earth sidelobe fraction and sidelobe brightness temperature are small.
6. Cross-validation of long-term stability with SSM/I
To validate the methodology described above for the long-term microwave radiometer calibration, the resulting TMR climate data record is compared to a suitable climate record both in terms of length and quality. The SSM/I climate record is produced by interleaving the observations from five SSM/I sensors and extends from 1987 to the present (Wentz 1997). The TMR climate record is produced from a single instrument and extends from 1992 to 2005. The quality of both records hinges on the long-term stability of the instrument calibration. The SSM/I record additionally depends on the quality of the intercalibration between the five sensors. The SSM/I climate records used in this analysis are the version 6 records from Remote Sensing Systems, which have been extensively calibrated and validated for long-term stability.
The SSM/I images the earth at an incidence angle of approximately 53° and has dual-polarized channels at 19.35, 37.0, and 85.0 GHz, and a vertically polarized channel at 22.235 GHz. The SSM/I sensors are in sun-synchronous polar orbits with different local equator-crossing times. The different orbits of TMR and SSM/I provide a significant number of globally distributed cross-over points. The precipitable water vapor, integrated cloud liquid water, and wind speed products from each instrument are compared for the 1992–2005 time period in Figs. 8 and 9. SSM/I and TMR observations that occur within 25 km and 1 h of each other are collected and categorized by SSM/I sensor and by SSM/I morning and evening passes. To assess the relative global trends between the TMR and SSM/I, the difference between the matched pairs is averaged annually, and a global bias for the entire dataset is removed. The global average comparisons between the SSM/I and TMR for the 1992–2005 time period are shown in Fig. 8. The top panels show the difference trends for each SSM/I sensor individually, combining morning and evening passes. The second and third panels show the SSM/I–TMR trends for morning-only and evening-only passes, respectively. The agreement in the relative trends of precipitable water vapor, cloud liquid water, and wind speed between the two records is compelling. The intersensor drift in the water vapor is less than 0.1 mm decade−1, and almost no discernable trends are evident in the cloud liquid water and wind speed comparisons. Slight differences in the trends and relative biases between the morning and evening passes are observed. Overall, the intersensor biases between the SSM/I sensors are small, pointing to the quality of the SSM/I intercalibration.
The TMR and SSM/I collocated pairs are distributed over the globe between ±66° latitude, allowing for the cross-validation of regional trends as well as global trends. Global maps of the average difference between the TMR and all available SSM/I data have been created on a 5° × 5° grid every 100 days from 1992 to 2005, yielding 48 maps for each geophysical variable. The linear trends over time at each grid point are computed and a 2 × 2 cosine taper smoothing function is applied to the resulting map. The regional intersatellite trends on the SSM/I–TMR water vapor, wind speed, and cloud liquid water retrievals are shown in Fig. 9. It is observed that regional SSM/I–TMR trends of water vapor are globally uniform at about −0.05 to 0.1 mm decade−1. An interesting regional feature is observed in the CLW trends (Fig. 9, middle). High northern latitudes show a positive difference trend of about 0.005 mm decade−1, whereas the tropics and high southern latitudes show negative difference trends of about 0.007 and 0.003 mm decade−1, respectively. No potential calibration anomalies in the TMR TBs could be identified to explain this regional pattern. The regional wind speed difference trends vary from about +0.1 to −0.05 m s−1 decade−1, with no significant regional patterns observed.
A technique to calibrate satellite microwave radiometers operating near 18–37 GHz on decadal time scales using on-earth brightness temperature references is presented. The methodology is applied to the 13-yr record from the TOPEX Microwave Radiometer. A vicarious cold reference is used to stabilize the TMR brightness temperatures at the low end of the instrument’s on-earth dynamic range. The cold reference is found to be generally insensitive to climate variability since it is not tied to a climate minimum but to a range of geophysical states that occur near an optimal sea surface temperature in the middle of the range of SSTs observed on the earth. The hot reference is observed to exhibit some climate sensitivity, particularly during the 1997/98 El Niño event. NCEP–NCAR reanalysis meteorological fields are used along with a radiative transfer model to produce time-dependent modeled hot reference values for the 1992–2005 period. This model is shown to account for the natural variability observed in this reference. These references are used to remove a long-term drift in the TMR calibration, as well as a residual TB error correlated with the physical temperature of the radiometer front end.
The TMR antenna pattern correction algorithm is also improved to reduce residual geographically correlated errors, in particular errors correlated with distance to land or sea ice. Time-dependent maps of the on-earth antenna sidelobe brightness temperature contribution are produced and used in the algorithm that generates main-beam brightness temperatures from the calibrated antenna temperatures. This improvement near land is demonstrated by sampling the vicarious cold reference as a function of the sidelobe brightness temperature and observing no significant correlation between the measured cold reference and increasing TE values. The new algorithm is shown to reduce the residual error approaching land to a negligible level.
The validity of the methodology using on-earth TB references to stabilize the calibration on decadal time scales is assessed by cross comparing the resulting calibrated TMR record with that produced from the intercalibration of five SSM/I sensors. It is found that intersatellite trends between the TMR and the SSM/I are exceedingly small. This is an important validation of both the calibration methodology presented here as well as that used for the SSM/I record. The cross-comparisons shown represent realistic errors bars on the climate record of PWV, CLW, and WS absolute trends produced by TMR and SSM/I and increase confidence in the record as a whole.
Future work will focus on a similar, rigorous recalibration of the radiometers on the Jason-1 and Jason-2 ocean altimetry missions. These radiometers are functionally equivalent to the TMR. The Jason missions are flying in the same orbit as TOPEX, continuing the climate record started in 1992. A distinct advantage of producing a climate record from these satellites is that each mission was launched into a tandem orbit with its predecessor. Jason-1 flew in formation with TOPEX, and Jason-2 flew in formation with Jason-1. This will allow for a seamless intersensor calibration of a climate record that will span at least 2 decades.
This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.
Corresponding author address: Shannon Brown, Jet Propulsion Laboratory, M/S 168-314, 4800 Oak Grove Dr., Pasadena, CA 91109. Email: email@example.com