1. Introduction
Historical ocean temperature profiles are used in a variety of climate research applications, including assessing Earth’s energy imbalance and ocean heat content change (von Schuckmann et al. 2016; Palmer 2017), ocean reanalysis and state estimation (Balmaseda et al. 2015; Palmer et al. 2017) and seasonal-to-decadal weather and climate forecasting (Doblas-Reyes et al. 2013; Meehl et al. 2014). Expendable bathythermograph (XBT) measurements, which dominate temperature profile observations over the latter half of the twentieth century, are prone to time-varying biases that can affect estimates of ocean heat uptake and sea level rise (Gouretski and Koltermann 2007; Domingues et al. 2008). This has led to a number of international groups developing XBT bias corrections for use in climate studies (Abraham et al. 2013; Cheng et al. 2016). Several studies have demonstrated that choice of XBT correction is a leading-order uncertainty in time series of global upper-ocean heat content change (Palmer et al. 2010; Lyman et al. 2010; Boyer et al. 2016; Cheng et al. 2016). The impact of XBT biases on the spatial patterns of ocean warming and forecast initialization is an important area of present research.
XBTs were first developed during the 1960s with the aim of providing a cheap and effective means of surveying the temperature of the upper ocean, with the ability to be deployed underway from ships at speeds exceeding 15 kt (1 kt = 0.51 m s−1; Abraham et al. 2013). An XBT consists of a small torpedo-shaped probe that includes a thermistor attached to a spool of copper wire that is linked to an onboard data acquisition system. Once deployed, the XBT falls vertically through the ocean under its own weight and the system records measurements of temperature until the wire runs out and breaks. Time elapsed is converted to an estimate of probe depth using a fall-rate equation (e.g., Hanawa et al. 1995) to provide a profile of ocean temperature against depth. XBTs began widespread deployment on naval, merchant, and research vessels in the mid-1960s and brought about a dramatic improvement in the coverage of upper-ocean temperature observations (Abraham et al. 2013; Palmer 2017).
The U.S. company Sippican (now called Lockheed Martin Sippican Inc.) was the original developer and manufacturer of XBT instruments, while the Japanese Tsurumi-Seiki Co. (TSK) started manufacturing from the early 1970s (Kizu et al. 2011) under the Sippican license. Later, Sparton of Canada also manufactured XBTs of its own design. Over the years, different probe types were developed for a variety of sampling depth ranges and vessel deployment speeds (Table 1), which occasionally share the same model name, but their production is independent for each manufacturer. However, Sippican and TSK always used the same brand of thermistor temperature sensor (currently GE Sensing, also used for all XCTD probe versions). XBTs from each manufacturer have shown sizable differences in fall rate (Kizu et al. 2005, 2011) and therefore should be distinguished when developing XBT bias correction schemes. Table 1 summarizes the probe types supplied by Sippican and TSK, with operational depth ranges, maximum ship speeds, and dates when the manufacturers started to supply individual models to the market. The depth ranges and maximum deployment speeds are based on product catalogues and historical acquisition software, and the dates have been provided by each manufacturer.
Probe types supplied by Sippican and TSK, and their basic information.
However, it seems that release dates may be uncertain for some probe types. For example, Sippican has stated that the T-11 XBT probe became commercially available in 2006, but Johnson and Lange (1979) studied the properties of that model and noted that the T-11 probe had been available since 1975. Also, Sippican indicated 1968 and 1971 as release years for its T-5 and T-10 probes, respectively, in response to subsequent questions about probe manufacture date (cf. Table 1 values). Similarly, TSK has subsequently indicated 1979 and 1984 as release years for its T-7 and T-10, respectively (cf. Table 1 values). We also found literature that mentioned other types planned by Sippican—T-2 (Wannamaker et al, 1985), T-3 (Saur and Stewart 1967), T-8 (Sippican Corporation 1968), and T-9 (Brown et al. 1977)—but they are not included in the table because we have not confirmed the details of their specification and whether they became available for purchase. We note that not all XBT types listed in Table 1 are present in the World Ocean Database (Table 2), for example, the Sippican T-12 (Hannon 2000; Gilson and Roemmich 2002) and the TSK T-7 20-kt version. However, we include all information for completeness, noting that these probes may enter at a later date and may be present in the database with an unknown probe type.
Number of temperature profiles in WOD with known manufacturer and probe type. SSXBT = submarine-launched expendable bathythermograph.
Sippican and TSK have maintained agreement on their sales territories, which have been renewed occasionally. As of July 2017, customers in Europe, North America, South America, Australia, New Zealand, India, Malaysia, and Singapore are in Sippican’s sales territory, and Japan and China are in TSK’s sales territory. South Korea, Taiwan, and Thailand are separately covered by both manufacturers: their military forces are in Sippican’s sales territory but their civilian/research customers are in TSK’s sales territory. Thus, the country code reported with each profile gives an indication of the likely manufacturer, when this information is absent.
The typical approach in development of an XBT bias correction algorithm is to aggregate probes according to a small number of types (Table 1) and to compute bulk statistics with a “matchup” database of high-quality temperature measurements (such as ship hydrography or Argo profiling float observations). This enables an estimate of time-varying temperature and/or fall-rate bias for each of the major probe types or designs (Cheng et al. 2016). However, these efforts are fundamentally limited by the availability of probe type and manufacturer information, which is missing for approximately 50% of all XBT drops available to the global community as found in the World Ocean Database (WOD). This has led to many groups making simple assumptions about the likely probe type (and manufacturer) for “unknown” XBT drops (e.g., Cowley et al. 2013; Cheng et al. 2014).
The purpose of this paper is to present a new algorithm for assigning probe type and manufacturer to XBT data for which these metadata are missing. This effort builds on the work presented by Cowley et al. (2013) and represents a community effort that has been fostered under the auspices of the International Quality Controlled Ocean Database (IQuOD; www.iquod.org) initiative (Domingues and Palmer 2015). The “intelligent metadata” (iMeta) generated by the algorithm presented here are associated with the IQuOD, version 0.1, data release and will be served alongside the WOD temperature profiles by the National Centers for Environmental Information (NCEI) and other ocean data repositories. The paper outline is as follows. In section 2 we describe the database used in this study. In section 3 we present the iMeta algorithm with the results of our data analysis in support of the algorithm presented in section 4. In section 5 we provide an initial skill assessment of the algorithm and a summary of probe assignments across the database. This is followed by a discussion and summary in section 6. Further information on the Sippican XBT manufacturing history is provided in the appendix.
2. Data
Our analysis makes use of WOD ASCII files on reported depth levels (not the WOD standard depths) that were downloaded from the National Centers for Environmental Information in July 2016. The data include all profiles categorized as XBT observations for the inclusive period 1966–2015 with a total of 2.3 million temperature profiles. The number of XBT profiles that include both manufacturer and probe type information represents about 50% of the database overall, and this ranges between 20% and 60% for any given year prior to 2000 and is close to 100% the last 15 years (see section 5). There are a total of 27 unique probe types listed in the WOD, with just 7 of these accounting for >95% of known XBT profiles (Table 2).
The three manufacturers of XBT probes in the WOD database—Sippican, TSURUMI SEIKI Co., and Sparton—account for 95.2%, 4.3%, and 0.5% of known XBT profiles, respectively. Because of the much lower numbers of Sparton XBTs in the database, our iMeta algorithm does not assign any unknown probes to that manufacturer. The maximum depth of an XBT profile and the date it was recorded are important pieces of information that can be used to distinguish between probe types. To inform depth criteria for assigning likely probe type for each manufacturer, we computed histograms of maximum profile depth for known XBTs (see section 4). Following the approach of Cowley et al. (2013), quality control flags were disregarded for the purposes of this analysis: we used all the available profiles and data points.
3. The iMeta algorithm
The iMeta algorithm presented in this section is an evolution of the approach described by Cowley et al. (2013). The objective of the algorithm is to assign the most likely probe type and manufacturer to XBT drops that are missing this information based on 1) the reporting country, 2) the maximum depth reported, and 3) the date on which the profile was taken. The main innovations relative to Cowley et al. (2013) are in informing the criteria for items 2 and 3 based on known XBT profiles, where manufacturer specifications and date-to-market information were used previously. In addition, we retain all estimated manufacturer and probe type information, and do not aggregate probes of a similar type (e.g., combining T-4 and T-6 probes). Histograms and time series plots to support our choice of depth and date criteria for the algorithm are presented in section 4, with a wider discussion of those results. We present an initial skill assessment of the algorithm in section 5. Since the iMeta algorithm is deterministic, it can be usefully summarized as a flowchart (Fig. 1).
Flow diagram of the IQuOD, version 0.1, iMeta algorithm to classify XBT probes of unknown type. Flow is from left to right, starting with a criterion to determine the manufacturer (SIPPICAN = blue, TSK = orange). Next step is to classify the probes according to maximum depth recorded for the profile, followed by a depth classification step. Country code abbreviations are as follows: JP = Japan, TW = Taiwan, CN = China, and KR = South Korea. Probe type abbreviations are DB = Deep Blue and FD = Fast Deep.
Citation: Journal of Atmospheric and Oceanic Technology 35, 3; 10.1175/JTECH-D-17-0129.1
The first step for any unknown profile is to specify the manufacturer based on the country of origin. We follow the same criterion as used by Cowley et al. (2013), based on sales territories: XBT profiles from Japan, Taiwan, China, and South Korea are assumed to be manufactured by TSK. Profiles from all other countries are assumed to come from a Sippican instrument, consistent with our present knowledge of sales territories. As noted in the introduction, South Korea is a sales territory for both Sippican (military) and TSK (civilian) XBTs, and there is some inherent error associated with our assumption on manufacturer. However, we leave further research into the relationship between country of origin and manufacturer, and more sophisticated approaches to treating sales territories for future work.
The second step is to classify the probe into the most likely type(s) according to the maximum recorded depth. On inspection of the maximum recorded depth histograms (Figs. 2 and 3), we select the following depth ranges to be used as criteria for distinguishing between XBT probe types: 0–360, 360–600, 600–1000, 1000–1350, 1350–2300 m. These ranges are chosen to differentiate between different probe types while encompassing the main histogram peaks in each category. For comparison, Cowley et al. (2013) used intervals that were determined by applying the Hanawa et al. (1995) fall-rate correction to manufacturer specifications, resulting in depth ranges of 0–362, 362–568, 568–982, and 982–2584 m. Our ranges are similar and include an additional depth range (1000–1350 m) in order to distinguish between Sippican T-5 and Fast Deep probes. We have also reduced the final depth cutoff from 2584 to 2300 m based on tests performed with probes manufactured after 2002 that suggest there is insufficient wire for T-5 probes to reach depths beyond 2300 m. A total of 371 profiles contained a maximum depth >2300 m, which represents about 0.02% of the database.
Histograms of maximum recorded depth for known SIPPICAN XBT probes in the WOD for the period 1966–2015. Bin widths of 50 m are used for all probe categories. Shaded regions indicate the depth interval used in the iMeta algorithm (Fig. 1) for the allocation of unknown probes to the type indicated in each subplot title.
Citation: Journal of Atmospheric and Oceanic Technology 35, 3; 10.1175/JTECH-D-17-0129.1
As in Fig. 2, but for TSK XBT probes.
Citation: Journal of Atmospheric and Oceanic Technology 35, 3; 10.1175/JTECH-D-17-0129.1
The final step in the algorithm is to classify the profile according to the date when the profile was recorded. For simplicity we sort probes in time only according to the year in which the profile was taken; hence, all dates used are 1 January (Fig. 1). This approach could be refined in future analyses by considering the month or date that each profile was recorded. This date criterion is informed by looking at time series of total profile numbers for the two dominate XBT types in each depth range (Figs. 4 and 5). The only exception to this is in the distinction between TSK T-5 and T-7 probes in the 600–1000-m depth range for which there are insufficient known profiles to facilitate this approach. Instead, the date criterion for these probes is based on T-7 probes becoming available only in 1979. This final step is not invoked for the 360–600-m depth range for Sippican, for which all profiles are assigned as a T-4, and the 1000–2000-m depth range for TSK, for which all profiles are assigned as a T-5.
(a) Number of profiles for known Sippican T-10 and T-4 XBT profiles with maximum depths terminating in the depth range 0–360 m. (b) As in (a), but for T-5 and FD XBT profiles terminating in the depth range 1000–1350 m. (c) As in (a), but for T-7 and DB XBT profiles terminating in the depth range 600–1000 m. Year of transition for the dominant XBT probe type in that depth range (vertical dashed lines).
Citation: Journal of Atmospheric and Oceanic Technology 35, 3; 10.1175/JTECH-D-17-0129.1
Number of profiles for known TSK T-4 and T-6 XBT profiles with maximum depths terminating in the depth range 0–600 m. Year of transition for the dominant XBT probe type (vertical dashed lines).
Citation: Journal of Atmospheric and Oceanic Technology 35, 3; 10.1175/JTECH-D-17-0129.1
In addition to the main algorithm, it is necessary to provide some information about the fall-rate equations (FREs) that were used to determine the depth of each temperature observation in an XBT profile. Hanawa et al. (1995) proposed a correction to the manufacturer FRE following a coordinated side-by-side comparison with CTD data that demonstrated a faster observed fall rate for T-4, T-6, T-7, and Deep Blue, which were the most widely used probes in the science community. The Integrated Global Ocean Services System (IGOSS) decided to approve the proposed usage of the new FRE by the Task Team on Quality Control for Automated Systems (TT/QCAS), which managed the problem and issued an amendment to the BATHY data protocol on 8 November 1995, which enabled implementation of the new FRE and enhancement of the metadata description. The new FRE is reported to have been adopted in the software packages of TSK and Sippican in January 1996 and around August 1996, respectively (G. Ferguson and J. Hannon 2005, personal communication). However, it is not clear how quickly these updates were adopted by the users. Since our IQuOD data files include the Cheng et al. (2014) XBT bias corrections, we adopt their assumption that all XBTs dropped in or before 1997 used the Sippican FRE and that for all XBTs dropped in or after 1998 use the Hanawa et al. (1995) revised FRE. The number of XBT profiles in the World Ocean Database that do not include FRE information is summarized in Table 3.
Number of XBT profiles with unknown FRE for each year over the period 1996–2001. All subsequent years have just a few hundred profiles with missing FRE information (<5% of profiles).
4. Supporting results
Histograms of maximum recorded depth for known Sippican probe profiles (Fig. 2) corresponding to iMeta categories (Fig. 1) show distributions that correspond well with manufacturer specifications (Cowley et al. 2013, their Table 3). There are also interesting features in the histograms that warrant further investigation. For example, many of the deeper probe types show a peak in profile numbers around 500 m. This is thought to be associated with Sippican XBT recorder software, which imposed a depth cutoff at 480 m until the software was changed in the mid-1990s. Although the T-10 probes have a manufacturer-specified operation depth of 200 m, a number of these probes appear to have recorded much greater depths, suggesting some mislabeling of reported probe type.
The histograms of maximum recorded depth for TSK probes also show good agreement with manufacturer specifications (Fig. 3). They are also similar to Sippican probes in showing a histogram peak around 500-m depth for the T-5, T-6, and T-7 probes. The T-5 histogram appears to have a particularly “noisy” histogram profile, with the large peaks around 500, 1000, and 1750 m. However, this may be partly due to a relatively small sample size for this XBT type, which makes it more difficult to draw clear conclusions on issues such as mislabeling. The TSK T-6 histogram shows a second peak at about 750 m, with some probes reporting maximum depths even greater than this. It seems most likely that these are deeper probes (e.g., T-5 or T-7) that have been mislabeled. The same is true of the peak at 500 m for the T-10 probes, which have a nominal sampling depth of 300 m, which is deeper than their Sippican counterparts (Table 1).
For both Sippican and TSK probes, the depth ranges used in the iMeta algorithm (Figs. 2 and 3, shaded regions) appear to do a good job of capturing the main histogram peaks while differentiating between different XBT probe types.
Time series of total profiles for known XBTs over a number of depth ranges are used to inform the date criteria in the final step of the iMeta algorithm. For Sippican probes with maximum recorded depths in the range 0–360 m, we can see a transition from T-4 to T-10 as the dominant probe type after 1993 (Fig. 4a). In the 1000–1350-m depth range, the transition between Sippcian T-5 and Fast Deep as the dominant probe type is less distinct (Fig. 4b). For simplicity, and to maintain only single-date criteria in the iMeta algorithm, we take 2007 as the transition from T-5 to the newer Fast Deep probes, despite the earlier peak in Fast Deep numbers in 2001. We note that in 2007 Sippican started the production of T-5/T-20 probes (the T-5 version for ships traveling up to 20 kt) with the same properties and characteristics of the output file as the standard T-5 version. In the 600–1000-m depth range, there is a distinct transition between T-7 and the newer Deep Blue probes after 1997 (Fig. 4c).
The TSK probes with maximum recorded depths in the upper few hundred meters are dominated by T-4 and T-6 probes, with a transition between T-4 and T-6 as the dominant probe type after 1995 (Fig. 5). There is insufficient data on known probes for the remaining iMeta categories to perform an analysis on the dominant probe types.
5. Skill assessment
In this section we carry out a simple skill assessment of the iMeta algorithm presented in section 3 (Fig. 1). The measure of skill is based upon running the iMeta algorithm on all known XBT profiles and looking for agreement with the metadata information for both manufacturer and probe type. During the 1970s to the late 1990s, the number of XBT profiles recorded each year generally exceeded 50 000 with this number declining substantially over the start of the twenty-first century (Fig. 6a). While the total number of XBT profiles missing metadata constitutes about 50% of the database, there are large temporal variations. Prior to the late 1990s, the percentage of known XBT profiles fluctuates between about 65% in the late 1970s to a minimum of about 10% in the early 1990s (Fig. 6b). There was a dramatic rise in the number of known XBT profiles over the mid-to-late 1990s with enhancements to the BATHY data protocol (IOC–WMO 1995), reaching over 95% from the year 2000 and thereafter. Further inspection of the data during the early 1990s reveals that the low rate of metadata is mostly due to the large number of probes being reported as “Unknown Brand” for the manufacturer (about 70% of the unknown probes).
(a) Total number of XBT profiles (black) and profiles containing manufacturer and probe type metadata (blue) for each year during the period 1966–2015. Also shown are the total profiles for which iMeta predicts the correct probe type and manufacturer (red) and just the probe type (orange). (b) Assessment of the iMeta performance expressed as a percentage for probe type and manufacturer (red) and probe type alone (orange). Percentage of probes with known type and manufacturer is also shown (blue).
Citation: Journal of Atmospheric and Oceanic Technology 35, 3; 10.1175/JTECH-D-17-0129.1
The iMeta algorithm skill starts very high (>90%), presumably because there are very few probe types available initially (Fig. 6b). The skill declines over the 1970s and 1980s from values around 90% to a minimum of around 50% in the early 1990s. This is followed by a slow recovery back up to about 80% by the mid-2010s. While the overall performance of the iMeta algorithm is quite good, with an average of 77% for the entire period, there is room for substantial improvement. As a sensitivity test, we also evaluate the skill in predicting only the probe type and disregarding the information on manufacturer (Fig. 6b, orange line). In this case, we see some improvement of the iMeta skill during the mid-1970s to mid-1990s, which may be largely explained by the use of Sippican probes deployed from Japanese vessels in the Thermal Structure Monitoring Program in the Pacific (TRANSPAC) ship of opportunity XBT program (e.g., Koblinsky et al. 1984). This improvement in skill is relevant for XBT correction schemes that aggregate probe types across manufacturers. However, previous studies have shown that both manufacturer and probe type are important determinants of XBT bias (e.g., Kizu et al. 2005, 2011; Cowley et al. 2013).
It is interesting to consider the percentages of total probe numbers for both the known XBTs (Fig. 7a) and the probe assignments of the iMeta algorithm for all profiles (Fig. 7b). The iMeta algorithm retains the Sippican T-4, Deep Blue, and T-7 probes as the most numerous types, with some substantial changes in the percentages. The iMeta algorithm suggests the next most numerous probe type is the TSK T-4, accounting for just under 10% of the WOD.
Relative proportions of XBT probe type and manufacturer for (a) known XBT profiles and (b) totals based on iMeta assignment for all XBT profiles. Letters in parenthesis indicate the following: S = SIPPICAN and T = TSK.
Citation: Journal of Atmospheric and Oceanic Technology 35, 3; 10.1175/JTECH-D-17-0129.1
An estimate of total probe numbers manufactured by Sippican prior to August 2002 (see the appendix) suggests the following percentages (excluding air-dropped and submarine-deployed XBTs): T-4 = 68%, T-7 = 17%, Deep Blue = 8.5%, T-5 = 3.4%, T-10 = 1.7%, and Fast Deep < 1%. Although no direct correspondence between our iMeta assignments and those numbers can be expected because of the large number of probes that are unaccounted for in the World Ocean Database (presumably resulting from being classified information, since the U.S. Navy is the largest customer), it is reassuring that the proportions are broadly similar.
6. Discussion and summary
We have presented an “intelligent meta data” (iMeta) algorithm for assigning manufacturer and probe type information to unknown XBT profiles. The primary purpose of the algorithm is to facilitate advances in XBT bias corrections for climate research applications. Our approach is an extension of the work presented by Cowley et al. (2013) and uses country code, maximum recorded depth, and profile date to inform the most likely XBT manufacturer and probe type. A skill assessment based on all known XBT profiles for the period 1966–2015 shows that the correct probe type and manufacturer are assigned on average 77% of the time. Skill is poorest during the early 1990s, which is also a period of particularly high rates of missing XBT metadata. A histogram analysis of the maximum recorded depths has highlighted some interesting features that warrant further investigation. In particular, the data suggest that there may be a substantial number of mislabeled probes in the database. If possible, these errors should be eliminated, since erroneous data may introduce artifacts into the iMeta algorithm and/or compromise the evaluation of skill.
There are a number of avenues of future research that could be usefully pursued. We have presented a deterministic algorithm here, but ultimately there may be greater value in adopting a probabilistic framework, that is, one that gives likelihoods for all possible probe types rather than a single “best guess.” Machine learning approaches may be a particularly well-suited approach to pursue, and initial research efforts are currently underway. A probabilistic framework would allow the generation of multiple realizations of iMeta and could underpin ensembles of XBT bias corrections, which may offer a more complete description of the associated uncertainties.
In addition, the number of predictor variables could be increased. Information such as cruise identification, scientific institute, and geographic location (perhaps combined with bathymetry data) could add to the skill of the algorithm. It would also be useful to seek further information from manufacturers on the numbers of probes (or relative proportions) that have been sold over time. Further advances in the provision of intelligent metadata and the impact on estimated ocean heat content variability and change will be fostered under the International Quality Controlled Ocean Database (IQuOD, www.iquod.org) initiative.
Acknowledgments
The work presented here has been carried out as part of the IQuOD working group activities sponsored by the Scientific Committee on Oceanic Research and the International Oceanographic Data and Information Exchange program of the Intergovernmental Oceanographic Commission (IOC). We would like to acknowledge the wider community of scientists who have contributed to useful discussions on this work at IQuOD workshops. We thank Catia Domingues and two anonymous reviewers for their constructive comments on an earlier version of this manuscript. MP was supported by the Joint U.K. BEIS/Defra Met Office Hadley Centre Climate Programme (GA01101). TS and SK were supported by the Environment Research and Technology Development Fund (2-1506) of the Ministry of Environment, Japan. RC and AT were supported by the Australian Government's National Environmental Science Program through the Earth Systems and Climate Change Hub.
APPENDIX
Sippican Production History Notes
The aim of this appendix is to capture some of the information on Sippican XBT production history based on various e-mail inquiries from company staff and research scientists. This information may help facilitate future improvements in approaches to assigning intelligent XBT metadata.
In e-mail correspondence dated 2 August 2002, Jim Hannon of Sippican provided the following information and “guesstimate” on the total number of probes manufactured. The XBT was developed by Sippican using internal funding and patented in 1962. Production of XBTs began in 1964 and the first model (and by far the largest number produced) was the T-4, which had a depth capability of 1500 ft (460 m) and a maximum ship speed of 30 kt. On 10 December 1972, Sippican produced its 1 000 000th XBT. Prior to August 2002, over 7 million XBTs had been produced in various models, including submarine- and aircraft-launched XBTs (Table A1).
Estimate of the total number of XBT probes manufactured at Sippican prior to August 2002.
The probe has a wire length of 868 to 893 meters (winding tolerance). I believe we made a change to the winding tolerances back in 1999 to resolve a concern about not getting as deep as probes used to. We changed wire lengths when we moved to Juarez because we changed wire suppliers. Given these wire lengths it is not surprising at all that the reported depths are going to as much as 925 meters with some stretch (this relates to a 3% stretch, which is well within our expectations).
[The] rated depth of the Deep Blue XBT is 760m, but actual depth is 922m. Extra wire was added to the canister spool (ship end) to accommodate our fast commercial vessels, so that the ship spool would not run out before the probe spool, even at high vessel speeds. So we continue to this day to get probe depths of 922m in good weather. Note: wind and sea state 'take' more wire from both ends, so in rough seas we typically get 850m depths, even though the wire almost always breaks at the probe. Identifying probes due to maximum depth is fraught with many uncertainties. I would suggest that it is an unreliable way to determine probe type. We launched a bunch of T-12s on an experimental basis, but it was so long ago (~20 years), I've lost the bead on that. We were hoping that the 2000 m probe would pan out but after Jim gave us a 4 boxes, we discovered that the LMP-T5 (fall rate ~10m/sec) had a temperature offset of 4 C at the surface - after spending some time in storage, due to the encapsulation curing over time, putting stresses on the circuitry. After the French spent significant funds for development for their 30knot Charles De Gaul, Sippican eventually abandoned that project, as unsolvable.
REFERENCES
Abraham, J. P., and Coauthors, 2013: A review of global ocean temperature observations: Implications for ocean heat content estimates and climate change. Rev. Geophys., 51, 450–483, https://doi.org/10.1002/rog.20022.
Balmaseda, M. A., and Coauthors, 2015: The Ocean Reanalyses Intercomparison Project (ORA-IP). J. Oper. Oceanogr., 8, 80–97, https://doi.org/10.1080/1755876X.2015.1022329.
Boyer, T., and Coauthors, 2016: Sensitivity of global upper-ocean heat content estimates to mapping methods, XBT bias corrections, and baseline climatologies. J. Climate, 29, 4817–4842, https://doi.org/10.1175/JCLI-D-15-0801.1.
Brown, W., J. A. Vermersh, and R. C. Beardsley, 1977: Wintertime 1974-75 Western Gulf of Maine Experiment data report. Woods Hole Oceanographic Institution Tech. Rep. WHOI-77-22, 95 pp.
Cheng, L., J. Zhu, R. Cowley, T. Boyer, and S. Wijffels, 2014: Time, probe type, and temperature variable bias corrections to historical expendable bathythermograph observations. J. Atmos. Oceanic Technol., 31, 1793–1825, https://doi.org/10.1175/JTECH-D-13-00197.1.
Cheng, L., and Coauthors, 2016: XBT science: Assessment of instrumental biases and errors. Bull. Amer. Meteor. Soc., 97, 924–933, https://doi.org/10.1175/BAMS-D-15-00031.1.
Cowley, R., S. Wijffels, L. Cheng, T. Boyer, and S. Kizu, 2013: Biases in expendable bathythermograph data: A new view based on historical side-by-side comparisons. J. Atmos. Oceanic Technol., 30, 1195–1225, https://doi.org/10.1175/JTECH-D-12-00127.1.
Doblas-Reyes, F. J., J. García-Serrano, F. Lienert, A. P. Biescas, and L. R. L. Rodrigues, 2013: Seasonal climate predictability and forecasting: Status and prospects. Wiley Interdiscip. Rev. Climate Change, 4, 245–268, https://doi.org/10.1002/wcc.217.
Domingues, C. M., and M. D. Palmer, 2015: The IQuOD initiative: Towards an International Quality controlled Ocean Database. CLIVAR Exchanges, No. 67, International CLIVAR Project Office, Southampton, United Kingdom, 38–40.
Domingues, C. M., J. A. Church, N. J. White, P. J. Gleckler, S. E. Wijffels, P. M. Barker, and J. R. Dunn, 2008: Improved estimates of upper-ocean warming and multi-decadal sea-level rise. Nature, 453, 1090–1093, https://doi.org/10.1038/nature07080.
Gilson, J., and D. Roemmich, 2002: Mean and temporal variability in Kuroshio geostrophic transport south of Taiwan (1993–2001). J. Oceanogr., 58, 183–195, https://doi.org/10.1023/A:1015841120927.
Gouretski, V., and K. P. Koltermann, 2007: How much is the ocean really warming? Geophys. Res. Lett., 34, L01610, https://doi.org/10.1029/2006GL027834.
Hanawa, K., P. Rual, R. Bailey, A. Sy, and M. Szabados, 1995: A new depth-time equation for Sippican or TSK T-7, T-6 and T-4 expendable bathythermographs (XBT). Deep-Sea Res. I, 42, 1423–1451, https://doi.org/10.1016/0967-0637(95)97154-Z.
Hannon, J., 2000: New developments in expendable oceanographic sensors and data acquisition systems. OCEANS 2000 MTS/IEEE Conference Proceedings, Vol. 3, IEEE, 1875–1877, https://doi.org/10.1109/OCEANS.2000.882210.
Johnson, B. P., and R. E. Lange, 1979: Rapid sampling of temperature and temperature gradient using XBT’s. Scripps Institution of Oceanography Rep. SIO Reference Series 79-4, 41 pp.
IOC–WMO, 1995: Integrated Global Ocean Services System (IGOSS). Circular Letter 95-96.
Kizu, S., S. Itoh, and T. Watanabe, 2005: Inter-manufacturer difference and temperature dependency of the fall-rate of T-5 expendable bathythermograph. J. Oceanogr., 61, 905–912, https://doi.org/10.1007/s10872-006-0008-z.
Kizu, S., C. Sukigara, and K. Hanawa, 2011: Comparison of the fall rate and structure of recent T-7 XBT manufactured by Sippican and TSK. Ocean Sci., 7, 231–244, https://doi.org/10.5194/os-7-231-2011.
Koblinsky, C. J., R. L. Bernstein, W. J. Schmitz Jr., and P. P. Niiler, 1984: Estimates of the geostrophic stream function in the western North Pacific from XBT surveys. J. Geophys. Res., 89, 10 451–10 460, https://doi.org/10.1029/JC089iC06p10451.
Lyman, J. M., S. A. Good, V. V. Gouretski, M. Ishii, G. C. Johnson, M. D. Palmer, D. M. Smith, and J. K. Willis, 2010: Robust warming of the global upper ocean. Nature, 465, 334–337, https://doi.org/10.1038/nature09043.
Meehl, G. A., and Coauthors, 2014: Decadal climate prediction: An update from the trenches. Bull. Amer. Meteor. Soc., 95, 243–267, https://doi.org/10.1175/BAMS-D-12-00241.1.
Palmer, M. D., 2017: Reconciling estimates of ocean heating and Earth’s radiation budget. Curr. Climate Change Rep., 3, 78, https://doi.org/10.1007/s40641-016-0053-7.
Palmer, M. D., and Coauthors, 2010: Future observations for monitoring global ocean heat content. Proceedings of OceanObs’09: Sustained Ocean Observations and Information for Society, J. Hall, D. E. Harrison, and D. Stammer, Eds., Vol. 2, ESA Publ. WPP-306, https://doi.org/10.5270/OceanObs09.cwp.68.
Palmer, M. D., and Coauthors, 2017: Ocean heat content variability and change in an ensemble of ocean reanalyses. Climate Dyn., 49, 909–930, https://doi.org/10.1007/s00382-015-2801-0.
Saur, J. F. T., and D. D. Stewart, 1967: Expendable bathythermograph data on subsurface thermal structure in the eastern North Pacific Ocean. U.S. Fish and Wildlife Service Special Scientific Rep.—Fisheries 548, 70 pp.
Sippican Corporation, 1968: R-467B: Instruction for installation, operation and maintenance of Sippican expendable bathythermograph system. The Sippican Corporation, 100 pp.
von Schuckmann, K., and Coauthors, 2016: An imperative to monitor Earth’s energy imbalance. Nat. Climate Change, 6, 138–144, https://doi.org/10.1038/NCLIMATE2876.
Wannamaker, B., R. Rossi, P. Nesfield, and P. Saia, 1985: A microcomputer-controlled digital acquisition and analysis system for the expendable bathythermograph. SACLANT ASW Research Centre Memo. SM-183, 23 pp.