An objective mapping exercise simulating observations of temperature in the North Atlantic Ocean was used to assess the resolution capabilities of ocean acoustic tomography in combination with Argo floats. A set of basis functions for a basinwide area was obtained from a singular value decomposition of a covariance derived from an ocean state estimate. As demonstrated by the formal uncertainty estimates from the objective maps, Argo and tomography are complementary measurements. In several examples, each separately obtained uncertainty for determining large-scale monthly average temperature of about 50% of prior (resolved 75% of variance), while when both data were employed, uncertainties were reduced to about 25% of prior (resolved 94% of variance). Possible tomography configurations range from arrays that span specific regions to line arrays that supplement existing observations to arrays that span the Atlantic basin. A basinwide array consisting of two acoustic sources and seven receivers can be used to significantly reduce the uncertainties of estimated broad-scale temperature. An optimal observing system study would comprise simulated measurements in combination with data assimilation techniques and numerical ocean modeling. This objective map study, however, showed that the addition of tomography to the existing observing system could substantially reduce the uncertainties for estimated large-scale temperature. To the extent that tomography offers a 50% reduction in uncertainty at a fraction of the cost of the Argo program, it is a cost-effective contribution to the ocean observing system.
Ocean acoustic tomography was introduced as an approach to ocean observation in the late 1970s (Munk et al. 1995; Munk and Wunsch 1982a; Munk 1986; Spiesberger and Metzger 1992). Several scientific programs in the 1980s and 1990s developed and established this measurement as both unique and valuable. Examples of tomographic contributions to physical oceanography are many, including the following: 1) The Greenland Sea Project made remarkable three-dimensional (3D) measurements of the evolution of deep-water forming events during the winter of 1989–90 (Worcester et al. 1993; Morawitz et al. 1996a,b). 2) Highlighting the inherent averaging properties of tomography, precise measurements of barotropic, open-ocean tidal currents, from the dominant M2 and S2 constituents to the small P1 and Q1 constituents, have been made in several places (Dushaw et al. 1997; Stammer et al. 2014). 3) The tiny tidal relative vorticity [O() s−1] at M2 frequency caused by tidal variations in sea surface height was also measured. 4) Tomography was central to the detection of coherent, mode-1, internal-tide radiation far into the interior of the North Pacific (Dushaw et al. 1995; Ray and Mitchum 1996) and 5) the determination that, much like the barotropic tides, such variability is predictable over most of the world’s oceans (Dushaw et al. 2011; Dushaw 2015). 6) An O(1000)-km-scale tomographic array deployed during 2000 in the equatorial Pacific measured the mean relative vorticity, showing that it was positive during La Niña and negative during the normal state (Nakano et al. 2001). 7) Basin-scale measurements of temperature variations of the layer of Atlantic water in the Arctic were made by transbasin acoustic experiments in the 1990s, one of the first indications of Arctic warming (Mikhalevsky et al. 1999, 2015). 8) Measurements of temperature were made across the North Pacific over the 1996–2006 decade by the Acoustic Thermometry of Ocean Climate (ATOC) program (ATOC Consortium 1998; Worcester et al. 1999; Dushaw et al. 2009), indicating that sometimes rapid changes, comparable in size to the seasonal cycle, in basin-scale temperature occur. 9) Data assimilation techniques used in combining tomography and other data with dynamical constraints in western boundary current regions substantially reduces the uncertainties of the state estimate (Gaillard 1992; Lebedev et al. 2003). In all of these measurements, tomography detected and quantified oceanographic phenomena in ways not achievable by other means. Munk et al. (1995) describe the technique and applications of acoustic tomography in general. Updated reviews have been given by Worcester (2001), Dushaw et al. (2001), and Dushaw (2014).
In the course of these experimental programs, the remarkable stability of the acoustic environment over most of the world’s oceans was demonstrated (Munk et al. 1995; Cornuelle et al. 1993; Worcester et al. 1999) such that in most open-ocean regions, quite accurate predictions of the acoustic propagation and spatial sampling characteristics can be made (Jensen et al. 2011; Dushaw et al. 2013). Such predictions can use a climatological ocean such as the World Ocean Atlas (Antonov et al. 2010; Locarnini et al. 2010) of the National Oceanic and Atmospheric Administration (NOAA). The oceanographic signals are derived from the small deviations in acoustic travel times from the basic climatological conditions. Variations of travel times over a 5000-km range across an ocean basin can be measured to an accuracy of about 0.010 s, out of a total travel time of over 3300.000 s. This accuracy in travel time is roughly equivalent to a range- and depth-averaged sound-speed (temperature) change of 0.005 m s−1 (1 m °C).
Ocean observing systems (OOSs) began to be developed in the 1990s, almost concurrent with the development of the acoustical measurements. Employment of acoustical methods within OOSs has been elusive, however. International workshops focused on OOSs, such as the 1999 and 2009 OceanObs conferences, highlighted acoustic tomography as a good potential contribution, suggesting the deployment of pilot systems in the North Atlantic (Scientific Committee on Ocean Research 1994; Dushaw et al. 2001, 2010; BOM 2001; Fischer et al. 2010). Despite this community consensus, there has been little progress in developing such prototype sustained systems. One difficulty has been the challenge of completing modeling studies that would quantify the precise contributions of tomography to the ocean observing system and determine optimal configurations of the acoustical systems. In the context of the Global Ocean Data Assimilation Experiment (GODAE; https://www.godae-oceanview.org/) and other such programs, observing system simulation experiments (OSSEs) that would bring numerical ocean modeling and data assimilation techniques to bear on the problem of system design and optimization have been discussed for many years (Cornuelle et al. 1989; Semtner and Chervin 1990; Foster 1991; Duda et al. 1995; Cornuelle and Worcester 1996; Sheinbaum 1995; Chiu et al. 1994; Yaremchuk and Yaremchuk 2001; Rémy et al. 2002; Yaremchuk et al. 2004). For the purposes of this paper, at least, an OSSEs is defined as a study employing a numerical ocean model and data assimilation. While such numerical studies are the gold standard of system assessment and design, in practice they are technically challenging and time consuming, such that few OSSEs, examining tomography or any other data type, in the context of general circulation models have been completed (Halliwell et al. 2017; Gasparin et al. 2018).
Considerable success has been achieved in obtaining ocean state estimates by data assimilation of existing data, however. Many such global and regional estimates are now available for research or operational purposes. State estimates represent accurate ocean environments that can be used for observing system design, for example, Mazloff et al. (2018). While not as definitive as a rigorous OSSE in quantifying the information contribution of data types and their configurations, analyses based on realistic ocean environments offer a simple, but objective, way to determine information requirements and to assess the relative resolution capabilities of different data types. For the exercise here, simulated measurements of temperature by ocean acoustic tomography in conjunction with simulated float measurements in the North Atlantic were assessed. A recent state estimate from the Estimating the Circulation and Climate of the Ocean (ECCO) consortium was employed. The approach used here is objective analysis, or objective mapping, which has been commonly used for data mapping or observing array analysis (Bretherton et al. 1976; Roemmich and Gilson 2009). Previous analyses have often represented ocean variability using covariances derived directly from data. In the present case, the ocean variability was represented by a covariance derived from the ECCO state estimate. A large set of mapping basis functions was obtained from a singular value decomposition (SVD) of the covariance.
The ECCO global ocean state estimate is described in section 2. Some examples of notional tomographic observations for the North Atlantic are described in section 3, and the basic, predicted acoustic characteristics associated with those observations are described in section 4. This study makes the simplifying, but justified, assumption that a tomographic measurement represents a simple temperature measurement, averaged over depth and over range. The derivation of the basis functions from an SVD of the ECCO temperature covariance is described in section 5. The measurements of temperature by Argo drifting floats in the North Atlantic and their representation and noise covariance for the objective mapping calculation are described in section 6, and the objective maps themselves are described in section 7. Metrics for assessing the quality of the ocean estimate from the various data configurations are described in section 8. A concluding discussion is given in section 9.
2. The ECCO, version 4, global ocean state estimate
The ocean state estimate used for this study was the recent ECCO solution, version 4, release 3 (ECCOv4; ftp://ecco.jpl.nasa.gov/Version4/Release3/) (Forget et al. 2015, 2016). The ECCO modeling program was initiated in the mid-1990s, partly to support the ATOC program. ECCOv4 provides an estimate for the global ocean state from 1992 to 2016. The model has a longitude resolution of 1° and a latitude resolution that telescopes to smaller values at high and low latitudes. A product derived from this solution consisting of monthly mean potential temperature interpolated to an equal grid spacing was employed for this analysis. This product has 0.5° horizontal resolution, less than the model itself in many places, and 50 layers in the vertical, spanning 5–5906 m. The details of the implementation of the numerical ocean model, the data, and forcing employed, etc., of this particular solution are given by Fukumori et al. (2017) and references cited therein. Ocean data employed for the estimate included a wide-ranging and variety of global datasets, including sea level, temperature profiles, salinity profiles, sea surface temperature, sea surface salinity, ocean bottom pressure. The model is also constrained by temperature–salinity climatology from the 2009 World Ocean Atlas and mean dynamic topography. Figure 1 illustrates the ECCOv4 state estimate with a map of the potential temperature field relative to climatology for the North Atlantic Ocean in February 1996 at 300-m depth. The lower panel of the figure also shows the root-mean-square (RMS) temperature at this depth computed from the state estimate, indicating, among other things, the large temperature variations of the North Atlantic Current to the south and east of Nova Scotia.
Since this study was limited to the purpose of assessing and illustrating the basic properties of acoustic tomography for resolving large-scale temperature variability, limiting the variables employed to just potential temperature was adequate. The sound speed that governs the acoustic travel times employed by tomography is a function of temperature, salinity, and pressure, to be sure (Del Grosso 1974), but the dominant variable for acoustic tomography is temperature (e.g., Morawitz et al. 1996a; Dushaw et al. 2009, 2013, 2016b). Indeed, this author knows of no cases where salinity variations caused significant ambiguity with temperature estimated from a tomographic inverse analysis. To good approximation, sound speed and temperature are proxy variables for each other in the open ocean.
Existing tomographic data obtained by the ATOC program between 1996 and 2006 in the North Pacific (Dushaw et al. 2009) can be compared to equivalent data computed from the ECCOv4 solution for potential temperature (Fig. 2). Time series of travel time, relative to the time average, were derived from ECCOv4 by averaging the potential temperature along the 3000–5000-km acoustic paths between the surface and 3500 m, and scaling that averages by a factor of 3.5 m s−1 °C−1 to estimate sound speed and travel time. (The section average omits the upper ocean near Hawaii to 200-m depth, linearly shoaling to the surface at 1000-km range, to account for the nature of the acoustic ray paths there.) The measurement and model have some similarity (Fig. 2), but the significant differences, comparable in size to the seasonal cycle, between the two time series indicate the additional information about the ocean state provided by acoustic tomography for the large-scale thermal field. The ECCO state estimates include all available datasets, sea surface height in particular, so such differences indicate a gap in the ocean observing system. The comparisons shown in Fig. 2 are little changed from rigorous comparisons made a decade ago using an ECCO state estimate available then (Dushaw et al. 2009, their Figs. 12 and 13).
3. Prototype acoustic array designs
The design of an acoustic observing array is driven by several factors, not the least of which is the physical property of the ocean that is to be measured. Acoustic arrays can be regional, such as the pentagonal arrays deployed for observing deep-mixing events in the Greenland Sea (Morawitz et al. 1996b) or for observing mesoscale variability (Worcester et al. 2013), or transbasin, such as a set acoustic paths across the North Pacific or Arctic for observing the large-scale temperature (ATOC Consortium 1998; Dushaw et al. 2009; Mikhalevsky et al. 2015). Other acoustic measurement have been made across important oceanographic choke points, such as the Strait of Gibraltar (Send et al. 2002) or Fram Strait (Sagen et al. 2016). The present analysis is focused on basin-scale observations.
Acoustic paths that would cause the acoustic signals to interact with the seafloor should be avoided, because such signals can be difficult to interpret. Thus, while acoustic instruments deployed near the Azores may seem to be logistically sensible, these islands are surrounded by the relatively shallow Azores Plateau, making them less than ideal. The seafloor depth data employed for this study were Smith and Sandwell (1997), version 18.1, with 1-min resolution. The acoustic paths were geodesics computed using the 1984 World Geodetic System (WGS84), using the algorithm of Vincenty (1975). The horizontal refraction of tomographic signals caused by ocean variability is inconsequential (Dushaw 2014; Dushaw and Menemenlis 2014).
Examples of three possible, but somewhat arbitrary, arrays will be described here to highlight different observational approaches and acoustical properties (Fig. 3). These arrays are 1) a network of acoustic paths spanning the western North Atlantic, 2) a single basin-scale acoustic path from Senegal to Bermuda, and 3) two 1200-km-long acoustic paths on either side of the Mid-Atlantic Ridge along 26°N. Array 1 consists of six transceivers that give 15 acoustic paths that span the western subtropical North Atlantic, from Antigua to Newfoundland and along the North American coast. Bermuda is highlighted as an ideal location for acoustic instruments, since it is central to the basin and the steep flanks of the Bermuda Rise allow for topographically unobstructed acoustical propagation into the open ocean. Array 2 is a basin-scale path, similar to those employed for the ATOC program in the Pacific. This particular path was used in 1945 to demonstrate the remarkable ability of the ocean to carry sound long distances (Anonymous 1960). The two paths of array 3 lie along the Rapid Climate Change (RAPID) array (http://www.rapid.ac.uk), a transatlantic line of moored instruments along 26°N that has been used to monitor the Atlantic meridional overturning circulation (AMOC) over the past decade (McCarthy et al. 2015, 2017). The precise measurements of average temperature by these two paths were anticipated to be constraints on the AMOC heat flux.
The actual design of acoustical observations would necessarily require close consultation between acousticians and oceanographers. The need to obtain particular or ideal measurements of ocean temperature or current has to be balanced against the practical acoustical and engineering limitations of what can be measured. The goal of employing these three array types is to begin the process of building intuition as to what is possible and how the estimated uncertainties from objective maps respond to these new data. The measurement capabilities of a fourth basinwide Atlantic array is described below.
4. Ray paths and the tomography measurement
Any new measurement for acoustic tomography requires an initial analysis of the acoustic forward problem (e.g., Worcester et al. 1999; Dushaw et al. 2016a, 2017). The acoustic propagation characteristics on any given path determine the resolution characteristics for oceanographic properties. From this analysis, the computation of an “inverse estimate” for the parameters of interest, either in temperature or current, follows. Such an inverse estimate could be computed in any number of ways, from a simple two-dimensional least squares fit using a basis set of simple functions that represent the ocean in depth and range (e.g., Dushaw and Sagen 2016) to an assimilation of the acoustic data as a constraint on a numerical ocean model. While the basic acoustic properties can most often be readily determined from existing ocean climatologies, oftentimes details of the recorded acoustic signals require additional analysis, for example, acoustic propagation in Fram Strait (Dushaw et al. 2016a) or the Canary basin (Dushaw et al. 2017). Regardless of such details, the essential information provided by tomography data obtained on any particular acoustic path is the average temperature, either a profile averaged over range or an average over both range and depth (Dushaw et al. 1993; Cornuelle and Worcester 1996; Dushaw 1999; Sagen et al. 2016). Averaging is the basic nature of the acoustic data.
Acoustic predictions can be made for any of the acoustic paths depicted in Fig. 3 (e.g., Figs. 4–6). The use of efficient broadband acoustical sources in the 30–100-Hz frequency range was assumed; acoustic signals of 30–40-Hz frequency will travel antipodal distances with little attenuation (Dushaw and Menemenlis 2014). The sound-speed structure of the North Atlantic is more conducive to ray paths that turn at a wide variety of upper and lower depths than the North Pacific. Ray paths typically turn in the upper 100–500 m, or are surface reflected, and they have lower turning depths of 3000–5000 m. Good separation of the ray arrivals in travel time is apparent, indicating that many rays could be resolved in tomographic data. On paths such as Senegal to Bermuda (Fig. 4), the ray predictions suggest the acoustic sampling would not measure the upper few hundred meters. In this case, the acoustic propagation is bottom limited by the Mid-Atlantic Ridge. The varied depths of ray turning in the upper and lower ocean suggest that tomography in the Atlantic would likely offer a degree of resolution of depth-dependent structure, but such possibilities are set aside for the present analysis. Tomography has an inherent “up-down ambiguity” (Munk and Wunsch 1982b), such that observed variations in acoustic travel time cannot be used to distinguish between upper- and lower-ocean temperature changes.
For the purposes of this study, all the technical details of acoustic propagation were set aside, and tomography was assumed to provide a simple measure of temperature, averaged over range and over a 0–3000-m-depth interval. While such an approximation disregards any number of acoustical properties and the details of resolution characteristics, it encompasses the essential nature of the acoustical measurement. For the purposes of illustrating the response of a basin-scale mapping problem to the acoustical measurements, assuming that tomography represents such a simple average measurement is adequate. The assumption avoids the extensive technical discussions regarding the details of acoustic propagation and inverse computation that would be eventually required.
The tomography uncertainty stems from a combination of measurement noise and representation error. Only a single datum was used for each acoustic path. The uncertainty employed here was made using the assumption that 30–50 transmissions—for example, 10 per week—would be employed in estimating the temperature average on any given path and month. A single transmission along a basin-scale path in the North Pacific obtained 12 m °C as a formal estimated uncertainty for the depth-averaged temperature (Dushaw 1999). The uncertainty of an average of 36 such measurements would be 2 m °C, which is the uncertainty value that was used here for all acoustic paths.
5. SVD of model covariance: Basis functions and statistical weights for mapping
The ECCOv4 monthly mean potential temperature fields span 24 years, giving 288 realizations for the evolution of the global ocean over this time interval. By definition, the state estimate comprises an optimal estimate for the actual ocean between 1992 and 2016, consistent with known dynamics, estimated forcing, and available data. It represents a nearly ideal ocean environment for the purposes of this study. We seek to derive a large set of 3D functions and their associated statistical weightings as a functional model for ocean variability from this state estimate using an SVD of its covariance (Fukumori and Wunsch 1991; Hannachi et al. 2007). These functions will empirically model the gamut of large-scale ocean temperature variability over a large area of the North Atlantic, from the near-surface mixed layer to the North Atlantic Current and other current systems to the abyss. Since the aim was to assess the measurement of ocean temperature variability, the time mean was removed from the temperature fields. The seasonal cycle was also removed by subtracting the average temperature by month. The seasonal cycle is well determined, and it would otherwise dominate the statistical analysis.
The ECCOv4 state estimate for potential temperature consists of a time series of fields for points and time . To compute covariances or SVDs of these fields, they were resized spatially to two-dimensional arrays . An array F had typical dimensions of about 30 000 × 288. The covariances were computed by , where the brackets indicate a time average. An SVD of the 288 ECCOv4 monthly temperature fields obtained only 276 empirical orthogonal functions (EOFs), however. Each EOF has length about 30 000, which can be resized to a 3D spatial function. There were only 276 degrees of freedom, perhaps 288 less 12 for subtracting the monthly mean temperatures. This limitation is also true for multivariable state estimates—for example, temperature, current, sea surface height, carbon content, etc.—and irrespective of domain size, whether a particular region, or the global ocean. Setting aside the nuances of data error, to determine the amplitudes of the multivariable EOFs, only 276 data of any type—for example, 276 point measurements of sea surface temperature—are formally required for a perfect resolution of this entire multivariable ocean state. This particular set of EOFs is therefore too limited and unsuitable for the purpose of assessing the resolution capabilities of the ocean observing system. This study therefore employed a temperature covariance that was modified from the ECCOv4 covariance, as described below.
Because of limitations in available computer memory, temperature covariances were computed using a decimated state estimate. Single precision was used throughout the computation. A horizontal resolution of 102 km was used, while 15 ECCOv4 depths were used, from the surface to 3900-m depth. The domain grid size was 61 (longitude) × 34 (latitude) × 15 (depth); a domain in Cartesian coordinates was employed. Physically, this domain spanned the width of the North Atlantic between about 15° and 45°N, an area of size 6200 × 3400 km2. This domain size and grid correspond to a temperature covariance of dimensions 31 110 × 31 110, but omitting points deeper than the seafloor reduced the matrix size to 27 476 × 27 476. This array size is about the maximum possible for computing an SVD with 32 GB of computer memory using standard routines.
Often the lowest EOFs are interpreted as physical modes of variability, but such interpretation is not generally correct (Hannachi et al. 2007). The EOFs are a mixture of the natural physical modes of variability and ordinary matrix mathematics, since, irrespective of the physical system, the functions derived this way need only account for the number of degrees of freedom of the system, 276 in this case. One symptom that the system has limited degrees of freedom may be that the spatial correlations of temperature were exaggerated (Fig. 7), while the EOFs themselves suggest significant correlations between distant points. These pathologies were remedied by artificially localizing the ECCOv4 covariance by applying a Gaussian filter (Fig. 7). The Gaussian filter had the effect of suppressing distant correlations, but the cost was that the covariance no longer respected the dynamical constraints of the ECCOv4 solution. Rather, aside from some modest residual local structure, the filtered covariance was agnostic with respect to the model dynamical and forcing constraints. The resulting EOFs were better suited for a system to be used for objectively testing measurement resolution, however.
While it was initially hoped that this filter would be required only in the horizontal, preserving the ECCOv4 vertical correlations, the unmodified vertical correlations also proved to be unworkable. While there are justifications for possible substantial vertical correlations, such that near-surface variability could indicate abyssal variability (e.g., Lacasce 2017; Frajka-Williams 2015), retaining the vertical correlations introduced other obvious pathologies. The covariances at all depths were dominated by the large near-surface variances. Maps derived by fitting Argo data, described below, were able to perfectly resolve abyssal temperature variability to 4000 m, even though those profiles extended to only 2000-m depth. The ECCOv4 covariances in the vertical were therefore also filtered to suppress excessive vertical correlations.
In summary, the distant horizontal and vertical correlations were suppressed by weighting the ECCOv4 covariance by
for domain grid points and . This weighting corresponds to a horizontal correlation width of about 2400 km zonally and about 940 km meridionally, irrespective of depth (Fig. 7). The expression in the vertical corresponds to smaller vertical scales (500 m) near the surface and large vertical scales (2000 m) at abyssal depths. This weighting was convolved with the model covariance, so the effective correlations were a combination of the weighting values and the characteristics of the model covariance. Most the dynamical information or correlation inherent in the state estimates was removed, though not entirely. The covariance modified this way retained the state estimate variances, however; hence, it preserved the natural ocean variability.
The lowest-order EOFs had their largest variations either within the near-surface ocean or within the North Atlantic Current system. Higher-order modes had oscillatory structures over the North Atlantic, with increasing variations at depth. Thus, the larger the mode number, the shorter scale the variations, and the deeper the extent of the mode influence. Since the set of these modes was used to model all large-scale North Atlantic temperature variability, all modes were necessary; there was no real natural mode cutoff. Abyssal variability was accounted for by the high-order modes; such variability comprised weak, but important, temperature variance. The SVD spectrum, indicating the RMS of the mode amplitudes, showed the effects of the correlation filters (Fig. 8). The spectrum for the original ECCOv4 covariance had a sharp fall off after mode 276; higher-order modes were not defined. After filtering the covariance to suppress distant correlations horizontally, the resulting spectrum fell off gradually, with no obvious spectrum cutoff point. The additional filter applied to the vertical covariance caused a more gradual falloff in the spectrum. For this study, 5000 EOFs were employed.
6. Argo data and noise covariance
To ensure an accurate representation of Argo sampling, Argo profiles for the North Atlantic were obtained from the GODAE data server (http://www.usgodae.org) for the first six months of 2015 and 2017. Considerable effort has been made, and is ongoing, in estimating the large-scale temperature and salinity fields using Argo data to assess data quality, to determine the adequacy of Argo sampling, and for comparison of Argo data to other datasets, for example, Roemmich and Gilson (2009). For the objective maps here, Argo positions for months in 2017 were used to define a typical set of Argo measurements, but ideal profiles to 2000-m depth, or to the ocean depth in shallower regions, were assumed. Data from 2015 were used together with the ECCOv4 state estimate to determine the data noise covariance.
Argo profile data can be obtained at small depth intervals, but in the context of the objective mapping exercise here, these data are not independent. Rather, there are significant correlations in depth, such that the effective number of independent data (Ziȩba 2010) is small. If data are used at finely spaced intervals, treating the values as independent, then the objective map lends the Argo profiles far more resolution than is real. The formal uncertainties get reduced by , for N independent data points. Hypothetically, if the uncertainty of an Argo profile is 1°C, temperature noise is highly correlated over all depths, and 200 data points are used for the profile (values at 10-m intervals to 2000 m), then the uncertainty gets artificially reduced in the least squares fit to . Without properly assessing the effective number of independent data, the effective resolution contributed by Argo profiles would be far greater than is physical.
The noise of Argo profiles can be directly determined by computing the difference between the Argo data and the ECCOv4 state estimates (Figs. 9, 10); this residual is the noise by definition. Noise in this case is not the instrument noise but mostly reflects representation error, that is, the fact that model employed for ECCOv4 did not resolve the smaller scales of ocean variability. For Argo floats, noise is much greater than signal.
An SVD of the Argo–ECCOv4 misfit showed that the noise was typically dominated by five to six significant EOFs (Figs. 9, 10). It was therefore conjectured that Argo profiles convey information content to this extent; that is, each Argo profile conveyed five to six independent points of information to the ECCOv4 estimate. For the objective maps, Argo floats were assumed to provide independent data at the five depths of 5, 193, 477, 814, and 1517 m, roughly corresponding to the maximum values of the lowest EOFs (Figs. 9, 10) and the ECCOv4 layer depths.
Argo noise is obviously much larger in the region of the North Atlantic current. Within the domain employed for this study, Argo profiles were therefore separated into those north and south of 35°N. The noise covariances (°C2) for north (C1) and south (C2) of 35°N at the depths given above were
Both noise variances and correlations were much larger for profiles north of 35°N. These covariances were used as data noise covariances for Argo profiles in the mapping computations described below.
Another obvious correlation, ignored in this analysis, is that the profiles obtained by an Argo float in any particular month were usually correlated. In both Figs. 9 and 10, the residual profiles occurred in groups of about three, corresponding to the number of profiles obtained by an Argo float in a month. These sequential profiles were evidently correlated; hence, they represented redundant information. For the mapping exercise here, however, all profiles were employed and assumed to be independent. Within the mapping domain, some 700–800 profiles were obtained in any given month in 2017.
7. Objective mapping using Argo and tomography data
Using the set EOFs derived from an SVD of the localized ECCOv4 covariance as a basis set, the computations of objective maps using Argo or tomography data were simple least squares problems. In this case, an objective map, or inverse, is an estimate for the 3D temperature of the model domain. For the present analysis, however, the map itself was not computed. Rather, the quantity of interest was the uncertainty estimate for the EOF amplitudes and the uncertainty for whatever derived quantity was to be estimated, for example, basin-averaged upper-ocean temperature. We did not seek to derive accurate estimates for the ocean temperature field per se; rather, we sought to understand how the estimated uncertainties respond to various measurement configurations. In particular, we sought to determine, within a realistic, self-consistent mapping system, how the addition of tomographic measurements would reduce the resulting uncertainties.
The least squares problem can be reduced to the equation
where is the vector of data, is the forward problem matrix, is the vector of model parameters to solve for, and is data noise. The data, a vector of length , consist of the set of either Argo float measurements of temperature at a depth or tomography measurements of average temperature. The forward problem matrix is obtained by computing the data-equivalent measurements of the 5000 individual EOFs; hence, this matrix is dimension . The model parameters are the amplitudes of the EOFs, a vector of length 5000. The data noise is a concatenation of the noise values for Argo or tomography, described above; is dimension .
The least squares solution is weighted by a prior estimate for the model covariance and by the data noise covariance. This solution is derived from the inverse operator applied to the data vector,
In this case, the prior model covariance is just the variances of the mode amplitudes determined by the SVD, a diagonal matrix. The noise covariances for the Argo profiles were described above. For this analysis, the essential results from the inverses were the model parameter error covariance matrices of size 5000 × 5000, where
Suitable mapping or averaging operations that transform the inverse solution for mode amplitudes to variables of interest are applied similarly to to derive error covariances for those variables.
Examples of the mapped estimated uncertainties for a combination of Argo profiles from January 2017 and tomography data from hypothetical arrays 1–3, introduced above, are given in Fig. 11. The figure shows mapped 3D uncertainty in two panels: at 300-m depth across the North Atlantic and a transbasin section in depth and range along 26°N. Such maps show just the diagonal of the error covariance. At the top of the figure, the a priori RMS is indicated in the two panels, or the uncertainty in the case of no available data. This prior uncertainty is equal to the RMS of the ECCOv4 state estimate. The middle panels show the uncertainty estimate, which is reduced from the prior value by the available measurements. The uncertainties are reduced where there are point measurements and slightly reduced, broadly, where there are line-average measurements. The bottom panels show the relative uncertainty, or the ratio of the estimated uncertainty to the prior RMS values, indicating a modest reduction of uncertainty across the North Atlantic. The tomography array 3, the two 1200-km paths on 26°N, contributes significantly, of course, to broadly reducing the uncertainty where those measurements occur; tomography provides information on the average, rather than on values at particular points. Such error covariances and uncertainty maps were computed for a variety of measurement configurations.
8. Inverse metrics and uncertainty response
The mapped uncertainties indicate the uncertainties at particular points, but such results are of limited use in determining the effectiveness of a particular measurement configuration. Rather, we require some specific defined quantities of special interest, or metrics, that can be derived from the maps, together with their uncertainties. An example of a sophisticated metric is in the OSSE of Halliwell et al. (2017), a determination of how different ocean measurements could improve ocean model initialization for better predictions of tropical cyclones. Here, the simple metric of basin-averaged upper-ocean temperature, equivalent to ocean heat content, was employed. This quantity has been of particular interest in determining the climatological changes to the world’s oceans (Levitus et al. 2012; Balmaseda et al. 2013), and it is a quantity that is naturally derived from the acoustic measurements. While the formal uncertainty of such a simple metric is a number that can be used to objectively compare different measurement configurations, the ultimate test occurs in the context of data assimilation and the assessment of the accuracy of the resulting state estimate.
The uncertainties for basin- and upper-ocean averaged temperature for three areas and different measurement configurations are given in Table 1. The uncertainties are given as standard errors (m °C); values in variance can be readily determined. The average was computed over the two areas west and east of the Mid-Atlantic Ridge at 26°N and over the entire basin (Fig. 11). Uncertainties over these areas were computed for four depth intervals: 0–1000, 0–2000, 0–3000, and 3000–3900 m. The measurements have minimal resolution for the deepest interval. For the three areas and all the upper-ocean intervals, the Argo-only configuration had an uncertainty that was about 50% of the prior.
Of the three tomography arrays, tomography array 1, the array of 15 paths in the western North Atlantic, obtained an uncertainty for the western Atlantic area (7.0 m °C for the 0–3000-m-depth-interval case) that is comparable to Argo (8.8 m °C), about 50% of the prior (17 m °C). When both Argo and tomography were included, the uncertainty for the temperature of the western Atlantic was halved (3.9 m °C).
All tomography observations, the three configurations described above, provided modest resolution for the basinwide average. For these configurations much of the North Atlantic was unobserved. The best resolution was afforded by tomography array 1, but even the single basin-scale acoustic path provided about 10% resolution for the average. The combinations of Argo and tomography arrays reduced the uncertainties substantially, with about 35% reduction when all tomography configurations were employed. While all the tomography measurements in this case averaged between the surface and 3000 m, those measurements still provided significant resolution for the 0–1000-m temperature average.
It is possible to design a basinwide tomographic array to resolve the North Atlantic at the largest scales (Fig. 12). The hypothetical array in the figure consists of two acoustic sources and seven receivers. One acoustic source off Antigua and Barbuda is located at 19.1°N, 62.8°W, and the second source off the Cape Verde Islands is located at 17.3°N, 25.8°W. Hypothetical receivers are located at, clockwise from Antigua and Barbuda, 19.1°N, 62.8°W; 25.5°N, 75.9°W; 35.1°N, 73.8°W; 41.4°N, 65.4°W; 41.8°N, 50.7°W; 51.1°N, 15.4°W; and 32.4°N, 17.4°W. As is apparent from Fig. 12, these instruments provide for a coarse, but uniform, coverage of the subtropical North Atlantic. Since this array leaves no region of the North Atlantic unobserved, substantial reductions in the uncertainties were obtained from the tomography data (Table 2). For this tomographic configuration, the Argo and tomography resolutions were comparable, with the Argo resolution better for the 0–1000-m-depth average and the tomography resolution better for the greater depth intervals, in keeping with the nature of those data. Though the uncertainties for Argo and tomography separately were comparable, when both were combined the uncertainties were about halved, showing that the information provided by these two data types is not redundant.
The computation was repeated using float data for each month from January to June 2017, which obtained 10%–20% variations in the numbers, without changing the relative responses of the data types. Last, the density of Argo float sampling was tripled by combining all profiles from January, March, and May. Every other month was used to try to avoid the redundancies of the Argo profiling. The greater density of float sampling served to reduce the uncertainties by about 25%, with significant improvement still apparent when those data were combined with the tomography measurements. Argo and tomography are complementary measurements.
The sensitivity of the objective maps to localizing the ECCOv4 covariance was tested by doubling the horizontal correlation lengths in the expression for W above and repeating the calculations. The system with the larger correlation lengths improved both the Argo and tomography resolutions by several percent (the Argo-only uncertainty for the 0–2000-m, basinwide average temperature improved from 48% to 41% of the prior), while leaving the essential nature of the analysis and its conclusions unchanged.
Uncertainties were also computed for the particular sections along 26°N (Table 3). Averaged along the two 1200-km sections east and west of the Mid-Atlantic Ridge, the tomography uncertainties are small, since such averages correspond identically to the acoustic measurements. The table shows that tomography has relatively large uncertainty for 5–1485 or 1500–3000-m-depth intervals, but small uncertainties for the full depth. As above, the least uncertainties were obtained when Argo and tomography data were combined. The computation also showed that even the basinwide array (Fig. 12) alone made modest contributions to reducing the uncertainties on this particular section.
To illustrate the nature and possible signals in acoustic travel time as might be observed by tomography in the Atlantic, time series of temperature averaged along these paths can be readily computed from the ECCOv4 state estimate. These time series can then be scaled to estimate equivalent travel times (Fig. 13). Judging by the comparisons to actual data obtained in the Pacific (Fig. 2), had actual time series been measured, they could be expected to be considerably different from these predictions.
For this study, the acoustic measurements were purposefully simplified to be represented by section averages of temperature. This approximation comprises the bulk of the information provided by tomography, however. The details of the acoustic propagation, acoustic sampling, and tomography inverse would add or subtract secondary aspects of resolution with depth, depending on local conditions. The near-surface ocean would be unresolved on acoustic paths where rays do not reach the surface. The varied ray turning depths on most paths would provide some resolution of the depth dependence of temperature. Similarly, the details of how Argo sampling was implemented for this study could be refined. The refined details of these measurements would leave the essential and primary conclusion of this study unchanged, however: Argo and tomography are complementary measurements. The addition of tomography to the observing system would substantially reduce uncertainties of basinwide average temperatures specifically and state estimates generally. This conclusion is consistent with the significant differences between the tomography observations obtained from 1996 to 2006 in the North Pacific and equivalent observations computed from ECCOv4 (Fig. 2). The uncertainties resulting from the existing system can be reduced by averaging over longer time intervals—for example, the 5-yr running mean of Levitus et al. (2012)—but the cost of doing so is reduced time resolution for ocean processes. The acoustical observations in the Pacific demonstrated that significant large-scale thermal variability can occur at quite short time scales.
Although this computation employed a state estimate that did not include small-scale variability, the nature of a computation employing such a state estimate may be conjectured. A basin-scale EOF computation that admits mesoscale eddies would require an order-of-magnitude-larger computer memory to be able to manage the much larger covariance matrices. Since the large-scale variability of low- and high-resolution ocean estimates would be nearly identical, both estimates being for the same ocean, the initial part of the EOF spectrum for the high-resolution estimate would be similar to the spectrum for the low-resolution estimate. The EOF spectrum for the high-resolution case would then continue at a decaying rate to higher modes, reflecting the ever-lessening variance at the smaller scales. These high-order modes lie mostly in the null space of both tomography measurements and the large-scale metrics employed here, since those quantities are large-scale averages. Argo point measurements project equally well on all modes, however, reflecting the sensitivity to the Argo data to small-scale variability. To compute the inverse, the tomography data uncertainties would be essentially unchanged from the low-resolution model computation, while the Argo data uncertainties would have to be reduced to account for the different representation uncertainty. In essence, the very small unresolved processes, such as internal-wave variability, become the noise for Argo data. In the inverse computation, most of the Argo information would go to constraining the mesoscale variability, although rather poorly, given the sparseness of the Argo sampling. That most of the Argo information goes to constraining the small scales is essentially equivalent to the relatively large uncertainties assumed for Argo data in the low-resolution computation. For the large-scale average metrics employed for this study, the responses of the uncertainties to the different measurement configurations are therefore likely to be about the same for the low- and high-resolution ocean state estimates.
Two common objections to employing acoustic tomography are its possible effects on marine life and its costs, but these objections are specious. These two issues are addressed in turn. The 1996–2006 ATOC program included a substantial multiyear marine mammal observation program that reached the conclusion that its acoustic transmissions had no significant biological impact (Frankel and Clark 2002; Mobley 2005). Basic physical considerations indicate why this is so. Sound levels of ATOC acoustic sources are about 250 W, or about the same intensity as the sound made by a blue whale. The tomographic signals employed transmit at approximately 50–100-Hz frequency, which is outside the frequency bands employed by marine mammals. In addition, tomographic sound sources are usually deployed near the sound channel axis at around 1000-m depth; hence, sound levels at the surface are reduced by spherical spreading. The signal level relative to the level at the source is 20 log(1/r), so r = 1000 m gives −60 dB less intensity at the surface than at the source. The tomographic signals employ lengthy coded transmissions together with signal processing, thus spreading the acoustic energy out over a time of 10–20 min and requiring less absolute intensity. The transmissions have a low duty cycle, such that, operationally, transmissions could likely be reduced to just several transmissions per week. Obtaining the appropriate government permits for the operation of tomographic acoustic sources is now routine. There are now decades of experience with tomographic and other similar acoustic sources, with no evidence for harmful effects. While vigilance for possible adverse effects should be maintained, precluding deployment out of fear of unlikely, hypothetical scenarios for how marine life can be impacted by these relatively weak acoustical signals is unreasonable.
The costs for sustaining a tomography system are less than those for other global-scale measurements. Satellites cost a few \$100 million (U.S. dollars), and, over the past 20 years, Argo floats have cost \$500 million (4000 floats at \$25,000 each, redeployed every 4 years for 20 years), not counting deployment costs. Operational tomography is less than both these costs, perhaps costing \$2–\$3 million for a cabled acoustic source and less for a receiver. The cost profiles for tomography and Argo over time are different, however, with tomography being more similar to a satellite program in having relatively large start-up costs. Once deployed, acoustic instrumentation lasts for decades, however. The acoustic source deployed over 20 years ago north of Kauai used by the ATOC project is still operational, though acoustic transmissions ceased in 2006. The cumulative costs over time for tomography are significantly less than for the satellite or Argo programs. To the extent that tomography offers a 50% reduction in uncertainty at a fraction of the cost of the Argo program, it is certainly a cost-effective measurement.
This simplified study had the limited purpose of demonstrating the ability of a tomographic system for reducing the observational uncertainties of the existing ocean observing system for temperature at large spatial and temporal (monthly) scales. Many other possible applications of tomography require further consideration. Arrays or particular acoustical lines can be designed or selected to observe regions or phenomena of particular interest. The use of tomography to measure gyre-scale currents and relative vorticity—for example, the circulation of the AMOC—by exploiting the acoustical sampling of the abyssal ocean, requires a more thorough analysis. As shown above, tomography and Argo are complementary measurements, but, in addition, the former is Eulerian in character, while the latter is Lagrangian, with different properties for monitoring ocean gyres. Other advantages of a basin-scale system are that it provides for an underwater global positioning system (UGPS) for tracking positions of, for example, abyssal Argo floats, and that such a system provides a measurement of temperature independent from Argo for guarding against systemic errors. Exploiting the acoustical observations to optimize their use and effectiveness will require closer, active cooperation between acousticians and oceanographers.
The author was supported by the U.S. Office of Naval Research (Grant N00014-15-1-2186). Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the author and do not necessarily reflect the views of the Office of Naval Research. L. Bertino, B. Cornuelle, and D. Menemenlis provided valuable discussion and advice for this project. B. Howe, E. Rémy, and E. Skarsoulis provided constructive comments on the manuscript.