Browse
Abstract
We analyze the calibration stability of the 17-yr precipitation radar (PR) data on board the Tropical Rainfall Measuring Mission (TRMM) satellite to develop a precipitation climate record from the spaceborne precipitation radar data of the TRMM and following satellite missions. Since the PR measures the normalized radar cross section (NRCS) over the ocean surface, the temporal change in the NRCS whose variability is insensitive to the sea surface wind is regarded as the temporal change of the PR calibration. The temporal change of the PR calibration in TRMM, version 7, is found to be 0.19 dB decade−1 from 1998 to 2013. The calibration change is simply adjusted to evaluate the NRCS time series and the near-surface precipitation trend analysis within the latitudinal band between 35°S and 35°N. The NRCS time series at nadir and off-nadir are uncorrelated before the calibration adjustment, but they are correlated after the adjustment. The 0.19 dB decade−1 change of the PR calibration causes an overestimation of 0.08 mm day−1 decade−1 or 2.9% decade−1 for the linear trend of the near-surface precipitation. Even after the adjustment, agreement of the results among the satellite products depends on the analysis period. The temporal stability of the data quality is also important to evaluate the plausible trend analysis. The reprocessing of the PR data in TRMM, version 8 (or later), takes into account the temporal adjustment of the calibration change based upon the results of this study, which can provide more credible data for a long-term precipitation analysis.
Significance Statement
The stability of long-term data is very important for climate research so that an account of temporal calibration changes in the sensor must be made. In this study, we investigate the calibration stability of the TRMM PR data and evaluate its impact on the precipitation trend analysis. The temporal change of the PR calibration is estimated to be 0.19 dB decade−1. Compensating for this change improves the consistency of precipitation trend analysis between the PR and other precipitation datasets. The reprocessed PR data provide more probable data for long-term precipitation analysis.
Abstract
We analyze the calibration stability of the 17-yr precipitation radar (PR) data on board the Tropical Rainfall Measuring Mission (TRMM) satellite to develop a precipitation climate record from the spaceborne precipitation radar data of the TRMM and following satellite missions. Since the PR measures the normalized radar cross section (NRCS) over the ocean surface, the temporal change in the NRCS whose variability is insensitive to the sea surface wind is regarded as the temporal change of the PR calibration. The temporal change of the PR calibration in TRMM, version 7, is found to be 0.19 dB decade−1 from 1998 to 2013. The calibration change is simply adjusted to evaluate the NRCS time series and the near-surface precipitation trend analysis within the latitudinal band between 35°S and 35°N. The NRCS time series at nadir and off-nadir are uncorrelated before the calibration adjustment, but they are correlated after the adjustment. The 0.19 dB decade−1 change of the PR calibration causes an overestimation of 0.08 mm day−1 decade−1 or 2.9% decade−1 for the linear trend of the near-surface precipitation. Even after the adjustment, agreement of the results among the satellite products depends on the analysis period. The temporal stability of the data quality is also important to evaluate the plausible trend analysis. The reprocessing of the PR data in TRMM, version 8 (or later), takes into account the temporal adjustment of the calibration change based upon the results of this study, which can provide more credible data for a long-term precipitation analysis.
Significance Statement
The stability of long-term data is very important for climate research so that an account of temporal calibration changes in the sensor must be made. In this study, we investigate the calibration stability of the TRMM PR data and evaluate its impact on the precipitation trend analysis. The temporal change of the PR calibration is estimated to be 0.19 dB decade−1. Compensating for this change improves the consistency of precipitation trend analysis between the PR and other precipitation datasets. The reprocessed PR data provide more probable data for long-term precipitation analysis.
Abstract
The density of newly fallen snow ρN is an important parameter for assessing accumulated snowfall depth. We examined the relationships between polarimetric parameters of X-band radar and the ρN in dry snow cases with ground temperatures less than 0°C. Our study was based on observations at Niigata Prefecture, Japan, along the coastal region of the Sea of Japan. This region is subjected primarily to sea-effect snow during the winter monsoon season, and convective clouds and rimed snow are common. We assumed that snow particles that accumulated on the ground originated from altitudes above an altitude with a temperature of −15°C, and we focused on the ratio of the differential phase K DP to radar reflectivity Zh , which is influenced by both aspect ratio and inverse particle size. We found that K DP/Zh at an altitude with a temperature of −15°C exhibited a greater magnitude for lower ρN values. Its correlation coefficient was the best among the polarimetric parameters that we examined. The difference in ice crystal flatness is highlighted rather than the difference in size because aggregation growth has not progressed at this altitude. On the basis of this result, we propose an empirical relationship between K DP/Zh at an altitude with a temperature of −15°C and ρN on the ground, thereby facilitating the estimation of snowfall depth by combining the estimated ρN with the liquid equivalent snowfall rate from, for example, Zh or K DP.
Significance Statement
This study aims to estimate the density of newly fallen (just-accumulated) snow from polarimetric radar observations. Understanding the newly fallen snow density will help to determine the exact snowfall depth. Focusing on polarimetric parameters at an altitude with a temperature of −15°C, we conducted radar and ground-based observations of snow particles and found that the newly fallen snow density of dry snow can be estimated. We were able to highlight the difference in ice crystal flatness before aggregation growth progressed by focusing on higher altitudes.
Abstract
The density of newly fallen snow ρN is an important parameter for assessing accumulated snowfall depth. We examined the relationships between polarimetric parameters of X-band radar and the ρN in dry snow cases with ground temperatures less than 0°C. Our study was based on observations at Niigata Prefecture, Japan, along the coastal region of the Sea of Japan. This region is subjected primarily to sea-effect snow during the winter monsoon season, and convective clouds and rimed snow are common. We assumed that snow particles that accumulated on the ground originated from altitudes above an altitude with a temperature of −15°C, and we focused on the ratio of the differential phase K DP to radar reflectivity Zh , which is influenced by both aspect ratio and inverse particle size. We found that K DP/Zh at an altitude with a temperature of −15°C exhibited a greater magnitude for lower ρN values. Its correlation coefficient was the best among the polarimetric parameters that we examined. The difference in ice crystal flatness is highlighted rather than the difference in size because aggregation growth has not progressed at this altitude. On the basis of this result, we propose an empirical relationship between K DP/Zh at an altitude with a temperature of −15°C and ρN on the ground, thereby facilitating the estimation of snowfall depth by combining the estimated ρN with the liquid equivalent snowfall rate from, for example, Zh or K DP.
Significance Statement
This study aims to estimate the density of newly fallen (just-accumulated) snow from polarimetric radar observations. Understanding the newly fallen snow density will help to determine the exact snowfall depth. Focusing on polarimetric parameters at an altitude with a temperature of −15°C, we conducted radar and ground-based observations of snow particles and found that the newly fallen snow density of dry snow can be estimated. We were able to highlight the difference in ice crystal flatness before aggregation growth progressed by focusing on higher altitudes.
Abstract
Out of the 45 radars composing the Terminal Doppler Weather Radar (TDWR) network, 21 are located in areas of complex terrain. Their mission to observe low-level wind shear at major airports prone to strong shear-induced accidents puts them in an ideal position to fill critical boundary layer observation gaps within the NEXRAD network in these regions. Retrievals such as velocity azimuth display and velocity volume processing (VVP) are used to create time-height profiles of the boundary layer from radar conical scans, but assume that a wide area around the radar is horizontally homogeneous. This assumption is rarely met in regions of complex terrain. This paper introduces a VVP retrieval with limited radius to make these profiling techniques informative for flows affected by topography. These retrievals can be applied to any operational radar to help examine critical boundary layer processes. VVP retrievals were derived from the TDWR for Salt Lake City International Airport, TSLC, during a summertime high ozone period. These observations highlighted thermally driven circulations and variations in boundary layer depth at high vertical and temporal resolution and provided insight on their influence on air quality.
Significance Statement
Residents in many urban areas of the United States are exposed to elevated ozone concentrations during the summer months. In complex terrain, thermally driven circulations and terrain-forced flows affect chemical processes by modulating mixing and transport. A novel technique to monitor local boundary layer conditions on small horizontal length scales was applied to data from the Terminal Doppler Weather Radar located near Salt Lake City International Airport during a multiday high ozone event, and effects of these flows on ozone concentrations are illustrated. This technique can be applied to other operational weather radars to create long-term and real-time records of near-surface processes at high vertical and temporal resolution.
Abstract
Out of the 45 radars composing the Terminal Doppler Weather Radar (TDWR) network, 21 are located in areas of complex terrain. Their mission to observe low-level wind shear at major airports prone to strong shear-induced accidents puts them in an ideal position to fill critical boundary layer observation gaps within the NEXRAD network in these regions. Retrievals such as velocity azimuth display and velocity volume processing (VVP) are used to create time-height profiles of the boundary layer from radar conical scans, but assume that a wide area around the radar is horizontally homogeneous. This assumption is rarely met in regions of complex terrain. This paper introduces a VVP retrieval with limited radius to make these profiling techniques informative for flows affected by topography. These retrievals can be applied to any operational radar to help examine critical boundary layer processes. VVP retrievals were derived from the TDWR for Salt Lake City International Airport, TSLC, during a summertime high ozone period. These observations highlighted thermally driven circulations and variations in boundary layer depth at high vertical and temporal resolution and provided insight on their influence on air quality.
Significance Statement
Residents in many urban areas of the United States are exposed to elevated ozone concentrations during the summer months. In complex terrain, thermally driven circulations and terrain-forced flows affect chemical processes by modulating mixing and transport. A novel technique to monitor local boundary layer conditions on small horizontal length scales was applied to data from the Terminal Doppler Weather Radar located near Salt Lake City International Airport during a multiday high ozone event, and effects of these flows on ozone concentrations are illustrated. This technique can be applied to other operational weather radars to create long-term and real-time records of near-surface processes at high vertical and temporal resolution.
Abstract
The Geostationary Lightning Mapper (GLM) has been providing unprecedented observations of total lightning since becoming operational in 2017. The potential for GLM observations to be used for forecasting and analyzing tropical cyclone (TC) structure and intensity has been complicated by inconsistencies in the GLM data from a number of artifacts. The algorithm that processes raw GLM data has improved with time; however, the need for a consistent long-term dataset has motivated the development of quality control (QC) techniques to help remove clear artifacts such as blooming events, spurious false lightning, “bar” effects, and sun glint. Simple QC methods are applied that include scaled maximum energy thresholds and minima in the variance of lightning group area and group energy. QC and anomaly detection methods based on machine learning (ML) are also explored. Each QC method is successfully able to remove artifacts in the GLM observations while maintaining the fidelity of the GLM observations within TCs. As the GLM processing algorithm has improved with time, the amount of QC flagged lightning within 100 km of Atlantic TCs is reduced, from 70% during 2017, to 10% in 2018, to 2% during 2021. These QC methods are relevant to the design of ML-based forecasting techniques which could pick up on artifacts rather than the signal of interest in TCs if QC was not applied beforehand.
Significance Statement
The Geostationary Lightning Mapper (GLM) provides total lightning observations in tropical cyclones that can benefit forecasts of intensity change. However, nonlightning artifacts in GLM observations make interpreting lightning observations challenging for automated techniques to predict intensity change. Quality control procedures have been developed to aid the TC community in using GLM observations for statistical and pattern-matching techniques.
Abstract
The Geostationary Lightning Mapper (GLM) has been providing unprecedented observations of total lightning since becoming operational in 2017. The potential for GLM observations to be used for forecasting and analyzing tropical cyclone (TC) structure and intensity has been complicated by inconsistencies in the GLM data from a number of artifacts. The algorithm that processes raw GLM data has improved with time; however, the need for a consistent long-term dataset has motivated the development of quality control (QC) techniques to help remove clear artifacts such as blooming events, spurious false lightning, “bar” effects, and sun glint. Simple QC methods are applied that include scaled maximum energy thresholds and minima in the variance of lightning group area and group energy. QC and anomaly detection methods based on machine learning (ML) are also explored. Each QC method is successfully able to remove artifacts in the GLM observations while maintaining the fidelity of the GLM observations within TCs. As the GLM processing algorithm has improved with time, the amount of QC flagged lightning within 100 km of Atlantic TCs is reduced, from 70% during 2017, to 10% in 2018, to 2% during 2021. These QC methods are relevant to the design of ML-based forecasting techniques which could pick up on artifacts rather than the signal of interest in TCs if QC was not applied beforehand.
Significance Statement
The Geostationary Lightning Mapper (GLM) provides total lightning observations in tropical cyclones that can benefit forecasts of intensity change. However, nonlightning artifacts in GLM observations make interpreting lightning observations challenging for automated techniques to predict intensity change. Quality control procedures have been developed to aid the TC community in using GLM observations for statistical and pattern-matching techniques.
Abstract
The Ka-band radar interferometer (KaRIn) on the Surface Water and Ocean Topography (SWOT) satellite that was launched in December 2022 is providing the first two-dimensional altimetric views of sea surface height (SSH). Measurements are made across two parallel swaths of 50-km width separated by a 20-km gap. In the data product that will be used for most oceanographic applications, SSH estimates with a footprint diameter of about 3 km are provided on a 2 km × 2 km grid. Early analyses of in-flight KaRIn data conclude that the instrumental noise for this footprint diameter has a standard deviation less than σ 3km = 0.40 cm for conditions of 2-m significant wave height. This is a factor of 2.3 better than the prelaunch expectation based on the science requirement specification. The SSH fields measured by KaRIn allow the first satellite estimates of essentially instantaneous surface current velocity and vorticity computed geostrophically from SSH. The effects of instrumental noise on smoothed estimates of velocity and vorticity based on early postlaunch assessments are quantified here as functions of the half-power filter cutoff wavelength of the smoothing. Signal-to-noise ratios for smoothed estimates of velocity and vorticity are determined from simulated noisy KaRIn data derived from a high-resolution numerical model of the California Current System. The wavelength resolution capabilities for σ 3km = 0.40 cm are found to be about 17 and 35 km for velocity and vorticity, respectively, which correspond to feature diameters of about 8.5 and 17.5 km, and are better than the prelaunch expectations by about 45% and 35%.
Abstract
The Ka-band radar interferometer (KaRIn) on the Surface Water and Ocean Topography (SWOT) satellite that was launched in December 2022 is providing the first two-dimensional altimetric views of sea surface height (SSH). Measurements are made across two parallel swaths of 50-km width separated by a 20-km gap. In the data product that will be used for most oceanographic applications, SSH estimates with a footprint diameter of about 3 km are provided on a 2 km × 2 km grid. Early analyses of in-flight KaRIn data conclude that the instrumental noise for this footprint diameter has a standard deviation less than σ 3km = 0.40 cm for conditions of 2-m significant wave height. This is a factor of 2.3 better than the prelaunch expectation based on the science requirement specification. The SSH fields measured by KaRIn allow the first satellite estimates of essentially instantaneous surface current velocity and vorticity computed geostrophically from SSH. The effects of instrumental noise on smoothed estimates of velocity and vorticity based on early postlaunch assessments are quantified here as functions of the half-power filter cutoff wavelength of the smoothing. Signal-to-noise ratios for smoothed estimates of velocity and vorticity are determined from simulated noisy KaRIn data derived from a high-resolution numerical model of the California Current System. The wavelength resolution capabilities for σ 3km = 0.40 cm are found to be about 17 and 35 km for velocity and vorticity, respectively, which correspond to feature diameters of about 8.5 and 17.5 km, and are better than the prelaunch expectations by about 45% and 35%.
Abstract
In remote sensing imaging systems, stripe noise is a pervasive issue primarily caused by the inconsistent response of multiple detectors. Stripe noise not only affects image quality but also severely hinders subsequent quantitative derived products and applications. Therefore, it is crucial to eliminate stripe noise while preserving detailed structure information in order to enhance image quality. Although existing destriping methods have achieved certain effects to some extent, they still face problems such as loss of image details, image blur, and ringing artifacts. To address these issues, this study proposes an image stripe correction algorithm based on weighted block sparse representation. This research applies techniques such as differential low-rank constraint and edge weight factor to remove stripe noise while retaining image detail information. The algorithm also uses the alternating direction method of multipliers (ADMM) to solve the minimax concave penalty (MCP)-regularized least squares optimization problem model, improving the processing efficiency of the model. The results of this study have been applied and validated in imager data from the Medium Resolution Spectral Imager (MERSI-II) onboard Fengyun-3D satellite, the multichannel scanning radiometer [Advanced Geosynchronous Radiation Imager (AGRI)] onboard Fengyun-4A satellite, and precipitation microwave radiometer [Microwave Radiation Imager-Rainfall Mission (MWRI-RM)] onboard Fengyun-3G. Compared to typical stripe correction methods, the proposed method achieves better stripe removal while preserving image detail information. The destriped image data can be used to generate high-quality quantitative products for various applications. Overall, by combining insights from prior research and innovative techniques, this study provides a more effective and robust solution to the stripe noise problem in remote sensing and weather forecast.
Significance Statement
Stripe noise is a persistent problem in remote sensing imaging systems, hindering image quality and subsequent analysis. This study introduces a novel algorithm based on weighted block sparse representation to remove stripe noise while preserving image details. By incorporating techniques like differential low-rank constraint and edge weight factor, our method achieves superior stripe removal. The proposed approach was validated using data from MERSI-II and AGRI satellites, showing its effectiveness in enhancing image quality. This research provides a more robust solution to the stripe noise issue, benefiting various applications in remote sensing and weather forecast.
Abstract
In remote sensing imaging systems, stripe noise is a pervasive issue primarily caused by the inconsistent response of multiple detectors. Stripe noise not only affects image quality but also severely hinders subsequent quantitative derived products and applications. Therefore, it is crucial to eliminate stripe noise while preserving detailed structure information in order to enhance image quality. Although existing destriping methods have achieved certain effects to some extent, they still face problems such as loss of image details, image blur, and ringing artifacts. To address these issues, this study proposes an image stripe correction algorithm based on weighted block sparse representation. This research applies techniques such as differential low-rank constraint and edge weight factor to remove stripe noise while retaining image detail information. The algorithm also uses the alternating direction method of multipliers (ADMM) to solve the minimax concave penalty (MCP)-regularized least squares optimization problem model, improving the processing efficiency of the model. The results of this study have been applied and validated in imager data from the Medium Resolution Spectral Imager (MERSI-II) onboard Fengyun-3D satellite, the multichannel scanning radiometer [Advanced Geosynchronous Radiation Imager (AGRI)] onboard Fengyun-4A satellite, and precipitation microwave radiometer [Microwave Radiation Imager-Rainfall Mission (MWRI-RM)] onboard Fengyun-3G. Compared to typical stripe correction methods, the proposed method achieves better stripe removal while preserving image detail information. The destriped image data can be used to generate high-quality quantitative products for various applications. Overall, by combining insights from prior research and innovative techniques, this study provides a more effective and robust solution to the stripe noise problem in remote sensing and weather forecast.
Significance Statement
Stripe noise is a persistent problem in remote sensing imaging systems, hindering image quality and subsequent analysis. This study introduces a novel algorithm based on weighted block sparse representation to remove stripe noise while preserving image details. By incorporating techniques like differential low-rank constraint and edge weight factor, our method achieves superior stripe removal. The proposed approach was validated using data from MERSI-II and AGRI satellites, showing its effectiveness in enhancing image quality. This research provides a more robust solution to the stripe noise issue, benefiting various applications in remote sensing and weather forecast.
Abstract
This study investigated trends in satellite-based chlorophyll-a (Chl-a; 1998–2022), sea surface temperature (SST; 1982–2022), and sea level anomaly (SLA; 1993–2021) from the European Space Agency’s Climate Change Initiative records, integrating time series decomposition and spectral analysis. Trends in parameters signify prolonged increases, decreases, or no changes over time. These are time series in the same space as original parameters, excluding seasonalities and noise, and can exhibit nonlinearity. Trend rates approximate the pace of change per time unit. We quantified trends using conventional linear-fit and three incrementally advancing methods for time series decomposition: simple moving average (SMA), seasonal-trend decomposition using locally estimated scatterplot smoothing (STL), and multiple STL (MSTL), across the global ocean, the Bay of Bengal, and the Chesapeake Bay. Challenges in decomposition include specifying accurate seasonal periods that are derived here by combining Fourier and Wavelet Transforms. Globally, SST and SLA trend upwards, and Chl-a has no significant change, yet regional variations are notable. We highlight the advantage of extracting multiple periods with MSTL and, more broadly, decomposition’s role in disentangling time-series components (seasonality, trend, noise) without resorting to monotonic functions, thereby preventing overlooking episodic events. Illustrations include extreme events temporarily counteracting background trends, e.g., the 2010–2011 SLA drop due to La Niña-induced rainfall over land. The continuous analysis clarifies the warming hiatus debate, affirming sustained warming. Decadal trend rates per grid cell are also mapped. These are ubiquitously significant for SST and SLA, whereas Chl-a trend rates are globally low but extreme across coasts and boundary currents.
Abstract
This study investigated trends in satellite-based chlorophyll-a (Chl-a; 1998–2022), sea surface temperature (SST; 1982–2022), and sea level anomaly (SLA; 1993–2021) from the European Space Agency’s Climate Change Initiative records, integrating time series decomposition and spectral analysis. Trends in parameters signify prolonged increases, decreases, or no changes over time. These are time series in the same space as original parameters, excluding seasonalities and noise, and can exhibit nonlinearity. Trend rates approximate the pace of change per time unit. We quantified trends using conventional linear-fit and three incrementally advancing methods for time series decomposition: simple moving average (SMA), seasonal-trend decomposition using locally estimated scatterplot smoothing (STL), and multiple STL (MSTL), across the global ocean, the Bay of Bengal, and the Chesapeake Bay. Challenges in decomposition include specifying accurate seasonal periods that are derived here by combining Fourier and Wavelet Transforms. Globally, SST and SLA trend upwards, and Chl-a has no significant change, yet regional variations are notable. We highlight the advantage of extracting multiple periods with MSTL and, more broadly, decomposition’s role in disentangling time-series components (seasonality, trend, noise) without resorting to monotonic functions, thereby preventing overlooking episodic events. Illustrations include extreme events temporarily counteracting background trends, e.g., the 2010–2011 SLA drop due to La Niña-induced rainfall over land. The continuous analysis clarifies the warming hiatus debate, affirming sustained warming. Decadal trend rates per grid cell are also mapped. These are ubiquitously significant for SST and SLA, whereas Chl-a trend rates are globally low but extreme across coasts and boundary currents.
Abstract
Earth and planetary radiometry requires spectrally dependent observations spanning an expansive range in signal flux due to variability in celestial illumination, spectral albedo, and attenuation. Insufficient dynamic range inhibits contemporaneous measurements of dissimilar signal levels and restricts potential environments, time periods, target types, or spectral ranges that instruments observe. Next-generation (NG) advances in temporal, spectral, and spatial resolution also require further increases in detector sensitivity and dynamic range corresponding to increased sampling rate and decreased field-of-view (FOV), both of which capture greater intrapixel variability (i.e., variability within the spatial and temporal integration of a pixel observation). Optical detectors typically must support expansive linear radiometric responsivity, while simultaneously enduring the inherent stressors of field, airborne, or satellite deployment. Rationales for significantly improving radiometric observations of nominally dark targets are described herein, along with demonstrations of state-of-the-art (SOTA) capabilities and NG strategies for advancing SOTA. An evaluation of linear dynamic range and efficacy of optical data products is presented based on representative sampling scenarios. Low-illumination (twilight or total lunar eclipse) observations are demonstrated using a SOTA prototype. Finally, a ruggedized and miniaturized commercial-off-the-shelf (COTS) NG capability to obtain absolute radiometric observations spanning an expanded range in target brightness and illumination is presented. The presented NG technology combines a Multi-Pixel Photon Counter (MPPC) with a silicon photodetector (SiPD) to form a dyad optical sensing component supporting expansive dynamic range sensing, i.e., exceeding a nominal 10 decades in usable dynamic range documented for SOTA instruments.
Abstract
Earth and planetary radiometry requires spectrally dependent observations spanning an expansive range in signal flux due to variability in celestial illumination, spectral albedo, and attenuation. Insufficient dynamic range inhibits contemporaneous measurements of dissimilar signal levels and restricts potential environments, time periods, target types, or spectral ranges that instruments observe. Next-generation (NG) advances in temporal, spectral, and spatial resolution also require further increases in detector sensitivity and dynamic range corresponding to increased sampling rate and decreased field-of-view (FOV), both of which capture greater intrapixel variability (i.e., variability within the spatial and temporal integration of a pixel observation). Optical detectors typically must support expansive linear radiometric responsivity, while simultaneously enduring the inherent stressors of field, airborne, or satellite deployment. Rationales for significantly improving radiometric observations of nominally dark targets are described herein, along with demonstrations of state-of-the-art (SOTA) capabilities and NG strategies for advancing SOTA. An evaluation of linear dynamic range and efficacy of optical data products is presented based on representative sampling scenarios. Low-illumination (twilight or total lunar eclipse) observations are demonstrated using a SOTA prototype. Finally, a ruggedized and miniaturized commercial-off-the-shelf (COTS) NG capability to obtain absolute radiometric observations spanning an expanded range in target brightness and illumination is presented. The presented NG technology combines a Multi-Pixel Photon Counter (MPPC) with a silicon photodetector (SiPD) to form a dyad optical sensing component supporting expansive dynamic range sensing, i.e., exceeding a nominal 10 decades in usable dynamic range documented for SOTA instruments.