1. Introduction
As well as being visually more appealing than shades of gray (grayscale), true-color imagery provides a far more information-rich and intuitive palette for at-a-glance rapid interpretation of remotely sensed imagery, useful for operational analysts and nonexperts alike (Miller et al. 2012). With true color, the ocean appears blue, and the land appears green or brown depending on vegetation content and health, rather than various shades of gray, making it far easier to distinguish the coastline, shallow water, and the deep ocean. Cloud, dust, and smoke offer very little contrast against the background in grayscale, but in color, atmospheric dust tends to be tan, smoke can appear in various bluish-white to dark brown hues, and meteorological clouds are almost universally white. Brown/red areas of deforestation and dark black burn scars produce strong contrast against surrounding green vegetation. Imagery at spatial resolution of 1 km or finer allows the structure of the cloud to be more easily observed, such as the overshooting tops of storm systems, providing meteorologists with extra tools to improve forecasts. In short, color imagery adds an extra dimension to the information available from grayscale-only images and more closely matches how humans naturally perceive the world around them. Broadband visible detectors are typically displayed as grayscale intensity in visualization system. If the red, green, and blue parts of the spectrum are imaged separately, via narrowband response functions, they can be combined as a true-color image using a visual display system. The blending of the intensities of the three bands can produce the millions of colors perceptible by normal human vision. By scaling the intensity in a way that mimics the retinal response, such imagery can be made to closely resemble what a human observer would see (or “true” to human vision) from the same vantage point. True-color imagery has been available for many years with polar-orbiting satellites such as Landsat and MODIS, utilizing methods such as those described by Gumley et al. (2010). Depending on the position on Earth, Landsat images each location at least every 16 days. MODIS images the equatorial region at least once per day during daylight hours and the polar region every 99 min (due to overlapping swaths). This observational frequency does not allow rapidly changing features to be adequately observed. Until recently, true color was not available from a geostationary platform, but this has changed with the launch of Himawari-8.
2. The Himawari-8 Advanced Himawari Imager
With the Advanced Himawari Imager (AHI), the spectral capability from previous satellites has increased to six reflective (three visible and three near infrared), one transitional (shortwave infrared), and nine emissive (thermal infrared) bands, dramatically advancing the scope of applications possible from the geostationary platform. The Advanced Himawari Imager (Bessho et al. 2016) is the flagship environmental sensor of Himawari-8. AHI, a next-generation radiometer, scans the full disk (for a field of regard centered at 0°N, 140.7°E) with 16 bands over a wide spectral range, including the notional red, green, and blue spectral bands required for true-color imagery rendering. Launched on 7 October 2014, AHI is the first of a series of advanced geostationary meteorological imagers. A similar sensor, the Advanced Baseline Imager (ABI) (Schmit et al. 2017), now flies aboard the U.S. Geostationary Operational Environmental Satellite (GOES) R series, and the Korean Geostationary Korea Multi-Purpose Satellite-2a [GEO-KOMPSAT-2A (GK-2A)] launched in December 2018 and houses the Advanced Meteorological Imager (AMI) (Korea Meteorological Administration 2018). AHI is configured to scan the full disk every 10 min. During this interval, it also scans a mesoscale box over Japan 4 times, a selectable target area (e.g., a tropical cyclone) 4 times, and two landmark areas (used primarily for navigation) 20 times. During each scan band 3 (the notional red band at 0.64-µm central wavelength) is imaged at a spatial resolution of 500 m, bands 1 (blue; 0.47 µm), 2 (green; 0.51 µm), and 4 (near infrared; 0.856 µm) at a resolution of 1 km, with the remaining 12 bands imaged at a resolution of 2 km (JMA 2015). The rich spectral diversity of the 16 bands permits a multitude of different products to be calculated. The blue, green, and red regions of the visible can be combined as a red–green–blue (RGB) composite to create geostationary true-color imagery for the first time over the Asia–Pacific region. A number of corrections can then be applied to the imagery to make it more vivid or visually rich so that features are more perceptive to the eye. The following sections detail a methodology for rendering vivid true-color imagery, beyond the simple combination of the red, green, and blue bands.
3. Correcting for the Rayleigh scattering within the atmosphere
To produce the best-quality imagery, a number of processing steps are required. An important process that is often ignored when producing true-color imagery from satellite instruments is compensating for the effect of the atmosphere on the propagation of light. As with any instrument sensing Earth from the vantage point of space, AHI imagery is affected by the intervening atmosphere. If one is interested in deriving products from the reflected or emitted radiation from the surface then a full atmospheric correction would need to be applied. If the goal is to produce imagery, which would include smoke, dust, and cloud, normally considered contaminants with full atmospheric correction processes, then only atmospheric effects that detract from the clarity of the imagery need be considered. Among the many mechanisms by which shortwave light interacts with the atmosphere, Rayleigh scattering (RS) is the dominant effect overall (Hsu et al. 2004; Wang 2016), and occurs where the geometric size of the scattering species is much smaller than the wavelength of the radiation being scattered. Visible light is Rayleigh scattered by the gaseous constituents of the atmosphere with the interaction more predominant at the blue end of the spectrum, meaning that uncorrected color imagery from space will have an unwanted “bluish haze.”
Quantitative description and correction for the RS component of the satellite-observed signal requires the use of a radiative transfer model. Models such as MODTRAN5 (Berk et al. 2008) can be used to adequately quantify the RS effects based on the collection of standard atmospheres included within the model and the geometry between the sun, Earth’s surface, and the satellite sensor. As a single full disk of an AHI image contains several hundred million pixels, and a new image is produced every 10 min, it is impractical to perform a rigorous atmospheric correction by running a full radiative transfer calculation on a pixel by pixel basis, especially for real-time imagery production. An alternative approach is to parameterize the components of the atmospheric correction for specific sensors using code like the Simple Method of Atmospheric Correction (SMAC) (Rahman and Dedieu 1994) and then apply a set of simple equations to the satellite data. Another approach, used for this work, is to generate a set of lookup tables (LUTs) based on radiative transfer calculations covering a set of specified geometries between the satellite, Earth, and sun. Using the LUTs, simple interpolations between these specified geometries are required, and then a simple set of equations are applied to the satellite data (Guanter et al. 2009; Vermote and Vermeulen 1999; Lyapustin et al. 2011). Observations from geostationary satellite sensors extend to much higher sensor zenith angles than do those from sun-synchronous polar orbiters such as MODIS and VIIRS. While the most extreme angles are largely avoided with polar orbiters, research into the use of lookup tables with polar orbiters has shown (Wang 2016) that estimates of Rayleigh scatter exhibit increasing error as sensor zenith angles (and the corresponding pathlength through the atmosphere) increase. As found in previous research (Miller et al. 2016), errors are much greater with the extremities of geostationary sensors and require special mitigation procedures. The primary objective of this work is to produce near-real-time true-color imagery from AHI that is useful to both forecasters and the general public. Forecasters also use RGB imagery created from combinations of IR and reflective bands to highlight properties of the atmosphere (air mass, severe convection, etc.), referred to as “false color” imagery, as well as true-color imagery from uncorrected AHI data. It is also anticipated that Rayleigh-corrected reflectance data will contribute to improvements in all RGB combinations and true-color imagery used by the Bureau of Meteorology. True color serves as a useful companion to false-color imagery, providing the analyst with an intuitive baseline for interpreting various enhancement colors.
4. Methods and discussion
The remainder of this paper describes the generation of an LUT for atmospheric correction, its use in producing a Rayleigh-corrected reflectance dataset, and the subsequent processing steps required to create a true-color image. Whereas this development is done in the context of Himawari-8 AHI, it is generally applicable to any sensor with the prerequisite bands required for true-color imagery with full knowledge of the optical properties of those bands.
a. The data
The Bureau of Meteorology (BoM) receives full-disk Himawari-8 AHI data at its native resolutions of 2 km (bands 5–16), 1 km (bands 1, 2, and 4), and 500 m (band 3). These data, provided by JMA in Himawari Standard Format (HSF), are supplied as counts along with a calibration table that provides coefficients to convert counts to spectral radiance (L) in units W m−2 sr−1 μm−1 for each of the 16 bands. Information on HSF data can be found in the user’s manual (JMA 2015). The temporal resolution of 10 min, apart from two observational periods at 0240 and 1440 UTC which are devoted to housekeeping on the satellite, results in reception of 142 images a day. This ingest yields over 382 GB of data per day that are stored, processed, and disseminated to forecasters and researchers in near–real time.
b. Radiative transfer modeling and atmospheric correction
The atmospheric correction process used for this work outputs intermediate products in radiance units. All AHI data processed by the atmospheric correction code are scaled to match the radiance units of the radiative transfer model of μW cm−2 sr−1 nm−1. This work enlists the MODTRAN5 (Berk et al. 2008) radiative transfer model, and its configuration for the current application very closely follows the methods of Guanter et al. (2009) and A. Rodger (2009, private communication). MODTRAN allows the user to specify an illumination source (generally the sun), a target (such as Earth’s surface), an atmosphere (either predefined or user defined), and an observer location and associated geometry (e.g., an Earth-viewing satellite instrument). MODTRAN also allows the user to define numerous processing options, and the specifics of the setup of MODTRAN as used for this work are described in the appendix.
The terms in Eq. (1), namely,
MODTRAN is configured to produce output over the range 350–2550 nm at 1-nm resolution. This means that the surface reflectance over this range at 1-nm intervals can also be produced, with important implications for the generalization of this approach. Namely, in Eq. (1),

Relative or normalized spectral response function for bands 1–4 of the Himawari-8 AHI instrument.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

Relative or normalized spectral response function for bands 1–4 of the Himawari-8 AHI instrument.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
Relative or normalized spectral response function for bands 1–4 of the Himawari-8 AHI instrument.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
All the spectral components derived from MODTRAN are band averaged first before they are used in the ensuing calculations.
To compensate for the coarse LUT increments, the values for the four radiative transfer components (RTCs) from Eq. (1) are interpolated from the LUTs. Each LUT can be considered to contain four (one for each band) three-dimensional grids of solar zenith angle (SZA) × relative azimuth (RA) × view zenith angle (VZA) values. To interpolate to an actual pixel value, the eight nearest pixel values are selected from the three-dimensional LUT, and a trilinear interpolation method (Kang 2006) is used to derive each pixel value.
Accuracy of LUT interpolation
The Rayleigh corrected (RC) reflectance values calculated from interpolating from the lookup tables will have some error compared to using exact view zenith angles, solar zenith angles, and relative azimuth for each pixel. This error can be assessed by comparing reflectance values calculated by using exact angular values in MODTRAN runs and those derived from the lookup tables.
Pixel position were selected on the full Himawari-8 disk to get a range of VZA, SZA, and RA. At these positions, MODTRAN was run using the exact values for VZA, SZA, and RA. The reflectance values for specific Himawari-8 scenes were then derived for these points using the MODTRAN runs. Reflectance values for each scene at the selected pixel positions were also determined using the lookup tables. Pixel properties (VZA, SZA, RA, and reflectance) were extracted at the approximate positions shown in Fig. 2 for both interpolated and exact values. The points are for every 250th pixel down and along the line 3000 pixels from the southern and western edges, respectively. The southernmost and easternmost points shown here are composed of 5 pixels at 10-pixel spacing, which were selected to show the rapid changes at Earth’s limbs but cannot be resolved individually in Fig. 2. A few pixel positions are marked in Fig. 2 for reference including where the groups of multiple pixels are located.

Test points where exact observational geometric coordinates were extracted and used to generate RTM outputs for 0250 UTC 2 Feb 2016.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

Test points where exact observational geometric coordinates were extracted and used to generate RTM outputs for 0250 UTC 2 Feb 2016.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
Test points where exact observational geometric coordinates were extracted and used to generate RTM outputs for 0250 UTC 2 Feb 2016.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
The solar geometries were extracted for the scene of 0250 UTC 2 February 2016. This gave 52 unique positions for which all the band-equivalent RTCs were generated. These RTCs were then used to calculate Rayleigh-corrected reflectance for the 52 points for bands 1 to 4 of Himawari-8. The same data were generated using the precalculated LUT and interpolation methods. This gave a set of 52 points where interpolated results could be compared to the exact results. This analysis is shown in section 5a. Another image was selected on the 2016 autumnal equinox (20 March) when the path of the subsolar point was directly along the equator. The image selected was at 0800 UTC where the solar terminator is close to the nadir point of Himawari-8. Figure 3 shows points where pixel properties have been extracted, but due to the spacing between samples, these positions are mostly not able to be resolved in Fig. 3. The arrangement in Fig. 3 gives values with high SZA but low VZA near the terminator. The opposite is true near the dayside limb of Earth where SZA is low and VZA is high. Pixel positions where the SZA is greater than 90° and some of the pixels with high VZA will not actually produce reflectance values because the radiative transfer model considers there to be no solar illumination or they are not visible to the sensor.

The SZAs, RA, and sensor zenith angles for Himawari-8 AHI at 0800 UTC 20 Mar 2016. The white points show where pixel values were compared to highlight interpolation error.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

The SZAs, RA, and sensor zenith angles for Himawari-8 AHI at 0800 UTC 20 Mar 2016. The white points show where pixel values were compared to highlight interpolation error.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
The SZAs, RA, and sensor zenith angles for Himawari-8 AHI at 0800 UTC 20 Mar 2016. The white points show where pixel values were compared to highlight interpolation error.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
c. Color correction, pathlength correction, resolution sharpening, and image enhancement
The primary motivation for the Rayleigh-corrected data is to generate true-color or RGB images either for forecasters or for public outreach. The Himawari-8 AHI data have other issues that cannot be addressed completely with Rayleigh correction alone, and the limiting assumptions of the RS correction require ad hoc modification in some areas as well. This section will detail the steps required to take Rayleigh-corrected data and produce vibrant RGB images where herein vibrant is defined as having clarity, good contrast, improved dynamic range, best available resolution, good color reproduction, and being free of artifacts.
1) Color correction
2) Cloud-disrupted atmospheric pathlength correction
The pathlength correction is applied over the entire disk during the atmospheric correction process and will thus contribute to the final reflectance products for Himawari-8 AHI bands 1–4. Since an actual cloud mask is not used for this adjustment, certain clear-sky pixels (especially land surface at high latitudes, where temperatures can fall below 230 K) will be subject to this adjustment as well. These Rayleigh-corrected bands are likely to be used by forecasters and interested researchers, so it is important to note that this correction has been introduced.
3) Resolution sharpening
Applying the resolution sharpening as shown in Eq. (6) yields
4) Image enhancement
5) Blending
The blending of data at the extremities of solar and satellite zenith angles is purely for aesthetic purposes. High satellite and solar zenith angle pixels can have very large and even negative reflectance values artificially induced by calculation error in the atmospheric correction process. Miller et al. (2016) address this by gradually blending in top-of-atmosphere (uncorrected) data at high satellite and solar view zenith angles. This method has been modified here. At the planetary limb, the corrected data are multiplied by a factor, which is linearly decreased from 1.0 to 0.0 over the VZA range of 78° to 88°. This gradually blends the high satellite zenith angle values to that of the background space which replicates the look of the Miller et al. (2016) approach and is easier to implement. When the solar zenith angle exceeds 90° then there is no direct sunlight and Earth is in shadow. The shadowed regions are replaced with brightness temperature data from band 13. Rather than a sharp cutoff at the solar terminator, these data are blended into each other. At the transition zone between 78° and 88° the reflectance data are multiplied by factor, which is linearly decreased from 1.0 to 0.0 with SZA, and the IR data are multiplied by a factor which is linearly increased from 0.0 to 1.0 over the same SZA range, which gives a seamless transition between the two datasets. This results in a more natural-looking transition from day to night.
5. Results
This section shows samples of the results achieved by applying the corrections and enhancements discussed in section 4.
a. Accuracy of the data interpolated from the lookup tables
Determining the accuracy of the LUT interpolation method is reasonably straightforward, as direct comparisons can be made between interpolated values and MODTRAN runs with actual values for the RTCs and the resultant Rayleigh corrected (RC) reflectance values.
Figure 4 shows the percentage error in the Rayleigh corrected (RC) reflectance interpolation for AHI bands 1 to 4. These errors are for the approximate pixel positions shown in Fig. 3 for 0250 UTC 2 February 2016. This shows that the error level increases toward the limb of Earth where the VZA changes rapidly from pixel to pixel. In Fig. 4, the SZA varies from 6.3° to 77.1°, the VZA varies from 5.3° to 86.5°, and the RA varies from 1.7° to 175.4°. The highest values of RA do not occur at the extremities, where VZA and SZA are high, so Fig. 4 shows that RA has a limited effect on the error associated with interpolation from the lookup tables. As the AHI scene used to produce Fig. 4 was close to solar noon, the SZA and VZA values will be close in value to each other for each pixel, so they cannot be considered independently.

Percentage error in the Rayleigh-corrected reflectance for VZA due to interpolation for Himawari-8 bands 1–4 for the two lines of pixels (down and across) indicated in Fig. 2.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

Percentage error in the Rayleigh-corrected reflectance for VZA due to interpolation for Himawari-8 bands 1–4 for the two lines of pixels (down and across) indicated in Fig. 2.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
Percentage error in the Rayleigh-corrected reflectance for VZA due to interpolation for Himawari-8 bands 1–4 for the two lines of pixels (down and across) indicated in Fig. 2.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
Figure 5 shows the percentage error in the Rayleigh-corrected reflectance interpolation for the line of pixels marked in Fig. 3 for 0800 UTC 20 March 2016. This transect has high VZA values toward pixel zero and high SZA values toward pixel 3500. Interpolation errors in AHI band 1 approach 18% at a VZA of nearly 86°. The interpolation error is much higher for the SZA, where the maximum error is 102% at an angle of 89.9°, although this is due to AHI being capable of capturing data at higher SZA values. These errors drop off for high sensor zenith angles for longer-wavelength bands, but there is little drop-off for high solar zenith angles for longer wavelengths. The interpolation error due to sensor zenith is only significant (greater than 5%) at angles greater than 80° and only for AHI bands 1 and 2. The interpolation error due to SZA is significant at angles greater than 80° for the first three bands and becomes significant at angles greater than 83° for band 4. These interpolation errors are due to the rapid nonlinear change in atmospheric pathlength at high zenith angles, and commensurately higher Rayleigh scatter contributions to the total signal. A small change in solar zenith angle can result in a large increase in the pathlength between the TOA and the surface. As the interpolation method is linear and the angular step in the lookup table is relatively coarse, it does not adequately address the nonlinear behavior at high zenith angles, particularly for solar zenith angles.

Percentage error in the Rayleigh-corrected reflectance for pixel position due to interpolation for Himawari-8 bands 1–4 for the geometry of the line of pixels indicated in Fig. 3.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

Percentage error in the Rayleigh-corrected reflectance for pixel position due to interpolation for Himawari-8 bands 1–4 for the geometry of the line of pixels indicated in Fig. 3.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
Percentage error in the Rayleigh-corrected reflectance for pixel position due to interpolation for Himawari-8 bands 1–4 for the geometry of the line of pixels indicated in Fig. 3.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
Figures 4 and 5 show that there can be large relative percentage differences in the determination of reflectance between interpolated LUTs and exact geometric values in the RTM. These numbers and the underlying data do not assess the actual error in reflectance, which would require an independent source of verification data, such as in situ reflectance measurements. However, these figures do provide enough justification to remove or replace data at these high zenith angles, as the interpolation error can be substantial.
b. Imagery results
The following section shows the effects of the corrections and enhancements described in section 4 on the resultant imagery. Unless specifically mentioned, only the correction or enhancement being demonstrated has been omitted from the processing chain to compare to a completely corrected and enhanced image.
1) Impact of Rayleigh correction
Figure 6 shows the result of Rayleigh correction applied to true-color imagery. The left panel shows the result of color correction, brightening, and a contrast enhancement on non-Rayleigh-corrected imagery, and the right-hand panel shows the same image with Rayleigh correction included. This shows that Rayleigh correction removes much of the atmospheric haze and yields sharper contrast between cloud and surface features, in addition to enhancing surface detail overall.

(left) The TOA image without Rayleigh correction with hybrid green color adjustment and gamma brightening applied. (right) The full complement of corrections, adjustments, and blending has been applied and represents the finished bottom-of-atmosphere product. Himawari-8 scene from 0220 UTC 6 Jun 2016.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

(left) The TOA image without Rayleigh correction with hybrid green color adjustment and gamma brightening applied. (right) The full complement of corrections, adjustments, and blending has been applied and represents the finished bottom-of-atmosphere product. Himawari-8 scene from 0220 UTC 6 Jun 2016.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
(left) The TOA image without Rayleigh correction with hybrid green color adjustment and gamma brightening applied. (right) The full complement of corrections, adjustments, and blending has been applied and represents the finished bottom-of-atmosphere product. Himawari-8 scene from 0220 UTC 6 Jun 2016.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
2) Impact of hybrid green correction
The application of the Miller et al. (2016) hybrid green correction is evident in Fig. 7. Figure 7a shows the uncorrected imagery with a noticeable red tinge where there is sparse vegetation and the dense vegetation appears overly dark. Figure 7b, where the hybrid green band is used, appears more like imagery from other sensors where the green band is centered at approximately 555 nm, as shown by Fig. 7c, which is the MODIS Terra Rayleigh-corrected RGB image captured close to the scan time of AHI.

(a) Rayleigh-corrected AHI band 2 data, which have not been corrected for color. (b) Band 2 data which have been blended with band 4 data to create a hybrid green band. These data are from 2230 UTC 17 Mar 2017. (c) The Rayleigh-corrected MODIS Terra RGB image at 2335 UTC 17 Mar 2017.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

(a) Rayleigh-corrected AHI band 2 data, which have not been corrected for color. (b) Band 2 data which have been blended with band 4 data to create a hybrid green band. These data are from 2230 UTC 17 Mar 2017. (c) The Rayleigh-corrected MODIS Terra RGB image at 2335 UTC 17 Mar 2017.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
(a) Rayleigh-corrected AHI band 2 data, which have not been corrected for color. (b) Band 2 data which have been blended with band 4 data to create a hybrid green band. These data are from 2230 UTC 17 Mar 2017. (c) The Rayleigh-corrected MODIS Terra RGB image at 2335 UTC 17 Mar 2017.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
3) Impact of atmospheric path length correction
Figure 8 shows the result of reducing the path radiance at the limb when high cloud is present. The red coloration is a result of the blue and green bands used in the RGB combination being overcorrected with respect to the red band. The top-right-hand corner of both panels, where there is limited cloud, and the limb correction is not applied, also shows some slight reddening. This overcorrection could be a result of the interpolation error from the lookup table at high view zenith angles. It may be due to limitations of the radiative transfer code at high angles, which is difficult to quantify without corresponding in situ measurements. It may be possible to improve the correction at the limb by reducing the angular increment in the LUTs or by setting up the radiative transfer code differently. Adding height to the radiative transfer process is the most obvious way to improve the uncertainty around the pathlength. This would require the use of a cloud-height product (derived from the AHI data) and would reduce the timeliness of true-color imagery. As the goal of this work is to produce imagery as fast as possible and not climate data record–level products, the most simplistic (and fastest) viable approach is used but could be improved.

(top) A true-color image after Rayleigh correction but with no rescaling of the path radiance. (bottom) Rescaling the path radiance based on cloud height removes the reddening at high view zenith angles.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

(top) A true-color image after Rayleigh correction but with no rescaling of the path radiance. (bottom) Rescaling the path radiance based on cloud height removes the reddening at high view zenith angles.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
(top) A true-color image after Rayleigh correction but with no rescaling of the path radiance. (bottom) Rescaling the path radiance based on cloud height removes the reddening at high view zenith angles.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
4) Impact of resolution sharpening
The resolution sharpening is evident in Fig. 9. This shows the Faure sill in the Shark Bay region of Western Australia. The sill is a large sandbar which covers the inner bay or Hamelin pool. The 500-m data show far more detail of the small channels that snake through the sill, whereas the 1000-m data can only resolve the large channels. The coastline is also poorly defined in the 1000-m data

The Shark Bay area of Western Australia at (left) 500 and (right) 1000 m. The smaller channels in the Faure sill (the large sandbar atop the inner bay) are visible in the 500-m imagery but not in the 1000-m imagery.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

The Shark Bay area of Western Australia at (left) 500 and (right) 1000 m. The smaller channels in the Faure sill (the large sandbar atop the inner bay) are visible in the 500-m imagery but not in the 1000-m imagery.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
The Shark Bay area of Western Australia at (left) 500 and (right) 1000 m. The smaller channels in the Faure sill (the large sandbar atop the inner bay) are visible in the 500-m imagery but not in the 1000-m imagery.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
5) Impact of nonlinear image enhancement
Figure 10 shows the result of applying nonlinear scale brightening and contrast enhancement to a Rayleigh- and color-corrected image. The top panel is dull and has poor contrast, while the bottom panel is brighter and has better contrast.

(top) The RGB image where gamma brightening and contrast have not been applied to the Rayleigh-corrected imagery. (bottom) These enhancements have been applied. Both images show the south of the state of Victoria.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

(top) The RGB image where gamma brightening and contrast have not been applied to the Rayleigh-corrected imagery. (bottom) These enhancements have been applied. Both images show the south of the state of Victoria.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
(top) The RGB image where gamma brightening and contrast have not been applied to the Rayleigh-corrected imagery. (bottom) These enhancements have been applied. Both images show the south of the state of Victoria.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
6) Blending at the limb and with nighttime infrared imagery
The visible imagery is blended with black at the limb, the proportion varying from 100% to 0% linearly with view zenith angle from 78° to 88°. This removes most edge effects not previously removed by the pathlength correction, most noticeably the prominent red edge in Fig. 11. It also produces a more realistic-looking transition from Earth to space than a sharp cutoff. High solar zenith angles approaching 90° will also exhibit edge effects, such as overcorrection due to the path radiance calculation, which also produces a red edge. The section on interpolation accuracy shows that there are very large errors at the limb and solar terminator due to interpolation, and these are the primary cause of the overcorrection seen at the edges. These overcorrected pixels can be blended out to remove the red edge (as shown in Fig. 11), or IR brightness temperature imagery can be blended in at the solar transition to retain the full-disk coverage, as shown in Fig. 12. Blending in IR data makes time series animations more informative as the movement of clouds can still be tracked over the nighttime.

(left) The unblended imagery at high solar zenith angles. The path radiance calculated from MODTRAN overcorrects at shorter wavelengths which results in a red rim. (right) This part of the image has been blended out to both remove these edge effects and provide a more physically representative (and visually pleasing) effect than a sharp cutoff.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

(left) The unblended imagery at high solar zenith angles. The path radiance calculated from MODTRAN overcorrects at shorter wavelengths which results in a red rim. (right) This part of the image has been blended out to both remove these edge effects and provide a more physically representative (and visually pleasing) effect than a sharp cutoff.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
(left) The unblended imagery at high solar zenith angles. The path radiance calculated from MODTRAN overcorrects at shorter wavelengths which results in a red rim. (right) This part of the image has been blended out to both remove these edge effects and provide a more physically representative (and visually pleasing) effect than a sharp cutoff.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

(left) The RGB image only and (right) the result of blending in BT data from band 13 for the Himawari scene at 0740 UTC 6 Jun 2016.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1

(left) The RGB image only and (right) the result of blending in BT data from band 13 for the Himawari scene at 0740 UTC 6 Jun 2016.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
(left) The RGB image only and (right) the result of blending in BT data from band 13 for the Himawari scene at 0740 UTC 6 Jun 2016.
Citation: Journal of Atmospheric and Oceanic Technology 36, 3; 10.1175/JTECH-D-18-0060.1
6. Conclusions
This paper describes a method to produce Rayleigh-corrected true-color imagery from Himawari-8 AHI. A lookup table approach is a sensible idea for this application as running a pixel by pixel atmospheric correction would be time prohibitive given that AHI data are produced every 10 min at a resolution of 1 km and there are 121 million pixels in each scene. Interpolated values from the lookup tables when used in radiative transfer calculations approach 20% error in reflectance at the planetary limb and over 100% at the solar terminator compared to using exact derived values.
These extreme angle pixels at the planetary limb or the solar terminator are blended to background values, which largely negates any unwanted effects on the imagery from the limitations of the Rayleigh-correction process.
As artifacts of the Rayleigh-correction process remain in the imagery after all of the other corrections are run, these areas at the angular extremities are masked by either blending to background (space) values at the limb, or brightness temperature values from a thermal infrared band are gradually blended across the solar terminator.
The interpolation error at higher zenith angles could be improved by increasing the resolution of the lookup tables at higher zenith angles. All the radiative transfer modeling conducted for this work used the U.S. standard atmosphere and sea level height. Using other MODTRAN5 in-built atmospheric models that better address the latitudinal and seasonal differences of the global atmosphere may provide an increase in accuracy and could be implemented using the same lookup table structure. The radiative transfer modeling could be extended to include topographic and cloud-top height. The pathlength would then be described for each pixel and avoid the requirement for a pathlength correction to be applied during the atmospheric correction process. These enhancements have the potential to improve the accuracy of the Rayleigh-corrected reflectance, particularly at higher solar and satellite zenith angles. Adding the capacity to run the Rayleigh correction with defined pathlengths would require an appropriate elevation model and cloud-height product, many more radiative transfer runs to be completed (a unique set for each selected pathlength), and an extra dimension added to the lookup table structure. It is unlikely that the appearance of the true-color imagery would be largely improved by including pathlength as an extra variable in the atmospheric correction process, except perhaps at the highest sensor and solar zenith angles. The RTM output was produced over a wavelength range of 350 to 2550 nm at 1-nm intervals. This allows construction of lookup tables for any satellite band in this wavelength region, including Himawari-9 AHI and AMI if the spectral properties of the band are known. This could also be adapted for most Earth observation satellites that sense Earth’s reflected signal. Utilizing the methods of Miller et al. (2016), it is possible to produce a hybrid green band which better matches the chlorophyll peak reflectance around 555 nm. True-color imagery produced from the hybrid green band more closely resembles imagery from satellite sensors such as MODIS and Landsat-8, which have a green band centered around 555 nm. The corrected true-color RGB images will appear dark and lack contrast, and require brightening and contrast adjustment to produce vivid true-color images. Even after Rayleigh correction, edge effects at high VZA and SZA values persist owing to limitations in the approximations used with the RTM. To overcome these artifacts, the edges can be blended toward zero or black. An alternative near the limb is to blend in uncorrected imagery. An alternative solution for the solar terminator is to blend in brightness temperature information from a band in the thermal infrared region, such as band 13, providing a 24-h imagery product. High-temporal-resolution animations of the true-color imagery provide a tool for visualizing rapidly evolving phenomenon such as bushfires, dust storms, volcanic eruptions, and even solar eclipses. Here, true-color information provides a practical means for interpreting the components of the complex scene. High-temporal-resolution true-color imagery from Himawari-8 AHI provides a tool for scientists and forecasters to visualize the dynamic Earth and has great potential to engage the general public.
Acknowledgments
Miller: Support of the NOAA GOES-R Program Office, the Naval Research Laboratory (Contract N00173-17-1-G015), the Oceanographer of the Navy PEO C4I and Space/PMW-120, Program Element PE-0603207N, and the Office of Naval Research (Contract N00014-16-1-2040) are gratefully acknowledged.
APPENDIX
Using MODTRAN
The format and nomenclature of the MODTRAN5 inputs and outputs reflect the legacy of the model. The primary input to the model is a structured ASCII file called a tape5 file. Each of these tape5 files is separated into individual cards. The separate cards control model parameters such as atmosphere type, observational geometry, output formats, and many other configurable parameters. While the format and setup of these input cards is described in the manual, creating these from scratch is not a simple undertaking. The latest version of MODTRAN is available with a graphical user interface and a new input file format [JavaScript Object Notation (JSON)], which greatly simplifies the process (http://modtran.spectral.com). The following section will focus on using MODTRAN output to generate lookup tables.
Once the radiative transfer method has been decided and the radiative transfer model inputs are set, the only parameters that are changed are 1) the surface reflectance, 2) the VZA, 3) the RA (degrees east of north), and 4) the SZA. The model is run for surface reflectance values of 1.0 and 0.5 for each combination of VZA, RA, and SZA, as the method requires two runs (with any two surface reflectance values) with the surface reflectance components cancelled out of the final calculations. The VZA and SZA are varied from 0° to 85° at 5° increments, with a final increment of 4° to provide a final zenith angle of 89°. The RA is varied from 0° to 180° with an increment of 10°. This give a total of 13 718 tape5 files and thus 13 718 MODTRAN5 runs required to build an LUT. This range of geometries covers almost the entire range of observational geometries a geostationary satellite will see apart from a very small part at the edge of the Earth disk and just to the daylight side of the solar terminator. The full ranges of possible observational geometries for AHI (and for most other satellite instruments) are covered within the LUTs, examples of which are shown in Figs. 2 and 3.
The parameters
REFERENCES
Berk, A., G. P. Anderson, P. K. Acharya, and E. P. Shettle, 2008: MODTRAN 5.2.0.0 user’s manual. Air Force Research Laboratory Rep., 100 pp.
Bessho, K., and Coauthors, 2016: An introduction to Himawari-8/9—Japan’s new-generation geostationary meteorological satellites. J. Meteor. Soc. Japan, 94, 151–183, https://doi.org/10.2151/jmsj.2016-009.
Guanter, L., R. Richter, and H. Kaufmann, 2009: On the application of MODTRAN4 atmospheric radiative transfer code to optical remote sensing. Int. J. Remote Sens., 30, 1407–1424, https://doi.org/10.1080/01431160802438555.
Gumley, L., J. Descloitres, and J. Schmaltz, 2010: Creating reprojected true color MODIS images: A tutorial. NASA Rep., 17 pp., https://cdn.earthdata.nasa.gov/conduit/upload/946/MODIS_True_Color.pdf.
Hsu, N. C., S.-C. Tsay, M. D. King, and J. R. Herman, 2004: Aerosol properties over bright-reflecting source regions. IEEE Trans. Geosci. Remote Sens., 42, 557–569, https://doi.org/10.1109/TGRS.2004.824067.
JMA, 2015: Himawari-8/9: Himawari standard data—User’s guide, version 1.2. Japan Meteorological Agency Rep., 22 pp., https://www.data.jma.go.jp/mscweb/en/himawari89/space_segment/hsd_sample/HS_D_users_guide_en_v12.pdf.
Kang, H., 2006: Computational Color Technology. International Society for Optical Engineering, 524 pp.
Korea Meteorological Administration, 2018: GEO-KOMPSAT-2A basic. Korean Meteorological Agency National Meteorological Satellite Center, http://nmsc.kma.go.kr/html/homepage/en/ver2/static/selectStaticPage.do?view=satellites.gk2a.gk2aIntro.
Lyapustin, A., J. Martonchik, Y. Wang, I. Laszlo, and S. Korkin, 2011: Multiangle Implementation of Atmospheric Correction (MAIAC): 1. Radiative transfer basis and look-up tables. J. Geophys. Res., 116, D03210, https://doi.org/10.1029/2010JD014985.
Miller, S., C. Schmidt, T. Schmit, and D. Hillger, 2012: A case for natural colour imagery from geostationary satellites, and the approximation for GOES-R ABI. Int. J. Remote Sens., 33, 3999–4028, https://doi.org/10.1080/01431161.2011.637529.
Miller, S., T. Schmit, C. Seaman, D. Lindsey, M. Gunshor, R. Kohrs, Y. Sumida, and D. Hillger, 2016: A sight for sore eyes: The return of true color to geostationary satellites. Bull. Amer. Meteor. Soc., 97, 1803–1816, https://doi.org/10.1175/BAMS-D-15-00154.1.
Rahman, H., and G. Dedieu, 1994: SMAC: A simplified method for the atmospheric correction of satellite measurements in the solar spectrum. Int. J. Remote Sens., 15, 123–143, https://doi.org/10.1080/01431169408954055.
Schmit, T., P. Griffith, M. Gunshor, J. Daniels, S. Goodman, and W. Lebair, 2017: A closer look at the ABI on the GOES-R series. Bull. Amer. Meteor. Soc., 98, 681–698, https://doi.org/10.1175/BAMS-D-15-00230.1.
Vermote, E., and A. Vermeulen, 1999: Atmospheric correction algorithm: Spectral reflectances (MOD09). NASA Rep., 107 pp., https://eospso.nasa.gov/sites/default/files/atbd/atbd_mod08.pdf.
Wang, M., 2016: Rayleigh radiance computations for satellite remote sensing: Accounting for the effect of sensor spectral response function. Opt. Express, 24, 12 414–12 429, https://doi.org/10.1364/OE.24.012414.