1. Introduction
Unmanned aerial vehicles (UAVs) are relatively small, remotely operated aircraft that are becoming increasingly popular as environmental surveying platforms. Compared to other airborne optical surveying platforms, one of the main reasons for their increasing popularity is their ease in obtaining high-resolution imagery at a moderate spatial and high temporal scale in environments that do not support a high vantage point (Table 1). Historically, small UAVs were costly; required specialized training; and were limited by low-amperage, heavy batteries. Early applications focused primarily on military and public safety activities for inspection, reconnaissance, and surveillance. In recent years the arrival of lightweight, high-capacity batteries, low-power electronics, and compact high-definition cameras has driven the development of commercially available UAVs for hobbyists. The low operation costs have increased their potential for scientific research. New applications include mapping and monitoring agriculture (e.g., Zhang and Kovacs 2012; Rasmussen et al. 2013), archeology (e.g., Rinaudo et al. 2012), meteorology (e.g., Rogers and Finn 2013), and marine fauna (e.g., Hodgson et al. 2013), among others.
Comparison of airborne optical surveying platforms.
UAVs are separated into two classes: fixed wing vehicles, which resemble a small airplane; and rotary wing vehicles, which resemble a helicopter but with multiple propellers rotating in the horizontal plane (see Fig. 1). Small, fixed wing UAVs have existed for over two decades and currently have flight durations up to 3.5 h and cruising speeds up to 80 km h−1. The combination of high speed and long flight duration allows for photogrammatic mapping of large areas at high spatial resolution (e.g., G. Pennucci et al. 2008, unpublished manuscript). In contrast, rotary wing UAVs (Fig. 1) exhibit shorter flight durations (up to 50 min) and slower cruising speeds (up to 20 km h−1). However, they are capable of loitering at a fixed position, holding a steady field of view (FOV) for extended periods of time, thus providing moderate spatial and high temporal resolution. Their ability to fly in any direction with no requirement for a runway greatly simplifies launching and landing procedures, making them an ideal instrument for monitoring otherwise difficult-to-access and highly dynamic areas, in particular surfzones.
Surfzones, defined as the areas of breaking waves, control the arrival of biota and pollutants at the beach and are important for recreation and swimmer safety, where rip currents can take swimmers involuntarily offshore (Dalrymple et al. 2011). Surfzone kinematics are notoriously difficult to measure due to large gradients in fluid motions of different spatial and temporal scales (Battjes 1988; Peregrine 1998). For instance, Eulerian in situ measurements can accurately capture the temporal variability of the kinematics but cannot provide large spatial coverage without a significant number of sensors, making it cost prohibitive. Our understanding of the beach morphology and surfzone processes has dramatically increased, owing to long-term video monitoring stations such as Argus (Holman and Stanley 2007, and references therein). Video monitoring, either short term or long term, requires a high-vantage point, such as a large tower, cliff, or a tall hotel adjacent to the beach. In addition, the oblique angle of the camera results in nonequidistant pixel resolution, which increases with distance away from the camera. The rotary-wing UAV can operate directly above the surfzone, providing better pixel resolution for capturing successive images at a fraction of the cost. This allows for a whole new approach for studying the surfzone and beach processes that were previously unavailable. Two rotary wing UAVs, their operational use, and errors are discussed in context with a unique surfzone monitoring effort.
2. Methods
a. SCOPE
The Surfzone Coastal Oil Pathways Experiment (SCOPE) to examine the surfzone control on oil transport on a sandy, rip-channeled beach with a crescentic outer bar system was performed on Fort Walton Beach, Okaloosa Island, Florida, in December 2013. Several UAV monitoring missions were flown on 9–15 December with varying Rhodamine WT (water tracing; 20% by concentration) dye releases (continuous, blob, and streaks) both outside and inside the surfzone to augment an array of in situ instruments. Since dye is difficult to track with fixed in situ sensors, video monitoring from a high vantage point was required to complement the in situ observations.
b. UAV systems
Two types of commercial rotary wing UAVs, the Aerialtronics Altura AT6 (Fig. 1a) and the 3D Robotics Y6 (Fig. 1b), were flown for monitoring the surfzone. Both systems are hexacopters, where the three-strut configuration of the Y6 has additional advantages over the AT6, in that the former provides a larger, unobstructed view and that the propeller layout is stable with only five of the blades spinning. The latter provides important redundancy in the event of a motor failure while operating over water. UAV navigation is performed autonomously with planned missions and waypoints or remotely with radio control. The onboard flight controller is capable of stabilizing and holding position and altitude of the UAV based on the internal sensors (Fig. 1). UAV data are stored on board and transmitted to the operator in real time at a frequency of 2400 and 915 MHz for the AT6 and Y6, respectively. A live feed from the onboard camera is transmitted at a frequency of 5.8 GHz for both systems. Both UAVs are powered from a 5000-mAh, four-cell lithium polymer battery.
c. Camera and lens correction
During SCOPE, both UAVs were equipped with GoPro Hero 3+ Black edition cameras and stabilized with gyroscopic gimbals capable of precise pitch control by the operator. This camera has two main advantages that are particularly useful in combination with UAVs: 1) it is lightweight (74 g), resulting in longer flying times; and 2) it has a large FOV due to the fish-eye lens (horizontal FOV = 122.6°, vertical FOV = 94.4°), which is necessary to capture large areas from a limited altitude. With a targeted operating altitude of 120 m and an inclined viewing angle of the camera, a typical ground coverage O(1 km) alongshore and O(0.5 km) cross-shore can be achieved (Fig. 2).
Correcting the lens distortion generally requires the determination of the intrinsic camera parameters, that is, focal length, principal point, and distortion coefficients. There are several techniques available to determine these intrinsic camera parameters, for example, Tsai (1987), Heikkilä and Silvén (1997), Zhang (1999), and Kannala and Brandt (2006). In addition, there are several ready-to-use toolboxes available, for example, Bouguet’s (2014) calibration toolbox, Scaramuzza et al.’s (2006) OCamCalib calibration toolbox, and the lens correction tool in Adobe Photoshop. As an example, we used OCamCalib to undistort the camera raw image of a dye release on 15 December 2013 (Figs. 2a,b). Analysis of the lens correction using the corresponding chessboard test showed an rms error of 1.51 pixels. Hößler and Landgraf (2014) showed that subpixel accuracy of the GoPro camera calibration is possible using a specifically designed calibration room. However, for the purpose of this study, the relatively quick camera calibration with OCamCalib provides sufficient accuracy.
d. Mission planning
Dominant processes in the surfzone typically have time scales on the order of seconds to minutes, requiring images to be obtained every few seconds to properly resolve surfzone kinematics. During our missions, we used the camera’s time-lapse function at a sample rate of 0.5 Hz with a photo resolution of 12 megapixels (4000 × 3000 pixels). Higher sample frequencies up to 2 Hz are possible, but the process of obtaining the image interrupts the video stream to the operators, reducing real-time evaluation of the focus area. To obtain a near-continuous dataset, we flew two UAVs in cyclical deployments. When the UAV battery of the first UAV reached its lower limit, the second was launched to relieve it. Both vehicles were programmed autonomously to loiter at the same location to ensure that the observation position was constant. This cycling scheme allowed an UAV to be on station almost continuously. To increase the temporal coverage, the UAV batteries were charged on-site using a portable gasoline-powered generator. Fortuitously, this setup was sustainable, such that each cycle, defined as the duration to execute a mission, land, and takeoff, was equivalent to the battery charge time.
3. Results
a. Georectification
To use the aerial images to complement the available in situ data, it is necessary to project the two-dimensional (2D) image plane onto a three-dimensional (3D) geographic plane. The camera calibration defines the relation between the 2D image and the 3D geographic plane. In general, camera calibration consists of two steps (e.g., Holland et al. 1997). First, the lens distortion has to be removed from the images (see section 2c), after which a perspective transformation is applied to project the 2D image plane onto the 3D geographic plane. To find the necessary transformation matrix (see, e.g., Hartley and Zisserman 2003, part 1), it is necessary to relate ground control points (GCPs) pixel locations in the undistorted image (Fig. 2b) to their known GPS location in the real world. During SCOPE, there were land-based GCPs, which were blue rectangular tarps on the beach (red squares), and water-based GCPs, which were pink boogie boards anchored to the bottom outside the surfzone (white circles). Each experiment day the contours of the tarps were surveyed using real-time kinematic (RTK) GPS that is accurate to O(1 cm). The boogie boards were equipped with a GT31 GPS that is accurate to O(2–3 m). Using the transformation matrix, the image is projected onto the geographic plane. The result is referred to as an orthophoto (see Fig. 2c). For Fig. 2c the pixel resolution ranges between approximately 0.035 and 0.7 m. These values depend on the flying altitude and camera inclination, among others, and thus on the area covered by the orthophoto. The maximum and mean reprojection error—that is, the difference between the surveyed GPS location of a GCP and the location of the GCP obtained from the orthophoto—are 1.22 and 0.71 m, respectively. These errors are affected by the accuracy of 1) the GPS devices used to survey the GCPs, 2) the lens correction, and 3) locating the GCPs in the undistorted image, among others.
b. System performance
In using UAVs to monitor the surfzone, there are two important aspects to consider: 1) the loiter duration and 2) the loitering accuracy. Longer loiter durations result in longer continuous data acquisition with the UAV. A high loitering accuracy of the UAV yields a stable FOV, making it easier to keep all GCPs used for the rectification procedure in view.
Regarding loiter duration, analysis of the log data of 36 SCOPE missions (18 for each vehicle) showed that the Y6 on average draws less power than the AT6 while loitering (325 and 505 W, respectively). This results in longer loiter durations for the Y6 than for the AT6. The mean and maximum loiter duration are 9.63 and 11.61 min and 5.72 and 8.97 min for the Y6 and AT6, respectively. These durations depend primarily on battery type and age, flying style, and environmental conditions.
To analyze the loitering accuracy of the two UAVs, we focus our attention on the missions on 13 and 15 December 2013 (see Table 2; Fig. 3). During these missions we deployed the UAVs to fly to the same predefined waypoint in order to obtain a near-continuous dataset with a similar FOV. Defining a watch circle in which the UAV spends 90% of its time loitering (Fig. 3a), it follows that in general the AT6 has a larger loiter radius than the Y6 (see Table 2). In turn, there is less variability in the mean position of the AT6 compared to the Y6 (except for an AT6 outlier on 15 December; see Fig. 3b). These results suggest that the GPS of the AT6 is more accurate than the Y6, but that in turn the position holding correction gains of the Y6 are better calibrated than the AT6.
UAV loitering accuracy data on 13 and 15 Dec 2013. SD denotes standard deviation.
Furthermore, from relating collected wind data to each individual UAV mission on 13 and 15 December (Fig. 3b), it follows that the mean wind strength (for its definition see Table 2, footnote a) significantly influences the variability of the UAVs mean loiter error. However, we experienced that from an altitude around 100 m and wind speeds up to 10 m s−1, it is relatively easy to keep all the necessary GCPs in view.
4. Application
Rotary wing UAVs are flexible surfzone monitoring platforms, requiring about 1 h of set up, including placement of GCPs, that have the ability to loiter at a fixed position for several hours directly above or seaward of the surfzone. In addition, the resulting moderate spatial resolution and high temporal resolution images can be georectified with good accuracy. These assets make them highly suitable to extract surfzone characteristics and investigate surfzone kinematics on the key spatial and temporal scales. A single orthophoto already provides much valuable information about the surfzone that is otherwise tedious or difficult to obtain; see Fig. 2c for some examples. Additionally, averaging successive orthophotos over a certain period returns a so-called time-exposure image, commonly used in sandbar morphology and rip current studies (e.g., Lippmann and Holman 1989); see Fig. 2d. As a result, UAV can, for instance, be used pre- and poststorm/hurricane to quickly identify and measure important morphological changes (by using single orthophotos) or as a day-to-day beach safety tool to locate possible dangerous areas where strong (rip) currents might occur (by using a time-exposure image).
The ability to obtain consecutive orthophotos with similar FOV permit new opportunities in the scientific pursuit of surfzone kinematics at different spatial and temporal scales. For example, on the smallest scales of O(1 m) and O(1 s), it is possible to track individual wave crests (see Figs. 4a,b), which provide a quantitative spatial pattern of wave celerity and dissipation. This information can be used to estimate surfzone bathymetry using algorithms such as Beach Wizard (van Dongeren et al. 2008) or cBathy (Holman et al. 2013). The latter algorithm was used by Holman et al. (2011) on aerial imagery from a fixed wing UAV (short and gappy data in time and unsteady in aim compared to rotary wing UAV data) already showing reasonable comparison between estimated bathymetry and ground truth data. In turn, the spatial patterns in wave dissipation can be evaluated to understand the formation of surfzone eddies (MacMahan et al. 2004; Spydell and Feddersen 2009) on the intermediate spatial, O(10 m), and temporal scale, O(10 min), (see Figs. 4c,d) that affect the rip current kinematics and thereby swimmer safety. On the same scale, dye releases yield estimates of tracer dispersion (Grant et al. 2005) and concentration (Clark et al. 2014), and eddy diffusivity (Bogucki et al. 2005). On the largest spatial, O(100 m–1 km), and temporal scales, O(30 min–hours), the evolution of a dye cloud can be used to investigate the residence time of material in the surfzone (Reniers et al. 2009) and the exchange of material between the surfzone and inner shelf (see Figs. 4e,f).
5. Summary
Here, we describe an exciting new potential of rotary wing UAVs for monitoring the surfzone. The UAVs are extremely flexible surveying platforms that can gather near-continuous moderate spatial resolution and high temporal resolution images from a fixed position high above a study site that has previously been difficult to obtain. The georectified images are accurate to O(1 cm–1 m) based on pixel resolution. There are a number of creative approaches that can be performed to quickly obtain surfzone and beach characteristics in response to storms or for day-to-day beach safety information, as well as scientific pursuits of surfzone kinematics on different spatial and temporal scales, and dispersion and advection estimates of pollutants.
Acknowledgments
RB and MS are supported by the ERC-Advanced Grant 291206-NEMO. Furthermore, this research was funded by a grant from BP’s Gulf of Mexico Research Initiative. The authors thank two anonymous reviewers for their constructive comments.
REFERENCES
Battjes, J. A., 1988: Surf-zone dynamics. Annu. Rev. Fluid Mech., 20, 257–291, doi:10.1146/annurev.fl.20.010188.001353.
Bogucki, D. J., Jones B. H. , and Carr M.-E. , 2005: Remote measurements of horizontal eddy diffusivity. J. Atmos. Oceanic Technol., 22, 1373–1380, doi:10.1175/JTECH1794.1.
Bouguet, J.-Y., cited 2014: Camera calibration toolbox for MATLAB. [Available online at http://www.vision.caltech.edu/bouguetj/calib_doc/.]
Clark, D. B., Lenain L. , Feddersen F. , Boss E. , and Guza R. T. , 2014: Aerial imaging of fluorescent dye in the near shore. J. Atmos. Oceanic Technol., 31, 1410–1421, doi:10.1175/JTECH-D-13-00230.1.
Dalrymple, R. A., MacMahan J. H. , Reniers A. J. H. M. , and Nelko V. , 2011: Rip currents. Annu. Rev. Fluid Mech., 43, 551–581, doi:10.1146/annurev-fluid-122109-160733.
Grant, S. B., Kim J. H. , Jones B. H. , Jenkins S. A. , Wasyl J. , and Cudaback C. , 2005: Surf zone entrainment, along-shore transport, and human health implications of pollution from tidal outlets. J. Geophys. Res., 110, C10025, doi:10.1029/2004JC002401.
Hartley, R. I., and Zisserman A. , 2003: Multiple View Geometry in Computer Vision. 2nd ed. Cambridge University Press, 665 pp.
Heikkilä, J., and Silvén O. , 1997: A four-step camera calibration procedure with implicit image correction. Proceedings of the 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, D. Plummer and I. Torwick, Eds., IEEE, 1106–1112, doi:10.1109/CVPR.1997.609468.
Hodgson, A., Kelly N. , and Peel D. , 2013: Unmanned aerial vehicles (UAVs) for surveying marine fauna: A Dugong case study. PLoS ONE, 8, e79556, doi:10.1371/journal.pone.0079556.
Holland, K. T., Holman R. A. , Lippmann T. C. , Stanley J. , and Plant N. , 1997: Practical use of video imagery in nearshore oceanographic field studies. IEEE J. Oceanic Eng., 22, 81–92, doi:10.1109/48.557542.
Holman, R. A., and Stanley J. , 2007: The history and technical capabilities of Argus. Coastal Eng., 54, 477–491, doi:10.1016/j.coastaleng.2007.01.003.
Holman, R. A., Holland K. T. , Lalejini D. M. , and Spansel S. D. , 2011: Surf zone characterization from Unmanned Aerial Vehicle imagery. Ocean Dyn., 61, 1927–1935, doi:10.1007/s10236-011-0447-y.
Holman, R. A., Plant N. , and Holland T. , 2013: cBathy: A robust algorithm for estimating nearshore bathymetry. J. Geophys. Res. Oceans, 118, 2595–2609, doi:10.1002/jgrc.20199.
Hößler, T., and Landgraf T. , 2014: Automated traffic analysis in aerial images. Computer Vision and Graphics, L. J. Chmielewski et al., Eds., Lecture Notes in Computer Science, Vol. 8671, Springer International Publishing, 262–269, doi:10.1007/978-3-319-11331-9_32.
Kannala, J., and Brandt S. S. , 2006: A generic camera model and calibration method for conventional, wide-angle, and fish-eye lenses. IEEE Trans. Pattern Anal. Mach. Intell., 28, 1335–1340, doi:10.1109/TPAMI.2006.153.
Lippmann, T. C., and Holman R. A. , 1989: Quantification of sand bar morphology: A video technique based on wave dissipation. J. Geophys. Res., 94, 995–1011, doi:10.1029/JC094iC01p00995.
MacMahan, J. H., Reniers A. J. H. M. , Thornton E. B. , and Stanton T. P. , 2004: Surf zone eddies coupled with rip current morphology. J. Geophys. Res., 109, C07004, doi:10.1029/2003JC002083.
Peregrine, D. H., 1998: Surf zone currents. Theor. Comput. Fluid Dyn., 10, 295–309, doi:10.1007/s001620050065.
Rasmussen, J., Nielsen J. , Garcia-Ruiz F. , Christensen S. , and Streibig J. C. , 2013: Potential uses of small unmanned aircraft systems (UAS) in weed research. Weed Res., 53, 242–248, doi:10.1111/wre.12026.
Reniers, A. J. H. M., MacMahan J. H. , Thornton E. B. , Stanton T. P. , Henriquez M. , Brown J. W. , Brown J. A. , and Gallagher E. , 2009: Surf zone surface retention on a rip-channeled beach. J. Geophys. Res., 114, C10010, doi:10.1029/2008JC005153.
Rinaudo, F., Chiabrando F. , Lingua A. , and Spanò A. , 2012: Archeological site monitoring: UAV photogrammetry can be an answer. Proceedings of the XXII ISPRS Congress: Imaging a Sustainable Future, M. Shortis, and J. Mills, Eds., International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. XXXIX-B5, 583–588.
Rogers, K., and Finn A. , 2013: Three-dimensional UAV-based atmospheric tomography. J. Atmos. Oceanic Technol., 30, 336–344, doi:10.1175/JTECH-D-12-00036.1.
Scaramuzza, D., Martinelli A. , and Siegwart R. , 2006: A toolbox for easy calibrating omnidirectional cameras. 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2006), IEEE, 5695–5701, doi:10.1109/IROS.2006.282372.
Spydell, M., and Feddersen F. , 2009: Lagrangian drifter dispersion in the surf zone: Directionally spread, normally incident waves. J. Phys. Oceanogr., 39, 809–830, doi:10.1175/2008JPO3892.1.
Tsai, R. Y., 1987: A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE J. Rob. Autom., 3, 323–344, doi:10.1109/JRA.1987.1087109.
van Dongeren, A., Plant N. , Cohen A. , Roelvink D. , Haller M. C. , and Catalán P. , 2008: Beach Wizard: Nearshore bathymetry estimation through assimilation of model computations and remote observations. Coastal Eng., 55, 1016–1027, doi:10.1016/j.coastaleng.2008.04.011.
Zhang, C., and Kovacs J. M. , 2012: The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric., 13, 693–712, doi:10.1007/s11119-012-9274-5.
Zhang, Z., 1999: Flexible camera calibration by viewing a plane from unknown orientations. The Proceedings of the Seventh IEEE International Conference on Computer Vision, B. Werner, Ed., Vol. 1, IEEE, 666–673, doi:10.1109/ICCV.1999.791289.