1. Introduction
This paper describes an accurate automated technique of terrestrial1 photogrammetry that applies to weather images obtained in uncontrolled circumstances. Traditionally, photogrammetry has been performed on carefully obtained images from special airborne cameras with carefully calibrated focal length and orientation (Slama 1980). In meteorological terrestrial photogrammetry, images have provided much useful information. For example, Hoecker (1960), Golden and Purcell (1978), and others cited in Bluestein and Golden (1993) have measured wind speeds in tornadoes photogrammetrically. Wakimoto and Bringi (1988) utilized radar data overlaid on still photographs to document the development and descent of precipitation in deep cumulus clouds during the Microburst and Severe Thunderstorm (MIST) project. Colorado microbursts were similarly analyzed using still photographs (Wakimoto et al. 1994). A set of images of a Colorado tornado was overlaid with Doppler radar data in the analysis of Wakimoto and Martner (1992). These studies were performed with telephoto images obtained under relatively controlled circumstances: that is, known focal length lenses and negligible difference between the lens and visible horizons (for definitions of terms, see Table 1). This is not always the case with terrestrial weather photographs. These investigators used established photogrammetry procedures (Holle 1982, hereafter H82) that apply only to objects with small angular displacements from the principal axis of the lens and assume that the camera is held perfectly horizontally. In other words, the roll angle of the camera, or the angle at the principal point between the “vertical” side on the photograph and true vertical in object space, is assumed to be identically zero. Saunders (1963) outlined a more general technique that involves finding the lens horizon by a manual trial-and-error method. The fast computer algorithm presented in this paper is essentially an automated and more precise version of Saunders' method with the labor-intensive manual iterations replaced by convergent iterative solutions of the photogrammetric equations.
The methods in this paper apply to movie or video images as well as still images. Historically, movies have been used to assess the motion of cloud or debris by comparing positions of a feature between frames exposed at a known time interval [e.g., Golden and Purcell (1978), Hoecker (1960), and references cited by Bluestein and Golden (1993)]. Much of the literature concerning motion picture weather photogrammetry is informal and will not be cited here. In most cases the small-angle (linear scaling) approximation described in section 3 was appropriately utilized for image scaling, and prephotogrammetry determination of landmark azimuth and elevation was used for image orientation.
In recent field programs such as the Verification of Rotation in Tornadoes Experiment (VORTEX; Rasmussen et al. 1994) numerous photographs and video images were obtained and proved very useful for deducing cloud and tornado locations. These photographs were typically obtained with unknown focal length and camera orientation, unmarked principal point, and poorly calibrated image placement with respect to the camera optical axis. Often, cameras were hand-held with little attention to careful orientation with respect to the horizon. The cameras used lenses with a variety of focal lengths that often changed between exposures. Exact camera position was seldom recorded. Photographic parameters are not recorded at the time either by the general public or by scientists in the stressful, rapidly changing circumstances of a severe storm intercept. Thus the parameters have to be deduced a posteriori using information gained from revisiting the camera site with surveying equipment.
Very little information is present in the formal meteorological literature concerning techniques for single-camera or multicamera photogrammetry of images obtained in these uncontrolled circumstances. For that matter, papers in the formal literature rarely address issues such as the actual location of the horizon line, camera roll angle about the optical axis, or other crucial details of photogrammetric analysis. This paper describes a new algorithm for analyzing terrestrial weather images obtained with any type of lens (telephoto, normal, or wide angle). The input for this algorithm consists of measurements obtained in prephotogrammetry surveys from the camera site (section 2). The information that must be obtained in these surveys consists of locating the exact camera position and from this point measuring precisely the azimuth and elevation angles of landmarks that appear in the images. The mathematical method for retrieving focal length, principal azimuth, and camera tilt and roll is developed in section 3. Once these parameters are found, the azimuth and elevation angle of any feature in the image can be determined. The scale distortion inherent in photographs obtained with wide-angle lenses is accommodated by the nonlinear equations developed herein. The algorithm is tested in section 4 using a telephoto image obtained during VORTEX and also simulated wide-angle photography. The range of a visible feature from a camera is unknown unless further information is available such as the map of the damage path in the case of a tornado or a simultaneous image from a second camera with a different viewing angle. Section 5 describes a search method used in analyses of VORTEX data for locating the same feature in photographs from different directions and then deducing its 3D position.
2. The prephotogrammetry survey
A prephotogrammetric field survey is unnecessary in the ideal situation when full images are obtained with a special camera and the camera's exact Cartesian and angular coordinates are measured and recorded. The special camera would have a fixed and calibrated focal length and would mark the position of the principal point on the image (Slama 1980). Such controlled procedures are impractical when pursuing rare, short-lived phenomena such as tornadoes because of time constraints, the expense of special cameras, and the need for zoom lenses to obtain an optimum field of view for a given situation. The meteorologist has even less control of the data gathering if photography is obtained from the public, rather than as part of a scientific project. Thus focal length, camera orientation, and sometimes camera location are unknowns, which can be deduced accurately only through information acquired in a prephotogrammetric survey. Lacking a survey, the analyst typically assumes that the visible horizon line coincides with the lens horizon and that this line can then be used to determine camera orientation. However, this only works with a flat horizon at the same elevation as the camera (a condition that is hard to verify).
We now present guidelines for conducting a prephotogrammetry survey that are sufficient to obtain the data needed for scaling and orienting a weather image. The goals of the survey are to determine the camera location, as well as azimuths and elevations of landmarks visible in the image(s). Further, information must be obtained so that field-measured azimuths can be made earth relative in later analysis. Certain equipment is essential for the survey, including a global positioning system (GPS) receiver and measuring wheels to determine camera location, and a survey transit capable of measuring azimuth and elevation with an accuracy of 1′ of arc (0.0167°). Also essential are prints or traces of the original photography with marked identifiable landmarks, and a camera of fixed focal length.
First, the exact camera location must be determined. This can be facilitated in a field project if the location is marked at the time of the original photography by spray painting the ground or recording the location from GPS (the former is presently more accurate and may be essential if nearby, tall landmarks are used for image scaling and orientation). During a survey, the camera site is located through simple comparison of perspective between foreground and background objects in the photograph. In practice, a camera site can usually be determined to within 1 m unless the image is lacking in landmarks (which occurs most often in telephoto images). After the site is found, it is necessary to determine the geographical location of the site, preferably with GPS. For gross error checking, distance to nearby landmarks can be measured with a measuring wheel. Further, by measuring the azimuths (as described below) of distant landmarks of a known location, triangulation can be used to validate the camera location information.
After establishing the camera location and leveling the instrument, it is necessary to measure the azimuth and elevation angles (ϕ, θ) of as many landmarks as practical in order to minimize errors (section 3). A survey instrument is placed as close as possible to the actual camera location, and it is leveled very precisely because both azimuth and elevation of landmarks will be measured. In practice, a sketch is first made of the scene showing and numbering the approximate location of the landmarks in the image, and a numbered list is prepared that describes the landmark (e.g., “top of leaning power pole”). The actual location of the survey point must be described because both azimuth and elevation will be measured. Increasing the number of landmarks that are utilized increases the confidence with which both subjective and objective scaling and orientation can be performed. In practical experience, it has proven valuable to have two people involved in the survey, each making independent measurements in order to catch gross errors.
The azimuths obtained in the field are measured relative to the survey instrument because it is not generally possible to know the orientation of true north during the survey. In order to determine earth-relative azimuths in later processing, it is possible to locate two or more “reference landmarks” and measure the azimuth of these during the field survey. Reference landmarks should be tall objects such as transmission towers that are visible from many or all of the camera sites that are being surveyed. It is imperative that the actual location of the reference landmarks be ascertained (through topographic maps or GPS). Then, using the known camera and reference landmark locations, the actual azimuth of the reference landmark can be computed to within a fraction of a degree, and by comparing this value with the measured azimuth, the azimuth bias of the survey instrument at the camera site can be determined.
The preceding is sufficient for scaling images in terms of angular separations. Scaling in terms of linear distances requires range information. For example, the horizontal range of tornado debris is determined approximately by the intersection of the object's azimuth with the centerline of the tornado's damage track. Hence a careful survey of the tornado track is also required. The range of cloud features often can be obtained by triangulation (section 5) if there is simultaneous photography from different viewing directions.
3. A computer algorithm for ground-based photogrammetry
We now present a mathematical technique for retrieving the parameters describing image orientation and scaling from information gathered during the prephotogrammetry survey, and measurements made in the image. The image orientation (α, ω, κ; principal azimuth, elevation angle, roll angle) and scaling (f; focal length) are based on the azimuths and elevation angles (ϕ, θ) of two or more landmarks that are visible in the image (see Table 2 for a summary of variable definitions).
































What happens if the principal point is unknown, owing to the photograph being cropped or other causes? Do three or more landmarks provide enough information to determine the six unknowns: F, α, ω, κ,
Difficulties in determining the principal point in a wide-angle photograph may be anticipated since the grid is locally Cartesian in the neighborhood of the principal point. Our fears were confirmed by the results of simulated tests (section 4) in which the above method for finding the principal point generally converged to the wrong point. Therefore, in scaling wide-angle photographs, the “crossed-diagonals technique” (H82) should be used on the original negative or slide in order to obtain a good estimate of the location of the principal point in the image. For example, the principal point of the Canon 16 MS 16-mm movie camera used during VORTEX is within (±0.2 mm, ±0.3 mm) of the center of the 10.37 mm × 7.52 mm image frame according to specifications obtained from the manufacturer.
4. Tests of the algorithm
With telephoto images or in situations in which only a feature in the central part of the image is of interest, the H82 method is found to be acceptable. In other words, a scaled Cartesian grid can be placed on the image, and rotated so that its horizontal axis is parallel to the visible horizon, which is assumed to be at 0° elevation. This has been done in Fig. 3 with an image of the Dimmitt, Texas, tornado of 2 June 1995 recorded in Super-VHS video format during a VORTEX intercept. The overlay was created using drawing software, and scaled, translated, and rotated to obtain a subjective best fit to the four survey landmarks. This ∼10.5° image is typical of the sort of telephoto image in which the small-angle approximation does not lead to significant errors (i.e., a Cartesian grid can be used for scaling).
This photograph (video image capture) also has been analyzed by the algorithm (Fig. 4). The resulting (slightly non-Cartesian) grid is generated in a matter of a few seconds at most, and is similar to that obtained by the more labor-intensive subjective method (Fig. 3). The grid fits the landmark positions with an rmse of 0.045°. The small-angle solution based on the two outer landmarks has an rmse that is larger by 20% [or 40% if κ is set to zero in (13b)]. Although the H82 method may be accurate enough for telephoto images, the algorithm still has a considerable speed advantage. Once the computer code has been written (a one-time effort), the work in generating the grid consists merely of inputting the azimuths and elevation angles (ϕn, θn) into the program and running it. Moreover, the algorithm computes the magnified focal length F (2242 pixels where the pixel dimensions of the image are 416 × 238), principal azimuth α(262.67°), camera elevation angle ω(1.33°), and roll angle κ(0.74°). These parameters are then available for immediately computing via (11a) the azimuth and elevation angles (ϕ, θ) of any object, given the Cartesian coordinates (x″, z″) of its image.
A more challenging test is provided by the following simulation of a 35-mm photograph (image dimensions 36 mm × 24 mm) taken with a 28-mm wide-angle lens. It is assumed that the camera is pointed at 270° azimuth, is tilted upward at 15°, and has a roll of 7.5°, and that there are four landmarks at angular positions (ϕ1, θ1) = (240°, 0.5°), (ϕ2, θ2) = (290°, 1.0°), (ϕ3, θ3) = (260°, 2.0°), and (ϕ4, θ4) = (275°, 0.7°). The principal point is assumed to be at the center of the slide. The corresponding positions of the images of the landmarks on the slide, computed from (10) and then rounded to the nearest 0.1 mm to allow for observational imprecision, are (
We did not succeed in finding a reliable method for locating the principal point (PP) in wide-angle photographs. Our attempts consisted of running the algorithm with different assumed positions of the PP to determine rmse as a function of
5. Application example using triangulation of VORTEX data
One of the uses of photogrammetry in VORTEX data analysis has been to locate tornadoes, wall clouds, and other accessory clouds by triangulation. In many events, numerous photographers, working with VORTEX and independently, obtained still and video images from different viewing directions. The video images were registered in time to the nearest ∼5 s (time registration techniques are beyond the scope of this paper). Time registration of still photos is more problematic, but in the case of tornado cyclones, the cloud features tend to rotate and evolve quickly enough that comparisons with time-stamped video provides time estimates that have an uncertainty of about 15 s in practice. Using two or more simultaneous images from different angles (stereo photogrammetry), the locations of common features in the photos can be triangulated. (In field research programs, whenever practical, camera teams should synchronize their photographs via radio communication.)
The technique of graphical intersection simply involves plotting, on an isometric map projection, straight lines from the camera locations oriented along the azimuths of the feature. The intersection of these lines gives the horizontal coordinates of the common feature. The height of the feature can be computed directly from (16) below using the range and elevation angle of the feature from one of the cameras and the camera's height above mean sea level. The graphical intersection technique is illustrated in Fig. 5 for the Dimmitt, Texas, tornado photographed at ∼0106:30 UTC 2 June 1995. (In this illustration, the two images were not obtained simulataneously, so there is a small error in tornado position compared to accurate trackings that have been made using stereo photogrammetry and mobile Doppler data for a formal study in progress.) The center of the visible tornado near the ground, and the left and right extent of the completely opaque portion of the debris cloud were mapped. Examination of the scaled images as well as the output from the objective scaling and orientation technique indicates that uncertainty in azimuth from orientation and scaling errors is ∼0.1°, while uncertainty owing to the “nebulous” appearance and definition of the features is probably closer to 0.2°. It is typically the case that more uncertainty accrues from the identification of a cloud feature than from image measurement, scaling, and orientation errors. At the ranges of 6–10 km, an uncertainty in azimuth of 0.3° equates to an uncertainty in position of about 30–50 m.








This procedure is illustrated by another example from VORTEX. Figure 6 depicts the camera positions and geometry for this example. A target cloud feature in a photograph from camera P is chosen (i.e., the diamond near 267°, 2° in Fig. 7a). The coordinates of points along the ray are computed from (16), and the ray is then projected into the image of a second camera, C, using (17) and (18). The projected points in C's image are labeled with the pertinent values of RCT (see Fig. 7b). The analyst then chooses the location along the ray of points that coincides with the same cloud feature of the target image. In this case, the lowered portion of the distant cloud base to the right of the tornado at roughly 15 750-m range is chosen as the solution. This technique has the attractive feature of allowing estimates of uncertainty based on how close the solution ray passes to target features. In the example, it appears that the uncertainty is perhaps around 0.5°. Hence, this cloud feature is near azimuth 250.7°, elevation 0.8°, and range 15 750 m from camera C. A correction term for earth curvature and atmospheric refraction, +6.76 × 10−8 R2 cos2θ (H82), is used in the height computations. This correction is +3 m at P and +16 m at C. The target height is roughly 1400 m MSL from Fig. 6. [The close agreement between the calculations using P's and C's data is probably fortuitous.]
This pair of images was chosen because of their image clarity in publication. In reality, they were not obtained at the same instant (as evidenced by the differing morphology of the tornado funnel), although the trailing low cloud feature was evolving slowly enough that this measurement might not be too much in error. Another pathology of this image pair is that the two cameras were quite close to collinear in orientation near the time these images were obtained, which can lead to serious errors in practice: stereo photogrammetry, like dual-Doppler analysis, relies on adequate separation in the angle of view.
It is not strictly necessary to identify distinctive targets. For example, it is often sufficient to choose a more amorphous target, such as the edge of a cloud base. The analyst simply chooses the point in the solution image where the ray passes through the cloud base (being cognizant of perspective), irrespective of whether a distinctive feature can be noted on the cloud base. This approach is the one utilized in the foregoing discussion of the actual photographs.
6. Conclusions
After deducing the camera site accurately by lining up foreground and background landmarks in the imagery with those seen in person and then carefully measuring the azimuth and elevation angles of the landmarks, it is possible to perform accurate photogrammetry on images with unrecorded focal length and camera orientation angles. We have developed an algorithm (section 3) that rapidly deduces the focal length, azimuth, and tilt of the optical axis, and the roll angle of the camera, and then computes the azimuth and elevation angles of any feature in the image. Assumptions that the roll is negligible, that the visible horizon is the lens horizon, and that angles are small are unnecessary and potential sources of error. Although the small-angle solution derived in section 3 and the appendix might be sufficiently accurate for telephoto images, there is no advantage to using it over the more general and accurate algorithm if the time to develop the computer code has been invested previously. The method accommodates the scale distortion inherent in wide-angle photographs (e.g., the distortion visible in the wide-angle photograph of Fig. 5). Optical effects, such as pincushion and barrel distortion, are not accommodated here, but are treatable using techniques readily found in Web searches. Results are insensitive to the exact position of the principal point for telephoto images. For wide-angle photography, the principal point can be determined only if there is a sufficient number of precisely measured landmarks with diverse azimuth and elevation angles. If all the landmarks have low elevation angles, the PP is impossible to determine and must be assumed to lie at the intersection of the diagonals of the uncropped image.
A photogrammetric search technique is described for finding an entity, which is visible in one camera's photography, in a simultaneous image obtained from a different direction by a second camera. Once the same object has been identified in both images, its 3D position is determined by triangulation. This method could be used to superpose visible cloud features onto fields of reflectivity and Doppler velocity observed by mobile Doppler radar.
Acknowledgments
This work was supported under Grants ATM-9617318 and ATM-0003869 from the National Science Foundation. Additional support was provided by the National Severe Storms Laboratory. We gratefully acknowledge very thorough and helpful reviews of the anonymous reviewers. Computer routines in the IDL programming language can be obtained from the corresponding author to perform some of the photogrammetric functions described in this paper.
REFERENCES
Bluestein, H. B., and Golden J. H. , 1993: A review of tornado observations. The Tornado: Its Structure, Dynamics, Prediction, and Hazards, Geophys. Monogr., No. 79, Amer. Geophys. Union, 319–352.
Golden, J. H., and Purcell D. , 1978: Airflow characteristics around the Union City tornado. Mon. Wea. Rev., 106 , 22–28.
Hoecker, W. H., 1960: Wind speed and airflow patterns in the Dallas tornado of April 2, 1957. Mon. Wea. Rev., 88 , 167–180.
Holle, R. L., 1982: Photogrammetry of thunderstorms. Thunderstorms: A Social and Technological Documentary, E. Kessler, Ed., University of Oklahoma Press, 77–98.
Press, W. H., Flannery B. P. , Teukolsky S. A. , and Vetterling W. T. , 1986: Numerical Recipes: The Art of Scientific Computing. Cambridge University Press, 818 pp.
Rasmussen, E. N., Straka J. M. , Davies-Jones R. , Doswell C. A. III, Carr F. H. , Eilts M. D. , and MacGorman D. R. , 1994: Verification of the Origins of Rotation in Tornadoes Experiment: VORTEX. Bull. Amer. Meteor. Soc., 75 , 995–1006.
Saunders, P. M., 1963: Simple sky photogrammetry. Weather, 18 , 8–11.
Slama, C. C., Ed.,. 1980: Manual of Photogrammetry. 4th ed. American Society of Photogrammetry, 1056 pp.
Wakimoto, R. M., and Bringi V. M. , 1988: Dual-polarization observations of microbursts associated with intense convection: The 20 July storm during the MIST project. Mon. Wea. Rev., 116 , 1521–1539.
Wakimoto, R. M., and Martner B. E. , 1992: Observations of a Colorado tornado. Part II: Combined photogrammetric and Doppler radar analysis. Mon. Wea. Rev., 120 , 522–543.
Wakimoto, R. M., Kessinger C. J. , and Kingsmill D. E. , 1994: Kinematic, thermodynamic, and visual structure of low-reflectivity microbursts. Mon. Wea. Rev., 122 , 72–92.
APPENDIX
Small-Angle Solution for a Known or Assumed Principal Point











(a) The transformation of horizontal coordinates. (b) The relationship between [X, Y, Z] and (R, ϕ, θ) coordinates. Here T is the object or target and T1 is the projection of T onto the vertical plane that contains the optical axis. (c) The geometry in the principal plane of image formation for a camera with a thin lens focused on infinity, pointing along azimuth α, and tilted upward at an elevation angle ω. (d) The geometry of the image formation projected onto the plane that contains the optical axis and is normal to the principal plane. Symbols as in text.
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

(a) The transformation of horizontal coordinates. (b) The relationship between [X, Y, Z] and (R, ϕ, θ) coordinates. Here T is the object or target and T1 is the projection of T onto the vertical plane that contains the optical axis. (c) The geometry in the principal plane of image formation for a camera with a thin lens focused on infinity, pointing along azimuth α, and tilted upward at an elevation angle ω. (d) The geometry of the image formation projected onto the plane that contains the optical axis and is normal to the principal plane. Symbols as in text.
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2
(a) The transformation of horizontal coordinates. (b) The relationship between [X, Y, Z] and (R, ϕ, θ) coordinates. Here T is the object or target and T1 is the projection of T onto the vertical plane that contains the optical axis. (c) The geometry in the principal plane of image formation for a camera with a thin lens focused on infinity, pointing along azimuth α, and tilted upward at an elevation angle ω. (d) The geometry of the image formation projected onto the plane that contains the optical axis and is normal to the principal plane. Symbols as in text.
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

Schematic of a photograph taken with a camera at a roll angle κ. The intersection C of the diagonals (dashed) approximately locates the principal point P (the separation between C and P is exaggerated for clarity). The x′ and z′ axes are horizontal and vertical; the x″ and z″ axes are parallel to the long and short edges of the photograph
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

Schematic of a photograph taken with a camera at a roll angle κ. The intersection C of the diagonals (dashed) approximately locates the principal point P (the separation between C and P is exaggerated for clarity). The x′ and z′ axes are horizontal and vertical; the x″ and z″ axes are parallel to the long and short edges of the photograph
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2
Schematic of a photograph taken with a camera at a roll angle κ. The intersection C of the diagonals (dashed) approximately locates the principal point P (the separation between C and P is exaggerated for clarity). The x′ and z′ axes are horizontal and vertical; the x″ and z″ axes are parallel to the long and short edges of the photograph
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

Image of the Dimmitt, TX, tornado of 2 Jun 1995 obtained by the “CAM-1” VORTEX intercept team using Super-VHS video. The four survey landmarks are marked with arrowheads near the survey point, and azimuth and elevation notated in decimal degrees. From left to right, the landmarks are a faint power pole, the left roof peak of the barn, a power pole to the right of the barn, and the top of the second power pole from the image edge. The Cartesian grid overlay was generated in graphics drawing software and magnified, translated, and rotated to find a subjective best fit with the survey data
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

Image of the Dimmitt, TX, tornado of 2 Jun 1995 obtained by the “CAM-1” VORTEX intercept team using Super-VHS video. The four survey landmarks are marked with arrowheads near the survey point, and azimuth and elevation notated in decimal degrees. From left to right, the landmarks are a faint power pole, the left roof peak of the barn, a power pole to the right of the barn, and the top of the second power pole from the image edge. The Cartesian grid overlay was generated in graphics drawing software and magnified, translated, and rotated to find a subjective best fit with the survey data
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2
Image of the Dimmitt, TX, tornado of 2 Jun 1995 obtained by the “CAM-1” VORTEX intercept team using Super-VHS video. The four survey landmarks are marked with arrowheads near the survey point, and azimuth and elevation notated in decimal degrees. From left to right, the landmarks are a faint power pole, the left roof peak of the barn, a power pole to the right of the barn, and the top of the second power pole from the image edge. The Cartesian grid overlay was generated in graphics drawing software and magnified, translated, and rotated to find a subjective best fit with the survey data
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

As in Fig. 3 but the principal vertical and horizontal lines are solid white with a gap at the principal point (at image center), and the overlay is the objectively computed image scaling and orientation. Lines of constant azimuth and elevation are dash-dotted at 5° intervals, and dotted at 1° intervals. This and subsequent images are the output of an IDL image analysis program
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

As in Fig. 3 but the principal vertical and horizontal lines are solid white with a gap at the principal point (at image center), and the overlay is the objectively computed image scaling and orientation. Lines of constant azimuth and elevation are dash-dotted at 5° intervals, and dotted at 1° intervals. This and subsequent images are the output of an IDL image analysis program
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2
As in Fig. 3 but the principal vertical and horizontal lines are solid white with a gap at the principal point (at image center), and the overlay is the objectively computed image scaling and orientation. Lines of constant azimuth and elevation are dash-dotted at 5° intervals, and dotted at 1° intervals. This and subsequent images are the output of an IDL image analysis program
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

Example of graphical intersection technique. The top image is from a 35-mm still photograph obtained by the “PROBE-4” VORTEX team located at “P4” on Texas Highway 194 southeast of Dimmitt. The bottom image is a digitized frame of Super-VHS video from the CAM-1 team located on Texas Highway 86 east of Dimmitt. The images were objectively scaled and oriented using the method described in the text. The map shows lines plotted from the camera sites along the measured azimuths of the center of the tornado near the ground, as well as the left and right extent of the opaque portion of the debris cloud. The circle represents the ∼350 m diameter location of this opaque debris cloud
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

Example of graphical intersection technique. The top image is from a 35-mm still photograph obtained by the “PROBE-4” VORTEX team located at “P4” on Texas Highway 194 southeast of Dimmitt. The bottom image is a digitized frame of Super-VHS video from the CAM-1 team located on Texas Highway 86 east of Dimmitt. The images were objectively scaled and oriented using the method described in the text. The map shows lines plotted from the camera sites along the measured azimuths of the center of the tornado near the ground, as well as the left and right extent of the opaque portion of the debris cloud. The circle represents the ∼350 m diameter location of this opaque debris cloud
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2
Example of graphical intersection technique. The top image is from a 35-mm still photograph obtained by the “PROBE-4” VORTEX team located at “P4” on Texas Highway 194 southeast of Dimmitt. The bottom image is a digitized frame of Super-VHS video from the CAM-1 team located on Texas Highway 86 east of Dimmitt. The images were objectively scaled and oriented using the method described in the text. The map shows lines plotted from the camera sites along the measured azimuths of the center of the tornado near the ground, as well as the left and right extent of the opaque portion of the debris cloud. The circle represents the ∼350 m diameter location of this opaque debris cloud
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

The geometrical relationship in two vertical planes and in projection onto a horizontal plane between a target T and two cameras C and P. The numerical values represent the solution of the problem depicted in Fig. 7
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

The geometrical relationship in two vertical planes and in projection onto a horizontal plane between a target T and two cameras C and P. The numerical values represent the solution of the problem depicted in Fig. 7
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2
The geometrical relationship in two vertical planes and in projection onto a horizontal plane between a target T and two cameras C and P. The numerical values represent the solution of the problem depicted in Fig. 7
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

Example of automated graphical intersection for a full 3D target location solution. In the top image (obtained by I. Wittmeyer of the VORTEX PROBE-4 team) a target has been identified on the lower edge of the cloud base trailing the tornado (black box containing white diamond). In the lower image, obtained by the VORTEX CAM-1 team, the ray from the PROBE-4 camera through the target is traced. Symbols are marked with range from the CAM-1 camera in tens of meters. The likely location of the target along the ray is between the 14 990- and 15 940-m range marks
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2

Example of automated graphical intersection for a full 3D target location solution. In the top image (obtained by I. Wittmeyer of the VORTEX PROBE-4 team) a target has been identified on the lower edge of the cloud base trailing the tornado (black box containing white diamond). In the lower image, obtained by the VORTEX CAM-1 team, the ray from the PROBE-4 camera through the target is traced. Symbols are marked with range from the CAM-1 camera in tens of meters. The likely location of the target along the ray is between the 14 990- and 15 940-m range marks
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2
Example of automated graphical intersection for a full 3D target location solution. In the top image (obtained by I. Wittmeyer of the VORTEX PROBE-4 team) a target has been identified on the lower edge of the cloud base trailing the tornado (black box containing white diamond). In the lower image, obtained by the VORTEX CAM-1 team, the ray from the PROBE-4 camera through the target is traced. Symbols are marked with range from the CAM-1 camera in tens of meters. The likely location of the target along the ray is between the 14 990- and 15 940-m range marks
Citation: Journal of Atmospheric and Oceanic Technology 20, 12; 10.1175/1520-0426(2003)020<1790:TPOWIA>2.0.CO;2
Definitions of common photogrammetry terms


Definitions of symbols used in text


Photogrammetry in general is much more often concerned with measurements in airborne photographs obtained at near-vertical incidence. Terrestrial photogrammetry is concerned with measurements in photographs obtained from the ground at incidence close to horizontal.