Search Results
You are looking at 1 - 3 of 3 items for :
- Author or Editor: Tim Scheitlin x
- Bulletin of the American Meteorological Society x
- Refine by Access: All Content x
Abstract
Visualizations enable us to detect patterns, time-evolving features, and trends in complex datasets that might not be obvious by looking at the raw data. The visual exploration process often requires comparisons between multiple visualizations, either from the same dataset or a different one, to identify relationships and patterns. This visualization process, referred to as comparative visualization, is valuable for analyzing multivariate, multispectral, or multidimensional data. The existing tools that facilitate visual comparisons do this by three means: juxtaposition (placing visuals side by side), superposition (overlaying visuals), and explicit encoding (visualizing a derived quantity corresponding to the relationship being studied). While superposition is ideal for static, geospatial datasets, where spatialization is a key component of the data, the spatiotemporal nature of Earth science datasets presents a challenge with comparative visualizations. Visual Comparator is an interactive, cross-platform (desktops, kiosks, and web), open-source application, developed to address this shortcoming. The application is used to superimpose and compare up to three synchronized, animated visualizations, and a transition between the visualizations is provided through a slider-based interface. This form of visualization has the advantage of drawing the viewers’ attention to changes between the datasets, enabling comparisons of scale, and reducing the clutter caused by having multiple variables in one visual. This article provides an overview of the project, a brief review of literature pertaining to human perception research and comparative visualizations, and a guide to accessing this application.
Abstract
Visualizations enable us to detect patterns, time-evolving features, and trends in complex datasets that might not be obvious by looking at the raw data. The visual exploration process often requires comparisons between multiple visualizations, either from the same dataset or a different one, to identify relationships and patterns. This visualization process, referred to as comparative visualization, is valuable for analyzing multivariate, multispectral, or multidimensional data. The existing tools that facilitate visual comparisons do this by three means: juxtaposition (placing visuals side by side), superposition (overlaying visuals), and explicit encoding (visualizing a derived quantity corresponding to the relationship being studied). While superposition is ideal for static, geospatial datasets, where spatialization is a key component of the data, the spatiotemporal nature of Earth science datasets presents a challenge with comparative visualizations. Visual Comparator is an interactive, cross-platform (desktops, kiosks, and web), open-source application, developed to address this shortcoming. The application is used to superimpose and compare up to three synchronized, animated visualizations, and a transition between the visualizations is provided through a slider-based interface. This form of visualization has the advantage of drawing the viewers’ attention to changes between the datasets, enabling comparisons of scale, and reducing the clutter caused by having multiple variables in one visual. This article provides an overview of the project, a brief review of literature pertaining to human perception research and comparative visualizations, and a guide to accessing this application.
Abstract
Mixed reality taps into intuitive human perception by merging computer-generated views of digital objects (or flow fields) with natural views. Digital objects can be positioned in 3D space and can mimic real objects in the sense that walking around the object produces smoothly changing views toward the other side. Only recently have advances in gaming graphics advanced to the point that views of moving 3D digital objects can be calculated in real time and displayed together with digital video streams. Auxiliary information can be positioned and timed to give the viewer a deeper understanding of a scene; for example, a pilot landing an aircraft might “see” zones of shear or decaying vortices from previous heavy aircraft. A rotating digital globe might be displayed on a table top to demonstrate the evolution of El Niño. In this article, the authors explore a novel mixed reality data visualization application for atmospheric science data, present the methodology using game development platforms, and demonstrate a few applications to help users quickly and intuitively understand evolving atmospheric phenomena.
Abstract
Mixed reality taps into intuitive human perception by merging computer-generated views of digital objects (or flow fields) with natural views. Digital objects can be positioned in 3D space and can mimic real objects in the sense that walking around the object produces smoothly changing views toward the other side. Only recently have advances in gaming graphics advanced to the point that views of moving 3D digital objects can be calculated in real time and displayed together with digital video streams. Auxiliary information can be positioned and timed to give the viewer a deeper understanding of a scene; for example, a pilot landing an aircraft might “see” zones of shear or decaying vortices from previous heavy aircraft. A rotating digital globe might be displayed on a table top to demonstrate the evolution of El Niño. In this article, the authors explore a novel mixed reality data visualization application for atmospheric science data, present the methodology using game development platforms, and demonstrate a few applications to help users quickly and intuitively understand evolving atmospheric phenomena.