Student Presenter: Alan Whelan

Event: Data Visualization in Virtual Reality for Oceanography

Wednesday, April 30th, 2025, 2-5 pm, JSGMakerspace @ the STEM Library

This event was hosted generously by an undergraduate student who does research in the Atmospheric and Oceanic Sciences department. The primary objective was to showcase how virtual reality can be used to better visualize oceanographic data. The world’s oceans are studied extensively through satellite imagery and observations that collect data in the form of pure numbers and are given to researchers to interpret. However, interpreting this complex, multidimensional data is very difficult using traditional 2D projections.

However, researchers are turning to the immersive nature of virtual reality to make this data easily interpretable. Using tools like Python and ParaView to curate the simulations, and then using the Meta Quest 2 VR headset to visualize the data, this demonstration produced a clear, time-evolving, and interactive 3D projection of several different categories of oceanographic data. Some of these data include temperature, kinetic energy, and salinity. The projections presented represent data points collected from the Gulf of Mexico.

I think the student presents a compelling argument for using virtual reality as a tool to better visualize and interpret oceanographic data sets. The presentation did a good job of highlighting what current data analysis looks like, and the pitfalls of this current state. Presently, output datasets from satellite observations contain millions of data points that detail a specific quantity, latitude, longitude, depth, and time for different forms of oceanographic data. However, visualizing this immersive data is difficult. You can’t get a clear picture by looking at tables of numbers. Current visualizations of this data typically come from looking at “slices” of data and projecting them in two dimensions. However, these are poor representations of the true volumetric nature of our seas, oceans, and gulfs.

I had the pleasure of trying this path of visualization at the event through the lens of the Quest 2 virtual reality headset. The student giving the presentation accessed a large dataset of oceanographic data from GLORYS Reanalysis and applied the values through basic equations written in Python code to obtain derived quantities (kinetic energy, salinity, and temperature). Since these specific data points have location and time quantities associated with them, like longitude, latitude, and depth, the data for derived quantities can be stored and exported to ParaView where they are rendered as 3D models. The student chose to use a data set from August to November of 2024, creating 121 individual three-dimensional models of the Gulf of Mexico. When you quickly cycle through this data range, what you get is an evolving graph of data. You can see certain areas of the Gulf get warmer or cooler, more saline or less saline, move faster or move slower, providing the viewer a clear, encompassing view of what is happening. The purpose of this project is to prove how simple VR makes understanding what is happening. I personally advocate that using the headset, and subsequently asking questions while viewing the data, gave me a clear understanding of everything that was going on, even as someone who has no experience with the AOSC field or VR itself.

The student did a great job of avoiding any logical fallacies while presenting their work. The event was largely focused on personal interaction with the VR technology, and less of a lecture on why this is so innovative. Even still, the poster that accompanied the presentation is very well made. They clearly demonstrated the process used to extrapolate the data, explaining each of the steps and tools involved. Later, they provided readers with different real-world, scientific uses for the technology. Even here, they kept their arguments very objective and straightforward. One thing I would have liked to see, however, is examples of the two-dimensional projections the student mentions. This way, viewers could have compared the two types of projections in real time to further justify the argument that virtual reality is superior in this context.