Suspended Cities

Real-time Data Visualization and Case Study
Custom Graphics Software
It is important for the representation of data in any data visualization piece to be legibly connected to the data itself, as well as the artwork’s conceptual thread.

Drawing on the suspended sensation of biking over city bridges, I visualize real-time New York rideshare routes that interact with a weather simulation to create the final visual system.

This system conveys information about the relative density of rideshare usage while indicating relevant weather conditions.

Background

I love to bike around New York, avidly using NYC’s Citibike bike-share system. No matter how bad the job, rough the day or long the night, biking is a meditative and restorative experience. There are few feelings like floating over the crest of the bridge, a moment suspended before descending effortlessly to the clamor of the streets. When I am considering taking a bike anywhere, especially across the bridge, there are three things on my mind: Is it too windy, is it raining to hard, and can I find an electric bike. So, I wanted to create a work clearly communicating this information to the viewer while also conveying the emotional freedom and weightlessness that biking brings to me personally.

Below are two more examples samples showing the same visualization in different weather environments (nice summer day, raining fall day, overcast winter day) - see if you can tell which is which. You might also notice that the degree of flow in the post-processing being emitted by the rendered routes. In this way, areas of higher density emit a stronger fluid pulse, adding another visual layer of semantics.

Project Structure

This project consists of several pieces of custom software, including:

  • A Python-based ETL pipeline to gather, store and query real-time Citibike ride data
  • Historical data analysis using simple machine learning algorithms to predict the distribution of ride destination stations based on station of ride origin
  • OpenStreetMap and OpenRouteService API integrations to calculate turn-by-turn driving directions (using real-time road closures, etc) for every predicted route, and generate a GeoJSON of {Latitude, Longitude} points that lie along the route
  • Weather API integration via Visual Crossing Weather API
  • A TouchDesigner visualization system leveraging vertex shaders to generate routes from real-time ride starts and deform them according to the returned GeoJSON route point coordinates. This system also integrates current wind and rain conditions into the visualization, and manages the hardware interface.

Technology Used: 

Weather Integration

As I noted in the introduction, the integrity of the data visualization was a key component of this project. As data is simply an observer, and observers’ interactions with a system must be legible to other observers, I wanted the data’s presence to be as obvious as a physical viewer. Three primary weather features alter the dynamics of the visualization:

  • Wind speed and direction are visualized continuous through the fluid post-processing layer’s velocity field and diffusion parameters 
  • Cloud cover masks the corners of the image, substantially decreasing emitted light when overcast 
  • Precipitation (snow and rain) is introduced when appropriate through an additional layer in the fluid post-processing
  • Finally, temperature and humidity are used to chose the palette from a set of pre-selected palettes I designed 

Key Technical Challenge

Transferring Large-Dimensional API Data from CPU to GPU in Real-Time

The real-time nature of this project required doing quite a bit of CPU work with each API call. First we have to hit the API, then calculate or lookup the destination distributions for our given start station, then calculate the routing directions and GeoJSON points, and finally transmit all that data to our GLSL Material for rendering.

This presented complications, as transferring data from the CPU to GPU in real time is quite taxing on the system. I developed a solution, precomputing all possible routes and corresponding points so that we could simply look them up while rendering on the fly:

  • Generate all possible routes using destination distributions for all possible start stations (distributions are truncated to the ten most likely destinations per start station)
  • Pre-compute predicted routes for all start<>end station combinations, and then generate points for each route using ORS. Then, pack the points for all of these routes, representing the universe of possible rides, into a uniform array that can be passed to our vertex shader
  • During this pre-computation, we also keep track of the number of points per route and the offset, or number of points in the packed texture before this route's points begin, for each route
  • Then we use the Event CHOP to create an Event for each predicted destination route every time a ride is started. Custom functionality was built to include route information like points and offset in the Event data structure. These Events are passed to the vertex shader as instances, along with the lookup texture of unrolled points for all routes
  • For each instance (e.g. each route to render), the vertex shader looks up a that route's point array using Custom Instance Attributes for offset and number of points, deforming each vertex of the instance (a Line SOP with many points) to the corresponding point in the geo-located route array