I started this project trying to force myself to think away from screens. How would one experience a map-based dataset without actually having to look at a screen. I will document the project and the concept separately; this post is meant to explain how I built / am building this experience.
Moving away from screens, the first alternative I considered was audio. Keeping that in mind, I explored different datasets that I could visualize. At around the same time, the Sutherland Springs Church shooting occurred. I wondered if I could “visualize” pain. I found the concept of depicting pain as a measure interesting and challenging. I first looked up US shooting data. I soon realized that if my choice of data was pain, gun violence and US data alone might not cut it. I ended up with global homicide rates.
Here’s an image from wikimedia that I will bring up again later in this post.
I also realized that sound was probably not the best way to depict pain. Pain is a feeling itself, and that can be used to depict the dataset. I got reminded of a few experiments I worked on from a few years ago where I used a TENS machine to electrically contract my ulnar nerve (and few other facial nerves). While the experience is not necessarily painful, it does feel weird for the person wearing it. Anyone viewing it, on the other hand, feels a lot worse than the person wearing it. For some reason, the idea of being controlled by the TENS machine is more “painful” than being controlled itself.
I decided to have the audience walk through a space that has no resemblance of a map – just a white rectangle. They would wear this hacked device. A kinect tracks their position in the space, and as they walked through the map, they’d explore/unveil parts of the map temporarily. When they step on a land mass, the TENS machine that they wear forces their forearm to twitch. As they move to areas with higher rates of homicide, the frequency at which the TENS machine forces contractions is a lot higher (and lower frequency with lower homicide rates).
The TENS machine is not programmatically controllable, so I had to modify it to be controllable. I decided to play with the potentiometer that controlled the frequency of twitches. The other option was the intensity of each twitch. I could not bring myself to do this, because if that spikes it could be actually painful. An Arduino MKR1000 is my choice of microcontroller for this project – it’s portable and WiFi enabled. A processing sketch runs the kinect code to track the audience member in the space, creates the visuals and sends signals to the Arduino. The dataset is an actual visual representation of the data (seen in one of the reference links below). The higher the red, the higher the homicide rate. So I decided to use this image as a heatmap that the computer saw and translated.
I will insert a video here soon.