Week 01: 001–007
Week 02: 008–014
Week 03: 015–021
Day 015: Stroke Rate and Watts for 1000m Row on Concept2 Ergometer
Day 016: Ice Core Acidity from 1995–2004
Day 018: Covid-19 Cases and Deaths in Scotland Feb 2020–2021
Day 019: Covid-19 Cases and Deaths in Scotland Feb 2020–2021 (Conversion Bias)
Day 020: Bitcoin Close Prices 2017–2018
Day 021: Covid-19 Cases and Deaths in Scotland Feb 2020–2021
Week 04: 022–028
Week 05: 029–035
Day 029: Fewer Children Are Dying
Day 032: United States Covid-19 Case Rates, January 22, 2020–March 2, 2021
Day 033: Comparison of United States to Canada Covid-19 Case Rates
Day 034: Comparison of United States, Canada, and Mexico Covid-19 Case Rates
Day 035: United States and Brazil Covid-19 Case Rates, January 22, 2020–March 3, 2021
Week 06: 036–042
Week 07: 043–049
Day 043–049: Sources of Energy in California, 2002–2017
Week 08: 050–056
Day 050–056: Percentage of US with Heat Anomalies, 1895–2020
Week 09: 057–063
Day 057–063: Percentage of US with Heat Anomalies, 1895–2020
Week 10: 064–070
Day 064–070: Cherry Blossom Bloom Dates in Washington, D.C.
Week 11: 071–077
Week 12: 078–084
Week 13: 085–091
Day 085–091: FOIA Requests Fulfilled by the NSA, 2008–2020
Week 14: 092–100
The Procedure
In 2020, my partner Mary introduced me to the use of data sonification for providing accessibility to Internet content beyond the use of screen readers. After tinkering with some of the available tools, I realized there are additional meanings could be ‘heard’ from a data visualization.
Over 100 sequential days in 2021, I saw how many different ways I could explore the overlap between data sonification and ambient music. I followed the rough guidelines of the 100 Day Project. This means that every day, I created a sonification of a chart using the following workflow:
1. Find an interesting data table that can be displayed as a chart or graph. In some cases, I’ve needed to get the raw data as a .CSV from the web page, pull it into Google Sheets to normalize, and then output it as a web page.
2. Load the data table into SAS Graphics Accelerator plugin for Chrome. This plugin allows you to capture data tables, translate them into charts and graphs, and then create audio representations of the data when viewing the visualizations. It also has a map sonification function that I’ll try out. (A few biases I’ve found so far: The plugin is making all the decisions regarding how to map the data to five octaves on a chromatic scale. This is a point of contention in the sonification community, as you’re immediately locked into a specific Western musical interpretation of the data.)
3. Create a data visualization from the table that can be output by the plugin. In most cases, this is a bar chart or linear graph.
4. Prepare to ‘perform' the visualization. There are a few settings that have to be selected: chord or melody for the sound itself, and an explore mode ('playing' the chart) versus a scan mode that automatically plays the chart. If I’m using explore mode, then I pick a BPM that feels on theme with the data and run a metronome to stay on time.
5. Screen/audio record the performance of the data visualization via QuickTime Pro. If there are multiple charts or graphs, I will record each as its own video with timing to match. (I'm using Soundflower to capture the audio out of Chrome so it's incorporated into the video.)
6. Pull the screen recordings from QuickTime Pro into a project in Ableton Live. I’ll set the BPM to match the timing of the performance.
7. Convert the audio recordings into MIDI tracks using Ableton’s audio detection algorithm. (Bias: While I wish this was a foolproof method at this step, it's possible that Ableton will mangle the audio data that was in the original sonification. In a few cases I have left stray data [notes], but I have had to also manually clean up the MIDI data so it’s accurate, move it up or down an octave to have it fit an instrument’s register, quantize the MIDI data to 1/16 + 1/16T to get it on a basic grid, or give up on wholesale data sets due to the impossibility of clean conversion.)
8. Assign instruments to the MIDI tracks based on the theme of the initial data. This is usually the first point in this process that I hear what the data sounds like, and start responding to it in terms of musical decision-making.
10. Set levels, panning, and effects for the track. Every data set has its own story, and while I want the source data to stay true to what it’s communicating, I’ve made some creative decisions to emphasize the theme of the data’s subject or to acknowledge biases that may be inherent in the data set itself. I have also made choices regarding instrument and timbre to push the sonification into ambient, electronic, or classical music genres.
11. Bounce and master a mix of the audio. For this step, I use iZotope’s Ozone and the Cloudbounce desktop app, which I’ve also leveraged for my other musical project.
12. Post the mix to my YouTube channel for this project. You can go here to hear all the tracks and get links to the source data for each sonification. For the second half of the project, I’ve only posted the final iteration of a particular data set’s sonification (to save you from boring and incremental repetition).
Sources + Inspiration
I used this project as a way to learn more about sonification, and do not claim to be an expert in any of this. There are folks who have spent decades of their careers mastering this set of skills and creating plenty of extraordinary sonifications.
If you're seeking more inspiration in the realm of data sonification, here are some good places to start:
This is a good article by Carolyn Beans about scientists and musicians collaborating together, highlighting the work of Margaret Schedel and Carla Scaletti.
Check out the work of Brian Foo, who is the Data-Driven DJ and curator of Citizen DJ. This episode of 20 Thousand Hertz goes into his process for the data sonificiation of an epilepsy patient's seizures.
I liked this high-level overview of the sonification practice by Miriam Quick and Duncan Geere. They’re launching a new podcast called Loud Numbers focused on the practice, with many examples they’ve hand-created. Their newsletter associated with the podcast is great.
If you want to go deep on the practice of sonification, The Sonification Handbook is available as a free PDF download. This is a huge resource I’m slowly working through.
This tool used by the team at Reveal for assignment of time-series data to MIDI notes looks super-useful, more details on it here. If you wanted to keep the output in code, using this plus Sonic Pi could yield some interesting results.
Contact
Email: dksherwin (at) msn (dot) com
Twitter: https://www.twitter.com/changeorder
Instagram: https://www.instagram.com/dksherwin