Visual analytics provides a powerful way for people to see patterns and trends in data by visualizing them. In real life, we use both our eyes and ears. So can we hear patterns and trends if we listen to the data?
I spent a few days studying the JavaScript Sound API and adding simple data sonification to our Visual Process Analytics (VPA) to explore this question. I don’t know where including the auditory sense to the analytics toolkit may lead us, but you never know. It is always good to experiment with various ideas.
Note that the data sonification capabilities of VPA is very experimental at this point. To make the matter worse, I am not a musician by any stretch of the imagination. So the generated sounds in the latest version of VPA may sound horrible to you. But this represents a step forward to better interactions with complex learner data. As my knowledge about music improves, the data should sound less terrifying.
The first test feature added to VPA is very simple: It just converts a time series into a sequence of notes and rests. To adjust the sound, you can change a number of parameters such as pitch, duration, attack, decay, and oscillator types (sine, square, triangle, sawtooth, etc.). All these options are available through the context menu of a time series graph.
At the same time as the sound plays, you can also see a synchronized animation of VPA (as demonstrated by the embedded videos). This means that from now on VPA is a multimodal analytic tool. But I have no plan to rename it as data visualization is still and will remain dominant for the data mining platform.
The next step is to figure out how to synthesize better sounds from multiple types of actions as multiple sources or instruments (much like the Song from Pi). I will start with sonifying the scatter plot in VPA. Stay tuned.