The next Explorative Design assignment is about Citizen Science. My project suggestion is about music classification and image mapping. A few years ago I was working on a little university project with two colleagues in this domain – we tried to write an algorithm that analyzes both a music track and an image in order to map the music to an image. It were our first steps in this area and as you can imagine we didn’t come far. ;) It’s quite hard to automate something like this, but I think by adding some kind of learning algorithm and support from a lot of users, this project could be very interesting.

Here are some ideas an thoughts / required steps:

– music analysis & feature extraction
– image processing & feature extraction
– smartphone app lets user listen to music (short snippets of tracks)
– app presents two or more images and asks the user to choose the most fitting image

A second approach would be to skip the image analysis part and just let the user choose between moods (music classification). A lot of things could be added to such an app:

– like / favorite music
– find music based on mood
– maybe some kind of “listen to whole track” button (maybe some kind of cooperation with something like spotify – or a purchase button)
– skip 30 seconds button to “browse” through songs