A proof-of-concept app was developed for iOS that allows the user to hear different hummed melodies, and then what those melodies sound like with the same musical algorhythms applied to each.
Several potential end-users were interviewed, from hobbyist musicians to a multi Grammy Award-winning composer.
With the knowledge gained from the interviews we began to model the potential futures of song ideation and audio recording. Futures that seemed likely were developed further into opportunities for design solutions.
Concepts for capturing opportunities discovered during the futures study. These ranged from app ideas to concepts for physical recording devices, and combinations of the two.
A direction was finalized that addressed two major pain points for songwriters, which were: 1. The inability to record things easily and immediately and 2. A lack of tools that promote song ideation. A key-detection algorhythm was developed in Max/MSP to test the conversion of a hummed tune into MIDI notes in a specific scale, and the UX/UI of the app went through several iterations.
With the visual language of the interface solidified, the remainder of the screens were produced in their final form and a functional proof of concept was developed, written in C++ and running on iOS.