Presentation given by Janani Mukundan and Richard Daskas at Watson DevCon 2016.
As AI advances, scientists are pushing the boundaries of computational creativity. IBMers from Research are teaching Watson to compose original music based on human emotion. Using machine learning algorithms, the Researchers are teaching artificial neural networks to understand music theory, structure (pitch, time signature, key signature), and emotional intent to co-create music with a human partner. "Watson Beat" generates new and unique musical song combinations based on the beat played by the individual and the emotional tone the human chooses. Here’s how it works.