MUSical Improvising Collaborating Agent (MUSICA) is a DARPA project (Communicating with Computers, BAA-15-18) to build an information system capable of musical improvisation in collaboration with a human performer. Our research draws on techniques in machine learning, artificial intelligence, computational creativity, cognitive science, and music theory, in order to develop a system that communicates with human performers to make music that we understand to be meaningful and context-aware. For recent press on this new project, read more here and here.
Bryan Carter conceived Virtual Harlem as a virtual representation of 1920s Jazz Era Harlem. One of the earliest virtual reality projects developed for Humanities education, it has existed as a Second Life project, then ported to Open Sim and CAVE facilities. The Creative Computing Lab is porting existing assets from Virtual Harlem into the Unity 3D Game Engine, and using Google Earth to build a VR environment that maps accurately onto the geographic area around 125th street. We anticipate that the migration of Virtual Harlem to an open source platform will enable greater access to the environment as a space for immersive education and historical experience.
In June 2015, Jamie Aditya Graham and I spent a week rehearsing, arranging, and recording seven tracks which became the Trad & Soul EP. For several years we had been discussing the possibility of collaborating, and our discussions always came back to our shared passion for the traditional music of the U.S. - early jazz, blues, and gospel. Jamie is well-known in Indonesia and throughout Asia for his tenure as an MTV VJ, his acting, and his amazing voice - he is an R & B and Soul singer with great range and style. The amalgamation of his soulful modern voice and my arrangements, taken straight from the traditional jazz repertoire, sounds unique and, to my ear, surprisingly coherent.
The EP is currently available for digital download and streaming on all major digital distribution services, including CD Baby.