New Works and Laboratory highlights by the UnSupervised Group

 

Motion Capture Session for Machine Learning training purposes at VR2 facilities and NOVARS

First laboratory Session (out of three) with Greek Beatboxer champion GG (Giorgios Gargalas) thanks to SALC, UoM Recovery grants.
The Project ‘Noh Virtual’ is led by Ricardo Climent and the team includes GG (voice), Hongsuo Fan (Machine Learning) and Manusamo&Biza (3D models and rigging).
The Virtual Reality Research Facility (VR2) is a Cutting-edge virtual reality and motion capture research at the Univeristy of Manchester, UK. It includes 8 OptiTrack Prime 17W tracking cameras, Zotac wearable backpack and commerical tracking software. NOVARS custom implementation for face tracking was commissioned to Unsupervised PhD Hongshuo Fan.

Hongshuo Fan & SWR Experimentalstudio »Conversation in the Cloud« | Giga-Hertz Award.

(2020-21) For clarinet and AI musician, ca. 12', UA »Conversation in the Cloud« is a live multimedia composition for one human musician and one AI musician at the intersectins of reality and virtuality in music. The AI musician is a comprehensive system that applies multiple machine learning techniques to enhance its machine musicianship, such as deep neural networks and human body pose estimation. The combination of live multimedia and the performance of two musicians creates a multidimensional musical conversation. – Hongshuo Fan

 
 

Vicky Clarke inSonic ZKM talk

Dec 2020. Artist talk for inSonic Festival ZKM Dec 2020. Introducing SleepStates and the NOVARS & EASTnDC network residency exploring machine learning and musique concrete.

Duet for Violin and Biofeedback (2019) by Chris Rhodes.

Violin: Sarah Keirle
Duet for Violin and Biofeedback (2019) is a work for Violin and Live Electronics, composed by Chris Rhodes. In the piece, the performer (here, Sarah Keirle) wears a Myo armband (wearable sensor) on the bow arm. As the bow arm moves, this sensor changes the timbre and playability of the violin in real-time (during performance) during the second movement. This novel musical interaction is made possible by machine learning algorithms.

 
 

We’ve all at some point, and with varying levels of embarrassment, played the air guitar. It’s the perfect instrument for those of us wishing to live the rock n’ roll lifestyle minus the dedication and talent needed to perfect a real instrument. But what if you could play the air guitar for real, with your movement creating sound a music based on how you chose to play. On the latest episode of Insight Faster I’ll be chatting to a researcher working with technology able to do just that, and far more.

 
Previous
Previous

.