Art-inspired Science & Innovation Showcase: Spatial Computing, AI & Quantum

What if audience members could “become” part of an orchestra, either in real time or years later as “belated guests”, and what if artists could use algorithms to transform their hand and body movements into audio and abstract visual feedback to the audience during a show? This interdisciplinary research project -in collaboration with the Stanford Virtual Human Interaction Lab- aims seeks to integrate spatial computing and mixed reality (MR), interactive AI, quantum-centric computing, and HCI with artistic expressions, while also addressing ethical, legal, social, and policy implications (ELSPI) across market verticals. Our initiative includes to have an ensemble of the Royal Concertgebouw Orchestra do an entire performance on Stanford Campus together with soloists Jin-Hee Lee and Katie Liu, wearing MR-headsets. Today, we’ll showcase a first-stage experimental setup (with the musicians performing Regnava nel silenzio by Donizetti from Lucia di Lammermoor) designed to capture and record their unique gestures, perspectives, and emotional responses, allowing the audience to connect with music and performers via novel technological modalities. We envision the project to advance use cases in classical music, gaming, healthcare, manufacturing, and beyond.

In addition, Jin-Hee and Katie will outline their own RQT core mission STEM related projects: ELSPI considerations of Q-HCI, and Q-Neuroscience.

Back to top