Remote Orchestral Conduction via a Virtual Reality System DOI

Leonardo Severi,

Matteo Sacchetto, Andrea Bianco

и другие.

Опубликована: Сен. 30, 2024

This paper envisions the adoption of a Virtual Reality (VR)-based approach to provide visual feedback remote musicians without acquisition and transmission video stream, in Networked Music Performance scenario. Focusing on orchestral conduction setup, VR headset tracks conductor's gestures convey them remotely connected performers, where hands head pose are displayed by means an avatar. Quantitative results suggest that Motion-to-Photon latency introduced system is tolerable for NMP applications, while proposed achieves substantial reduction bit-rate requirements comparison traditional streaming.

Язык: Английский

Music Generation Algorithms: An In-Depth Review of Future Directions and Applications Explored DOI

Premkumar Duraisamy,

V. Jagan Naveen, G. N. V. Kishore

и другие.

Опубликована: Фев. 24, 2024

Язык: Английский

Процитировано

1

Using Web Audio Modules for Immersive Audio Collaboration in the Musical Metaverse DOI
Michel Buffa,

Dorian Girard,

Ayoub Hofr

и другие.

Опубликована: Сен. 30, 2024

W3C Web standards now enable online applications that were unthinkable just a few years ago. In 2020, the Audio Modules V2 (WAM) initiative, known as "VST for web", revolutionized music on web. WAMs facilitate host using reusable web components such note generators, virtual instruments and audio effects. Today, WAM ecosystem includes over 100 plugins, tutorials Sequencer Party, real-time collaborative platform built entirely with WAMs, allows multiple participants to build installations by patching plugins together. This article adapts concepts of Party musical metaverse, reusing existing without modification generating new 3D interfaces VR/XR use. BabylonJS was used rendering WebXR support, "Conflict-Free Replicated Data Type» (CRDT) approach chosen client synchronization. The software is open-source hosted online.

Язык: Английский

Процитировано

1

Use-Cases of the new 3GPP Immersive Voice and Audio Services (IVAS) Codec and a Web Demo Implementation DOI
Eleni Fotopoulou,

Kacper Sagnowski,

Karin Prebeck

и другие.

Опубликована: Сен. 30, 2024

The newly standardized 3GPP codec for Immersive Voice and Audio Services (IVAS) is the first specifically tailored immersive communication within 5G mobile networks. IVAS introduces capabilities coding rendering of stereo, multi-channel, Ambisonics, audio objects, innovative metadata-assisted spatial format. It paves way new service scenarios involving interactive stereo communication, content sharing, distribution. This paper describes a demonstration three exemplary use-cases-multi-party conferencing, calls ad-hoc conferencing-using tablet with headphones. Attendees can experience impact in future services firsthand, comparing interactively its mono predecessor, EVS, which currently state-of-the-art Finally, technical details building blocks demo implementation are described.

Язык: Английский

Процитировано

0

Extending Networked Mapping Research Middleware into the Browser Sandbox DOI
Matthew Peachey, Kyle S. Smith, Joseph Malloch

и другие.

Опубликована: Сен. 30, 2024

Web-based technologies have seen rapid technologi-cal improvements over the past several decades. This is especially true as it relates to audio applications, with specification of WebAudio API enabling users deploy highly performant projects web. Existing literature describes how mappings are a critical component end-to-end projects, including Digital Musical Instruments, Internet Sounds devices, and more. Due this, years research efforts produced mapping middleware for facilitation establishing between sources destinations. paper discusses libmapper [?] ecosystem extended support from sandboxed browser environment. Establishing connectivity on web achieved through websockets backend daemon. In this paper, we discuss implementation details binding well potential use-cases via user-story driven scenarios.

Язык: Английский

Процитировано

0

Holodeck: A Research Framework for Distributed Multimedia Concert Performances DOI
Andrea Genovese,

Zack Nguyen,

Marta Gospodarek

и другие.

Опубликована: Сен. 30, 2024

This paper presents the Holodeck project, a multi-room research framework designed to support distributed multimedia concert performances. Developed through an inter-lab collaboration at New York University, platform leverages Corelink engine, flexible and unified data routing system that facilitates immersive, interactive experiences across diverse networked environments. The enables real-time streaming synchronization of various types, including audio, video, motion-capture data, thus supporting design implementation augmented Network Music Performances (NMPs). Discussion around case studies large-scale concerts illustrates system's capabilities potential applications. In pilot exploration, has been used gather preliminary on quality experience from both audiences musicians. aims enhance telepresence realism in remote collaborations, contributing development new methodologies artistic practices also discusses technical challenges solutions associated with implementing such adaptive future work regarding its usage immersive NMPs.

Язык: Английский

Процитировано

0

Immersive Io3MT Environments: Design Guidelines, Use Cases and Future Directions DOI

Rômulo Vieira,

Shu Wei, Thomas Röggla

и другие.

Опубликована: Сен. 30, 2024

The Internet of Multisensory, Multimedia, and Mu-sical Things (Io3MT) is an emerging field at the intersection computer science, arts, humanities. It focuses on integrating technologies datasets to explore human sensory perception, multimedia elements, musical compositions for artistic entertainment purposes. This position paper advocates merging Io3mtwith Extended Reality (XR) in creative contexts. Through literature review expert focus group discussions, we have identified five key design guidelines that aid development immersive environments creation within Io3mtframework. We developed PhysioDrum as a practical demonstration these guidelines. work provides insights into infrastructure, tools, opportunities, research challenges arising from Io3mtand XR.

Язык: Английский

Процитировано

0

“It Takes Two” - Shared and Collaborative Virtual Musical Instruments in the Musical Metaverse DOI
Alberto Boem,

Damian Dziwis,

Matteo Tomasetti

и другие.

Опубликована: Сен. 30, 2024

The relevance and technical possibilities of Shared Virtual Environments (SVEs) are constantly growing as part what is known the Metaverse. This includes software web platforms for creating SVEs availability hardware experiencing these environments. offer unique capabilities that have yet to be explored, especially in music. In this paper we explore concept networked Musical Instruments (VMIs) Metaverse, where virtual spaces specifically designed musical collaboration social inter-actions. We describe three prototypes shared, collaborative VMIs incorporate specific features SVEs, such spatial audio, data sonification, embodied avatar-based interactions. conducted a user study investigate how instruments can support creativity usability extent they deliver sense presence mutual engagement between users. Finally, discuss implementations proposed shared provide novel avenues music-making Our results show exhibit varying degrees usability. However, employ symmetrical interactions better interdependence among

Язык: Английский

Процитировано

0

The Musical Metaverse: Advancements and Applications in Networked Immersive Audio DOI
Claudia Rinaldi, Carlo Centofanti

Опубликована: Сен. 30, 2024

The convergence of the Sound and Music Computing (SMC) domain with Internet Things (IoT) paradigm has given rise to burgeoning field Sounds (IoS). Within this context, concept Musical Metaverse emerges as a pivotal area exploration, offering unprecedented opportunities for immersive audio experiences networked musical interactions. This paper provides comprehensive examination Metaverse, delving into technological innovations, methodologies, applications that underpin dynamic domain. We examine integration tech-nologies Networked Extended Reality (XR) create collaborative virtual environments music performance education. also explore advancements in smart instruments ubiquitous music, focusing on their impact accessibility, inclusiveness, sustainability. Additionally, we address privacy, security, large database management, highlighting role cloud services wireless acoustic sensor networks enhancing efficiency applications. aim is provide foundational understanding its potential influence future global industry.

Язык: Английский

Процитировано

0

Remote Orchestral Conduction via a Virtual Reality System DOI

Leonardo Severi,

Matteo Sacchetto, Andrea Bianco

и другие.

Опубликована: Сен. 30, 2024

This paper envisions the adoption of a Virtual Reality (VR)-based approach to provide visual feedback remote musicians without acquisition and transmission video stream, in Networked Music Performance scenario. Focusing on orchestral conduction setup, VR headset tracks conductor's gestures convey them remotely connected performers, where hands head pose are displayed by means an avatar. Quantitative results suggest that Motion-to-Photon latency introduced system is tolerable for NMP applications, while proposed achieves substantial reduction bit-rate requirements comparison traditional streaming.

Язык: Английский

Процитировано

0