Beat estimation from musician visual cues

dc.authorid0000-0002-1415-1198
dc.authorid0000-0001-8635-8860
dc.authorid0000-0001-5822-4742
dc.authorscopusid57221833067
dc.authorscopusid57393201900
dc.authorscopusid57515994000
dc.authorscopusid24578248900
dc.contributor.authorChakraborty, Sutirtha
dc.contributor.authorAktaş, Senem
dc.contributor.authorClifford, William
dc.contributor.authorTimoney, Joseph
dc.date.accessioned2024-09-25T19:42:51Z
dc.date.available2024-09-25T19:42:51Z
dc.date.issued2021
dc.departmentBAİBÜ, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümüen_US
dc.descriptionElken_US
dc.description18th Sound and Music Computing Conference, SMC 2021 -- 29 June 2021 through 1 July 2021 -- Virtual, Online -- 175590en_US
dc.description.abstractMusical performance is an expressive art form where musicians interact with each other using auditory and nonverbal information. This paper aims to discover a robust technique that can identify musical phases (beats) through visual cues derived from a musician’s body movements captured through camera sensors.A multi-instrumental dataset was used to carry out a comparative study of two different approaches: (a) motiongram, and (b) pose-estimation, to detect phase from body sway. Decomposition and filtering algorithms were used to clean and fuse multiple signals. The final representations were analysed from which estimates of the beat, based on a’trust factor’, were obtained. The Motiongram and pose estimation were found to demonstrate usefulness depending on the musical instrument as some instrument playing gestures stimulate more movement in the players than others. Overall, the results were most promising using motiongram. It performed well where string instruments were used. The spatial derivative technique based on human pose estimation was consistent with woodwind instruments, where only a small degree of motion was observed. Copyright: © 2021 the Authors.en_US
dc.identifier.endpage52en_US
dc.identifier.isbn978-889454154-0
dc.identifier.issn2518-3672
dc.identifier.scopus2-s2.0-85122096528en_US
dc.identifier.scopusqualityN/Aen_US
dc.identifier.startpage46en_US
dc.identifier.urihttps://hdl.handle.net/20.500.12491/12307
dc.identifier.volume2021-Juneen_US
dc.indekslendigikaynakScopusen_US
dc.institutionauthorAktaş, Senem
dc.language.isoenen_US
dc.publisherSound and Music Computing Networken_US
dc.relation.ispartofProceedings of the Sound and Music Computing Conferencesen_US
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - İdari Personel ve Öğrencien_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.snmzYK_20240925en_US
dc.subjectBody Movementsen_US
dc.subjectBody Swayen_US
dc.subjectCamera Sensoren_US
dc.subjectComparatives Studiesen_US
dc.subjectDecomposition Algorithmen_US
dc.subjectMusical Performanceen_US
dc.subjectNon-Verbal Informationen_US
dc.subjectPose-Estimationen_US
dc.subjectRobust Techniqueen_US
dc.subjectVisual Cuesen_US
dc.subjectMusicen_US
dc.titleBeat estimation from musician visual cues en_US
dc.typeConference Objecten_US

Dosyalar

Orijinal paket
Listeleniyor 1 - 1 / 1
Yükleniyor...
Küçük Resim
İsim:
sutirtha-chakraborty.pdf
Boyut:
1.94 MB
Biçim:
Adobe Portable Document Format
Açıklama:
Tam Metin / Full Text