Structure and Automatic Segmentation of Dhrupad Vocal Bandish Audio

by   Rohit M. A., et al.

A Dhrupad vocal concert comprises a composition section that is interspersed with improvised episodes of increased rhythmic activity involving the interaction between the vocals and the percussion. Tracking the changing rhythmic density, in relation to the underlying metric tempo of the piece, thus facilitates the detection and labeling of the improvised sections in the concert structure. This work concerns the automatic detection of the musically relevant rhythmic densities as they change in time across the bandish (composition) performance. An annotated dataset of Dhrupad bandish concert sections is presented. We investigate a CNN-based system, trained to detect local tempo relationships, and follow it with temporal smoothing. We also employ audio source separation as a pre-processing step to the detection of the individual surface densities of the vocals and the percussion. This helps us obtain the complete musical description of the concert sections in terms of capturing the changing rhythmic interaction of the two performers.


Moisesdb: A dataset for source separation beyond 4-stems

In this paper, we introduce the MoisesDB dataset for musical source sepa...

Source Separation-based Data Augmentation for Improved Joint Beat and Downbeat Tracking

Due to advances in deep learning, the performance of automatic beat and ...

Structural Segmentation and Labeling of Tabla Solo Performances

Tabla is a North Indian percussion instrument used as an accompaniment a...

Drum-Aware Ensemble Architecture for Improved Joint Musical Beat and Downbeat Tracking

This paper presents a novel system architecture that integrates blind so...

An Improved Measure of Musical Noise Based on Spectral Kurtosis

Audio processing methods operating on a time-frequency representation of...

Please sign up or login with your details

Forgot password? Click here to reset