Research Works

 

Academic Research Paper: Lip-sync (Playback singing) Technology of Bollywood (Indian Cinema) – Reasons for downfall and annihilation of Indipop (non-film) music 

Abstract: This paper analyzes the history of Indian music scenario where Bollywood playback singing has played a major role in causing a critical setback to the Indian non-film music category which further resulted in its obliteration. The study also compares the song-dance spectacle and the fixed format songs of Bollywood with the experimental and futuristic approach of the 1990’s Indi pop music. The properties of the playback singing have also been discussed highlighting the factors like “hero worship” and “non-realistic larger than life” quotient, which were the key factors for the monopoly. The paper is a journey through a chronological timeline of consequences, supported by historical facts and examples, which helps us understand the logic behind the Bollywood playback singing dominance, the rise of commercial “independent music” category of the 1990’s, drawback and inconsistency of Indipop, and the final annihilation. The study ends with a brief discussion on the current scenario where non-film music seems to see some light of hope. 

 

Capstone Project: Developing Android App for controlling DSP from accelerometer and touch data 

Brief: An Android app was developed which is useful for DJs and live electronic musician where any smartphone can be used as a controller. Mainly accelerometer and touch data were tracked to change values of different DSPs in real-time. Any effect parameter can be manipulated by horizontal, vertical, left right tilt, movement of the phone and also by finger tap, tap and drag, two fingers tap and many more. Extensive Pure Data programming was used to create the DSP controller. The PD controller was then converted to an embedded library by LibPD so that it can work within Android Studio. The language Java was also learned in the process of developing in Android Studio. Currently, I am enhancing the project by taking it to the next level of designing for consumer usage. 

 

Special Project: Computer Vision Virtual Tabla using Pure Data 

Brief: As the final project of digital instrument design a computer vision virtual TABLA (popular Indian percussion instrument) was created where a user can make actual tabla playing hand and finger gestures on top of a simple black table to play actual tabla sounds. This instrument would not only act as a midi tabla but also any tabla player can play different kinds of Indian percussions (most of them have similar gestures). In addition to that basic western drums with wide variety of tones can also be played just by making the tabla gestures. All existing virtual tabla apps or VSTs has to be played from a keyboard or generic finger tap. This instrument can let the real acoustic table players expand themselves to electronic music world. The main concept behind the instrument was to cut the limitations of electronic music to a keyboard and expand to a variety of real not-keyboard-like-instruments, starting with tabla. Extensive PD programming with GEM was used, especially an object called PIX_MANO was highly explored, which helped to gather real-time contour data from moving image captured by a simple webcam. Besides playing samples, the DSP control, volume, loop, bank/instrument select, and many other features can be also controlled virtually. Apart from extensive coding, the camera setup with light management was also a big part of the project. A high-end Linux OS computer was used to handle this complex graphic data processing. Taking this to the next level, I am planning to participate in various digital instrument design competitions. Also thinking of using 3D printed models of the instruments (tabla in this case) to replace the simple black table. 

 

Research: Polyphonic Pitch Tracking of the Vibraphone with Artificial Neural Networks 

Brief: The spectrum of a vibraphone tone is inharmonic, and the instrument is typically played polyphonically with four mallets and frequent use of a sustain pedal. Our approach will be to leverage the classification power of artificial neural networks (ANNs) in order to classify the most distinctive moment of vibraphone note events: the transient note onset. While the spectra of the vibraphone’s 37 bars are not harmonic, they do contain unique signatures of high frequency spectral energy. We will use these distinct spectral features as training data for a multi-layer ANN capable of classifying pitches and polyphonic chords. Because our classification is based on a short time window sampled during the first milliseconds of a vibraphone note/chord event, we expect to achieve low enough latency for use in real time performance.