Date of Degree
PhD (Doctor of Philosophy)
First Committee Member
Second Committee Member
Segre, Alberto Maria
Third Committee Member
Fourth Committee Member
Recently, there have been efforts to design more efficient ways to internalize music by applying the disciplines of cognition, psychology, temporality, aesthetics, and philosophy. Bringing together the fields of art and science, computational techniques can also be applied to musical analysis. Although a wide range of research projects have been conducted, the automatization of music analysis remains emergent. Importantly, patterns are revealed by using automated tools to analyze core musical elements created from melodies, harmonies, and rhythms, high-level features that are perceivable by the human ear. For music to be captured and successfully analyzed by a computer, however, one needs to extract certain information found in the lower-level features of amplitude, frequency, and duration. Moreover, while the identity of harmonic progressions, melodic contour, musical patterns, and pitch quantification are crucial factors in traditional music analysis, these alone are not exclusive. Visual representations are useful tools that reflect form and structure of non-conventional musical repertoire.
Because I regard the fluidity of music and visual shape as strongly interactive, the ultimate goal of this thesis is to construct a practical tool that prepares the visual material used for musical composition. By utilizing concepts of time, computation, and composition, this tool effectively integrates computer science, signal processing, and music perception. This will be obtained by presenting two concepts, one abstract and one mathematical, that will provide materials leading to the original composition. To extract the desired visualization, I propose a fully automated tool for musical analysis that is grounded in both the mid-level elements of loudness, density, and range, and low-level features of frequency and duration. As evidenced by my sinfonietta, Equilibrium, this tool, capable of rapidly analyzing a variety of musical examples such as instrumental repertoire, electro-acoustic music, improvisation and folk music, is highly beneficial to my proposed compositional procedure.
Composition, Feature extraction, Music perception, Music visualization, Temporality
xii, 170 pages
Includes bibliographical references (pages 168-170).
Copyright © 2017 Nima Hamidi Ghalehjegh
Hamidi Ghalehjegh, Nima. "Visualizing temporality in music: music perception – feature extraction." PhD (Doctor of Philosophy) thesis, University of Iowa, 2017.