Tod Machover (born November 24, 1953, in Mount Vernon, New York), is a composer and an innovator in the application of technology in music. He is the son of Wilma Machover, a pianist and Carl Machover, a computer scientist.
He was named Director of Musical Research at IRCAM in 1980. Joining the faculty at the new Media Laboratory of the Massachusetts Institute of Technology (MIT) in 1985, he became Professor of Music and Media and Director of the Experimental Media Facility. Currently Professor of Music and Media at the MIT Media Lab, he is head of the Lab's Hyperinstruments/Opera of the Future group and has been co-director of the Things That Think (TTT) and Toys of Tomorrow (TOT) consortia since 1995. In 2006, he was named visiting professor of composition at the Royal Academy of Music in London. He has composed significant works for Yo-Yo Ma, Joshua Bell, Matt Haimovitz, the Ying Quartet, the Boston Pops, the Los Angeles Philharmonic, Penn & Teller, and many others, as well as designed and implemented various interactive systems for performance by Peter Gabriel and Prince. Machover gave a keynote lecture at NIME-02, the second international conference on New Interfaces for Musical Expression, which was held in 2002 at the former Media Lab Europe in Dublin, Ireland, and is a frequent lecturer worldwide. Machover is a Finalist for the 2012 Pulitzer Prize in Music for his opera "Death and the Powers."
He attended the University of California at Santa Cruz in 1971 and received a BM and MM from the Juilliard School in New York where he studied with Elliott Carter and Roger Sessions (1973–1978). He also started his Doctoral studies at Juilliard before being invited as Composer-in-Residence to Pierre Boulez's new Institut de Recherche et Coordination Acoustique/Musique (IRCAM) in 1978.
In the fall of 1978, Tod Machover arrived at IRCAM in Paris, and was introduced to Giuseppe di Giugno's digital synthesizer 4 series. Light was premiered at the Metz Festival in November 1979 using 4C, the brain-child of di Giugno's concept that "synthesizers should be made for musicians, not for the people that make them." (Electric Sound, p. 181). In 1981 he composed Fusione Fugace for solo performance on a real-time digital synthesizer, called the 4X machine. At IRCAM 1986 and 1987 he was motivated to score for keyboard and percussion duet with emphasis on extending their performance into many complex sound layers. He composed Valis, again using di Giugno's 4X system to process voices. This desire to enhance the human performance foreshadowed his concept of the hyperinstrument (term coined in 1986). At MIT's Media Lab, he developed methods for taking many more sophisticated measurements of the instrument as well as the performer's expression. He focused on augmenting keyboard instruments, percussion, strings, even the act of conducting, with the goal of developing and implementing new technology in order to expand the function of the musical instruments and their performers. He propelled forward-thinking research in the field of musical performance and interaction using new musical and technological resources. Originally concentrated to the enhancement of virtuosic performance, research has expanded in a direction of building sophisticated interactive musical instruments for non-professional musicians, children, and the general public. He premiered 'Brain Opera' in 1996, an interactive music experience with hyper instruments that aimed at making every human being into a musician.
Basically an electric violin, audio output provides raw material for real-time timbre analysis and synthesis techniques. Coupled with an enhanced bow (see Hyperbow), measured properties of both the audio output of the instrument and the bowing gesture of the player create data which controls aspects of the resulting amplified sound.
In addition to bow pressure and string contact, wrist measurements and left-hand fingering-position indicators create measurements which are evaluated and processed in response to the performance.
Main article: Hyperbow
Bowing parameters (speed, force, position) are measured and data is processed to create an interaction between performance properties and audio output. Different types or styles of bowing create complex calculations which are conducive to the performance and manipulation of larger structures and compositional shapes.
MIDI data generated by performer on a Yamaha Disklavier is manipulated by various Max/MSP processes as accompaniment and augmentation of keyboard performance.