by Carlo Serafini
This is the first article, which appears on jazzitalia.net, which deals with electronic music and that's why I think it right to start with some historical reference to its evolution.
Due to the fact that this article appears in a site that deals mostly jazz (as well as music education) this might seem out of place, I believe that anyone who uses an electronic musical instrument today (and I imagine that among the users of this site there are several) may find benefit from the knowledge of the history that led to the current widespread use of technology applied to music (also to jazz, at least at its most "contaminated"), as well as anyone studying an acoustic instrument, can find it useful and stimulating to study the repertoire and history of these instruments.
The history of electronic music is as much a history of inventions and entrepreneurial adventures because of artistic imagination.
The first significant electronic musical instrument was the Telharmonium patented in 1897 by the lawyer, businessman and American inventor Thaddeus Cahill (Mount Zion, Iowa 1867 - New York City in 1934). The idea was to broadcast music in homes and in public places via telephone lines (type radio) to listen with special handsets connected to telephones. Cahill managed to find investors who systematically funded his idea, he built this giant who weighed 200 tons, convinced the telephone company in New York to sign a contract for the provision of this service, but for a number of problems, not the least interference with phone lines, the company went bankrupt in 1908.
Among the various tools that were developed in the first half of the twentieth century the most notable was the THEREMIN invented by Leon Termen (St Petersberg in 1896 to 1993) in Moscow in 1917. This tool is the oldest electronic instrument still in use. The main features of this tool are the two antennas that are used to control the height and volume of the sound: one to control the height of the sound is vertically mounted on the main body of the instrument: approaching the right hand in this antenna is obtained a higher sound and removing it more serious. The control of the volume is mounted horizontally on the main body of the instrument: near the left hand. With this antenna the volume is lowered and by moving away from it, it will be increased.
Termen convinced scientists of the Soviet and Lenin himself, in 1921 followed a triumphant European tour to promote this tool and in 1927 he arrived in New York where the first performance in which she performed with the New York Philharmonic. Musicians present were Arturo Toscanini and Sergei Rachmaninoff.
He met Clara Rockmore (1911-1998) who would become the first virtuoso of the instrument. An example is the The Swan (1886) by Saint-Saens.
Back in Russia he was sent to a gulag for a short period and then used to work on projects
related radar during the Second World War. Later he invented a device to listen to the KGB,
he won the Stalin Price and taught acoustics at Moscow University where he died a few years ago (1993).
More succes had Laurens Hammond (1895 - 1973) with the organ, which was named after him, in 1935. The success was such that it became synonymous with the Hammond organ. I will not elaborate on this tool since there are already other authors covering the subject.
The revolution in electronic music took off when the concept of music is beginning to spread to include all possible sounds:
John Cage (Los Angeles, 5 september 1912 - New York, 12 august 1992) was the first composer to use ambient sounds as musical material in his "Imaginary Landscape # 1" of 1939 in which he used the recorded sounds played on two turntables with variable rotation speed, and percussion sounds.
(One of the precursors of this revolution is the Italian Luigi Russolo with his "Intonarumori" of 1913 that, though it did not include electronic instruments, had a big role in affirming the idea of including the noise and environmental sounds in modern music).
In 1948 Pierre Schaeffer (Nancy, Fr 1910-1995), an engineer and announcer at the French Broadcasting Corporation, recorded near Paris with a mobile studio the sound of steam engines of some locomotives (including the train whistles and noise of the rails) that were used to create a short composition called Etude aux Chemins de fer which was broadcasted, with great success, by radio, along with other pieces created with similar techniques under the name "Concert of noises." He is also responsible of the term "musique concrete" which indicates music in which composers have "specifically" to do with the sounds and not "theoretically" with symbols that represent them (as in a music score).
Schaeffer had used a system of direct recording to disk. Around 1950, tape recorders had made their appearance on the market and the idea of using recorded sounds for artistic purposes was in the air. Studies were born worldwide: Cologne, Milan, New York, Tokyo, Buenos Aires etc.
At the Brussels World Expo in 1958 the hall of Philips had been designed by Le Corbusier, with the first spectacular multimedia presentation of sounds and images. The music, reproduced by 425 speakers, were positioned to create spatial soundeffects, included compositions of concrete music Poeme Electronique by Edgar Varese (1883 - 1965) and "Concrete PH" by Iannis Xenakis (1922). The images were projected and lighting created by Le Corbusier.
In 1957, built the RCA Mark II synthesizer. An analog synthesizer controlled by punched paper tape.
The huge machine was too complex to be a commercial success.
In 1959 thanks to the Rockefeller Foundation it was bought for the center of electronic music of the Columbia-Princeton University in New York and used mainly by the composer Milton Babbitt (Philadelphia, may 10 1916).
In 1957 Max Mathews (Columbus, Nebraska, November 13, 1926), at the laboratories of the Bell telephone company, developed "Music I",the first computer program to generate sounds. L 'enthusiasm generated by the ability to create sounds digitally resulted, in later years, in a number of changes and improvements that led, in 1968, to "Music V", which became the basic model of most of the next generation of music software. A significant moment of those years was the use of the computer to sing a song at the end of Stanley Kubrick film "2001: A Space Odyssey". Hal (the onboard computer) sings a song while it is turned off. HAL In his last moments of "life," recalls his life, including his "childhood" at Bell Labs.
Since the '60s the technology applied to electronic music has been evolving in parallel with the development of technology in general.
The 60's were those of the development of analog synthesizers.
These include those designed by Robert Moog (May 23, 1934) (it repeated what had happened with Hammond: Moog synthesizer became synonymous) and Donald Buchla.The first synthesizers developed by Robert Moog in 1964 with composer Herbert Deutsch had a modular structure. The modules, between these oscillators, filters, envelope generators and mixers, were connected by cables. The advantages of this type of structure were both music, as the modules could be linked together in various ways making possible a great variety of timbres, both commercial since the system could be expanded slightly to time. Moog gave in 1967 his name to his synthesizer instruments and it was in that year that his instruments began to attract attention including commercial interest.
In 1969 the disc of Wendy Carlos' "Switched on Bach "arranged and played on a Moog synthesizer was the first success of electronic music on LP record.
Although the Buchla synthesizers of those years were modular but differed from the others for a couple of innovative features: the user interface did not use the usual structure type keyboard piano or organ but touch surfaces and systems included a sequencer that allowed users to store a sequence of events to play automatically.
The 70's saw the advent of digital synthesizers like the Synclavier.
The Synclavier, developed by Sydney Alonso and Cameron Jones with music-consulter John Appleton was available since 1977. It was the direct descendant of the Dartmouth Digital Synthesizer, the first digital synthesizer, built by the same group in 1972. The idea was to build a tool suitable for use live with a control panel from which it was possible to manage the parameters of the machine in real time. In 1979 came the new version of Synclavier, the Synclavier II, which quickly became a great commercial success being widely used by the film industry and in the production of pop music.
The 80's witnessed the expansion of the market for electronic music. Ikutaro Kakehashi, founder and chairman of Roland, realized that it was necessary to find a memorandum of understanding between the various manufacturers to standardize (and then further develop) the field.
Cosi, with Dave Smith, president of the Sequential Circuits and in collaboration with other producers defined a protocol that was called MIDI (musical instrument digital interface) that would allow products from various companies to talk to each other, so that, for example, a keyboard from Roland could control a Yamaha synthesizer.
The first draft of the MIDI protocol
dates back to 1983. In the same year it became the first Yamaha DX7 synthesizer MIDI successful. The next year Dave Oppenheim of Opcode Systems developed a MIDI interface to allow a Macintosh computer to exchange information with a MIDI synthesizer by opening the door to a new world of music software.
The 90s have sanctioned an increasing use of electronic tools thanks to technological advances and a consequent improvement of the quality / price, allowing a wider audience access to these technologies that only a few years before the were exclusive preserve of a small elite.
Today we talk more and more about "virtual" synthesizers or of computer applications that use the resources of the machine for the creation and manipulation of digital information which is then converted into audio signals.
All this has allowed the emergence of new concepts related to the use of technology applied to audiovisual art forms. Among these, music can include sounds recorded in our environment, nature sounds, words and sounds from other cultures. The music, in other words, can communicate extra-musical content making us more aware of the world we live in and have, therefore, also an educational value.
This technological development of new types of sensors makes the creation of new musical instruments possible that can expand our creativity.
The enlargement of the concept of art and communication, along with the adaptation of new technologies as tools for creativity, will allow everyone a degree of creativity never seen or heard previously and this seems like the best hope that we can have to each other as well as a good way to close this article in a positive way.
An article like this gives just a hint at a history of electronic music, on which were written whole volumes. It may be that this one raise more questions than answers as many failed to give. All cybernauts can fathom the internet looking for more information (almost all of them in English). In this regard (if English is not a problem) I suggest reading the article I published on the site http://electro-music.com where I retraced my personal journey, which lasts for about 25 years, as user of electronic music. The article is called "Fixin 'to hole" and you can find it by using to the address below: http://electro-music.com/forum/viewtopic.php?t=225 .
See you soon..