Electronic music

From Infogalactic: the planetary knowledge core
(Redirected from Electronic art music)
Jump to: navigation, search

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Lua error in package.lua at line 80: module 'strict' not found. Electronic music is music that employs electronic musical instruments and electronic music technology in its production, an electronic musician being a musician who composes and/or performs such music. In general a distinction can be made between sound produced using electromechanical means and that produced using electronic technology.[1] Examples of electromechanical sound producing devices include the telharmonium, Hammond organ, and the electric guitar. Purely electronic sound production can be achieved using devices such as the theremin, sound synthesizer, and computer.[2]

The first electronic devices for performing music were developed at the end of the 19th century, and shortly afterward Italian Futurists explored sounds that had previously not been considered musical. During the 1920s and 1930s, electronic instruments were introduced and the first compositions for electronic instruments were composed. By the 1940s, magnetic audio tape allowed musicians to tape sound and then modify them by changing the tape speed or direction, leading to the development of electroacoustic tape music in the 1940s, in Egypt and France. Musique concrète, created in Paris in 1948, was based on editing together recorded fragments of natural and industrial sounds. Music produced solely from electronic generators was first produced in Germany in 1953. Electronic music was also created in Japan and the United States beginning in the 1950s. An important new development was the advent of computers for the purpose of composing music. Algorithmic composition was first demonstrated in Australia in 1951.

In America and Europe, live electronics were pioneered in the early 1960s. During the 1970s to early 1980s, the monophonic Minimoog became once the most widely used synthesizer at that time in both popular and electronic art music.

In the 1970s, electronic music began having a significant influence on popular music, with the adoption of polyphonic synthesizers such as the Yamaha GX-1 and Prophet-5, electronic drums, and drum machines such as the Roland CR-78, through the emergence of genres such as krautrock, disco, new wave and synthpop. In the 1980s, electronic music became more dominant in popular music, with a greater reliance on synthesizers, and the adoption of programmable drum machines such as the Roland TR-808 and TR-909 and the Linn LM-1, and bass synthesizers such as the Roland TB-303. In the early 1980s, a group of musicians and music merchants developed the Musical Instrument Digital Interface (MIDI), and Yamaha released the first FM digital synthesizer, the DX7.

Electronically produced music became prevalent in the popular domain by the 1990s, because of the advent of affordable music technology.[3] Contemporary electronic music includes many varieties and ranges from experimental art music to popular forms such as electronic dance music.

Origins: late 19th century to early 20th century

Lee de Forest's 1906 invention, the triode audion tube, later had a profound effect on electronic music. It was the first thermionic valve, or vacuum tube, and led to circuits that could create and amplify audio signals, broadcast radio waves, compute values, and perform many other functions.

Before electronic music, there was a growing desire for composers to use emerging technologies for musical purposes. Several instruments were created that employed electromechanical designs and they paved the way for the later emergence of electronic instruments. An electromechanical instrument called the Telharmonium (sometimes Teleharmonium or Dynamophone) was developed by Thaddeus Cahill in the years 1898 to 1912. However, simple inconvenience hindered the adoption of the Telharmonium, due to its immense size. One early electronic instrument often mentioned may be the theremin, invented by Professor Léon Theremin circa 1919–1920.[4] Other early electronic instruments include the Audion Piano invented in 1915 by Lee de Forest who was inventor of triode audion as mentioned above,[5][6] the Croix Sonore, invented in 1926 by Nikolai Obukhov, and the ondes Martenot, which was most famously used in the Turangalîla-Symphonie by Olivier Messiaen as well as other works by him. The ondes Martenot was also used by other, primarily French, composers such as André Jolivet.[7]

Sketch of a New Esthetic of Music

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

In 1907, just a year after the invention of the triode audion, Ferruccio Busoni published Sketch of a New Esthetic of Music, which discussed the use of electrical and other new sound sources in future music. He wrote of the future of microtonal scales in music, made possible by Cahill's Dynamophone: "Only a long and careful series of experiments, and a continued training of the ear, can render this unfamiliar material approachable and plastic for the coming generation, and for Art."[8]

Also in the Sketch of a New Esthetic of Music, Busoni states:

<templatestyles src="Template:Blockquote/styles.css" />

Music as an art, our so-called occidental music, is hardly four hundred years old; its state is one of development, perhaps the very first stage of a development beyond present conception, and we—we talk of "classics" and "hallowed traditions"! And we have talked of them for a long time!

We have formulated rules, stated principles, laid down laws;—we apply laws made for maturity to a child that knows nothing of responsibility!

Young as it is, this child, we already recognize that it possesses one radiant attribute which signalizes it beyond all its elder sisters. And the lawgivers will not see this marvelous attribute, lest their laws should be thrown to the winds. This child—it floats on air! It touches not the earth with its feet. It knows no law of gravitation. It is well nigh incorporeal. Its material is transparent. It is sonorous air. It is almost Nature herself. It is—free!

But freedom is something that mankind have never wholly comprehended, never realized to the full. They can neither recognize or acknowledge it.

They disavow the mission of this child; they hang weights upon it. This buoyant creature must walk decently, like anybody else. It may scarcely be allowed to leap—when it were its joy to follow the line of the rainbow, and to break sunbeams with the clouds.[9]

Through this writing, as well as personal contact, Busoni had a profound effect on many musicians and composers, perhaps most notably on his pupil, Edgard Varèse, who said: <templatestyles src="Template:Blockquote/styles.css" />

Together we used to discuss what direction the music of the future would, or rather, should take and could not take as long as the straitjacket of the tempered system. He deplored that his own keyboard instrument had conditioned our ears to accept only an infinitesimal part of the infinite gradations of sounds in nature. He was very much interested in the electrical instruments we began to hear about, and I remember particularly one he had read of called the Dynamophone. All through his writings one finds over and over again predictions about the music of the future which have since come true. In fact, there is hardly a development that he did not foresee, as for instance in this extraordinary prophecy: 'I almost think that in the new great music, machines will also be necessary and will be assigned a share in it. Perhaps industry, too, will bring forth her share in the artistic ascent.'[10]

Futurists

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

In Italy, the Futurists approached the changing musical aesthetic from a different angle. A major thrust of the Futurist philosophy was to value "noise," and to place artistic and expressive value on sounds that had previously not been considered even remotely musical. Balilla Pratella's "Technical Manifesto of Futurist Music" (1911) states that their credo is: "To present the musical soul of the masses, of the great factories, of the railways, of the transatlantic liners, of the battleships, of the automobiles and airplanes. To add to the great central themes of the musical poem the domain of the machine and the victorious kingdom of Electricity."[11]

On 11 March 1913, futurist Luigi Russolo published his manifesto "The Art of Noises". In 1914, he held the first "art-of-noises" concert in Milan on April 21. This used his Intonarumori, described by Russolo as "acoustical noise-instruments, whose sounds (howls, roars, shuffles, gurgles, etc.) were hand-activated and projected by horns and megaphones."[12] In June, similar concerts were held in Paris.

The 1920s to 1930s

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

This decade brought a wealth of early electronic instruments and the first compositions for electronic instruments. The first new electronic musical instrument of the decade, the etherphone ( or ætherphone ), was created by Léon Theremin (born Lev Termen) between 1919 and 1920 in Leningrad, though it was eventually renamed the theremin ( though sometimes also known as the thereminophone or termenvox/thereminvox ). This led to the first compositions for electronic instruments, as opposed to noisemakers and re-purposed machines. In 1929, Joseph Schillinger composed First Airphonic Suite for Theremin and Orchestra, premièred with the Cleveland Orchestra with Leon Theremin as soloist.

In addition to the theremin, the ondes Martenot was invented in 1928 by Maurice Martenot, who debuted it in Paris.[13]

In 1924, George Antheil composed a score for Fernand Leger's film Ballet Mecanique, featuring player pianos in sync, airplane propellers, and other artificial sounds.[clarification needed] In 1929, Antheil composed for mechanical devices, electrical noisemakers, motors, and amplifiers in his unfinished opera, Mr. Bloom.[citation needed]

Recording of sounds made a leap in 1927, when American inventor J. A. O'Neill developed a recording device that used magnetically coated ribbon. However, this was a commercial failure. Two years later, Laurens Hammond established his company for the manufacture of electronic instruments. He went on to produce the Hammond organ, which was based on the principles of the Telharmonium, along with other developments including early reverberation units.[14] Hammond (along with John Hanert and C. N. Williams) would also go on to invent another electronic instrument, the Novachord, which Hammond's company manufactured from 1939–1942.[15]

The method of photo-optic sound recording used in cinematography made it possible to obtain a visible image of a sound wave, as well as to realize the opposite goal—synthesizing a sound from an artificially drawn sound wave.

In this same period, experiments began with sound art, early practitioners of which include Tristan Tzara, Kurt Schwitters, Filippo Tommaso Marinetti, and others. The animation film L'Idee (1932) by Berthold Bartosch featured a score composed by Arthur Honegger with ondes Martenot and chamber orchestra.[16][17]

Development: 1940s to 1950s

Electroacoustic tape music

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Halim El-Dabh at a Cleveland festival in 2009

Low-fidelity magnetic wire recorders had been in use since around 1900[18] and in the early 1930s the movie industry began to convert to the new optical sound-on-film recording systems based on the photoelectric cell.[19] It was around this time that the German electronics company AEG developed the first practical audio tape recorder, the "Magnetophon" K-1, which was unveiled at the Berlin Radio Show in August 1935.[20]

During World War II, Walter Weber rediscovered and applied the AC biasing technique, which dramatically improved the fidelity of magnetic recording by adding an inaudible high-frequency tone. It extended the 1941 'K4' Magnetophone frequency curve to 10 kHz and improved the dynamic range up to 60 dB,[21] surpassing all known recording systems at that time.[22]

As early as 1942 AEG was making test recordings in stereo.[23] However these devices and techniques remained a secret outside Germany until the end of WWII, when captured Magnetophon recorders and reels of Farben ferric-oxide recording tape were brought back to the United States by Jack Mullin and others.[24] These captured recorders and tapes were the basis for the development of America's first commercially made professional tape recorder, the Model 200, manufactured by the American Ampex company[25] with support from entertainer Bing Crosby, who became one of the first performers to record radio broadcasts and studio master recordings on tape.[26]

Magnetic audio tape opened up a vast new range of sonic possibilities to musicians, composers, producers and engineers. Audio tape was relatively cheap and very reliable, and its fidelity of reproduction was better than any audio medium to date. Most importantly, unlike discs, it offered the same plasticity of use as film. Tape can be slowed down, sped up or even run backwards during recording or playback, with often startling effect. It can be physically edited in much the same way as film, allowing for unwanted sections of a recording to be seamlessly removed or replaced; likewise, segments of tape from other sources can be edited in. Tape can also be joined to form endless loops that continually play repeated patterns of pre-recorded material. Audio amplification and mixing equipment further expanded tape's capabilities as a production medium, allowing multiple pre-taped recordings (and/or live sounds, speech or music) to be mixed together and simultaneously recorded onto another tape with relatively little loss of fidelity. Another unforeseen windfall was that tape recorders can be relatively easily modified to become echo machines that produce complex, controllable, high-quality echo and reverberation effects (most of which would be practically impossible to achieve by mechanical means).

The spread of tape recorders eventually led to the development of electroacoustic tape music. The first known example was composed in 1944 by Halim El-Dabh, a student at Cairo, Egypt.[27] He recorded the sounds of an ancient zaar ceremony using a cumbersome wire recorder and at the Middle East Radio studios processed the material using reverberation, echo, voltage controls, and re-recording. The resulting work was entitled The Expression of Zaar and it was presented in 1944 at an art gallery event in Cairo. While his initial experiments in tape based composition were not widely known outside of Egypt at the time, El-Dabh is also notable for his later work in electronic music at the Columbia-Princeton Electronic Music Center in the late 1950s.[28]

Musique concrète

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

It wasn't long[when?] before composers in Paris also began using the tape recorder to develop a new technique for composition called musique concrète. This technique involved editing together recorded fragments of natural and industrial sounds.[29] The first pieces of musique concrète in Paris were assembled by Pierre Schaeffer, who went on to collaborate with Pierre Henry.

On 5 October 1948, Radiodiffusion Française (RDF) broadcast composer Pierre Schaeffer's Etude aux chemins de fer. This was the first "movement" of Cinq études de bruits, and marked the beginning of studio realizations[30] and musique concrète (or acousmatic art). Schaeffer employed a disk-cutting lathe, four turntables, a four-channel mixer, filters, an echo chamber, and a mobile recording unit. Not long after this, Henry began collaborating with Schaeffer, a partnership that would have profound and lasting effects on the direction of electronic music. Another associate of Schaeffer, Edgard Varèse, began work on Déserts, a work for chamber orchestra and tape. The tape parts were created at Pierre Schaeffer's studio, and were later revised at Columbia University.

In 1950, Schaeffer gave the first public (non-broadcast) concert of musique concrète at the École Normale de Musique de Paris. "Schaeffer used a PA system, several turntables, and mixers. The performance did not go well, as creating live montages with turntables had never been done before."[31] Later that same year, Pierre Henry collaborated with Schaeffer on Symphonie pour un homme seul (1950) the first major work of musique concrete. In Paris in 1951, in what was to become an important worldwide trend, RTF established the first studio for the production of electronic music. Also in 1951, Schaeffer and Henry produced an opera, Orpheus, for concrete sounds and voices.

Elektronische Musik

Karlheinz Stockhausen in the Electronic Music Studio of WDR, Cologne, in 1991

Karlheinz Stockhausen worked briefly in Schaeffer's studio in 1952, and afterward for many years at the WDR Cologne's Studio for Electronic Music.

In Cologne, what would become the most famous electronic music studio in the world was officially opened at the radio studios of the NWDR in 1953, though it had been in the planning stages as early as 1950 and early compositions were made and broadcast in 1951.[32] The brain child of Werner Meyer-Eppler, Robert Beyer, and Herbert Eimert (who became its first director), the studio was soon joined by Karlheinz Stockhausen and Gottfried Michael Koenig. In his 1949 thesis Elektronische Klangerzeugung: Elektronische Musik und Synthetische Sprache, Meyer-Eppler conceived the idea to synthesize music entirely from electronically produced signals; in this way, elektronische Musik was sharply differentiated from French musique concrète, which used sounds recorded from acoustical sources.[33]

"With Stockhausen and Mauricio Kagel in residence, it became a year-round hive of charismatic avante-gardism [sic]"[34] on two occasions combining electronically generated sounds with relatively conventional orchestras—in Mixtur (1964) and Hymnen, dritte Region mit Orchester (1967).[35] Stockhausen stated that his listeners had told him his electronic music gave them an experience of "outer space," sensations of flying, or being in a "fantastic dream world".[36] More recently, Stockhausen turned to producing electronic music in his own studio in Kürten, his last work in the medium being Cosmic Pulses (2007).

Japanese electronic music

Lua error in package.lua at line 80: module 'strict' not found. While early electric instruments such as the ondes Martenot, theremin and trautonium were little known in Japan prior to World War II,[neutrality is disputed][37] certain composers such as Minao Shibata had known about them at the time. Several years after the end of World War II, musicians in Japan began experimenting with electronic music, resulting in some of the most dedicated efforts due to institutional sponsorship enabling composers to experiment with the latest audio recording and processing equipment. These efforts represented an infusion of Asian music into the emerging genre and would eventually pave the way for Japan's domination in the development of music technology several decades later.[38]

Following the foundation of electronics company Sony (then called Tokyo Tsushin Kogyo K.K.) in 1946, two Japanese composers, Toru Takemitsu and Minao Shibata, independently wrote about the possible use of electronic technology to produce music during the late 1940s.[39] In 1948, Takemitsu conceived of a technology that would "bring noise into tempered musical tones inside a busy small tube," an idea similar to Pierre Schaeffer's musique concrète the same year, which Takemitsu was unaware of until several years later. In 1949, Shibata wrote about his concept of "a musical instrument with very high performance" that can "synthesize any kind of sound waves" and is "operated very easily," predicting that with such an instrument, "the music scene will be changed drastically."[40] The same year, Sony developed the magnetic tape recorder G-Type,[41] which became a popular recording device in courtrooms and government offices, leading to Sony releasing the H-Type for home use by 1951.[38]

In 1950, the Jikken Kōbō (Experimental Workshop) electronic music studio would be founded by a group of musicians in order to produce experimental electronic music using Sony tape recorders. It included musicians such as Toru Takemitsu, Kuniharu Akiyama, and Joji Yuasa, and was supported by Sony, which offered them access to the latest audio technology, hired Takemitsu to compose electronic tape music to demonstrate their tape recorders, and sponsored concerts.[42] The first electronic tape music from the group were "Toraware no Onna" ("Imprisoned Woman") and "Piece B", completed in 1951 by Kuniharu Akiyama.[43] Many of the electroacoustic tape pieces they produced were usually used as incidental music for radio, film, and theatre. They also held concerts such as 1953's Experimental Workshop, 5th Exhibition, which employed an 'auto-slide', a machine developed by Sony that made it possible to synchronize a slide show with a soundtrack recorded on tape; they used the same device to produce the concert's tape music at the Sony studio. The concert, along with the experimental electroacoustic tape music they produced, anticipated the introduction of musique concrète in Japan later that year.[44] Beyond the Jikken Kobo, several other composers such as Yasushi Akutagawa, Saburo Tominaga and Shiro Fukai were also experimenting with producing radiophonic tape music between 1952 and 1953.[41]

Japan was introduced to musique concrète through Toshiro Mayuzumi, who in 1952 attended a Schaeffer concert in Paris.[43] On his return to Japan, he experimented with a short tape music piece for the 1952 comedy film Carmen Jyunjyosu (Carmen With Pure Heart)[45] and then produced X, Y, Z for Musique Concrète, broadcast by the JOQR radio station in 1953.[43] Mayuzumi also composed another musique concrète piece for Yukio Mishima's 1954 radio drama Boxing.[45] Schaeffer's French concept of objet sonore (sound object), however, was not influential among Japanese composers, whose main interest in music technology was instead to, according to Mayuzumi, overcome the restrictions of "the materials or the boundary of human performance."[46] This led to several Japanese electroacoustic musicians making use of serialism and twelve-tone techniques,[46] evident in Yoshirō Irino's 1951 dodecaphonic piece "Concerto da Camera",[45] in the organization of electronic sounds in Mayuzumi's "X, Y, Z for Musique Concrète", and later in Shibata's electronic music by 1956.[47]

Following as a model the NWDR Cologne studio, Japan's NHK company established an electronic-music studio in Tokyo in 1955, which became one of the world's leading electronic music facilities. The NHK Studio was equipped with technologies such as tone-generating and audio processing equipment, recording and radiophonic equipment, ondes Martenot, Monochord and Melochord, sine-wave oscillators, tape recorders, ring modulators, band-pass filters, and four- and eight-channel mixers. Musicians associated with the studio included Toshiro Mayuzumi, Minao Shibata, Joji Yuasa, Toshi Ichiyanagi, and Toru Takemitsu. The studio's first electronic compositions were completed in 1955, including Mayuzumi's five-minute pieces "Studie I: Music for Sine Wave by Proportion of Prime Number", "Music for Modulated Wave by Proportion of Prime Number" and "Invention for Square Wave and Sawtooth Wave" produced using the studio's various tone-generating capabilities, and Shibata's 20-minute stereo piece "Musique Concrète for Stereophonic Broadcast".[48][49]

Ikutaro Kakehashi founded a repair shop called Kakehashi Watch Shop in the late 1940s repairing watches and radios, and then in 1954 founded Kakehashi Musen ("Kakehashi Radio"), which by 1960 grew into the company Ace Tone, and by 1972 became the Roland Corporation. Kakehashi began producing electronic musical instruments in 1955, with the aim of creating devices that could produce monophonic melodies. During the late 1950s, he produced theremins, ondes Martenots, and electronic keyboards, and by 1959, a Hawaiian guitar amplifier and electronic organs.[50][neutrality is disputed][51]

In the 1970's the composer Isao Tomita realized and arranged the music of Debussy,[52] Ravel[53] and Mussorgsky[54] in electronic arrangements. He nominated three Grammys for his album Snowflakes Are Dancing in 1974.[55]

American electronic music

In the United States, electronic music was being created as early as 1939, when John Cage published Imaginary Landscape, No. 1, using two variable-speed turntables, frequency recordings, muted piano, and cymbal, but no electronic means of production. Cage composed five more "Imaginary Landscapes" between 1942 and 1952 (one withdrawn), mostly for percussion ensemble, though No. 4 is for twelve radios and No. 5, written in 1952, uses 42 recordings and is to be realized as a magnetic tape. According to Otto Luening, Cage also performed a William [sic] Mix at Donaueschingen in 1954, using eight loudspeakers, three years after his alleged collaboration.[clarification needed] Williams Mix was a success at the Donaueschingen Festival, where it made a "strong impression".[56]

The Music for Magnetic Tape Project was formed by members of the New York School (John Cage, Earle Brown, Christian Wolff, David Tudor, and Morton Feldman),[57] and lasted three years until 1954. Cage wrote of this collaboration: "In this social darkness, therefore, the work of Earle Brown, Morton Feldman, and Christian Wolff continues to present a brilliant light, for the reason that at the several points of notation, performance, and audition, action is provocative."[58]

Cage completed Williams Mix in 1953 while working with the Music for Magnetic Tape Project.[59] The group had no permanent facility, and had to rely on borrowed time in commercial sound studios, including the studio of Louis and Bebe Barron.

Columbia-Princeton Center

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

In the same year Columbia University purchased its first tape recorder—a professional Ampex machine—for the purpose of recording concerts. Vladimir Ussachevsky, who was on the music faculty of Columbia University, was placed in charge of the device, and almost immediately began experimenting with it.

Herbert Russcol writes: "Soon he was intrigued with the new sonorities he could achieve by recording musical instruments and then superimposing them on one another."[60] Ussachevsky said later: "I suddenly realized that the tape recorder could be treated as an instrument of sound transformation."[60] On Thursday, May 8, 1952, Ussachevsky presented several demonstrations of tape music/effects that he created at his Composers Forum, in the McMillin Theatre at Columbia University. These included Transposition, Reverberation, Experiment, Composition, and Underwater Valse. In an interview, he stated: "I presented a few examples of my discovery in a public concert in New York together with other compositions I had written for conventional instruments."[60] Otto Luening, who had attended this concert, remarked: "The equipment at his disposal consisted of an Ampex tape recorder . . . and a simple box-like device designed by the brilliant young engineer, Peter Mauzey, to create feedback, a form of mechanical reverberation. Other equipment was borrowed or purchased with personal funds."[61]

Just three months later, in August 1952, Ussachevsky traveled to Bennington, Vermont at Luening's invitation to present his experiments. There, the two collaborated on various pieces. Luening described the event: "Equipped with earphones and a flute, I began developing my first tape-recorder composition. Both of us were fluent improvisors and the medium fired our imaginations."[61] They played some early pieces informally at a party, where "a number of composers almost solemnly congratulated us saying, 'This is it' ('it' meaning the music of the future)."[61]

Word quickly reached New York City. Oliver Daniel telephoned and invited the pair to "produce a group of short compositions for the October concert sponsored by the American Composers Alliance and Broadcast Music, Inc., under the direction of Leopold Stokowski at the Museum of Modern Art in New York. After some hesitation, we agreed. . . . Henry Cowell placed his home and studio in Woodstock, New York, at our disposal. With the borrowed equipment in the back of Ussachevsky's car, we left Bennington for Woodstock and stayed two weeks. . . . In late September, 1952, the travelling laboratory reached Ussachevsky's living room in New York, where we eventually completed the compositions."[61]

Two months later, on October 28, Vladimir Ussachevsky and Otto Luening presented the first Tape Music concert in the United States. The concert included Luening's Fantasy in Space (1952)—"an impressionistic virtuoso piece"[61] using manipulated recordings of flute—and Low Speed (1952), an "exotic composition that took the flute far below its natural range."[61] Both pieces were created at the home of Henry Cowell in Woodstock, NY. After several concerts caused a sensation in New York City, Ussachevsky and Luening were invited onto a live broadcast of NBC's Today Show to do an interview demonstration—the first televised electroacoustic performance. Luening described the event: "I improvised some [flute] sequences for the tape recorder. Ussachevsky then and there put them through electronic transformations."[62]

1954 saw the advent of what would now be considered authentic electric plus acoustic compositions—acoustic instrumentation augmented/accompanied by recordings of manipulated and/or electronically generated sound. Three major works were premiered that year: Varèse's Déserts, for chamber ensemble and tape sounds, and two works by Luening and Ussachevsky: Rhapsodic Variations for the Louisville Symphony and A Poem in Cycles and Bells, both for orchestra and tape. Because he had been working at Schaeffer's studio, the tape part for Varèse's work contains much more concrete sounds than electronic. "A group made up of wind instruments, percussion and piano alternates with the mutated sounds of factory noises and ship sirens and motors, coming from two loudspeakers."[63]

At the German premiere of Déserts in Hamburg, which was conducted by Bruno Maderna, the tape controls were operated by Karlheinz Stockhausen.[63] The title Déserts, suggested to Varèse not only, "all physical deserts (of sand, sea, snow, of outer space, of empty streets), but also the deserts in the mind of man; not only those stripped aspects of nature that suggest bareness, aloofness, timelessness, but also that remote inner space no telescope can reach, where man is alone, a world of mystery and essential loneliness."[64]

In 1958, Columbia-Princeton developed the RCA Mark II Sound Synthesizer, the first programmable synthesizer.[65] This device was actually a special-purpose, digitally controlled analogue computer, it was the first electronic music synthesizer in which a large range of sounds could not only be produced and sequenced but also be programmed by the user. This programming feature had a profound influence on the nature of Babbitt's electronic music.[citation needed] Prominent composers such as Vladimir Ussachevsky, Otto Luening, Milton Babbitt, Charles Wuorinen, Halim El-Dabh, Bülent Arel and Mario Davidovsky used the RCA Synthesizer extensively in various compositions.[66] One of the most influential composers associated with the early years of the studio was Egypt's Halim El-Dabh who,[67] after having developed the earliest known electronic tape music in 1944,[27] became more famous for Leiyla and the Poet, a 1959 series of electronic compositions that stood out for its immersion and seamless fusion of electronic and folk music, in contrast to the more mathematical approach used by serial composers of the time such as Babbitt. El-Dabh's Leiyla and the Poet, released as part of the album Columbia-Princeton Electronic Music Center in 1961, would be cited as a strong influence by a number of musicians, ranging from Neil Rolnick, Charles Amirkhanian and Alice Shields to rock musicians Frank Zappa and The West Coast Pop Art Experimental Band.[68]

Stochastic music

Lua error in package.lua at line 80: module 'strict' not found.

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

An important new development was the advent of computers for the purpose of composing music, as opposed to manipulating or creating sounds. Iannis Xenakis began what is called musique stochastique, or stochastic music, which is a composing method that uses mathematical probability systems. Different probability algorithms were used to create a piece under a set of parameters. Xenakis used computers to compose pieces like ST/4 for string quartet and ST/48 for orchestra (both 1962),[69] Morsima-Amorsima, ST/10, and Atrées. He developed the computer system UPIC for translating graphical images into musical results and composed Mycènes Alpha (1978) with it.

Mid-to-late 1950s

In 1954, Stockhausen composed his Elektronische Studie II—the first electronic piece to be published as a score. In 1955, more experimental and electronic studios began to appear. Notable were the creation of the Studio di fonologia musicale di Radio Milano, a studio at the NHK in Tokyo founded by Toshiro Mayuzumi, and the Phillips studio at Eindhoven, the Netherlands, which moved to the University of Utrecht as the Institute of Sonology in 1960.

The score for Forbidden Planet, by Louis and Bebe Barron,[70] was entirely composed using custom built electronic circuits and tape recorders in 1956.

The world's first computer to play music was CSIRAC, which was designed and built by Trevor Pearcey and Maston Beard. Mathematician Geoff Hill programmed the CSIRAC to play popular musical melodies from the very early 1950s. In 1951 it publicly played the Colonel Bogey March, of which no known recordings exist.[71] However, CSIRAC played standard repertoire and was not used to extend musical thinking or composition practice. CSIRAC was never recorded, but the music played was accurately reconstructed. The oldest known recordings of computer generated music were played by the Ferranti Mark 1 computer, a commercial version of the Baby Machine from the University of Manchester in the autumn of 1951. The music program was written by Christopher Strachey.

The impact of computers continued in 1956. Lejaren Hiller and Leonard Isaacson composed Illiac Suite for string quartet, the first complete work of computer-assisted composition using algorithmic composition. "... Hiller postulated that a computer could be taught the rules of a particular style and then called on to compose accordingly."[72] Later developments included the work of Max Mathews at Bell Laboratories, who developed the influential MUSIC I program. In 1957, MUSIC, one of the first computer programs to play electronic music, was created by Max Mathews at Bell Laboratories. Vocoder technology was also a major development in this early era. In 1956, Stockhausen composed Gesang der Jünglinge, the first major work of the Cologne studio, based on a text from the Book of Daniel. An important technological development of that year was the invention of the Clavivox synthesizer by Raymond Scott with subassembly by Robert Moog.

Also in 1957, Kid Baltan (Dick Raaymakers) and Tom Dissevelt released their debut album, Song Of The Second Moon, recorded at the Phillips studio.[73] The public remained interested in the new sounds being created around the world, as can be deduced by the inclusion of Varèse's Poème électronique, which was played over four hundred loudspeakers at the Phillips Pavilion of the 1958 Brussels World Fair. That same year, Mauricio Kagel, an Argentine composer, composed Transición II. The work was realized at the WDR studio in Cologne. Two musicians perform on a piano, one in the traditional manner, the other playing on the strings, frame, and case. Two other performers use tape to unite the presentation of live sounds with the future of prerecorded materials from later on and its past of recordings made earlier in the performance.

Expansion: 1960s

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

These were fertile years for electronic music—not just for academia, but for independent artists as synthesizer technology became more accessible. By this time, a strong community of composers and musicians working with new sounds and instruments was established and growing. 1960 witnessed the composition of Luening's Gargoyles for violin and tape as well as the premiere of Stockhausen's Kontakte for electronic sounds, piano, and percussion. This piece existed in two versions—one for 4-channel tape, and the other for tape with human performers. "In Kontakte, Stockhausen abandoned traditional musical form based on linear development and dramatic climax. This new approach, which he termed 'moment form,' resembles the 'cinematic splice' techniques in early twentieth century film."[74]

The theremin had been in use since the 1920s but it attained a degree of popular recognition through its use in science-fiction film soundtrack music in the 1950s (e.g., Bernard Herrmann's classic score for The Day the Earth Stood Still).[75]

In the UK in this period, the BBC Radiophonic Workshop (established in 1958) came to prominence, thanks in large measure to their work on the BBC science-fiction series Doctor Who. One of the most influential British electronic artists in this period[76] was Workshop staffer Delia Derbyshire, who is now famous for her 1963 electronic realisation of the iconic Doctor Who theme, composed by Ron Grainer.

Israeli composer Josef Tal at the Electronic Music Studio in Jerusalem (c. 1965). On the right, Hugh Le Caine's sound synthesizer the Special Purpose Tape Recorder.

In 1961 Josef Tal established the Centre for Electronic Music in Israel at The Hebrew University, and in 1962 Hugh Le Caine arrived in Jerusalem to install his Creative Tape Recorder in the centre.[77] In the 1990s Tal conducted, together with Dr Shlomo Markel, in cooperation with the Technion – Israel Institute of Technology, and VolkswagenStiftung a research project (Talmark) aimed at the development of a novel musical notation system for electronic music.[78]

Milton Babbitt composed his first electronic work using the synthesizer—his Composition for Synthesizer (1961)—which he created using the RCA synthesizer at the Columbia-Princeton Electronic Music Center.

<templatestyles src="Template:Blockquote/styles.css" />

For Babbitt, the RCA synthesizer was a dream come true for three reasons. First, the ability to pinpoint and control every musical element precisely. Second, the time needed to realize his elaborate serial structures were brought within practical reach. Third, the question was no longer "What are the limits of the human performer?" but rather "What are the limits of human hearing?"[79]

The collaborations also occurred across oceans and continents. In 1961, Ussachevsky invited Varèse to the Columbia-Princeton Studio (CPEMC). Upon arrival, Varese embarked upon a revision of Déserts. He was assisted by Mario Davidovsky and Bülent Arel.[80]

The intense activity occurring at CPEMC and elsewhere inspired the establishment of the San Francisco Tape Music Center in 1963 by Morton Subotnick, with additional members Pauline Oliveros, Ramon Sender, Anthony Martin, and Terry Riley.[citation needed]

Later, the Center moved to Mills College, directed by Pauline Oliveros, where it is today known as the Center for Contemporary Music.[81]

Simultaneously in San Francisco, composer Stan Shaff and equipment designer Doug McEachern, presented the first “Audium” concert at San Francisco State College (1962), followed by a work at the San Francisco Museum of Modern Art (1963), conceived of as in time, controlled movement of sound in space. Twelve speakers surrounded the audience, four speakers were mounted on a rotating, mobile-like construction above.[82] In an SFMOMA performance the following year (1964), San Francisco Chronicle music critic Alfred Frankenstein commented, "the possibilities of the space-sound continuum have seldom been so extensively explored".[82] In 1967, the first Audium, a “sound-space continuum” opened, holding weekly performances through 1970. In 1975, enabled by seed money from the National Endowment for the Arts, a new Audium opened, designed floor to ceiling for spatial sound composition and performance.[83] “In contrast, there are composers who manipulated sound space by locating multiple speakers at various locations in a performance space and then switching or panning the sound between the sources. In this approach, the composition of spatial manipulation is dependent on the location of the speakers and usually exploits the acoustical properties of the enclosure. Examples include Varese's Poeme Electronique (tape music performed in the Philips Pavilion of the 1958 World Fair, Brussels) and Stanley Schaff's [sic] Audium installation, currently active in San Francisco”[84] Through weekly programs (over 4,500 in 40 years), Shaff “sculpts” sound, performing now-digitized spatial works live through 176 speakers.[85]

A well-known example of the use of Moog's full-sized Moog modular synthesizer is the Switched-On Bach album by Wendy Carlos, which triggered a craze for synthesizer music.

Pietro Grossi was an Italian pioneer of computer composition and tape music, who first experimented with electronic techniques in the early sixties. Grossi was a cellist and composer, born in Venice in 1917. He founded the S 2F M (Studio de Fonologia Musicale di Firenze) in 1963 in order to experiment with electronic sound and composition.

Computer music

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

CSIRAC, the first computer to play music, did so publicly in August 1951.[86] One of the first large-scale public demonstrations of computer music was a pre-recorded national radio broadcast on the NBC radio network program Monitor on February 10, 1962. In 1961, LaFarr Stuart programmed Iowa State University's CYCLONE computer (a derivative of the Illiac) to play simple, recognizable tunes through an amplified speaker that had been attached to the system originally for administrative and diagnostic purposes. An interview with Mr. Stuart accompanied his computer music.

Laurie Spiegel is also notable for her development of "Music Mouse—an Intelligent Instrument" (1986) for Macintosh, Amiga, and Atari computers. The intelligent-instrument name refers to the program's built-in knowledge of chord and scale convention and stylistic constraints. She continued to update the program through Macintosh OS 9, and as of 2012, it remained available for purchase or demo download from her Web site.

The late 1950s, 1960s and 1970s also saw the development of large mainframe computer synthesis. Starting in 1957, Max Mathews of Bell Labs developed the MUSIC programs, culminating in MUSIC V, a direct digital synthesis language[87]

Live electronics

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

In Europe in 1964, Karlheinz Stockhausen composed Mikrophonie I for tam-tam, hand-held microphones, filters, and potentiometers, and Mixtur for orchestra, four sine-wave generators, and four ring modulators. In 1965 he composed Mikrophonie II for choir, Hammond organ, and ring modulators.[88]

In 1966–67, Reed Ghazala discovered and began to teach "circuit bending"—the application of the creative short circuit, a process of chance short-circuiting, creating experimental electronic instruments, exploring sonic elements mainly of timbre and with less regard to pitch or rhythm, and influenced by John Cage’s aleatoric music [sic] concept.[89]

Popularization: 1970s to early 1980s

Synthesizers

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Released in 1970 by Moog Music the Mini-Moog was among the first widely available, portable and relatively affordable synthesizers. It became once the most widely used synthesizer at that time in both popular and electronic art music.[90] Patrick Gleeson, playing live with Herbie Hancock in the beginning of the 1970s, pioneered the use of synthesizers in a touring context, where they were subject to stresses the early machines were not designed for.[91][92]

In 1974, the WDR studio in Cologne acquired an EMS Synthi 100 synthesizer, which a number of composers used to produce notable electronic works—including Rolf Gehlhaar's Fünf deutsche Tänze (1975), Karlheinz Stockhausen's Sirius (1975–76), and John McGuire's Pulse Music III (1978).[93]

The early 1980s saw the rise of bass synthesizers, the most influential being the Roland TB-303, a bass synthesizer and sequencer released in late 1981 that later became a fixture in electronic dance music,[94] particularly acid house.[95] One of the first to use it was Charanjit Singh in 1982, though it wouldn't be popularized until Phuture's "Acid Tracks" in 1987.[95]

IRCAM, STEIM, and Elektronmusikstudion

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

IRCAM in Paris became a major center for computer music research and realization and development of the Sogitec 4X computer system,[96] featuring then revolutionary real-time digital signal processing. Pierre Boulez's Répons (1981) for 24 musicians and 6 soloists used the 4X to transform and route soloists to a loudspeaker system.

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

STEIM is a center for research and development of new musical instruments in the electronic performing arts, located in Amsterdam, Netherlands. STEIM has existed since 1969. It was founded by Misha Mengelberg, Louis Andriessen, Peter Schat, Dick Raaymakers, Jan van Vlijmen, Reinbert de Leeuw, and Konrad Boehmer. This group of Dutch composers had fought for the reformation of Amsterdam's feudal music structures; they insisted on Bruno Maderna's appointment as musical director of the Concertgebouw Orchestra and enforced the first public fundings for experimental and improvised electronic music in The Netherlands.

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Elektronmusikstudion (EMS), formerly known as Electroacoustic Music in Sweden, is the Swedish national centre for electronic music and sound art. The research organisation started in 1964 and is based in Stockholm.

Rise of popular electronic music

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Keith Emerson performing in St. Petersburg in 2008

In the late 1960s, pop and rock musicians, including The Beach Boys and The Beatles, began to use electronic instruments, like the theremin and Mellotron, to supplement and define their sound. By the end of the decade, the Moog synthesizer took a leading place in the sound of emerging progressive rock with bands including Pink Floyd, Yes, Emerson, Lake & Palmer, and Genesis making them part of their sound. Instrumental prog rock was particularly significant in continental Europe, allowing bands like Kraftwerk, Tangerine Dream, Can, and Faust to circumvent the language barrier.[97] Their synthesiser-heavy "krautrock", along with the work of Brian Eno (for a time the keyboard player with Roxy Music), would be a major influence on subsequent electronic rock.[98]

Electronic rock was also produced by several Japanese musicians, including Isao Tomita's Electric Samurai: Switched on Rock (1972), which featured Moog synthesizer renditions of contemporary pop and rock songs,[99] and Osamu Kitajima's progressive rock album Benzaiten (1974).[100] The mid-1970s saw the rise of electronic art music musicians such as Jean Michel Jarre, Vangelis, and Tomita, who with Brian Eno were a significant influence on the development of new-age music.[101]

After the arrival of punk rock, a form of basic electronic rock emerged, increasingly using new digital technology to replace other instruments. Pioneering bands included Ultravox with their 1977 single "Hiroshima Mon Amour",[102] Yellow Magic Orchestra from Japan, Gary Numan, Depeche Mode, and The Human League.[103] Yellow Magic Orchestra in particular helped pioneer synthpop with their self-titled album (1978) and Solid State Survivor (1979). The definition of MIDI and the development of digital audio made the development of purely electronic sounds much easier.[104] These developments led to the growth of synthpop, which after it was adopted by the New Romantic movement, allowed synthesizers to dominate the pop and rock music of the early 80s. Key acts included Duran Duran, Spandau Ballet, A Flock of Seagulls, Culture Club, Talk Talk, Japan and the Eurythmics. Synthpop sometimes used synthesizers to replace all other instruments, until the style began to fall from popularity in the mid-1980s.[103]

Sequencers and drum machines

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Lua error in package.lua at line 80: module 'strict' not found. Music sequencers began being used around the mid 20th century, and Tomita's albums in mid-1970s being later examples.[99] In 1978, Yellow Magic Orchestra were using computer-based technology in conjunction with a synthesiser to produce popular music,[105] making their early use of the microprocessor-based Roland MC-8 Microcomposer sequencer.[106][107][not in citation given]

Drum machines, also known as rhythm machines, also began being used around the late-1950s, with a later example being Osamu Kitajima's progressive rock album Benzaiten (1974), which used a rhythm machine along with electronic drums and a synthesizer.[100] In 1977, Ultravox's "Hiroshima Mon Amour" was one of the first singles to use the metronome-like percussion of a Roland TR-77 drum machine.[102] In 1980, Roland Corporation released the TR-808, one of the first and most popular programmable drum machines. The first band to use it was Yellow Magic Orchestra in 1980, and it would later gain widespread popularity with the release of Marvin Gaye's "Sexual Healing" and Afrika Bambaataa's "Planet Rock" in 1982.[108] The TR-808 was a fundamental tool in the later Detroit techno scene of the late 1980s, and was the drum machine of choice for Derrick May and Juan Atkins.[109]

Birth of MIDI

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

In 1980, a group of musicians and music merchants met to standardize an interface that new instruments could use to communicate control instructions with other instruments and computers. This standard was dubbed Musical Instrument Digital Interface (MIDI) and resulted from a collaboration between leading manufacturers initially Sequential Circuits, Oberheim, Roland—and later, other participants that included Yamaha, Korg, and Kawai.[110] A paper was authored by Dave Smith of Sequential Circuits and proposed to the Audio Engineering Society in 1981. Then, in August 1983, the MIDI Specification 1.0 was finalized.

MIDI technology allows a single keystroke, control wheel motion, pedal movement, or command from a microcomputer to activate every device in the studio remotely and in synchrony, with each device responding according to conditions predetermined by the composer.

MIDI instruments and software made powerful control of sophisticated instruments easily affordable by many studios and individuals. Acoustic sounds became reintegrated into studios via sampling and sampled-ROM-based instruments.

Miller Puckette developed graphic signal-processing software for 4X called Max (after Max Mathews) and later ported it to Macintosh (with Dave Zicarelli extending it for Opcode)[111] for real-time MIDI control, bringing algorithmic composition availability to most composers with modest computer programming background.

Digital synthesis

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Lua error in package.lua at line 80: module 'strict' not found.

In 1975, the Japanese company Yamaha licensed the algorithms for frequency modulation synthesis (FM synthesis) from John Chowning, who had experimented with it at Stanford University since 1971.[112][113] Yamaha's engineers began adapting Chowning's algorithm for use in a digital synthesizer, adding improvements such as the "key scaling" method to avoid the introduction of distortion that normally occurred in analog systems during frequency modulation.[114] However, the first commercial digital synthesizer to be released would be the Australian Fairlight company's Fairlight CMI (Computer Musical Instrument) in 1979, as the first practical polyphonic digital synthesizer/sampler system.

In 1980, Yamaha eventually released the first FM digital synthesizer, the Yamaha GS-1, but at an expensive price.[115] In 1983, Yamaha introduced the first stand-alone digital synthesizer, the DX-7, which also used FM synthesis and would become one of the best-selling synthesizers of all time.[112] The DX-7 was known for its recognizable bright tonalities that was partly due to an overachieving sampling rate of 57 kHz.[116]

Barry Vercoe describes one of his experiences with early computer sounds: <templatestyles src="Template:Blockquote/styles.css" />

At IRCAM in Paris in 1982, flutist Larry Beauregard had connected his flute to DiGiugno's 4X audio processor, enabling real-time pitch-following. On a Guggenheim at the time, I extended this concept to real-time score-following with automatic synchronized accompaniment, and over the next two years Larry and I gave numerous demonstrations of the computer as a chamber musician, playing Handel flute sonatas, Boulez's Sonatine for flute and piano and by 1984 my own Synapse II for flute and computer—the first piece ever composed expressly for such a setup. A major challenge was finding the right software constructs to support highly sensitive and responsive accompaniment. All of this was pre-MIDI, but the results were impressive even though heavy doses of tempo rubato would continually surprise my Synthetic Performer. In 1985 we solved the tempo rubato problem by incorporating learning from rehearsals (each time you played this way the machine would get better). We were also now tracking violin, since our brilliant, young flautist had contracted a fatal cancer. Moreover, this version used a new standard called MIDI, and here I was ably assisted by former student Miller Puckette, whose initial concepts for this task he later expanded into a program called MAX.[117]

Chiptunes

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The characteristic lo-fi sound of chip music was initially the result of early sound cards' technical limitations; however, the sound has since become sought after in its own right.

Late 1980s to 1990s

Rise of dance music

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The trend has continued to the present day with modern nightclubs worldwide regularly playing electronic dance music (EDM). Nowadays, electronic dance music has radio stations,[118] websites,[119] and publications like Mixmag dedicated solely to the genre. Moreover, the genre has found commercial and cultural significance in the United States and North America, thanks to the wildly popular big room house/EDM sound that has been incorporated into U.S. pop music[120] and the rise of large scale commercial raves such as Electric Daisy Carnival, Tomorrowland (festival) and Ultra Music Festival.

Advancements

Other recent developments included the Tod Machover (MIT and IRCAM) composition Begin Again Again for "hypercello", an interactive system of sensors measuring physical movements of the cellist. Max Mathews developed the "Conductor" program for real-time tempo, dynamic and timbre control of a pre-input electronic score. Morton Subotnick released a multimedia CD-ROM All My Hummingbirds Have Alibis.

2000s and 2010s

Qlimax, a large electronic music event that occurs each year in the Netherlands, celebrating the Hardstyle subgenre of electronic music

In recent years, as computer technology has become more accessible and music software has advanced, interacting with music production technology is now possible using means that bear no relationship to traditional musical performance practices:[121] for instance, laptop performance (laptronica)[122] and live coding.[123] In general, the term Live PA refers to any live performance of electronic music, whether with laptops, synthesizers, or other devices.

In the last decade, a number of software-based virtual studio environments have emerged, with products such as Propellerhead's Reason and Ableton Live finding popular appeal.[124] Such tools provide viable and cost-effective alternatives to typical hardware-based production studios, and thanks to advances in microprocessor technology, it is now possible to create high quality music using little more than a single laptop computer. Such advances have democratized music creation,[125] leading to a massive increase in the amount of home-produced electronic music available to the general public via the internet.

Artists can now also individuate their production practice by creating personalized software synthesizers, effects modules, and various composition environments. Devices that once existed exclusively in the hardware domain can easily have virtual counterparts. Some of the more popular software tools for achieving such ends are commercial releases such as Max/Msp and Reaktor and open source packages such as Csound, Pure Data, SuperCollider, and ChucK.

Circuit bending

Probing for "bends" using a jeweler's screwdriver and alligator clips

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Circuit bending is the creative customization of the circuits within electronic devices such as low voltage, battery-powered guitar effects, children's toys and small digital synthesizers to create new musical or visual instruments and sound generators. Emphasizing spontaneity and randomness, the techniques of circuit bending have been commonly associated with noise music, though many more conventional contemporary musicians and musical groups have been known to experiment with "bent" instruments. Circuit bending usually involves dismantling the machine and adding components such as switches and potentiometers that alter the circuit. With the revived interest for analogue synthesizers, circuit bending became a cheap solution for many experimental musicians to create their own individual analogue sound generators. Nowadays many schematics can be found to build noise generators such as the Atari Punk Console or the Dub Siren as well as simple modifications for children toys such as the famous Speak & Spells that are often modified by circuit benders. Reed Ghazala has explored circuit bending with the Speak & Spell toy, and has held apprenticeships and workshops on circuit bending.[126][not in citation given]

See also

Live electronic music

Footnotes

  1. "The stuff of electronic music is electrically produced or modified sounds. ... two basic definitions will help put some of the historical discussion in its place: purely electronic music versus electroacoustic music" (Holmes 2002, p. 6).
  2. "Electroacoustic music uses electronics to modify sounds from the natural world. The entire spectrum of worldly sounds provides the source material for this music. This is the domain of microphones, tape recorders and digital samplers … can be associated with live or recorded music. During live performances, natural sounds are modified in real time using electronics. The source of the sound can be anything from ambient noise to live musicians playing conventional instruments" (Holmes 2002, p. 8).
  3. "Electronically produced music is part of the mainstream of popular culture. Musical concepts that were once considered radical—the use of environmental sounds, ambient music, turntable music, digital sampling, computer music, the electronic modification of acoustic sounds, and music made from fragments of speech-have now been subsumed by many kinds of popular music. Record store genres including new age, rap, hip-hop, electronica, techno, jazz, and popular song all rely heavily on production values and techniques that originated with classic electronic music" (Holmes 2002, p. 1). "By the 1990s, electronic music had penetrated every corner of musical life. It extended from ethereal sound-waves played by esoteric experimenters to the thumping syncopation that accompanies every pop record" (Lebrecht 1996, p. 106).
  4. Anonymous 2001.
    Note: Historically, world's first electronic musical instruments using active component may be the Audion Piano developed in 1915 by Lee de Forest who also invented audion tube, the first active component for amplification and oscillation. However at that era, also passive components were researched for oscillation (ex. intermittent glow discharge of neon tubes [1]), and possibly more earlier electronic musical instruments might be found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Orton & Davies n.d.
  8. Busoni 1962, p. 95.
  9. Busoni 1962, pp. 76–77.
  10. Russcol 1972, pp. 35–36.
  11. Quoted in Russcol 1972, p. 40.
  12. Russcol 1972, p. 68.
  13. Composers using the instrument ultimately include Boulez, Honegger, Jolivet, Koechlin, Messiaen, Milhaud, Tremblay, and Varèse. In 1937, Messiaen wrote Fête des belles eaux for 6 ondes Martenot, and wrote solo parts for it in Trois petites liturgies de la présence divine (1943–44) and the Turangalîla Symphonie (1946–48/90).
  14. Russcol 1972, p. 70.
  15. Crab 2005.
  16. L'Idee entry at ClassicalArchives
  17. Arthur Honegger entry at FilmReference
  18. Anonymous n.d.(1).
  19. Tyson n.d.
  20. Anonymous 2006.
  21. Engel 2006, pp. 4 and 7.
  22. Krause 2002 abstract.
  23. Engel & Hammar 2006, p. 6.
  24. Snell 2006, [2].
  25. Angus 1984.
  26. Hammar 1999, [3].
  27. 27.0 27.1 Young 2007, p. 24.
  28. Holmes 2008, pp. 156–57.
  29. "Musique Concrete was created in Paris in 1948 from edited collages of everyday noise" (Lebrecht 1996, p. 107).
  30. NB: To the pioneers, an electronic work did not exist until it was "realized" in a real-time performance (Holmes 2008, p. 122).
  31. Snyder n.d.
  32. Eimert 1972, p. 349.
  33. Eimert 1958, p. 2; Ungeheuer 1992, p. 117.
  34. (Lebrecht 1996, p. 75). "... at Northwest German Radio in Cologne (1953), where the term 'electronic music' was coined to distinguish their pure experiments from musique concrete..." (Lebrecht 1996, 107).
  35. Stockhausen 1978, pp. 73–76, 78–79
  36. "In 1967, just following the world premiere of Hymnen, Stockhausen said about the electronic music experience: '... Many listeners have projected that strange new music which they experienced—especially in the realm of electronic music—into extraterrestrial space. Even though they are not familiar with it through human experience, they identify it with the fantastic dream world. Several have commented that my electronic music sounds "like on a different star," or "like in outer space." Many have said that when hearing this music, they have sensations as if flying at an infinitely high speed, and then again, as if immobile in an immense space. Thus, extreme words are employed to describe such experience, which are not "objectively" communicable in the sense of an object description, but rather which exist in the subjective fantasy and which are projected into the extraterrestrial space'" (Holmes 2002, p. 145).
  37. Before the WWII in Japan, already several "electric" instruments seems to be developed (see ja:電子音楽#黎明期), and in 1935 a kind of "electronic" musical instrument, Yamaha Magna Organ was developed. It seems to be a multi-timbral keyboard instrument based on electrically-blown free reeds with pickups, possibly similar to the electrostatic reed organs developed by Frederick Albert Hoschke in 1934 then manufactured by Everett and Wurlitzer until 1961.
    • Lua error in package.lua at line 80: module 'strict' not found.
    • Lua error in package.lua at line 80: module 'strict' not found.
  38. 38.0 38.1 Holmes 2008, p. 106.
  39. Holmes 2008, p. 106 & 115.
  40. Fujii 2004, pp. 64–66.
  41. 41.0 41.1 Fujii 2004, p. 66.
  42. Holmes 2008, pp. 106–7.
  43. 43.0 43.1 43.2 Holmes 2008, p. 107.
  44. Fujii 2004, pp. 66–67.
  45. 45.0 45.1 45.2 Fujii 2004, p. 64.
  46. 46.0 46.1 Fujii 2004, p. 65.
  47. Holmes 2008, p. 108.
  48. Holmes 2008, pp. 108 & 114–5.
  49. Loubet 1997, p. 11
  50. Reid 2004.
  51. On the development and commercialization of electronic musical instruments in Japan, Yamaha (since 1935), Kuroda Organ (since 1955), JVC (since 1958), Teisco (since 1958) were preceded than Ace Tone/Roland. For the drum machine, Korg (since 1963) was preceded. For the commercialization of electric guitars and amplifiers, Teisco (established in 1948) and Guyatone (established in 1956) were pioneers in Japan.
  52. Lua error in package.lua at line 80: module 'strict' not found.
  53. Lua error in package.lua at line 80: module 'strict' not found.
    Notes: Titled "Daphnis Et Chloé" in Japan, this album was retitled "Bolero" in North America, and "The Ravel Album" in Europe.
  54. Lua error in package.lua at line 80: module 'strict' not found.
  55. Lua error in package.lua at line 80: module 'strict' not found.
  56. Luening 1968, p. 136
  57. Johnson 2002, p. 2.
  58. Johnson 2002, p. 4.
  59. "Carolyn Brown [Earle Brown's wife] was to dance in Cunningham's company, while Brown himself was to participate in Cage's 'Project for Music for Magnetic Tape.'... funded by Paul Williams (dedicatee of the 1953 Williams Mix), who—like Robert Rauschenberg—was a former student of Black Mountain College, which Cage and Cunnigham had first visited in the summer of 1948" (Johnson 2002, p. 20).
  60. 60.0 60.1 60.2 Russcol 1972, p. 92.
  61. 61.0 61.1 61.2 61.3 61.4 61.5 Luening 1968, p. 48.
  62. Luening 1968, p. 49.
  63. 63.0 63.1 Kurtz 1992, pp. 75–76.
  64. Anonymous 1972.
  65. Holmes 2008, pp. 145–46.
  66. Rhea 1980, p. 64.
  67. Holmes 2008, p. 153.
  68. Holmes 2008, pp. 153–54 & 157
  69. Xenakis 1992[page needed]
  70. "From at least Louis and Bebbe Barron's soundtrack for 'The Forbidden Planet" onwards, electronic music—in particular synthetic timbre—has impersonated alien worlds in film" (Norman 2004, p. 32).
  71. Doornbusch 2005[page needed].
  72. Schwartz 1975, p. 347.
  73. Harris n.d.
  74. Kurtz 1992, p. 1.
  75. Glinsky 2000, p. 286.
  76. Delia Derbyshire's Audiological Chronology
  77. Gluck 2005[page needed].
  78. Tal & Markel 2002, pp. 55–62.
  79. Schwartz 1975, p. 124.
  80. Bayly 1982–83, p. 150.
  81. "A central figure in post-war electronic art music, Pauline Oliveros [b. 1932] is one of the original members of the San Francisco Tape Music Center (along with Morton Subotnick, Ramon Sender, Terry Riley, and Anthony Martin), which was the resource on the U.S. west coast for electronic music during the 1960s. The Center later moved to Mills College, where she was its first director, and is now called the Center for Contemporary Music." from CD liner notes, "Accordion & Voice," Pauline Oliveros, Record Label: Important, Catalog number IMPREC140: 793447514024.
  82. 82.0 82.1 Frankenstein 1964.
  83. Loy 1985, pp. 41–48.
  84. Begault 1994, p. 208, online reprint.
  85. Hertelendy 2008.
  86. Doornbusch 2005[page needed].
  87. Mattis 2001.
  88. Stockhausen 1971, pp. 51, 57, 66.
  89. "This element of embracing errors is at the centre of Circuit Bending, it is about creating sounds that are not supposed to happen and not supposed to be heard (Gard 2004). In terms of musicality, as with electronic art music, it is primarily concerned with timbre and takes little regard of pitch and rhythm in a classical sense. ... . In a similar vein to Cage’s aleatoric music, the art of Bending is dependent on chance, when a person prepares to bend they have no idea of the final outcome" (Yabsley 2007).
  90. "In 1969, a portable version of the studio Moog, called the Minimoog Model D, became the most widely used synthesizer in both popular music and electronic art music" Montanaro 2004[page needed].
    Note: Thereafter, at least the total shipments record have been overwritten by Yamaha DX7 (over 200,000 units between 1983-1989) and Korg M1 (250,000 units between 1988-1995). For details, see Yamaha DX7 § Footnote.
  91. Zussman 1982, pp. 1, 5
  92. Sofer & Lynner 1977, p. 23 "Yes, I used [ Moog modular equipment ] until I went with Herbie (Hancock) in 1970. Then I used a [ ARP ] 2600 because I couldn't use the Moog on stage. It was too big and cranky; every time we transported it, we would have to pull a module out, and I knew I couldn't do that on the road, so I started using ARP's."
  93. Morawska-Büngeler 1988, pp. 52, 55, 107–108
  94. Vine 2011.
  95. 95.0 95.1 Aitken 2011.
  96. Schutterhoef 2007 [4].
  97. Bussy 2004, pp. 15–17.
  98. Unterberger 2002, pp. 1330–1.
  99. 99.0 99.1 Jenkins 2007, pp. 133–34
  100. 100.0 100.1 Osamu Kitajima – Benzaiten at Discogs
  101. Holmes 2008, p. 403.
  102. 102.0 102.1 Maginnis n.d.
  103. 103.0 103.1 Anonymous n.d.(2).
  104. Russ 2004, p. 66.[verification needed]
  105. Anonymous 1979.
  106. Yellow Magic Orchestra – Yellow Magic Orchestra at Discogs
  107. Lua error in package.lua at line 80: module 'strict' not found.
  108. Anderson 2008.
  109. Blashill 2004, p. [page needed]
  110. Holmes 2008, p. 227.
  111. Ozab 2000 [5].
  112. 112.0 112.1 Holmes 2008, p. 257.
  113. Chowning 1973.
  114. Holmes 2008, pp. 257–8.
  115. Roads 1996, p. 226.
  116. Holmes 2008, pp. 258–9.
  117. Vercoe 2000, pp. xxviii–xxix.
  118. Electric Area: Bigroom House & DJ Mixes, Sirius XM
  119. Dancing Astronaut website
  120. "House Music: How It Sneaked Its Way Into Mainstream Pop" by Kia Makarechi, The Huffington Post, August 11, 2011
  121. Emmerson 2007, pp. 111–13.
  122. Emmerson 2007, pp. 80–81.
  123. Emmerson 2007, p. 115; Collins 2003.
  124. Anonymous 2009—Best Audio Editing Software of the Year—1st Abelton Live, 4th Reason. Best Audio DJ Software of the Year—Abelton Live.
  125. Chadabe 2004, pp. 5–6.
  126. Lua error in package.lua at line 80: module 'strict' not found.

References

  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.. (archive on 10 March 2011)
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.. Guide ID: A520831 (Edited).
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.. (Online reprint, NASA Ames Research Center Technical Memorandum facsimile 2000.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found. (Originally published: New York: Twayne, 1998)
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.. (First published in German in Melos 39 (January–February 1972): 42–44.)
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found. (Excerpt exist on History of Experimental Music in Northern California)
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found. Abstract.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.. (archive at webcitation.org)
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.[full citation needed]
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.. English version as Lua error in package.lua at line 80: module 'strict' not found.. Second English version as Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.. Also published in German, as Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.

Further reading

  • Lua error in package.lua at line 80: module 'strict' not found.
  • Best of Electronic Music Podcasts/Eurock Live
  • A. Patterson Light & Sound by Mikhail Chekalin, itunes.apple.com Best of Electronic Music
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Dorschel, Andreas, Gerhard Eckel, and Deniz Peters (eds.) (2012). Bodily Expression in Electronic Music: Perspectives on Reclaiming Performativity. Routledge Research in Music 2. London and New York: Routledge. ISBN 978-0-415-89080-9.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found..
  • Lua error in package.lua at line 80: module 'strict' not found. (US title, Lua error in package.lua at line 80: module 'strict' not found.)
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Strange, Allen (1983), Electronic Music: Systems, Technics, and Controls, second ed. Dubuque, Iowa: W.C. Brown Co. ISBN 978-0-697-03602-5.
  • Lua error in package.lua at line 80: module 'strict' not found.

External links

Lua error in package.lua at line 80: module 'strict' not found.

Lua error in package.lua at line 80: module 'strict' not found.