Composer explores legacy of computer pioneer Ada Lovelace and using AI for musical composition

Alessandrini

An exploration of artificial intelligence and musical composition may seem like a modern question, but Dr. Patricia Alessandrini found the beginnings of the idea in the 19th century. “Ada Lovelace is credited with the first published imaginings of AI-assisted composition,” Alessandrini said. She quoted Lovelace: “Numerous fundamental relations of music can be expressed by those of the abstract science of operations, such that a machine could compose elaborate and scientific pieces of music of any degree of complexity or extent.”

At a Feb. 13 Clayman Institute Artist’s Salon, Alessandrini presented a project developed as a tribute to the 19th century mathematician. Called Ada’s Song: A Tribute to Ada Lovelace, the work was performed in November 2019 by the Britten Sinfonia, which included a small instrument ensemble, a soprano vocalist, and a device of Alessandrini’s design – the piano machine. At the salon, Alessandrini explained how her interest in human musical expressivity, interactivity and the innovations of Lovelace led to this exploration of computer-assisted composition and performance.

Alessandrini is an assistant professor in the Department of Music as well as the Stanford Center for Computer Research in Music and Acoustics. She is a composer, sound artist and researcher on embodied interaction and immersive experience. Her work has been presented in festivals worldwide, and she also has toured extensively as a performer of live electronics. She previously has taught at the University of Bangalore and Goldsmith’s, University of London; here at Stanford she teaches composition, sonic arts and computer music. The Clayman Institute’s annual Artist’s Salon, under the leadership of Artist-in-Residence Valerie Miner, shows how the arts contribute to the larger mission of gender equality and research. Miner invites a diverse group of artists from the Stanford community to appear in the Artist’s Salon series.

Before playing a video excerpt of the November performance, Alessandrini explained the project’s development. “For this project, what was most relevant was the piano machine,” she said. “It’s an artificial way of causing the piano to sound through computing.” Created with a former colleague at Goldsmiths, the machine uses MIDI messaging, controlled by a computer or an electronic piano keyboard, to connect with microprocessors, which control a physical device. Each device includes a small vibrating motor – the same ones found in many cell phones – in a clear acrylic casing, connected to a piece inside a piano that touches the strings and causes them to sound. “This idea of physically making this connection between the symbolic and the sound really interested me,” she said. 

"... I always go back to this question of human expression. Where does that lie? Where is that exact aspect?"

Regarding the role of AI in Ada’s Song, Alessandrini said: “The basis is similarity matching.” She compared the process with a smart phone selfie app that matches a user’s photograph with an image from a large bank of stored images. “I can take that same principle and do that in music,” she says, when a computer tries to match a note or passage or sound to generate notes. “The trajectory of the piano machine in that piece is that maybe in the beginning you hear it’s rather elementary, and there’s not too much, and then… it’s learning a bit from the musicians, and then by the end” it was taking a more leading role in the performance.

Not only the piano machine is assisted by the computer, but the other performers as well, who play or sing from a musical score generated in real time. The soprano wears an earpiece that provides her next notes. “Part of the material she sings is generated in real time, so she’s part of the machine in a way.”

Regarding her inspiration, Ada Lovelace, Alessandrini noted that she “was a strong artificial intelligence skeptic.” Often referred to as the first computer programmer, for the algorithm she wrote to accompany the design of Charles Babbage for an analytical machine, Lovelace saw limits to the capabilities of such calculating devices. “She said, computers can’t create anything. For creation requires, minimally, originating something. But, computers originate nothing. They merely do that which we order them, via programs, to do.” Alessandrini said, “You can imagine with AI theory, these statements are fascinating to reflect upon, and timely right now.”

Lovelace was cited extensively by Alan Turing, a computer scientist in the early 20th century. His Turing test of artificial intelligence – can computers have consciousness? can they fool humans into thinking they are other humans? – “is past now,” Alessandrini said, “too low a bar.” She said, “We can talk about a Lovelace Test, which some theorists put as, ‘Only if computers originate things, should they be believed to have minds.’” 

 

photo of speaker presenting to room

Before presenting Ada’s Song, Alessandrini shared her work on other interactive projects. In one, she took a domestic scene – a woman ironing in a home – and made a few changes. “I was thinking about how especially when it was a very traditional image of the women at home, doing the housework – my thing was a push toward innovation in these devices, and at the same time, innovations in hi-fi.” She notes that marketing in the 1950s for stereo equipment often targeted women. “This was my idea of women as pioneers in ways that maybe we haven’t thought about.”

She turned an iron into an instrument. Where steam usually comes out, a microphone was added. With the iron in a relaxed position, the person operating it – a soprano – could sing into its microphone, providing material for it to process. A vintage radio also was emptied out and replaced with a microprocessor. 

In a video game project, Alessandrini networked players from different sites who were producing both electronics and a musical score in real time. She also entered – and won – a team hackathon to design prototypes for sex toys. Their challenge: “Getting out of the kind of stereotypes and very strict functions of sex toys, very heteronormative kinds of things.” The Love Pad allowed users to create an experience that could be pre-programmed via a sensuous interface, including her sound design, that could guide the experience of someone else remotely. She says, “I won a very expensive vibrator I now use for musical purposes.”

In questions at the end of her presentation, Alessandrini returned to some of the ideas she wanted to examine with Ada’s Song. “The means of producing sound – people know this who are working in film – getting really expressive performance from virtual instruments is not so easy. That’s why I always go back to this question of human expression. Where does that lie? Where is that exact aspect?”

Her work is not about replacing human performance, but better understanding how it differs from computer-generated music. “This question of expressivity to me has not been solved, and it’s still an area to work on. That’s a positive thing, because there is an individuality of human expression.”

Photos by Cynthia Newberry, Clayman Institute