Music technology was prominently featured at MIT during “FUTURE PHASES,” a showcase of compositions for string orchestra and electronics, organized by the MIT Music Technology and Computation Graduate Program as part of the 2025 International Computer Music Conference (ICMC).
The well-attended occasion took place last month in the Thomas Tull Concert Hall located within the new Edward and Joyce Linde Music Building. This event was produced in partnership with the MIT Media Lab’s Opera of the Future Group and Boston’s self-conducted chamber orchestra A Far Cry. “FUTURE PHASES” marked the inaugural presentation by the MIT Music Technology and Computation Graduate Program in MIT Music’s new venue.
Among the offerings at “FUTURE PHASES” were two original pieces by MIT composers: the world premiere of “EV6,” by MIT Music’s Kenan Sahin Distinguished Professor Evan Ziporyn and professor of the practice Eran Egozy; and the U.S. premiere of “FLOW Symphony,” by the MIT Media Lab’s Muriel R. Cooper Professor of Music and Media Tod Machover. Additionally, three works were chosen by a jury from an open submission: “The Wind Will Carry Us Away,” by Ali Balighi; “A Blank Page,” by Celeste Betancur Gutiérrez and Luna Valentin; and “Coastal Portrait: Cycles and Thresholds,” by Peter Lane. Each composition was performed by Boston’s own multi-Grammy-nominated ensemble, A Far Cry.
“The ICMC is focused on showcasing the latest advancements in research, compositions, and performances within electronic music,” states Egozy, director of the new Music Technology and Computation Graduate Program at MIT. When invited to participate in this year’s conference, “it felt like an ideal chance to highlight MIT’s dedication to music technology, particularly in relation to the exciting new initiatives currently underway: a new master’s program in music technology and computation, the new Edward and Joyce Linde Music Building featuring enhanced music technology resources, and new faculty members arriving at MIT with overlapping appointments between MIT Music and Theater Arts (MTA) and the Department of Electrical Engineering and Computer Science (EECS).” Recently appointed professors include Anna Huang, a keynote speaker for the conference, and the creator of the machine learning model Coconet, which powered Google’s first AI Doodle, the Bach Doodle.
Egozy underscores the distinctiveness of this event: “You should appreciate that this is a remarkable circumstance. Having a full 18-member string orchestra [A Far Cry] perform new compositions that include electronics is an unusual occurrence. Typically, ICMC performances are either solely electronic and computer-generated music or feature a small group of two to four musicians. Thus, the opportunity to present to the wider music technology community was especially exhilarating.”
To maximize this thrilling opportunity, an international open call was issued to select the other pieces that would complement Ziporyn and Egozy’s “EV6” and Machover’s “FLOW Symphony.” A total of 46 submissions were reviewed by a judging panel that included Egozy, Machover, and other esteemed composers and technologists.
“We received a tremendous assortment of works through this call,” Egozy remarks. “We encountered diverse musical styles and innovative uses of electronics. No two compositions were quite alike, and I believe as a result, our audience gained insight into how varied and captivating a concert can be in this format. A Far Cry truly served as the unifying force, performing all pieces with enthusiasm and nuance. They have a remarkable ability to engage audiences with the music. Furthermore, the circular layout of the Thomas Tull Concert Hall allowed the audience to feel even more connected to the performance.”
Egozy elaborates, “we utilized the advanced technology embedded in the Thomas Tull Concert Hall, which features 24 built-in speakers for surround sound that enabled us to deliver unique, amplified sound to every seat in the venue. It’s likely that each attendee experienced the sound slightly differently, yet there was a consistent sense of a multidimensional evolution of sound as the pieces unfolded.”
The five pieces performed during the event incorporated various technological elements, including playing synthesized, pre-recorded, or digitally altered sounds; attaching microphones to instruments for real-time signal processing; transmitting custom-generated musical notation to the performers; employing generative AI to manipulate live sound and present it in novel and unpredictable manners; and audience involvement, where attendees could use their cellphones as musical instruments, thus participating in the ensemble.
Ziporyn and Egozy’s piece, “EV6,” particularly utilized this last feature: “Evan and I had previously collaborated on a system called Tutti, which means ‘together’ in Italian. Tutti allows an audience to use their smartphones as musical instruments, enabling us to all play collectively.” Egozy developed the technology, initially used in the MIT Campaign for a Better World in 2017, which included a three-minute piece solely for cellphones. “For this concert,” Egozy explains, “Evan had the creative idea to adapt the technology to compose a new piece — this time for audience phones along with a live string orchestra.”
To elaborate on the piece’s name, Ziporyn shares, “I drive an EV6; it’s my first electric vehicle, and when I first got it, it felt akin to driving an iPhone. However, it’s still just a car: it has wheels and an engine, which transports me from one location to another. This seemed to be a fitting metaphor for this composition, where much of the sound is indeed produced on cellphones, yet still behaves like any other musical piece. Additionally, it pays tribute to David Bowie’s song ‘TVC 15,’ which revolves around falling in love with a robot.”
Egozy adds, “We aimed for audience participants to experience the essence of playing in an orchestra. Through this technology, each audience member becomes part of an orchestral section (winds, brass, strings, etc.). As they play, they hear their section producing similar music while also perceiving different sections in various parts of the hall performing distinct sounds. This design allows the audience to feel accountable to their section, appreciate how music transitions among different orchestral parts, and relish the excitement of live performance. In ‘EV6,’ this engagement was even more electrifying since everyone in the audience could collaborate with a live string orchestra — possibly the first occurrence in recorded history.”
Following the concert, guests enjoyed six demonstrations of music technology showcasing research from undergraduate and graduate students from both the MIT Music program and the MIT Media Lab. These included a gamified interface for utilizing just intonation systems (Antonis Christou); insights from a human-AI co-created concert (Lancelot Blanchard and Perry Naseck); a system for analyzing piano performance data across campus (Ayyub Abdulrezak ’24, MEng ’25); extracting musical features from audio via latent frequency-masked autoencoders (Mason Wang); a device that transforms any surface into a drum machine (Matthew Caren ’25); and a play-along interface for mastering traditional Senegalese rhythms (Mariano Salcedo ’25). This final example led to the development of Senegroove, a drumming application specifically designed for an upcoming edX online course conducted by ethnomusicologist and MIT associate professor in music Patricia Tang, along with world-renowned Senegalese drummer and MIT lecturer in music Lamine Touré, who provided performance videos of the foundational rhythms incorporated in the system.
In conclusion, Egozy reflects, “’FUTURE PHASES’ illustrated how having the appropriate space — in this case, the new Edward and Joyce Linde Music Building — can truly drive innovation in thinking, inspire new projects, and foster collaboration. My aspiration is for everyone in the MIT community, the Boston area, and beyond to soon recognize what an extraordinary place and environment we are creating, and continue to build here, for music and music technology at MIT.”