Beyond Queen’s Stomp-Stomp-Clap: Concerts and Computer Science Converge In New Research


Freddie Mercury in front of the Wembley Stadium Crowd for Live Aid in 1985.

The iconic “stomp-stomp-clap” of UK stadium rock band Queen‘s “We Will Rock You” was born out of the challenge that rock stars and professors alike know all too well: How to get large numbers of people engaged in participating during a live performance like a concert — or a lecture — and channel that energy for a sustained time period.

Sang Won Lee, an assistant professor of computer science in Virginia Tech‘s College of Engineering, and his collaborators tested some theories about not only engaging large audiences, but sustaining that engagement in a live music performance.

While Lee isn’t part of a world-renowned, stadium-filling legendary rock band, his team uncovered something that Queen guitarist Brian May and the late legendary frontman Freddie Mercury didn’t have that helped audience engagement: live social media.

Lee will present his findings at the Association for Computing Machinery‘s 12th Creativity and Cognition Conference in San Diego, California on June 26, 2019. He collaborated with Walter Lasecki and Danai Koutra, both assistant professors of computer science and engineering, and undergraduate sound engineering student Aaron Willette, all from the University of Michigan.

In addition to presenting his research findings as a formal paper, he will also perform a concert to demonstrate his smartphone participatory app and create a real-time composition for smartphones using an interactive musical piece for large-scale audience involvement called Crowd in C. The sounds of the composition will be generated solely from the audience.

“This research is important in learning what resonates with larger audiences and prompts people to not only participate in a group, but remain engaged and create an artistic artifact,” said Lee. “Artifacts let the audience see the fruits of their labor as a group and give them something to invest in as far as remaining engaged.”

For Lee’s performance the audience will log in to an app that will present them with a pattern of dots. By moving the dots, audience members will be able to manipulate the sounds that will be collectively played over a sound system and create their own compositions in the key of C. Lee will have the real-time ability to change the chord of the instrument to make the sounds lower or higher or to play a simple melody.

In developing his composition for smartphones, Lee had three challenges. The first challenge was how to engage a large audience using an instrument that was simple enough for novices. The second challenge was keeping the audience engaged with their new music makers. The last challenge was to perform a piece of music with the crowd interacting with the app.

Lee patterned the social media engagement tools for Crowd in C after such dating apps as Tinder. Users can listen to individual compositions during the performance and hit a like button in the shape of a heart. Additionally, if two users like each other’s musical profiles, they are greeted with an “it’s a match” message and magic fairy wand sounds.

Lee tested the Crowd in C app last December at the Moss Arts Center in Blacksburg, Virginia, and found that 87 audience members remained engaged at a constant rate for 540 seconds, or nine minutes. On average, audience members sent or received hearts 8.21 times. Sending hearts was driven by a small portion of people, the top 20 percent of participants sent 62.2 percent of all hearts.

While a small number of participants were responsible for sending and receiving a majority of the hearts, various individual approaches of engagement emerged over the course of Lee’s performance. While some audience members were socially active, others focused on more musical interaction and contributed to the artifact.

“We saw that social interaction helped audience members stay engaged longer with the app and the performance, so this could be a tool that professors or anyone else who has to captivate large audiences at conferences could use in the future,” Lee said.

He finds it promising that the computer-mediated participatory platform was flexible enough to accommodate various types of participation: some members were influencers, some were lurkers, and some were music geeks.

“Using computer science in nontraditional ways is a wonderful gateway to connect with the public and make technology relatable to people who may not interact or realize they are interacting with computer science on a regular basis,” said Steve Harrison, an associate professor of practice in the Department of Computer Science and the School of Visual Arts, as well as director of the human-centered design program.

Harrison has had dual roles as associate chair and co-chair of the Creativity and Cognition and Designing Interactive Systems conferences this year.

“We are experimenting with the joint format to bring together two related computer science research communities,” he said. “The conferences will host a shared art exhibition and one full day of conference programming to support dialogue between the two overlapping communities.”

Virginia Tech’s Department of Computer Science has a large presence at both conferences this year. In addition to Harrison, Assistant Professor Kurt Luther is serving as papers co-chair of Creativity and Cognition; Professor Deborah Tatar is serving as technical program co-chair for the Designing Interactive Systems conference; and graduate student Aakash Gautam is serving as co-chair for student volunteers.

Whether it’s a raucous concert or a compelling lecture, Lee’s research indicates the rules of audience engagement may have gone beyond stomping and clapping and headed into the realm of computer-mediated technology that can help performers of all kinds, whether they are in the classroom or the concert hall.


* This article was automatically syndicated and expanded from ScienceDaily – Strange & Offbeat News.


Be the first to comment

Leave a comment: