AI is recreating the voices of mass shooting victims

The advancement of artificial intelligence has long been controversial, but now one group is trying to use the technology to influence significant change. In an effort to enact gun control legislation, the parents of gun violence victims are using AI to send voice messages from their deceased children to Congress.

The project, called The Shotline, is a collaborative effort by six families who lost a child to gun violence or a mass shooting. At the helm is a marketing firm working with Manuel Oliver, whose 17-year-old son Joaquin Oliver was killed during the Marjory Stoneman Douglas High School shooting in Parkland, Florida, in 2018. Also involved in the project is March for Our Lives, a high-profile gun control advocacy group created in the aftermath of the Parkland shooting

The Shotline’s website went live on Feb. 14, exactly six years after the Parkland shooting. While the use of artificial intelligence has generated controversy, the parents behind The Shotline are convinced that the project will be a force for positive change. What does The Shotline entail, and what has been the outside reaction?

What is The Shotline?

It is a program that uses AI to “[recreate] the voices of those shot and killed by guns so they can call our representatives in hopes of changing our country’s gun laws,” according to The Shotline website. Using AI, these gun violence victims “have a chance to speak again.”

The website features AI-generated audio excerpts from six people: the aforementioned Joaquin Oliver; Akilah DaSilva, who died at the age of 23 during a shooting at a Waffle House near Nashville in 2018; Uzi Garcia, who was shot and killed at the age of 10 during the Uvalde, Texas, school shooting in 2022; Mike Vaughan, who died by a self-inflicted gunshot wound at the age of 30 in 2014; Jaycee Kemachet-Webster, who was shot and killed in Montgomery County, Maryland, in 2017 at the age of 20; and Ethan Song, who died in 2018 at the age of 15 after accidentally shooting himself with an unsecured gun.

  Biden, Trump clinch nominations

Each of the recordings features the AI-generated shooting victim talking about their lives and the circumstances of their death, while asking Congress to pass gun legislation. Visitors to the website can enter their ZIP code and send any of the recordings to their local representative.  

What is The Shotline hoping to accomplish?

The main goal of the project is to influence Congress to pass gun control legislation. The point is to “interrupt people’s regularly scheduled programming as a movement to get their attention,” David Hogg, a Parkland survivor and co-founder of March for Our Lives, said to NPR

“We have to use all the tools that we can at our disposal in an ethical way, of course, to get their attention in the first place. And if that means using AI to simulate the voices of people that have been stolen by gun violence, then so be it,” Hogg said to NPR.

Deciding to recreate the victim’s voices was a “heartbreaking thing for us to do,” Ethan Song’s father, Mike Song, said to The Washington Post. “But I think this is the kind of thing that wakes people up.” And while many people have criticized the use of AI-generated voices, particularly in the political space, parents involved with The Shotline say that this is a necessary development of the technology. 

“Kids shouldn’t be shot in schools. So I don’t think that what I’m doing here is worse than what happened to Joaquin,” Manuel Oliver said to the Post. “If you feel uncomfortable with this, well, lucky you, because I feel uncomfortable for other major things that have happened in my life.” 

  How will the FTC's ban on noncompete agreements affect the workforce?

What has the outside reaction been?

Most experts seem to agree on the merits of the project. The Shotline is “one of the least nefarious uses of voice-cloning technology I’ve heard of yet. There is a forest of ethical concerns to navigate there,” Aram Sinnreich, a communications studies professor at American University, said to the Post.

Others mostly concurred. While “I’m not saying this [initiative] isn’t complicated and we should talk and have a serious conversation about the ethics of it,” The Shotline is “not a negative use case,” Hany Farid, a UC Berkeley professor of digital forensics, said to NPR. The technology works “as long as there is disclosure about it, as long as they’re not trying to be deceptive, which they clearly are not,” Farid said.

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *