Former CNN anchor Jim Acosta faced widespread backlash for conducting what he called a “one of a kind interview” with an AI avatar of Joaquin Oliver, a 17-year-old victim of the 2018 Parkland school shooting. The controversial segment, which aired on Monday and was created by Oliver’s parents to send a “powerful message on gun violence,” instead sparked outrage over its tone-deaf use of AI technology to recreate a deceased shooting victim.
What happened: Acosta interviewed an AI recreation of Oliver, one of 17 people killed at Marjory Stoneman Douglas High School in Miami, asking the avatar what had happened to him.
The backlash: Critics immediately condemned the interview as insensitive and ethically questionable, with many pointing out the availability of living survivors who could share authentic experiences.
Broader AI concerns: The incident reflects growing public wariness about inappropriate uses of AI technology, particularly in sensitive contexts involving death and grief.
Previous precedent: This isn’t the first time AI has been used to recreate Parkland victims or deceased individuals for advocacy purposes.
What Acosta said: Despite the criticism, the former CNN anchor defended the interview, telling Oliver’s father: “I really felt like I was speaking with Joaquin. It’s just a beautiful thing.”
Why this matters: The controversy highlights the ethical challenges surrounding AI’s use in sensitive contexts, particularly when dealing with tragedy and grief, as gun violence remains a leading cause of death for children and teens in the United States.