After a 16-year-old Kentucky teenager committed suicide earlier this year, his parents discovered he had been receiving threatening messages demanding $3,000. The money was a condition for not distributing a fake AI-generated image of him naked.
The tragedy highlights a rapidly growing global problem of digital sex racketeering targeting children and young people. At its core are so-called “nudify” apps – AI tools that digitally remove clothing or create sexualized images of real people.
The victim – Elijah Heacock, 16 – is just one of thousands of American children who have fallen prey to such crimes. His parents told US media that he was threatened that if he did not pay, the image would be sent to his family and friends.
“The people who are stalking our children are organized, well-funded and ruthless,” the boy’s father, John Burnett, told CBS News.
“They don’t need real photos — they can generate anything they want and then use it to blackmail the child,” he said.
The federal investigation into the case is ongoing. The case comes amid a growing use of “nudify” apps, which were originally used to fake photos of celebrities but are now increasingly being used against minors.
The FBI is warning of a “horrifying rise” in digital sex racketeering cases, with the most common victims being boys between the ages of 14 and 17. The agency says the crimes have already led to an alarming number of suicides in the country. | BGNES