THE SUN
“WHAT does your room look like?” seemed an innocent question to teenager Thomas… but within minutes of his reply he was being ‘sextorted’ and feared his life was over.
The 14-year-old, who we have given a false name, had been talking to a ‘girl’ online and sent pictures showing his surroundings – fatefully they included his face.
Moments later, those images were twistedly transformed into a deepfake child sexual abuse video – generated by Artificial Intelligence (AI) and the stranger was threatening to make it public.
In a harrowing plea for help to Childline, Thomas wrote: “Now they’re demanding money from me and they said if I don’t pay my life will be over!
“I know it’s not me in the video but it looks so real! I’m worried what will happen if my friends find out. I can’t believe I got myself in this situation, I’m so scared. I don’t know what to do.”
This is just one of a number of terrifying tricks used by paedophiles and their enablers to ‘sextort’ children and teenagers online – and tech is at the heart of it.
The Internet Watch Foundation (IWF) found 2,562 images of child sexual abuse material (CSAM) that had used AI on just one dark web forum in one month last year… some of the content showed kids as young as two.
Among the twisted creators is Hugh Nelson, from Bolton, who was locked up for 18 years on Monday after he turned ordinary images of children into AI-generated explicit material.
The 27-year-old sicko charged paedophiles £80 for a new “character” – depicting a real-life kid – and would then flog them £10 images of the child in different repulsive positions.
The harrowing case comes as The Sun today reveals that vile paedophiles are swapping and selling images of kids “like trading cards” and try to “collect every single abuse image of a child that exists”.
READ THE FULL STORY IN THE SUN