The city of Lancaster, Pennsylvania, was shaken by revelations in December 2023 that two local teenage boys shared hundreds of nude images of girls in their community over a private chat on the social chat platform Discord. Witnesses said the photos easily could have been mistaken for real ones, but they were fake. The boys had used an artificial intelligence tool to superimpose real photos of girls’ faces onto sexually explicit images. We know that seeing images and videos of child sexual abuse online is upsetting. It is perhaps surprising that there is not a higher ratio of multiple child images in the ‘self-generated’ 3-6 age group. It would be easy to assume that a child of that age would only engage in this type of activity on camera with the encouragement in person of an older child, leading the way, but shockingly this is not what we have seen.
- The group asks the operators of content-sharing sites to remove certain images on behalf of people who appear in them.
- More than 200 Australians have collectively paid more than $1.3 million to watch live streamed child sexual abuse filmed in the Philippines.
- Technology is woven into our everyday lives, and it is necessary in many ways even for young children.
Information and support
The shocking statistics were revealed on Wednesday in a report by the Australian Institute of Criminology, which says it has identified more than 2,700 financial transactions linked to 256 webcam child predators between 2006 and 2018.
Dear Concerned Adult,
It may seem like the best solution is to restrict or remove access to digital media, but this can actually increase the risk of harm. A youth may then become more secretive about their digital media use, and they therefore may not reach out when something concerning or harmful happens. Instead, it’s crucial that children and youth have the tools and the education to navigate social media, the internet, and other digital media safely. See our guide for Keeping Children and Youth Safe Online to find tips on preparing for internet safety. Unclear language can lead to confusion, misunderstanding or even harm, as in the case of the term ‘child pornography’. This phrase, which continues to be used today, 1, 1 is a perfect example of how harmful language can be.
Other measures allow people to take control even if they can’t tell anybody about their worries — if the original images or videos still remain in device they hold, such as a phone, computer or tablet. His job was to delete content that did not depict or discuss child pornography. These are very young children, supposedly in the safety of their own bedrooms, very likely unaware child porn that the activities they are being coerced into doing are being recorded and saved and ultimately shared multiple times on the internet. Below is the breakdown of the sexual activity seen in the whole sample alongside the activity of those that showed multiple children. Most of the time these children are initially clothed and much of what we see is a quick display of genitals. It could also be that most 3–6-year-olds are not left alone long enough for the discussion and the coercion to get further along, towards full nudity and more severe sexual activity.
To be clear, the term ‘self-generated’ does not mean that the child is instigating the creation of this sexual content themselves, instead they are being groomed, coerced and in some cases blackmailed into engaging in sexual behaviour. In cases involving “deepfakes,” when a real child’s photo has been digitally altered to make them sexually explicit, the Justice Department is bringing charges under the federal “child pornography” law. In one case, a North Carolina child psychiatrist who used an AI application to digitally “undress” girls posing on the first day of school in a decades-old photo shared on Facebook was convicted of federal charges last year. WASHINGTON (AP) — A child psychiatrist who altered a first-day-of-school photo he saw on Facebook to make a group of girls appear nude.