Sexting is an increasingly common activity among children and young people, where they share inappropriate or explicit images online or through mobile phones. ‘Sexting’ is more common than you may think and has found to be commonplace amongst children and young people.
Most young people do not see ‘sexting’ as a problem and they are afraid to talk to adults about it as they fear they may be judged or have their phones taken away.

The NSPCC website highlights some of the dangers associated with sexting:

  • No controls of the image and how they are shared
  • It’s easy to send a photo or message but the sender has no control about how it is passed on
  • When images are stored or shared online they become public. They can be deleted on social media or may only last a few seconds on apps like Snapchat, but images can still be saved or copied by others
  • These images may never be completely removed and could be found in the future, for example when applying for jobs or university

Young people may think ‘sexting’ is harmless but it can leave them vulnerable to:

  • Blackmail

An offender may threaten to share the pictures with the child’s family and friends unless the child sends money or more images

  • Bullying

If images are shared with their peers or in school, the child may be bullied

  • Unwanted attention

Images posted online can attract the attention of sex offenders, who know how to search for, collect and modify images

  • Emotional distress

Children can feel embarrassed and humiliated. If they are very distressed this could lead to suicide or self -harm.


It may be common, but ‘sexting’ is illegal. By sending an explicit image, a young person is producing and distributing child abuse images and risks being prosecuted, even if the picture is taken and shared with permission.

Make use of the CEOP websites below where there is a great deal of information and advice available for everyone.

Reporting youth produced sexual imagery online

The quickest way to get content removed from the internet is for the person who posted it to take it down. If the young person posted the content themselves using their account, they should be asked to log in and delete it.
If someone else posted the image or re-posted it, they should be asked to log in and delete it from any sites they’ve shared it on.
If the school knows where the content is hosted but doesn’t know who posted it, or the poster refuses to take
it down, the content can still be reported to an online service. If it breaches a site’s Terms of Service then it will
be removed.

Each provider will have a different approach to dealing with requests for the removal of content and the speed of response. More information can be found on individual providers’ websites where they should make public their Terms of Service and process for reporting. Nudity and sexual content is not allowed by the majority of the main providers. Sexual imagery of young people is illegal and should not be hosted by any providers.

The following provides an overview of the reporting functions provided by the main service providers:


Snapchat offers users the ability to share images/videos, which it calls ‘snaps’. The snap is shared and then disappears after a few seconds. Snapchat also allows users to share Snapchat Stories: these are snaps that are shared in a sequence across a 24 hour period.
Snapchat provides a reporting function here:
Users are able to block other users.


WhatsApp is a messaging service where users can share pictures, text or videos. These can be shared with one person or multiple users.
WhatsApp encourages users to report problematic content, however, they advise that they generally do not have the contents of messages available to them. This can limit their ability to verify the report and take action.
Please see instructions on how to report here:
Users are able to block other users here:


Instagram is a picture and video sharing app which allows users to share images, make comments and
post messages.
Instagram provides a reporting function here:
Users are able to block other users.


Facebook is a social network which allows users to create a profile, share images, videos and messages.
Facebook provides a reporting function here:

Social reporting –
This offers users the ability to contact other users directly to ask them to take something down that does not necessarily breach Facebook’s terms of service. In some cases the young person may not feel comfortable in contacting the person directly so they can use the report flow to enable another trusted person to help them – e.g. a teacher, friend, parent.
Public reporting –
Users who do not have a Facebook account are able to report directly to Facebook using the link above and completing the form.
Users are able to block other users.


YouTube allows users to watch, create and share videos. Users can create their own YouTube account, make playlists and create their own channel. Users are also able to comment on other users’ channels.
YouTube provides a reporting function here:
Users can report an individual video, a channel or a comment on a video. Only account holders can make reports
on YouTube.


The “right to be forgotten” ruling allows the public to request the removal of search results that they feel link to outdated or irrelevant information about themselves on a country-by-country basis. Users are able to complete a form to highlight what content they wish to be removed. Users have to specify why the content applies to them and why it is unlawful so the exact URLs relating to the search results need to be referenced. See

A list of many other providers and links to their reporting functions can be found at the NSPCC’s NetAware website:


think you know index