NOTE: This content is old - Published: Friday, Jun 14th, 2019.
Half of teenagers (46%) aged 13-17 who use social media have seen posts that they believe should not be allowed, new research published this week by the Chartered Institute of Marketing (CIM) has revealed.
The survey of over 2,500 adults and teenagers published ahead of the close of the Government’s consultation on online harms, shows that 95% of young people aged 13-17 have a social media account, with the most popular being YouTube (79%), followed by Instagram (73%), Snapchat (66%) and Facebook (45%).
Despite many children coming across potentially harmful posts on social media platforms, very few are doing anything about them. Almost two thirds (62%) of teenagers who have seen content they think they shouldn’t have, say they either rarely or never report these posts. Only 7% say they always do.
Seeing these posts does seem to be discouraging some children from engaging on social media; close to half (44%) agree that they would be put off from engaging in discussion and conversations online. But very few are prepared to give up their accounts; two in three (66%) said that seeing posts on social media that should not be allowed would not make them want to delete their account, while more than half (52%) said it would not put them off signing up for an account in the first place.
The survey put similar questions to adults and found that almost half (44%) of those who had seen harmful content on social media say they rarely or never report it. While only one in five (20%) say that they always report it.
Who is responsible?
When it comes to who should be protecting children under the age of 18 from harmful or inappropriate content on social media, the public place responsibility on parents and social media companies.
Three quarters of people over 18 say it is the responsibility of parents/guardians (76%) and social media companies (74%) to protect children on social media.
However, most people believe strongly that social media companies should be removing harmful content from social media.
– Who’s responsible? Eight in ten (83%) said that social media companies have a responsibility to monitor for harmful content on social media. Many people also felt there was a role for government (49%), and individuals themselves (57%).
– Who pays? When it came to paying for dealing with harmful content on social media the vast majority of the public felt this was the responsibility of social media companies. 67% of adults said the cost of monitoring and regulating harmful content on social media should be borne by the social media companies themselves, compared with only 14% who said government was responsible.
Revenue from marketing and advertising is the main source of income for most social media companies and the Chartered Institute of Marketing believes more must be done to protect users on social media if UK businesses are to continue to spend their marketing revenues reaching customers through social media platforms.
Chair of the Chartered Institute of Marketing, Leigh Hopwood, said:
“Social media is a crucial marketing channel, but professional marketers need to be confident that enough is being done to protect the users of these platforms, especially children. With regulation of social media under review we felt it was important to seek out the views and experiences of the users themselves.
It is alarming that so many children have seen inappropriate posts on social media and failed to report them. Moreover, while more adults do report harmful content, it is concerning that only one in five always do so.
Our research shows that we could make a huge difference quickly if we all take the simple action of hitting the report button when we see something that shouldn’t be on social media. When the new regulations take effect then social media companies will have a legal responsibility to do something about it once we have reported it.
We are calling for a public education campaign to show people, especially children, how to report harmful content and to highlight the importance of reporting it whenever you see it. We don’t believe we should wait for the regulations, this is something that can happen now.”
The research also demonstrates the prevalence and impact of harmful content being seen by adults on social media:
– Harmful content: Three in ten (29%) adults said that they had seen content that could be damaging if seen by children, encourage illegal activity or be considered abusive or offensive, in the last 6 months. Only one in five (21%) said that they had not seen harmful content, while a third (32%) were not sure or couldn’t recall.
– Who’s seeing it? Younger adults are much more likely to recall seeing harmful content than older generations; 46% of 18-24 year olds say they had seen it in the last 6 months, compared with only 16% of those aged 55 and over. Those people most active on social media are the most likely to have seen harmful content. Among those who are active on all three of the most popular platforms, Facebook, Instagram and Twitter, 44% say that they had seen harmful content in the past six months.
– Stifling debate: Three quarters of people who use social media (74%) say that the presence of abusive or offensive content can put them off engaging in discussions on social media, while more than half (52%) agree that it would make them consider deleting their account.