Children searching for depression and self-harm content can be exposed to more of it through recommendation engines built into social networks.
Sophie Parkinson was only thirteen years old when she committed suicide. She had depression and suicidal thoughts.
Her mother, Ruth Moss, thinks Sophie eventually committed suicide because of the videos she saw online.
Like many young men, Sophie got a phone when she was 12 years old.
Ruth remembers discovering shortly after Sophie was using her to display inappropriate material on the Internet.
Sophie Parkinson and Mother Ruth Moss
"The really hard part for the family after Sophie's death was finding some really difficult pictures and clues of how her life had committed suicide," she says.
Almost 90% of kids ages 12 to 15 have a mobile phone, according to Ofcom, the communications watchdog. It is estimated that three-quarters of these people have social media accounts.
The most popular apps restrict access to people under the age of 13, but many young children sign up and the platforms do little to stop them.
The National Association for the Prevention of Cruelty to Children (NSPCC) believes that tech companies should be compelled by law to consider the risks children face to their products.
"For more than a decade, child safety has not been considered part of core business models by major tech companies," says Andy Burroughs, head of online child safety policy at the charity.
"Website designs can prompt vulnerable teens, looking to commit suicide or self-harm, to see more of this type of content."
Recognizes and removes
Recently, a video clip of a young man committing suicide was posted on Facebook.
The footage then spread to other platforms, including TikTok, where it stayed online for days.
TikTok acknowledged that users would be better protected if social media providers worked closely.
But Ruth reflects the NSPCC's view and believes social networks should not be allowed to monitor themselves.
She says that some of the material her daughter reached six years ago is still online, and typing certain words in Facebook or Instagram showing the same pictures.
Facebook announced an expansion of an automated tool to recognize and remove self-harm and suicide content from Instagram earlier this week, but said data privacy laws in Europe limit what it can do.
Other smaller startups are also trying to use the technology to address the problem.
SafeToWatch is developing software that is trained with machine learning techniques to prevent inappropriate scenes including violence and nudity in real time.
It analyzes the context of any visual material and monitors the sound.
She notes that this provides a balanced way for parents to protect their children without deeply intruding on their privacy.
"We never allow parents to see what a child is doing, because we need to gain the child's trust which is critical to the cybersecurity process," explains founder Richard Bursey.
'Frank talks'
It is often easy to blame parents, Roth suggests, adding that safety technology only helps in limited circumstances as children become more independent.
"Most parents can't find out exactly what's going on on their teen's cell phone and watch what they see," she says.
Many experts agree that it is inevitable that most children will encounter inappropriate content at some point, so they need to acquire 'digital resilience'.
"Internet safety should be taught the same way as other skills that keep us safe in the physical world," explains Dr. Linda Papadopoulos, a psychologist who works with the nonprofit Internet Matters Safety.
"Parents should have candid conversations about the types of content children may encounter on the Internet and teach them ways to protect themselves."
She says the average age of children who are exposed to pornography is 11. When this happens, she advises parents to try to discuss the issues involved rather than confiscating the device used to view them.
1 Comments
Excellent knowledge, I am very much thankful to you that you have shared good information with us. Here I got some special kind of knowledge and it is helpful for everyone. Thanks for share it. Mental health services in Morro Bay, CA
ReplyDelete