Molly Russell: The suicide of a British girl raises controversy over the responsibility of social networks | technology


Molly Russell, 14, was found dead in her bedroom on the morning of 21 November 2017, in Harrow, northwest London. He took his life. Her family has never detected any strange behavior in her, other than the fact that she has spent more time in her room over the past year. They blamed the changes in adolescence. But when her father, Ian Russell, checked Molly’s email for any possible explanation for the tragedy, he came across a two-week-old Pinterest post titled “Depression Pins You May Like.” The investigation continued and found that in the six months leading up to her death, the young woman shared or interacted on Instagram with more than 2,000 posts related to suicide, self-harm or depression.

Five years later, Instagram and Pinterest were recalled by British authorities. Elizabeth Lagoon, director of health and wellness policy at Meta, the parent company of Instagram, and Judd Hoffman, global director of community operations at Pinterest, gave her testimony in early October in a British court. This is the first time that two tech companies have been involved in a legal process related to user suicide.

“[Molly Russell] “She died of self-inflicted injuries while suffering from depression and the negative effects of online content,” said Andrew Walker, attorney and medical examiner.senior forensic medicine) for the North London area. In the UK, this figure has the power to conduct independent investigations to determine the causes of people’s deaths. Walker did not classify the event as a suicide: the jurist has proven that the Internet “affected his mental health.” [en referencia a Russell] negatively and contributed to his death.”

Companies do not face fines or penalties. They were not summoned to a criminal or civil trial, but to forensic hearings. But the debate opened up about their joint responsibility in some suicides, something that had never happened before. “Our thoughts are with Molly’s family and with other families affected by suicide or self-harm,” Hoffman confirms to EL PAÍS via email. “Molly’s story has been a huge change for us and we will continue to work to create a safe and positive place for us Commentators [usuarios de Pinterest]”.

Frances Hogan pointed the way a year ago now. A former Meta employee has led the company into the worst existential crisis due to a leak of internal documents, sparking a massive press investigation into the The Wall Street Journal. Among the many discoveries made by the engineer, there was one that had a special impact: Instagram executives intentionally introduced content that was toxic to young people because it was more addictive and better monetized. So much so that the job offer revealed that 13% of British girls and 6% of American girls who said they had suicidal thoughts had refined this desire thanks to said social network.

Data from the Pew Research Center shows that lawsuits are breeding in the US from parents who believe social networking algorithms are causing physical harm to their children. So far this year, more than 70 lawsuits have been filed against Meta, Snap (owner of Snapchat), ByteDance (father of TikTok), and Google for causing anxiety, depression, eating disorders or lack of sleep in teens and young adults to blame. for his addiction to social networks. according to Bloomberg BusinessweekAt least seven of these operations come from parents whose children have committed suicide.

Ian Russell, the father of baby Molly, speaks to the press after one of the criminal investigation sessions to determine the causes of his daughter's death, on September 30.
Ian Russell, the father of baby Molly, speaks to the press after one of the criminal investigation sessions to determine the causes of his daughter’s death, on September 30.Joshua Pratt – PA Images (PA Images via Getty Images)

Janet Majowski, whose 14-year-old daughter committed suicide, sued TikTok, Snapchat, and Meta in August alleging that the aforementioned social networks are responsible for the young woman embarking on a path of no return. “They have to change what they see to the kids, and tweak the algorithm in a way that doesn’t lead them into the dark,” he said. Bloomberg Businessweek.

The lawsuits faced by social networks are asking them to take responsibility for the harmful effects of their products, just as they did 30 years ago with the tobacco companies. Tech companies think that’s not their problem. Albert Gimeno, a spokesperson for the Padres 2.0 Association, which specializes in cyberbullying, technology addiction or digital violence, among others, says it’s not in their business culture to fight the dissemination of content that could encourage suicide. “The actions they took and the teams they created to get rid of malicious content not only have to deal with a huge volume of information to review, but also with other departments in the same companies going in the opposite direction, such as marketingadvertising, sales or communications.

Social networks greatly influence the lives of young people. “Adolescents with certain personality traits and emotional vulnerabilities communicate in an environment in which they can display pain, hopelessness, and detachment from traditional channels of communication,” explains psychologist and psychotherapist Luis Fernando Lopez, ISNISS Project Co-Director and Coordinator, Technician in the Let’s Talk About Suicide Program for the Official College of Psychologists in Madrid. “These profiles are found on social networks because they feel accompanied on issues that matter to them, they maintain some anonymity and see that they belong to a group, and they have the security of not being judged and rejected. They start with general contacts and then develop into private places where behaviors like self-harm begin. or suicidal thoughts to develop,” he describes. In Spain, the number of child suicides has tripled since 2006.

Jimeno doesn’t realize that lawsuits against social networks are spreading in Spain, as they are in the United States, nor does he think they have much of a trip, let alone that they solved the problem. “Parents themselves, the administration, educational centers, other Internet users, and other technology and communications companies have a role to play as well,” he explains.

Algorithms and manual supervision

Every minute 2.4 million photos are uploaded to the internet on Snapchat, 1.7 million posts on Facebook or 66,000 photos on Instagram, according to Domo Consulting. The technological approach to sifting all this information combines automatic and analog means. “Pinterest’s current policy on self-harm provides a detailed list of content to be removed and to limit distribution, with more than 25,000 terms on the block list,” notes Hoffman. “When content violates our policies, we take action on it through human and automated processes. If a user searches for content related to suicide or self-harm, they should not see results and instead be shown a notification directing them to experts who can help if they have issues.

A teenage girl uses the social network Instagram.
A teenage girl uses the social network Instagram.Baku Points (Country)

Instagram response is more flexible. On the other hand, they have created parental controls for the content that teens see. They also ban those that promote suicide or self-harm. “We found and removed 98% of this content before we were told it was,” says a Meta spokesperson. On the other hand, the company allows people to talk about their feelings and share content dealing with suicide, as long as they don’t promote it.

The blended approach, which combines automated detection tools for problematic materials with moderation in human content, dominates the industry. TikTok, for example, publishes quarterly reports on compliance with its standards. And in the latest release covering the period from April to July of this year, it appeared that 113.8 million videos were deleted, which is about 1% of the total posted videos. Of those, 6.1% were eliminated for failing to comply with policies related to suicide and serious challenges, say sources from ByteDance, the owner of the social network.

The most accessible measure, because it can be counterfeited, but one that tech companies take very seriously, is the minimum age of access. Facebook, TikTok, Instagram, Pinterest, and Snapchat do not accept children under the age of 13; You must have 14 on YouTube. Google blocks certain searches and displays helplines for those interested in content related to self-harm or suicide. So does TikTok, which has begun to replace Google as the search engine of choice for its youngest.

You can follow country technology in Facebook s Twitter Or sign up here to receive Weekly Bulletin.

Subscribe to continue reading

Unlimited reading



Leave a Reply

Your email address will not be published. Required fields are marked *