Since launching in 2005, YouTube has revolutionized the way people watch videos online. Offering users endless content on virtually any topic, there’s no denying YouTube’s convenience. Now, however, researchers from Griffith University suggest spending too much time on the platform can be detrimental to mental health. A team from the Australian Institute for Suicide Research and Prevention (AISRAP) report frequent, habitual YouTube users exhibit higher levels of loneliness, anxiety, and depression.
Dr. Luke Balcombe, and Emeritus Professor Diego De Leo from Griffith University’s School of Applied Psychology and AISRAP, originally set out to better understand both the positive and negative mental health implications associated with the world’s busiest streaming platform. Notably, individuals experiencing the most negative affects were young people under 29 years of age and those regularly watching content about other people’s lives.
Dr. Balcombe explains that the fostering of parasocial relationships between content creators and followers may be cause for concern. However, there were a few neutral or positive instances in which creators developed closer relationships with their followers.
“These online ‘relationships’ can fill a gap for people who, for example, have social anxiety, however it can exacerbate their issues when they don’t engage in face-to-face interactions, which are especially important in developmental years,” Balcombe says in a university release. “We recommend individuals limit their time on YouTube and seek out other forms of social interaction to combat loneliness and promote positive mental health.”
Is there a problem with YouTube’s algorithms?
Study authors add many parents are especially concerned about the amount of time their kids spend on YouTube. It’s common for moms and dads to say it’s tough to constantly monitor their children’s use of the platform for educational or other purposes.
During this study, the team considered spending over two hours daily on YouTube “high frequency use” and over five hours daily “saturated use.” Researchers also concluded much more needs to be done to prevent suicide-related content from being recommended to users via suggested viewing algorithms. YouTube’s algorithm suggests videos based on previous searches, potentially sending users further down a disturbing “rabbit hole.”
While users are able to report this type of content, it often goes unreported. Conversely, if a harmful video remains online for just a few days or weeks, due to the sheer volume of content passing through, it is usually near impossible for YouTube’s algorithms to stop all of it. If a piece of content is flagged as potentially promoting self-harm or suicide, YouTube generates a warning and asks users if they want to continue watching the video.