In a world where human interaction and connectivity with people from across the globe is easily accessible through a single app, it is fair to say that social media has become an integral part of people’s lives (4.48 billion lives to be precise).
With that in mind, a growing topic of discussion in the last few years has been the impact that social media has on people’s mental health and wellbeing, especially the role that these apps play in protecting and safeguarding their teen and young adult audience.
In fact, a wellbeing study from 2021 showed that two thirds of UK adults with a social media account believe social media companies have a duty to protect the mental health of their users. However, 53% of people surveyed felt that social media platforms are not doing enough to safeguard their users’ mental wellbeing.
Whilst no social media platform is perfect, it is important for these companies to take mental health awareness seriously and to push for safe, more inclusive and positive online spaces.
At Reward, we believe in promoting and supporting our team’s wellbeing and mental health. That is why we use our platform to promote positive conversation around mental health, as well as support our team members who need days off for their mental wellbeing.
With the increasing demand for better mental health resources and guidelines, various social media apps have begun to implement features and initiatives that protect its users.
Emotional Health Support Guides
As the Covid-19 pandemic continued to rise in 2020, Facebook, now coined Meta, included additional resources from mental health experts to their Emotional Health Resource Centre to assist Facebook users by providing emotional and mental health support. These guides and resources developed by the World Health Organisation were created to address mental health topics such as stress, anxiety, depression, grief and loss - something which was needed more than ever during the growing mental health crisis due to the pandemic.
Facebook Watch Show
First airing on 14th December 2020, Facebook launched an original Facebook Watch Show featuring Empire and Hidden Figures actress Taraji P. Henson called Peace of Mind with Taraji. The show, which has accumulated 11.8 million followers, follows Taraji and her best friend Tracie as they create open discussions regarding mental health issues, whilst sharing personal stories of the BIPOC community. Each episode focuses on a different mental health topic, whilst featuring celebrity guests, therapists, doctors and people from the general public. Finishing up the second season of the show in December 2021, Peace of Mind with Taraji has helped in showcasing the support and tools that are available to people on how they can cope and work through their own mental health struggles, proving to people that they are not alone.
Whilst WhatsApp overall platform is different from its compadre, in the fact that it is more like Facebook Messenger, there are features available that support its users through chatbot functions. Partnering with UNICEF in 2021, the Global Mental Health chatbot was created to offer tips on starting a conversation with someone you’re concerned about, breaking down stigmas and helping people to communicate their thoughts and feelings. Similarly, the WHO’s Health Alert chatbot (also launched on Messenger) provides mental health and wellbeing resources, such as exercises for coping with anxiety and stress, for users worldwide.
Alongside this, there are localised services that WhatsApp have partnered with, such as Connection Coalition UK, providing a Loneliness Advice chatbot that helps to mitigate the impact of isolation throughout the pandemic and develop strong relationships and connections with people for the future.
Mental Health and Wellbeing Instagram Guides
Initially created in May 2020 for health and wellbeing advocates to support people struggling during the pandemic, Instagram Guides is an easy way for people to access information, tips and recommendations through a curated flow of posts. This feature allows users to produce Guides that contain positive, supportive content for users whilst teaching them how to practice self-care and mindfulness exercises. The feature was launched in partnership with mental health and wellbeing organisations such as UNICEF, Heads Together and AFSP, but has now expanded so that content creators on the platform can access this feature as well.
Eating Disorders and Body Image Resources
Instagram has always been a highly visual platform, which has caused many negative conversations in regards to the mental health of young people, especially regarding body image issues for teenage girls. Whilst the app will never be able to fully combat these issues, they announced in 2021 that they will be introducing new ways to support people who are affected by eating disorders and negative body image. Advised by health experts, Instagram have created dedicated resources to cope with eating disorders and body dissatisfaction. These resources are available to people who are searching for terms related to eating disorders and body image issues on the app, before showing the search results. Alongside the expert-backed resources, Instagram provides local and national helplines, as well as self-help information, which are available upon search or when attempting to share content on the platform surrounding the topic.
Stricter Penalties and Bullying Regulations
Cyber bullying on the popular social media platform has been an issue for many years now, however Instagram has made it clear that they have a zero-tolerance of online bullying. In 2021, they announced that stricter penalties will apply to users sending abusive Instagram comments and Direct Messages (DMs). When a comment or DM violates Instagram’s rules, users will be prohibited from sending messages for a set period of time and, after repeated offences, will have their account disabled. Instagram have also mentioned that they will be working alongside UK law enforcement to respond to valid legal requests for information in hate speech related cases.
Users on the platform can also take control of their Instagram comments by filtering out offensive words, phrases or emojis that they do not want to see. This feature was also applied to Direct Message requests in April 2021. There is also the option to switch off DMs from people that do not follow you, eliminating hateful private messages from people you don’t know.
Earlier in 2021, Instagram announced that they were developing a new version of their app, called Instagram Kids, that would be available to children under the age of 13. However, these plans were soon put on hold amid concerns of mental health in youths. Whilst this development could have opened up the market to a wider audience, Instagram decided it would be best to invest further research and time into understanding the concerns of parents, experts and regulators. By investing further time into understanding the implications this app could develop for younger teens online, Instagram are endorsing the importance of child safety and mental health to ensure that Instagram Kids would not be detrimental to the health and wellbeing of children.
In-App Resources and Support
TikTok has become the most popular social media app since its launch in 2016, with 1 billion active users worldwide. The app is especially popular with young adults, with around 63% of its users being Gen Z adults and teens.
With that being said, TikTok has been particularly active in implementing mental health initiatives and resources for its users. In 2021 alone, TikTok launched new in-app support resources for people struggling with eating disorders and body image, providing access to help from expert organisations through the app. Users are able to access BEAT Helpline, a UK-based charity, to find support, information and help about treatment options, as well as expert advice on supporting yourself or a friend who is struggling.
To further support people affected by these issues all year round, TikTok announced that they would be introducing permanent public service announcements on eating disorder and recovery related hashtags (such as #whatieatinaday and #bingerecovery) to drive awareness and support to those affected and recovering from eating disorders.
Empowering Creator Control over Comments
Similar to Instagram’s filter feature, creators on TikTok have the ability to apply comment filters to their content, select to filter out spam and offensive comments, as well as filtering out comments by specific keywords. There is also a comment management feature which allows TokTok creators to select whether or not to approve a comment to be displayed on their post.
Similar to this, people who use the platform and wish to comment on a post will be prompted to reconsider their comment if TikTok redeems it to be inappropriate or unkind, which reminds users of their Community Guidelines and enables users to edit their comment before sharing.
TikTok Wellbeing Hub
In April 2021, TikTok announced the launch of their Wellbeing Content Hub (to US, Australian and New Zealand users), which offers a centralised destination for tools, expert advice and content that supports holistic wellbeing. Launched to coincide with Mental Health Awareness Month, the hub showcases content from TikTokers who focus on the mind, body and wellbeing, enabling users to openly discuss and find support on emotional and physical wellbeing.
It is very easy to get lost for hours in the endless scroll on the TikTok ‘For You’ page (I know I have gone down that rabbit hole many times), which is why, in April 2019, TikTok introduced a Screen Time Management feature as part of their commitment to looking after their users wellbeing.
This feature, integrated in the form of a screen time prompt on a user’s news feed, reminds a user how much time they have spent scrolling through the app, acting as a reminder to take a break. Social media addiction and usage has been on the rise for the past few years, which has shown to have contributed to the rise in anxiety and depression in young adults. By taking a time out from the app, people can relax their brain, focus on their wellbeing and their life outside of a screen.
‘Here For You’ Campaign
Early in 2020, Snapchat became a founding partner of the first-ever Mental Health Action Day, created to drive awareness into action, providing support and advice for Snapchatters with their mental health. Supported by hundreds of brands, charities and government agencies, the ‘Here For You’ campaign encourages and empowers people to take action for themselves or for someone they know who is struggling with their mental health.
This feature showcases resources from local experts when a user searches for a specific topic related to mental health issues, such as anxiety, depression, eating disorders, stress, grief and bullying.
Snapchat’s partnership with popular meditation and wellbeing app Headspace was made available to users in 2020, designed to provide a safe space for the app’s users to practice meditation and mindfulness exercises. Headspace Mini was created to help the app’s users who may be struggling with depression, anxiety and other mental health issues. Users are able to access in-app medication videos which help to deal with stress and anxiety, as well as using it to check-in with friends and family and partake in the “classes” together.
In-App Education Portal
To support people and provide advice to people struggling with substance abuse, Snapchat has created an in-app portal called ‘Heads Up’. The portal is used to distribute content from expert organisations to its users, which can be found whenever someone searches Snapchat for drug-related keywords. In partnership organisations such as Song for Charlie and Shatterproof, Heads Up provides educational videos and resources that raise awareness to the dangers of drugs, providing advice and helpline links that people can reach out to for support.
Mental Health Filters and Lens’
Snapchat’s claim to fame is their introduction to filters and lens through the social media app, one which Snapchatters keep on coming back for. With that in mind, Snapchat introduced a few new initiatives which included the use of this in-app feature, partnering with organisations to start a discussion about mental health.
Partnering with AdCouncil and The Boris L. Henson Foundation, Snapchat have created filters and lens’ to provide unique conversation starters for people to talk about mental health, as well as address mental health risks for BIPOC youth.
Compared to some of the other major social media platforms, Twitter is less active when it comes to mental health initiatives, despite many people believing that it is the most toxic of all social media platforms. With that being said though, Twitter announced in February 2021 the new steps they were taking to combat racism on their platform. Using automated tools, Twitter detects and removes racist and abusive tweets, as well as working to remove reported Tweets from Twitter users and suspending accounts.
Alongside this, they are testing a new feature that will temporarily autoblock accounts using harmful language, as well as triggering reply prompts on potentially harmful Tweets to encourage users to change the language they have used.
During Mental Health Awareness Month in the US, Pinterest announced their partnership with #HalfTheStory, a non-profit organisation that works to empower the next generation’s relationship with social media through education, advocacy and resources. Their goal is to evolve digital wellbeing to connect people on a deeper level, in which Pinterest donated ad credits on their platform to encourage their users to discover #HalfTheStory. Through this collaboration, Pinterest were able to encourage Pinners to learn more about the organisation and provide resources to empower healthy tech usage and safe advocacy online.
Introducing ‘Compassionate Search’
Initially launched in the US in 2019, Pinterest created ‘Compassionate Search’ as a prompted response to mental health and emotion-related search terms which provides expert-based wellbeing practices, resources and support. These guides and activities are designed to help people who are feeling stressed, anxious, sad or struggling to manage difficult emotions. Even though it is available to support people who are dealing with mental health issues, it is not designed to replace professional care but to help someone who is looking for support and doesn’t know where to start. This feature also provides direct access to suicide prevention lifelines such as the Samaritans.
The Future of Mental Health and Social Media Apps
It’s positive that many of the social media goliaths of this world are taking steps in supporting their users' mental health, encouraging people to focus on their wellbeing and starting conversations when help is needed. But that isn’t saying that the job is finished. The creators of social media apps have a duty to protect their community, constantly evolving their guidelines, support and in-app features to create a sustainable social media platform that protects its users and creators.