-
To mark Mental Health Awareness Week and World Mental Health Day, Team Vitality and Bodyguard have released a report investigating online hate and toxicity within esports.
-
The findings highlight the urgent need to provide a safe, inclusive, and positive online environment for players, fans, and professionals.
-
This initiative is part of the KARE program launched by Team Vitality in 2023. Supported since its launch by its partner EVNIA (Philips’ gaming monitor brand), KARE focuses on three pillars: awareness, prevention and action, to make mental health a central priority in esports and gaming.
Key figures(August 1 – October 6, 2025)
|
For the first time in France, an esports club is taking a stand against hate and toxicity online. A leader in both the French and international scenes, Team Vitality and its partner EVNIA have joined forces with Bodyguard, a technology solution that allows brands and platforms to moderate texts, comments, images, and videos across their online accounts in real time.
This initiative is part of the KARE program, launched by Team Vitality in 2023. The shared goal is to protect the mental health of those involved in esports, support their performance, and provide the community with a safer environment. Together, the three organisations are fostering a healthy and positive environment within the esports industry.
Tens of thousands of hate messages controlled
As Team Vitality regularly competes in the world's biggest tournaments, from the Esports World Cup to Rocket League Worlds, their players and talents are directly exposed to online hate.
Since August 1, 19 official club accounts have been protected by Bodyguard. Thanks to its hybrid AI technology combined with human expertise, Bodyguard can detect, analyse, and remove toxic content in real time, based on the custom moderation rules tailored and set by Team Vitality. As a result, over 2,000 hateful messages have already been blocked, mainly on Team Vitality's X and Instagram accounts, where toxicity rates reach 4.6% and 2.5% respectively.
Since August 1, approximately 4.5 million followers of the 19 accounts protected by Bodyguard have benefited from this protection against toxic messages. These messages are far from harmless: they include violent attacks related to performance, calls to exclude certain players, racist, homophobic, fatphobic, or religious insults, as well as threats targeting players’ families. Some users repeatedly post toxic messages, which can make up as much as 80% of their comments. This kind of online abuse, if left unchecked, can have serious consequences on a player’s mental health and performance.
“In esports, just like in traditional sports, online hate can destroy careers. With Bodyguard, we protect both the players and the passion of millions of fans,” explains Charles Cohen, Founder and CEO of Bodyguard.
“In esports, the pressure doesn’t stop when the match ends. It often continues on social media, where every decision and action is scrutinised. After a win, messages can be friendly and supportive, but after a loss, they often become a heavy burden. For a coach, as well as for the players, this constant pressure eventually takes its toll. It’s essential to protect our teams and stay on the right path, even when the online storm feels more intense than the match itself,” says James "Mac" MacCormack, League of Legends coach at Team Vitality.
Toxicity under control, but still worrying
Sport or esports, the challenge remains the same. Insults, hate speech, and online harassment can have devastating effects on an athlete, impacting their confidence, morale, and performance. For Team Vitality, 3.6% of messages are hateful, slightly below the esports average of 4.2%, showing that the measures put in place are beginning to pay off, including the partnership with Bodyguard, awareness initiatives, engagement with the community, etc. In comparison, football is less exposed to toxicity (3%), but much more affected by unwanted content (3.5% versus 1% for esports).
"Social media is everywhere, and in esports, its influence has become immense. For a player, or even a staff member, positive messages can be a source of comfort - but when they’re not, they can hurt, corrode, and sometimes even gradually destroy someone. This constant exposure pushes people to question and doubt themselves, and not always in a healthy way. It's essential to find a balance and set boundaries. In this fight, Bodyguard is a true ally." says Fabien Devide, known as “Neo”, President and co-founder of Team Vitality.
Creating a more respectful and accessible environment for everyone
By its very nature, esports thrives on social media and streaming - spaces where passionate communities react and share their emotions in real time. While this constant exposure fuels fans' enthusiasm and attachment, it also comes with above-average levels of toxicity, highlighting the urgent need to protect esports players and safeguard their mental well-being.
The joint report by Bodyguard x Team Vitality clearly illustrates this dual reality. On one hand, 10% of messages received by Team Vitality are positive, a figure in line with industry standards. Esports fans particularly enjoy video content, vlogs, interviews, and behind-the-scenes footage, reflecting real loyalty and solidarity towards their club. However, this encouraging signal contrasts with traditional sports: in football, nearly one in four messages is positive (24%), while esports caps at around 18%.
This gap highlights a specific challenge: esports fans express their enthusiasm as much through criticism and toxic comments as through passion, making moderation essential. While esports remains more exposed to online hate than traditional sports, Team Vitality, EVNIA, and Bodyguard show that with innovative solutions, it is possible to reduce toxicity and protect the community.
A pioneer in its approach, Team Vitality is raising awareness of the need to mitigate this online harassment. With Bodyguard, the club is paving the way for an esports environment where performance, inclusion, and freedom of expression coexist. This strong commitment is reflected in concrete tools to protect the community, players, and staff from the excesses of online hate.