Microsoft is doing more to tackle toxicity in multiplayer Xbox games. The company is introducing a feature that allows Xbox Series X/S and Xbox One players to capture a 60-second video clip of abusive or inappropriate voice chat and submit it for moderators to review.
“This feature is purpose-built to support the broadest arena of in-game interactions between players and works across thousands of games that offer in-game multiplayer voice chat, including Xbox 360 backward-compatible titles,” Xbox Player Services corporate vice-president Dave McCarthy wrote in a blog post.
Microsoft designed the tool for both ease of use and to minimize the impact on gameplay. When you capture a clip for reporting, it will stay on your Xbox for “24 online hours.” You’ll have the option to submit it right away or wait until you finish your gaming session. You’ll get a reminder before the 24 hours are up. If you opt not to report the clip, your Xbox will automatically delete it.
No one else will have access to the clip unless you submit it. “Xbox is not saving or uploading any voice clips without you, the player, choosing to start the reporting process,” McCarthy said. Any clips you capture through the tool will not appear in your recent captures and it won’t be possible to download, share or modify them. These clips will only be used for moderation purposes. After the safety team has reviewed your report, you’ll receive a notification to tell you whether it took action against an abusive player.
An Xbox spokesperson told Engadget the safety team will use “a range of moderation tools that leverage AI and human moderators” to analyze the clips. Moderators will review audio and video to determine if someone has violated the community standards.
At the outset, the reactive voice reporting system allows a player to report up to three individuals at once. “If a moderator cannot determine who was talking at a given time and associate it with the reported Xbox Live player, the report will be closed as unactionable, no enforcement action will be taken and the captured video will be deleted within 90 days,” the spokesperson said.
The advent of cross-platform play muddies the waters here too. The safety team won’t take any action in cases of inappropriate voice chat from those who are on other platforms. “The announced reactive voice moderation feature is specifically for reporting Xbox players to the Xbox Safety Team,” the spokesperson noted.
It’s encouraging to see Xbox tackle the problem of toxic voice chat at a platform-wide level. The PlayStation 5 has had a similar feature since it debuted in 2020.
Several studios have adopted similar approaches in their own games. In 2021, Riot said it would record Valorant voice communications but only listen to them when a report was filed. It started testing the system last July.
Ahead of Overwatch 2 going live last October, Blizzard said it would automatically transcribe a recording of a match’s voice chat following a report from a player. The company’s chat review tools analyze the transcription for signs of abusive behavior and both the recording and text file are deleted. (Of note, Blizzard and Overwatch 2 may soon be Microsoft properties.)
To begin with, Xbox’s voice reporting feature will be available for Alpha and Alpha-skip Xbox Insiders in the English-speaking markets of the US, Canada, Great Britain, Ireland, Australia and New Zealand. Microsoft hopes that insiders will provide feedback to help improve this feature. It plans to continue investing in voice moderation and to support more languages. Xbox will share data and updates on voice chat moderation in its bi-annual transparency report.
Source: www.engadget.com