The data was included in the company’s inaugural ‘Xbox Transparency Report’, which was published on Monday to highlight steps the platform holder is taking to moderate content to create a safer environment for players.
According to the Microsoft, it took a total of 7.126 million enforcement actions between January 1 and June 30 this year.
Microsoft Flight Simulator 40th Anniversary Edition - Available now
Account-only suspensions accounted for 4.5 million (63%) of all enforcements, incidents of content removal totalled 196,000 (3%), and incidents of both account suspension and content removal were 2.43 million (34%).
In terms of enforcement by policy area, Microsoft said cheating/inauthentic accounts were responsible for 4.33 million enforcements, ahead of adult sexual content (199,000), fraud (87,000), harassment or bullying (54,000), profanity (46,000), phishing (26,000) and ‘other’ (23,000), a category which includes “smaller volume areas such as piracy, account tampering, real world concerns, drugs, vulgar content, hate speech, spam, advertising, or solicitation”.
Microsoft said approximately two thirds of the enforcements issued in the first half of 2022 were the result of proactive detection. A third of enforcements were made in reaction to player reports, of which Microsoft received 33.08 million during the six-month period.
Negative communications by other players were behind 46% of user reports, ahead of complaints about player conduct (43%) and user generated content (11%).
Microsoft said the total number of player reports in the first half of the year was down significantly, from 41.83 million in the second half of 2021 and 52.05 million in the first half of last year.
Xbox’s appeals process enables players to get more information about any enforcements they have received and to challenge them. Of 151,000 cases reviewed in the first half of the year following appeals, just 6.5% of enforcement actions were overturned.
Microsoft said: “With this inaugural Xbox Transparency Report, it is our goal to share with you more about the wide range of actions that the Xbox team takes to moderate content on our platform and create safer experiences. As an essential part of our growth, we expect this report to evolve over time as we learn, iterate, incorporate feedback, and make improvements.
“Our proactive moderation, up 9x from the same period last year, allows us to catch negative content and conduct before it reaches players. We continue to invest and improve our tech so players can have safe, positive, and inviting experiences.”