This Guidance demonstrates how game developers can moderate user-generated content (UGC) to ensure appropriate and safe player interactions. With AWS managed services and custom machine learning models, developers can quickly set up a content moderation backend system in one place. This backend system supports detecting and filtering comprehensive toxic content, along with customizable content flagging. The well-designed APIs allow for fast integration with the game and community tools. Ultimately, this allows developers to face the operational risks of user-provided content in online gaming platforms head-on; manual content moderation is error-prone and costly, whereas content moderation, powered by artificial intelligence (AI), dramatically accelerates the process to keep gaming communities safe