I think the current reporting system is excessive, since most players use voice chat, and this can cause confusion of context and language when reporting a player. So what I propose is an autobot that records swear words, and the creators of servers or realms we can choose the level of "automod".
Levels of the automod:
Level 0: Messages are not checked.
Level 1: words related with sexual content.
Level 2: Level 1 and vulgar language.
Level 3: Level 2, insulting or offensive.
Level 4: Level 3, abuse of capital letters or links with viruses.
When a not-allowed word is detected the game warn with a message before sending the message, writing posters, renaming items and writing books.
If the "automod" detects one of these words will temporarily mute the player from the media with messages. In case there are several abuses as they did in school with these levels of moderation skip it 4 times without affecting the multiplayer.
I propose this because there are many children who learn bad words, or say swear words in other languages, even without knowing that they are. That's why right now this system gives more space to learning and correcting the language keeping the servers and realms free of this kind of dangers.
In multiplayer, besides the bad words, you can add custom words that are not allowed in these servers or realms. For example: on servers that help with autism, do not allow certain types of words when someone wants to offend another user.
Please sign in to leave a comment.
0 Comments