Once you have reviewed the philosophy behind moderation (check out our discussion of that here), it is time to consider the options available to achieve your goals.
This article covers three relevant topics that help ensure the thoughts your participants see are appropriate and respectful while preserving trust and transparency in your Exchange:
Hide Reported Thoughts
Like everything related to ThoughtExchange, settings for reported thoughts lean into transparency and trust in the conversation.
By default, thoughts reported by participants (see Community Moderation section below) continue to be visible (and available to rate) to all participants until they are reviewed by you (the leader) and either approved to go on in the conversation or removed. Leaders can choose to hide participant-reported thoughts pending review in the moderation settings menu.
You can choose to hide reported thoughts from your participants until you have had a chance to review them by toggling the Hide Reported Thoughts option on the Create page to Yes.
Once you've selected the settings you would like to use for your Exchange, be sure to click the Apply button at the bottom of the page.
Community moderation is the most open and transparent approach to reviewing shared thoughts. With this method, all thoughts entered into the exchange are immediately visible to participants, making the process as smooth as possible. If a participant sees a thought they think is inappropriate, they can click the three dots in the top right-hand corner of the thought bubble to Report it.
You can review and decide to keep or remove any reported thoughts in your exchange using the Moderate tool in the Results Dashboard.
When it’s appropriate:
- Exchanges with trusted/respectful groups (e.g. staff, leadership teams, etc.)
- Exchanges about non-contentious topics
- Exchanges where you have resources in place to keep an eye on the conversation
Machine moderation allows the software to automatically review thoughts before they are visible to participants, and report those that contain rude or hurtful language, individuals’ names, and duplicate thoughts shared by the same participant. By default, this feature uses a list of commonly seen words and phrases to determine thoughts that should be reviewed.
You can choose the settings for Machine Moderated thoughts under the Moderation tab in the Settings menu for your exchange (look in the top right-hand corner for the gear icon when creating or editing your Exchange. You can easily turn Machine Moderation on and off and control what happens to thoughts that are flagged by the AI (removed from the exchange or visible to participants).
Leaders may also add up to 100 custom words and phrases to the list of moderation criteria the Machine Moderator uses for a particular Exchange. Simply toggle ON the setting labeled "Enable machine reporting for custom words and phrases" and type the list into the dialogue box below. After each new word or phrase, press enter.
When you are finished creating moderation settings, be sure to click the Apply button at the bottom of the page.
You can review any thoughts reported by the software using the Moderate tool in the Results Dashboard - we’ve kept it pretty simple that way.
Note: Machine moderation, like any feature that relies on artificial intelligence, can make mistakes. It is not meant to replace the decision-making power of a person, only to make moderation easier.
When it’s appropriate:
- Exchanges with broad groups
- Exchanges about more contentious topics
- Exchanges where you have resources in place to review flagged thoughts
If you have any questions about which method is right for your exchange, feel free to reach out to your Client Success Manager or Engagement Coach, or drop us a line at firstname.lastname@example.org.
Please sign in to leave a comment.