Once you have reviewed the philosophy behind moderation (check out our discussion of that here), it is time to consider the options available to achieve your goals.
This article covers three relevant topics that help ensure the thoughts your participants see are appropriate and respectful while preserving trust and transparency in your exchange:
Hide Reported Thoughts
Like all things related to Thoughtexchange, settings for reported thoughts lean into transparency and trust in the conversation.
By default, thoughts reported by participants (see Community Moderation section below) continue to be visible (and available to rate) to all participants until they are reviewed by you (the leader) and either approved to go on in the conversation or removed.
You can choose to hide reported thoughts from your participants until you have had a chance to review them by toggling the Hide Reported Thoughts option on the Create page to Yes.
Community moderation is the most open and transparent approach to reviewing shared thoughts. With this method, all thoughts entered into the exchange are immediately visible to participants, making the process as smooth as possible. If a participant sees a thought they think is inappropriate, they can click the three dots in the top right-hand corner of the thought bubble to Report it.
When it’s appropriate:
- Exchanges with trusted/respectful groups (e.g. staff, leadership teams, etc.)
- Exchanges about non-contentious topics
- Exchanges where you have resources in place to keep an eye on the conversation
Machine Moderation (BETA)
Machine moderation allows the software to automatically review thoughts before they are visible to participants, and report those that contain rude or hurtful language and individuals’ names.
You can choose the settings for Machine Moderated thoughts under the Moderation tab in the Create menu for your exchange. You can easily turn Machine Moderation on and off and control what happens to thoughts that are flagged by the AI (removed from the exchange or visible to participants).
You can review any thoughts reported by the software using the Moderate tool in the Discover Dashboard - we’ve kept it pretty simple that way.
Note: Machine moderation, like any feature that relies on artificial intelligence, can make mistakes. It is not meant to replace the decision-making power of a person, only to make moderation easier.
When it’s appropriate:
- Exchanges with broad groups
- Exchanges about more contentious topics
- Exchanges where you have resources in place to review flagged thoughts
If you have any questions about which method is right for your exchange, feel free to reach out to your Client Success Manager or Engagement Coach, or drop us a line at email@example.com.