What Is Moderation?

The moderation feature allows you to assign certain users to review and approve (or reject) content changes submitted by community users before they are published. You can designate one or several moderators and enable moderation for a variety of content types. You can also set up abuse reporting and moderate user registration.

For a complete list of content types that can be moderated and the places where you can enable moderation, see Who Moderates What?

The following types of moderation are available in the application:
Content moderation
You can enable content moderation in a space and the projects it contains, or in the root space and all social groups (because social groups are contained by the root space and inherit its moderation settings). You can designate one or several content moderators per place, as well as limit the type of content moderated in that place (documents, blog posts, discussions, and so on).
Document approval
Document approval is different from content moderation in that document approvers can only approve documents in a given space. For information about that, see Setting a Document Approver in a Space.
User registration moderation
When you enable user registration moderation, registration requests are reviewed by a moderator(s) and then approved or rejected. You can blacklist email addresses from specific domains or auto-approve users if their email addresses originate from your community domain.
Profile image moderation
You can enable moderation for images that users upload to their user profile. This feature is either fully enabled for all users or fully disabled for all users.
Avatar moderation
You can enable moderation for every user-uploaded avatar image. For more information, see Setting Up User-Uploaded Avatar Moderation.
Abuse reporting
When users report abuse, the reports are sent to a moderator. For more information, see Setting Up Abuse Reporting.