All Collections
Flagged content explained
Flagged content explained

Monitor and respond to content deemed inappropriate by your members

Mor Aframian avatar
Written by Mor Aframian
Updated over a week ago

At JouleBug, we rely on the community's collaboration to maintain a clean activity feed. If any of your members come across posts that seem inappropriate, offensive, or potentially fraudulent (such as using online images instead of in-the-moment photos), they can flag and report the content.

Flagged Content Types

  • Photo and/or Caption: This is an action logged by a participant that includes either a photo, a caption or both. Click on the photo thumbnail to see a larger preview.

  • Comment: This is a comment that was left on an action logged by another participant, typically via the activity feed.

  • Profile: Members may customize their profile photo, name or headline, which is most visible in the activity feed and leaderboard.

Reviewing and Taking Action on Flagged Content

The flagged content section allows Org Owners and Admins to monitor and respond to content that members have flagged as inappropriate. Once content has been flagged, it appears on the Flags tab of the Admin Dashboard.

Flagged Content is displayed in a table with the following information:

  • Flagged User: The name of the user that originally authored the content. When a comment is flagged on another person's post, the identity of the comment author is revealed, not that of the original post author.

  • Comment: The text content of what was flagged. If a post containing only a photograph without any comment is flagged, it may appear as blank.

  • Photo: The photograph content of what was flagged. Click on the thumbnail to see a larger preview of the photo.

  • # Flags: The number of times this specific content item has been flagged by different members. If an item has been flagged by more than 2 different members, the item is hidden but not removed until the Org Owner or Admin takes action on it.

  • Created (time): The time and date that the content was created. Times are displayed in your browser's local time.

  • Flagged (time): The time and date that the content was last flagged. Times are displayed in your browser's local time.

  • Action Buttons: There are two possible actions that can be taken to moderate each item: Approve or Exclude. Note: Once an action is taken, it cannot be undone, so we recommend inspecting the content first and being decisive.

    • Approve: This approves the content permanently for the app. The item will appear in the app as if it had never been flagged. It will no longer appear in the Flagged Content section if it is flagged again after it has been approved.

    • Exclude: This removes the content permanently from the app. The item will no longer appear in the user's history or any activity feeds. However, points will remain rewarded for the original action logged.

Did this answer your question?