Outprobe: Content Moderation, DMCA & Legal Compliance

Effective Date: April 4, 2026 Last Updated: March 20, 2026 Version: 1.0


Our Approach

Outprobe is a platform, not a publisher. We do not create, curate, or editorially control user-generated content. Like WhatsApp or Telegram, most content on Outprobe lives inside private or semi-private spaces (communities, circles, probes, DMs) controlled by their members.

Our role:

  • Provide tools for community owners to moderate their own spaces
  • Act swiftly on illegal content (CSAM, terrorism) when reported or detected
  • Process DMCA/copyright takedowns as required by law
  • Maintain a reporting system for all users
  • Comply with law enforcement requests through proper legal channels

We do not:

  • Pre-screen or pre-approve user content
  • Algorithmically suppress or promote content based on editorial judgment
  • Monitor private messages, probes, or hidden communities proactively
  • Make editorial decisions about what opinions are acceptable

1. Content Responsibility

1.1 Who Is Responsible for What

PartyResponsible For
UsersAll content they create, upload, or share. Users are legally liable for their own posts, media, comments, messages, and listings.
Community Owners/AdminsModerating their community. Setting and enforcing community rules. Removing content that violates their community guidelines. Responding to member reports within their community.
Community ModeratorsAssisting owners in content moderation within their assigned community. Acting on reports and enforcing community rules.
Outprobe (Platform)Providing moderation tools. Acting on reports of illegal content. Processing DMCA takedowns. Complying with law enforcement. Removing content that violates platform-wide Terms of Service. Maintaining the reporting system.

1.2 Platform Liability Protection

Outprobe operates as an intermediary platform under:

JurisdictionLawOur Obligation
United StatesSection 230, Communications Decency ActNot liable for user content. Must process DMCA notices (17 U.S.C. 512).
European UnionDigital Services Act (DSA)Act expeditiously on illegal content when notified. Provide transparency reports. Designate point of contact.
IndiaIT Act, Section 79 + IT Rules 2021Follow due diligence. Appoint Grievance Officer. Remove content within 36 hours of government order. Remove CSAM within 24 hours.
United KingdomOnline Safety ActRisk assessments for illegal content. Swift removal when notified.
GeneralLocal lawsComply with applicable content laws in each jurisdiction we operate in.

This protection requires us to act when notified. If we receive a valid report of illegal content and fail to act, we lose safe harbor protection.


2. Content Categories

2.1 Zero Tolerance — Removed Immediately, Reported to Authorities

This content is never allowed under any circumstances. It is removed immediately upon detection or report, and reported to relevant law enforcement.

ContentActionReporting
Child Sexual Abuse Material (CSAM)Immediate removal. Account permanently banned. All associated accounts investigated.Reported to NCMEC (US), IWF (UK), and local law enforcement.
Terrorism / Violent ExtremismImmediate removal. Account permanently banned.Reported to relevant counter-terrorism authorities.
Content facilitating imminent violenceImmediate removal. Account suspended pending investigation.Reported to local law enforcement if credible threat.
Non-consensual intimate imageryImmediate removal. Account suspended.Reported to law enforcement if applicable.

No warnings. No appeals for CSAM or terrorism content. Permanent ban.

2.2 Terms of Service Violations — Removed by Outprobe

Content that violates our Terms of Service but is not necessarily illegal:

ContentAction
Spam / scam / phishingRemoved. Repeat offenders banned.
ImpersonationRemoved. Account may be suspended.
Doxxing (sharing private info)Removed immediately. Account suspended.
Hate speech (targeted harassment based on protected characteristics)Removed. Warning or suspension based on severity.
Malware / malicious linksRemoved immediately. Account banned.

2.3 Community-Level Moderation — Handled by Owners

Everything else is up to community owners and their moderation teams:

ContentWho Decides
Off-topic postsCommunity owner/mods
Low-quality contentCommunity owner/mods
Arguments / heated discussionsCommunity owner/mods
NSFW content (in appropriate communities)Community owner/mods (must be properly labeled)
Community-specific rulesCommunity owner/mods
Member behaviorCommunity owner/mods

Outprobe does not intervene in community-level moderation decisions unless the content violates platform-wide Terms of Service or applicable law.


3. DMCA / Copyright Takedown Process

3.1 Overview

Outprobe complies with the Digital Millennium Copyright Act (DMCA) and equivalent copyright laws in other jurisdictions. We have a designated DMCA Agent and follow the standard notice-and-takedown process.

3.2 Filing a Copyright Takedown Notice

If you believe content on Outprobe infringes your copyright, send a notice to our designated DMCA Agent:

DMCA Agent Contact:

Your notice must include:

  1. Identification of the copyrighted work — what is being infringed (title, URL of original, registration number if applicable)
  2. Identification of the infringing content — URL(s) or description of where the infringing content is on Outprobe
  3. Your contact information — full name, email address, phone number, physical address
  4. Good faith statement: "I have a good faith belief that the use of the material described above is not authorized by the copyright owner, its agent, or the law."
  5. Accuracy statement: "The information in this notice is accurate, and under penalty of perjury, I am the copyright owner or am authorized to act on behalf of the owner of an exclusive right that is allegedly infringed."
  6. Your signature — physical or electronic

Incomplete notices will not be processed. We will inform you of what is missing.

3.3 What Happens After a Valid Notice

StepTimelineAction
1Within 24 hoursWe acknowledge receipt of the notice
2Within 48 hoursWe review the notice for completeness and validity
3Within 72 hoursIf valid, we remove or disable access to the content
4Immediately after removalWe notify the user who posted the content, providing a copy of the notice
5User has 10 business daysUser may file a counter-notice (see below)

3.4 Counter-Notice (Disputing a Takedown)

If your content was removed and you believe it was removed in error (fair use, you own the rights, misidentification), you may file a counter-notice.

Send to: dmca@outprobe.com Subject line: DMCA Counter-Notice

Your counter-notice must include:

  1. Identification of the removed content — what was taken down and where it was located
  2. Your contact information — full name, email, phone number, physical address
  3. Consent to jurisdiction: "I consent to the jurisdiction of the Federal District Court for the district in which my address is located, or if outside the US, any judicial district in which Outprobe may be found."
  4. Good faith statement: "I swear, under penalty of perjury, that I have a good faith belief that the material was removed or disabled as a result of mistake or misidentification."
  5. Your signature — physical or electronic

3.5 What Happens After a Counter-Notice

StepTimelineAction
1Within 24 hoursWe forward the counter-notice to the original complainant
2Complainant has 10 business daysTo file a court action seeking to restrain the user
3If no court action filedWe restore the content within 14 business days of receiving the counter-notice
4If court action filedContent remains down pending court resolution

3.6 Copyright Strikes System

Outprobe maintains a copyright strike system to comply with the DMCA's repeat infringer policy.

StrikeConsequence
1st strikeContent removed. Warning notification sent. User informed of copyright policies.
2nd strikeContent removed. Formal warning. User's upload capabilities may be temporarily restricted.
3rd strikeContent removed. Account suspended for 30 days. All content reviewed.
4th strikeAccount permanently terminated.

Strike expiration: Strikes expire after 12 months of no further violations.

Strike removal: If a counter-notice is successful (content restored), the associated strike is removed.

Dedicated copyright page: Users can view their copyright strike history and status at their account settings page. This page shows:

  • Number of active strikes
  • Details of each strike (date, content, complainant)
  • Strike expiration dates
  • How to file a counter-notice
  • Educational resources about copyright

3.7 Designated DMCA Agent

As required by 17 U.S.C. 512(c), our designated agent for receiving DMCA notices is:

Outprobe DMCA Agent
Email: dmca@outprobe.com
[Physical address to be registered with US Copyright Office]

4. Reporting System

4.1 How Users Report Content

Every piece of content on Outprobe has a "Report" option accessible via the three-dot menu:

Reportable ContentWhere to Find Report Button
PostsThree-dot menu on post
CommentsThree-dot menu on comment
MediaThree-dot menu on media item
Messages (DM/Probe)Long-press on message
CommunitiesCommunity page > three-dot menu
User profilesProfile page > three-dot menu
Ads (promoted posts)Three-dot menu on promoted post
EventsThree-dot menu on event
Listings (marketplace)Three-dot menu on listing

4.2 Report Categories

When reporting, users select a category:

CategoryRouted ToResponse Time
CSAM / Child exploitationOutprobe trust & safety (immediate)Within 24 hours
Terrorism / violent extremismOutprobe trust & safety (immediate)Within 24 hours
Imminent threat of violenceOutprobe trust & safety (immediate)Within 24 hours
Non-consensual intimate imageryOutprobe trust & safetyWithin 24 hours
Copyright infringementOutprobe DMCA teamWithin 72 hours
Spam / scam / phishingCommunity moderators first, then Outprobe if unresolvedWithin 7 days
Harassment / bullyingCommunity moderators first, then Outprobe if severeWithin 7 days
Hate speechCommunity moderators first, then Outprobe if unresolvedWithin 7 days
ImpersonationOutprobe trust & safetyWithin 7 days
Self-harm / suicideOutprobe trust & safetyWithin 24 hours (with crisis resources provided)
Other violationCommunity moderatorsAt moderator discretion

4.3 Report Flow

User reports content
  │
  ├─ CSAM / Terrorism / Imminent Violence / NCII
  │   → Goes directly to Outprobe trust & safety
  │   → Content hidden immediately pending review
  │   → Reviewed within 24 hours
  │   → If confirmed: removed permanently, account action taken, authorities notified
  │
  ├─ Copyright
  │   → Goes to Outprobe DMCA team
  │   → Standard DMCA process (see Section 3)
  │
  ├─ Spam / Harassment / Hate Speech / Other
  │   → Goes to community owner/moderators first
  │   → Moderators decide within their community rules
  │   → If user disagrees with moderator decision → can escalate to Outprobe
  │   → Outprobe reviews only for Terms of Service violations
  │
  └─ Self-harm
      → Goes to Outprobe trust & safety
      → Crisis resources shown to reporter and content creator
      → Content reviewed for policy compliance

4.4 Reporter Privacy

  • The identity of the reporter is never revealed to the reported user
  • Reports are processed confidentially
  • Reporters receive a notification when their report is resolved
  • False reporting (abuse of the report system) may result in the reporter's account being restricted

5. Content Appeals

5.1 When Content Is Removed

If Outprobe removes your content, you will receive a notification explaining:

  • What content was removed
  • Which policy it violated
  • What action was taken on your account (if any)
  • How to appeal

5.2 Appeal Process

StepTimelineAction
1AnytimeUser submits appeal via the notification or Settings > Account > Appeals
2Within 7 daysOutprobe reviews the appeal (different reviewer than original decision)
3After reviewDecision communicated: upheld (content stays removed) or overturned (content restored)

5.3 What Can Be Appealed

DecisionAppealable?
Content removal (ToS violation)Yes
Account suspensionYes
Copyright strikeYes (via DMCA counter-notice)
CSAM removalNo
Terrorism content removalNo
Permanent ban (repeated violations)Yes (one appeal only)

5.4 What Cannot Be Appealed

  • CSAM removal (zero tolerance, no exceptions)
  • Terrorism content removal (zero tolerance, no exceptions)
  • Community-level moderation decisions (take it up with the community owner, not Outprobe)

6. Community Moderation Responsibilities

6.1 Owner Obligations

Community owners are the first line of moderation. By creating a community, owners agree to:

ObligationDescription
Set community rulesClearly define what is and isn't allowed in the community
Moderate contentRemove content that violates community rules
Respond to reportsAct on member reports within their community in a reasonable timeframe
Appoint moderatorsFor larger communities, appoint moderators to help
Enforce Outprobe ToSEnsure community content doesn't violate platform-wide Terms of Service
Escalate illegal contentReport illegal content to Outprobe immediately (don't just delete it)

6.2 Moderation Tools Provided

Outprobe provides community owners and moderators with:

ToolFunction
Delete post/commentRemove any content within the community
Mute memberTemporarily prevent a member from posting
Ban memberRemove a member and prevent them from rejoining
Report to OutprobeEscalate content to Outprobe trust & safety
Moderation logView all moderation actions taken in the community
Auto-moderation (future)Keyword filters, link blocking, spam detection

6.3 When Outprobe Overrides Community Moderation

Outprobe will override community moderation decisions only when:

SituationAction
Illegal content not removed by moderatorsOutprobe removes content directly
Community systematically violates ToSCommunity may be suspended or removed
Owner/moderators are complicit in ToS violationsTheir accounts may face action
Law enforcement requestOutprobe complies regardless of community moderation

7. Law Enforcement & Government Requests

7.1 Our Principles

  • We comply with valid legal orders in applicable jurisdictions
  • We minimize the data disclosed to only what is legally required
  • We notify affected users when legally permitted to do so
  • We do not provide bulk or warrantless access to user data
  • We do not build "backdoors" into our systems

7.2 What We Require

Request TypeWhat We Require
User data requestValid subpoena, court order, or equivalent legal process
Content removal (government)Valid court order or legal directive from applicable jurisdiction
Emergency disclosureImminent threat to life — we may disclose limited data without a court order to prevent harm
Preservation requestValid legal request — we preserve (but do not disclose) specified data for 90 days

7.3 What We Provide

RequestData Provided
Basic subscriber info (subpoena)Name, email, account creation date, IP log (last 90 days)
Content data (court order)Posts, comments, media created by the specified user
Message content (search warrant)DM/probe messages of specified user (requires search warrant or equivalent)
Real-time interceptionNot supported. We do not have the capability for real-time wiretapping.

7.4 User Notification

  • We notify users of law enforcement requests unless prohibited by law (e.g., gag order)
  • If a gag order expires, we notify the user at that time
  • Notification includes: what was requested, by whom, and what was disclosed

7.5 Transparency Report

Outprobe will publish an annual transparency report including:

  • Number of law enforcement requests received
  • Number of requests complied with (full and partial)
  • Number of requests rejected
  • Number of user accounts affected
  • Number of content removals due to legal orders
  • Breakdown by country

8. Private Content (DMs, Probes, Hidden Communities)

8.1 Our Position

Outprobe treats private content similarly to how WhatsApp and other messaging platforms treat messages:

PrincipleImplementation
We don't read private contentOutprobe staff does not access DMs, probe messages, or hidden community content unless required by law or a valid user report
We don't scan private contentNo automated scanning of private messages for advertising, analytics, or general moderation purposes
We act on reportsIf a participant in a private conversation reports content, we review the reported content only
We comply with lawValid legal orders can compel disclosure of private content (see Section 7)
CSAM exceptionIf we become aware of CSAM in any context (public or private), we are legally required to report it to NCMEC and remove it

8.2 What This Means in Practice

Private DM between User A and User B:
  → Outprobe cannot see the content
  → Outprobe does not scan the content
  → If User A reports a message from User B:
    → Outprobe reviews the reported message(s) only
    → Takes action if it violates ToS
  → If law enforcement presents a valid warrant:
    → Outprobe provides the specified messages
    → Notifies the user unless legally prohibited

9. CSAM Detection and Reporting

9.1 Legal Obligations

Under US federal law (18 U.S.C. 2258A), electronic service providers must report apparent CSAM to NCMEC's CyberTipline. This is not optional.

9.2 Our Process

StepAction
1CSAM detected (user report, or becomes known to Outprobe staff)
2Content immediately removed and preserved for law enforcement
3User account permanently banned
4Report filed with NCMEC CyberTipline within 24 hours
5All relevant data preserved and available for law enforcement
6Associated accounts investigated for additional violations

9.3 No Appeals

There is no appeal process for CSAM-related removals and bans. This is a legal and moral absolute.


10. Terrorist and Violent Extremist Content (TVEC)

10.1 Definition

Content that:

  • Promotes, incites, or glorifies terrorism or violent extremism
  • Provides instructions for carrying out terrorist acts
  • Recruits for terrorist organizations
  • Depicts terrorist acts for the purpose of glorification

10.2 Our Process

StepAction
1Content reported or detected
2Content immediately removed
3User account permanently banned
4Reported to relevant authorities (varies by jurisdiction)
5Hash of content shared with industry databases (e.g., GIFCT) if applicable

10.3 No Appeals

There is no appeal process for terrorism-related removals and bans.


11. Self-Harm and Suicide Content

11.1 Our Approach

We treat self-harm content with care, not just enforcement:

ActionDetails
Crisis resourcesWhen self-harm content is reported, both the reporter and the content creator are shown crisis helpline information
Content reviewContent is reviewed by Outprobe trust & safety
RemovalContent that promotes or encourages self-harm is removed
SupportContent that is a genuine cry for help is handled sensitively — the user is directed to crisis resources, not punished
No automatic bansUsers expressing distress are not banned — they need help, not punishment

11.2 Crisis Resources

When self-harm content is detected, we display localized crisis helpline information:

RegionResource
GlobalInternational Association for Suicide Prevention: https://www.iasp.info/resources/Crisis_Centres/
United States988 Suicide & Crisis Lifeline (call or text 988)
IndiaAASRA: 9820466626
United KingdomSamaritans: 116 123
European UnionEU-wide: 116 123

(Additional regions to be added)


12. Account Actions

12.1 Types of Account Actions

ActionWhat It MeansDurationAppealable?
WarningNotification that content violated a policyPermanent recordN/A
Content removalSpecific content is removedPermanentYes
Temporary muteUser cannot post or comment24 hours to 30 daysYes
Upload restrictionUser cannot upload mediaUntil restriction is liftedYes
Account suspensionUser cannot access the platform7 to 90 daysYes
Permanent banAccount terminatedPermanentOne appeal only

12.2 Escalation Path

1st violation: Warning + content removal
2nd violation: Content removal + temporary mute (24-72 hours)
3rd violation: Content removal + extended mute (7-30 days)
4th violation: Account suspension (30 days)
5th violation: Permanent ban

Exception: Zero-tolerance content (CSAM, terrorism)
  → Immediate permanent ban, no escalation path

12.3 Evading Bans

Creating new accounts to evade a ban is a violation of our Terms of Service. If detected:

  • All associated accounts are permanently banned
  • IP-based restrictions may be applied
  • Legal action may be pursued in severe cases

13. Content on the Platform After Account Deletion

13.1 What Happens to Content

When a user deletes their account:

Content TypeWhat Happens
Posts in communitiesAnonymized — author shown as "Deleted User". Content remains for community context.
CommentsAnonymized — author shown as "Deleted User"
Direct messagesRemoved from the deleted user's side. Other participant retains their copy.
Probe messagesAnonymized — author shown as "Deleted User"
Media uploadsRemoved from platform storage within 30 days
Profile dataPermanently deleted within 30 days
Community ownershipTransferred to next admin. If no admin, community is deleted.

13.2 Why Content Is Anonymized, Not Deleted

Deleting all posts from communities would destroy conversation context and harm other users who participated in discussions. Anonymization preserves the community's content while removing all connection to the deleted user.


14. Immunity and Indemnification

14.1 Platform Immunity

Outprobe is an intermediary platform. We are not the author, creator, or endorser of any user-generated content. Under applicable safe harbor laws, we are not liable for content posted by users, provided we:

  • Do not exercise editorial control over user content
  • Act on valid takedown notices
  • Remove illegal content when notified
  • Maintain a repeat infringer policy
  • Comply with applicable legal obligations

14.2 User Responsibility

By using Outprobe, users agree that:

  • They are solely responsible for all content they post
  • They will not post content that infringes third-party rights
  • They will not post illegal content
  • They indemnify Outprobe against claims arising from their content
  • They accept that their content may be removed if it violates policies

14.3 Community Owner Responsibility

Community owners accept additional responsibility:

  • They are responsible for moderating their community
  • They must take reasonable action on reports within their community
  • They must escalate illegal content to Outprobe
  • They are not employees or agents of Outprobe

15. Contact Information

PurposeContact
DMCA takedown noticesdmca@outprobe.com
DMCA counter-noticesdmca@outprobe.com
Report illegal contentreport@outprobe.com or in-app Report button
Law enforcement requestslegal@outprobe.com
Content appealsIn-app appeals system or appeals@outprobe.com
General supportsupport@outprobe.com
Grievance Officer (India, IT Rules)grievance@outprobe.com
DSA contact (EU)dsa@outprobe.com

16. Compliance Calendar

RequirementFrequencyDetails
Transparency reportAnnualLaw enforcement requests, content removals, DMCA stats
NCMEC reportingAs neededWithin 24 hours of CSAM detection
DSA compliance report (EU)AnnualIf applicable based on user count
India IT Rules compliance reportMonthlyIf applicable based on Indian user count
DMCA agent registrationUpdate as neededUS Copyright Office registration

Summary

PrincipleImplementation
Users own their contentUsers are responsible for what they post
Community owners moderateFirst line of defense, not Outprobe
Outprobe is the platformWe provide tools, not editorial control
Zero tolerance for illegal contentCSAM, terrorism — immediate removal, permanent ban, authorities notified
DMCA complianceFull notice-and-takedown process with counter-notice and strikes
Private content is privateWe don't read or scan private messages
Due processAppeals for all decisions except CSAM/terrorism
TransparencyAnnual reports on law enforcement requests and content removals
Law enforcement complianceValid legal orders only, user notification when permitted

Questions about this policy? legal@outprobe.com