Outprobe: Content Moderation, DMCA & Legal Compliance
Effective Date: April 4, 2026 Last Updated: March 20, 2026 Version: 1.0
Our Approach
Outprobe is a platform, not a publisher. We do not create, curate, or editorially control user-generated content. Like WhatsApp or Telegram, most content on Outprobe lives inside private or semi-private spaces (communities, circles, probes, DMs) controlled by their members.
Our role:
- Provide tools for community owners to moderate their own spaces
- Act swiftly on illegal content (CSAM, terrorism) when reported or detected
- Process DMCA/copyright takedowns as required by law
- Maintain a reporting system for all users
- Comply with law enforcement requests through proper legal channels
We do not:
- Pre-screen or pre-approve user content
- Algorithmically suppress or promote content based on editorial judgment
- Monitor private messages, probes, or hidden communities proactively
- Make editorial decisions about what opinions are acceptable
1. Content Responsibility
1.1 Who Is Responsible for What
| Party | Responsible For |
|---|---|
| Users | All content they create, upload, or share. Users are legally liable for their own posts, media, comments, messages, and listings. |
| Community Owners/Admins | Moderating their community. Setting and enforcing community rules. Removing content that violates their community guidelines. Responding to member reports within their community. |
| Community Moderators | Assisting owners in content moderation within their assigned community. Acting on reports and enforcing community rules. |
| Outprobe (Platform) | Providing moderation tools. Acting on reports of illegal content. Processing DMCA takedowns. Complying with law enforcement. Removing content that violates platform-wide Terms of Service. Maintaining the reporting system. |
1.2 Platform Liability Protection
Outprobe operates as an intermediary platform under:
| Jurisdiction | Law | Our Obligation |
|---|---|---|
| United States | Section 230, Communications Decency Act | Not liable for user content. Must process DMCA notices (17 U.S.C. 512). |
| European Union | Digital Services Act (DSA) | Act expeditiously on illegal content when notified. Provide transparency reports. Designate point of contact. |
| India | IT Act, Section 79 + IT Rules 2021 | Follow due diligence. Appoint Grievance Officer. Remove content within 36 hours of government order. Remove CSAM within 24 hours. |
| United Kingdom | Online Safety Act | Risk assessments for illegal content. Swift removal when notified. |
| General | Local laws | Comply with applicable content laws in each jurisdiction we operate in. |
This protection requires us to act when notified. If we receive a valid report of illegal content and fail to act, we lose safe harbor protection.
2. Content Categories
2.1 Zero Tolerance — Removed Immediately, Reported to Authorities
This content is never allowed under any circumstances. It is removed immediately upon detection or report, and reported to relevant law enforcement.
| Content | Action | Reporting |
|---|---|---|
| Child Sexual Abuse Material (CSAM) | Immediate removal. Account permanently banned. All associated accounts investigated. | Reported to NCMEC (US), IWF (UK), and local law enforcement. |
| Terrorism / Violent Extremism | Immediate removal. Account permanently banned. | Reported to relevant counter-terrorism authorities. |
| Content facilitating imminent violence | Immediate removal. Account suspended pending investigation. | Reported to local law enforcement if credible threat. |
| Non-consensual intimate imagery | Immediate removal. Account suspended. | Reported to law enforcement if applicable. |
No warnings. No appeals for CSAM or terrorism content. Permanent ban.
2.2 Terms of Service Violations — Removed by Outprobe
Content that violates our Terms of Service but is not necessarily illegal:
| Content | Action |
|---|---|
| Spam / scam / phishing | Removed. Repeat offenders banned. |
| Impersonation | Removed. Account may be suspended. |
| Doxxing (sharing private info) | Removed immediately. Account suspended. |
| Hate speech (targeted harassment based on protected characteristics) | Removed. Warning or suspension based on severity. |
| Malware / malicious links | Removed immediately. Account banned. |
2.3 Community-Level Moderation — Handled by Owners
Everything else is up to community owners and their moderation teams:
| Content | Who Decides |
|---|---|
| Off-topic posts | Community owner/mods |
| Low-quality content | Community owner/mods |
| Arguments / heated discussions | Community owner/mods |
| NSFW content (in appropriate communities) | Community owner/mods (must be properly labeled) |
| Community-specific rules | Community owner/mods |
| Member behavior | Community owner/mods |
Outprobe does not intervene in community-level moderation decisions unless the content violates platform-wide Terms of Service or applicable law.
3. DMCA / Copyright Takedown Process
3.1 Overview
Outprobe complies with the Digital Millennium Copyright Act (DMCA) and equivalent copyright laws in other jurisdictions. We have a designated DMCA Agent and follow the standard notice-and-takedown process.
3.2 Filing a Copyright Takedown Notice
If you believe content on Outprobe infringes your copyright, send a notice to our designated DMCA Agent:
DMCA Agent Contact:
- Email: dmca@outprobe.com
- Subject line: DMCA Takedown Notice
Your notice must include:
- Identification of the copyrighted work — what is being infringed (title, URL of original, registration number if applicable)
- Identification of the infringing content — URL(s) or description of where the infringing content is on Outprobe
- Your contact information — full name, email address, phone number, physical address
- Good faith statement: "I have a good faith belief that the use of the material described above is not authorized by the copyright owner, its agent, or the law."
- Accuracy statement: "The information in this notice is accurate, and under penalty of perjury, I am the copyright owner or am authorized to act on behalf of the owner of an exclusive right that is allegedly infringed."
- Your signature — physical or electronic
Incomplete notices will not be processed. We will inform you of what is missing.
3.3 What Happens After a Valid Notice
| Step | Timeline | Action |
|---|---|---|
| 1 | Within 24 hours | We acknowledge receipt of the notice |
| 2 | Within 48 hours | We review the notice for completeness and validity |
| 3 | Within 72 hours | If valid, we remove or disable access to the content |
| 4 | Immediately after removal | We notify the user who posted the content, providing a copy of the notice |
| 5 | User has 10 business days | User may file a counter-notice (see below) |
3.4 Counter-Notice (Disputing a Takedown)
If your content was removed and you believe it was removed in error (fair use, you own the rights, misidentification), you may file a counter-notice.
Send to: dmca@outprobe.com Subject line: DMCA Counter-Notice
Your counter-notice must include:
- Identification of the removed content — what was taken down and where it was located
- Your contact information — full name, email, phone number, physical address
- Consent to jurisdiction: "I consent to the jurisdiction of the Federal District Court for the district in which my address is located, or if outside the US, any judicial district in which Outprobe may be found."
- Good faith statement: "I swear, under penalty of perjury, that I have a good faith belief that the material was removed or disabled as a result of mistake or misidentification."
- Your signature — physical or electronic
3.5 What Happens After a Counter-Notice
| Step | Timeline | Action |
|---|---|---|
| 1 | Within 24 hours | We forward the counter-notice to the original complainant |
| 2 | Complainant has 10 business days | To file a court action seeking to restrain the user |
| 3 | If no court action filed | We restore the content within 14 business days of receiving the counter-notice |
| 4 | If court action filed | Content remains down pending court resolution |
3.6 Copyright Strikes System
Outprobe maintains a copyright strike system to comply with the DMCA's repeat infringer policy.
| Strike | Consequence |
|---|---|
| 1st strike | Content removed. Warning notification sent. User informed of copyright policies. |
| 2nd strike | Content removed. Formal warning. User's upload capabilities may be temporarily restricted. |
| 3rd strike | Content removed. Account suspended for 30 days. All content reviewed. |
| 4th strike | Account permanently terminated. |
Strike expiration: Strikes expire after 12 months of no further violations.
Strike removal: If a counter-notice is successful (content restored), the associated strike is removed.
Dedicated copyright page: Users can view their copyright strike history and status at their account settings page. This page shows:
- Number of active strikes
- Details of each strike (date, content, complainant)
- Strike expiration dates
- How to file a counter-notice
- Educational resources about copyright
3.7 Designated DMCA Agent
As required by 17 U.S.C. 512(c), our designated agent for receiving DMCA notices is:
Outprobe DMCA Agent
Email: dmca@outprobe.com
[Physical address to be registered with US Copyright Office]
4. Reporting System
4.1 How Users Report Content
Every piece of content on Outprobe has a "Report" option accessible via the three-dot menu:
| Reportable Content | Where to Find Report Button |
|---|---|
| Posts | Three-dot menu on post |
| Comments | Three-dot menu on comment |
| Media | Three-dot menu on media item |
| Messages (DM/Probe) | Long-press on message |
| Communities | Community page > three-dot menu |
| User profiles | Profile page > three-dot menu |
| Ads (promoted posts) | Three-dot menu on promoted post |
| Events | Three-dot menu on event |
| Listings (marketplace) | Three-dot menu on listing |
4.2 Report Categories
When reporting, users select a category:
| Category | Routed To | Response Time |
|---|---|---|
| CSAM / Child exploitation | Outprobe trust & safety (immediate) | Within 24 hours |
| Terrorism / violent extremism | Outprobe trust & safety (immediate) | Within 24 hours |
| Imminent threat of violence | Outprobe trust & safety (immediate) | Within 24 hours |
| Non-consensual intimate imagery | Outprobe trust & safety | Within 24 hours |
| Copyright infringement | Outprobe DMCA team | Within 72 hours |
| Spam / scam / phishing | Community moderators first, then Outprobe if unresolved | Within 7 days |
| Harassment / bullying | Community moderators first, then Outprobe if severe | Within 7 days |
| Hate speech | Community moderators first, then Outprobe if unresolved | Within 7 days |
| Impersonation | Outprobe trust & safety | Within 7 days |
| Self-harm / suicide | Outprobe trust & safety | Within 24 hours (with crisis resources provided) |
| Other violation | Community moderators | At moderator discretion |
4.3 Report Flow
User reports content
│
├─ CSAM / Terrorism / Imminent Violence / NCII
│ → Goes directly to Outprobe trust & safety
│ → Content hidden immediately pending review
│ → Reviewed within 24 hours
│ → If confirmed: removed permanently, account action taken, authorities notified
│
├─ Copyright
│ → Goes to Outprobe DMCA team
│ → Standard DMCA process (see Section 3)
│
├─ Spam / Harassment / Hate Speech / Other
│ → Goes to community owner/moderators first
│ → Moderators decide within their community rules
│ → If user disagrees with moderator decision → can escalate to Outprobe
│ → Outprobe reviews only for Terms of Service violations
│
└─ Self-harm
→ Goes to Outprobe trust & safety
→ Crisis resources shown to reporter and content creator
→ Content reviewed for policy compliance
4.4 Reporter Privacy
- The identity of the reporter is never revealed to the reported user
- Reports are processed confidentially
- Reporters receive a notification when their report is resolved
- False reporting (abuse of the report system) may result in the reporter's account being restricted
5. Content Appeals
5.1 When Content Is Removed
If Outprobe removes your content, you will receive a notification explaining:
- What content was removed
- Which policy it violated
- What action was taken on your account (if any)
- How to appeal
5.2 Appeal Process
| Step | Timeline | Action |
|---|---|---|
| 1 | Anytime | User submits appeal via the notification or Settings > Account > Appeals |
| 2 | Within 7 days | Outprobe reviews the appeal (different reviewer than original decision) |
| 3 | After review | Decision communicated: upheld (content stays removed) or overturned (content restored) |
5.3 What Can Be Appealed
| Decision | Appealable? |
|---|---|
| Content removal (ToS violation) | Yes |
| Account suspension | Yes |
| Copyright strike | Yes (via DMCA counter-notice) |
| CSAM removal | No |
| Terrorism content removal | No |
| Permanent ban (repeated violations) | Yes (one appeal only) |
5.4 What Cannot Be Appealed
- CSAM removal (zero tolerance, no exceptions)
- Terrorism content removal (zero tolerance, no exceptions)
- Community-level moderation decisions (take it up with the community owner, not Outprobe)
6. Community Moderation Responsibilities
6.1 Owner Obligations
Community owners are the first line of moderation. By creating a community, owners agree to:
| Obligation | Description |
|---|---|
| Set community rules | Clearly define what is and isn't allowed in the community |
| Moderate content | Remove content that violates community rules |
| Respond to reports | Act on member reports within their community in a reasonable timeframe |
| Appoint moderators | For larger communities, appoint moderators to help |
| Enforce Outprobe ToS | Ensure community content doesn't violate platform-wide Terms of Service |
| Escalate illegal content | Report illegal content to Outprobe immediately (don't just delete it) |
6.2 Moderation Tools Provided
Outprobe provides community owners and moderators with:
| Tool | Function |
|---|---|
| Delete post/comment | Remove any content within the community |
| Mute member | Temporarily prevent a member from posting |
| Ban member | Remove a member and prevent them from rejoining |
| Report to Outprobe | Escalate content to Outprobe trust & safety |
| Moderation log | View all moderation actions taken in the community |
| Auto-moderation (future) | Keyword filters, link blocking, spam detection |
6.3 When Outprobe Overrides Community Moderation
Outprobe will override community moderation decisions only when:
| Situation | Action |
|---|---|
| Illegal content not removed by moderators | Outprobe removes content directly |
| Community systematically violates ToS | Community may be suspended or removed |
| Owner/moderators are complicit in ToS violations | Their accounts may face action |
| Law enforcement request | Outprobe complies regardless of community moderation |
7. Law Enforcement & Government Requests
7.1 Our Principles
- We comply with valid legal orders in applicable jurisdictions
- We minimize the data disclosed to only what is legally required
- We notify affected users when legally permitted to do so
- We do not provide bulk or warrantless access to user data
- We do not build "backdoors" into our systems
7.2 What We Require
| Request Type | What We Require |
|---|---|
| User data request | Valid subpoena, court order, or equivalent legal process |
| Content removal (government) | Valid court order or legal directive from applicable jurisdiction |
| Emergency disclosure | Imminent threat to life — we may disclose limited data without a court order to prevent harm |
| Preservation request | Valid legal request — we preserve (but do not disclose) specified data for 90 days |
7.3 What We Provide
| Request | Data Provided |
|---|---|
| Basic subscriber info (subpoena) | Name, email, account creation date, IP log (last 90 days) |
| Content data (court order) | Posts, comments, media created by the specified user |
| Message content (search warrant) | DM/probe messages of specified user (requires search warrant or equivalent) |
| Real-time interception | Not supported. We do not have the capability for real-time wiretapping. |
7.4 User Notification
- We notify users of law enforcement requests unless prohibited by law (e.g., gag order)
- If a gag order expires, we notify the user at that time
- Notification includes: what was requested, by whom, and what was disclosed
7.5 Transparency Report
Outprobe will publish an annual transparency report including:
- Number of law enforcement requests received
- Number of requests complied with (full and partial)
- Number of requests rejected
- Number of user accounts affected
- Number of content removals due to legal orders
- Breakdown by country
8. Private Content (DMs, Probes, Hidden Communities)
8.1 Our Position
Outprobe treats private content similarly to how WhatsApp and other messaging platforms treat messages:
| Principle | Implementation |
|---|---|
| We don't read private content | Outprobe staff does not access DMs, probe messages, or hidden community content unless required by law or a valid user report |
| We don't scan private content | No automated scanning of private messages for advertising, analytics, or general moderation purposes |
| We act on reports | If a participant in a private conversation reports content, we review the reported content only |
| We comply with law | Valid legal orders can compel disclosure of private content (see Section 7) |
| CSAM exception | If we become aware of CSAM in any context (public or private), we are legally required to report it to NCMEC and remove it |
8.2 What This Means in Practice
Private DM between User A and User B:
→ Outprobe cannot see the content
→ Outprobe does not scan the content
→ If User A reports a message from User B:
→ Outprobe reviews the reported message(s) only
→ Takes action if it violates ToS
→ If law enforcement presents a valid warrant:
→ Outprobe provides the specified messages
→ Notifies the user unless legally prohibited
9. CSAM Detection and Reporting
9.1 Legal Obligations
Under US federal law (18 U.S.C. 2258A), electronic service providers must report apparent CSAM to NCMEC's CyberTipline. This is not optional.
9.2 Our Process
| Step | Action |
|---|---|
| 1 | CSAM detected (user report, or becomes known to Outprobe staff) |
| 2 | Content immediately removed and preserved for law enforcement |
| 3 | User account permanently banned |
| 4 | Report filed with NCMEC CyberTipline within 24 hours |
| 5 | All relevant data preserved and available for law enforcement |
| 6 | Associated accounts investigated for additional violations |
9.3 No Appeals
There is no appeal process for CSAM-related removals and bans. This is a legal and moral absolute.
10. Terrorist and Violent Extremist Content (TVEC)
10.1 Definition
Content that:
- Promotes, incites, or glorifies terrorism or violent extremism
- Provides instructions for carrying out terrorist acts
- Recruits for terrorist organizations
- Depicts terrorist acts for the purpose of glorification
10.2 Our Process
| Step | Action |
|---|---|
| 1 | Content reported or detected |
| 2 | Content immediately removed |
| 3 | User account permanently banned |
| 4 | Reported to relevant authorities (varies by jurisdiction) |
| 5 | Hash of content shared with industry databases (e.g., GIFCT) if applicable |
10.3 No Appeals
There is no appeal process for terrorism-related removals and bans.
11. Self-Harm and Suicide Content
11.1 Our Approach
We treat self-harm content with care, not just enforcement:
| Action | Details |
|---|---|
| Crisis resources | When self-harm content is reported, both the reporter and the content creator are shown crisis helpline information |
| Content review | Content is reviewed by Outprobe trust & safety |
| Removal | Content that promotes or encourages self-harm is removed |
| Support | Content that is a genuine cry for help is handled sensitively — the user is directed to crisis resources, not punished |
| No automatic bans | Users expressing distress are not banned — they need help, not punishment |
11.2 Crisis Resources
When self-harm content is detected, we display localized crisis helpline information:
| Region | Resource |
|---|---|
| Global | International Association for Suicide Prevention: https://www.iasp.info/resources/Crisis_Centres/ |
| United States | 988 Suicide & Crisis Lifeline (call or text 988) |
| India | AASRA: 9820466626 |
| United Kingdom | Samaritans: 116 123 |
| European Union | EU-wide: 116 123 |
(Additional regions to be added)
12. Account Actions
12.1 Types of Account Actions
| Action | What It Means | Duration | Appealable? |
|---|---|---|---|
| Warning | Notification that content violated a policy | Permanent record | N/A |
| Content removal | Specific content is removed | Permanent | Yes |
| Temporary mute | User cannot post or comment | 24 hours to 30 days | Yes |
| Upload restriction | User cannot upload media | Until restriction is lifted | Yes |
| Account suspension | User cannot access the platform | 7 to 90 days | Yes |
| Permanent ban | Account terminated | Permanent | One appeal only |
12.2 Escalation Path
1st violation: Warning + content removal
2nd violation: Content removal + temporary mute (24-72 hours)
3rd violation: Content removal + extended mute (7-30 days)
4th violation: Account suspension (30 days)
5th violation: Permanent ban
Exception: Zero-tolerance content (CSAM, terrorism)
→ Immediate permanent ban, no escalation path
12.3 Evading Bans
Creating new accounts to evade a ban is a violation of our Terms of Service. If detected:
- All associated accounts are permanently banned
- IP-based restrictions may be applied
- Legal action may be pursued in severe cases
13. Content on the Platform After Account Deletion
13.1 What Happens to Content
When a user deletes their account:
| Content Type | What Happens |
|---|---|
| Posts in communities | Anonymized — author shown as "Deleted User". Content remains for community context. |
| Comments | Anonymized — author shown as "Deleted User" |
| Direct messages | Removed from the deleted user's side. Other participant retains their copy. |
| Probe messages | Anonymized — author shown as "Deleted User" |
| Media uploads | Removed from platform storage within 30 days |
| Profile data | Permanently deleted within 30 days |
| Community ownership | Transferred to next admin. If no admin, community is deleted. |
13.2 Why Content Is Anonymized, Not Deleted
Deleting all posts from communities would destroy conversation context and harm other users who participated in discussions. Anonymization preserves the community's content while removing all connection to the deleted user.
14. Immunity and Indemnification
14.1 Platform Immunity
Outprobe is an intermediary platform. We are not the author, creator, or endorser of any user-generated content. Under applicable safe harbor laws, we are not liable for content posted by users, provided we:
- Do not exercise editorial control over user content
- Act on valid takedown notices
- Remove illegal content when notified
- Maintain a repeat infringer policy
- Comply with applicable legal obligations
14.2 User Responsibility
By using Outprobe, users agree that:
- They are solely responsible for all content they post
- They will not post content that infringes third-party rights
- They will not post illegal content
- They indemnify Outprobe against claims arising from their content
- They accept that their content may be removed if it violates policies
14.3 Community Owner Responsibility
Community owners accept additional responsibility:
- They are responsible for moderating their community
- They must take reasonable action on reports within their community
- They must escalate illegal content to Outprobe
- They are not employees or agents of Outprobe
15. Contact Information
| Purpose | Contact |
|---|---|
| DMCA takedown notices | dmca@outprobe.com |
| DMCA counter-notices | dmca@outprobe.com |
| Report illegal content | report@outprobe.com or in-app Report button |
| Law enforcement requests | legal@outprobe.com |
| Content appeals | In-app appeals system or appeals@outprobe.com |
| General support | support@outprobe.com |
| Grievance Officer (India, IT Rules) | grievance@outprobe.com |
| DSA contact (EU) | dsa@outprobe.com |
16. Compliance Calendar
| Requirement | Frequency | Details |
|---|---|---|
| Transparency report | Annual | Law enforcement requests, content removals, DMCA stats |
| NCMEC reporting | As needed | Within 24 hours of CSAM detection |
| DSA compliance report (EU) | Annual | If applicable based on user count |
| India IT Rules compliance report | Monthly | If applicable based on Indian user count |
| DMCA agent registration | Update as needed | US Copyright Office registration |
Summary
| Principle | Implementation |
|---|---|
| Users own their content | Users are responsible for what they post |
| Community owners moderate | First line of defense, not Outprobe |
| Outprobe is the platform | We provide tools, not editorial control |
| Zero tolerance for illegal content | CSAM, terrorism — immediate removal, permanent ban, authorities notified |
| DMCA compliance | Full notice-and-takedown process with counter-notice and strikes |
| Private content is private | We don't read or scan private messages |
| Due process | Appeals for all decisions except CSAM/terrorism |
| Transparency | Annual reports on law enforcement requests and content removals |
| Law enforcement compliance | Valid legal orders only, user notification when permitted |
Questions about this policy? legal@outprobe.com