Acceptable Use Guidelines Enforcement Report


At Zoom, we are committed to bringing happiness to our users while maintaining a culture of trust, safety, and respect. In October 2020, we published our first Acceptable Use Guidelines, “formally known as “Community Standards,” which govern the Zoom platform. These Guidelines describe the types of content and behavior we prohibit. These Guidelines and our Terms of Service help keep Zoom vibrant and reduce the risk of harm and disruption on our platform. For more information on how we approach Trust and Safety and enforce our Acceptable Use Guidelines, please visit our Safety Center.

Zoom’s Acceptable Use Guidelines are updated on an as-needed basis. Our Policy team and Appeals Panel regularly review each standard to ensure that they fit our products, serve our users, and are easy for our analysts to apply. We also review and update them based on regular engagement with third-party stakeholders, such as civil society organizations.

After review and approval, we post the update in our Acceptable Use Guidelines change log, which you can access here.

This Acceptable Use Guidelines Enforcement Report features data on the number of reports our Trust & Safety team has processed in a particular month, as opposed to the total number of reports we have received in a particular month. The Report also features data on the actions taken in response to violations of our Acceptable Use Guidelines, which could include suspending an event or host, issuing a strike against a reported user, etc. This information is updated monthly.

We publish a separate Government Requests Transparency Report, which you can access here.

Because the reporter chooses the “issue type,” the actual issue type may sometimes differ from the reported one.

Reports Actioned and Appeals Actioned



Issue Type

Issue types are further defined in our Acceptable Use Guidelines.

Special Issues


We permanently suspend accounts we determine have transmitted, displayed, stored, shared, or promoted Child Sexual Abuse Material (CSAM) on our platform. We have a specific section on our trust form allowing users to report child sexual exploitation and abuse (CSEA). On the back end, these cases are prioritized for review by our analysts. Additionally, we have a dedicated National Center for Missing & Exploited Children (NCMEC) API that allows us, after human review, to report instances of CSAM directly from our dashboard to the NCMEC. In this reporting period, we sent

CyberTipline reports to NCMEC.


At Zoom, we take our responsibility to prevent the spread of child sexual abuse material (CSAM) seriously. To help us achieve this, we’ve been using PhotoDNA to scan certain customer content data, such as files uploaded to persistent chat, Zoom Room backgrounds, and profile pictures.

PhotoDNA compares a hash value assigned to a piece of known CSAM against a file uploaded into Zoom’s system. If a match is found, a report is generated for our Trust and Safety team to review and determine whether the report is a true positive match. This process is done to ensure accuracy and weed out false positives.

Between March 2022 and January 2023, we received over 11,000 reports related to child sexual exploitation. Most of these reports were generated by PhotoDNA hash matches. After our Trust and Safety team review, we dismissed 96.85% of these reports as false positives.

Due to this high false positive rate, we are currently working with various stakeholders to evaluate new CSAM hash-matching technology and other metadata-driven methodologies for detecting this heinous form of abuse.

Violent Extremist Groups and Glorification of Violence

In partnership with the Federal Bureau of Investigations (FBI), we utilize an API to report content related to violent extremist groups and/or depicting glorification of violence directly to the FBI’s National Threat Operations Center (NTOC). In this reporting period, we escalated

cases to the FBI’s National Threat Operations Center (NTOC).

To learn more about how we work to address abuse on our platform, please visit Zoom’s Safety Center.

Action Taken

Blocked User(s): The user was deactivated and/or blocked. They are prohibited from using Zoom unless they successfully appeal the decision. Appeals for certain issue types, such as those involving CSAM or references to terrorism and violent extremism, will not be granted.

Disable App: When a Zoom Marketplace App was suspended, and we removed the listing from the Marketplace and the app is no longer searchable and it prevents new users from downloading the app and it revokes access of all the current users.

Dismissed: No action was taken. A report could be dismissed for various reasons, including the report being accidental or false, lacking sufficient evidence to prove a violation or a false positive. Definitions and an example for each reason are included below:

  • Accidental Report by User: it is clear the reporter did not intend to submit the report or that the information in the report needed correction.
    • Example:
      • Report is submitted by mistake or for the wrong user.
  • Action Not Listed: all available actions listed do not match the type of report or unique information submitted.
    • Example:
      • The report is submitted, and the action required on the case was taken in a system or tool outside of a Trust and Safety form.
  • False Positive: a case where the report appears to be a violation but is later determined to be incorrect or submitted with information that does not match the evidence.
    • Example:
      • The report filed contains images that were flagged incorrectly via internal solutions.
  • Incorrect Form: the report was submitted through the wrong channels or report category.
    • Example:
      • The report was submitted using our abusive content or behavior form while trying to report an account as compromised.
  • Insufficient Evidence or Information Provided: the report is missing information, and action cannot be taken.
    • Example:
      • The report is submitted with little to no information impeding a full investigation.
  • Internal Test: the report was submitted for internal testing purposes.
    • Example:
      • A report was automatically generated to ensure a newly deployed code worked as intended.
  • No Violation: the reported content and/or behavior does not violate Zoom’s Terms of Service and Acceptable Use Guidelines.
    • Example:
      • A report is submitted for someone joining a waiting room with no indication of a TOS violation.
  • Not Actionable: unable to take action on a report due to technical limitations with internal tooling or reporting.
    • Example:
      • A report was filed against an application that was authorized by one of the meeting’s participants.*
      • Technical limitations with internal tooling or reporting.
  • Spam/Phishing: the report is not Trust and Safety related and points to external websites or links.
    • Example:
      • The report was submitted with a link to see a host’s external website that is not relevant to Zoom.

*Note about applications:: During a meeting, a participant can allow the use of an application built for the Zoom platform (known as Marketplace apps). Some Marketplace apps will appear as a participant in the meeting. If the Marketplace app accesses meeting or webinar content, such as video, audio, chat, and/or meeting files during a meeting, it will appear in the Active Apps Notifier in the upper left corner of the meeting window. Zoom developed the Active Apps Notifier to help you make informed decisions about how you use Zoom. During meetings where a Marketplace app has access to meeting content, you may mute your microphone, turn off the video, not send chats, or leave a meeting entirely. Zoom is committed to fostering a developer-friendly environment and enabling the creation of innovative applications on its platform.

If you encounter any problems or have any concerns about a marketplace app, please don’t hesitate to let us know through our reporting function. For more information on reporting a marketplace app, please read here

Duplicate: Two or more reports about the same issue from the same reporter. In this instance, we consolidate duplicate reports and take action once.

Removed from Zoom Event Lobby: One or more Zoom Event Participants were removed from the event’s lobby.

Removed Message(s) (Lobby ZE): One or more messages were removed from a Zoom Event lobby.

Remove ZE Recording: One or more recordings were removed from a Zoom Event Account.

Suspend App: When a Zoom Marketplace App was suspended, and we removed the listing from the Marketplace and the app is no longer searchable and it prevents new users from downloading the app but current users are still allowed to use the app.

Suspended Developer: The developer was deactivated and/or blocked from our Marketplace. They are prohibited from using Zoom’s Marketplace unless they successfully appeal the decision.

Suspended Event: An event was ended or prohibited from taking place.

Suspended Meeting: When a Zoom Meeting was suspended from happening.

Suspended User from ZE: One or more Zoom Events users were deactivated and/or blocked.

Suspended Whiteboard: A user’s Zoom Whiteboard function was suspended.

OnZoom/Zoom Events Host(s) Suspended:: One or more OnZoom/Zoom Events hosts were deactivated and/or blocked.

Striked User(s):: The user received a strike. Strikes expire after 180 days and do not affect the user’s ability to use the platform unless they accumulate. Depending on the reason for the strike, one or two additional strikes within the same 180-day period will result in a suspension against the user.

Reporter Country

This information is based on the reporter’s IP address when the report was submitted. The IP address usually corresponds with where a reporter is geolocated unless they use a virtual private network (VPN) or proxy server.

Our Review Process

When a user reports a violation of our Acceptable Use Guidelines or Terms of Service, our Trust and Safety team will investigate and, if warranted, take action as quickly as possible.

Our tiered review process starts with a team of analysts who review different kinds of reports and flags. Reports are first divided into queues by issue or reporter type. Team members rotate among the different queues to ensure each individual reviewer has experience across reporting categories. As we resolve reports, the report type and resolution information feeds into a dashboard. The dashboard gives us meaningful data to spot trends, test abuse-prevention tools, or see spikes in demand so we can refine our processes over time.

Analysts escalate difficult or ambiguous cases to higher tiers. The highest tier is our Appeals Panel. Appeals Panelists serve for one-year terms and come from diverse backgrounds, experience levels, tenures, and departments at Zoom.

Users who have been suspended from Zoom may appeal the action here. Appeals for certain issue types, such as those involving CSAM or references to terrorism and violent extremism, will not be granted..

The dashboard below features data on the number of appeals we processed. “Appeal Approved” means that the user’s access to the platform was reinstated, and they have read and agreed to adhere to Zoom’s Acceptable Use Guidelines and Terms of Service. “Appeal Rejected” means that the user’s access to the platform was not reinstated either because the user was granted a prior appeal and cited for the original violation again or the user’s violation included an issue type that is not appealable.