Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Tip
Did you know you can try the features in Microsoft Defender for Office 365 Plan 2 for free? Use the 90-day Defender for Office 365 trial at the Microsoft Defender portal trials hub. Learn about who can sign up and trial terms on Try Microsoft Defender for Office 365.
In organizations with Microsoft Defender for Office 365 Plan 1 or Plan 2 or Microsoft Defender XDR, admins can decide whether users can report messages or calls in Microsoft Teams. The following clients support reporting:
- The Microsoft Teams desktop client.
- The Microsoft Teams Web App.
- The Microsoft Teams app for iOS/iPadOS: Version 7.15 or later (messages only).
- The Microsoft Teams for Android: Version 1416/1.0.0.2025153104 or later (messages only).
Users can report Teams messages from chats, channels, and meeting conversations as malicious or non-malicious. They can also report Teams calls from their call history as scam or not scam. Admins can view the Teams messages and calls that users report.
Note
User reporting of calls and messages in Teams is not supported in U.S. Government organizations (Microsoft 365 GCC, GCC High, and DoD).
For information about user reporting of email messages, see Report suspicious email messages to Microsoft.
User reporting settings for Teams items
User reporting of messages or calls in Teams is made of two separate settings:
In the Teams admin center: On by default and controls whether users are able to report items from Teams. When this setting is turned off, users can't report items within Teams, so the corresponding setting in the Microsoft Defender portal is irrelevant.
In the Microsoft Defender portal: On by default for new tenants. Existing tenants need to enable it. If user reporting of messages is turned on in the Teams admin center, it also needs to be turned on the Defender portal for user reported messages to show up correctly on the User reported tab on the Submissions page.
Important
When a user reports a Teams message or call to Microsoft, all data directly associated with the item is copied and included in ongoing algorithm reviews. This information includes:
- Message content
- Headers
- Attachments
- Routing metadata
- Call metadata and any other related information.
The submission might also include contextual data for the reported message. Specifically, up to fifteen messages before and after the reported message might also be shared for analysis.
Microsoft treats this feedback as your organization's authorization to analyze the submitted information to improve hygiene algorithms. Submitted content is stored in secured, compliance-audited data centres located in the USA and is deleted as soon as it's no longer required.
Microsoft personnel might read submitted messages, calls, and files, which is typically not permitted for Teams items in Microsoft 365. However, your submission remains confidential between you and Microsoft and isn't shared with any third party during the review process. Microsoft might also use AI to evaluate and generate responses tailored to your submission. Microsoft doesn't use customer data to train any generative AI foundation models, except pursuant to the customer's documented instructions.
Turn off or turn on user reporting in the Teams admin center
To view or configure this setting, you need to be a member of the Global Administrator* or Teams Administrator roles. For more information about permissions in Teams, see Use Microsoft Teams administrator roles to manage Teams.
Important
Microsoft strongly advocates for the principle of least privilege. Assigning accounts only the minimum permissions necessary to perform their tasks helps reduce security risks and strengthens your organization's overall protection. Global Administrator is a highly privileged role that you should limit to emergency scenarios or when you can't use a different role.
In the Teams admin center, go to the Settings & policies page at https://admin.teams.microsoft.com/one-policy/settings.
On the Settings & policies page, select either the Global (Org-wide) default settings tab for all users or Custom policies for users & groups for specific users.
On the tab, go to the Messaging section and select Messaging. If you selected the Custom policies for users & groups tab in the previous step, do one of the following steps to edit the specific policy:
- Click on the policy name in the Name column.
- Click anywhere in the row other than the Name column, and then select the
Edit action that appears.
In the policy details page that opens, find the Report a security concern toggle, and verify the value is
On.If the value is
Off, move the toggle to
On, and then select Save.In the Teams admin center, go to the Messaging settings page at https://admin.teams.microsoft.com/one-policy/settings/messaging.
On the Messaging settings page, go to the Messaging safety section, find the Report incorrect security detections toggle, and verify the value is
On.If the value is
Off, move the toggle to
On, and then select Save.In the Teams admin center, go to the Calling settings page at https://admin.teams.microsoft.com/one-policy/settings/calling.
On the Calling settings page, go to the General section, find the Report a call toggle, and verify the value is
On.If the value is
Off, move the toggle to
On, and then select Save.
For more information about messaging policies in Teams, see Manage messaging policies in Teams or calling policies in Teams, see Manage calling policies in Teams.
Turn off or turn on user reporting in the Defender portal
To modify this setting in the Defender portal, you need to be a member of the Organization Management or Security Administrator role groups. For more information about permissions in the Defender portal, see Permissions in the Microsoft Defender portal.
The value of this setting is meaningful only if reporting is turned on in the Teams admin center as described in the previous section.
In the Microsoft Defender portal at https://security.microsoft.com, go to Settings > Email & collaboration > User reported settings tab. To go directly to the User reported settings page, use https://security.microsoft.com/securitysettings/userSubmission.
On the User reported settings page, go to the Microsoft Teams section for the Monitor reported items in Microsoft Teams setting.
As previously described, this setting is turned on by default for new tenants, and existing tenants need to enable it. Typically, you leave it turned on if message reporting is also turned on in Teams admin center. Learn more about reported message destinations.
For more information about user reported items settings in the Defender portal, see User reported settings.
How users report items in Teams
Tip
- Reported oitems remain visible to users.
- Users can report the same items multiple times.
- Message senders aren't notified their messages were reported.
- The caller isn't notified that their calls were reported.
Report malicious messages in Teams
In the Microsoft Teams client, hover over the malicious message without selecting it, and then select
More options > Report this message.In the report this message dialog that opens, verify Security risk - Spam, phishing, malicious content is selected, and then select Report.
Note
If reporting for Microsoft Purview Communication Compliance is turned off, users might not have the dropdown list to select Security risk - Spam, phishing, malicious content. Instead, they're shown a confirmation pop-up.
In the confirmation dialog that opens, select Close.
Report non-malicious messages in Teams
In the Teams chat or channel, hover over the message without selecting it, and then select
More options > Report this message.In the report this message dialog that opens, select Not a security concern, and then select Report.
Report calls in Teams
Currently support is there for completed or missed one to one calls only.
In the Microsoft Teams client, go to the call history view and then select
More options > Report call.In the Report call dialog that opens, verify Security concern - Spam, phishing, malicious call is selected, and then select Report.
In the confirmation dialog that opens, select Close.
What happens after a user reports a items from Teams?
What happens to a user reported Teams item depends on the settings in the Reported items destinations section on the User reported settings page at https://security.microsoft.com/securitysettings/userSubmission:
- Send the reported items to > Microsoft and my reporting mailbox: The default user reporting mailbox is the Exchange Online mailbox of the global admin. The value for older Microsoft 365 organizations is unchanged.
- Send the reported items to > Microsoft only
- Send the reported items to > My reporting mailbox only
For more information, see User reported settings.
Notes:
For shared channel user reports, the report goes to the organization that owns/created the channel.
If you select Send the reported items to > My reporting mailbox only, reported items don't go to Microsoft for analysis unless an admin manually submits the item from the User reported tab on the Submissions page at https://security.microsoft.com/reportsubmission?viewid=user. Reporting items to Microsoft is an important part of training the service to help improve the accuracy of filtering (reduce false positives and false negatives). That's why we use Send the reported items to > Microsoft and my reporting mailbox as the default.
Regardless of the Send the reported items to setting, the following actions occur when a user reports a Teams item:
- Metadata from the reported Teams items (for example, senders/caller, recipients, reported by, and items details) is available on the User reported tab on the Submissions page.
- The alert policies named Teams message reported by user as a security risk, Teams message reported by user as a not security risk, Teams call reported by user as a security risk and Teams call reported by user as a not security risk generate alerts by default. For more information, see Manage alerts.
To view the corresponding alert for a user reported items in Teams, go to the User reported tab on the Submission page, and then double-click the item to open the submission flyout. Select
More options and then select View alert.
View and triage user reported items in Teams
As previously described, information about user reported items in Teams is available on the User reported tab on the Submissions page at https://security.microsoft.com/reportsubmission?viewid=user. For more information, see View user reported items to Microsoft.