Scan and protect Drive files using DLP rules

As a super administrator for G Suite Enterprise, you can prevent users from sharing sensitive content in Google Drive or Google Team Drive with people outside your organization. You can define rules that protect privacy using Data Loss Prevention (DLP). DLP for Drive scans your organization’s Drive and Team Drive files for sensitive content. You set up policy-based actions that are triggered when any sensitive content is detected. Available actions include sending an email to super administrators, sending an email to the user who created, edited, or uploaded a file with sensitive content, or blocking sharing of any file with sensitive content.

How rules work

You can work with a predefined template or create your own. You assign a rule to your whole domain, an organizational unit, or a group in Google Groups. Only whole domain level rules apply to Google Team Drive files. You can also exempt a group in Google Groups.

If a sensitive item is detected, you determine what action to take. For details, see How to define a rule.

Google Drive Files—Currently, DLP rules are only available for Drive files and Google Team Drive files.

Conditions—The variables that affect your rule.

Users

Choose Apply to Organization Unit then choose an organization unit from the pulldown menu. DLP rules are only applicable to Google Team Drive files when they’re applied on the Root Organization Unit (OU).

Click the Add button to add more. This rule will apply to files owned by users in the selected organization units.

Choose Apply to group then enter a group name.

Click the Add button to add more. This rule will apply to files owned by users in any one of the selected groups.

Choose Exempt group then enter a group name.

Click the Add button to add more. This rule will be ignored for files belonging to users in any of the selected groups.

Confidence threshold—Set whether to trigger the action if the detector in the file meets a medium confidence threshold (default), or only if the detector meets a high confidence threshold.
The confidence threshold indicates how likely the detected file content meets your compliance criteria.
A medium threshold means that more files trigger the action.
A high threshold can result in fewer false positives (fewer files being shared that should have triggered the action), but also possibly more false negatives (more files triggering the action that don’t require it).

Actions—What the rule does when it finds an issue (it always flags the file).

Block external sharing—Ensures that any files a user has created, edited, or uploaded with sensitive content are blocked from sharing with external users (anyone outside your organization can't see the file contents).

Warn on external sharing—Displays a subwindow that informs the user that they have created or uploaded a file with sensitive content.
​They'll need to click OK to close this subwindow.

Send email to Super Administrators—Sends an email to inform the super administrator that a user has created, edited, or uploaded a file with sensitive content.
An email is sent whenever the type of sensitive content in the file changes.
The maximum number of emails sent is 25 emails in 2 hours.

Note: If you don't choose an action, any matching files will only be "flagged" and will be visible in Rules Audit. External members of a Google Team Drive can't access files flagged with "block external access".

Tip: If you notice a high number of false positives, as administrator, create a pair of rules: In your first rule, add a strong action such as "Block external sharing" with the confidence threshold set to high. Next, create a second rule with a medium confidence threshold. For this rule, add the "Warn user on external sharing" action.

Date and time range—A start and end date and time for listing events.
Each entry in the log is associated with a single event.

To export the report data directly to a Google Sheets file within Drive or to download a CSV file with the report data, click Download . The exported Google Sheets file and downloaded CSV file both can contain a maximum of 200,000 cells. The maximum number of rows depends on the number of selected columns.

No. We can't guarantee that all sensitive data will get caught and flagged. The DLP detection system translates predefined templates into regexes (regular expressions) and uses additional content parameters to determine the probability of a match. There might be false positives and negatives, which are triggered by many factors.

Users will be shown a DLP specific message to communicate why sharing is blocked. In case of multiple violations, the message in the sharing policy violation screen will indicate the first detector that is matched.