Most people do not know where all their important files are, and this makes detecting data breaches tad difficult. To fully understand where data is, companies use data indexing solutions, but these are expensive and come with a significant IT overhead. There are, however, simpler ways to get just enough insight into where important files are, and prepare to implement a data breach detection process. It all starts with designating specific repositories that hold important data and defining a workflow that supports this approach. Next, it is just a matter of implementing a software solution able to provide enough insight into how data is being manipulated and alert on actions that put it at risk.

1. Know where important information is

– Usually, important files are generated by certain applications as part of the everyday business activity. These applications can be configured to create these files at a particular location;
– Many companies already use designated repositories for files containing valuable information and provide employees with access to those repositories. The only problem at this stage is there is no visibility into how the files are being used and what happens to them once they leave the designated repositories.

File monitoring feature to look for: The ability to tag information based on where it is created, and eventually based on the user or application that creates it.

2. Watch your sensitive folders and track the files as they move out

Next, you can start by monitoring what happens to the files in the given repositories. Most audit solutions provide basic information about file access like read, write or attribute change.
However, this may not be enough especially when users copy files out of the repository. For this operation, read access is sufficient and there is no way to allow users to read the files for outputting on the screen and block them from copying them to other locations.
An advanced file monitoring solution can also report on file movement across the network and provides insight into who copied files out of secure repositories.

File monitoring feature to look for: along with the ability to report on basic read / write operations, a good solution should also provide information on file copy operations (local, to the network, to removable devices) as well as file archived operations (archiving is a major step in data exfiltration scenarios)

3. Follow your important files on employees workstations

Once you have a grasp of what happens to the files in the relevant folders being watched, you should track what happens to them as they travel throughout the network on employees endpoints.

File monitoring feature to look for: in conjunction with the ability to tag files, look for the capacity to monitor endpoint file operations such as file attached to an email, uploaded by the browser, archived, copied to other remote locations, etc.

4. Get alerts and run scheduled reports

In general, you want to be alerted when a file is copied out of secure repositories so that you can run further reports on what happened next. You should also be interested in endpoint file activity that involves the internet or the network.

File monitoring feature to look for: Enough granularity in alerting and reporting to allow isolation of unusual cases only (based on various parameters of the file activity, such as user, application, file tag, type of operation and so on)

How we can help

TEMASOFT FileMonitor, delivers file access auditing technology which provides information on how data is being accessed on a computer system and provides all the features needed for reporting on file activity, detecting data breaches and performing forensic analysis on security incidents.
TEMASOFT offers this functionality for FREE for up to two workstation PCs, for personal use.

For more information, follow us on social media and subscribe to our newsletter.