Important Factors to Consider While Choosing Your Perfect DLP System

To meet modern standards, the functionality of DLP (Data Loss Prevention) systems has become more complex. At the same time, various DLP solutions have evolved in different ways. Thus, they all have their own strengths and weaknesses. In this article, I would like to write about one of the approaches to choosing a DLP system that is associated not with just listing the required functionality but with the ability of these systems to solve real-life tasks to prevent data leaks.

Today, DLP systems have become a popular tool in the arsenal of corporate information security departments. Their choice is approached about as thoughtfully as homemakers approach the choice of detergent – whose brand is stronger, whose advertising is more persistent, whose price is more attractive.

Nevertheless, under the hood of DLP systems, you can find a lot of things that are not mentioned in advertising brochures but the cost of an error here is much higher than when buying household chemicals. In this regard, it would be useful for information security experts to dive deeper into technical details, not limited to a formal comparison of available features.

There are many solutions of different maturity levels on the market now. The presence of one more checkbox in the comparison table does not guarantee that in reality this function works the way the customer needs. Some vendors try to release the product as quickly as possible, while its quality, convenience, and reliability may leave much to be desired.

Below are several practical points, which will help you understand how good the solution is, and whether or not, having spent significant funds, you end up with a product that is far from generally accepted corporate software standards.

The main advantage of a technologically mature enterprise solution is that the vendor has no problems maintaining it and releasing new functionality. Typically, DLP systems are selected based on a three or five-year planning horizon, so compliance with current requirements is not the only selection criterion. You need to understand where the vendor is heading and predict what tasks and problems your company may face in the future.

Content blocking

By definition, a DLP system must prevent data leaks caused by employees or malicious software. This can only be done by blocking a suspicious operation. Nevertheless, many organizations prefer to use DLP systems in a monitoring mode without actively interfering with information flows. At the same time, all information is stored in archives, and the security officers often react to leaks post factum. In this regard, the need for blocking functions loses priority since their active use is not planned.

Often this situation is explained by the risks of false positives or DLP system failures. If, due to the DLP system, the functioning of email or some other vital service is disrupted, the damage to the reputation of information security and IT departments can be remarkably high.

However, this argument only applies to immature DLP systems, which are hard to configure to run smoothly. A potential data breach caused by the lack of a blocking regime can be awfully expensive for the business. In addition, more and more regulations (for example, the GDPR) explicitly require building protection against data leaks using the blocking mode and provide for serious fines for violators.

Some communication channels monitored by DLP systems do not allow blocking due to purely technical reasons. For example, for WhatsApp and Telegram messengers, only passive monitoring is possible; otherwise, they will not work. However, any DLP system can hardly be considered mature enough if it does not support blocking the following channels based on the content of files: email, external USB drives, printers, web services (HTTP / HTTPS protocols).

Content analysis and policies

Not all DLP systems on the market today were initially created as complete systems with a well-designed architecture. Many vendors started with one main module. Later, around this module, all the rest environment was added to control the remaining channels. With such an approach, it was necessary to consider all the limitations and features that the first module had. Naturally, the result can rarely be called effective architecture.

Thus, based on where the content analysis is performed, it is often possible to determine where the history of a particular DLP system began. For example, if an endpoint agent transmits information for analysis to a DLP server using SMTP, then we can assume that the product started with an email control module. In this case, the server may or may not return the analysis result to the agent. In the latter case, there is no possibility of blocking by content when writing to a USB disk or printing.

It is clear that this architecture requires a constant connection to the server, and the network load can be a weak point. In documents, you can sometimes even come across a question of whether the DLP system supports the prioritization of network traffic. With proper design, content analysis is performed at the place where the policies are applied. And so, there is simply no need to transfer large amounts of information over the network, and the issue of prioritizing network traffic is also becoming irrelevant.

No connection to the corporate network

The DLP agent, in addition to performing content analysis and applying policies, must transmit information about events, send shadow copies and other data to the server. If the main archive is not accessible, all this important data should not be lost. As a rule, data gets saved on a local disk and transferred to the server when the connection is established. Ideally, policies should be applied depending on the situation with connection – whether there is a direct connection to the local network or the computer is connected via a VPN, or there is no connection at all. This way you can set the appropriate rules for each situation. This is important when an employee with a work laptop is out of the office, for example, on a business trip or working remotely from home.


Ease and convenience of system usage and management are highly subjective factors. Some people are more accustomed to managing the system using the command line, and others are more comfortable creating policies and rules with the help of scripting languages. Nevertheless, a well-thought-out and convenient interface can be an indirect sign that the developer is able to provide a high-quality product. And so, you can expect that important components of the system are also high-quality and efficient.

In terms of convenience and sophistication of the interface, it is important to note the following two aspects. First, you should have a unified management console. Nowadays, web consoles are preferred. They are cross-platform, do not require additional software installation, and can work on mobile devices. If there are several consoles, for example, separate consoles for managing individual modules, this indicates that the product was not designed and developed as a single, logically complete system. Perhaps the modules were added by different teams and attached (each with their own consoles) to each other in the course of the accelerated development circle, or maybe they were assembled and licensed from different manufacturers.

Second, a mature DLP system should have “omnichannel” policies. This means that if you want to create a policy to control, for example, legal contracts, then you just need to do it once and simply indicate which channels it should be applied to, like email, web, USB devices, etc. A less perfect system will require creating several similar policies for each channel separately. This may seem like a small problem, but it is not good when the number of policies is rising. When it exceeds a hundred, and the rules use complex conditions, where there are not only content rules, but also user groups, days of the week, etc., keeping such a set well synchronized can turn into a serious problem.

The minimum number of servers

Another clear sign of poor architecture, which can already be spotted during the pilot project, can be the number of servers required for the system to work. If more than one server is needed for a pilot project for 50 – 100 employees, this means that the system architecture is not optimal and, most likely, will require additional resources during the production phase. A quality enterprise-level system scales equally well in all directions. Of course, for huge scaling, you will need to have the ability to separate individual components of the system and have several servers and cluster them. But for small projects, the DLP system should not require unreasonably large resources.

Implementation options

A good DLP system should support many implementation options. This, firstly, simplifies the task of combining a DLP system with the existing IT infrastructure, and secondly, it allows you to flexibly balance the functional and computational load while controlling various channels.

Today, there are several systems on the market that provide for the control of all channels at the endpoint agent level only. This greatly simplifies the vendor’s work, as he can save on the development team and shorten the development time.

However, this architecture cannot be considered to belong to enterprise-level solutions. It is more natural to control most network lines at the gateway level, especially for large-scale deployments. A mature DLP system, in addition to the endpoint agent, offers the following implementation options:

  • Integration with mail servers (for example, Microsoft Exchange.) In this case, you can conveniently control your internal mail as well.
  • Receiving mail from a technical mailbox.
  • Integration with existing Internet gateway via ICAP protocol.
  • Own mail transport server.

The best vendors also offer their own proxy servers that provide integration with a DLP system to control HTTP and HTTPS traffic.

Cloud infrastructure

Since companies are switching to the remote work mode and at the same time want to save money on security services, cloud DLP solutions will definitely grow in quality and quantality.

Already today, it is often required to install the server components of a DLP system in the cloud. This happens, as a rule, with pilot projects or in small organizations. The archive of a DLP system contains all corporate secrets, and not many business owners want to place it in an uncontrolled environment. However, if the DLP system does not support hosting its own server modules in the cloud, this may come as an unpleasant surprise at some point.

There are also some issues with managing and controlling cloud storage and services. This can be either a banal upload of a file to the cloud or more complex schemes, for example, when an organization uses G Suite or Office 365 mail services. There are many nuances and problems here. For example, to access mail servers, you can use both a browser client and a classic one, for example, Microsoft Outlook. And for each option, you have to apply different protocols and different implementation options.

Also, when using cloud storage in an organization, it is necessary to ensure that a DLP system regularly scans all cloud folders to monitor confidential information stored there. To solve such problems, a whole class of solutions has appeared – Cloud Access Security Broker. In terms of the tasks being solved, these systems are conceptually very close to DLPs. One way or another, these tasks need to be solved, so the corresponding functionality will appear in DLP systems as well.

Integration with other corporate systems

Here, of course, I am not talking about integration with Microsoft Active Directory, which should be implemented in any modern DLP system. Most often, when implementing a DLP system, it is good if integration with the following software classes is supported:

  • Security Information and Event Management (SIEM). This is perhaps the most frequent issue with compatibility. The construction of a Security Operation Center (SOC) is one of the hottest topics in information security, and everyone wants events from DLP to be integrated with SIEM. Most SIEM systems can independently download information from the DLP database, but it is much more convenient if the DLP system supports CEF, Syslog, and other similar protocols. In this case, when configuring a DLP system, you can more flexibly control what information and in what form will enter the SIEM database.
  • Enterprise Digital Rights Management (EDRM). Basically, DLP and rights management systems complement each other and provide more reliable protection if properly deployed and configured. In this case, the DLP system understands EDRM policies and they can be used in DLP policies, when searching the archive, building reports, etc. Besides, the DLP system can itself apply EDRM policies based on certain rules, for example, when the specific type of content is detected.
  • Data classification systems. The most advanced DLP systems should be able to work with the labels and markings in documents added by data classification systems such as Boldon James, TITUS, etc. In this case, you can benefit from the effort already spent on data classification work.


The ability to work with multiple endpoint platforms is also a hallmark of a solid DLP system. The simplest solutions offer a Windows-only agent module. However, this may not be enough. Who knows, maybe someday we may face a massive switch to Linux. Nevertheless, there are already several DLP solutions where the agent works on Windows, Linux, and Mac.

Mobile devices based on iOS and Android should also be mentioned. For now, for technical reasons, it is almost impossible to create a full-fledged agent for smartphones and tablets, especially made by Apple. On the other hand, when implementing the BYOD strategy, you can (and should) use a Mobile Device Management (MDM) solution. It will allow you to use the mobile OS’s built-in capabilities, create privacy policies and reduce the risk of data leaks to an acceptable level.


As we have seen, DLP systems have evolved heterogeneously. This factor affects the degree of systems integrity, well-coordinated functioning, and the availability to support individual information distribution channels. Of course, in the modern world, the solution’s price and the cost of owning it also matter. Still, DLPs should first of all solve everyday security problems and, if possible, work not only in monitoring mode.