Does Your DLP/IRM Implementation Keep You Awake At Night?

With the fragmented manner of work culture in the 21st century, organizations have started to wake up to a fact that they cannot withhold information within the confines of their heavily guarded data-centers. Clients, employees and vendors need the information to continue smooth functioning of their businesses.

Information usually falls into anyone of the below categories:

Classification Legitimate Users Examples
Public  Visible to everyone in the world Tender documents
Internal Only for employees Memos, newsletters
Restricted Only for certain people/teams Contracts, Invoices, PO, Network Diagrams
Confidential Sensitive information – for very specific people Financial proposal
Private personal information of people Salaries, Employee HR Information

For example, your Infrastructure team has details about the servers installed in the organization. If you are looking to implement a new antivirus solution or server client software, you have to share your server details with the vendor.

For such scenarios, companies implement a Data Leakage Prevention (DLP) or Information Rights Management (IRM) Solution.

 

A DLP solution would monitor/prevent your internal employees from knowingly or unknowingly sending out information to 3rd parties outside the organization.

Whereas your IRM solution would allow your administrators to control who gets to view such data which needs to be shared externally.

Both of these are good solutions and help organizations protect their intellectual property.

 

However, during our various audit engagements we have come across implementations of these solutions (from different vendors) which do not protect corporate data to the fullest extent possible.

Without a proper implementation, an organization’s investments in such products not only burden their administrators but also make a big dent on in their finances.

Product Evaluation:

The implementation decision of a DLP/IRM solution should start at the evaluation stage. You need to know all the players in the market and then evaluate them to see which best suites your needs. The quick way out is to check out the best products as per Gartner ratings (http://www.gartner.com/technology/research/methodologies/research_mq.jsp).  The document to be reviewed is ‘Magic Quadrant for Content-Aware Data Loss Prevention’ (http://www.gartner.com/DisplayDocument?doc_cd=213871)

Proof-of-Concept:

Popularly called the PoC stage, it is the most critical phase of your solution implementation. This is where you technically evaluate the solutions and how well it integrates within your environment.

During our engagements we have seen organizations over-look the importance of this phase.

Usually the solution provider (vendor) demonstrates a few use-cases of how the product would be useful in your environment. The major features are well highlighted by the vendor. However, we need to ensure it all gels well within our infrastructure. It is imperative that the project owner prepare test-cases which are relevant to the organization and not the tried-and-tested cases demonstrated by the vendor.

For example during a recent engagement we came across a situation where the PoC was just performed as a formality. This resulted in a failed investment of $1 million for the DLP solution!!

Simple checks like the below were never tested:

  • File extensions that the solution supports: The product should support scanning for all file irrespective of the extension. A malicious user can easily change the file extension. If the DLP solution can only check DOC, PDF, XLS files then it will fail miserably.
  • Internet filtering: Since we’re trying to prevent leakage of data, the solution should be capable of monitoring all internet traffic in the organization. In a bank, credit card information is very critical. The solution should be able to block users trying to send bulk credit-card numbers (or data) online.
  • Online Storage blocking:  Most organizations have a web proxy which usually filters access to online storage sites like Dropbox, Box.net, Rapidshare etc. However, there may exist certain categories which the internet proxy administrator may not have blocked – like online webmail (OWA, Gmail, Hotmail) and code repos services – either due to oversight, or due to corporate policies. DLP Solutions should have their own category filtering to prevent leakage via an malicious employee uploading credit-card information on source-code hosting websites (Google code, Sourceforge, GitHub, BitBucket) or even online storage sites like the above.
  • Logging or alerting encrypted file transfers:  For highly sensitive files, employees encrypt the contents before sending it to the recipient. Cracking the encryption to check the contents of the file is practically not feasible. However, it is necessary to ensure that the organization is at-least aware that data is being sent out in encrypted files. Based on the designation of the person, this may be permitted or the user can be denied access. Malicious users can also use encrypted archives like RAR, ZIP, 7ZIP or password protected PDF, DOC,XLS
  • Integration with PGP or email encryption software: As with file encryption, some organizations also allow their employees to send encrypted emails. However, during the PoC stage, they fail to envisage their requirement for a product which can be integrated with email encryption. As such, anyone can send out encrypted emails with critical documents.

Deployment:

Once you have successfully completed the PoC phase and finalized on the solution you wish to implement in you organization, you need to decide on your deployment strategy.

Deploying a DLP solution requires you to first identify:

  • Your critical data assets: For the DLP solution to work efficiently, it needs to be trained on what to identify. So the first few months of deployment would probably be in ‘monitoring’ mode. It is imperative that the results obtained during the monitoring phase are scrutinized. The signature definitions should be defined to detect patterns of file contents that you wish to prevent users from sending externally. Considering the previous example of the bank, credit-card patterns can be defined as regular expression.  For a product/service oriented company, customer data is very sensitive. They would probably want to define patterns which match a customer name along-with their phone numbers.
  • Data Leakage Points: To fix a problem you need to know where it is! You need to analyze the avenues that a malicious user can utilize to ex-filtrate data from your organization. The usual culprits you should be aware about are:
    1. Emails
    2. USB
    3. Internet Access
    4. Backups
    5. (S) FTP

You need to ensure that your solution deployment covers all these areas.

  • The different users of the information within the organization: Classifying the users of your data is the next step. The users broadly fall into two categories:
    1. Legitimate business users who require the information for processing, sharing or analyzing. How they use the data should ideally be monitored and logged for non-repudiation purposes.
    2. Everyone else who is not authorized to handle the data.
  • Legitimate external users: Some organizations outsource functions of their business to external 3rd party vendors – print vendors, call centers etc. These vendors are the biggest source of data leakage in any organization. The vendors are beyond the restrictions applied by your organization at the perimeters.

 

Hardening

Once your deployment is complete, the final step involves hardening your DLP environment. This may seem counter-productive, since you wouldn’t expect to secure to a security product. Unfortunately, during the all the testing, PoC and deployments phases, administrators in-advertently introduce configuration changes which may introduce weakness in your solution. Ideally, an independent entity should review the final deployment for security issues. Areas we have seen usually misconfigured are:

  • Local User accounts configured on the Management Console. There should be no test or unauthorized user accounts configured on the system, especially administrator/auditor accounts.
  • Un-approved exceptions granted during the initial stages need to be removed so that your solution comprehensively covers all users in your organization.

 

NOTE: This blogpost has been cross-posted from my original post at NII Consulting – Checkmate

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s