Getty Images/iStockphoto


7 steps to create a data loss prevention policy

Data loss prevention is an ever-changing process of proactive and reactive protection and planning. Read on to learn how to set up a successful DLP policy.

Data loss can be devastating, leading to financial setbacks, legal issues and a rough reputation hit. Counteracting threats starts with a well-designed data loss prevention policy, but success demands updates and sustained practice within the policy.

A DLP policy is a set of processes, procedures and tools to prevent the loss, misuse or unauthorized transfer of sensitive data. DLP is not just about technology; it also demands strategy and collaboration. Organizations need employee training and data governance to make it work.

A common assumption about DLP is that it's merely negative and preventative. Effective DLP isn't just putting up walls, but laying a foundation for greater innovation and agility. With the right protections in place, business users, analysts and developers can explore, experiment and innovate more freely with data.

Build a DLP policy

Follow these seven steps to build a solid DLP policy that will help safeguard core data assets and achieve strategic goals. Data specialists and their cohorts must classify data; search out data leaks; build a toolset; get business buy-in; and test, prep and monitor for constant protection.

1. Pinpoint sensitive data

The first critical task in crafting a DLP policy is to know what data to protect. Start by identifying major categories of confidential data. These types include the following:

  • Personally identifiable information of any kind.
  • Financial information about individuals and businesses.
  • Intellectual property.
  • Customer and partner data.
  • Business plans such as forecasts and internal reports.

Conduct a thorough audit of all data storage systems. Scrutinize legacy databases and other vulnerable repositories that may be poorly secured. You might uncover unofficial assets, such as rogue cloud storage or makeshift servers, which could hold sensitive data. To find hidden data pockets, consult IT, security, legal and key business units during your audit.

The result is a classified inventory of data, mapped to specific locations and ranked by risk levels. Use the document to establish DLP controls. To be effective, the document must stay current. Perform regular scans to update the inventory, particularly as organizational needs and systems evolve.

2. Spot data leak risks

After identifying what needs protection, assess how sensitive data could potentially be exposed. Knowledge of weaknesses helps craft a DLP policy that is both proactive and adaptive.

In a dynamic business and technology environment where threats continually evolve, a static DLP policy can be as bad as no policy at all.

Think about ways in which people could transfer data outside system boundaries. Potential data leak channels include email, cloud storage, web uploads and endpoint devices such as phones and USB drives.

Identify potential external and internal threats to these vulnerable data channels. Threat analysis provides insights into the highest-risk channels. This process might require cybersecurity experts to analyze risks in detail and to account for various forms of risk.

Don't limit the DLP conversation to security experts. Engage with key personnel who handle sensitive data in their daily work. Their insights can reveal weak spots that might not be apparent through a purely technical lens, such as bad practices in storing backups.

After the assessment, categorize the risks again based on their likelihood and impact. Focus initial DLP efforts on the most critical vulnerabilities.

3. Choose DLP tools

Software tools are one essential component of an effective DLP strategy.

Data teams can either use a dedicated DLP suite or build controls into existing systems. DLP tools offer comprehensive features but organizations must weigh the features against the investment in new software. Integrating DLP into existing security stacks -- especially if the current infrastructure is deployed on a major cloud platform -- requires less spending and change, but generally provides less granular policy control.

When choosing a tool, consider the following capabilities:

  • Range of detection methods.
  • Ability to fine-tune policies.
  • Workflows for escalating incidents.
  • Analytics.
  • Interoperability with the existing data stack.

When configuring tools, don't take a one-size-fits-all approach. Align the policies and controls to the sensitivity and risk categories identified in the first two steps. Highly sensitive financial records require stricter controls than general business communications. The aim is a DLP policy that ensures security without unnecessarily hindering agility.

4. Make the business case

Now comes a crucial step: Make a compelling business case to secure DLP program buy-in from leadership and affected departments.

Making the case at this step in the DLP policy creation process, rather than earlier, enables a targeted and credible proposal. If you seek plan approval with generalizations rather than details, there is a risk of errors in budgeting, tool selection and support. The project timeline also might be unrealistic. Doing the upfront homework creates a more realistic case for implementing the DLP policy.

The first order of business is to identify the potential risks the organization faces. Use statistics, studies and case studies from vendors or analysts to showcase the financial and reputational costs of data loss incidents.

Consider the investment required for implementing DLP, including software costs, training and possibly hiring. Compare implementation costs with the financial losses the organization could incur from a data breach. Establish a possible return on investment that justifies the upfront costs.

In addition to preventing data loss, a well-implemented DLP policy can provide teams the confidence to innovate and improve data management. Explain these benefits to stakeholders even if they don't have a quantifiable financial impact.

5. Test the policy

It's time to test the policy in practice. The objective is to validate the rules while minimizing business disruptions and false positives, which can undermine confidence and adoption.

Before testing, establish a baseline for key metrics, such as system resource utilization, false positives and user experience measures.

Begin with a pilot test involving a small, representative group within the organization. A pilot helps show how the policies perform in a real-world environment without affecting the entire company. Adjust the rules based on the pilot results.

Once the team has confidence in the tested policy, begin rolling it out across the organization in phases. A gradual approach sets organizations up to manage the policy and make quick adjustments if new issues arise.

6. Prepare for incidents

No system is completely secure. Organizations must be ready to respond to possible data loss or unauthorized access.

A formalized incident response plan should outline procedures and responsibilities to address data loss incidents. Include steps for immediate containment, investigation and remediation.

The incident response plan should designate a cross-functional team comprising members from IT, legal, communications and business units to respond to incidents. Team members need training on their role in the incident response process. Set standard operating procedures for typical incident scenarios so the team has quick-reference guides during crises.

Define forensic investigation procedures to determine the root cause, scope and effects of data breaches. Contract with specialists to assist in large-scale incident investigation, if needed.

Don't wait for a crisis before testing response abilities. Regularly simulate data loss incidents to test readiness, identify plan gaps and adjust accordingly.

7. Keep monitoring

In a dynamic business and technology environment where threats continually evolve, a static DLP policy can be as bad as no policy at all. Ongoing monitoring and policy adaptation are vital for an effective defense against threats.

Regularly review the data the organization handles. Companies add new data types at times, and older ones may change in importance. Periodically rescan repositories for new sources of confidential data that require protection. Again, adjust policies accordingly.

Set KPIs for policies, such as false positives, prevented leaks and system resource utilization. KPIs offer insights into efficacy. Analyze KPIs regularly to identify weaknesses.

Modify rules to align with changes in workflows, networks, devices and software. Static policies become outdated remarkably quickly. For example, extend DLP capabilities to the cloud to match data migration at the organization.

Compare DLP metrics against industry standards and best practices to spot deficiencies and improvement opportunities. Keep key stakeholders updated and involved regularly -- their insight and feedback matter.

The human element is critical to all business policies. People will work around prohibitively rigid controls, often unintentionally exposing data along the way. Policies must keep up with working practices.

Potential pitfalls in implementing and managing DLP

Mistakes can undermine the effectiveness of a DLP program. Be aware of common traps:

  • Overcomplicated policies. Overly complex rules are a burden. Keep policies as straightforward as possible.
  • Ignored insider threats. Focusing solely on external threats leaves an organization vulnerable to breaches from within. Consider both intentional and unintentional insider risks.
  • Inadequate training. Employees who are not well-informed about policies may fail to abide by them.
  • Poor incident planning. Without a structured incident plan, responses can be chaotic and ineffective.
  • Static policies. A DLP policy that is not regularly updated and adapted will gradually fall into obsolescence.

Be sure to answer the following questions: What data is the organization managing? Where is it located? Is it sensitive data? And remember, ask the same questions every week. That's a policy.

Donald Farmer is the principal of TreeHive Strategy, who advises software vendors, enterprises and investors on data and advanced analytics strategy. He has worked on some of the leading data technologies in the market and in award-winning startups. He previously led design and innovation teams at Microsoft and Qlik.

Dig Deeper on Data governance

Business Analytics
Content Management