Proofpoint signs definitive agreement to acquire Normalyze. Read more.
What is DSPM?

FEATURED

Gartner® Innovation Insight: Data Security Posture Management
Get Report
PLATFORM
The Normalyze Platform
Supported Environments
Platform Benefits
Solution Differentiators
Data Handling for DSPM
USE CASES

Reduce Data Access Risks

Enforce Data Governance
Eliminate Abandoned Data

Secure PaaS Data

Enable Use of AI

DSPM for Snowflake

MARKETS

Healthcare
Retail
Technology
Media
M&A

FEATURED

DSPM Buyer's Guide: Report
DSPM Buyer's Guide

A toolkit to help gather internal DSPM requirements and evaluate vendors

Get Your Copy

FEATURED

CYBER 60: The fastest-growing startups in cybersecurity
Get Report

Data handling
for DSPM platforms

Understand vendor options and best practices for data handling.

DSPM deployment options

When evaluating a DSPM vendor, it is important to consider how they handle data. The deployment of DSPM solutions can vary significantly depending on the operational needs and security policies of an organization and the vendor selected. There are three distinct approaches, with some vendors offering customers the choice and other vendors only offering one method.

As organizations research Data Security Posture Management, it’s important to understand the pros and cons of the three methods below:

Summary

1. Extract and scan

Risky

2. In-place scanning

Preferred

3. Sidecar

Alternative

1

Extract and scan

Risky

The most controversial of the three, this requires allowing vendors to extract customer data which is then transferred to the external vendor environments for scanning, analysis and classification. While this approach, also called the “snapshot” approach, expedites implementation because fewer permissions are typically needed, it also introduces two concerns that customers should ask vendors to address:

What happens to the extracted data?

 
The customer loses control over the extracted data, including that it remains secure during transfer and in the vendor environment, and that it is properly purged after analysis is complete. Visibility into data lineage will likely be lost on extraction, and sovereignty or other compliance issues may be introduced depending on the location and configuration of the vendor environment.

Who pays the processing costs? 

 
In most cases, the cost of scanning data is passed along to the customer, despite the fact that the data is in the vendor environment. While the initial idea is to bypass internal roadblocks, procurement teams will often ask about the costs associated with scanning data since it can often be 2-3x more costly than other approaches.

2

In-place scanning

Preferred

This approach, preferred by Normalyze customers, allows DSPM scanning functions to run directly in the customer’s data environments, continuously operating as data is created, moved or updated.

This method ensures real-time data monitoring and analysis, which is crucial for environments where data sensitivity and immediacy are critical. By analyzing data within its native system, in-place scanning minimizes data exposure to external threats and eliminates the latency and potential security risks associated with data transfer. Unlike the other two approaches, this model does not attempt to bypass vendor due diligence or internal risk assessment/audits.

Advantages of
in-place scanning

In-place scanning represents a significant advancement in managing data security by integrating DSPM capabilities  directly within the data environment.

By continuously monitoring and analyzing data in its native environment as it is created, moved or modified, context about critical data is not lost, which would leave gaps of vulnerability between scans. 

This method ensures that security measures are always in step with the latest data, providing a living view of an organization’s data landscape and security posture.

Lower operational cost
By processing data within the customer environment, organizations can significantly reduce the operational costs associated with their DSPM budget. Other methods require the vendor to pass along egress costs as well as costs incurred from using their own computing resources.

Simplified data governance and sovereignty
Managing data within its native jurisdiction simplifies compliance with data sovereignty regulations. It mitigates the risks associated with data transfers across borders, ensuring that data handling practices meet the requirements set by governing bodies.

Continuous compliance
In-place scanning simplifies adherence to strict data protection regulations like GDPR, HIPAA, and PCI DSS. Since the data stays with the customer, they retain control over data management practices such as purging or retaining data per regulatory requirements. This approach ensures compliance from the moment data is created, maintaining data integrity and accuracy while reducing compliance-related liabilities.

Better trust among teams
IT and security teams have likely configured access within the native environment according to very specific internal policies and best practices. In-place scanning maintains these policies and best practices automatically. Under other methods, teams may be unenthusiastic to rely on assurances that the DSPM vendor has created the same level of access control within its environment.

Smaller risk footprint
Keeping data inside its native environment minimizes data exposure risk and removes exposure to potential security threats during transit and at rest in external systems. With alternate methods, risks can arise from several sources, including system or human vulnerabilities. For example, employees who handle sensitive data in multiple environments might misuse it, intentionally or accidentally. Even if data is encrypted to mitigate some of the data exposure risks of extract and scan, mismanagement of encryption keys can potentially expose the data. In addition, data may be stored in environments susceptible to cyber-attacks or it may not be properly deleted after analysis, increasing the overall risk footprint unnecessarily.

Better data integrity
Keeping data in its original environment maintains the lineage of that data so that security teams know its context – where it came from, who has access to it, is it derived or duplicated from other data – to provide a more precise picture of risk. In regulated industries where there is a “duty to prove” the integrity of data throughout its lifecycle, maintaining data in its native environment reduces data movement and duplication, thereby simplifying tracking of data lineage by.
3

Sidecar model

Alternative

In this configuration, a software component is deployed in a separate account within the customer environment. This component operates in parallel to the primary system, pulling data over, then scanning and analyzing it in near real time. It allows for an isolated – yet integrated – environment where data can be processed and analyzed by DSPM tools without being moved out of the source system.

This is ideal for customers with many on-premises data stores, since it balances operational isolation with the convenience of close data proximity, offering a compromise between in-place scanning and external processing. In certain scenarios, Normalyze recommends a sidecar housed within the customer environment. Along with an established process for sidecar maintenance—managed either by the customer or under strict control by Normalyze—risk and compliance teams keep the data under their control while still providing security and data teams with rapid insights on their posture.

Resources

GigaOm Radar for DSPM 2024

Data is the most valuable asset for a modern enterprise, and its proliferation everywhere makes DSPM an essential tool for visibility into where sensitive data is, who has access to it, and how its being used.

The Normalyze cloud-native platform

Learn how we deliver the fastest scanning at scale with the most accurate classification across every data environment.

DSPM Buyer's Guide: Report
A Buyer’s Guide to Data Security Posture Management

The 2024 DSPM Buyer’s Guide is designed to help in your research process, clearly define your internal requirements, and make well-informed decisions for your organization.