
GDPR and the Transparency Revolution

(everything-possible/Shutterstock)
It’s been over a year since GDPR went into effect, and in that time, the regulation has driven a great deal of meaningful conversations around consumer privacy and enterprise data management policies. In an age where “data is the new oil,” fueling innovation and growth, enterprises face all kinds of new challenges when it comes to handling consumer data.
Large data breaches are constantly putting personal data at risk, and today, consumers are demanding greater transparency when it comes to the collection and use of their data, marking a shift from the days when everyone blindly accepted terms of service contracts from tech companies.
GDPR has forced companies to be honest and clear about how they leverage data now that consumers are paying more attention to the fine print. Individuals are more informed than ever before of their privacy rights and are exercising them accordingly, and organizations are finally beginning to understand the importance of demonstrating accountability and transparency with how they collect, handle, and transfer personal data. One slight misstep in the wrong direction, and enterprises may run the risk of hefty, crippling fines or worse: losing customers.
Thanks to GDPR, companies are finally realizing that while data is a hugely valuable asset, it is also a significant liability. Companies that fail to prioritize transparency around data privacy and protection aren’t just risking damage to financial profit and brand reputation, they’re also risking non-compliance with the most important overarching obligation of the GDPR. To get ahead of this, proactive organizations have already taken steps to ensure transparency and compliance in the age of GDPR.
Capturing Consent at Every Step of the Customer Journey
A large portion of the GDPR text is dedicated to regulating how businesses capture consent in an effort to demonstrate transparency. Updating a company’s privacy policy is a great start, but capturing consent by adding clear and concise language to every lead form, chat box, and email opt-in is ideal. Individuals should know exactly what they’re signing up for, and companies that give them the option to do so are taking a step in the right direction towards data transparency and accountability.
Adopt Data Minimization Practices
For awhile, professional data scientists were in high demand, and companies sought to amass and subsequently analyze as much data as possible to drive business innovation and growth. This mentality has changed in the wake of GDPR.
Businesses are adopting data minimization practices at collection or at retention so they can handle less but more meaningful data, and they’re putting policies in place to dispose of it when it’s no longer needed or useful. Ultimately, an organization’s best cybersecurity measure is to collect less data and encrypt it so that it’s safe in the event of a breach.
Consider Encryption and Data Distribution
Encryption is what makes personal data indecipherable to anyone who isn’t authorized to see it, and data distribution is what essentially bifurcates information files so they aren’t held all in one place where they would likely be more susceptible to a cyber attack.
Companies that care about their customers, employees, and vendors as much as they care about being GDPR-compliant should adopt data minimization practices and leverage technology to encrypt and distribute the small amount of data they do collect.
Don’t Skimp on Quality Cybersecurity
We live in a world where companies are collecting and handling more personal data than ever before, but holding any amount of data is risky if an organization isn’t doing enough to protect it.
The benefits of having less data to manage is that there’s less data to steal, but that shouldn’t exempt companies from implementing cybersecurity strategies and technologies that don’t just “tick the box” for GDPR compliance, but that are actually effective at thwarting cyber crime. New technologies exist that can provide data protection competence while still enabling companies to offer compelling and customized digital experiences for customers that give them confidence that their data isn’t being carelessly collected or processed.
It’s time for global organizations to proactively address the fact that data is both an asset and a liability by being more thoughtful about how much data they’re capturing instead of storing everything, unfiltered, for some unforeseen future use.
Companies need data to function–it’s an asset to every organization, and it would be unrealistic (and unfair) to ask a company to stop collecting it entirely. However, businesses that understand the risks associated with negligent data management practices will be better positioned to comply with GDPR and offer clarity to data subjects. Collecting only the most relevant information for a very specific purpose is not only easier to explain to individuals, but it also gives them confidence that you won’t misuse or abuse it.
It’s then up to you, the organization, to communicate transparently with your consumers and put processes in place to prioritize their personal data and handle it responsibly.
About the author: David Thomas is the CEO of Evident ID, a provider of online identify verification solutions. David has held key leadership roles at Motorola, AirDefense, VeriSign, and SecureIT. Since being recruited at a young age by the Department of Defense, David has been at the forefront of cybersecurity including firewalls as corporations began connecting to the Internet, Web security as online shopping emerged, wireless security as Wi-Fi and smartphones became ubiquitous, and security sensing networks as analytic technology became mainstream. He has been featured in CNN, The Wall Street Journal and other leading publications.
Related Items:
Capital One Hack Highlights Poor Data Security Practices
Big Data Security: Progress Is Made, But Is It Enough?
April 25, 2025
- Denodo Supports Real-Time Data Integration for Hospital Sant Joan de Déu Barcelona
- Redwood Expands Automation Platform with Introduction of Redwood Insights
- Datatonic Announces Acquisition of Syntio to Expand Global Services and Delivery Capabilities
April 24, 2025
- Dataiku Expands Platform with Tools to Build, Govern, and Monitor AI Agents at Scale
- Indicium Launches IndiMesh to Streamline Enterprise AI and Data Systems
- StorONE and Phison Unveil Storage Platform Designed for LLM Training and AI Workflows
- Dataminr Raises $100M to Accelerate Global Push for Real-Time AI Intelligence
- Elastic Announces General Availability of Elastic Cloud Serverless on Google Cloud Marketplace
- CNCF Announces Schedule for OpenTelemetry Community Day
- Thoughtworks Signs Global Strategic Collaboration Agreement with AWS
April 23, 2025
- Metomic Introduces AI Data Protection Solution Amid Rising Concerns Over Sensitive Data Exposure in AI Tools
- Astronomer Unveils Apache Airflow 3 to Power AI and Real-Time Data Workflows
- CNCF Announces OpenObservabilityCon North America
- Domino Wins $16.5M DOD Award to Power Navy AI Infrastructure for Mine Detection
- Endor Labs Raises $93M to Expand AI-Powered AppSec Platform
- Ocient Announces Close of Series B Extension Financing to Accelerate Solutions for Complex Data and AI Workloads
April 22, 2025
- O’Reilly Launches AI Codecon, New Virtual Conference Series on the Future of AI-Enabled Development
- Qlik Powers Alpha Auto Group’s Global Growth with Automotive-Focused Analytics
- Docker Extends AI Momentum with MCP Tools Built for Developers
- John Snow Labs Unveils End-to-End HCC Coding Solution at Healthcare NLP Summit
- PayPal Feeds the DL Beast with Huge Vault of Fraud Data
- OpenTelemetry Is Too Complicated, VictoriaMetrics Says
- Will Model Context Protocol (MCP) Become the Standard for Agentic AI?
- Thriving in the Second Wave of Big Data Modernization
- What Benchmarks Say About Agentic AI’s Coding Potential
- Google Cloud Preps for Agentic AI Era with ‘Ironwood’ TPU, New Models and Software
- Google Cloud Fleshes Out its Databases at Next 2025, with an Eye to AI
- Can We Learn to Live with AI Hallucinations?
- Monte Carlo Brings AI Agents Into the Data Observability Fold
- AI Today and Tomorrow Series #3: HPC and AI—When Worlds Converge/Collide
- More Features…
- Google Cloud Cranks Up the Analytics at Next 2025
- New Intel CEO Lip-Bu Tan Promises Return to Engineering Innovation in Major Address
- AI One Emerges from Stealth to “End the Data Lake Era”
- SnapLogic Connects the Dots Between Agents, APIs, and Work AI
- Snowflake Bolsters Support for Apache Iceberg Tables
- GigaOM Report Highlights Top Performers in Unstructured Data Management for 2025
- Supabase’s $200M Raise Signals Big Ambitions
- Grafana’s Annual Report Uncovers Key Insights into the Future of Observability
- Big Data Career Notes for March 2025
- GenAI Investments Accelerating, IDC and Gartner Say
- More News In Brief…
- Gartner Predicts 40% of Generative AI Solutions Will Be Multimodal By 2027
- MinIO: Introducing Model Context Protocol Server for MinIO AIStor
- Dataiku Achieves AWS Generative AI Competency
- AMD Powers New Google Cloud C4D and H4D VMs with 5th Gen EPYC CPUs
- Seagate Unveils IronWolf Pro 24TB Hard Drive for SMBs and Enterprises
- CData Launches Microsoft Fabric Integration Accelerator
- MLCommons Releases New MLPerf Inference v5.0 Benchmark Results
- Opsera Raises $20M to Expand AI-Driven DevOps Platform
- GitLab Announces the General Availability of GitLab Duo with Amazon Q
- Prophecy Introduces Fully Governed Self-Service Data Preparation for Databricks SQL
- More This Just In…