Home β€Ί D2: Asset Security β€Ί Privacy Controls, DLP & Compliance
Domain 2 Β· Lesson 5 of 5

Privacy Controls, DLP & Compliance

Kiểm soΓ‘t Quyền riΓͺng tΖ°, DLP & TuΓ’n thα»§

Theory

Privacy by Design (PbD) β€” 7 Principles

Developed by Ann Cavoukian, Privacy by Design (PbD) is the principle that privacy must be built into systems and processes from the start β€” not added as an afterthought. It is embedded in GDPR (Article 25) and is fundamental to regulatory compliance for FinTech Company X.

#PrincipleWhat It Means
1Proactive not ReactiveAnticipate and prevent privacy issues before they occur β€” not respond after a breach
2Privacy as the DefaultDefault settings must protect privacy; users must actively opt in to share more data, not opt out
3Privacy Embedded into DesignPrivacy is integral to the system architecture β€” not a feature added on top afterward
4Full Functionality (Positive-Sum)Privacy AND functionality β€” not a trade-off. Both goals can and must be achieved simultaneously
5End-to-End SecurityProtect data through its entire lifecycle β€” from creation to secure destruction
6Visibility and TransparencyBe open about what data is collected, why, and how it is processed β€” privacy notices, data maps
7Respect for User PrivacyKeep it user-centric β€” individual data subject rights are paramount; build controls that empower users

Anonymization vs Pseudonymization

These are often confused β€” the distinction has major regulatory implications:

FeatureAnonymizationPseudonymization
Reversible?No β€” no mapping table kept; original identity cannot be recoveredYes β€” with the separate mapping table, original identity can be restored
GDPR personal data?No β€” once truly anonymized, GDPR no longer appliesYes β€” still personal data (GDPR applies); offers reduced risk but not exemption
ExampleReplace customer phone number with a random hash and discard the mappingReplace phone number with a token; mapping table stored separately in secure vault
Use caseAnalytics datasets, ML training data for production modelsProduction databases with audit trail needs; reduces breach impact while preserving reversibility

Tokenization & Data Masking

  • Tokenization: Replaces sensitive data with a non-sensitive surrogate value (token) with no mathematical relationship to the original. Primary use: PCI-DSS scope reduction. Card Processor holds the real PAN; FinTech Company X holds only the token β€” no PAN ever stored in Platform C systems.
  • Data masking: Shows partial data (e.g., ****-****-****-1234). Used in non-production environments and UI displays where the full value is not needed. Partner E card display should show only last 4 digits.

Data Loss Prevention (DLP)

DLP systems monitor and prevent unauthorized transfer or exfiltration of sensitive data. Three deployment modes:

TypeMonitorsTS Use Case
Network DLPOutbound traffic β€” email attachments, web uploads, SFTP transfersAlert if customer PII is emailed to personal Gmail; block unencrypted PII uploads to external storage
Endpoint DLPUSB copy, print, clipboard, screen capture on employee devicesPrevent staff from copying customer data to personal USB drives
Cloud DLPCloud storage buckets (GCS, S3) β€” scans for sensitive data in wrong locationScan GCS buckets for PII in wrong bucket (e.g., public bucket containing loan PDFs)

Digital Rights Management (IRM / DRM)

IRM (Information Rights Management) β€” also called DRM β€” controls what recipients can do with a document or file. Unlike perimeter controls (firewalls, DLP), IRM controls travel with the data itself. This means a recipient who legitimately receives a document still cannot print it, forward it, or copy from it if the IRM policy prohibits those actions.

  • View-only restrictions (cannot print or copy)
  • No-forward on emails
  • Time-limited access (document expires after 30 days)
  • Watermarking (track who leaked a document)

Data Subject Rights

RightGDPRPhilippines DPA 2012
AccessArt. 15 β€” right to know what data is held and how it is processedSec. 16(b) β€” right to access personal information
RectificationArt. 16 β€” right to correct inaccurate dataSec. 16(c) β€” right to dispute and correct inaccurate data
Erasure (Right to be Forgotten)Art. 17 β€” right to deletion (with exceptions)Sec. 16(d) β€” right to erasure or blocking of data
Data PortabilityArt. 20 β€” right to receive data in structured formatSec. 18 β€” right to data portability
ObjectArt. 21 β€” right to object to processing (e.g., marketing)Sec. 16(f) β€” right to object to processing
No Automated Decision-MakingArt. 22 β€” right not to be subject to solely automated decisions with significant effectSec. 16(g) β€” right to complain about automated decisions

Key Terms

Privacy by Design β€” embed privacy into systems from the start; proactive, not reactive
Anonymization β€” irreversible removal of identifying information; GDPR no longer applies
Pseudonymization β€” reversible replacement with token/key; GDPR still applies but risk is reduced
Tokenization β€” replaces sensitive data with non-related surrogate; primary use is PCI-DSS scope reduction
Data Masking β€” partial display of data (e.g., ****1234); for non-prod environments and UI
DLP β€” Data Loss Prevention; monitors/prevents unauthorized data exfiltration
IRM / DRM β€” Information/Digital Rights Management; controls travel with the data (no-print, no-forward, expiry)
GDPR β€” EU General Data Protection Regulation; applies to any processing of EU data subjects
Data Subject Rights β€” rights of individuals over their personal data: access, rectification, erasure, portability, objection
Right to Erasure β€” data subject right to deletion; has exceptions for legal obligation, public interest, legal claims
NPC β€” National Privacy Commission (Philippines); enforces DPA 2012

Exam Tips

Tip 1 β€” Privacy by Design: Proactive Not Reactive The key phrase is "proactive not reactive" β€” privacy is anticipated and built in before problems arise. Exam questions often present a scenario where privacy was "bolted on" after the system was built β€” that violates PbD. The answer is always to embed privacy into design from the beginning.
Tip 2 β€” Anonymization vs Pseudonymization: GDPR Scope This is heavily tested. Anonymized data = GDPR does NOT apply (no longer personal data). Pseudonymized data = GDPR DOES apply (personal data β€” just harder to link). The differentiator is reversibility: if a mapping table exists, it's pseudonymization. If no re-identification is possible by any means, it's anonymization.
Tip 3 β€” Tokenization Reduces PCI-DSS Scope When FinTech Company X uses Card Processor for card processing, Card Processor holds the actual PAN. Platform C stores only the token. This removes Platform C from PCI-DSS scope for cardholder data (the token has no value to an attacker without the mapping at Card Processor). Tokenization = scope reduction, not elimination of PCI-DSS compliance entirely.
Tip 4 β€” DRM / IRM Controls Travel With the Data Unlike firewalls or DLP (which protect the perimeter), IRM/DRM controls are embedded in the document or file itself. A recipient who legitimately downloads the file still cannot print it or forward it if IRM prohibits those actions. Exam question: "A document was legitimately emailed to a partner, but we don't want them to print it." β†’ IRM/DRM, not DLP.
Tip 5 β€” Right to Erasure Has Exceptions GDPR Art. 17 right to erasure can be denied when: (1) legal obligation requires retention; (2) public interest / public health; (3) establishment, exercise, or defense of legal claims; (4) freedom of expression and information. Know these exceptions β€” exam questions will present scenarios where the answer is to deny the erasure request lawfully.

Work Application β€” FinTech Company X

Action Item β€” Apply Privacy by Design to Platform C & Partner C Audit each PbD principle against Platform C's current implementation:
PbD PrinciplePlatform C Implementation CheckGap / Action
1. Proactive Are privacy impact assessments (PIA/DPIA) run before new features ship? Add DPIA gate to feature release process; required for any new personal data field
2. Privacy Default Does Platform C default to collecting minimum fields? Does Partner C lead form have all optional fields unchecked by default? Audit Partner C form β€” any pre-ticked consent checkboxes are non-compliant under DPA 2012
3. Embedded Is encryption part of the schema design, not a post-deployment patch? Confirm AES-256-CTR defined in schema migrations, not retrofitted via application layer
6. Transparency Is there a privacy notice displayed before Partner C (PH) form submission? Required by DPA 2012 β€” privacy notice must be shown before data collection begins; consent checkbox required; NPC-registered data processing activity
Partner E / Card Processor card display: Card number must show only last 4 digits (****-****-****-1234) in all UI surfaces β€” account dashboard, transaction history, customer service screens. PAN must never appear in full in any Platform C/Platform B interface. Card Processor holds PAN; Platform C holds only the token.
Non-production environments: Staging, QA, and development environments must use masked or synthetic data only. Production PII must never be copied to non-production environments β€” a common audit finding and DPA 2012 violation risk.

Practice Questions

Q1. Platform C's loan application form collects 32 fields. A privacy review finds that only 18 are required for the credit decision. The remaining 14 are collected "because they might be useful for future marketing." Which Privacy by Design principle does removing the 14 fields most directly address?

A) Principle 4 β€” Full Functionality   B) Principle 2 β€” Privacy as the Default + data minimization   C) Principle 6 β€” Transparency   D) Principle 3 β€” Embedded into Design

βœ“ B) Principle 2 β€” Privacy as the Default + data minimization
Privacy as the Default means collecting only what is necessary by default β€” users do not need to opt out of over-collection. Combined with the GDPR data minimization principle (collect only what is adequate, relevant, and necessary for the stated purpose), collecting extra fields for speculative future marketing violates both. The 14 extra fields should be removed unless explicit purpose and legal basis for collection are established. Principle 7 (respect for user privacy) also supports this, but Principle 2 is the most direct answer.

Q2. Platform C replaces a customer's phone number with a randomly generated token for analytics use. The mapping table (token β†’ original phone number) is stored in a separate, access-controlled vault. Is this anonymization or pseudonymization?

A) Anonymization β€” data is replaced with a token   B) Pseudonymization β€” mapping table exists; original identity can be restored   C) Tokenization β€” equivalent to PCI-DSS tokenization   D) Encryption β€” functionally equivalent to AES encryption

βœ“ B) Pseudonymization β€” mapping table exists
The key differentiator is reversibility. Because a mapping table exists (stored in the vault), the original phone number can be recovered by anyone with access to the mapping. This is pseudonymization, not anonymization. GDPR still applies to pseudonymized data β€” it is still personal data. Anonymization requires that re-identification be permanently impossible by any means. The mapping table must be destroyed for the data to become truly anonymized.

Q3. FinTech Company X uses Card Processor for card processing. Platform C stores only the Card Processor-issued token; Card Processor holds the actual PAN. What is the primary security benefit of this arrangement?

A) Biometric data is protected by eKYC Vendor   B) FinTech Company X's Platform C systems are removed from PCI-DSS cardholder data scope   C) Customer consent is automatically captured   D) Data minimization is achieved for PII

βœ“ B) Platform C systems are removed from PCI-DSS cardholder data scope
PCI-DSS compliance requirements apply to any system that stores, processes, or transmits cardholder data (PANs). By using tokenization β€” where Card Processor holds the PAN and Platform C holds only a non-sensitive token β€” Platform C's systems are removed from PCI-DSS cardholder data scope. If Platform C were breached, attackers would find only tokens, which have no value without Card Processor's secure mapping table. This is the primary commercial driver for tokenization in fintech architectures.

Q4. A confidential Partner A credit policy document was legitimately emailed to a Partner A executive. FinTech Company X wants to ensure the executive cannot print it or forward it to third parties. Which control achieves this?

A) Network DLP β€” block outbound email attachments   B) Endpoint DLP β€” monitor USB and print on the executive's device   C) IRM / DRM β€” embed controls in the document that prevent printing and forwarding   D) Data masking β€” redact sensitive sections before sending

βœ“ C) IRM / DRM β€” controls travel with the document
DLP (Network or Endpoint) protects data within the organization's perimeter β€” it cannot control what a recipient does with legitimately received data on their own device. IRM/DRM embeds access controls directly into the document file. When the executive opens the document, the IRM client enforces the policy: no print, no forward, no copy β€” regardless of whose device they use. This is the defining advantage of IRM/DRM over perimeter-based controls.

Q5. A Philippine Partner C customer submits a data erasure request under DPA 2012. However, the customer has an active loan with Partner A that is currently in arrears and may result in legal action. Can FinTech Company X deny the erasure request? Why?

A) No β€” data subject rights are absolute under DPA 2012   B) Yes β€” the data may be required for the establishment, exercise, or defense of legal claims   C) Yes β€” but only if NPC is notified within 72 hours   D) No β€” the right to erasure has no exceptions under Philippines law

βœ“ B) Yes β€” data required for legal claims
The right to erasure β€” like its GDPR equivalent β€” has statutory exceptions. Data retention may be justified when it is necessary for the establishment, exercise, or defense of legal claims (e.g., loan collection proceedings). FinTech Company X can lawfully deny the erasure request, but must: (1) document the legal basis for denial; (2) inform the customer of the reason; (3) inform the customer of their right to complain to the NPC (National Privacy Commission). Once the legal proceedings conclude and the legal basis for retention no longer exists, the erasure request must be honored.