HomeD2: Asset Security › Data Lifecycle & States
Domain 2 · Lesson 3 of 5

Data Lifecycle & Data States

Vòng đời Dữ liệu & Trạng thái Dữ liệu

Theory

Data Lifecycle Stages & Controls

Data passes through six stages from creation to destruction. Security controls must be applied at every stage — there is no safe gap in the lifecycle.

StageTiếng ViệtKey ControlsTS Example
Create Tạo Input validation, classification at ingestion, consent capture Customer submits loan application; KYC data captured via Partner C/Platform C onboarding
Store Lưu trữ Encryption at rest, access control (RBAC), backup and recovery, integrity controls AES-256-CTR in PostgreSQL (Platform C); signed PDFs in GCS; card tokens in encrypted store
Use Sử dụng Authorization enforcement, audit logging, need-to-know access, minimize in-memory exposure Credit scoring service processes income data; AML screening queries customer profile
Share Chia sẻ Encryption in transit (TLS/mTLS), data sharing agreements (DPA/NDA), data minimization in payloads Send credit score to Partner A via H2H; AML check with AML Vendor; biometric verify with eKYC Vendor
Archive Lưu trữ lâu dài Retention schedule compliance, continued encryption, restricted access, integrity verification Closed Partner A loan records archived to cold GCS storage after 2 years; access restricted to compliance team
Destroy Hủy NIST SP 800-88 method appropriate to media, verification and documentation of destruction Platform B 30-day soft-delete window expires; NIST media sanitization for decommissioned hardware

Data States & Their Controls

At any point in its lifecycle, data exists in one of three states. Each state requires different protection mechanisms:

StateTiếng ViệtControls
At Rest Lưu trữ (tĩnh) AES-256 encryption, full-disk encryption (FDE), database encryption, access control lists, key management (Vault)
In Transit Truyền tải TLS 1.3, mTLS (mutual authentication), VPN tunnels, SFTP, certificate pinning
In Use Đang xử lý Secure enclaves (Intel SGX, AWS Nitro), minimize exposure time, memory scrubbing, process isolation

In Use — The Hardest State to Protect

When data is actively being processed, it must be decrypted in RAM. Standard encryption does not apply to data in memory. This is why secure enclaves (isolated memory regions that prevent even the OS from reading the data) are used for the most sensitive operations. Memory-scrapers and cold-boot attacks target this state specifically.

Data Minimization

A core GDPR principle: collect only what is necessary for the stated purpose, and retain only as long as necessary. Collecting more data than needed increases breach impact, regulatory exposure, and storage costs. For FinTech Company X: collect the minimum credit decision fields; do not collect data "because it might be useful later."

Key Terms

Data Lifecycle — the six stages data passes through: Create, Store, Use, Share, Archive, Destroy
At Rest — data stored on disk, database, or media; protected by encryption and access control
In Transit — data moving across a network; protected by TLS, mTLS, VPN
In Use — data actively processed in memory (RAM); hardest to protect; requires secure enclaves
Data Minimization — GDPR principle to collect and retain only minimum necessary data
Archiving — moving data to long-term storage; classification and controls must be maintained
Retention Schedule — formal policy defining how long each data type must be kept before destruction
Secure Enclave — isolated memory region (e.g., Intel SGX) where data can be processed without OS access

Exam Tips

Tip 1 — In Use Is the Hardest to Protect Exam questions frequently test this. Data in RAM is decrypted — no standard encryption protects it. Secure enclaves provide hardware-level isolation, but are not universally deployed. If a question asks which data state is most vulnerable, the answer is In Use.
Tip 2 — Data Minimization Limits Both Collection AND Retention GDPR's data minimization principle applies at two points: (1) at the Create stage — collect only what's needed; (2) at the Archive/Destroy stage — don't retain longer than the stated purpose requires. Both are exam-relevant.
Tip 3 — Security Controls Apply at EVERY Lifecycle Stage A common exam trap: "once data is archived, it no longer needs protection." Wrong — archived data retains its original classification level and requires continued encryption and access control. "Archived" does not mean "no longer sensitive."
Tip 4 — Archived Data Keeps Its Original Classification Data classified as Confidential when active is still Confidential when archived. Classification does not automatically downgrade when data moves to archive. Only a formal reclassification review can change the classification level.

Work Application — FinTech Company X

Action Item — Audit Platform C Data Lifecycle Audit each lifecycle stage for Platform C customer data:
StageAudit QuestionExpected State
Create Is consent captured and logged (with timestamp + version) for every data field collected? Consent log in immutable store; field-level consent mapping
Store Is AES-256-CTR confirmed active for all PII fields in PostgreSQL? Encryption verified; Vault key rotation schedule active
Use Is the credit scoring service audit-logged for every data access event? SIEM receiving access logs; anomaly alerting enabled
Share Is mTLS enforced for the AML Vendor AML API call? Is payload minimized to required fields only? mTLS certificates verified; no PII beyond required AML fields in payload
Archive Are closed Partner A loan records encrypted in cold GCS storage with access restricted to compliance team? GCS bucket: encryption on, public access off, IAM role-gated
Destroy Platform B 30-day soft-delete: does it also purge Redis cache, Kafka events, Elasticsearch index, AND eKYC Vendor biometrics? All data stores included in deletion job; eKYC Vendor biometric deletion confirmed via API
Critical gap to check: Kafka is append-only — you cannot delete individual records from a Kafka topic. If customer PII flows through Kafka, the payload must be anonymized (not just the offset deleted) to achieve true erasure at the Destroy stage.

Practice Questions

Q1. An Platform B customer's data is loaded into RAM during their active session for profile rendering and credit limit display. Which data state does this represent?

A) At Rest   B) In Transit   C) In Use   D) In Archive

✓ C) In Use
Data actively being processed in memory (RAM) is In Use. At this point, the data is decrypted and accessible to the application process. Standard disk encryption (AES-256-CTR) does not protect data once it is loaded into RAM for processing — hence In Use is the most vulnerable data state.

Q2. mTLS (mutual TLS) between Platform C and the AML Vendor AML API protects data in which state?

A) At Rest   B) In Use   C) In Transit   D) In Archive

✓ C) In Transit
mTLS is a transport-layer security protocol — it encrypts data as it moves between two systems over a network. Once data arrives at AML Vendor's servers and is written to disk, it is At Rest. Once it is loaded into AML Vendor's processing engine, it is In Use. mTLS specifically protects the In Transit state.

Q3. Partner A loan records are moved to a cold GCS archive after 2 years. What classification level applies to this archived data?

A) Unclassified — archived data is no longer active   B) Internal — reduced sensitivity over time   C) Same Confidential/Restricted level as when active   D) Public — data is historical

✓ C) Same Confidential/Restricted level as when active
Archived data retains its original classification. Moving data to archive does not change its sensitivity — a customer's credit score and loan history are just as sensitive whether the loan is active or closed 2 years ago. Classification can only change through a formal reclassification review process initiated by the Data Owner.

Q4. Why is the "In Use" data state considered the hardest to protect?

A) Encryption algorithms are weaker for RAM   B) Data must be decrypted in memory for processing, making standard encryption inapplicable   C) Regulations don't cover In Use data   D) It is only temporary, so controls are skipped

✓ B) Data must be decrypted in memory for processing
For an application to use data, the CPU must work on the plaintext. This means the data exists unencrypted in RAM during processing. Memory-scraping attacks, cold-boot attacks, and hypervisor attacks can extract this data. Secure enclaves (Intel SGX, AWS Nitro Enclaves) partially mitigate this by creating hardware-isolated memory regions, but they are not universally deployed.

Q5. GDPR's data minimization principle requires that an organization:

A) Encrypt all data at rest   B) Collect only necessary data and retain it only as long as needed   C) Share data only within the EU   D) Anonymize all data before storage

✓ B) Collect only necessary data and retain it only as long as needed
Data minimization (GDPR Article 5(1)(c)) has two dimensions: (1) collection minimization — only collect data that is adequate, relevant, and limited to what is necessary for the stated purpose; (2) retention minimization — do not keep data longer than necessary for the original purpose. For FinTech Company X: only collect KYC fields required for credit decisions, and delete them when the purpose expires or the customer exercises erasure rights.