Domain 2 Quiz: Asset Security

100 Câu hỏi / 100 Practice Questions — Scenario-Based, ISC² Mindset

100 Questions
5 Topics
Data Governance Focus
Fintech / PH / VN Context
T1 Data Classification — Q1 to Q20
1
Data ClassificationEasy

A FinTech Company X analyst discovers a spreadsheet containing applicants' full names, national IDs, and credit scores. The company has a four-tier classification scheme: Public, Internal, Confidential, and Restricted. Which classification BEST fits this dataset, and who is ultimately responsible for assigning that label?

  • A. Internal — the IT Security team assigns labels based on sensitivity scans
  • B. Confidential — the Data Custodian assigns labels when ingesting data into the DLP system
  • C. Restricted — the Data Owner assigns the classification based on business impact and regulatory requirement
  • D. Restricted — the Chief Privacy Officer mandates all PII be labeled by legal counsel
✓ Correct: C — Restricted, assigned by the Data Owner
National IDs and credit scores are regulated PII subject to the Philippine Data Privacy Act (Republic Act 10173) and credit-reporting rules, warranting the highest commercial tier (Restricted). In ISC² doctrine, the Data Owner — the business executive accountable for the data — determines classification, not IT or legal teams. Custodians and others implement controls; they do not assign labels.
💡 ISC2 Mindset: Classification authority belongs to the Data Owner, not to technical or legal teams who only support the decision.
2
Data ClassificationEasy

A government security agency classifies a document as "Secret." According to the U.S. government classification scheme, what is the correct ordering from lowest to highest classification level?

  • A. Unclassified → Sensitive → Confidential → Secret → Top Secret
  • B. Unclassified → Confidential → Secret → Top Secret
  • C. Public → Internal → Confidential → Restricted → Top Secret
  • D. Confidential → Sensitive → Secret → Top Secret → Compartmented
✓ Correct: B — Unclassified → Confidential → Secret → Top Secret
The U.S. government uses three formal classification levels: Confidential, Secret, and Top Secret (with Unclassified below all of them). There is no official "Sensitive" tier in government scheme — Sensitive But Unclassified (SBU) or Controlled Unclassified Information (CUI) are handling designations, not classification levels. Options A, C, and D mix commercial and government schemes or invent levels.
💡 ISC2 Mindset: Know the three-tier government scheme vs. multi-tier commercial schemes — exams frequently test this distinction.
3
Data ClassificationEasy

A fintech company publishes its quarterly earnings report on its investor relations website. Under a commercial classification policy, this information should be labeled as which of the following?

  • A. Internal — only employees and regulators may read it
  • B. Confidential — financial data always requires the highest protection
  • C. Public — it is intentionally disclosed to all without restriction
  • D. Restricted — regulatory filings carry legal sensitivity
✓ Correct: C — Public
Once information is intentionally released to the general public (e.g., published on a website, filed with a stock exchange), it loses its confidentiality requirement and is classified as Public. The sensitivity of financial data is addressed before disclosure; after publishing, no access controls apply. Classifying public information as Confidential would misallocate protective resources.
💡 ISC2 Mindset: Classification reflects the sensitivity of information at a point in time — publicly released information is Public regardless of its original sensitivity.
4
Data ClassificationMedium

A FinTech Company X data engineer proposes classifying all machine-learning training datasets as "Internal" to avoid administrative overhead. The training data includes anonymized borrower behavioral signals derived from Restricted PII. Which principle BEST explains why this proposal is problematic?

  • A. Anonymized data cannot be re-identified, so Internal is actually too high a classification
  • B. Classification must reflect the most sensitive element in a dataset, not the derived output alone
  • C. ML training data should always be classified as Restricted per regulatory guidance
  • D. Only the Data Custodian, not a data engineer, may propose classification changes
✓ Correct: B — Classification must reflect the most sensitive element
ISC² doctrine holds that a dataset's classification is driven by its most sensitive component. Even if the training data appears anonymized, the derivation chain from Restricted PII and the re-identification risk (especially in behavioral data) mean the dataset may warrant a higher label. Downgrading classification to reduce overhead violates the principle that protection requirements follow sensitivity. Option A is incorrect because anonymization is not always irreversible — risk of re-identification must be assessed.
💡 ISC2 Mindset: Classify by the highest sensitivity tier present — never classify down to avoid overhead.
5
Data ClassificationMedium

During a classification review, a manager asks whether the same classification label should apply to a database and a printed report derived from that database. What is the ISC² correct position?

  • A. The printed report may carry a lower classification because physical documents are harder to exfiltrate at scale
  • B. Classification follows the data, not the medium — both must bear the same label if they contain the same information
  • C. Physical media is classified separately from digital media under a different policy framework
  • D. Only the database needs classification; derived reports inherit protection from the access control list
✓ Correct: B — Classification follows the data, not the medium
ISC² emphasizes that classification is data-centric, not medium-centric. If a printed report and a database contain equivalent sensitive information, both must carry the same classification label and receive commensurate physical or logical controls. Allowing the physical form to carry a lower label creates a loophole where sensitive data can be downgraded simply by printing it.
💡 ISC2 Mindset: Data classification travels with the data regardless of format — paper, digital, or verbal.
6
Data ClassificationMedium

Partner A's Vietnam operations store loan repayment histories for micro-finance customers. The legal team notes that Vietnam's Decree 13/2023 on personal data protection classifies certain financial data as "sensitive personal data." How should this regulatory designation affect Partner A's internal classification scheme?

  • A. Regulatory designations are advisory only; Partner A's internal scheme takes precedence
  • B. The internal classification must be at least as protective as the regulatory designation — regulatory requirements set the floor
  • C. Partner A should adopt Decree 13/2023 terminology verbatim to avoid compliance gaps
  • D. Financial data regulated under Decree 13/2023 is automatically classified as Top Secret under ISC² standards
✓ Correct: B — Regulatory requirements set the floor for protection
When regulations impose specific sensitivity designations, an organization's internal classification scheme must provide at least the same level of protection — regulations represent the minimum bar. Partner A can use its own terminology and labels, but the controls applied to data classified as sensitive under Decree 13/2023 must meet or exceed what the decree requires. Ignoring the regulatory floor creates legal liability.
💡 ISC2 Mindset: Regulatory requirements are the minimum — internal policies can exceed them but never fall below.
7
Data ClassificationMedium

A CISSP candidate is reviewing classification criteria. Which of the following is the PRIMARY criterion an organization should use when determining a classification label for a specific dataset?

  • A. The storage cost of implementing controls at each classification tier
  • B. The value, sensitivity, and potential damage to the organization if the data were disclosed, altered, or destroyed
  • C. The number of users who need access — higher access volume justifies a lower classification
  • D. The format of the data (structured vs. unstructured) because controls differ by format
✓ Correct: B — Value, sensitivity, and potential damage
ISC² identifies value, sensitivity, criticality, and potential harm as the primary criteria for classification. Cost, user count, and data format are operational considerations that may influence control selection, but they do not determine the appropriate classification tier. The classification label must reflect the real-world impact of a breach or loss, driving appropriate safeguards.
💡 ISC2 Mindset: Classify by impact of loss, not by implementation convenience.
8
Data ClassificationMedium

FinTech Company X's eKYC Vendor team stores facial recognition templates for identity verification. A government agency requests access to these templates for law enforcement purposes. Before granting access, the Data Owner reviews the request. What classification consideration is MOST relevant?

  • A. Biometric templates are always Unclassified because they are derived from public-facing interactions
  • B. Biometric templates qualify as sensitive personal data under the PH DPA and likely Restricted under internal policy, restricting disclosure without legal authority
  • C. Government requests automatically override internal classification and access controls
  • D. The Data Custodian should make the access decision because they manage the storage system
✓ Correct: B — Biometric templates are sensitive/Restricted; disclosure requires legal authority
Under the Philippine Data Privacy Act (RA 10173) and ISC² asset security principles, biometric data is among the most sensitive categories of personal data. An internal classification of Restricted is appropriate. Government requests do not automatically override data protection obligations — a lawful court order or specific legal authority is required. The Data Owner (not Custodian) decides whether to grant access, guided by legal counsel.
💡 ISC2 Mindset: Classification drives access decisions — even government requests must meet the legal bar set by data sensitivity.
9
Data ClassificationMedium

An organization's classification policy states that data classified as "Confidential" must be encrypted at rest. A developer argues that a Confidential dataset stored on an encrypted disk partition is compliant, even though individual files are not encrypted. Is this interpretation correct?

  • A. Yes — full-disk encryption satisfies file-level encryption requirements under any reasonable interpretation
  • B. No — the policy specifies data-at-rest encryption; the Data Owner must clarify whether full-disk or file-level encryption satisfies the control
  • C. No — only AES-256-CTR is acceptable for Confidential data under ISC² standards
  • D. Yes — encryption at the partition level exceeds file-level encryption and therefore over-satisfies the policy
✓ Correct: B — Policy ambiguity must be resolved by the Data Owner
Full-disk encryption and file-level encryption have different threat models — FDE does not protect data when the system is running and authenticated, while file-level encryption does. The policy states "encrypted at rest" without specifying granularity. ISC² requires that ambiguous policy language be clarified by the Data Owner, who can decide whether FDE meets the intent. Assuming compliance without owner clarification is a governance failure. ISC² does not mandate a specific algorithm like AES-256-CTR universally.
💡 ISC2 Mindset: Policy ambiguity is resolved by the Data Owner — implementers should not self-interpret security controls.
10
Data ClassificationMedium

A financial institution uses a three-tier commercial classification: Public, Internal, and Confidential. A newly hired compliance officer proposes adding a fourth tier "Regulated" specifically for data subject to PCI-DSS and local banking regulations. What is the BEST reason to support or reject this proposal?

  • A. Reject — four-tier schemes are more complex and increase misclassification risk
  • B. Support — a dedicated tier for regulated data simplifies compliance mapping and makes regulatory controls explicit
  • C. Reject — PCI-DSS mandates that card data must always be classified as Confidential, not in a separate tier
  • D. Support — only adding tiers increases classification precision; there is no downside
✓ Correct: B — A dedicated tier simplifies compliance mapping
Adding a "Regulated" tier can be beneficial when specific datasets trigger distinct legal obligations (PCI-DSS, BSP regulations, DPA) that differ from general Confidential controls. This makes the compliance linkage explicit and reduces the chance that regulatory controls are missed. ISC² supports classification schemes that are fit-for-purpose. Option A is a blanket rejection without merit. Option D overstates the case — complexity can be a downside if not managed.
💡 ISC2 Mindset: Classification schemes should be fit-for-purpose and map clearly to the organization's regulatory obligations.
11
Data ClassificationMedium

A security manager receives a request to declassify a dataset that previously contained Restricted PII after the data has been anonymized. What process MUST occur before reclassification is approved?

  • A. A DLP tool scan confirming no PII fields remain in the dataset
  • B. A formal re-identification risk assessment reviewed and approved by the Data Owner
  • C. An audit by the external privacy regulator confirming the anonymization is complete
  • D. Reclassification is automatic once anonymization tools are applied — no further process is needed
✓ Correct: B — Formal re-identification risk assessment approved by the Data Owner
Anonymization is not binary — re-identification risk depends on the dataset's attributes, external data availability, and the strength of the anonymization technique. ISC² requires that declassification decisions be made by the Data Owner based on a formal assessment, not automated tool output alone. A DLP scan (A) looks for field-level PII but cannot assess re-identification risk from quasi-identifiers. Regulatory audit (C) is not a standard prerequisite for internal reclassification. Option D is incorrect — automatic reclassification on tool output alone is a governance failure.
💡 ISC2 Mindset: Declassification is a formal Data Owner decision backed by risk assessment, not an automatic outcome of processing.
12
Data ClassificationMedium

In the U.S. government classification system, who has the authority to originally classify information as "Top Secret"?

  • A. Any federal employee who determines the information meets the Top Secret criteria
  • B. Only the President and Vice President of the United States
  • C. Officials designated as Original Classification Authorities (OCAs) by the President or agency heads
  • D. The Director of National Intelligence and the Secretary of Defense jointly
✓ Correct: C — Designated Original Classification Authorities (OCAs)
Under Executive Order 13526, original classification authority is granted by the President to specific officials (OCAs). Not every federal employee can classify — only those explicitly designated. OCAs at the Top Secret level are typically agency heads or their delegated senior officials. The President delegates this authority broadly; it is not limited to the President and VP alone.
💡 ISC2 Mindset: In government systems, classification authority is explicitly delegated — it is not assumed by role alone.
13
Data ClassificationMedium

FinTech Company X maintains two separate data repositories: one for loan application data (Restricted) and one for marketing analytics (Internal). A data scientist wants to join these two datasets to build a churn prediction model. What classification should the resulting joined dataset receive?

  • A. Internal — the marketing dataset dominates because it has more rows
  • B. Restricted — the joined dataset inherits the highest classification of any source dataset
  • C. Confidential — joining datasets creates a new category that is one tier below the highest
  • D. The classification depends entirely on whether the join key is a PII field
✓ Correct: B — Restricted, inheriting the highest classification of any source
ISC² doctrine (aggregation principle) holds that combining datasets produces a result that must be classified at least as high as the most sensitive source. When Restricted and Internal data are joined, the result is Restricted — the higher label propagates. Additionally, the combination may itself create additional sensitivity (aggregation risk) that could justify an even higher classification. Volume and join key type do not lower the classification floor.
💡 ISC2 Mindset: Aggregation elevates classification — combined data is at least as sensitive as the most sensitive component.
14
Data ClassificationMedium

A healthcare company's classification policy requires all "Sensitive" data to be labeled with a visible header/footer on printed documents. An employee prints a patient record but forgets to add the header. Who is PRIMARILY responsible for this policy violation?

  • A. The IT team, for failing to configure the print system to add headers automatically
  • B. The employee, for failing to follow the labeling policy when handling sensitive data
  • C. The Data Custodian, for not enforcing technical controls that prevent unlabeled printing
  • D. The Data Owner, for failing to communicate the policy clearly enough to users
✓ Correct: B — The employee bears primary responsibility for the violation
Each individual who handles classified information is responsible for complying with the labeling policy. The employee knew the classification and was obligated to apply the required label. While technical controls (auto-labeling) reduce risk and IT/Custodian should implement them, their absence does not transfer primary responsibility away from the individual handler. Blaming only the Data Owner is too remote — policy communication is a shared responsibility.
💡 ISC2 Mindset: All individuals who handle data are responsible for following classification-based handling requirements.
15
Data ClassificationMedium

An organization uses a mandatory access control (MAC) system where classification labels are enforced by the OS. A user with "Secret" clearance attempts to write a file containing Top Secret data to a Secret-labeled directory. What does the Bell-LaPadula model mandate in this scenario?

  • A. Allow the write — the user's clearance covers Secret and the directory accepts Secret files
  • B. Deny the write — writing TS data to a Secret directory would violate the "no write-down" rule
  • C. Allow the write — the simple security property permits users to write at any level they can read
  • D. Deny the read but allow the write — MAC only restricts reading, not writing
✓ Correct: B — Deny; writing TS data to a Secret container violates no write-down (*-property)
The Bell-LaPadula model's *-property (star property) states "no write-down" — a subject may not write to an object at a lower classification level. Writing Top Secret content to a Secret directory would effectively downgrade TS data, violating the confidentiality model. The system must deny this operation regardless of the user's intent. The simple security property ("no read-up") controls reading, while *-property controls writing.
💡 ISC2 Mindset: Bell-LaPadula's *-property (no write-down) is the key control that prevents accidental data downgrade.
16
Data ClassificationMedium

A fintech startup operating in the Philippines processes credit card numbers, CVVs, and expiry dates. The Chief Information Security Officer (CISO) wants to classify this data under the internal scheme. Under PCI-DSS, which elements are considered "Sensitive Authentication Data" (SAD) and may NEVER be stored after authorization, regardless of encryption?

  • A. Primary Account Number (PAN), cardholder name, and expiry date
  • B. CVV/CVC codes, PIN blocks, and full magnetic stripe data
  • C. PAN and expiry date only — CVVs may be stored if encrypted
  • D. All cardholder data must be deleted after authorization under PCI-DSS
✓ Correct: B — CVV/CVC codes, PIN blocks, and full magnetic stripe/chip data are SAD
PCI-DSS distinguishes between Cardholder Data (CHD) — PAN, name, expiry, service code — which may be stored with protection, and Sensitive Authentication Data (SAD) — CVV/CVC, PIN blocks, full track data — which must NEVER be stored after authorization, even in encrypted form. The classification for fintech card data should place SAD in the highest tier with a note that storage post-authorization is prohibited.
💡 ISC2 Mindset: Some data elements carry absolute prohibitions (no storage), not just access controls — know the difference between CHD and SAD.
17
Data ClassificationMedium

A data governance team is designing classification criteria. Which of the following criteria would be LEAST appropriate as a primary classification driver?

  • A. Regulatory obligations associated with the data type
  • B. Competitive damage if the information were disclosed to a competitor
  • C. The age of the data — older data is inherently less sensitive
  • D. Privacy implications for individuals whose data is included
✓ Correct: C — Age alone is not a valid classification criterion
Age is not a reliable classification criterion because old data can still be highly sensitive (e.g., decades-old criminal records, historical medical data, or legacy financial transactions). Regulatory obligations, competitive impact, and privacy implications are all legitimate classification drivers recognized in ISC² frameworks. Assuming data loses sensitivity with age without a formal review is a classification governance failure.
💡 ISC2 Mindset: Age does not automatically reduce sensitivity — only a formal review by the Data Owner can justify declassification.
18
Data ClassificationHard

FinTech Company X's Platform B platform applies a 30-day soft-delete policy for customer records before permanent deletion. During this 30-day window, the data remains in the database flagged as "deleted." A legal hold is placed on an account on Day 25. What is the CORRECT action, and how does the classification of the data affect the hold decision?

  • A. Delete the data immediately on Day 30 regardless of the legal hold — the retention schedule takes precedence
  • B. Suspend the deletion process for all legal-hold records regardless of the 30-day schedule; classification does not affect the hold
  • C. The legal hold overrides the retention schedule; the data's classification (Restricted, containing PII) requires additional access controls while under hold
  • D. Notify the regulator and delete on Day 30 — legal holds only apply to financial records, not PII
✓ Correct: C — Legal hold overrides retention; Restricted classification requires access controls during hold
A legal hold (litigation hold) supersedes any retention schedule, including automated soft-delete processes. The organization must preserve the data in its current state until the hold is lifted. Because the data is classified as Restricted (PII), it must also be protected with appropriate access controls during the hold period — only authorized legal and IT staff should access it. Deleting data under legal hold constitutes spoliation, which can result in adverse court rulings and penalties.
💡 ISC2 Mindset: Legal holds are absolute overrides — no retention schedule, deletion policy, or automation can supersede them once issued.
19
Data ClassificationHard

A CISSP is reviewing a classification policy that allows business unit managers to reclassify data downward (e.g., from Confidential to Internal) without additional approval. What risk does this policy create, and what control would BEST mitigate it?

  • A. No significant risk — managers closest to the data are best positioned to classify it accurately
  • B. Risk of accidental downgrade; mitigate by requiring dual authorization from both the manager and the Data Steward for any downward reclassification
  • C. Risk of systematic downgrade to reduce compliance burden; mitigate by requiring Data Owner approval and a documented re-identification or impact assessment for any downward reclassification
  • D. Risk of regulatory breach; mitigate by prohibiting all downward reclassification permanently
✓ Correct: C — Risk of systematic downgrade; control is Data Owner approval + impact assessment
Allowing managers unilateral downward reclassification creates an incentive to reduce classifications to avoid compliance overhead — a pattern ISC² recognizes as a governance risk. The correct control is requiring the Data Owner (not just a business unit manager) to approve any downward reclassification, supported by a documented impact assessment showing the sensitivity has genuinely decreased. Dual manager+steward authorization (B) is weaker because neither is the Data Owner. Permanently prohibiting all downward reclassification (D) is operationally unworkable and ignores legitimate declassification needs.
💡 ISC2 Mindset: Only the Data Owner — with documented justification — should authorize downward reclassification.
20
Data ClassificationHard

A FinTech Company X security architect is mapping data flows in a microservices environment. An API endpoint receives Restricted customer PII, processes it, and returns only a tokenized ID to the calling service. The architect proposes classifying the API response (tokenized ID) as Internal. What consideration MOST threatens this classification decision?

  • A. Tokens are random and cannot be linked to PII, so Internal is appropriate
  • B. If the tokenization vault is compromised, the token becomes a surrogate for Restricted PII — the token's classification must reflect the de-tokenization risk
  • C. API responses should always inherit the classification of the input data regardless of transformation
  • D. Only the API's Data Owner can determine classification; the architect's proposal is advisory only
✓ Correct: B — Token classification must reflect de-tokenization risk if vault is compromised
Tokenization replaces PII with a surrogate value; the token alone carries no PII. However, if the tokenization vault mapping is exposed, every token immediately becomes as sensitive as the original PII. The classification of the token should therefore account for the de-tokenization risk — in high-stakes financial environments, tokens linked to a live vault may warrant Confidential, not Internal. Option A ignores vault compromise risk. Option C is overly broad — truly one-way transformed output (like a hash with no reverse mapping) can legitimately carry a lower classification.
💡 ISC2 Mindset: Transformed data inherits risk proportional to the reversibility of the transformation and the sensitivity of the source.
T2 Data Ownership Roles — Q21 to Q40
21
Data RolesMedium

A FinTech Company X product manager decides that the borrower loan-application database should only be accessible to the Credit Risk and Compliance teams. The IT DBA team receives this directive and implements the access controls in the database management system. Which roles are correctly described here?

  • A. Product manager = Data Custodian; DBA team = Data Owner
  • B. Product manager = Data Owner; DBA team = Data Custodian
  • C. Product manager = Data Steward; DBA team = Data Owner
  • D. Product manager = Data User; DBA team = Data Steward
✓ Correct: B — Product manager = Data Owner; DBA team = Data Custodian
The Data Owner is the business executive or manager who makes decisions about who can access the data and how it should be protected — here the product manager holds that accountability. The Data Custodian (DBA team) implements those decisions technically but has no authority to change access policy. This Owner vs. Custodian split is the most frequently tested concept in ISC² Domain 2.
💡 ISC2 Mindset: Data Owner = decides policy; Data Custodian = implements policy. Never confuse authority with implementation.
22
Data RolesMedium

A Data Custodian at a bank notices that a critical customer database is not being backed up according to the schedule specified in the data protection policy. What is the MOST appropriate action for the Custodian to take?

  • A. Independently determine a new backup schedule based on technical best practices and implement it
  • B. Report the gap to the Data Owner and recommend corrective action, then implement the fix once authorized
  • C. Escalate directly to the CISO and implement emergency backups without notifying the Data Owner
  • D. Wait for the next scheduled audit — operational issues are outside the Custodian's scope
✓ Correct: B — Report to Data Owner and implement fix once authorized
The Data Custodian's role is to implement and maintain controls specified by the Data Owner, not to set policy independently. When a gap is found, the Custodian should surface it to the Data Owner (the accountable party) and await authorization before changing procedures. Self-authorizing a policy change (A) or bypassing the Owner by going directly to the CISO (C) violates the governance chain. Ignoring the issue (D) is a negligence risk.
💡 ISC2 Mindset: Custodians surface problems to Owners; they do not self-authorize policy changes.
23
Data RolesMedium

Under GDPR, an organization that determines the purposes and means of processing personal data is called the _____, while an organization that processes data on behalf of another entity is called the _____.

  • A. Data Controller; Data Processor
  • B. Data Owner; Data Custodian
  • C. Data Steward; Data User
  • D. Data Principal; Data Fiduciary
✓ Correct: A — Data Controller; Data Processor
GDPR Article 4 defines these terms precisely: the Data Controller decides why and how personal data is processed; the Data Processor acts on the Controller's behalf. These GDPR roles map roughly (but not perfectly) to ISC²'s Data Owner (Controller) and Data Custodian (Processor). ISC² candidates must know both terminologies. The Data Processor bears obligations under GDPR, including DPA agreements and breach notification to the Controller.
💡 ISC2 Mindset: GDPR Controller ≈ ISC² Owner; GDPR Processor ≈ ISC² Custodian — but regulatory terms carry distinct legal obligations.
24
Data RolesMedium

FinTech Company X contracts Card Processor to process card transaction data. FinTech Company X defines what data is collected and why; Card Processor only processes it according to a written agreement. Under GDPR and PH DPA frameworks, which entity bears ultimate accountability to the data subjects (cardholders) for this processing?

  • A. Card Processor, because they perform the actual processing and have physical custody of transaction records
  • B. FinTech Company X, as the Data Controller/Owner that determines purposes and means of processing
  • C. Accountability is equally shared — both entities sign the DPA and are jointly liable
  • D. The payment scheme (Visa/Mastercard), because they govern the card transaction rules
✓ Correct: B — FinTech Company X as Data Controller/Owner bears ultimate accountability
Under both GDPR and the Philippine DPA (RA 10173), the Data Controller is ultimately accountable to data subjects — this is the entity that determines "why" data is processed. Card Processor, as a Processor/Custodian, acts on FinTech Company X's instructions and bears processor-level obligations, but primary accountability to cardholders remains with FinTech Company X. Joint and several liability may apply in some breach scenarios, but ultimate strategic accountability rests with the Controller.
💡 ISC2 Mindset: The Controller/Owner is accountable to the data subject even when a Processor/Custodian does the work.
25
Data RolesMedium

A Data Steward at a financial institution is asked to approve a data quality improvement project that will relabel 50,000 loan records with corrected income-bracket codes. What is WITHIN the Data Steward's authority to approve?

  • A. Approving the reclassification of the dataset from Confidential to Internal to simplify the project
  • B. Approving the data quality correction process itself, as data quality is the Steward's domain
  • C. Approving a new access control policy granting the project team permanent access to the dataset
  • D. Approving the deletion of all records that cannot be corrected, to maintain dataset integrity
✓ Correct: B — Data quality correction is within the Steward's domain
The Data Steward manages data quality, metadata, definitions, and ensures adherence to data governance standards. Approving a data quality correction process (relabeling records accurately) is a core Steward function. Reclassifying data (A) and granting permanent access (C) are Data Owner decisions. Deleting records permanently (D) is also an Owner-level decision with legal and compliance implications. Stewards operate within the policies set by Owners.
💡 ISC2 Mindset: Data Steward = data quality and metadata governance; does not set access policy or classification.
26
Data RolesMedium

An employee in the collections department accesses the customer loan repayment database daily to update payment statuses. In ISC² terminology, this employee's role with respect to this data is BEST described as:

  • A. Data Owner — they update the data and therefore control it
  • B. Data Custodian — they maintain the data on behalf of the organization
  • C. Data User — they access and use the data within authorized parameters
  • D. Data Steward — they ensure data quality by keeping records current
✓ Correct: C — Data User
A Data User is any individual authorized to access and use data for a specific business purpose within the parameters defined by the Data Owner. The collections employee uses the data operationally — they are not responsible for overall governance (Owner), technical maintenance (Custodian), or data quality policy (Steward). Updating individual records as part of daily duties is a user-level activity, not ownership or custodianship.
💡 ISC2 Mindset: Data User = accesses data for operational purposes within defined boundaries — the largest population in any organization.
27
Data RolesMedium

A CISSP exam scenario: The VP of Marketing owns the customer contact database. The database is hosted on servers managed by the IT Operations team. A data analyst needs read-only access to run campaign reports. Who should GRANT this access?

  • A. The IT Operations team, because they manage the servers and the database
  • B. The VP of Marketing (Data Owner), because access decisions are the Owner's responsibility
  • C. The data analyst's direct manager, because they are accountable for the analyst's work
  • D. The DBA, because they can technically provision the access quickly
✓ Correct: B — The VP of Marketing (Data Owner) grants access
The Data Owner is responsible for determining who may access their data and under what conditions. IT Operations (Custodian) provisions access only after the Owner approves the request. The analyst's manager can request access but cannot grant it — authority over data access belongs to the Owner, not the requester's management chain. This Owner→Custodian workflow is foundational ISC² Domain 2 knowledge.
💡 ISC2 Mindset: Access decisions flow from the Data Owner down to Custodians — never from IT upward.
28
Data RolesMedium

A cloud provider hosts FinTech Company X's customer PII on behalf of FinTech Company X. The cloud provider's SRE team accesses the data briefly during a debugging session to resolve a production outage. In ISC² role terms, the cloud provider is acting as:

  • A. Data Owner — they are in physical control of the infrastructure
  • B. Data Custodian — they maintain the infrastructure under a contractual agreement
  • C. Data User — they access the data to perform their job duties
  • D. Data Processor — they process personal data under GDPR on behalf of FinTech Company X
✓ Correct: B — Data Custodian (ISC² term); also D under GDPR framing
The cloud provider, under a contractual relationship with FinTech Company X, acts as a Data Custodian in ISC² terms — they have technical custody but no decision-making authority over the data. Under GDPR, this same relationship is characterized as Controller (FinTech Company X) and Processor (cloud provider). Both B and D reflect the same underlying governance reality; B is the ISC²-centric answer. The scenario tests whether candidates know these roles transcend physical possession.
💡 ISC2 Mindset: Physical custody does not equal ownership — the Custodian holds data under contract without governance authority.
29
Data RolesMedium

A new regulation requires that all customer financial records be retained for 7 years. The Data Owner wants to shorten retention to 3 years to reduce storage costs. Who has final authority to resolve this conflict?

  • A. The Data Owner, because they have ultimate authority over their data
  • B. The Chief Information Officer, because storage cost is an IT budget issue
  • C. Senior management or Legal/Compliance, because regulatory requirements supersede the Data Owner's discretion
  • D. The Data Custodian, because they implement retention schedules and understand the technical implications
✓ Correct: C — Legal/Compliance backed by senior management resolve regulatory conflicts
While the Data Owner has significant authority, that authority does not extend to violating mandatory legal or regulatory requirements. Regulatory retention obligations are not discretionary — shortening retention below the legal minimum exposes the organization to sanctions. Legal and Compliance teams, backed by senior management, have authority to enforce regulatory floors. The Data Owner must comply; cost reduction must be achieved through other means (e.g., cheaper storage tiers).
💡 ISC2 Mindset: Data Owner authority is bounded by law — regulatory requirements always override business convenience.
30
Data RolesMedium

Partner E Lhuillier processes remittance data on behalf of FinTech Company X under a formal data processing agreement. Partner E's IT team accidentally deletes a batch of transaction records. Under the data processing agreement, who must Partner E notify FIRST?

  • A. The National Privacy Commission (NPC) of the Philippines — regulators must be notified of any data incident
  • B. The affected data subjects directly — they have a right to know their data was lost
  • C. FinTech Company X as the Data Controller/Owner — the Processor notifies the Controller who then manages regulatory reporting
  • D. Partner E's own legal team — internal notification precedes any external communication
✓ Correct: C — Partner E notifies FinTech Company X (Controller) first
Under both GDPR and the Philippine DPA framework, when a Processor experiences a data incident, the Processor must notify the Controller without undue delay. The Controller then manages regulatory reporting to authorities (NPC) and communication to data subjects. Partner E does not have a direct notification obligation to the NPC or data subjects — that obligation belongs to FinTech Company X as Controller. This chain of notification is a critical governance concept.
💡 ISC2 Mindset: Processor notifies Controller → Controller notifies regulator and data subjects. The chain must not be skipped.
31
Data RolesMedium

A data governance framework lists the following responsibilities: "ensures data quality standards are met, maintains data dictionaries, resolves data definition disputes across business units." These responsibilities BEST describe which role?

  • A. Data Owner
  • B. Data Custodian
  • C. Data Steward
  • D. Data Protection Officer
✓ Correct: C — Data Steward
Data Stewards are responsible for data quality, metadata management, and enforcing data definitions and standards across business units. They serve as the operational arm of data governance between the strategic Data Owner and the technical Data Custodian. The Data Protection Officer (DPO) is a regulatory role under GDPR focused on compliance oversight, not data quality management.
💡 ISC2 Mindset: Data Steward bridges business and IT — responsible for data quality and definitional consistency, not access control or classification.
32
Data RolesMedium

A penetration tester is granted temporary read access to the production database to perform a security assessment. After the test, the Data Owner instructs the DBA to revoke the tester's access. The DBA delays this for three days due to a backlog. Who bears ACCOUNTABILITY for the access overstay?

  • A. The penetration tester, for not voluntarily relinquishing access after the engagement ended
  • B. The DBA (Custodian), for failing to implement the Owner's revocation instruction in a timely manner
  • C. The Data Owner, because they are ultimately accountable for all access to their data
  • D. Both the DBA and the Data Owner share equal accountability
✓ Correct: C — The Data Owner bears ultimate accountability
While the DBA (Custodian) failed to execute the revocation promptly and bears operational responsibility for that delay, the Data Owner is ultimately accountable for all access to their data at all times. The Owner should have followed up to ensure the revocation was completed. ISC² emphasizes that accountability cannot be delegated — the Owner remains accountable even when Custodians fail in their duties. The Owner should establish SLAs to prevent such delays.
💡 ISC2 Mindset: Accountability stays with the Data Owner — delegation to Custodians does not transfer ultimate responsibility.
33
Data RolesMedium

Under GDPR, which of the following obligations falls on the Data Processor (ISC² Custodian) rather than the Data Controller (ISC² Owner)?

  • A. Determining the lawful basis for processing personal data
  • B. Responding to data subject access requests directly
  • C. Implementing appropriate technical and organizational measures to secure processing activities
  • D. Conducting Data Protection Impact Assessments (DPIAs) for high-risk processing
✓ Correct: C — Implementing security measures for processing activities
Under GDPR Article 28 and 32, Processors must implement appropriate technical and organizational security measures for their processing activities — this is a direct Processor obligation. Determining lawful basis (A) and conducting DPIAs (D) are Controller responsibilities. Responding to data subject requests (B) is primarily the Controller's obligation, though Processors assist. This distinction is crucial for CISSP candidates working in organizations that act as both Controllers and Processors.
💡 ISC2 Mindset: Processors own the security of their processing activities; Controllers own the legal basis and data subject relationship.
34
Data RolesMedium

A CISSP is designing a data governance framework. She proposes that the Data Owner role should be assigned to the CISO because the CISO is responsible for security. A senior manager disagrees. Who is correct, and why?

  • A. The CISSP is correct — the CISO's accountability for security makes them the logical Data Owner
  • B. The senior manager is correct — the Data Owner should be the business manager who understands the business value and use of the data, not a security officer
  • C. Both are wrong — Data Owners must be designated by the board of directors under corporate governance rules
  • D. The senior manager is correct — the CISO can only be a Data Custodian because CISOs are technical roles
✓ Correct: B — Data Owner should be the business manager responsible for the data's value and use
ISC² is explicit that the Data Owner is a business manager or executive who understands the value, business purpose, and risk implications of the data — not a technical or security role. The CISO provides security guidance and oversight but is not typically the Owner. Assigning ownership to the CISO creates a conflict: the CISO would be approving access for their own security team, and business units would lose accountability for their data. Option D is incorrect — CISOs can be Owners of security-related data (e.g., security logs) but not of all organizational data.
💡 ISC2 Mindset: Data Ownership is a BUSINESS accountability, not a security or IT function.
35
Data RolesMedium

FinTech Company X's Partner A loan division generates operational reports from customer loan data. The report template is created by a data analyst, the underlying data is owned by the Credit Risk VP, and the reports are stored on a shared drive managed by IT. If the reports contain PII, who bears responsibility for ensuring the shared drive applies appropriate controls?

  • A. The data analyst, because they created the report
  • B. The Credit Risk VP (Data Owner), because the reports contain their data
  • C. IT Operations (Custodian), because they manage the shared drive
  • D. All three share equal responsibility under a collective ownership model
✓ Correct: B — The Credit Risk VP (Data Owner) is responsible for ensuring appropriate controls
Classification and protection requirements follow the data, not the format or storage location. Because the reports contain the Credit Risk VP's Restricted PII, the Data Owner is responsible for ensuring commensurate controls are applied — including specifying to IT (Custodian) what controls the shared drive must implement. The analyst is a User; they created the report but do not set policy. IT implements what the Owner directs. Option D dilutes clear accountability, which ISC² discourages.
💡 ISC2 Mindset: Data ownership follows the data, not the container — the Owner's responsibilities extend to all copies and derived outputs.
36
Data RolesHard

A merger between two fintech companies creates a data governance problem: both companies had separate Data Owners for identical customer segments. Post-merger, both legacy Owners claim authority over the merged customer database. What is the BEST resolution strategy from an ISC² governance perspective?

  • A. Allow both Owners to retain authority and resolve conflicts through a joint committee
  • B. Assign ownership to the more senior executive as a tiebreaker
  • C. Senior management must designate a single Data Owner for the merged dataset, clearly defined in the updated data governance policy
  • D. Temporarily assign ownership to the CISO until the merger integration is complete
✓ Correct: C — Senior management designates a single Data Owner
ISC² requires that each dataset have one clear, accountable Data Owner to avoid governance gaps. Dual ownership (A) creates decision paralysis and diluted accountability — a recognized anti-pattern. Senior management (not just the most senior of the two legacy owners, B) must make a formal decision and update governance documentation. Temporary CISO ownership (D) is inappropriate because CISOs should not own business data; it also perpetuates the governance vacuum.
💡 ISC2 Mindset: Every dataset must have ONE Data Owner — shared or dual ownership creates accountability gaps.
37
Data RolesHard

A Data Owner approves a third-party vendor's access to customer PII for analytics purposes. The vendor sub-contracts a portion of the work to a sub-processor. The sub-processor suffers a breach. Under the Philippine Data Privacy Act (RA 10173), who bears liability to the data subjects?

  • A. Only the sub-processor, because they caused the breach
  • B. The vendor (Processor), because they chose the sub-processor
  • C. FinTech Company X (Data Controller), because it is ultimately accountable to data subjects, though vendors share liability through the processing chain
  • D. The sub-processor and vendor equally, but FinTech Company X is fully exempt because they did not directly cause the breach
✓ Correct: C — FinTech Company X (Controller) bears ultimate accountability; the processing chain shares liability
Under RA 10173 and aligned GDPR principles, the Data Controller (FinTech Company X) remains ultimately accountable to data subjects regardless of sub-processing arrangements. The Processor (vendor) is required to get Controller approval before engaging a sub-processor, and the sub-processor carries obligations passed down from the Processor. A breach in the chain is ultimately the Controller's problem with regulators and data subjects. Contractual liability may distribute financial exposure, but the Controller cannot disclaim regulatory accountability.
💡 ISC2 Mindset: Controllers cannot outsource accountability — sub-processing chains do not break the Controller's responsibility to data subjects.
38
Data RolesHard

A Data Protection Officer (DPO) under GDPR disagrees with a Data Controller's decision to proceed with a high-risk processing activity without a DPIA. What authority does the DPO have in this situation?

  • A. The DPO can halt the processing unilaterally, as their compliance role grants veto power
  • B. The DPO must advise, document their objection, and escalate to senior management — but cannot unilaterally block the processing
  • C. The DPO should report directly to the supervisory authority (e.g., NPC or CNIL) without informing the Controller
  • D. The DPO has no relevant authority — processing decisions belong exclusively to the Data Controller
✓ Correct: B — DPO advises, documents objection, escalates — no unilateral veto
Under GDPR Article 38-39, the DPO is an advisory role — they provide expert guidance, monitor compliance, and advise on DPIAs, but do not have veto power over Controller decisions. If the Controller proceeds against DPO advice, the DPO must document the disagreement and may escalate to senior management or ultimately report to the supervisory authority, but not covertly (C). The DPO has real organizational influence but no unilateral legal authority to stop processing.
💡 ISC2 Mindset: The DPO advises and escalates — decision authority rests with the Controller, but documented DPO objections create an important governance trail.
39
Data RolesHard

FinTech Company X's eKYC Vendor system captures and stores biometric facial templates during loan applicant verification. The product team wants to repurpose these templates for a marketing personalization feature. From a data governance perspective, what must happen FIRST?

  • A. The Data Custodian must confirm the database can technically support the new use case
  • B. The Data Owner must assess whether the new purpose is compatible with the original consent given by applicants, and obtain new consent if required
  • C. The DPO must approve the repurposing as biometric data falls under DPO jurisdiction
  • D. A penetration test must be conducted on the marketing platform before biometric data is used
✓ Correct: B — Data Owner must assess purpose compatibility and consent requirements
Biometric data collected under consent for identity verification cannot be repurposed for marketing without a separate legal basis — this is a core principle of purpose limitation under the Philippine DPA (RA 10173) and GDPR. The Data Owner must evaluate whether the new use is compatible with the original consent; if not, new explicit consent must be obtained before repurposing. The DPO advises on this analysis but does not grant approval (C). Technical feasibility (A) and security testing (D) are secondary — governance and consent come first.
💡 ISC2 Mindset: Purpose limitation means data collected for one purpose cannot be repurposed without fresh governance review and legal basis.
40
Data RolesHard

A CISSP candidate is told: "This person determines what data is collected, why it is processed, and who may access it. They accept accountability for any harm arising from misuse." This description applies to which role REGARDLESS of whether the organization uses ISC², GDPR, or Philippine DPA terminology?

  • A. Data Custodian / Data Processor / Personal Information Processor
  • B. Data Steward / Data Quality Officer / Data Fiduciary
  • C. Data Owner / Data Controller / Personal Information Controller
  • D. Data User / Data Subject / Registered Individual
✓ Correct: C — Data Owner / Data Controller / Personal Information Controller
All three terminologies describe the same fundamental role: the entity that determines purposes and means of data processing and bears accountability for its outcomes. ISC² calls this the Data Owner; GDPR calls it the Data Controller; the Philippine DPA (RA 10173) calls it the Personal Information Controller. CISSP candidates must be fluent in all three frameworks and recognize that the same underlying governance role appears under different names in different regulatory contexts.
💡 ISC2 Mindset: ISC² Data Owner = GDPR Data Controller = PH DPA Personal Information Controller — same accountability, different labels.
T3 Data Lifecycle & Data States — Q41 to Q60
41
Data StatesMedium

FinTech Company X's AES-256-CTR encryption protects customer PII stored in the core database server. A security auditor notes the encryption only applies when data is not being processed by the application server. Which data state is NOT protected by this control?

  • A. Data at Rest — the database files on disk are unencrypted
  • B. Data in Transit — the network path from database to application server is unencrypted
  • C. Data in Use — data decrypted in application memory is exposed while being processed
  • D. Data in Archive — backup tapes use a different encryption scheme
✓ Correct: C — Data in Use is not protected
AES-256-CTR protects data at rest (on disk). When the application server decrypts and processes the data, it exists in plaintext in memory — this is the "data in use" state, unprotected by at-rest encryption. Controls for data in use include Trusted Execution Environments (TEEs), homomorphic encryption, or process isolation. The audit finding is correct: the encryption gap is at the application processing layer, not transit or archive.
💡 ISC2 Mindset: At-rest encryption does not protect data while it is being processed in memory — data in use requires separate controls.
42
Data StatesMedium

A FinTech Company X engineer is designing the security architecture for customer PII as it moves from the mobile app to the API gateway to the database. Which combination of controls BEST addresses all three data states?

  • A. TLS 1.3 for transit; AES-256 for rest; access controls for in-use
  • B. VPN for transit; BitLocker for rest; antivirus for in-use
  • C. SSH tunneling for transit; RAID for rest; DLP for in-use
  • D. PKI for transit; HSM for rest; IDS for in-use
✓ Correct: A — TLS 1.3 + AES-256 + access controls
TLS 1.3 is the current standard for protecting data in transit. AES-256 (symmetric encryption) is the standard for data at rest. For data in use, access controls (least privilege, role-based access) are the primary ISC²-recognized control — they limit which processes and users can access decrypted data in memory. Option B uses outdated or irrelevant controls (BitLocker is FDE; antivirus does not protect in-use data). Option C confuses RAID (availability, not confidentiality) with rest protection.
💡 ISC2 Mindset: Each data state requires its own control layer — TLS (transit), encryption (rest), access controls (in-use).
43
Data LifecycleMedium

Which of the following CORRECTLY lists the typical phases of the data security lifecycle in order?

  • A. Create → Store → Use → Share → Archive → Destroy
  • B. Collect → Process → Analyze → Report → Delete
  • C. Input → Processing → Output → Storage → Archival
  • D. Create → Classify → Use → Retain → Archive → Purge
✓ Correct: A — Create → Store → Use → Share → Archive → Destroy
ISC² and the Cloud Security Alliance (CSA) both identify six phases of the data security lifecycle: Create, Store, Use, Share, Archive, and Destroy. Each phase carries distinct security considerations and controls. Option D partially overlaps but is not the standard ISC² formulation. Understanding which controls apply at each phase is key for Domain 2.
💡 ISC2 Mindset: Know the six-phase data lifecycle — Create, Store, Use, Share, Archive, Destroy — each phase needs its own control set.
44
Data StatesMedium

Partner A sends daily loan repayment summaries from Vietnam to FinTech Company X's headquarters in Singapore via an SFTP connection. During transmission, an attacker performs a man-in-the-middle attack and intercepts the file. Which data state was exposed, and what control SHOULD have been in place?

  • A. Data at Rest — the file was cached on the SFTP server; end-to-end encryption of stored files is needed
  • B. Data in Use — the file was being parsed by the receiving application; memory protection is needed
  • C. Data in Transit — TLS/SFTP with certificate validation and mutual authentication should have been enforced
  • D. Data in Archive — the file was in a backup queue; archive encryption would have prevented the breach
✓ Correct: C — Data in Transit; enforce TLS/SFTP with mutual authentication
The attack occurred during the transmission — this is the "data in transit" state. SFTP provides encryption in transit, but a MitM attack can succeed if certificate validation is absent or if a self-signed cert is trusted blindly. Mutual authentication (both client and server authenticate) prevents spoofing. The control gap is in transport security, not at-rest or in-use. ISC² expects candidates to match the state with the appropriate control.
💡 ISC2 Mindset: MitM attacks target data in transit — mutual authentication and certificate validation are the correct mitigations.
45
Data LifecycleMedium

FinTech Company X's Platform B platform implements a 30-day soft-delete before permanent deletion. During the soft-delete window, data is still in the database but flagged as "deleted." What lifecycle phase does this 30-day window BEST represent?

  • A. Destroy — the data is logically deleted and no longer active
  • B. Archive — the data is retained for a defined period before final disposition
  • C. Use — the application still reads the soft-deleted records for compliance purposes
  • D. Store — the data remains in primary storage with full at-rest protections applying
✓ Correct: B — Archive (retention phase before final disposition)
A soft-delete window where data is marked for deletion but retained for a defined period (30 days) is a retention/archival phase — data is no longer actively used but is preserved for recovery, compliance, or legal hold purposes. It has not been destroyed (A) because it still exists. It is not actively in Use (C) by business processes. The data may still be in primary storage (D), but the governance phase is archival/retention. The appropriate controls for this phase include access restriction and audit logging.
💡 ISC2 Mindset: Soft-delete is a retention mechanism — data in this phase must still carry appropriate access controls until it is permanently destroyed.
46
Data LifecycleMedium

During the "Create" phase of the data lifecycle, which security activity is MOST critical from an ISC² perspective?

  • A. Encrypting the data immediately upon creation using AES-256
  • B. Assigning the appropriate classification label and notifying the Data Owner
  • C. Backing up the data to ensure availability from the moment of creation
  • D. Registering the data in the SIEM for immediate audit logging
✓ Correct: B — Assigning classification and notifying the Data Owner
The "Create" phase is where classification must be assigned — all subsequent controls depend on knowing the data's sensitivity level. Without classification, encryption, access controls, retention, and destruction decisions cannot be made correctly. ISC² emphasizes that classification at creation is the foundational governance act. Encryption and backup (A, C) are correct controls, but they depend on classification having been assigned. SIEM logging (D) is a detective control, not a governance priority at creation.
💡 ISC2 Mindset: Data must be classified at creation — all other lifecycle controls depend on knowing what sensitivity level applies.
47
Data StatesMedium

A cloud-based analytics platform decrypts customer behavioral data, processes it for scoring, and then re-encrypts the output. During which phase is the data MOST vulnerable, and what emerging technology addresses this gap?

  • A. Data at Rest — addressed by client-side encryption before upload
  • B. Data in Transit — addressed by quantum-resistant cryptography
  • C. Data in Use (during decryption and processing) — addressed by Homomorphic Encryption or Trusted Execution Environments (TEEs)
  • D. Data in Archive — addressed by tape encryption with AES-256
✓ Correct: C — Data in Use; addressed by Homomorphic Encryption or TEEs
The processing step — where data is decrypted in memory and processed — is the most vulnerable state because it is exposed in plaintext inside the compute environment. Homomorphic Encryption allows computation on encrypted data without decryption, eliminating this exposure. Trusted Execution Environments (TEEs, e.g., Intel SGX) isolate processing in hardware-protected enclaves. These technologies represent the frontier of data-in-use protection and are increasingly relevant for cloud fintech workloads like FinTech Company X's scoring engine.
💡 ISC2 Mindset: Data in Use is the hardest state to protect — homomorphic encryption and TEEs are emerging solutions for this gap.
48
Data LifecycleMedium

A security policy states that Restricted data must be reviewed for continued necessity every 12 months. After review, data no longer needed for business or legal purposes must be destroyed. A department head ignores this policy for three years. What is the PRIMARY risk created by this inaction?

  • A. Regulatory fines for retaining data longer than necessary — data minimization violations
  • B. Increased storage costs — the main risk is financial, not privacy or security
  • C. Breach of the backup SLA — retaining data past its lifecycle may cause backup system overloads
  • D. Loss of data classification accuracy — old data inevitably becomes unclassified
✓ Correct: A — Regulatory fines for data minimization violations
Both GDPR (Article 5(1)(e)) and the Philippine DPA require that personal data be retained only as long as necessary for its original purpose. Retaining Restricted PII beyond necessity without legal justification is a data minimization violation. Regulators (CNIL, NPC) can impose significant fines. While storage costs (B) are a secondary concern, they are not the primary risk. ISC² frames this as a governance and compliance failure, not merely a financial inefficiency.
💡 ISC2 Mindset: Retaining data longer than necessary is a compliance violation, not just a cost issue — data minimization is a legal obligation.
49
Data LifecycleMedium

During the "Share" phase of the data lifecycle, a FinTech Company X data engineer sends customer credit profiles to a partner credit bureau via API. What security control is MOST critical at this phase specifically?

  • A. DLP (Data Loss Prevention) policies to prevent unauthorized sharing channels
  • B. API encryption (TLS) only — the bureau handles security on its end
  • C. A data sharing agreement defining purpose, scope, controls, and liability, combined with transport encryption
  • D. Requiring the partner to sign an NDA before any data sharing begins
✓ Correct: C — Data sharing agreement + transport encryption
The "Share" phase introduces third-party governance risk. ISC² requires that data sharing be governed by formal agreements (DSA/DPA) that specify what data is shared, for what purpose, with what controls, and who bears liability for breaches. Transport encryption (TLS) protects the channel but does not govern how the bureau uses the data after receipt. An NDA (D) is a legal tool but narrower than a full data sharing agreement and does not address security controls. DLP (A) is important but is a technical control, not the primary governance mechanism.
💡 ISC2 Mindset: At the Share phase, governance (data sharing agreements) is as important as technical controls — third-party obligations must be contractually defined.
50
Data StatesMedium

A laptop containing unencrypted Confidential salary data is stolen from an employee's car. The attacker removes the hard drive and connects it to another computer. Which data state is compromised, and what single control would have MOST effectively prevented this breach?

  • A. Data in Transit — a VPN would have prevented the theft
  • B. Data at Rest — full-disk encryption (e.g., BitLocker/FileVault) would have protected the drive when offline
  • C. Data in Use — locking the screen would have prevented unauthorized access
  • D. Data in Archive — the data should have been stored on a server, not locally
✓ Correct: B — Data at Rest; full-disk encryption is the primary control
When a hard drive is removed and connected to another computer, the data is accessed directly from storage — this is the "data at rest" state. Full-disk encryption (FDE) such as BitLocker (Windows) or FileVault (macOS) encrypts all data on the drive; without the decryption key (tied to the original device's TPM or user password), the stolen drive's contents are unreadable. Screen lock (C) only prevents interactive access while the OS is running — it does not protect a removed drive. This is a classic ISC² scenario illustrating the value of at-rest encryption.
💡 ISC2 Mindset: Physical theft of a drive is a data-at-rest attack — full-disk encryption is the definitive control for this threat.
51
Data LifecycleMedium

A data security lifecycle review identifies that a dataset classified as Restricted is being shared on an internal Slack channel without restriction. At which lifecycle phase did the control fail, and what is the BEST corrective action?

  • A. Create phase — the data was misclassified initially; reclassify as Internal
  • B. Share phase — implement DLP controls on collaboration tools and enforce data sharing policy training
  • C. Store phase — move the data to an access-controlled repository and restrict the Slack channel
  • D. Archive phase — move the dataset to a read-only archive to prevent further sharing
✓ Correct: B — Share phase failure; DLP controls + policy training are the corrective action
The data was shared inappropriately through an unauthorized channel — this is a "Share" phase control failure. The correct response is dual: technical (DLP to block Restricted data from leaving approved channels) and administrative (training employees on the data sharing policy). Simply moving the data (C) or archiving it (D) does not address the behavioral and process gap that allowed the share. Reclassifying to Internal (A) is inappropriate — the sensitivity has not changed.
💡 ISC2 Mindset: Share-phase failures require both technical controls (DLP) and administrative controls (policy + training) — neither alone is sufficient.
52
Data StatesMedium

A FinTech Company X application processes loan scoring in real-time. Developers log all scoring inputs and outputs to a debug log for troubleshooting. The debug log is stored unencrypted on the application server. What risk does this create?

  • A. The scoring algorithm is exposed, risking intellectual property theft
  • B. Customer PII in the log is now data at rest without classification-appropriate controls, creating a data exposure risk
  • C. Debug logs are transient and present no meaningful risk if stored temporarily
  • D. The log creates a data-in-transit risk because it may be sent to monitoring systems
✓ Correct: B — PII in logs is data at rest without appropriate controls
Logging PII to debug files creates a shadow copy of Restricted data that may lack the access controls, encryption, and retention policies applied to the primary database. This is a common data sprawl problem in fintech — PII migrates from protected systems into less-controlled log files. ISC² would classify this as a data lifecycle governance failure: the Create/Store phase of the log did not inherit the classification controls of the source data.
💡 ISC2 Mindset: Data sprawl — PII leaking into logs, caches, or temp files — creates uncontrolled at-rest exposure that is harder to track than primary storage.
53
Data LifecycleMedium

An organization moves from a 5-year data retention policy to a 3-year policy for customer transaction data. What must happen to data currently past the new 3-year threshold?

  • A. It can remain indefinitely — changing future policy does not affect existing data
  • B. It must be reviewed for legal hold or regulatory retention obligations before disposition
  • C. It must be immediately deleted upon the policy change to minimize compliance risk
  • D. It should be reclassified as Public since it is no longer actively retained
✓ Correct: B — Review for legal holds and regulatory obligations before disposition
A retention policy change does not automatically authorize mass deletion. ISC² requires that before data is destroyed, organizations check for active legal holds, regulatory minimum retention periods, and ongoing business needs. Immediate deletion (C) without this review risks destroying data subject to a hold (spoliation) or required by regulation. Option A ignores the new policy. Option D is nonsensical — retention status does not affect classification.
💡 ISC2 Mindset: Retention policy changes trigger a review, not an automatic deletion — legal holds and regulatory floors must be checked first.
54
Data StatesMedium

A database administrator uses column-level encryption (CLE) to protect the national_id column in the customer table. The remaining columns (name, address, phone) are stored in plaintext. Which statement BEST describes the security posture?

  • A. CLE provides complete protection — since the most sensitive field is encrypted, the risk is acceptable
  • B. CLE protects the national_id column at rest, but the other PII columns in plaintext create residual data-at-rest risk and may enable re-identification via aggregation
  • C. CLE is ineffective because column encryption is always weaker than full-disk encryption
  • D. The design is correct — encrypting only the highest-sensitivity field is the standard practice
✓ Correct: B — Plaintext PII columns create residual risk and re-identification potential
While CLE is more targeted than FDE and protects the most sensitive field (national_id), leaving name, address, and phone in plaintext creates two risks: (1) direct exposure of PII if the database is compromised, and (2) aggregation risk — combining plaintext fields with the encrypted column's context can enable re-identification. ISC² expects a holistic view of data protection: protecting one field while leaving adjacent PII exposed is not sufficient for a Restricted dataset.
💡 ISC2 Mindset: Partial encryption creates false assurance — all PII fields must be assessed for protection, not just the most obvious one.
55
Data LifecycleMedium

FinTech Company X receives raw credit bureau data from an external provider. The data is ingested into the data lake, transformed by a data pipeline, and used by risk models. At the STORE phase, what is the MOST important security consideration beyond encryption?

  • A. Compression ratio — higher compression reduces breach exposure by making files smaller
  • B. Access control enforcement — who may read, write, or delete the data must align with the classification
  • C. Network segmentation of the storage layer — stored data should be on a separate VLAN
  • D. Backup frequency — data stored longer needs more frequent backups to ensure availability
✓ Correct: B — Access control enforcement aligned with classification
At the Store phase, access control is the primary governance control after encryption. Defining who can read, write, modify, or delete stored data — and enforcing this via role-based or mandatory access controls — is the mechanism that operationalizes the classification label. Network segmentation (C) is a valuable defense-in-depth measure but is a secondary control. Backup frequency (D) addresses availability, not confidentiality. Compression (A) has no direct security benefit for confidentiality.
💡 ISC2 Mindset: At the Store phase, access control is the governance mechanism that enforces classification — encryption without access control is insufficient.
56
Data LifecycleMedium

FinTech Company X migrates its customer database from an on-premises server to a cloud provider. During the migration window, data exists simultaneously in both environments. What is the PRIMARY security risk during this transition phase?

  • A. Data is in transit during migration — TLS must be used for all transfer connections
  • B. Duplicate copies of Restricted data exist in two environments with potentially different access controls, increasing the attack surface
  • C. The cloud provider becomes a Data Owner during the migration window
  • D. Migration is a Create phase event — data must be reclassified upon arrival in the new environment
✓ Correct: B — Duplicate copies in two environments with inconsistent controls
Cloud migration creates a window where Restricted data exists in two places simultaneously — on-prem and cloud — potentially with different access control policies, logging configurations, and encryption settings. This data duplication expands the attack surface and creates governance complexity. ISC² would identify this as the highest risk: a breach in either environment during the transition window exposes the data. Transit encryption (A) is important but addresses only the transfer channel, not the dual-residence risk.
💡 ISC2 Mindset: Migration creates temporary data duplication — both environments must enforce equivalent controls during the transition window.
57
Data StatesMedium

An employee pastes Restricted customer PII from a secure database into a Microsoft Teams message to share it with a colleague. Which data state transition is MOST concerning from a security perspective?

  • A. The data moved from At Rest to In Use when the employee opened the database
  • B. The data transitioned from a controlled In Use environment to an uncontrolled In Transit / Shared state via an unauthorized channel
  • C. The data moved from In Transit to At Rest when it was stored in Teams' servers
  • D. There is no concern if Teams uses end-to-end encryption — the data state transition is secure
✓ Correct: B — Controlled In Use to uncontrolled sharing via unauthorized channel
The critical failure is not the data state transition per se, but the fact that Restricted data was moved from a controlled access environment (secure database) to an unauthorized channel (Teams, which may not have equivalent DLP controls, access restrictions, or retention policies). ISC² focuses on the control environment, not just the state. Teams' encryption (D) protects the channel but does not enforce classification-based access controls or prevent further unauthorized sharing by the recipient.
💡 ISC2 Mindset: Data state transitions are concerning when they move data from controlled to uncontrolled environments — channel encryption alone is not sufficient governance.
58
Data LifecycleHard

During a security review, a CISSP discovers that FinTech Company X's Vietnam operations store Partner A micro-loan customer data on a SaaS CRM platform not listed in the approved vendor registry. The CRM vendor processes data under its own privacy policy. What is the MOST serious governance failure here?

  • A. The CRM vendor has not signed a data processing agreement — FinTech Company X lacks contractual control over the data
  • B. Using a SaaS CRM is inherently insecure — all data should be stored on-premises
  • C. The CRM was not approved, so its security controls are unknown — the risk cannot be assessed without a vendor security review
  • D. Vietnam's Decree 13/2023 prohibits storing personal data on foreign cloud platforms
✓ Correct: A — No data processing agreement means no contractual control over the data
The most serious failure is the absence of a Data Processing Agreement (DPA/DSA). Without a signed agreement, FinTech Company X has no contractual authority to govern how the CRM vendor stores, accesses, or processes the customer data. The vendor's own privacy policy governs, which may conflict with PH DPA, Decree 13/2023, and FinTech Company X's internal policies. This is a fundamental Controller-Processor governance failure. Option C (unknown controls) is a secondary risk flowing from A. Option D overstates — Decree 13/2023 has data localization provisions but does not broadly prohibit foreign cloud platforms.
💡 ISC2 Mindset: No processing agreement = no contractual control = unmanaged risk. Before data goes to a vendor, the legal framework must exist.
59
Data LifecycleHard

A FinTech Company X data engineer proposes using a data lake as a "schema-on-read" repository where all ingested data is stored raw and classified later at query time. From an ISC² asset security perspective, what is the MOST critical concern with this approach?

  • A. Schema-on-read lakes are technically inferior to schema-on-write databases for financial data
  • B. Storing data without classification labels means appropriate access controls, encryption, and retention policies cannot be applied at ingestion — creating a "data swamp" governance risk
  • C. Raw data cannot be encrypted — all data must be transformed before encryption can be applied
  • D. Query-time classification is acceptable as long as the data lake has strong perimeter controls
✓ Correct: B — Deferred classification creates a governance gap from ingestion onward
ISC² requires classification at data creation/ingestion. A "classify later" approach means Restricted PII sits in the data lake without appropriate access controls, encryption key management, or retention policies from the moment of ingestion. Any breach during this unclassified window exposes data without the controls that classification would mandate. Perimeter controls (D) are a single point of failure — defense in depth requires classification-based controls throughout. This is the "data swamp" anti-pattern in data governance.
💡 ISC2 Mindset: Classification must occur at ingestion — deferring it creates an uncontrolled window where Restricted data has no governance framework.
60
Data LifecycleHard

FinTech Company X's risk model uses customer behavioral data that is 4 years old. A data scientist argues that old data is less risky and proposes removing encryption to improve query performance. The Data Owner must decide. What is the CORRECT ISC² position?

  • A. Agree — data older than 3 years can be downgraded to Internal classification to reduce overhead
  • B. Agree — encryption is only mandatory during the active business use period, not for historical data
  • C. Disagree — the Data Owner must conduct a formal review before changing controls; age alone does not justify reducing protection
  • D. Disagree — historical data must always be encrypted regardless of reclassification; ISC² mandates lifetime encryption of all PII
✓ Correct: C — Formal review required; age alone does not justify reducing controls
Reducing encryption or other controls requires a formal Data Owner review that assesses current risk, regulatory obligations, and data sensitivity — not merely the data's age. Historical behavioral data can still be highly sensitive (re-identification from historical patterns is a known risk) and may still be subject to regulatory retention requirements. ISC² does not mandate lifetime encryption for all PII (D is too absolute), but it does require that any control reduction be justified through formal governance. Performance optimization is a valid goal, but must not drive security decisions unilaterally.
💡 ISC2 Mindset: Performance optimization never justifies unilateral reduction of security controls — changes require formal Data Owner review and documented risk acceptance.
T4 Data Destruction, Retention & Legal Hold — Q61 to Q80
61
Data DestructionMedium

An organization wants to dispose of 200 HDDs containing Restricted customer PII. The security team proposes degaussing. Which statement about degaussing HDDs is CORRECT?

  • A. Degaussing is ineffective on HDDs — only physical shredding achieves NIST SP 800-88 "Clear" level
  • B. Degaussing destroys the magnetic field on HDD platters, rendering data unrecoverable and meeting NIST SP 800-88 "Purge" level for magnetic media
  • C. Degaussing is equally effective on HDDs and SSDs because both use magnetic storage
  • D. Degaussing only reaches "Clear" level — overwriting is required to reach "Purge" level for HDDs
✓ Correct: B — Degaussing is effective for HDDs and meets NIST Purge level for magnetic media
NIST SP 800-88 defines three sanitization levels: Clear (overwriting), Purge (rendering data recovery infeasible with state-of-the-art techniques), and Destroy (physical destruction). Degaussing — exposing magnetic media to a strong magnetic field — meets the Purge level for HDDs because it destroys the magnetic domain structure that stores data. However, degaussing is completely ineffective on SSDs (which use flash memory, not magnetic storage) and also renders the HDD inoperable.
💡 ISC2 Mindset: Degaussing = Purge for HDD; degaussing is USELESS for SSDs — know the medium before choosing the sanitization method.
62
Data DestructionMedium

FinTech Company X is retiring 50 NVMe SSDs from its Singapore data center that contained Restricted PII. The IT team proposes a 7-pass DoD overwrite. What is WRONG with this proposal?

  • A. 7-pass overwrite is overkill — 1-pass is sufficient for SSDs per NIST SP 800-88
  • B. Overwriting is unreliable on SSDs because wear-leveling algorithms may leave residual data in reserved cells not accessible to the OS
  • C. DoD overwrite only works on HDDs — SSDs require degaussing to achieve Purge level
  • D. NVMe SSDs cannot be overwritten — they must be encrypted before use and then the key destroyed
✓ Correct: B — Overwriting is unreliable on SSDs due to wear-leveling and reserved cells
NIST SP 800-88 explicitly notes that overwriting is unreliable for flash-based storage (SSDs, NVMe) because wear-leveling distributes writes across the drive, and the OS cannot directly address reserved/spare cells where old data may reside. For SSDs, NIST recommends using the device's built-in Secure Erase command (ATA Secure Erase or NVMe Format), Cryptographic Erasure (destroying the encryption key), or physical destruction. Multi-pass overwriting (originally from DoD 5220.22-M) was designed for magnetic media, not flash.
💡 ISC2 Mindset: SSD sanitization ≠ HDD sanitization — use Secure Erase, Cryptographic Erasure, or physical destruction for SSDs. Overwriting is unreliable.
63
Data DestructionMedium

A CISSP uses the term "Cryptographic Erasure" (CE) to sanitize an SSD. What does this method entail, and what is its LIMITATION?

  • A. CE overwrites all sectors with random cryptographic keys; limitation is that it voids the manufacturer warranty
  • B. CE destroys the encryption key protecting data on a self-encrypting drive (SED); limitation is that data recovery remains theoretically possible if the key is later recovered
  • C. CE applies a 256-bit AES key to overwrite each sector; limitation is that it takes 24+ hours for large drives
  • D. CE erases the encryption key of a SED; limitation is that it only works if the drive was encrypted from the start — unencrypted data is not addressed
✓ Correct: D — CE destroys the encryption key of a SED; only works if data was pre-encrypted
Cryptographic Erasure (NIST SP 800-88) involves destroying or sanitizing the encryption key used by a Self-Encrypting Drive (SED), rendering encrypted data unreadable without needing to overwrite the data itself. The critical limitation: if the drive was not encrypted before data was written, CE is irrelevant — the data on unencrypted areas remains recoverable. Organizations using CE as their primary SSD sanitization strategy must ensure all drives use encryption from first use (a "encrypt first" policy). Option B partially describes CE but underestimates the pre-encryption requirement.
💡 ISC2 Mindset: Cryptographic Erasure is fast and effective — but ONLY if the drive was encrypted from the beginning. No encryption = no CE option.
64
Data DestructionMedium

According to NIST SP 800-88, which sanitization method provides the HIGHEST assurance of data irretrievability for all media types?

  • A. Clear — overwriting all user-addressable storage locations
  • B. Purge — using targeted techniques such as secure erase or degaussing that make recovery infeasible with state-of-the-art techniques
  • C. Destroy — physical or chemical destruction of the media so it cannot function as a storage device
  • D. Archive — moving data to a secure cold storage facility under strict access control
✓ Correct: C — Destroy provides the highest assurance
NIST SP 800-88 defines a hierarchy: Clear < Purge < Destroy. The "Destroy" level — shredding, disintegration, melting, or incineration — provides the absolute highest assurance that no data can be recovered, because the physical medium no longer exists as a storage device. This is used for the most sensitive data (Top Secret, biometric, cryptographic key material) or when reuse of the media is not a concern. Archive (D) is not a sanitization method — it is a retention phase.
💡 ISC2 Mindset: NIST Hierarchy — Clear < Purge < Destroy. Use Destroy when the data sensitivity demands absolute assurance, not just technical infeasibility.
65
Data DestructionMedium

A FinTech Company X employee reformats an HDD containing Confidential data and then donates the drive to a charity. Is this an acceptable sanitization method?

  • A. Yes — formatting removes all file system pointers, making data effectively inaccessible
  • B. No — formatting only removes file system metadata; the underlying data remains recoverable with common forensic tools
  • C. Yes — if the format is a "full format" (not quick format), all sectors are overwritten and meet NIST Clear level
  • D. It depends — if the charity signs an NDA, the residual data risk is transferred contractually
✓ Correct: B — Formatting only removes metadata; data remains recoverable
A standard (quick) format only removes the file allocation table and directory entries — the actual data sectors remain intact and are trivially recovered with free tools like Recuva or PhotoRec. A "full format" in Windows 7+ does overwrite sectors (approaching NIST Clear level), but a standard format does not. For Confidential data intended for donation (third-party hands), ISC² requires at minimum a Clear-level sanitization (verified overwrite) or preferably Purge. An NDA (D) does not neutralize data recovery risk — contractual agreements cannot undo poor technical sanitization.
💡 ISC2 Mindset: Formatting is NOT sanitization — it removes pointers, not data. Always verify the sanitization method matches the media and classification level.
66
Data DestructionMedium

Partner A is decommissioning a cloud-based virtual machine that processed Restricted micro-loan applicant data. The underlying physical hardware is managed by the cloud provider. What is the MOST appropriate sanitization approach?

  • A. Request the cloud provider to physically shred the host server's drives
  • B. Terminate the VM — cloud providers guarantee data erasure upon VM termination per their SLA
  • C. Use Cryptographic Erasure by deleting all encryption keys used for the VM's storage volumes, then confirm via provider's data destruction certificate
  • D. Overwrite the VM's virtual disk with zeros before termination to achieve NIST Clear level
✓ Correct: C — Cryptographic Erasure of storage keys + destruction certificate from provider
In cloud environments, the organization typically cannot physically access the underlying hardware. The best practice per NIST SP 800-188 and cloud security guidelines is Cryptographic Erasure: ensure all VM storage was encrypted (which reputable cloud providers do by default), then delete the encryption keys and request a data destruction certificate or audit evidence from the provider. Simply terminating the VM (B) provides no formal guarantee without documentation. Physical shredding (A) is impractical and cloud providers will not shred hardware for one customer's workload.
💡 ISC2 Mindset: In cloud environments, Cryptographic Erasure + provider destruction certificate is the standard sanitization approach — physical access is not available.
67
RetentionMedium

A FinTech Company X compliance officer is developing a data retention schedule. What is the PRIMARY factor that should drive the minimum retention period for customer financial records?

  • A. The cost of storage — longer retention is only justified if storage is inexpensive
  • B. The classification level — Restricted data must be retained for at least 5 years regardless of regulation
  • C. Applicable regulatory and legal requirements — these set the mandatory floor for retention
  • D. Business value — data should be retained as long as the analytics team finds it useful
✓ Correct: C — Regulatory and legal requirements set the mandatory minimum
ISC² identifies regulatory and legal requirements as the primary driver of minimum retention periods. Regulations like BSP Circular 900 (Philippines banking records), Decree 13/2023 (Vietnam), and AML/CTF laws mandate specific retention durations regardless of storage cost or business interest. Classification level (B) influences access controls, not retention duration directly. Business value (D) drives the maximum useful period but cannot override legal minimums. A complete retention schedule considers regulatory floors, legal hold possibilities, and business needs together.
💡 ISC2 Mindset: Retention schedules start with regulatory floors — business needs may extend retention above the minimum but never below it.
68
Legal HoldMedium

A legal team issues a litigation hold for all communications related to a disputed loan. An automated data retention system deletes the relevant emails three days later as per the 90-day email policy. The deletion was not intentional. What legal concept describes this situation, and what is the likely consequence?

  • A. Data minimization — the deletion was compliant with data minimization principles and creates no liability
  • B. Spoliation — the negligent destruction of evidence subject to a legal hold can result in adverse inference instructions or sanctions
  • C. Data breach — automated deletion of PII is a reportable privacy incident under RA 10173
  • D. Chain of custody failure — the court will demand re-creation of the deleted emails
✓ Correct: B — Spoliation with potential adverse inference consequences
Spoliation is the negligent or intentional destruction of evidence that a party knows or should know is relevant to pending or reasonably foreseeable litigation. Even if automated, once a legal hold is issued, all covered data must be preserved — automated systems must be configured to suspend normal deletion for held data. Courts may impose sanctions including adverse inference instructions (telling jurors to assume the destroyed evidence was unfavorable to the spoliating party) or even case dismissal. ISC² candidates must recognize that legal holds override all automated retention/deletion processes.
💡 ISC2 Mindset: A legal hold freezes all automated deletion for covered data — failure to implement the hold in automated systems constitutes spoliation.
69
Legal HoldMedium

FinTech Company X receives a government subpoena for customer transaction records. The records include data from accounts that have already been soft-deleted under the Platform B 30-day policy. Can FinTech Company X comply with the subpoena by responding that the data was deleted per policy?

  • A. Yes — if the data was deleted before the subpoena was received, FinTech Company X has no obligation to produce it
  • B. It depends on whether the deletion occurred before or after FinTech Company X reasonably anticipated the legal demand, and whether the data is within the 30-day soft-delete window and recoverable
  • C. No — all deleted data must be restored and produced regardless of when it was deleted
  • D. Yes — soft-deleted data is legally considered destroyed and is not producible under any circumstance
✓ Correct: B — Timing of deletion relative to anticipation of legal demand is the key factor
The duty to preserve evidence (and thus comply with a subpoena) begins when litigation is "reasonably anticipated" — not just when the subpoena is formally received. If FinTech Company X deleted data in good faith before any reasonable anticipation of legal demand, they may have a valid defense. If the deletion occurred after they should have anticipated the demand (e.g., after a prior court notice or formal complaint), it may constitute spoliation. If the data is still within the 30-day soft-delete window and recoverable, it must be produced. Legal counsel must evaluate the specific facts.
💡 ISC2 Mindset: The preservation duty begins at reasonable anticipation of litigation — not at formal subpoena receipt. Timing is everything.
70
Data DestructionMedium

An organization uses a "Clear" sanitization method (single-pass overwrite) on HDDs containing Confidential data before re-deploying them internally. Under NIST SP 800-88, is this appropriate?

  • A. No — Clear is only appropriate for Public data; Confidential data requires Purge
  • B. Yes — Clear is appropriate when media is being reused within the same organization or released to a trusted party at the same classification level
  • C. No — all redeployment scenarios require Purge-level sanitization regardless of data sensitivity
  • D. Yes — as long as the new user does not have access to forensic tools, Clear is sufficient for any redeployment
✓ Correct: B — Clear is appropriate for reuse within the same trust boundary
NIST SP 800-88 states that the Clear level (logical overwriting) is appropriate when media is being reused within the organization or released to trusted parties at equivalent or lower risk levels, where the adversary model does not include sophisticated lab-based recovery. Purge is required when media is released outside the organization's control or when the data classification is very high. This question tests whether candidates understand that sanitization level selection depends on the threat model and reuse context, not just the data's original classification.
💡 ISC2 Mindset: Match sanitization level to the reuse context and threat model — internal reuse at the same classification level may only need Clear.
71
Data DestructionMedium

FinTech Company X uses a third-party e-waste vendor to destroy decommissioned HDDs. What control MOST reduces the risk that the vendor improperly disposes of data?

  • A. Requiring the vendor to sign an NDA prohibiting them from disclosing customer data
  • B. Performing on-site degaussing before the drives are handed to the vendor, and obtaining a Certificate of Destruction
  • C. Selecting a vendor with a good reputation and verifying their ISO 9001 certification
  • D. Auditing the vendor annually — the audit will deter improper disposal between visits
✓ Correct: B — Pre-sanitize on-site before handoff; obtain Certificate of Destruction
The most effective control is to sanitize (degauss) the drives before they leave organizational custody — at that point, even if the vendor mishandles the drives, there is no recoverable data. The Certificate of Destruction provides audit evidence and contractual accountability. An NDA (A) does not prevent data recovery — it only creates legal liability after the fact. ISO 9001 (C) covers quality management, not security. Annual audits (D) detect problems retroactively and do not prevent harm between cycles.
💡 ISC2 Mindset: Never rely solely on vendor controls for data destruction — sanitize before handoff. Certificates document accountability; they are not a substitute for prior sanitization.
72
RetentionMedium

An organization's data retention policy specifies that employee HR records must be retained for 7 years after employment ends, while customer transaction records must be kept for 5 years. An employee is also a former customer. How long must their combined record be retained?

  • A. 5 years — the shorter retention period takes precedence to minimize risk
  • B. 7 years — the longer retention obligation applies to the combined record
  • C. Records must be separated and each component retained per its applicable policy
  • D. 6 years — average of the two retention periods as a compromise
✓ Correct: C — Separate the records and retain each per its applicable policy
The correct approach is to separate the HR and customer data components and apply the appropriate retention policy to each. This requires proper data architecture (segregation) and governance. If records cannot be separated, the longer retention period (7 years) must govern the combined record to avoid premature deletion of legally required data. ISC² emphasizes that retention policies are type-specific; applying the shorter period to a mixed record risks destroying data before its legal minimum is met.
💡 ISC2 Mindset: When multiple retention obligations apply, segregate records by type — if segregation is impossible, apply the longer retention period.
73
Legal HoldHard

A legal hold is placed on all FinTech Company X emails related to a disputed transaction. The IT team implements the hold in the email archiving system. Two weeks later, an employee manually deletes the emails from their local client, unaware of the hold. What should the CISO prioritize IMMEDIATELY?

  • A. Terminate the employee for violating the legal hold policy
  • B. Check if the email archive captured the messages before local deletion, restore from archive, and notify legal counsel of the incident
  • C. File an emergency data recovery request with the email vendor and conduct a forensic investigation
  • D. Notify the opposing party in the dispute that evidence was inadvertently destroyed
✓ Correct: B — Check archive for copies, restore, and notify legal counsel immediately
The immediate priority is to assess whether the emails were captured in the archive (which they should have been, given the hold was implemented in the archiving system) and restore them if so. Legal counsel must be notified immediately to advise on disclosure obligations and whether spoliation has occurred. Terminating the employee (A) is an HR decision separate from the legal emergency. Notifying the opposing party (D) is a legal decision counsel will make — not an immediate IT action. The CISO's role is to restore data integrity and loop in legal, not to make disclosure decisions.
💡 ISC2 Mindset: In a legal hold incident, the first priorities are data recovery and legal notification — preserve evidence, then govern the response.
74
Data DestructionHard

FinTech Company X uses self-encrypting NVMe drives (SEDs) with AES-256 hardware encryption enabled from deployment. Upon decommission, the team performs Cryptographic Erasure by destroying the media encryption key (MEK). A forensic auditor challenges the method, saying recovery might still be possible. What is the CORRECT assessment?

  • A. The auditor is correct — hardware encryption can be bypassed by manufacturer backdoors, so physical destruction is required
  • B. The auditor is correct — the MEK may be backed up in firmware, so destroying the software key does not guarantee erasure
  • C. CE on a properly implemented SED meets NIST SP 800-88 Purge level — the encrypted data is computationally unrecoverable without the MEK, which has been destroyed
  • D. The auditor is correct — CE only meets Clear level, not Purge, and the drive must be overwritten after CE for Purge level
✓ Correct: C — CE on a properly implemented SED meets NIST Purge level
NIST SP 800-88 explicitly recognizes Cryptographic Erasure as a Purge-level method when the drive's encryption was properly implemented from the start. AES-256 encrypted ciphertext without the key is computationally unrecoverable with current or foreseeable technology — this satisfies "recovery infeasible with state-of-the-art techniques" which is the Purge definition. The auditor's challenge is unfounded IF the SED implementation is correct (no key escrow or backup, proper hardware attestation). Auditors should request a certificate of CE completion and review the SED vendor's security documentation rather than assert theoretical vulnerabilities.
💡 ISC2 Mindset: CE on a properly configured SED = NIST Purge level. The key is "properly implemented" — verify the SED specification, not just the method.
75
Data DestructionHard

An organization uses tape backups for disaster recovery. The tapes contain a mix of Restricted and Internal data. The tapes are 10 years old and are being retired. What is the MOST appropriate sanitization method for tape media?

  • A. Degaussing — magnetic tape is erased by a strong magnetic field, meeting NIST Purge level
  • B. Overwriting the tape with zeros using the backup software — this meets NIST Clear level
  • C. Physical shredding — tape cannot be reliably overwritten, so destruction is the only option
  • D. Leaving the tapes in a secured vault — stored data poses no risk if physically protected
✓ Correct: A — Degaussing is effective for magnetic tape and meets NIST Purge level
Like HDDs, magnetic tape uses magnetic storage and is effectively sanitized by degaussing (strong magnetic field exposure). NIST SP 800-88 recognizes degaussing as a Purge-level method for magnetic tape. Physical shredding (C) also works but may not be necessary if degaussing is available and documented. Overwriting (B) is possible for some tape formats but less reliable than degaussing and may not cover all data tracks. Vaulting (D) is retention, not sanitization — it does not address data if the vault is ever accessed by unauthorized parties.
💡 ISC2 Mindset: Magnetic media (HDD + tape) = degaussing works for Purge. Flash media (SSD) = degaussing does nothing. Know your media.
76
RetentionHard

FinTech Company X processes PH DPA-subject personal data of Filipino customers. A customer exercises their "right to erasure" (right to be forgotten). However, the customer's loan account is still active and subject to a BSP (Bangko Sentral ng Pilipinas) mandatory 5-year retention requirement. How should the Data Owner respond?

  • A. Immediately erase all data — the customer's right to erasure is absolute under RA 10173
  • B. Deny the request entirely and refer the customer to BSP if they disagree
  • C. Partially comply — erase data not subject to regulatory retention, restrict access to retained data, and inform the customer of the legal basis for continued retention
  • D. Erase data after the 5-year BSP retention period expires, without notifying the customer until that time
✓ Correct: C — Partial compliance; erase non-retained data, restrict retained data, notify customer
Under RA 10173 and GDPR equivalents, the right to erasure is not absolute — it is subject to overriding legal obligations. BSP regulations create a mandatory retention obligation that supersedes the erasure request for covered data. The correct response is: (1) erase all data not covered by a legal retention obligation, (2) restrict access to retained data to only those who need it for compliance purposes, and (3) inform the customer in writing of the specific legal basis for retention and what data is being retained. This balances privacy rights with regulatory compliance.
💡 ISC2 Mindset: Right to erasure is subject to legal retention overrides — always check regulatory floors before complying, and document the legal basis for any retained data.
77
RetentionHard

A data governance review finds that FinTech Company X retains customer data "indefinitely" for analytics purposes. The DPO warns this violates the storage limitation principle. Under GDPR/RA 10173, which exception, if any, legitimately allows indefinite retention?

  • A. Business necessity — if the analytics team needs the data, retention is justified indefinitely
  • B. Anonymization — if data is irreversibly anonymized, it no longer constitutes personal data and the storage limitation principle does not apply
  • C. Consent — if customers consented to analytics use, data may be retained forever
  • D. Data minimization — the DPO is incorrect; retention schedules are advisory, not mandatory
✓ Correct: B — Irreversible anonymization removes the personal data designation
The storage limitation principle under GDPR (Article 5(1)(e)) and RA 10173 applies to "personal data" — data that can identify a natural person. If data is irreversibly anonymized such that re-identification is not reasonably possible, it no longer constitutes personal data and the storage limitation principle does not apply. This is the legal basis for long-term analytics using anonymized datasets. Consent (C) does not grant indefinite retention rights — it is a lawful basis for processing, not an override of storage limitation. Business necessity (A) requires documented legitimate purpose and proportionality.
💡 ISC2 Mindset: Anonymization is the legitimate path to indefinite analytics retention — but anonymization must be irreversible and verified, not merely claimed.
78
Legal HoldHard

A CISSP at FinTech Company X is designing the e-discovery process. Which of the following best describes the correct order of steps in an e-discovery response?

  • A. Collect → Identify → Preserve → Process → Review → Produce
  • B. Identify → Preserve → Collect → Process → Review → Produce
  • C. Preserve → Identify → Collect → Review → Process → Produce
  • D. Identify → Collect → Process → Preserve → Review → Produce
✓ Correct: B — Identify → Preserve → Collect → Process → Review → Produce
The EDRM (Electronic Discovery Reference Model) defines the standard e-discovery workflow: Identify potentially relevant data sources first, then Preserve (implement legal holds before collecting — to prevent alteration), then Collect the data, Process it (filter, convert, index), Review for privilege and relevance, and finally Produce to the requesting party. Preservation must occur before collection — otherwise data may be altered during the collection process. Option A skips preservation before collection, which is a critical error that could constitute spoliation.
💡 ISC2 Mindset: In e-discovery, Preserve comes BEFORE Collect — never collect without first freezing the data in place.
79
Data DestructionHard

A FinTech Company X security manager must choose a sanitization method for 300 flash drives (USB) containing Restricted data before repurposing them for a different department. Which method meets NIST SP 800-88 Purge level for flash media?

  • A. Full-format via operating system — this overwrites all sectors
  • B. ATA Secure Erase or the vendor's built-in Sanitize command
  • C. Degaussing — strong magnetic field destroys flash memory cells
  • D. Storing them in a secure cabinet for 6 months before reuse
✓ Correct: B — ATA Secure Erase or vendor Sanitize command
For flash media (USB drives, SSDs), NIST SP 800-88 recommends using the device's built-in Secure Erase or Sanitize command, which is designed by the manufacturer to erase all cells including wear-leveled and reserved areas that OS-level writes cannot reach. This meets Purge level. Degaussing (C) has no effect on flash memory, which is not magnetic. OS-level format (A) is unreliable due to wear leveling. Storage (D) is not sanitization. Physical destruction is the alternative if no Secure Erase command is available or if the drive cannot be repurposed.
💡 ISC2 Mindset: Flash Purge = vendor Secure Erase command. OS-level overwriting is unreliable. Degaussing does nothing to flash.
80
RetentionHard

FinTech Company X's Vietnam operations manager tells a security auditor: "We keep all customer data for 10 years regardless of purpose because storage is cheap." Vietnam's Decree 13/2023 requires personal data to be retained only as long as necessary for its processing purpose. What is the PRIMARY violation, and what should be implemented?

  • A. No violation — organizational retention policies are self-determined and Decree 13/2023 is advisory
  • B. Violation of the storage limitation principle — purpose-based retention schedules with defined periods per data type must be implemented
  • C. Minor violation — the 10-year policy needs to be documented formally to be compliant
  • D. Violation of data minimization — less data should be collected at source to avoid long-term retention
✓ Correct: B — Storage limitation violation; purpose-based retention schedules required
Decree 13/2023 on personal data protection in Vietnam, like GDPR, mandates the storage limitation principle: personal data must be retained only for as long as necessary to fulfill the stated purpose of collection. A blanket "10 years because storage is cheap" policy violates this principle for any data whose purpose expires before 10 years. The remedy is to establish purpose-based retention schedules: each data category has a defined retention period tied to its processing purpose, after which data is deleted or anonymized. Cost is not a legitimate basis for retention.
💡 ISC2 Mindset: Retention must be purpose-driven, not storage-cost-driven. "We keep everything forever" is a compliance violation, not a best practice.
T5 Privacy Controls — Q81 to Q100
81
Privacy ControlsMedium

A data scientist replaces customer names and national IDs in a loan dataset with randomly generated codes. A mapping table linking codes to real identities is kept in a separate secured database. What technique is described, and what is its KEY risk?

  • A. Anonymization — risk is that the coding algorithm may be reverse-engineered
  • B. Pseudonymization — risk is that re-identification is possible if the mapping table is compromised
  • C. Tokenization — risk is that payment card industry rules may prohibit this approach
  • D. Encryption — risk is key management complexity at scale
✓ Correct: B — Pseudonymization; re-identification risk if mapping table is compromised
Pseudonymization replaces direct identifiers with artificial identifiers (pseudonyms) while retaining a mapping that allows re-identification. This is explicitly defined in GDPR Article 4(5). The key risk is that the dataset remains personal data under GDPR because re-identification is possible — the mapping table is the single point of failure. If the mapping table is breached, all pseudonymized records can be re-identified. Anonymization (A) is different: it permanently and irreversibly removes re-identification possibility, leaving no mapping table.
💡 ISC2 Mindset: Pseudonymization ≠ Anonymization — pseudonymized data is still personal data under GDPR because a mapping table exists for re-identification.
82
Privacy ControlsMedium

FinTech Company X's Card Processor card tokenization replaces the 16-digit PAN with a surrogate token for use in transactions. The token vault is hosted by Card Processor. Under GDPR/PH DPA, does the token (stored by FinTech Company X) constitute personal data?

  • A. No — tokens are random strings with no inherent meaning; they are not personal data
  • B. Yes — because FinTech Company X has a contractual relationship with Card Processor that enables de-tokenization, the token is effectively linkable to an individual and may constitute personal data in context
  • C. No — PCI-DSS classifies tokens as non-PCI data, which by extension means they are not personal data under privacy laws
  • D. Yes — all data related to financial services is automatically classified as personal data under RA 10173
✓ Correct: B — Token is personal data in context if de-tokenization is accessible
GDPR Recital 26 and RA 10173 define personal data contextually: if a party can, by reasonable means, link data to an identifiable individual, it constitutes personal data. FinTech Company X, via its contract with Card Processor, can trigger de-tokenization — meaning the token is linked to a real PAN and cardholder. Therefore, in FinTech Company X's operational context, the token is personal data. If FinTech Company X had no means to de-tokenize (e.g., the vault is completely isolated and contractually inaccessible), the argument for non-personal data would be stronger. PCI-DSS classification (C) is irrelevant to privacy law designations.
💡 ISC2 Mindset: Data is personal if the holding party can reasonably link it to an individual — context determines the personal data designation, not the token format.
83
Privacy ControlsMedium

What is the FUNDAMENTAL difference between anonymization and pseudonymization from a GDPR compliance perspective?

  • A. Anonymization uses AES-256; pseudonymization uses SHA-256 hashing
  • B. Anonymized data is no longer subject to GDPR because re-identification is not reasonably possible; pseudonymized data remains personal data because re-identification is possible with additional information
  • C. Pseudonymization provides stronger protection than anonymization because the mapping table can be secured
  • D. Both are equivalent under GDPR — neither technique affects the classification of data as personal
✓ Correct: B — Anonymization removes GDPR applicability; pseudonymization does not
This is the most critical distinction in privacy engineering. GDPR Recital 26 states that principles of data protection should not apply to anonymous information. Truly anonymized data exits the GDPR regulatory framework. Pseudonymized data, however, remains personal data under Article 4 because the original data controller (who holds the mapping key) can reverse the pseudonymization. ISC² expects CISSP candidates to clearly distinguish these techniques by their legal and governance implications, not just their technical methods.
💡 ISC2 Mindset: Anonymization = exits GDPR scope. Pseudonymization = still personal data, still GDPR-regulated. This distinction determines your entire compliance posture.
84
Privacy ControlsMedium

Privacy by Design (PbD) was developed by Ann Cavoukian and is embedded in GDPR Article 25. Which of the following BEST describes the "Privacy as the Default" principle of PbD?

  • A. Privacy controls should be available as an opt-in feature that power users can activate
  • B. The most privacy-protective settings should be the default — individuals should not have to take action to protect their privacy
  • C. Privacy features should be the default only for sensitive data categories such as health and biometric data
  • D. Default settings should balance privacy and functionality — the organization decides the appropriate default based on business needs
✓ Correct: B — Most protective settings are the default; individuals should not have to act
Privacy by Default means that without action by the individual, the system automatically applies the most privacy-protective settings. This contrasts with opt-in models where users must actively enable privacy features. GDPR Article 25(2) codifies this: controllers must ensure that by default, only personal data necessary for the specific purpose is processed. This principle prevents "dark pattern" defaults that harvest excessive data unless the user notices and changes settings. ISC² frames PbD as a proactive, not reactive, privacy governance approach.
💡 ISC2 Mindset: Privacy by Default = maximum privacy out of the box, not minimum. The burden is on the organization to protect privacy, not on the individual to enable it.
85
Privacy ControlsMedium

A FinTech Company X product manager wants to build a loan eligibility feature. A privacy engineer proposes using Privacy by Design from the beginning. Which of the following is a PbD principle being applied?

  • A. Collecting all possible customer data fields first, then filtering to what is needed during post-processing
  • B. Building consent management and data minimization into the feature's requirements before any code is written
  • C. Adding a privacy policy link to the UI after the feature is built and tested
  • D. Requesting a privacy audit annually after deployment to identify issues retroactively
✓ Correct: B — Embedding consent and data minimization in requirements before coding
Privacy by Design mandates that privacy is embedded into system design from the start ("proactive, not reactive; preventive, not remedial"). Building consent management and data minimization into requirements before coding is PbD in action. Options A, C, and D are all reactive approaches — they address privacy after data has already been collected or after the system is built. PbD's first principle is "Proactive not Reactive" — privacy must be anticipated and designed in, not bolted on.
💡 ISC2 Mindset: Privacy by Design = privacy baked in from requirements, not added as a compliance checkbox after build.
86
Privacy ControlsMedium

A DLP system at FinTech Company X triggers an alert when an employee emails a file containing 20+ national ID numbers to an external address. The security analyst must decide on the BEST response. What action should be taken FIRST?

  • A. Block the email and terminate the employee immediately for policy violation
  • B. Allow the email to send — DLP alerts are informational and do not justify blocking business email
  • C. Block the outbound email pending review; investigate whether the transmission was authorized, and escalate based on findings
  • D. Quarantine the email, notify the recipient's domain that a potential breach occurred, and file an NPC report
✓ Correct: C — Block pending review; investigate authorization; escalate findings
ISC² emphasizes proportionate, process-driven responses. The first action is to prevent potential data exfiltration (block) while the investigation determines if this was an authorized business transmission (e.g., to a licensed credit bureau under a DPA). Immediate termination (A) is premature without due process. Allowing it through (B) is negligent. Filing an NPC report (D) is premature — regulatory reporting comes after confirming that a breach occurred. The investigation must establish whether this was an insider threat, a mistake, or a legitimate business activity.
💡 ISC2 Mindset: DLP alerts trigger investigation, not automatic punishment. Block first, investigate second, respond proportionately based on findings.
87
Privacy ControlsMedium

Under the Philippine Data Privacy Act (Republic Act 10173), which of the following is classified as "sensitive personal information" requiring heightened protection?

  • A. Full name, email address, and home address
  • B. Race, ethnic origin, health information, biometric data, and commission of offenses
  • C. Professional title, employer name, and work address
  • D. Date of birth, marital status, and educational background
✓ Correct: B — Race, ethnic origin, health, biometric data, and criminal offenses
RA 10173 Section 3(l) defines "sensitive personal information" to include: personal information about race, ethnic origin, marital status, age, color, and religious/political beliefs; health, education, genetic, or sexual life; proceedings for any offense; government-issued IDs (SSS, GSIS, passport, driver's license); and biometric information. Name, email, address (A) are regular personal information but not sensitive. Date of birth and marital status (D) may qualify as sensitive in some contexts — but biometric data (B) is always classified as sensitive under RA 10173 without exception.
💡 ISC2 Mindset: PH DPA Section 3(l) lists sensitive personal information — biometric, health, race, ethnic origin, and criminal offenses are the key categories to memorize.
88
Privacy ControlsMedium

A data analyst at FinTech Company X publishes a public research report using a dataset that has been k-anonymized (k=5) from the original customer database. A researcher later demonstrates that specific individuals can be re-identified using publicly available social media data combined with the quasi-identifiers in the report. What attack technique is described?

  • A. Inference attack — deriving sensitive attributes from public model outputs
  • B. Linkage attack — re-identifying individuals by joining the anonymized dataset with external data sources
  • C. Aggregation attack — combining multiple low-sensitivity data points to derive high-sensitivity conclusions
  • D. Replay attack — reusing a prior session token to access the database
✓ Correct: B — Linkage attack (also called re-identification attack)
A linkage attack (or re-identification attack) occurs when an adversary combines a supposedly anonymized dataset with external data sources to re-identify individuals. This is why k-anonymization alone is often insufficient for high-risk datasets — quasi-identifiers (age range, ZIP code, gender) that appear harmless in isolation can uniquely identify individuals when joined with public social media profiles, voter rolls, or other datasets. ISC² CISSP candidates must understand that anonymization effectiveness is context-dependent and must be evaluated against reasonably available external data.
💡 ISC2 Mindset: Anonymization effectiveness must be assessed against available external data — k-anonymization alone does not guarantee re-identification impossibility.
89
Privacy ControlsMedium

FinTech Company X deploys a DLP solution to prevent PII exfiltration. The DLP is configured to scan outbound emails. A user circumvents DLP by encoding the national ID as Base64 before attaching it. What does this expose about the DLP implementation?

  • A. The DLP system needs more memory to handle Base64 decoding in real-time
  • B. Content-aware DLP must include encoding/obfuscation detection to prevent trivial bypass techniques
  • C. DLP systems cannot inspect encrypted files — users should be banned from encrypting attachments
  • D. Base64 is itself encryption; the organization should deploy an encryption key management system to address this gap
✓ Correct: B — DLP must include encoding and obfuscation detection
Base64 is an encoding scheme (not encryption) that transforms data into ASCII characters to bypass pattern-matching DLP rules. Effective DLP systems must decode common encoding schemes (Base64, hex, URL encoding) before applying content inspection rules. This is a standard bypass technique that mature DLP implementations address. The finding indicates the DLP is configured for surface-level pattern matching without pre-processing — a gap that must be remediated. ISC² expects security engineers to understand the limitations of DLP and configure them to resist common evasion techniques.
💡 ISC2 Mindset: DLP that only matches literal patterns is easily bypassed. Content inspection must include decoding/normalization before pattern matching.
90
Privacy ControlsHard

A GDPR data subject submits a Subject Access Request (SAR) to FinTech Company X asking for "all personal data you hold about me." FinTech Company X holds: loan application data, behavioral scoring inputs, eKYC Vendor biometric templates, and internal fraud risk scores. Which of these MUST be disclosed under GDPR Article 15?

  • A. All four categories must be disclosed in full with no exceptions
  • B. Loan application data and behavioral scoring inputs must be disclosed; biometric templates may be withheld for security reasons; fraud scores are exempt as internally derived data
  • C. All four must generally be disclosed; however, fraud scores may be withheld under the prevention of crime exemption if disclosure would prejudice fraud detection, and biometric templates may be withheld under proportionality
  • D. Only the loan application data must be disclosed — scoring and biometric data are proprietary and exempt
✓ Correct: C — Disclose all; fraud scores may be withheld under crime prevention exemption
GDPR Article 15 grants individuals the right to access their personal data. All four categories contain personal data and are in principle disclosable. However, national law exemptions under Article 23 allow restriction of SAR rights to protect the detection and prevention of crime — fraud scores may qualify if disclosure would enable applicants to game the fraud detection system. Biometric templates are personal data and must generally be disclosed unless a specific exemption applies. ISC² expects nuanced understanding: the right to access is broad but subject to specific, narrow, documented exemptions.
💡 ISC2 Mindset: SAR rights are broad — use exemptions narrowly, document the legal basis for any withholding, and never withhold data simply because it is inconvenient.
91
Privacy ControlsHard

FinTech Company X processes biometric data (facial templates from eKYC Vendor) for identity verification in the Philippines. Under RA 10173, what is the MINIMUM lawful basis required for processing sensitive personal information like biometric data?

  • A. Legitimate interest — biometric verification is in the organization's interest and implicitly consented to by loan applicants
  • B. Explicit written consent of the data subject, OR processing is necessary for a purpose authorized by law
  • C. A Privacy Impact Assessment (PIA) submitted to the NPC — approval grants lawful processing authority
  • D. Registration with the NPC as a personal information controller — registration constitutes lawful basis
✓ Correct: B — Explicit written consent OR legal necessity/authorization
RA 10173 Section 13 states that processing sensitive personal information requires explicit consent of the data subject, OR that processing is necessary for a specific purpose authorized by law. For biometric data used in financial identity verification, FinTech Company X should rely on explicit written consent obtained during the application process. Legitimate interest (A) is a basis for regular personal data, not sensitive personal information. A PIA (C) is a compliance tool, not a lawful basis. NPC registration (D) is a compliance obligation, not a processing authority.
💡 ISC2 Mindset: Sensitive personal information under RA 10173 requires explicit consent or specific legal authorization — legitimate interest is insufficient.
92
Privacy ControlsHard

Vietnam's Decree 13/2023 on Personal Data Protection requires a Data Protection Impact Assessment (DPIA) for high-risk processing. FinTech Company X's Partner A operations plan to process sensitive financial data of 500,000 customers using automated profiling for loan eligibility. Does this processing require a DPIA under Decree 13/2023, and what factors determine this?

  • A. No — DPIAs are only required for government agencies processing Vietnamese citizen data
  • B. Yes — large-scale processing of sensitive financial data using automated profiling is high-risk and requires a DPIA
  • C. Only if Partner A is incorporated in Vietnam — foreign-incorporated entities are not subject to Decree 13/2023
  • D. No — DPIAs are a GDPR requirement only; Decree 13/2023 does not mandate DPIAs
✓ Correct: B — Large-scale automated profiling of sensitive data requires a DPIA
Decree 13/2023 (Vietnam's personal data protection regulation, effective September 2023) requires that controllers conducting high-risk processing activities — including large-scale processing, processing of sensitive data, and automated profiling — conduct an impact assessment and notify the Ministry of Public Security (equivalent to a DPIA under GDPR principles). Processing 500,000 customers' financial data through automated profiling clearly triggers high-risk criteria. Decree 13/2023 applies to entities processing Vietnamese citizens' data regardless of where the entity is incorporated.
💡 ISC2 Mindset: Decree 13/2023 applies to any entity processing Vietnamese personal data — incorporation location is irrelevant. Large-scale automated profiling is explicitly high-risk.
93
Privacy ControlsHard

FinTech Company X transfers customer behavioral data from Vietnam to Singapore for processing by the central data science team. Vietnam's Decree 13/2023 restricts cross-border personal data transfers. What is the MOST appropriate mechanism to legitimize this transfer?

  • A. Anonymizing the data before transfer — anonymized data is not subject to cross-border transfer restrictions
  • B. Obtaining approval from Vietnam's Ministry of Public Security for the cross-border transfer, or establishing that Singapore has adequate data protection standards
  • C. Encrypting the data in transit with TLS 1.3 — encrypted data transfer satisfies cross-border transfer requirements
  • D. Including a clause in employee contracts that consent to data transfer to Singapore — employee consent satisfies the requirement
✓ Correct: B — MPS approval or adequacy assessment of the destination country
Decree 13/2023 requires that cross-border personal data transfers be assessed for the receiving country's data protection standards and, for sensitive data, may require approval from Vietnam's Ministry of Public Security. This mirrors GDPR's adequacy decisions and Standard Contractual Clauses framework. Encryption (C) addresses security in transit but does not constitute a legal transfer mechanism. Employee consent (D) addresses HR-related data, not customer behavioral data. Anonymization (A) is a valid strategy but requires genuine irreversible anonymization — not just pseudonymization — which may not be feasible for behavioral data.
💡 ISC2 Mindset: Cross-border transfer legality is a governance question, not a technical one — encryption protects the channel, but legal mechanisms authorize the transfer.
94
Privacy ControlsHard

Under GDPR, a data subject exercises their "right to data portability" and requests their loan application data in a machine-readable format. FinTech Company X's legacy system only exports data in a proprietary binary format. What obligation does GDPR Article 20 impose?

  • A. GDPR allows any format the controller chooses — the requirement is to provide the data, not a specific format
  • B. The data must be provided in a "structured, commonly used and machine-readable format" (e.g., JSON or CSV) — proprietary binary formats do not fulfill this requirement
  • C. The right to portability only applies to data actively provided by the data subject — profiling or derived data is excluded
  • D. Portability requests must be fulfilled within 90 days — there is no format requirement under Article 20
✓ Correct: B — Must provide in structured, commonly used, machine-readable format (e.g., JSON/CSV)
GDPR Article 20 mandates that ported data be provided in a "structured, commonly used and machine-readable format." Proprietary binary formats that require the controller's own software to read do not fulfill this requirement — the intent is interoperability, allowing the data subject to transfer their data to another controller. Standard formats like JSON, CSV, or XML satisfy this. Option C is partially correct (portability applies to data "provided by" the subject and processed on consent/contract basis), but the question's scenario involves application data which typically qualifies. The 90-day timeline (D) is incorrect — GDPR requires one month with possible 2-month extension.
💡 ISC2 Mindset: Data portability = structured, machine-readable, interoperable format. Proprietary formats defeat the purpose of portability rights.
95
Privacy ControlsHard

A FinTech Company X data analyst builds a credit scoring model trained on historical loan data. A civil society group argues the model is discriminatory against a protected ethnic group based on correlated behavioral signals. Under fairness and privacy frameworks, what is the MOST relevant concept, and what should the Data Owner do?

  • A. Model bias is a product liability issue, not a privacy issue — refer to the legal department only
  • B. The model uses legitimate business correlations; bias claims do not require a governance response
  • C. Algorithmic fairness and anti-discrimination principles intersect with privacy law — the Data Owner should commission a fairness audit and, if discrimination is found, retrain or constrain the model
  • D. The model should be immediately disabled pending full regulatory review before retraining
✓ Correct: C — Fairness audit and remediation if discrimination is found
Automated profiling in financial services is subject to anti-discrimination regulations and, under GDPR Article 22, individuals have rights regarding automated decision-making including the right to obtain human review and to contest decisions. The intersection of privacy, fairness, and anti-discrimination is a recognized governance area. ISC² expects security and privacy managers to take fairness complaints seriously — they represent both legal risk (discrimination claims) and reputational risk. The proportionate response is a fairness audit, not immediate shutdown (D) or dismissal (B). This is a governance and risk management decision, not solely a legal one.
💡 ISC2 Mindset: Algorithmic fairness is a governance responsibility — automated decision systems must be auditable and correctable, not just technically accurate.
96
Privacy ControlsHard

FinTech Company X suffers a data breach exposing 50,000 customer records containing national IDs and credit scores. Under the Philippine Data Privacy Act (RA 10173), what are the notification timelines and recipients?

  • A. Notify the NPC within 72 hours; notify affected individuals within 7 days of confirmation of breach
  • B. Notify the NPC within 72 hours of discovery; notify affected individuals as soon as practicable but within 5 days of confirming the breach
  • C. No mandatory timeline — notification is required only "without undue delay" at the organization's discretion
  • D. Notify affected individuals within 24 hours; notify the NPC within 30 days of breach discovery
✓ Correct: B — NPC within 72 hours; individuals within 5 days
Under NPC Circular 16-03 (implementing RA 10173), organizations must notify the NPC of a personal data breach within 72 hours of discovery when the breach is likely to give rise to a real risk of serious harm to data subjects. Affected individuals must be notified as soon as practicable but within five (5) days of the organization's confirmation that the breach occurred and that notification is required. This differs from GDPR (72 hours to DPA; no fixed timeline for individuals). Knowing jurisdiction-specific notification timelines is a tested CISSP Domain 2 skill.
💡 ISC2 Mindset: PH DPA breach notification = 72h to NPC + 5 days to individuals. This differs from GDPR (72h to DPA, "without undue delay" to individuals). Know both.
97
Privacy ControlsHard

FinTech Company X's eKYC Vendor facial recognition system processes biometric data in real time. The security team wants to apply the principle of data minimization. Which design choice BEST embodies this principle?

  • A. Store the raw facial image and the derived biometric template — raw images provide a fallback for re-processing
  • B. Store only the biometric template needed for comparison; delete the raw facial image immediately after template extraction
  • C. Store all biometric data in encrypted form — encryption satisfies data minimization requirements
  • D. Retain biometric data for 7 years to ensure availability for future model retraining
✓ Correct: B — Store only the template; delete the raw image after extraction
Data minimization requires collecting and retaining only the data necessary for the stated purpose. If the purpose is biometric identity verification, the template (a mathematical representation) is sufficient — the raw facial image is not needed after extraction and constitutes unnecessary additional biometric data. Storing both the image and template doubles the sensitive data footprint without proportionate benefit. Encryption (C) addresses security but not minimization — you can still hold unnecessary data in encrypted form. Long retention (D) for model retraining is a separate purpose that requires its own legal basis and minimization assessment.
💡 ISC2 Mindset: Data minimization = hold only what is necessary for the defined purpose. Redundant data (raw image + template) violates minimization even if it is useful.
98
Privacy ControlsHard

A senior manager at FinTech Company X argues: "We don't need a formal privacy program — we have strong encryption and access controls. Security is privacy." A CISSP must correct this misconception. What is the BEST response?

  • A. The manager is correct — encryption and access controls together constitute a complete privacy program
  • B. Security is a necessary but insufficient condition for privacy. Privacy additionally requires purpose limitation, consent management, data subject rights fulfillment, transparency, and retention governance — technical controls alone cannot satisfy these obligations
  • C. Privacy programs are only required for organizations over a certain size — small teams can rely on security controls
  • D. Privacy is a subset of security — a mature security program automatically satisfies all privacy obligations
✓ Correct: B — Security is necessary but insufficient for privacy
ISC² explicitly distinguishes privacy from security: security ensures CIA (Confidentiality, Integrity, Availability) of information, but privacy requires governance of personal information across its entire lifecycle — including purpose limitation, lawful processing basis, consent management, data subject rights (access, correction, erasure, portability), transparency (privacy notices), and proper retention and disposal. A perfectly encrypted, access-controlled system can still violate privacy law if data is retained beyond its purpose, used without consent, or not made accessible to data subjects on request. Security enables privacy but does not substitute for it.
💡 ISC2 Mindset: Security ≠ Privacy. Security is a technical tool that supports privacy; privacy is a governance framework that security tools help implement.
99
Privacy ControlsHard

FinTech Company X's new mobile app collects device location data continuously, even when the app is running in the background. The privacy engineer warns this violates data minimization. The product manager argues location data improves fraud detection and customer experience. What is the ISC²-aligned resolution?

  • A. Collection of location data is legitimate because fraud detection is a compelling business interest that overrides minimization
  • B. A Data Protection Impact Assessment (DPIA) must be conducted to assess the proportionality of continuous collection versus the privacy intrusion; collection should be limited to what is necessary and proportionate to the fraud detection purpose
  • C. The privacy engineer should be overruled — product decisions are the product manager's authority, not the privacy team's
  • D. Collect all location data initially and then delete what is not used after 30 days — this satisfies minimization retrospectively
✓ Correct: B — DPIA to assess proportionality; collect only what is necessary
The data minimization principle (GDPR Article 5(1)(c); RA 10173 Section 11) requires that personal data collection be "adequate, relevant and not excessive" relative to the stated purpose. Continuous background location collection may be disproportionate if the fraud detection purpose can be served with more limited collection (e.g., location at transaction time only). A DPIA assesses whether the privacy intrusion is proportionate to the business benefit and identifies less privacy-invasive alternatives (Privacy by Design). Option D is a retrospective collection approach that violates minimization at the point of collection. Option C incorrectly subordinates privacy governance to product authority.
💡 ISC2 Mindset: Data minimization and proportionality must be assessed BEFORE collection — a DPIA is the governance tool that balances business need against privacy intrusion.
100
Privacy ControlsHard

A CISSP sitting the exam is asked: "An organization collects customer PII for loan origination. The same data is later used for marketing analytics without additional consent. Under both GDPR and PH DPA principles, which concept is violated, and what is the MINIMUM remediation?" This is Q100 — the capstone question integrating D2 principles.

  • A. Purpose limitation is violated. Minimum remediation: obtain fresh explicit consent for marketing analytics, OR establish an alternative lawful basis, AND conduct a compatibility assessment if relying on legitimate interest
  • B. Data minimization is violated. Remediation: delete excess data collected beyond what the loan origination purpose requires
  • C. Storage limitation is violated. Remediation: establish a defined retention period for marketing analytics data separate from loan origination data
  • D. Integrity and confidentiality is violated. Remediation: encrypt marketing analytics datasets to the same standard as loan origination data
✓ Correct: A — Purpose limitation violated; new consent or compatible lawful basis required
This capstone question integrates the key Domain 2 principle: purpose limitation. Under GDPR Article 5(1)(b) and RA 10173 Section 11, personal data collected for one purpose (loan origination) cannot be used for an incompatible purpose (marketing analytics) without a fresh lawful basis. The minimum remediation is: obtain explicit consent for marketing analytics (preferred), OR establish an alternative lawful basis (legitimate interest with compatibility assessment, legal obligation, etc.). The compatibility assessment must consider: link between purposes, context of collection, nature of data, consequences, and safeguards. Options B, C, and D describe real privacy principles but do not address the primary violation — unauthorized repurposing.
💡 ISC2 Mindset: Purpose limitation is the cornerstone of privacy governance — data collected for one purpose cannot migrate to another without fresh lawful basis. This principle ties classification, lifecycle, ownership, and consent together as Domain 2's unifying theme.