Bank of Canada research has established that there is a public good aspect to privacy in payments (Garratt and van Oordt 2019). This note outlines what is technologically feasible for privacy in a central bank digital currency (CBDC) system. Privacy in a CBDC goes beyond binary choices of anonymity or full disclosure. System designers have a range of choices around the type of information to keep private and who to keep it private from. Because privacy is not in the sole purview of the Bank, defining it requires consultation with external parties. Our approach in this note is to:

  • develop a framework to evaluate different privacy models
  • understand the technical tools to enact various privacy models
  • suggest a design approach for CBDC privacy
  • list the key risks and trade-offs

Key messages

  • There are many cryptographic techniques and operational arrangements for a fine-grained privacy design. These demand knowledge of the detailed requirements around privacy and disclosure.
  • The Bank could engineer a CBDC system with higher levels of privacy than commercial products can offer—but with trade-offs. Some combinations of requirements will not be feasible or may lead to high operational costs and excessive complexity and risk. Also, the user’s overall privacy will depend on factors such as user behaviour and the privacy policies of other entities in the CBDC ecosystem.
  • Techniques to achieve cash-like privacy are immature. They have limited deployments, none of which comply with know-your-customer (KYC) and anti–money laundering (AML) regulations. Their risks include hidden vulnerabilities, a lack of scalability and complicated operations.
  • Maintaining privacy and complying with regulations (the latter which requires disclosure of information) present a dichotomy for a CBDC. This is further complicated by the need for proactive disclosure to prevent fraud.
  • Public trust in the privacy design the Bank enacts could be enhanced through third-party reviews of CBDC architecture and operations.

Analysis of system privacy

Diverse payment systems with a range of privacy offerings

The technologies that are or could be used in payment systems are diverse—ranging from cash and debit and credit cards to public and private distributed ledger technologies (DLTs), centralized systems and offline devices. The privacy available to users varies markedly from system to system. For example, most users appreciate that, broadly speaking, cash use is highly private while credit cards offer less privacy. We aim to more precisely categorize this variation in privacy between systems.

A framework to analyze, compare and define CBDC system requirements

We consider a CBDC system consisting of holdings and transactions, where a holding has an owner (O) and a balance (B), and a transaction has a payer (Pr), payee (Pe) and amount (A). Privacy is the degree to which holdings and transactions data are hidden from participating entities. The entities are many—the payer’s bank or money services business (MSB), the payee’s MSB, government institutions, payment providers and the general public—each with a varying degree of visibility to holdings and transactions. We present a framework where the privacy profile of a system is a combination of the extent to which details are kept private from each entity. This formulation allows us to compare privacy profiles of diverse technologies.

We summarize the privacy levels of the following representative technologies in Table 1:

  • magnetic stripe credit cards
  • EMV1 chip credit cards
  • electronic transfers
  • debit cards
  • permissioned DLT
  • custodial and advanced use of public DLTs (e.g., Bitcoin)
  • tiered ledger systems composed of multiple different ledgers
  • device-based systems using devices assigned either to identified customers as per KYC regulations or to anonymous customers
  • cash

Offline device-based systems (such as PUF Cash, a system based on physically uncloneable functions [see Calhoun et al. 2019]) come closest to achieving cash-like privacy.

Table 1: Privacy profiles of payment technologies

Solution Government Payer MSB Payee MSB Payee Payment providers Public (other users)
H T H T H T T H T H T
O B Pr Pe A O B Pr Pe A O B Pr Pe A Pr O B Pr Pe A O B Pr Pe A
Credit card (stripe) 3 3 1 1 0 0 0 0 0 0 2 3 2 0 0 0 1 3 1 0 0 3 3 3 3 3
Credit card (EMV) 3 3 1 1 0 0 0 0 0 0 2 3 2 0 0 2 1 3 1 1 0 3 3 3 3 3
E-transfer 3 3 1 1 0 0 0 0 1 0 1 3 1 0 0 2 1 3 1 1 0 3 3 3 3 3
Debit card 3 3 1 1 0 0 0 0 0 0 1 3 1 0 0 1 1 3 1 1 0 3 3 3 3 3
Permissioned DLT 1 0 1 1 0 0 0 0 1 0 1 3 1 0 0 1 1 0 1 1 0 3 3 3 3 3
Bitcoin custodial 2 3 2 2 0 0 0 0 2 0 2 3 2 0 0 2 2 3 2 2 0 2 3 2 2 0
Bitcoin pro 3 3 2 2 0 3 3 2 2 0 3 3 2 2 0 2 3 3 2 2 0 3 3 2 2 0
Tiered ledgers 1 0 1 1 0 0 0 0 1 0 2 3 2 0 0 1 3 3 3 3 3 3 3 3 3 3
Device-based (KYC, non-transferable) 0 2 2 0 2 0 2 2 0 2 0 2 2 0 2 1 2 3 3 3 3 3 3 3 3 3
Device-based (non-KYC, transferable) 3 3 2 0 2 3 3 2 0 2 3 3 2 0 2 1 2 3 3 3 3 3 3 3 3 3
Cash 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 3 3 3 3 3 3 3 3 3 3

Note: Higher/darker values indicate more privacy.

Dependence on multiple entities and behaviour

A system may be more private with respect to one entity (e.g., merchant) and less so for another (e.g., government). Privacy may also vary by user behaviour: for example, paying through an online account may reveal personal details. In designing a CBDC system’s privacy, we must consider many nuanced questions:

  • Should all transactions be routinely disclosed to the government, or only some (by, e.g., dollar threshold)?
  • Should law enforcement be able to determine a person’s holdings, even if only approximately?
  • Should a payer’s identity be hidden from a merchant?
  • What transaction details should be shown to a payer’s MSB?
  • Should users be able to transact outside of KYC regulations to some extent?

By using a framework to document CBDC privacy requirements, system designers can ensure they cover all entities and use design ideas from systems with similar privacy profiles.

Privacy techniques and design

Adopting privacy by design

According to Cavoukian (2011) it is important that privacy with compliance be designed into the system from the outset. Privacy by design is a well-known approach that calls for privacy to be proactively engineered throughout the design process. The alternative, of doing a functional design first and adding privacy and compliance later, carries risks of unnecessary trade-offs.

Customizing and combining privacy constructs to achieve a desired design

Privacy design can apply building blocks of varying maturity and trade-offs:

  • Group signatures (Chaum and van Heyst 1991) allow a set of entities to transact while obscuring their identities, revealing only that “someone in the group” transacted.
  • Secret sharing (Shamir 1979) or multi-signature (Itakura and Nakamura 1983) schemes can guarantee that sensitive data are disclosed only when an adequate number of entities (e.g., three of five) agree.
  • Zero-knowledge proofs (Blum, Feldman and Micali 1988) can prove claims about data without revealing them (e.g., they can prove an account balance is adequate for a transaction without revealing the balance).
  • Homomorphic encryption (Rivest, Adleman and Dertouzos 1978) allows mathematical operations on obscured data (e.g., payment of interest on a balance that is encrypted).
  • Multi-party computation (Yao 1982) allows several entities to securely contribute their data to a combined dataset for fraud detection while keeping their data private from one another.
  • Differential privacy (Dwork and Roth 2014) and anonymization are techniques that ensure personally identifiable information cannot be extracted from sensitive datasets. The data are rendered safe and private for uses such as research and data analytics.

More techniques not covered here could be explored by system designers for potential use: for example, private information retrieval (Chor et al. 1998) and deniable encryption (Canetti et al. 1997). Most of these are flexible enough to be used across a variety of technology platforms (e.g., centralized, DLT and device-based) and can be combined and customized to achieve fine-grained CBDC privacy goals.

Privacy and governance

Optimal mix of business model, attributes and platform

Knowing the CBDC business model, attributes and technology platform is essential to choosing the right constructs and combining them appropriately.

For example, consider a system where private transactions are verified by MSBs. If the business model states that MSBs are highly trusted, then privacy protocols can be simplified by assuming verifiers are honest. If not, the chosen protocols must guard against dishonest verifiers, which entails higher complexity. If amounts are hidden and policy dictates an interest-bearing CBDC, then chosen schemes must support encrypted computations of interest payments.

Further, the selection of privacy techniques will depend on the chosen platform. Typical proof systems are made up of provers (e.g., end-users) who generate proofs and verifiers (e.g., the systems) that check them. In a DLT system, multiple nodes perform verification, so system designers would need to ensure verification protocols are highly efficient. Centralized systems could tolerate slower verification. Another consideration is the trade-off between prover efficiency and proof size—algorithms that achieve fast proof generation generally result in large proofs. This could be a challenge in device-based solutions constrained by limited storage. Device-based solutions must also ensure that selected schemes can operate within the restrictions of sporadic CBDC network connectivity and limited computing capacity.

The impact of complying with regulations on privacy and techniques

A CBDC system is required to comply with regulations (e.g., KYC and AML). This can dictate the level of privacy and the selection of privacy techniques. KYC may require entities to store personal data with proper classification. Generally, achieving high levels of privacy while complying with regulations is complicated. A designer, however, could build a system with hybrid privacy levels. In this, unregulated holdings and transactions (offering maximum privacy to users) would be permitted within limits (e.g., a maximum amount) alongside regulated ones without limits.

Trade-offs and risks

Trade-offs for higher levels of privacy

In general, lower privacy levels are easier to achieve, as less information must be secured. For higher privacy, the system must encapsulate information in controls proven to be reliable. This adds complexity, which raises operational costs. It also adds computational overhead, so scaling to a population size can be challenging or impractical—whether a DLT or non-DLT platform. For example, Bitcoin (Nakamoto 2008) is a public DLT with fully visible transactions. Zcash (Hopwood et al. 2020) is also a public DLT but with fully private transactions built on zero-knowledge proofs. These privacy constructs are highly complex, do not scale and entail computational overhead for users that is significantly slower than regular transactions.

Risks of emerging cryptographic techniques

Cryptographic techniques such as zero-knowledge proofs are in their infancy and remain areas of active research. The skill set needed to employ them is not as widely available as in more mature technical areas. Few systems have deployed these techniques in production, even in private industry. The risk here is that their technical complexity combined with their immaturity could mask vulnerabilities. Further, no known deployments have scaled up to a national population. The risk in this case is the unknown technical obstacles in applying these techniques to the Canadian population and beyond for future uses, such as micropayments at internet-of-things endpoints.

  1. 1. EMV originally stood for Europay, Mastercard and Visa, the three companies that created the standard. The standard is now managed by EMVCo, a consortium with control split equally among Visa, Mastercard, JCB, American Express, UnionPay and Discover.[]

References

  1. Blum, M., P. Feldman and S. Micali. 1988. "Non-Interactive Zero-Knowledge and Its Applications." In STOC ’88: Proceedings of the Twentieth Annual ACM Symposium on Theory of Computing, 103–112. Chicago, IL. January.
  2. Calhoun, J., C. Minwalla, C. Helmich, F. Saqib, W. Che and J. Plusquellic. 2019. "Physical Unclonable Function (PUF)-Based e-Cash Transaction Protocol (PUF-Cash)." Cryptography 3 (3): 18.
  3. Canetti, R., C. Dwork, M. Naor and R. Ostrovsky. 1997. "Deniable Encryption." In Advances in Cryptology — CRYPTO '97, ed. B. S. Kaliski, 90–104. Lecture Notes in Computer Science, vol. 1294. Berlin, Heidelberg: Springer.
  4. Cavoukian, A. 2011. "Privacy by Design." Information and Privacy Commissioner of Canada. January.
  5. Chaum D. and E. van Heyst. 1991. "Group Signatures." In Advances in Cryptology— EUROCRYPT ’91, ed D. W. Davies, 257–265. Lecture Notes in Computer Science, vol. 547. Berlin, Heidelberg: Springer.
  6. Chor, B., O. Goldreich, E. Kushilevitz and M. Sudan. 1998. "Private Information Retrieval." Journal of the ACM 45 (6): 965–982.
  7. Dwork C. and A. Roth. 2014. "The Algorithmic Foundations of Differential Privacy." Foundations and Trends in Theoretical Computer Science 9 (3-4): 211–407.
  8. Garratt R. and M. van Oordt. 2019. "Privacy as a Public Good: A Case for Electronic Cash," Bank of Canada Staff Working Paper No. 2019-24.
  9. Hopwood, D., S. Bowe, T. Hornby and N. Wilcox. 2020. "Zcash Protocol Specification." March 20.
  10. Itakura K. and K. Nakamura. 1983. “A Public-Key Cryptosystem Suitable for Digital Multisignatures.” NEC Research & Development 71: 1–8.
  11. Nakamoto, S. 2008. "Bitcoin: A Peer-to-Peer Electronic Cash System." November 8.
  12. Rivest, R. L., L. Adleman and M. L. Dertouzos. 1978. “On Data Banks and Privacy Homomorphisms.” In Foundations of Secure Computation, ed. R. A. Demillo, 169–179. New York: Academia Press.
  13. Shamir, A. 1979.  "How to Share a Secret." Proceedings of the ACM 22 (11): 612–613.
  14. Yao, A. 1982. "Protocols for Secure Computation." In SFCS '82: Proceedings of the 23rd Annual Symposium on Foundations of Computer Science (SFCS ’82), 160–164. Los Alamitos, CA: IEEE Computer Society.

Avis d’exonération de responsabilité

Les notes analytiques du personnel de la Banque du Canada sont de brefs articles qui portent sur des sujets liés à la situation économique et financière du moment. Rédigées en toute indépendance du Conseil de direction, elles peuvent étayer ou remettre en question les orientations et idées établies. Les opinions exprimées dans le présent document sont celles des auteurs uniquement. Par conséquent, elles ne traduisent pas forcément le point de vue officiel de la Banque du Canada et n’engagent aucunement cette dernière.

DOI : https://doi.org/10.34989/san-2020-9

Sur cette page
Table des matières