Beyond the buzzword that is tokenization, the reality is that replacing sensitive data with unique identifiers is not the bulletproof method that some may think it is. That’s because standards are lacking, and according to GEOBRIDGE CTO Jason Way, who describes the “extra mile” that firms must travel in making sure that tokens are unique and data is secure.
All too often, we look toward technology as a magic bullet to cure what ails us. In some cases, tech works wonders on any number of fronts — from improving healthcare to changing the very way we pay for goods and services, rendering transactions digital and instant.
However, when it comes to security of those transactions, specifically data security, technology comes up short. Part of that comes because fraud is always evolving, and fraudsters are pushing constantly to find the vulnerabilities in firms’ and consumers’ best efforts to protect data. Shortfalls also crop up when promises outweigh what is delivered. When tech becomes a buzzword, excitement follows and, ultimately, complacency. “Set it and forget it” goes the mantra, and that’s never a good maxim to follow, as fraudsters are assuredly not “forgetting it.”
Might we see the same forces at work when it comes to tokenization?
On its broadest face, tokenization replaces sensitive data with unique identification attributes. In payments, specifically in the payment card industry (PCI), data — which can range from names and addresses to Social Security numbers and account details — is presumably kept safe and firms satisfy compliance mandates. It is the “presumably” that might give pause, because tokenization is not as impervious a process as some might assert.
In an interview with PYMNTS, Jason Way, chief technology officer of information security firm GEOBRIDGE, stated that tokenization has a vulnerability that ties into its most basic premise: As it stands now, and as generally deployed, tokens cannot be guaranteed to be unique — and, quite simply, that guarantee must be in place when moving data that is of value. The most common vulnerability and weakness, the CTO said, is that nearly all the solutions in the market these days use software and “pseudo-random values” that do not ensure that uniqueness, which means “the amount of work to deploy [that solution] can be in vain.”
The extant solutions available to companies, said Way, come up short because they do not use standards, such as hardware-based FIPS 140-2 Level 3 certified random number generators. Those standards come amid encryption efforts that are required by PCI-PIN and PCI-P2PE, the executive told PYMNTS. He mused that firms should try to follow high standards even where they are not explicitly mandated, ensuring that tokenization is robust and security is eyed with the utmost care.
That’s a different mindset than the one typically adopted by most hardware manufacturers and software designers, which tend to strive to meet only minimum requirements.
“There are no well-known solutions for tokenizations that do any more than PCI-DSS mandates,” he added. Maybe it’s not the industry’s fault. As Way noted, “While standards are emerging, there is no governance on the proper implementation of a token solution.”
Nothing is codified as to how a token should be generated and how it should be associated to the clear value it represents. Marking efforts to minimum standards … well, that’s akin to a “get out of jail free” card, said Way, as employees tasked with data security can simply point to what’s out there and what current best practices are reputed to be.
He said, “If I can claim that I have tokenized your PII (anything from your date of birth, your Social Security number, your address and PANs), if we can present that in a way that it is not the actual true clear value, then, as a consumer, when you hear that, it gives you a warm and fuzzy feeling. … The thought is, [the firm], they are holding XYZ instead of the real Social Security number. But the reality is they still have got your Social Security number.”
The savviest of fraudsters can monitor a system and build a dictionary against what a token should reference, eventually making off with that sensitive information. The firm that is truly concerned with security will wisely adopt a mindset that seeks to, as Way put it, “manage ahead” in anticipation of standards that may shift toward a bit more rigor. Thus, to fix the problems mentioned above, the company has introduced its TokenBRIDGE solution that can help overcome the pitfalls inherent in tokenization. The new offering, said Way, extends a RESTful API that allows users to use tokens that are truly random and support any size or form of data.
Way explained that “TokenBRIDGE is a licensable extension that is operated from the same enterprise key management platform KeyBRIDGE, utilized throughout the industry to satisfy PCI-PIN and PCI-P2PE certification standards.”
In other words, with GEOBRIDGE’s input, tokenization embraces the same cryptographic key management and utilization standards as are used in other related standards.
“Until a choice is available, the mandate is difficult to create. GEOBRIDGE is now making this alternative available to the market,” said Way.
But the proactive firm has several challenges in place.
“I need to guarantee the uniqueness of my tokens. I need to make sure that I cannot produce the same token twice. I need to make that because my token is intended to be a public value — it cannot have a situation that allows for the collection of those tokens to produce a dictionary that would, otherwise, tell an attacker how to defeat a token protection mechanism,” he said.
Not an easy task, to be sure. In explaining the mechanics of the GEOBRIDGE solution, Way said that the internal HSM on the KeyBRIDGE platform is an FIPS 140-2 Level 3 device with a certified random number generator. All data submitted for tokenization is stored under hardware-based 256-Bit AES encryption. A unique cryptographic checksum is created for each clear value that is stored.
“TokenBRIDGE ensures that no two clear values will ever have the same token, and that no clear value may accidentally have more than one token,” said Way. “Think of it as a logical extension of GEOBRIDGE’s platform, now enabling tokenization and going that extra mile to protect customer data.”