Tokenization: Why EMVCo Falls ShortUniversal, Open Standard Is a Must for Merchants
To help prevent massive payment card breaches, U.S. card issuers and retailers have agreed that the time has come to migrate to the EMV chip. But tokenization and end-to-end encryption also have to be part of our payments systems updates (see 7 Lessons from Target's Breach).
More merchants are coming around to supporting this three-pronged approach. But making investments in tokenization won't make sense for merchants until a universal, open standard that's acceptable to mechants and card-issuers is developed.
An open, interoperable platform will ensure merchants can support the technology across multiple providers and make back-end security processes seamless for the customer experience.
The EMVCo standard for tokenization, which came out in March, falls short. The standard issued by EMVCo, the global body that manages specifications and testing processes for the Europay, MasterCard, Visa standard, better known as EMV, already has been adopted by ApplePay. But it's a proprietary standard that is not interoperable, and only meets the interests of the banks, merchants argue.
It's time for payments industry to act now to ensure a truly open and universal standard is established. Otherwise, movement to reform payments systems could be derailed. And that will mean even more breaches, and more fraud.
Making the Case
The Merchant Advisory Group, a consortium of large merchants founded in 2007, has done a good job of stating the case for a universal tokenization standard.
"A properly designed, implemented and enforced tokenization standard would move the U.S. payments system in the right direction toward mitigating payment card fraud and identity theft," MAG notes. "An open, interoperable platform will ensure merchants can support the technology across multiple providers and make back-end security processes seamless for the customer experience."
In a July interview with Digital Transactions, MAG's CEO, Mark Horwedel, speaking about the EMVCo standard, said that the merchant community is concerned about the issuance of technical standards that are "exclusively controlled by the prominent payment brands."
Specifications for magnetic-stripe transactions are issued by the International Organization for Standardization, an open body that is not run by the card brands. Horwedel argues that in the migration from mag-stripe to chip transactions, we are moving from an environment where payments specs are open to one where payments specs will be proprietary.
Avivah Litan, an analyst at the consultancy Gartner, agrees with MAG's assertion that EMVCo's specifications do not address the needs of merchants.
Until a universal, open standard for tokenization is issued and accepted, merchants are at the mercy of security and POS vendors, Litan says. "Right now, with the EMVCo standard, a retailer could have many tokens for one card number, which defeats the purpose," she explains.
In a draft paper, which Litan shared with me, she notes: "EMV tokens as first implemented by ApplePay and the payment card networks (Visa, MasterCard) are based on different protocols than are tokenization systems widely used by merchants to limit the scope of PCI audits."
She also says that the card brands should work with standards bodies - such as Accredited Standards Committee X9, which develops and manages standards certified by the American National Standards Institute - to develop a tokenization standard that works equally well for merchants, card issuers and all payment ecosystem players, "unlike the current EMV token standard."
But not everyone agrees that a new standard is needed.
Natalie Reinelt, a payments analyst at the consultancy Aite, says that anyone who deploys EMVCo's specs will properly devalue card data, so long as the data is tokenized at the point of capture.
And Mark Rasch, a former federal prosecutor who launched the cybercrime division at the Department of Justice, say the proprietary nature of tokenization is inevitable. Ultimately, Rasch says, retailers will have to implement translation programs that convert all of these different types of tokens on the back-end.
But why follow that route if we don't have to?
One of the reasons why it's taken the U.S. so long to move forward on the EMV debit front is that we lacked agreed-upon specifications. So the industry developed the necessary specifications, paving the way for universal chip payments.
Now the time has come to develop a widely supported standard for tokenization.