About


Take On Payments, a blog sponsored by the Retail Payments Risk Forum of the Federal Reserve Bank of Atlanta, is intended to foster dialogue on emerging risks in retail payment systems and enhance collaborative efforts to improve risk detection and mitigation. We encourage your active participation in Take on Payments and look forward to collaborating with you.

Take On Payments

February 20, 2018


Best Practices for Data Privacy Policies

In my last couple of posts, I've discussed the issue of ethical policies related to data collection and analysis.  In the first one, I focused on why there is a need for such policies. The second post focused on ethical elements to include in policies directly involving the end user. Whether or not the customer is actively involved in accepting these policies, any company that collects data should have a strong privacy and protection policy. Unfortunately, based on the sheer number and magnitude of data breaches that have occurred, many companies clearly have not sufficiently implemented the protection element—resulting in the theft of personally identifiable information that can jeopardize an individual's financial well-being. In this post, the last of this series, I look at some best practices that appear in many data policies.

The average person cannot fathom the amount, scope, and velocity of personal data being collected. In fact, the power of big data has led to the origination of a new term. "Newborn data" describes new data created from analyses of multiple databases. While such aggregation can be beneficial in a number of cases—including for marketing, medical research, and fraud detection purposes—it has recently come to light that enemy forces could use data collected from wearable fitness devices worn by military personnel to determine the most likely paths and congregation points of military service personnel. As machine learning technology increases, newborn data will become more common, and it will be used in ways that no one considered when the original data was initially collected.

All this data collecting, sharing, and analyzing has resulted in a plethora of position papers on data policies containing all kinds of best practices, but the elements I see in most policies include the following:

  • Data must not be collected in violation of any regulation or statute, or in a deceptive manner.
  • The benefits and harms of data collection must be thoroughly evaluated, then how collected data will be used and by whom must be clearly defined.
  • Consent from the user should be obtained, when the information comes from direct user interaction, and the user should be given a full disclosure.
  • The quality of the data must be constantly and consistently evaluated.
  • A neutral party should periodically conduct a review to ensure adherence to the policy.
  • Protection of the data, especially data that is individualized, is paramount; there should be stringent protection controls in place to guard against both internal and external risks. An action plan should be developed in case there is a breach.
  • The position of data czar—one who has oversight of and accountability for an organization's data collection and usage—should be considered.
  • In the event of a compromise, the data breach action plan must be immediately implemented.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

February 20, 2018 in consumer protection , cybercrime , data security , identity theft , privacy | Permalink | Comments ( 0)

February 12, 2018


If the Password Is Dying, Is the PIN Far Behind?

Back in January, I wrote a post that highlighted the rising incidence of lost-and-stolen card fraud in the United Kingdom. I concluded that the decades-old PIN solution for the card-present environment is now showing signs of weakness. Results of a recent Minneapolis Fed survey of 283 financial institutions offer some validity to my conclusion: the survey found that losses on PIN-based debit increased by 50 percent from 2015 to 2016. In fact, 81 percent of the respondents reported fraud losses from PIN-based debit, compared to only 77 percent for credit cards.

The news wasn't all bad for PIN-based debit. Signature-based debit and credit cards still had more fraud attempts than any other payment instrument. At 63 percent, signature debit fraud actually had a higher increase in fraud losses from 2015 to 2016 than did PIN debit. The PIN is a far superior verification method for card payments, but I'm willing to bet that the PIN, much like the password, has become less effective.

Is this coming at a time when the PIN is about to become more prominent? In late January, the PCI Security Standards Council announced a new security standard for software-based PIN entry, also known as "PIN on glass." This standard specifies the security requirements for accepting a PIN on a mobile point-of-sale device such as a Square card reader.

As an aside, I am a bit surprised by this announcement. Apparently, mobile phones are safe enough for entering PINs, but when someone uses a pay wallet such as Apple Pay or Samsung Pay, the card's PAN, or primary account number, is tokenized for security purposes. I'll save a discussion of this inconsistency for another post.

People have been talking for years now about how the password has passed its prime as a standalone authentication solution. Yet it continues to live, and it's as difficult as ever to mitigate its vulnerabilities. In my opinion, attempts to do so have increased customer friction and had minimal impact. I think the PIN is following a similar path. It creates customer friction (especially for me as I now have different PINs for multiple cards that I struggle to keep straight) and is losing its effectiveness, according to the data I mentioned in the first paragraph. But it appears that, with the PCI's recent announcement, the PIN could become even more prevalent for cardholders. Is it time, in the name of security and customer friction, for us to replace PINs and passwords with more modern authentication technologies such as biometrics?

Photo of Douglas King By Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

February 12, 2018 in authentication , banks and banking , cards , chip-and-pin , consumer fraud , debit cards , EMV , mobile payments | Permalink | Comments ( 0)

February 5, 2018


Elements of an Ethical Data Policy

In my last post, I introduced the issue of ethical considerations around data collection and analytics. I asked if, along with the significant expansion of technical capabilities over the last five years, there had been a corresponding increase in awareness and advancement of the ethical issues of big data, such as privacy, confidentiality, transparency, protection, and ownership. In this post, I focus on the disclosures provided to users and suggest some elements concerning ethics to include in them.

The complexities of ethics policies

As I've researched the topic of big data, I've come to realize that developing a universal ethics policy will be difficult because of the diversity of data that's collected and the many uses for this data—in the areas of finance, marketing, medicine, law enforcement, social science, and politics, to name just a few.

Privacy and data usage policies are often disclosed to users signing up for particular applications, products, or services. My experience has been that the details about the data being collected are hidden in the customer agreement. Normally, the agreement offers no "opt-out" of any specific elements, so users must either decline the service altogether or begrudgingly accept the conditions wholesale.

But what about the databases that are part of public records? Often these public records are created without any direct communication with the affected individuals. Did you know that in most states, property records at the county level are available online to anyone? You can look up property ownership by name or address and find out the sales history of the property, including prices, square footage, number of bedrooms and baths, often a floor plan, and even the name of the mortgage company—all useful information for performing a pricing analysis for comparable properties, but also useful for a criminal to combine with other socially engineered information for an account takeover or new-account fraud attempt. Doesn't it seem reasonable that I should receive a notification or be able to document when someone makes such an inquiry on my own property record?

Addressing issues in the disclosure

Often, particularly with financial instruments and medical information, those collecting data must comply with regulations that require specific disclosures and ways to handle the data. The following elements together can serve as a good benchmark in the development of an ethical data policy disclosure:

  • Type of data collected and usage. What type of data are being collected and how will that data be used? Will the data be retained at the individual level or aggregated, thereby preventing identification of individuals? Can the data be sold to third parties?
  • Accuracy. Can an individual review the data and submit corrections?
  • Protection. Are people notified how their data will be protected, at least in general terms, from unauthorized access? Are they told how they will be notified if there is a breach?
  • Public versus private system. Is it a private system that usually restricts access, or a public system that usually allows broad access?
  • Open versus closed. Is it a closed system, which prevents sharing, or is it open? If it's open, how will the information will be shared, at what level, and with whom? An example of an open system is one that collects information for a governmental background check and potentially shares that information with other governmental or law enforcement agencies.
  • Optional versus mandatory. Can individuals decline participation in the data collection, or decline specific elements? Or is the individual required to participate such that refusal results in some sort of punitive action?
  • Fixed versus indefinite duration. Will the captured data be deleted or destroyed on a timetable or in response to an event—for example, two years after an account is closed? Or will it be retained indefinitely?
  • Data ownership. Do individuals own and control their own data? Biometric data stored on a mobile phone, for example, are not also stored on a central storage site. On the other hand, institutions may retain ownership. Few programs are under user ownership, although legal rights governing how the data can be used may be made by agreement.

What elements have I missed? Do you have anything to suggest?

In my next post, I will discuss appropriate guiding principles in those circumstance when individuals have no direct interaction with the collection effort.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed


February 5, 2018 in consumer protection , innovation , regulations | Permalink | Comments ( 0)

January 29, 2018


Big Data, Big Dilemma

Five years ago, I authored a post discussing the advantages and pitfalls of "big data." Since then, data analytics has come to the forefront of computer science, with data analyst being among the most sought-after talents across many industries. One of my nephews, a month out of college (graduating with honors with a dual degree in computer science and statistics) was hired by a rail transportation carrier to work on freight movement efficiency using data analytics—with a starting salary of more than $100,000.

Big data, machine learning, deep learning, artificial intelligence—these are terms we constantly see and hear in technology articles, webinars, and conferences. Some of this usage is marketing hype, but clearly the significant increases in computing power at lower costs have empowered a continued expansion in data analytical capability across a wide range of businesses including consumer products and marketing, financial services, and health care. But along with this expansion of technical capability, has there been a corresponding heightened awareness of the ethical issues of big data? Have we fully considered issues such as privacy, confidentiality, transparency, and ownership?

In 2014, the Executive Office of the President issued a report on big data privacy issues. The report was prefaced with a letter that included this caution:

Big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans' relationship with data should expand, not diminish, their opportunities and potential.

(The report was updated in May 2016.)

In the European Union, the 2016 General Data Protection Regulation was adopted (enforceable after 2018); it provides for citizens of the European Union (EU) to have significant control over their personal data as well as to control the exportation of that data outside of the EU. Although numerous bills have been proposed in the U.S. Congress for cybersecurity, including around data collection and protection (see Doug King’s 2015 post), nothing has been passed to date despite the continuing announcements of data breaches. We have to go all the way back to the Privacy Act of 1974 for federal privacy legislation (other than constitutional rights) and that act only dealt with the collection and usage of data on individuals by federal agencies.

In a future blog post, I will give my perspective on what I believe to be the critical elements in developing a data collection and usage policy that addresses ethical issues in both overt and covert programs. In the interim, I would like to hear from you as to your perspective on this topic.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

January 29, 2018 in consumer protection , innovation , regulations | Permalink | Comments ( 0)

Google Search



Recent Posts


Archives


Categories


Powered by TypePad