About


Take On Payments, a blog sponsored by the Retail Payments Risk Forum of the Federal Reserve Bank of Atlanta, is intended to foster dialogue on emerging risks in retail payment systems and enhance collaborative efforts to improve risk detection and mitigation. We encourage your active participation in Take on Payments and look forward to collaborating with you.

Take On Payments

July 6, 2018


Attack of the Smart Refrigerator

We've all heard about refrigerators that automatically order groceries when they sense the current supply is running low or out. These smart refrigerators are what people usually point to when giving an example of an "internet-of-things" (IoT) device. Briefly, an IoT device is a physical device connected to the internet wirelessly that transmits data, sometimes without direct human interaction. I suspect most of you have at least one of these devices already operating in your home or office, whether it's a wireless router, baby monitor, or voice-activated assistant or "smart" lights, thermostats, security systems, or TVs.

Experts are forecasting that IoT device manufacturing will be one of the fastest growing industries over the next decade. Gartner estimates there were more than 8 billion connected IoT devices globally in 2017, with about $2 trillion going toward IoT endpoints and services. In 2020, the number of these devices will increase to more than 20 billion. But what security are manufacturers building into these devices to prevent monitoring or outside manipulation? What prevents someone from hacking into your security system and monitoring the patterns of your house or office or turning on your interior security cameras and invading your privacy? For those devices that can generate financial transactions, what authentication processes will ensure that transactions are legitimate? It's one kind of mistake to order an unneeded gallon of milk, but another one entirely to use that connection to access a home computer to monitor one's online banking transaction activity and capture log-on credentials.

As one would probably suspect, there is no simple or consistent answer to these security questions, but the overall track record of device security has not been a great one. There have been major DDOS attacks against websites using botnets composed of millions of IoT devices. Ransomware attacks have been made against consumers' home security systems and thermostats, forcing consumers to pay the extortionist to get their systems working again.

Some of the high-end devices such as the driverless cars and medical devices have been designed with security controls at the forefront, but most other manufacturers have given little thought to the criminal's ability to use a device to access and control other devices running on the same network. Adding to the problem is that many of these devices do not get software updates, including security patches.

With cybersecurity issues grabbing so many headlines, people are paying more and more attention to the role and impact of IoT devices. The National Institute of Standards and Technology (NIST) has begun efforts to develop security standards for cryptology that can operate within IoT devices. However, NIST estimates it will take two to four years to get the standard out.

In the meantime, the Department of Justice has some recommendations for securing IoT devices, including:

  • Research your device to determine security features. Does it have a changeable password? Does the manufacturer deliver security updates?
  • After you purchase a device and before you install it, download security updates and reset any default passwords.
  • If automatic updates are not provided to registered users, check at least monthly to determine if there are updates and download only from reputable sites.
  • Protect your routers and home Wi-Fi networks with firewalls, strong passwords, and security keys.

I see IoT device security as an issue that will continue to grow in importance. In a future post, I will discuss the privacy issues that IoT devices could create.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

July 6, 2018 in consumer fraud, cybercrime, cybersecurity, fraud, identity theft, innovation, online banking fraud, privacy | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 9, 2018


Fintech for Financial Wellness

When you hear the term fintech, you might free-associate "blockchain" or "APIs" or "machine learning." Chances are "financial opportunities and capabilities for all" might not be the first topic to spring to mind. Recently, I've been learning about the vast ecosystem of fintech entrepreneurs seeking to improve what the Center for Financial Services Innovation calls "financial health"—that is, our financial resiliency in the face of adversity, ability to take advantage of opportunities, and ability to manage day-to-day finances.

Consumer-focused fintech projects ask the question: Can we use data to improve financial wellness for individuals?

Some of these projects are directed toward specific groups. There are apps to help SNAP (Supplemental Nutrition Assistance Program) recipients manage benefits, enable immigrants to import their credit history from their home countries into U.S. credit reporting tools, and teach recent college grads about financial decisions such as paying off student loans or signing up for employer-sponsored retirement accounts.

Some can help you to:

  • Analyze your cash flows over the course of the month and tell you how much you could save.
  • Save or invest when you make purchases by automatically rounding up and putting your change into an account.
  • Analyze your accounts to identify peaks and valleys in your income and help you smooth it out.
  • Know when you have enough money to pay a particular bill and let you pay it by swiping your finger.
  • Link saving to opportunities to win prizes by incorporating lotteries.
  • Know, via text message, if you're likely to overdraw your account in the next few days.

Recent research finds that these sorts of interventions can be effective. For example, in "Preventing Over-Spending: Increasing Salience of Credit Card Payments through Smartphone Interventions," the authors find that people who use an app that suggests weekly savings goals significantly reduce their expenditures. This trial took place with a small sample of Swiss credit card users. As part of the experiment, participants reviewed and classified every credit card transaction, thus making every payment more visible to them. On average, participants reduced their weekly spending by about 14 percent.

Of course, not only entrepreneurs but also economists, policymakers, and traditional institutions appreciate the importance of financial education. Increasing financial literacy makes for a stronger economy, and financial education is an important part of what the Fed does. Just last week, Atlanta Fed president and CEO Raphael Bostic spoke about the importance of financial literacy. You can read his remarks here.

If you, too, care about improving financial wellness for everyone and want to learn more, please reach out to share information and ideas.

Photo of Claire Greene By Claire Greene, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

April 9, 2018 in fintech, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 2, 2018


Advice to Fintechs: Focus on Privacy and Security from Day 1

Fintech continues to have its moment. In one week in early March, I attended three Boston-area meetings devoted to new ideas built around the blockchain, open banking APIs, and apps for every conceivable wrinkle in personal financial management.

"Disruptive" was the vocabulary word of the week.

But no matter how innovative, disruptive technology happens within an existing framework of consumer protection practices and laws. Financial products and tools—whether a robofinancial adviser seeking to consolidate your investment information or a traditional checking account at a financial institution—are subject to laws and regulations that protect consumers. As an attorney speaking at one of the Boston meetings put it, "The words 'unfair,' 'deceptive,' and 'misleading' keep me up at night."

A failure to understand the regulatory framework can play out in various ways. For example, in a recent survey of New York financial institutions (FI)s by the Fintech Innovation Lab, 60 percent of respondents reported that regulatory, compliance, or security issues made it impossible to move fintech proposals into proof-of-concept testing. Great ideas, but inadequate infrastructure.

To cite another example, in 2016, the Consumer Financial Protection Bureau took action against one firm for misrepresenting its data security practices. And just last month, the Federal Trade Commission (FTC) reached a settlement with another firm over allegations that the firm had inadequately disclosed both restrictions on funds availability for transfer to external bank accounts and consumers' ability to control the privacy of their transactions. Announcing the settlement, acting FTC chairman Maureen Ohlhausen pointed out that it sent a strong message of the "need to focus on privacy and security from day one."

As Ohlhausen made clear, whoever the disrupter—traditional financial institution or garage-based startup—consumer protection should be baked in from the start. At the Boston meetings, a number of entrepreneurs advocated a proactive stance for working with regulators and urged that new businesses bring in compliance expertise early in product design. Good advice, not only for disrupters but also for innovation labs housed in FIs, FIs adopting third-party technology, and traditional product design.

Photo of Claire Greene By Claire Greene, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

April 2, 2018 in innovation, regulations, regulators | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

February 5, 2018


Elements of an Ethical Data Policy

In my last post, I introduced the issue of ethical considerations around data collection and analytics. I asked if, along with the significant expansion of technical capabilities over the last five years, there had been a corresponding increase in awareness and advancement of the ethical issues of big data, such as privacy, confidentiality, transparency, protection, and ownership. In this post, I focus on the disclosures provided to users and suggest some elements concerning ethics to include in them.

The complexities of ethics policies

As I've researched the topic of big data, I've come to realize that developing a universal ethics policy will be difficult because of the diversity of data that's collected and the many uses for this data—in the areas of finance, marketing, medicine, law enforcement, social science, and politics, to name just a few.

Privacy and data usage policies are often disclosed to users signing up for particular applications, products, or services. My experience has been that the details about the data being collected are hidden in the customer agreement. Normally, the agreement offers no "opt-out" of any specific elements, so users must either decline the service altogether or begrudgingly accept the conditions wholesale.

But what about the databases that are part of public records? Often these public records are created without any direct communication with the affected individuals. Did you know that in most states, property records at the county level are available online to anyone? You can look up property ownership by name or address and find out the sales history of the property, including prices, square footage, number of bedrooms and baths, often a floor plan, and even the name of the mortgage company—all useful information for performing a pricing analysis for comparable properties, but also useful for a criminal to combine with other socially engineered information for an account takeover or new-account fraud attempt. Doesn't it seem reasonable that I should receive a notification or be able to document when someone makes such an inquiry on my own property record?

Addressing issues in the disclosure

Often, particularly with financial instruments and medical information, those collecting data must comply with regulations that require specific disclosures and ways to handle the data. The following elements together can serve as a good benchmark in the development of an ethical data policy disclosure:

  • Type of data collected and usage. What type of data are being collected and how will that data be used? Will the data be retained at the individual level or aggregated, thereby preventing identification of individuals? Can the data be sold to third parties?
  • Accuracy. Can an individual review the data and submit corrections?
  • Protection. Are people notified how their data will be protected, at least in general terms, from unauthorized access? Are they told how they will be notified if there is a breach?
  • Public versus private system. Is it a private system that usually restricts access, or a public system that usually allows broad access?
  • Open versus closed. Is it a closed system, which prevents sharing, or is it open? If it's open, how will the information will be shared, at what level, and with whom? An example of an open system is one that collects information for a governmental background check and potentially shares that information with other governmental or law enforcement agencies.
  • Optional versus mandatory. Can individuals decline participation in the data collection, or decline specific elements? Or is the individual required to participate such that refusal results in some sort of punitive action?
  • Fixed versus indefinite duration. Will the captured data be deleted or destroyed on a timetable or in response to an event—for example, two years after an account is closed? Or will it be retained indefinitely?
  • Data ownership. Do individuals own and control their own data? Biometric data stored on a mobile phone, for example, are not also stored on a central storage site. On the other hand, institutions may retain ownership. Few programs are under user ownership, although legal rights governing how the data can be used may be made by agreement.

What elements have I missed? Do you have anything to suggest?

In my next post, I will discuss appropriate guiding principles in those circumstance when individuals have no direct interaction with the collection effort.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed


February 5, 2018 in consumer protection, innovation, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 29, 2018


Big Data, Big Dilemma

Five years ago, I authored a post discussing the advantages and pitfalls of "big data." Since then, data analytics has come to the forefront of computer science, with data analyst being among the most sought-after talents across many industries. One of my nephews, a month out of college (graduating with honors with a dual degree in computer science and statistics) was hired by a rail transportation carrier to work on freight movement efficiency using data analytics—with a starting salary of more than $100,000.

Big data, machine learning, deep learning, artificial intelligence—these are terms we constantly see and hear in technology articles, webinars, and conferences. Some of this usage is marketing hype, but clearly the significant increases in computing power at lower costs have empowered a continued expansion in data analytical capability across a wide range of businesses including consumer products and marketing, financial services, and health care. But along with this expansion of technical capability, has there been a corresponding heightened awareness of the ethical issues of big data? Have we fully considered issues such as privacy, confidentiality, transparency, and ownership?

In 2014, the Executive Office of the President issued a report on big data privacy issues. The report was prefaced with a letter that included this caution:

Big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans' relationship with data should expand, not diminish, their opportunities and potential.

(The report was updated in May 2016.)

In the European Union, the 2016 General Data Protection Regulation was adopted (enforceable after 2018); it provides for citizens of the European Union (EU) to have significant control over their personal data as well as to control the exportation of that data outside of the EU. Although numerous bills have been proposed in the U.S. Congress for cybersecurity, including around data collection and protection (see Doug King’s 2015 post), nothing has been passed to date despite the continuing announcements of data breaches. We have to go all the way back to the Privacy Act of 1974 for federal privacy legislation (other than constitutional rights) and that act only dealt with the collection and usage of data on individuals by federal agencies.

In a future blog post, I will give my perspective on what I believe to be the critical elements in developing a data collection and usage policy that addresses ethical issues in both overt and covert programs. In the interim, I would like to hear from you as to your perspective on this topic.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

January 29, 2018 in consumer protection, innovation, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 11, 2017


Fintechs and the Psychology of Trust

In the 14th century, Chaucer used the word trust to mean "virtual certainty and well-grounded hope." Since then, psychologists have described trust as an essential ingredient for social functioning, which, in turn, affects many economic variables. So how do we define trust in the 21st century, in the age of the internet? In particular, how do fintechs, relative newcomers in the financial services industry and not yet coalesced into an industry, gain the trust of the public? Would they more effectively gain that trust by relying on banks to hold them to certain standards, or by coming together to create their own?

In 2004, social psychologists Hans-Werver Bierhoff and Bernd Vornefeld, in "The Social Psychology of Trust with Applications in the Internet," wrote about trust in relation to technology and systems. They observed that "trust and risk are complementary terms. Risk is generally based on mistrust, whereas trust is associated with less doubts about security." They further explained that trust in technology and systems is based on whether an individual believes the system's security is guaranteed. Psychologically speaking, when companies show customers they care about the security of their information, customers have increased confidence in the company and the overall system. Understanding this provides insight into the development of certification authorities, third-party verification processes, and standardized levels of security.

To understand how fintechs might gain the trust of consumers and the financial industry, it's worth taking a step back, to look at how traditional financial services, before the internet and fintechs, used principles similar to those outlined by Bierhoff and Vornefeld. Take, for example, the following list of efforts the industry has taken to garner trust (this list is by no means comprehensive):

  • FDIC-insured depository institutions must advertise FDIC membership.
  • All financial institutions (FI) must undergo regulator supervision and examination.
  • FIs must get U.S. Patriot Act Certifications from any foreign banks that they maintain a correspondent account with.
  • Organizations with payment card data must comply with the PCI Standards Council's security standards and audit requirements.
  • Organizations processing ACH can have NACHA membership but must follow NACHA Operating Rules and undergo annual audits and risk assessments.
  • The Accredited Standards Committee X9 Financial Industry Standards Inc. has developed international as well as domestic standards for FIs.
  • The International Organization for Standardization has also developed international standards for financial services.
  • The American National Standards Institute provides membership options and develops standards and accreditation for financial services.

FIs have often been an integral part of the standards creation process. To the extent that these standards and requirements also affect fintechs, shouldn't fintechs also have a seat at the table? In addition, regulatory agencies have given us an additional overarching "virtual certainty' that FIs are adhering to the agreed-upon standards. Who will provide that oversight—and virtual certainty—for the fintechs?

The issue of privacy further adds to the confusion surrounding fintechs. The Gramm-Leach-Bliley Act (GLBA) of 1999 requires companies defined under the law as "financial institutions" to ensure the security and confidentiality of customer information. Further, the Federal Trade Commission's (FTC) Safeguards Rule requires FIs to have measures in place to keep customer information secure, and to comply with certain limitations on disclosure of nonpublic personal information. It's not clear that the GLBA's and FTC's definition of "financial institution" includes fintechs.

So, how will new entrants to financial services build trust? Will fintechs adopt the same standards, certifications, and verifications so they can influence assessments of risk versus security? What oversight will provide overarching virtual certainty that new systems are secure? And in the case of privacy, will fintechs identify themselves as FIs under the law? Or will it be up to a fintech's partnering financial institution to supervise compliance? As fintechs continue to blaze new trails, we will need clear directives as to which existing trust guarantees (certifications, verifications, and standards) apply to them and who will enforce those expectations.

As Bierhoff and Vornefeld conclude, "it is an empirical question how the balance between trust and distrust relates to successful use of the Internet." Although Chaucer was born a little too soon for internet access, he might agree.

Photo of Jessica Washington  By Jessica Washington, AAP, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

December 11, 2017 in banks and banking, financial services, innovation, mobile banking | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 4, 2017


What Will the Fintech Regulatory Environment Look Like in 2018?

As we prepare to put a bow on 2017 and begin to look forward to 2018, I can’t help but observe that fintech was one of the bigger topics in the banking and payments communities this year. (Be sure to sign up for our December 14 Talk About Payments webinar to see if fintech made our top 10 newsworthy list for 2017.) Many industry observers would likely agree that it will continue to garner a lot of attention in the upcoming year, as financial institutions (FI) will continue to partner with fintech companies to deliver client-friendly solutions.

No doubt, fintech solutions are making our daily lives easier, whether they are helping us deposit a check with our mobile phones or activating fund transfers with a voice command in a mobile banking application. But at what cost to consumers? To date, the direct costs, such as fees, have been minimal. However, are there hidden costs such as the loss of data privacy that could potentially have negative consequences for not only consumers but also FIs? And what, from a regulatory perspective, is being done to mitigate these potential negative consequences?

Early in the year, there was a splash in the regulatory environment for fintechs. The Office of the Comptroller of the Currency (OCC) began offering limited-purpose bank charters to fintech companies. This charter became the subject of heated debates and discussions—and even lawsuits, by the Conference of State Bank Supervisors and the New York Department of Financial Services. To date, the OCC has not formally begun accepting applications for this charter.

So where will the fintech regulatory environment take us in 2018?

Will it continue to be up to the FIs to perform due diligence on fintech companies, much as they do for third-party service providers? Will regulatory agencies offer FIs additional guidance or due diligence frameworks for fintechs, over and above what they do for traditional third-party service providers? Will one of the regulatory agencies decide that the role of fintech companies in financial services is becoming so important that the companies should be subject to examinations like financial institutions get? Finally, will U.S. regulatory agencies create sandboxes to allow fintechs and FIs to launch products on a limited scale, such as has taken place in the United Kingdom and Australia?

The Risk Forum will continue to closely monitor the fintech industry in 2018. We would enjoy hearing from our readers about how they see the regulatory environment for fintechs evolving.

Photo of Douglas King By Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

December 4, 2017 in banks and banking, financial services, innovation, mobile banking, regulations, regulators, third-party service provider | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

November 27, 2017


How Intelligent Is Artificial Intelligence?

At the recent Money20/20 conference, sessions on artificial intelligence (AI) joined those on friction in regulatory and technological innovation in dominating the agenda. A number of panels highlighted the competitive advantages AI tools offer companies. It didn't matter if the topic was consumer marketing, fraud prevention, or product development—AI was the buzzword. One speaker noted the social good that could come from such technology, pointing to the work of a Stanford research team trying to identify individuals with a strong likelihood of developing diabetes by running an automated review of photographic images of their eyes. Another panel discussed the privacy and ethical issues around the use of artificial intelligence.

But do any of these applications marketed as AI pass Alan Turing's 1950s now-famous Turing test defining true artificial intelligence? Turing was regarded as the father of computer science. It was his efforts during World War II that led a cryptographic team to break the Enigma code used by the Germans, as featured in the 2014 movie The Imitation Game. Turing once said, "A computer would deserve to be called intelligent if it could deceive a human into believing that it was human." An annual competition held since 1991, aims to award a solid 18-karat gold medal and a monetary prize of $100,000 for the first computer whose responses are indistinguishable from a real human's. To date, no one has received the gold medal, but every year, a bronze medal and smaller cash prize are given to the "most humanlike."

Incidentally, many vendors seem to use artificial intelligence as a synonym for the terms deep learning and machine learning. Is this usage of AI mostly marketing hype for the neural network technology developed in the mid-1960s, now greatly improved thanks to the substantial increase in computing power? A 2016 Forbes article by Bernard Marr provides a good overview of the different terms and their applications.

My opinion is that none of the tools in the market today meet the threshold of true artificial intelligence based on Turing's criteria. That isn't to say the lack of this achievement should diminish the benefits that have already emerged and will continue to be generated in the future. Computing technology certainly has advanced to be able to handle complex mathematical and programmed instructions at a much faster rate than a human.

What are your thoughts?

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

November 27, 2017 in emerging payments, innovation, payments | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

November 20, 2017


Webinar: Key Payment Events in 2017

This year has been an exciting one for the payments industry. Topics such as block chain and distributed ledger, card-not-present fraud, and chip-card migration continued to be in the news, and new subjects such as behavioral biometrics and machine learning/artificial intelligence made their way into the spotlight.

In the past, the Retail Payments Risk Forum team has coauthored a year-end post identifying what they believed to have been the major payment events of the year. This year, we are doing something a little bit different and hope you will like the change. Taking advantage of our new webinar series, Talk About Payments, the RPRF team will be sharing our perspectives through a round table discussion in a live webinar. We encourage financial institutions, retailers, payments processors, law enforcement, academia, and other payments system stakeholders to participate in this webinar. Participants will be able to submit questions during the webinar.

The webinar will be held on Thursday, December 14, from 1 to 2 p.m. (ET). Participation in the webinar is complimentary, but you must register in advance. To register, click on the TAP webinar link. After you complete your registration, you will receive a confirmation email with all the log-in and toll-free call-in information. A recording of the webinar will be available to all registered participants in various formats within a couple of weeks.

We look forward to you joining us on December 14 and sharing your perspectives on the major payment events that took place in 2017.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

November 20, 2017 in banks and banking, biometrics, emerging payments, EMV, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

November 13, 2017


The Future of Wearables

My wife and I took our children to a Florida theme park for their recent fall break. While I would love to spend the next few paragraphs opining on why I think our school calendar is crazy or giving a review of the most phenomenal ride that I have ever experienced, it doesn't really fit the mission or purpose of Take On Payments. Fortunately, the trip did provide some fodder and thought for a blog post, thanks to a much-discussed and written-about wearable NFC—or near-field-communication—device that the theme park offers.

These bands were introduced in 2013 to create an awesome customer experience. This experience is much bigger than a payment platform and has absolutely nothing to do with a rewards program around which so many mobile wallet and payment applications are being developed. The band's functionality certainly includes payments, but the device also replaces room keys, park entry cards, and ride-specific tickets known as fast passes. As an additional feature, it is waterproof, which proves handy for a trip to the water park. I was able to spend the week without ever having anything in my pockets (yes, I even left my phone in the room). My wife commented how fantastic it would be to take the NFC band experience outside of the park because it was just so easy and convenient.

Ease and convenience–isn't that what a lot of us are after? If you have to give me something to get me to open an application and tap my phone in place of a payment card, is that really providing ease and convenience? I am now 100 percent convinced that rewards programs aren't going to drive mobile commerce to any significant degree. Experiences that provide ease and convenience will drive mobile commerce. Hello, mobile order-ahead. Hello, grocery delivery. And hello, wearable of the future.

It isn't hard to imagine a wearable device, like an open-loop band, transforming our lives. After my theme park experience, I long for the day when a wearable will be the key to my vehicle—which I won't have to drive, either—and to my house, my communication device, and my payment device (or wallet). Of course, we'll have to consider the security issues. Even the bands incorporate PINs and fingerprint biometrics in some cases to ensure that the legitimate customer is the one wearing the band.

Is this day really so far-fetched? I can already order a pizza through a connected speaker, initiate a call from the driver's seat of my car without touching my phone, or tap my phone to pay for a hamburger. The more I think about these possibilities, I have to ask myself, is it crazy to question whether or not using mobile phones for payments just might become obsolete before long? Or maybe mobile phones will provide that band functionality?

Photo of Douglas King By Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

November 13, 2017 in banks and banking, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

Google Search



Recent Posts


Archives


Categories


Powered by TypePad