About


Take On Payments, a blog sponsored by the Retail Payments Risk Forum of the Federal Reserve Bank of Atlanta, is intended to foster dialogue on emerging risks in retail payment systems and enhance collaborative efforts to improve risk detection and mitigation. We encourage your active participation in Take on Payments and look forward to collaborating with you.

Take On Payments

October 15, 2018


An Ounce of Prevention

Benjamin Franklin coined the phrase "An ounce of prevention is worth a pound of cure," and after attending late September's FinovateFall 2018 Conference in New York City, I find this aphorism as relevant today as it was in 1735. The conference showcased 80 demonstrations of leading-edge financial technology over two days with presenters representing five continents. Demos touched on a wide range of technologies and solutions, including game-based marketing and financial education; "lifestyle" mobile banking applications that integrate social media, news, e-commerce, and financial management to deliver personalized recommendations; lending and home buying; and integration with intelligent personal assistants. What stood out to me most were the many possible technologies offered to authenticate users, cards, and mobile transactions, each with the potential to prevent payments fraud.

As card payments continue to dominate consumer transactions in the United States, usage is increasing in other countries, and remote purchases gather steam, the demand for fast, reliable identity and payment authentication has also grown. So has the even greater demand from consumers for frictionless payments. But how does technology reward the good guys, keep out the bad ones, and prevent cart abandonment or consumer frustration? Here are just a few examples of how some of the fintech companies at the conference propose to satisfy these competing priorities.

SMS—While one company proclaimed that SMS was designed for teenagers and never intended for use as a secure messaging means, another proposed a three-factor authentication method that combined the use of a PIN, Bluetooth communication, and facial recognition via SMS sent to account holders to identify a possible fraud event in real time. Enhancing this technology was artificial intelligence that analyzes facial characteristics such as smiling or frowning.

Biometrics—Developers demonstrated numerous biometrics options, including those using unique, multifactor, non-gesture-based biometric characteristics such as the speed and pressure we use to swipe our mobile devices. Also demonstrated was the process of linking facial recognition to cards for both in-person and e-commerce purchases, as well as "liveness" tests that access the mobile phone's gyroscope to detect slight physical movements not present when a bot is involved. Another liveness test demonstrated was one in which people use their mobile devices to shoot videos of themselves reciting a number or performing randomized movements. Video content is then checked against identity verification documents, such as driver's license photos, that account holders used at setup. The developers noted that using video for liveness testing helps prevent fraudsters from using stolen photos or IDs in the authentication process.

Passwords—Some developers declared that behavioral biometrics would bring about the death of the password, and others offered services that search the corners of the dark web for compromised credentials. Companies presented solutions including a single, unique identification across all platforms and single-use passwords generated automatically at each login. One of the most interesting password technologies displayed involved the use of colors, emojis, numbers, and logos. This password system, which could be as short as four characters, uses a behind-the-scenes "end code," where the definition of individual password characters is unique to each company employing the technology, rendering the password useless in the event of a data breach.

As I sat in the audience fascinated by so many of the demos, I wished I could go to my app store to download and use some of these technologies right away; the perceived security and convenience, combined with ease of use, tugged at the early adopter in me. Alas, most are white-labeled solutions to be deployed by financial institutions, card networks, and merchant acquirers rather than offered for direct consumer use. But I am buoyed by the fact that so many solutions are abiding by the words of Ben Franklin and seek to apply an ounce of prevention.

Photo of Ian Perry-Okara  By Nancy Donahue, project manager in the Retail Payments Risk Forum  at the Atlanta Fed

 

October 15, 2018 in biometrics, cards, cybersecurity, emerging payments, fintech, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

October 1, 2018


Safeguarding Things When They’re All Connected

In a July 6 post, I discussed the explosive growth of internet-of-things (IoT) devices in the consumer market. I expressed my concerns about how poor security practices with those devices could allow criminals to use them as gateways for fraudulent activity. At a recent technology event for Atlanta Fed employees, Ian Perry-Okpara of the Atlanta Fed’s Information Security Department led an information session on better ways to safeguard IoT devices against unauthorized access and usage. Ian and I have collaborated to provide some suggestions for you to secure your IoT device.

Prepurchase

  • Visit the manufacturer's website and get specific product information regarding security and privacy features. Is encryption being used and, if so, what level? What data is being collected, where and how long is it being stored, and is it shared with any other party? Does the product have firmware that you can update? Does it have a changeable password? (You should avoid devices that cannot receive updates or have their passwords changed.) What IoT standards have been adopted?
  • Check with reliable product review sites to see what others have to say about the product’s security features.
  • If your home network router supports a secondary "guest" network, create one for your IoT devices to separate them from your more secure devices such as desktop and laptop computers and printers.

Postpurchase

  • Especially if your device is used or refurbished or was a display model, immediately perform a factory reset if it’s equipped that way in case someone has modified the settings.
  • Download the most recent firmware available for the device. Often, a newer firmware will become available during the period the merchant held the device.
  • Use strong password techniques and change the user ID and password from the factory settings. Use different passwords for each one of your IoT devices.
  • Register your device with the manufacturer to be notified of security updates or recalls.
  • Add the device to your separate network if available.

If you adopt these suggestions, you will have a secure IoT network that will minimize your risk of attack. Criminals will be much less able to take over your IoT devices for bot attacks or for going through them to gain entry into other devices on your home network. You do not want the criminals to get at personal information like your credentials to your financial services applications.

We hope this information will be helpful. If you have other suggestions to better secure your IoT devices, we certainly would like to hear from you.

Photo of Ian Perry-Okara  By Ian Perry-Okpara, an information security architect in the Information Security Department at the Atlanta Fed

 

Photo of David Lott  By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

October 1, 2018 in account takeovers, cybercrime, cybersecurity, data security, identity theft, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

September 24, 2018


Racing Ahead in the Wireless Space

This past Sunday, Eliud Kipchoge smashed the marathon world record at the Berlin Marathon, with a time of 2:01:39, shaving 1 minute 22 seconds off the previous world record. Though some running experts claim a marathon under two hours will never happen, I think elite runners will continue to push the speed envelope and we will witness a sub-two-hour marathon one day.

The marathon isn’t the only area where the speed envelope is being pushed. Another area, and the focus of today’s blog, is in the wireless space.

It was in 2002 when the first commercial 3G network launched in the United States. 3G made it possible for our phones to run applications using a global positioning system (GPS) or using videoconferencing, among other things. The second half of 2010 marked the first commercial launch of 4G in the United States, with many of the mobile network operators launching this service. 4G expanded on the speed of 3G and made it possible for consumers to access the web with their mobile devices, stream high-definition video, and connect Internet of Things devices.

Now, as we approach the fourth quarter of 2018, we are on the cusp of 5G networks, which will be 10 times as fast as our 4G networks. According to a recent Wall Street Journal article on 5G that sparked my interest in the topic, the speed of 5G networks will allow the proliferation of applications such as self-driving cars, virtual reality, and remote surgery. And this got me thinking, what impact will 5G have on the future of commerce, payments, and security?

I haven’t spent any time researching that last question, but no doubt there will be significant benefits and risks that 5G networks will introduce into retail payments. I can draw inspiration from one of my favorite cartoons, the Jetsons, and think ahead to what a Jetson house might look like in 2025: one that is filled with connected devices that communicate with not only us but also each other. Close your eyes and imagine a house with a robotic vacuum that communicates with a virtual home assistant when it needs new bags—and zero human interaction is needed in the process. Or imagine a vehicle that drives itself to the nearest gas station when the low-fuel light appears. Undoubtedly, this new faster-speed wireless world will create security threats that we have yet to face.

So as we at the Risk Forum think about the possibilities and new risks of a 5G world and its impact on commerce, payments, and security, what should we be paying attention to?

Photo of Douglas King By Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

September 24, 2018 in data security, emerging payments, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

July 6, 2018


Attack of the Smart Refrigerator

We've all heard about refrigerators that automatically order groceries when they sense the current supply is running low or out. These smart refrigerators are what people usually point to when giving an example of an "internet-of-things" (IoT) device. Briefly, an IoT device is a physical device connected to the internet wirelessly that transmits data, sometimes without direct human interaction. I suspect most of you have at least one of these devices already operating in your home or office, whether it's a wireless router, baby monitor, or voice-activated assistant or "smart" lights, thermostats, security systems, or TVs.

Experts are forecasting that IoT device manufacturing will be one of the fastest growing industries over the next decade. Gartner estimates there were more than 8 billion connected IoT devices globally in 2017, with about $2 trillion going toward IoT endpoints and services. In 2020, the number of these devices will increase to more than 20 billion. But what security are manufacturers building into these devices to prevent monitoring or outside manipulation? What prevents someone from hacking into your security system and monitoring the patterns of your house or office or turning on your interior security cameras and invading your privacy? For those devices that can generate financial transactions, what authentication processes will ensure that transactions are legitimate? It's one kind of mistake to order an unneeded gallon of milk, but another one entirely to use that connection to access a home computer to monitor one's online banking transaction activity and capture log-on credentials.

As one would probably suspect, there is no simple or consistent answer to these security questions, but the overall track record of device security has not been a great one. There have been major DDOS attacks against websites using botnets composed of millions of IoT devices. Ransomware attacks have been made against consumers' home security systems and thermostats, forcing consumers to pay the extortionist to get their systems working again.

Some of the high-end devices such as the driverless cars and medical devices have been designed with security controls at the forefront, but most other manufacturers have given little thought to the criminal's ability to use a device to access and control other devices running on the same network. Adding to the problem is that many of these devices do not get software updates, including security patches.

With cybersecurity issues grabbing so many headlines, people are paying more and more attention to the role and impact of IoT devices. The National Institute of Standards and Technology (NIST) has begun efforts to develop security standards for cryptology that can operate within IoT devices. However, NIST estimates it will take two to four years to get the standard out.

In the meantime, the Department of Justice has some recommendations for securing IoT devices, including:

  • Research your device to determine security features. Does it have a changeable password? Does the manufacturer deliver security updates?
  • After you purchase a device and before you install it, download security updates and reset any default passwords.
  • If automatic updates are not provided to registered users, check at least monthly to determine if there are updates and download only from reputable sites.
  • Protect your routers and home Wi-Fi networks with firewalls, strong passwords, and security keys.

I see IoT device security as an issue that will continue to grow in importance. In a future post, I will discuss the privacy issues that IoT devices could create.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

July 6, 2018 in consumer fraud, cybercrime, cybersecurity, fraud, identity theft, innovation, online banking fraud, privacy | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 9, 2018


Fintech for Financial Wellness

When you hear the term fintech, you might free-associate "blockchain" or "APIs" or "machine learning." Chances are "financial opportunities and capabilities for all" might not be the first topic to spring to mind. Recently, I've been learning about the vast ecosystem of fintech entrepreneurs seeking to improve what the Center for Financial Services Innovation calls "financial health"—that is, our financial resiliency in the face of adversity, ability to take advantage of opportunities, and ability to manage day-to-day finances.

Consumer-focused fintech projects ask the question: Can we use data to improve financial wellness for individuals?

Some of these projects are directed toward specific groups. There are apps to help SNAP (Supplemental Nutrition Assistance Program) recipients manage benefits, enable immigrants to import their credit history from their home countries into U.S. credit reporting tools, and teach recent college grads about financial decisions such as paying off student loans or signing up for employer-sponsored retirement accounts.

Some can help you to:

  • Analyze your cash flows over the course of the month and tell you how much you could save.
  • Save or invest when you make purchases by automatically rounding up and putting your change into an account.
  • Analyze your accounts to identify peaks and valleys in your income and help you smooth it out.
  • Know when you have enough money to pay a particular bill and let you pay it by swiping your finger.
  • Link saving to opportunities to win prizes by incorporating lotteries.
  • Know, via text message, if you're likely to overdraw your account in the next few days.

Recent research finds that these sorts of interventions can be effective. For example, in "Preventing Over-Spending: Increasing Salience of Credit Card Payments through Smartphone Interventions," the authors find that people who use an app that suggests weekly savings goals significantly reduce their expenditures. This trial took place with a small sample of Swiss credit card users. As part of the experiment, participants reviewed and classified every credit card transaction, thus making every payment more visible to them. On average, participants reduced their weekly spending by about 14 percent.

Of course, not only entrepreneurs but also economists, policymakers, and traditional institutions appreciate the importance of financial education. Increasing financial literacy makes for a stronger economy, and financial education is an important part of what the Fed does. Just last week, Atlanta Fed president and CEO Raphael Bostic spoke about the importance of financial literacy. You can read his remarks here.

If you, too, care about improving financial wellness for everyone and want to learn more, please reach out to share information and ideas.

Photo of Claire Greene By Claire Greene, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

April 9, 2018 in fintech, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 2, 2018


Advice to Fintechs: Focus on Privacy and Security from Day 1

Fintech continues to have its moment. In one week in early March, I attended three Boston-area meetings devoted to new ideas built around the blockchain, open banking APIs, and apps for every conceivable wrinkle in personal financial management.

"Disruptive" was the vocabulary word of the week.

But no matter how innovative, disruptive technology happens within an existing framework of consumer protection practices and laws. Financial products and tools—whether a robofinancial adviser seeking to consolidate your investment information or a traditional checking account at a financial institution—are subject to laws and regulations that protect consumers. As an attorney speaking at one of the Boston meetings put it, "The words 'unfair,' 'deceptive,' and 'misleading' keep me up at night."

A failure to understand the regulatory framework can play out in various ways. For example, in a recent survey of New York financial institutions (FI)s by the Fintech Innovation Lab, 60 percent of respondents reported that regulatory, compliance, or security issues made it impossible to move fintech proposals into proof-of-concept testing. Great ideas, but inadequate infrastructure.

To cite another example, in 2016, the Consumer Financial Protection Bureau took action against one firm for misrepresenting its data security practices. And just last month, the Federal Trade Commission (FTC) reached a settlement with another firm over allegations that the firm had inadequately disclosed both restrictions on funds availability for transfer to external bank accounts and consumers' ability to control the privacy of their transactions. Announcing the settlement, acting FTC chairman Maureen Ohlhausen pointed out that it sent a strong message of the "need to focus on privacy and security from day one."

As Ohlhausen made clear, whoever the disrupter—traditional financial institution or garage-based startup—consumer protection should be baked in from the start. At the Boston meetings, a number of entrepreneurs advocated a proactive stance for working with regulators and urged that new businesses bring in compliance expertise early in product design. Good advice, not only for disrupters but also for innovation labs housed in FIs, FIs adopting third-party technology, and traditional product design.

Photo of Claire Greene By Claire Greene, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

April 2, 2018 in innovation, regulations, regulators | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

February 5, 2018


Elements of an Ethical Data Policy

In my last post, I introduced the issue of ethical considerations around data collection and analytics. I asked if, along with the significant expansion of technical capabilities over the last five years, there had been a corresponding increase in awareness and advancement of the ethical issues of big data, such as privacy, confidentiality, transparency, protection, and ownership. In this post, I focus on the disclosures provided to users and suggest some elements concerning ethics to include in them.

The complexities of ethics policies

As I've researched the topic of big data, I've come to realize that developing a universal ethics policy will be difficult because of the diversity of data that's collected and the many uses for this data—in the areas of finance, marketing, medicine, law enforcement, social science, and politics, to name just a few.

Privacy and data usage policies are often disclosed to users signing up for particular applications, products, or services. My experience has been that the details about the data being collected are hidden in the customer agreement. Normally, the agreement offers no "opt-out" of any specific elements, so users must either decline the service altogether or begrudgingly accept the conditions wholesale.

But what about the databases that are part of public records? Often these public records are created without any direct communication with the affected individuals. Did you know that in most states, property records at the county level are available online to anyone? You can look up property ownership by name or address and find out the sales history of the property, including prices, square footage, number of bedrooms and baths, often a floor plan, and even the name of the mortgage company—all useful information for performing a pricing analysis for comparable properties, but also useful for a criminal to combine with other socially engineered information for an account takeover or new-account fraud attempt. Doesn't it seem reasonable that I should receive a notification or be able to document when someone makes such an inquiry on my own property record?

Addressing issues in the disclosure

Often, particularly with financial instruments and medical information, those collecting data must comply with regulations that require specific disclosures and ways to handle the data. The following elements together can serve as a good benchmark in the development of an ethical data policy disclosure:

  • Type of data collected and usage. What type of data are being collected and how will that data be used? Will the data be retained at the individual level or aggregated, thereby preventing identification of individuals? Can the data be sold to third parties?
  • Accuracy. Can an individual review the data and submit corrections?
  • Protection. Are people notified how their data will be protected, at least in general terms, from unauthorized access? Are they told how they will be notified if there is a breach?
  • Public versus private system. Is it a private system that usually restricts access, or a public system that usually allows broad access?
  • Open versus closed. Is it a closed system, which prevents sharing, or is it open? If it's open, how will the information will be shared, at what level, and with whom? An example of an open system is one that collects information for a governmental background check and potentially shares that information with other governmental or law enforcement agencies.
  • Optional versus mandatory. Can individuals decline participation in the data collection, or decline specific elements? Or is the individual required to participate such that refusal results in some sort of punitive action?
  • Fixed versus indefinite duration. Will the captured data be deleted or destroyed on a timetable or in response to an event—for example, two years after an account is closed? Or will it be retained indefinitely?
  • Data ownership. Do individuals own and control their own data? Biometric data stored on a mobile phone, for example, are not also stored on a central storage site. On the other hand, institutions may retain ownership. Few programs are under user ownership, although legal rights governing how the data can be used may be made by agreement.

What elements have I missed? Do you have anything to suggest?

In my next post, I will discuss appropriate guiding principles in those circumstance when individuals have no direct interaction with the collection effort.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed


February 5, 2018 in consumer protection, innovation, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 29, 2018


Big Data, Big Dilemma

Five years ago, I authored a post discussing the advantages and pitfalls of "big data." Since then, data analytics has come to the forefront of computer science, with data analyst being among the most sought-after talents across many industries. One of my nephews, a month out of college (graduating with honors with a dual degree in computer science and statistics) was hired by a rail transportation carrier to work on freight movement efficiency using data analytics—with a starting salary of more than $100,000.

Big data, machine learning, deep learning, artificial intelligence—these are terms we constantly see and hear in technology articles, webinars, and conferences. Some of this usage is marketing hype, but clearly the significant increases in computing power at lower costs have empowered a continued expansion in data analytical capability across a wide range of businesses including consumer products and marketing, financial services, and health care. But along with this expansion of technical capability, has there been a corresponding heightened awareness of the ethical issues of big data? Have we fully considered issues such as privacy, confidentiality, transparency, and ownership?

In 2014, the Executive Office of the President issued a report on big data privacy issues. The report was prefaced with a letter that included this caution:

Big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans' relationship with data should expand, not diminish, their opportunities and potential.

(The report was updated in May 2016.)

In the European Union, the 2016 General Data Protection Regulation was adopted (enforceable after 2018); it provides for citizens of the European Union (EU) to have significant control over their personal data as well as to control the exportation of that data outside of the EU. Although numerous bills have been proposed in the U.S. Congress for cybersecurity, including around data collection and protection (see Doug King’s 2015 post), nothing has been passed to date despite the continuing announcements of data breaches. We have to go all the way back to the Privacy Act of 1974 for federal privacy legislation (other than constitutional rights) and that act only dealt with the collection and usage of data on individuals by federal agencies.

In a future blog post, I will give my perspective on what I believe to be the critical elements in developing a data collection and usage policy that addresses ethical issues in both overt and covert programs. In the interim, I would like to hear from you as to your perspective on this topic.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

January 29, 2018 in consumer protection, innovation, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 11, 2017


Fintechs and the Psychology of Trust

In the 14th century, Chaucer used the word trust to mean "virtual certainty and well-grounded hope." Since then, psychologists have described trust as an essential ingredient for social functioning, which, in turn, affects many economic variables. So how do we define trust in the 21st century, in the age of the internet? In particular, how do fintechs, relative newcomers in the financial services industry and not yet coalesced into an industry, gain the trust of the public? Would they more effectively gain that trust by relying on banks to hold them to certain standards, or by coming together to create their own?

In 2004, social psychologists Hans-Werver Bierhoff and Bernd Vornefeld, in "The Social Psychology of Trust with Applications in the Internet," wrote about trust in relation to technology and systems. They observed that "trust and risk are complementary terms. Risk is generally based on mistrust, whereas trust is associated with less doubts about security." They further explained that trust in technology and systems is based on whether an individual believes the system's security is guaranteed. Psychologically speaking, when companies show customers they care about the security of their information, customers have increased confidence in the company and the overall system. Understanding this provides insight into the development of certification authorities, third-party verification processes, and standardized levels of security.

To understand how fintechs might gain the trust of consumers and the financial industry, it's worth taking a step back, to look at how traditional financial services, before the internet and fintechs, used principles similar to those outlined by Bierhoff and Vornefeld. Take, for example, the following list of efforts the industry has taken to garner trust (this list is by no means comprehensive):

  • FDIC-insured depository institutions must advertise FDIC membership.
  • All financial institutions (FI) must undergo regulator supervision and examination.
  • FIs must get U.S. Patriot Act Certifications from any foreign banks that they maintain a correspondent account with.
  • Organizations with payment card data must comply with the PCI Standards Council's security standards and audit requirements.
  • Organizations processing ACH can have NACHA membership but must follow NACHA Operating Rules and undergo annual audits and risk assessments.
  • The Accredited Standards Committee X9 Financial Industry Standards Inc. has developed international as well as domestic standards for FIs.
  • The International Organization for Standardization has also developed international standards for financial services.
  • The American National Standards Institute provides membership options and develops standards and accreditation for financial services.

FIs have often been an integral part of the standards creation process. To the extent that these standards and requirements also affect fintechs, shouldn't fintechs also have a seat at the table? In addition, regulatory agencies have given us an additional overarching "virtual certainty' that FIs are adhering to the agreed-upon standards. Who will provide that oversight—and virtual certainty—for the fintechs?

The issue of privacy further adds to the confusion surrounding fintechs. The Gramm-Leach-Bliley Act (GLBA) of 1999 requires companies defined under the law as "financial institutions" to ensure the security and confidentiality of customer information. Further, the Federal Trade Commission's (FTC) Safeguards Rule requires FIs to have measures in place to keep customer information secure, and to comply with certain limitations on disclosure of nonpublic personal information. It's not clear that the GLBA's and FTC's definition of "financial institution" includes fintechs.

So, how will new entrants to financial services build trust? Will fintechs adopt the same standards, certifications, and verifications so they can influence assessments of risk versus security? What oversight will provide overarching virtual certainty that new systems are secure? And in the case of privacy, will fintechs identify themselves as FIs under the law? Or will it be up to a fintech's partnering financial institution to supervise compliance? As fintechs continue to blaze new trails, we will need clear directives as to which existing trust guarantees (certifications, verifications, and standards) apply to them and who will enforce those expectations.

As Bierhoff and Vornefeld conclude, "it is an empirical question how the balance between trust and distrust relates to successful use of the Internet." Although Chaucer was born a little too soon for internet access, he might agree.

Photo of Jessica Washington  By Jessica Washington, AAP, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

December 11, 2017 in banks and banking, financial services, innovation, mobile banking | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 4, 2017


What Will the Fintech Regulatory Environment Look Like in 2018?

As we prepare to put a bow on 2017 and begin to look forward to 2018, I can’t help but observe that fintech was one of the bigger topics in the banking and payments communities this year. (Be sure to sign up for our December 14 Talk About Payments webinar to see if fintech made our top 10 newsworthy list for 2017.) Many industry observers would likely agree that it will continue to garner a lot of attention in the upcoming year, as financial institutions (FI) will continue to partner with fintech companies to deliver client-friendly solutions.

No doubt, fintech solutions are making our daily lives easier, whether they are helping us deposit a check with our mobile phones or activating fund transfers with a voice command in a mobile banking application. But at what cost to consumers? To date, the direct costs, such as fees, have been minimal. However, are there hidden costs such as the loss of data privacy that could potentially have negative consequences for not only consumers but also FIs? And what, from a regulatory perspective, is being done to mitigate these potential negative consequences?

Early in the year, there was a splash in the regulatory environment for fintechs. The Office of the Comptroller of the Currency (OCC) began offering limited-purpose bank charters to fintech companies. This charter became the subject of heated debates and discussions—and even lawsuits, by the Conference of State Bank Supervisors and the New York Department of Financial Services. To date, the OCC has not formally begun accepting applications for this charter.

So where will the fintech regulatory environment take us in 2018?

Will it continue to be up to the FIs to perform due diligence on fintech companies, much as they do for third-party service providers? Will regulatory agencies offer FIs additional guidance or due diligence frameworks for fintechs, over and above what they do for traditional third-party service providers? Will one of the regulatory agencies decide that the role of fintech companies in financial services is becoming so important that the companies should be subject to examinations like financial institutions get? Finally, will U.S. regulatory agencies create sandboxes to allow fintechs and FIs to launch products on a limited scale, such as has taken place in the United Kingdom and Australia?

The Risk Forum will continue to closely monitor the fintech industry in 2018. We would enjoy hearing from our readers about how they see the regulatory environment for fintechs evolving.

Photo of Douglas King By Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

December 4, 2017 in banks and banking, financial services, innovation, mobile banking, regulations, regulators, third-party service provider | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

Google Search



Recent Posts


Archives


Categories


Powered by TypePad