About


Take On Payments, a blog sponsored by the Retail Payments Risk Forum of the Federal Reserve Bank of Atlanta, is intended to foster dialogue on emerging risks in retail payment systems and enhance collaborative efforts to improve risk detection and mitigation. We encourage your active participation in Take on Payments and look forward to collaborating with you.

Take On Payments

April 9, 2018


Fintech for Financial Wellness

When you hear the term fintech, you might free-associate "blockchain" or "APIs" or "machine learning." Chances are "financial opportunities and capabilities for all" might not be the first topic to spring to mind. Recently, I've been learning about the vast ecosystem of fintech entrepreneurs seeking to improve what the Center for Financial Services Innovation calls "financial health"—that is, our financial resiliency in the face of adversity, ability to take advantage of opportunities, and ability to manage day-to-day finances.

Consumer-focused fintech projects ask the question: Can we use data to improve financial wellness for individuals?

Some of these projects are directed toward specific groups. There are apps to help SNAP (Supplemental Nutrition Assistance Program) recipients manage benefits, enable immigrants to import their credit history from their home countries into U.S. credit reporting tools, and teach recent college grads about financial decisions such as paying off student loans or signing up for employer-sponsored retirement accounts.

Some can help you to:

  • Analyze your cash flows over the course of the month and tell you how much you could save.
  • Save or invest when you make purchases by automatically rounding up and putting your change into an account.
  • Analyze your accounts to identify peaks and valleys in your income and help you smooth it out.
  • Know when you have enough money to pay a particular bill and let you pay it by swiping your finger.
  • Link saving to opportunities to win prizes by incorporating lotteries.
  • Know, via text message, if you're likely to overdraw your account in the next few days.

Recent research finds that these sorts of interventions can be effective. For example, in "Preventing Over-Spending: Increasing Salience of Credit Card Payments through Smartphone Interventions," the authors find that people who use an app that suggests weekly savings goals significantly reduce their expenditures. This trial took place with a small sample of Swiss credit card users. As part of the experiment, participants reviewed and classified every credit card transaction, thus making every payment more visible to them. On average, participants reduced their weekly spending by about 14 percent.

Of course, not only entrepreneurs but also economists, policymakers, and traditional institutions appreciate the importance of financial education. Increasing financial literacy makes for a stronger economy, and financial education is an important part of what the Fed does. Just last week, Atlanta Fed president and CEO Raphael Bostic spoke about the importance of financial literacy. You can read his remarks here.

If you, too, care about improving financial wellness for everyone and want to learn more, please reach out to share information and ideas.

Photo of Claire Greene By Claire Greene, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

April 9, 2018 in fintech, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 2, 2018


Advice to Fintechs: Focus on Privacy and Security from Day 1

Fintech continues to have its moment. In one week in early March, I attended three Boston-area meetings devoted to new ideas built around the blockchain, open banking APIs, and apps for every conceivable wrinkle in personal financial management.

"Disruptive" was the vocabulary word of the week.

But no matter how innovative, disruptive technology happens within an existing framework of consumer protection practices and laws. Financial products and tools—whether a robofinancial adviser seeking to consolidate your investment information or a traditional checking account at a financial institution—are subject to laws and regulations that protect consumers. As an attorney speaking at one of the Boston meetings put it, "The words 'unfair,' 'deceptive,' and 'misleading' keep me up at night."

A failure to understand the regulatory framework can play out in various ways. For example, in a recent survey of New York financial institutions (FI)s by the Fintech Innovation Lab, 60 percent of respondents reported that regulatory, compliance, or security issues made it impossible to move fintech proposals into proof-of-concept testing. Great ideas, but inadequate infrastructure.

To cite another example, in 2016, the Consumer Financial Protection Bureau took action against one firm for misrepresenting its data security practices. And just last month, the Federal Trade Commission (FTC) reached a settlement with another firm over allegations that the firm had inadequately disclosed both restrictions on funds availability for transfer to external bank accounts and consumers' ability to control the privacy of their transactions. Announcing the settlement, acting FTC chairman Maureen Ohlhausen pointed out that it sent a strong message of the "need to focus on privacy and security from day one."

As Ohlhausen made clear, whoever the disrupter—traditional financial institution or garage-based startup—consumer protection should be baked in from the start. At the Boston meetings, a number of entrepreneurs advocated a proactive stance for working with regulators and urged that new businesses bring in compliance expertise early in product design. Good advice, not only for disrupters but also for innovation labs housed in FIs, FIs adopting third-party technology, and traditional product design.

Photo of Claire Greene By Claire Greene, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

April 2, 2018 in innovation, regulations, regulators | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

February 5, 2018


Elements of an Ethical Data Policy

In my last post, I introduced the issue of ethical considerations around data collection and analytics. I asked if, along with the significant expansion of technical capabilities over the last five years, there had been a corresponding increase in awareness and advancement of the ethical issues of big data, such as privacy, confidentiality, transparency, protection, and ownership. In this post, I focus on the disclosures provided to users and suggest some elements concerning ethics to include in them.

The complexities of ethics policies

As I've researched the topic of big data, I've come to realize that developing a universal ethics policy will be difficult because of the diversity of data that's collected and the many uses for this data—in the areas of finance, marketing, medicine, law enforcement, social science, and politics, to name just a few.

Privacy and data usage policies are often disclosed to users signing up for particular applications, products, or services. My experience has been that the details about the data being collected are hidden in the customer agreement. Normally, the agreement offers no "opt-out" of any specific elements, so users must either decline the service altogether or begrudgingly accept the conditions wholesale.

But what about the databases that are part of public records? Often these public records are created without any direct communication with the affected individuals. Did you know that in most states, property records at the county level are available online to anyone? You can look up property ownership by name or address and find out the sales history of the property, including prices, square footage, number of bedrooms and baths, often a floor plan, and even the name of the mortgage company—all useful information for performing a pricing analysis for comparable properties, but also useful for a criminal to combine with other socially engineered information for an account takeover or new-account fraud attempt. Doesn't it seem reasonable that I should receive a notification or be able to document when someone makes such an inquiry on my own property record?

Addressing issues in the disclosure

Often, particularly with financial instruments and medical information, those collecting data must comply with regulations that require specific disclosures and ways to handle the data. The following elements together can serve as a good benchmark in the development of an ethical data policy disclosure:

  • Type of data collected and usage. What type of data are being collected and how will that data be used? Will the data be retained at the individual level or aggregated, thereby preventing identification of individuals? Can the data be sold to third parties?
  • Accuracy. Can an individual review the data and submit corrections?
  • Protection. Are people notified how their data will be protected, at least in general terms, from unauthorized access? Are they told how they will be notified if there is a breach?
  • Public versus private system. Is it a private system that usually restricts access, or a public system that usually allows broad access?
  • Open versus closed. Is it a closed system, which prevents sharing, or is it open? If it's open, how will the information will be shared, at what level, and with whom? An example of an open system is one that collects information for a governmental background check and potentially shares that information with other governmental or law enforcement agencies.
  • Optional versus mandatory. Can individuals decline participation in the data collection, or decline specific elements? Or is the individual required to participate such that refusal results in some sort of punitive action?
  • Fixed versus indefinite duration. Will the captured data be deleted or destroyed on a timetable or in response to an event—for example, two years after an account is closed? Or will it be retained indefinitely?
  • Data ownership. Do individuals own and control their own data? Biometric data stored on a mobile phone, for example, are not also stored on a central storage site. On the other hand, institutions may retain ownership. Few programs are under user ownership, although legal rights governing how the data can be used may be made by agreement.

What elements have I missed? Do you have anything to suggest?

In my next post, I will discuss appropriate guiding principles in those circumstance when individuals have no direct interaction with the collection effort.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed


February 5, 2018 in consumer protection, innovation, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 29, 2018


Big Data, Big Dilemma

Five years ago, I authored a post discussing the advantages and pitfalls of "big data." Since then, data analytics has come to the forefront of computer science, with data analyst being among the most sought-after talents across many industries. One of my nephews, a month out of college (graduating with honors with a dual degree in computer science and statistics) was hired by a rail transportation carrier to work on freight movement efficiency using data analytics—with a starting salary of more than $100,000.

Big data, machine learning, deep learning, artificial intelligence—these are terms we constantly see and hear in technology articles, webinars, and conferences. Some of this usage is marketing hype, but clearly the significant increases in computing power at lower costs have empowered a continued expansion in data analytical capability across a wide range of businesses including consumer products and marketing, financial services, and health care. But along with this expansion of technical capability, has there been a corresponding heightened awareness of the ethical issues of big data? Have we fully considered issues such as privacy, confidentiality, transparency, and ownership?

In 2014, the Executive Office of the President issued a report on big data privacy issues. The report was prefaced with a letter that included this caution:

Big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans' relationship with data should expand, not diminish, their opportunities and potential.

(The report was updated in May 2016.)

In the European Union, the 2016 General Data Protection Regulation was adopted (enforceable after 2018); it provides for citizens of the European Union (EU) to have significant control over their personal data as well as to control the exportation of that data outside of the EU. Although numerous bills have been proposed in the U.S. Congress for cybersecurity, including around data collection and protection (see Doug King’s 2015 post), nothing has been passed to date despite the continuing announcements of data breaches. We have to go all the way back to the Privacy Act of 1974 for federal privacy legislation (other than constitutional rights) and that act only dealt with the collection and usage of data on individuals by federal agencies.

In a future blog post, I will give my perspective on what I believe to be the critical elements in developing a data collection and usage policy that addresses ethical issues in both overt and covert programs. In the interim, I would like to hear from you as to your perspective on this topic.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

January 29, 2018 in consumer protection, innovation, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 11, 2017


Fintechs and the Psychology of Trust

In the 14th century, Chaucer used the word trust to mean "virtual certainty and well-grounded hope." Since then, psychologists have described trust as an essential ingredient for social functioning, which, in turn, affects many economic variables. So how do we define trust in the 21st century, in the age of the internet? In particular, how do fintechs, relative newcomers in the financial services industry and not yet coalesced into an industry, gain the trust of the public? Would they more effectively gain that trust by relying on banks to hold them to certain standards, or by coming together to create their own?

In 2004, social psychologists Hans-Werver Bierhoff and Bernd Vornefeld, in "The Social Psychology of Trust with Applications in the Internet," wrote about trust in relation to technology and systems. They observed that "trust and risk are complementary terms. Risk is generally based on mistrust, whereas trust is associated with less doubts about security." They further explained that trust in technology and systems is based on whether an individual believes the system's security is guaranteed. Psychologically speaking, when companies show customers they care about the security of their information, customers have increased confidence in the company and the overall system. Understanding this provides insight into the development of certification authorities, third-party verification processes, and standardized levels of security.

To understand how fintechs might gain the trust of consumers and the financial industry, it's worth taking a step back, to look at how traditional financial services, before the internet and fintechs, used principles similar to those outlined by Bierhoff and Vornefeld. Take, for example, the following list of efforts the industry has taken to garner trust (this list is by no means comprehensive):

  • FDIC-insured depository institutions must advertise FDIC membership.
  • All financial institutions (FI) must undergo regulator supervision and examination.
  • FIs must get U.S. Patriot Act Certifications from any foreign banks that they maintain a correspondent account with.
  • Organizations with payment card data must comply with the PCI Standards Council's security standards and audit requirements.
  • Organizations processing ACH can have NACHA membership but must follow NACHA Operating Rules and undergo annual audits and risk assessments.
  • The Accredited Standards Committee X9 Financial Industry Standards Inc. has developed international as well as domestic standards for FIs.
  • The International Organization for Standardization has also developed international standards for financial services.
  • The American National Standards Institute provides membership options and develops standards and accreditation for financial services.

FIs have often been an integral part of the standards creation process. To the extent that these standards and requirements also affect fintechs, shouldn't fintechs also have a seat at the table? In addition, regulatory agencies have given us an additional overarching "virtual certainty' that FIs are adhering to the agreed-upon standards. Who will provide that oversight—and virtual certainty—for the fintechs?

The issue of privacy further adds to the confusion surrounding fintechs. The Gramm-Leach-Bliley Act (GLBA) of 1999 requires companies defined under the law as "financial institutions" to ensure the security and confidentiality of customer information. Further, the Federal Trade Commission's (FTC) Safeguards Rule requires FIs to have measures in place to keep customer information secure, and to comply with certain limitations on disclosure of nonpublic personal information. It's not clear that the GLBA's and FTC's definition of "financial institution" includes fintechs.

So, how will new entrants to financial services build trust? Will fintechs adopt the same standards, certifications, and verifications so they can influence assessments of risk versus security? What oversight will provide overarching virtual certainty that new systems are secure? And in the case of privacy, will fintechs identify themselves as FIs under the law? Or will it be up to a fintech's partnering financial institution to supervise compliance? As fintechs continue to blaze new trails, we will need clear directives as to which existing trust guarantees (certifications, verifications, and standards) apply to them and who will enforce those expectations.

As Bierhoff and Vornefeld conclude, "it is an empirical question how the balance between trust and distrust relates to successful use of the Internet." Although Chaucer was born a little too soon for internet access, he might agree.

Photo of Jessica Washington  By Jessica Washington, AAP, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

December 11, 2017 in banks and banking, financial services, innovation, mobile banking | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 4, 2017


What Will the Fintech Regulatory Environment Look Like in 2018?

As we prepare to put a bow on 2017 and begin to look forward to 2018, I can’t help but observe that fintech was one of the bigger topics in the banking and payments communities this year. (Be sure to sign up for our December 14 Talk About Payments webinar to see if fintech made our top 10 newsworthy list for 2017.) Many industry observers would likely agree that it will continue to garner a lot of attention in the upcoming year, as financial institutions (FI) will continue to partner with fintech companies to deliver client-friendly solutions.

No doubt, fintech solutions are making our daily lives easier, whether they are helping us deposit a check with our mobile phones or activating fund transfers with a voice command in a mobile banking application. But at what cost to consumers? To date, the direct costs, such as fees, have been minimal. However, are there hidden costs such as the loss of data privacy that could potentially have negative consequences for not only consumers but also FIs? And what, from a regulatory perspective, is being done to mitigate these potential negative consequences?

Early in the year, there was a splash in the regulatory environment for fintechs. The Office of the Comptroller of the Currency (OCC) began offering limited-purpose bank charters to fintech companies. This charter became the subject of heated debates and discussions—and even lawsuits, by the Conference of State Bank Supervisors and the New York Department of Financial Services. To date, the OCC has not formally begun accepting applications for this charter.

So where will the fintech regulatory environment take us in 2018?

Will it continue to be up to the FIs to perform due diligence on fintech companies, much as they do for third-party service providers? Will regulatory agencies offer FIs additional guidance or due diligence frameworks for fintechs, over and above what they do for traditional third-party service providers? Will one of the regulatory agencies decide that the role of fintech companies in financial services is becoming so important that the companies should be subject to examinations like financial institutions get? Finally, will U.S. regulatory agencies create sandboxes to allow fintechs and FIs to launch products on a limited scale, such as has taken place in the United Kingdom and Australia?

The Risk Forum will continue to closely monitor the fintech industry in 2018. We would enjoy hearing from our readers about how they see the regulatory environment for fintechs evolving.

Photo of Douglas King By Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

December 4, 2017 in banks and banking, financial services, innovation, mobile banking, regulations, regulators, third-party service provider | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

November 27, 2017


How Intelligent Is Artificial Intelligence?

At the recent Money20/20 conference, sessions on artificial intelligence (AI) joined those on friction in regulatory and technological innovation in dominating the agenda. A number of panels highlighted the competitive advantages AI tools offer companies. It didn't matter if the topic was consumer marketing, fraud prevention, or product development—AI was the buzzword. One speaker noted the social good that could come from such technology, pointing to the work of a Stanford research team trying to identify individuals with a strong likelihood of developing diabetes by running an automated review of photographic images of their eyes. Another panel discussed the privacy and ethical issues around the use of artificial intelligence.

But do any of these applications marketed as AI pass Alan Turing's 1950s now-famous Turing test defining true artificial intelligence? Turing was regarded as the father of computer science. It was his efforts during World War II that led a cryptographic team to break the Enigma code used by the Germans, as featured in the 2014 movie The Imitation Game. Turing once said, "A computer would deserve to be called intelligent if it could deceive a human into believing that it was human." An annual competition held since 1991, aims to award a solid 18-karat gold medal and a monetary prize of $100,000 for the first computer whose responses are indistinguishable from a real human's. To date, no one has received the gold medal, but every year, a bronze medal and smaller cash prize are given to the "most humanlike."

Incidentally, many vendors seem to use artificial intelligence as a synonym for the terms deep learning and machine learning. Is this usage of AI mostly marketing hype for the neural network technology developed in the mid-1960s, now greatly improved thanks to the substantial increase in computing power? A 2016 Forbes article by Bernard Marr provides a good overview of the different terms and their applications.

My opinion is that none of the tools in the market today meet the threshold of true artificial intelligence based on Turing's criteria. That isn't to say the lack of this achievement should diminish the benefits that have already emerged and will continue to be generated in the future. Computing technology certainly has advanced to be able to handle complex mathematical and programmed instructions at a much faster rate than a human.

What are your thoughts?

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

November 27, 2017 in emerging payments, innovation, payments | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

November 20, 2017


Webinar: Key Payment Events in 2017

This year has been an exciting one for the payments industry. Topics such as block chain and distributed ledger, card-not-present fraud, and chip-card migration continued to be in the news, and new subjects such as behavioral biometrics and machine learning/artificial intelligence made their way into the spotlight.

In the past, the Retail Payments Risk Forum team has coauthored a year-end post identifying what they believed to have been the major payment events of the year. This year, we are doing something a little bit different and hope you will like the change. Taking advantage of our new webinar series, Talk About Payments, the RPRF team will be sharing our perspectives through a round table discussion in a live webinar. We encourage financial institutions, retailers, payments processors, law enforcement, academia, and other payments system stakeholders to participate in this webinar. Participants will be able to submit questions during the webinar.

The webinar will be held on Thursday, December 14, from 1 to 2 p.m. (ET). Participation in the webinar is complimentary, but you must register in advance. To register, click on the TAP webinar link. After you complete your registration, you will receive a confirmation email with all the log-in and toll-free call-in information. A recording of the webinar will be available to all registered participants in various formats within a couple of weeks.

We look forward to you joining us on December 14 and sharing your perspectives on the major payment events that took place in 2017.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

November 20, 2017 in banks and banking, biometrics, emerging payments, EMV, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

November 13, 2017


The Future of Wearables

My wife and I took our children to a Florida theme park for their recent fall break. While I would love to spend the next few paragraphs opining on why I think our school calendar is crazy or giving a review of the most phenomenal ride that I have ever experienced, it doesn't really fit the mission or purpose of Take On Payments. Fortunately, the trip did provide some fodder and thought for a blog post, thanks to a much-discussed and written-about wearable NFC—or near-field-communication—device that the theme park offers.

These bands were introduced in 2013 to create an awesome customer experience. This experience is much bigger than a payment platform and has absolutely nothing to do with a rewards program around which so many mobile wallet and payment applications are being developed. The band's functionality certainly includes payments, but the device also replaces room keys, park entry cards, and ride-specific tickets known as fast passes. As an additional feature, it is waterproof, which proves handy for a trip to the water park. I was able to spend the week without ever having anything in my pockets (yes, I even left my phone in the room). My wife commented how fantastic it would be to take the NFC band experience outside of the park because it was just so easy and convenient.

Ease and convenience–isn't that what a lot of us are after? If you have to give me something to get me to open an application and tap my phone in place of a payment card, is that really providing ease and convenience? I am now 100 percent convinced that rewards programs aren't going to drive mobile commerce to any significant degree. Experiences that provide ease and convenience will drive mobile commerce. Hello, mobile order-ahead. Hello, grocery delivery. And hello, wearable of the future.

It isn't hard to imagine a wearable device, like an open-loop band, transforming our lives. After my theme park experience, I long for the day when a wearable will be the key to my vehicle—which I won't have to drive, either—and to my house, my communication device, and my payment device (or wallet). Of course, we'll have to consider the security issues. Even the bands incorporate PINs and fingerprint biometrics in some cases to ensure that the legitimate customer is the one wearing the band.

Is this day really so far-fetched? I can already order a pizza through a connected speaker, initiate a call from the driver's seat of my car without touching my phone, or tap my phone to pay for a hamburger. The more I think about these possibilities, I have to ask myself, is it crazy to question whether or not using mobile phones for payments just might become obsolete before long? Or maybe mobile phones will provide that band functionality?

Photo of Douglas King By Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

November 13, 2017 in banks and banking, innovation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

November 6, 2017


My Fingertips, My Data

I am not a user of old-style financial services. While I remember learning how to balance a checkbook, I never had to do it, since I never had checks. Recently, my financial adviser suggested several mobile applications that could help me manage my finances in a way that made sense to me. I researched them, evaluated a few, and decided which one I thought would be the best. I'm always excited to try new apps, hopeful that this one will be the one that will simplify my life.

As I clicked through the process of opening an account with my new financial management app, I entered the name of my financial institution (FI), where I have several accounts: checking, savings, money market, and line of credit. The app identified my credit union (which has over $5 billion in assets and ranks among the top 25) and entered my online banking credentials—and then I was brought up short. The app was asking for my routing and account number. As I said, I don't own any checks and I don't know how to find this information on my credit union's mobile app. (I do know where to find it using an internet browser.) I stopped creating my account at this point and have yet to finish it up.

I later discovered that if I banked with one of the larger banks, for which custom APIs have been negotiated, I would not have been asked for a routing and account number. I would have simply entered my online login details, and I'd be managing my finances with my fingertips already. I started digging into why my credit union doesn't have full interoperability.

In the United States, banking is a closed system. APIs are built as custom integrations, with each financial institution having to consent for third parties to access customer data. However, many FIs haven't been approached, or integration is bottlenecked at the core processor level. It is bottlenecked because if they deny access to customer data (which some do), the FI has no choice in the matter.

New Consumer Financial Protection Bureau (CFPB) guidance on data sharing and aggregation addresses the accessibility and ownership issue. The upshot of the CFPB's guidance is that consumers own their financial data and FIs should allow sharing of the data with third-party companies. But should doesn't equal will or can.

The CFPB guidance, though not a rule, is in the same vein as the European Union's PSD2 (or Directive on Payments Services II) regulation, whereby FIs must provide access to account information with the consumer's permission. This platform, which represents an open banking approach, standardizes APIs that banks can proactively make available to third parties for plug-and-play development.

While open banking is a regulatory requirement in Europe, market competition is driving North American banks to be very interested in implementing open banking here. An Accenture survey recently found that 60 percent of North American banks already have an open banking strategy, compared to 74 percent of European banks.

It is no surprise that bankers are becoming more comfortable with the shift-in-ownership concept. FIs have been increasingly sharing their customers' data with third parties. Consumer data are what fuel organizations like credit agencies, payment fraud databases, identity and authentication solutions, and anomaly detection services, to name a few. As these ownership theories change, we will also need to see new approaches to security. What are your thoughts about open banking?

Photo of Jessica Washington  By Jessica Washington, AAP, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

November 6, 2017 in banks and banking, data security, emerging payments, innovation, mobile banking | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

Google Search



Recent Posts


Archives


Categories


Powered by TypePad