About


Take On Payments, a blog sponsored by the Retail Payments Risk Forum of the Federal Reserve Bank of Atlanta, is intended to foster dialogue on emerging risks in retail payment systems and enhance collaborative efforts to improve risk detection and mitigation. We encourage your active participation in Take on Payments and look forward to collaborating with you.

Take On Payments

May 20, 2019


Could Federal Privacy Law Happen in 2019?

Some payments people have suggested that this could be the year for mobile payments to take off. My take? Nah. I gave up on that thought several years ago, as I've made clear in some of my previous posts. I'm actually wondering if this will be the year that federal privacy legislation is enacted in the United States. The effects of the European Union's General Data Protection Regulation (GDPR) that took effect a year ago (see this Take on Payments post) are being felt in the United States and across the globe. The GDPR essentially has created a global standard for how companies should protect citizens' personal data and the rights of everyone to understand what data is being collected as well as how to opt out of this collection. While technically the GDPR applies only to EU citizens, even when traveling outside the European Union, most businesses have taken a cautious approach and are treating every transaction—financial or informational—that they process as something that could be covered under the GDPR.

A tangible impact of the GDPR in the United States is that the state of California has passed a data privacy law known as the California Consumer Privacy Act of 2018Off-site link (CCPA) that is partly patterned after the GDPR. The CCPA gives California residents five basic rights related to data privacy:

  • The right to know what personal information a business has collected about them, where it was obtained, how it is being used, and whether it is being disclosed or sold to other parties and, if so, to whom it is being disclosed or sold
  • The right to access that personal information free of charge up to two times within a 12-month period
  • The right to opt out of allowing a business to sell their personal information to third parties
  • The right to have a business delete their personal information, except for information that is required to effect a transaction or comply with other regulatory requirements.
  • The right to receive equal service and pricing from a business, even if they have exercised their privacy rights under the CCPA.

According to the National Conference of State Legislatures (NCSL) 17 statesOff-site link have mandated that their governmental websites and access portals state privacy policies and procedures. Additionally, other states have privacy laws related to privacy, such as children's online privacy, the monitoring of employee email, and e-reader policies.

Take On Payments has previously discussed the numerous efforts to introduce federal legislation regarding privacy and data breach notification with little traction. So why do I think change is in the air? The growing trend of states implementing privacy legislation is putting pressure on Congress to take action in order to have a consistent national policy and process that businesses operating across state lines can understand and follow.

What do you think?

Photo of David LottBy David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

-payments">Retail Payments Risk Forum at the Atlanta Fed

May 20, 2019 in data security, privacy, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 22, 2019


The Prepaid Rule: All Jokes Aside

A payments compliance rule took effect this year on April Fools' Day, and it occurred to me that when a compliance deadline is approaching, you might not feel like joking around. The Prepaid Accounts Final Rule was issued a few years ago, in 2016, but after a number of postponements, its effective date is finally behind us.

The rule standardizes disclosures, error resolution procedures, consumer liability limits, and access to records. These changes are intended to provide comprehensive consumer protections for prepaid accounts under the Electronic Fund Transfer Act, or Regulation E. The rule is fairly comprehensive, but for the sake of brevity, I'm going to look at only a couple areas of the rule—those that stand out to me.

Consumers can now expect protections over their transaction accounts regardless of whether the account is offered directly by a traditional financial institution or by a third party, such as a fintech or merchant, as they make electronic payments (debit, prepaid, ACH). Also, fintech companies that allow consumers to store funds or are thinking about adding that ability may want to prepare themselves to be designated as prepaid services providers and therefore subject to the regulatory and licensing requirements that go along with that designation. To that point, I am not surprised to see several big names recently listed on the FinCen Money Service Business Registration as "Providers of prepaid access." (To see the list, scroll down the web page to the MSB registration form; on the MSB ACTIVITIES field, click the down arrow to open the dropdown list; select Provider of prepaid access and click the Submit button.)

Established prepaid issuers have long been preparing for the new prepaid rule despite the stops and starts of an effective date and the uncertainty about some of its key provisions. Because consumers open prepaid accounts in a variety of ways—from starting a new job to purchasing prepaid cards at a retail checkout lane—it can be difficult to accommodate the disclosure requirements, such as those for listing fees, that the prepaid rule prescribes. Most issuers have changed product packaging to accommodate the new disclosures. These changes required complicated logistics coordination for the prepaid supply chain to replace old, noncompliant inventory with new, compliant card packages. Some issuers are still grappling with how to list types of fees that may not apply to their particular account program.

Many issuers had already been providing some level of consumer protection from unauthorized transactions before the rule requirement took effect. Now there will be a standard expectation. Limited liability and error resolution benefits need apply only to customers who have successfully completed the identification and verification process, if there is one for their particular program. Regulation E's error resolution and limited liability requirements do not extend to prepaid accounts (other than payroll or government benefit accounts) that have not completed the verification process, one of the key revisions after the rule's initial issue.

The rule will change the way we categorize prepaid services. For instance, in the past, discussion around prepaid products focused on whether the product was open- or closed-loop, and whether it was reloadable or nonreloadable. While those characteristics still exist, they are not necessarily a determinant as to whether the rule applies to a particular product or not. There are clear exclusions for certain products like those that are marketed and labeled as gift cards, health care savings cards, or disaster relief cards. However, even if a product doesn't have "prepaid" on its label, it may still fall under Regulation E. Coverage extends to asset accounts that consumers can use to conduct transactions with multiple, unaffiliated merchants for goods or services, to pull cash from automated teller machines, or to make person-to-person transfers.

For both incumbents and those finding themselves new in prepaid, it has been no joke to prepare to comply with the new rule. Despite the extra burden, do you think we will look back on this milestone favorably in the future? I think the new prepaid rule will lead to strengthening trust and confidence in these products. The Consumer Financial Protection Bureau (CFPB) pledges to be vigilant in evaluating new rules. Moreover, the CFPB is required to submit a formal evaluation five years following a rule's effective date. The industry should be ready to help measure the rule's impact.

Photo of Jessica Washington By Jessica Washington, AAP, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

April 22, 2019 in fintech, prepaid, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

February 11, 2019


AI and Privacy: Achieving Coexistence

In a post early last year, I raised the issue of privacy rights in the use of big data. After attending the AI (artificial intelligence) Summit in New York City in December, I believe it is necessary to expand that call to the wider spectrum of technology that is under the banner of AI, including machine learning. There is no question that increased computing power, reduced costs, and improved developer skills have made machine learning programs more affordable and powerful. As discussed at the conference, the various facets of AI technology have reached far past financial services and fraud detection into numerous aspects of our life, including product marketing, health care, and public safety.

In May 2018, the White House announced the creation of the Select Committee on Artificial Intelligence. The main mission of the committee is "to improve the coordination of Federal efforts related to AI to ensure continued U.S. leadership in this field." It will operate under the National Science and Technology Committee and will have senior research and development officials from key governmental agencies. The White House's Office of Science and Technology Policy will oversee the committee.

Soon after, Congress established the National Security Commission on Artificial Intelligence in Title II, Section 238 of the 2019 John McCain National Defense Authorization Act. While the commission is independent, it operates within the executive branch. Composed of 15 members appointed by Congress and the Secretaries of Defense and Commerce—including representatives from Silicon Valley, academia, and NASA—the commission's aim is to "review advances in artificial intelligence, related machine learning developments, and associated technologies." It is also charged with looking at technologies that keep the United States competitive and considering the legal and ethical risks.

While the United States wants to retain its leadership position in AI, it cannot overlook AI's privacy and ethical implications. A national privacy advocacy group, EPIC (or the Electronic Privacy Information Center), has been lobbying hard to ensure that both the Select Committee on Artificial Intelligence and the National Security Commission on Artificial Intelligence obtain public input. EPIC has asked these groups to adopt the 12 Universal Guidelines for Artificial Intelligence released in October 2018 at the International Data Protection and Privacy Commissioners Conference in Brussels.

These guidelines, which I will discuss in more detail in a future post, are based on existing regulatory guidelines in the United States and Europe regarding data protection, human rights doctrine, and general ethical principles. They call out that any AI system with the potential to impact an individual's rights should have accountability and transparency and that humans should retain control over such systems.

As the strict privacy and data protection elements of the European Union's General Data Privacy Regulation take hold in Europe and spread to other parts of the world, I believe that privacy and ethical elements will gain a brighter spotlight and AI will be a major topic of discussion in 2019. What do you think?

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

February 11, 2019 in consumer protection, emerging payments, fintech, innovation, privacy, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 3, 2018


Building Blocks for the Sandbox

I just returned from a leave of absence to welcome my third child to this world. As I catch up on payments news, one theme emerging is the large number of state and federal regulatory bodies launching their own fintech sandboxes. Typically, these testing grounds allow businesses to experiment with various "building blocks" while they innovate. Some businesses are even allowed regulatory relief as they work out the kinks. As I've researched, I've found myself daydreaming about how my new little human also needs to work with the right building blocks, or core principles, to ensure he develops properly and "plays nice" in the sandbox.

But—back to work. What guidance do fintechs have available to them to grow and prosper?.

On July 31 of this year, the U.S. Department of the Treasury released a report suggesting regulatory reform to promote financial technology and innovation among both traditional financial institutions and nonbanks. The report in its entirety is worth a review, but I'll highlight some of it here.

The blueprint for a unified regulatory sandbox is still up for discussion, but the Treasury suggests a hierarchical structure, either overseen by a single regulator or by an entirely new regulator. The Treasury suggests that Congress will likely have to assist by passing legislation with the necessary preemptions to grant authority to the newly created agency or a newly named authoritative agency.

The report outlines these core principles of a unified regulatory sandbox:

  • Promote the adoption and growth of innovation and technological transformation in financial services.
  • Provide equal access to companies in various stages of the business lifecycle (e.g., startups and incumbents). [The regulator should define when a business could or should participate.]
  • Delineate clear and public processes and procedures, including a process by which firms enter and exit.
  • Provide targeted relief across multiple regulatory frameworks.
  • Offer the ability to achieve international regulatory cooperation or appropriate deference where applicable.
  • Maintain financial integrity, consumer protections, and investor protections commensurate with the scope of the project, not be based on the organization type (whether it's a bank or nonbank).
  • Increase the timeliness of regulator feedback offered throughout the product or service development lifecycle. [Slow regulator feedback is typically a deterrent for start-up participation.]

Clearly, the overarching intent of these principles is to help align guidance, standards, and regulation to meet the needs of a diverse group of participants. Should entities offering the same financial services be regulated similarly? More importantly, is such a mission readily achievable?

People have long recognized the fragmentation of the U.S. financial regulatory system. The number of agencies at the federal and state levels with a hand in financial services oversight creates inconsistencies and overlaps of powers. Fintech innovations even sometimes invite attention from regulators outside of the financial umbrella, regulators like the Federal Communications Commission or the Federal Trade Commission.

In the domain of financial services are kingdoms of industry. Take the payments kingdom, for example. Payments are interstate, global, and multi-schemed (each scheme with its own rules framework). And let's be honest, in the big picture of financial services innovations and in the minds of fintechs, payments are an afterthought, and they aren't front and center in business plans. Consumers want products or services; payments connect the dots. (In fact, the concept of invisible payments is only growing stronger.)

What is more, a fintech, even though it may have a payments component in its technology, might not identify itself as a fintech. And a business that doesn't see itself as a fintech is not going to get in line for a unified financial services regulator sandbox (though it might want to play in a payments regulator sandbox).

When regulatory restructuring takes place, I hope it will build a dedicated infrastructure to nurture the payments piece of fintech, so that all can play nice in the payments sandbox. (Insert crying baby.)

Photo of Jessica Washington By Jessica Washington, AAP, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

December 3, 2018 in bank supervision, emerging payments, financial services, fintech, innovation, regulations, regulators | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

September 10, 2018


The Case of the Disappearing ATM

The longtime distribution goal of a major soft drink company is to have their product "within an arm's reach of desire." This goal might also be applied to ATMs—the United States has one of the highest concentration of ATMs per adult. In a recent post, I highlighted some of the findings from an ATM locational study conducted by a team of economics professors from the University of North Florida. Among their findings, for example, was that of the approximately 470,000 ATMs and cash dispensers in the United States, about 59 percent have been placed and are operated by independent entrepreneurs. Further, these independently owned ATMs "tend to be located in areas with less population, lower population density, lower median and average income (household and disposable), lower labor force participation rate, less college-educated population, higher unemployment rate, and lower home values."

This finding directly relates to the issue of financial inclusion, an issue that is a concern of the Federal Reserve's. A 2016 study by Accenture pointed "to the ATM as one of the most important channels, which can be leveraged for the provision of basic financial services to the underserved." I think most would agree that the majority of the unbanked and underbanked population is likely to reside in the demographic areas described above. One could conclude that the independent ATM operators are fulfilling a demand of people in these areas for access to cash, their primary method of payment.

Unfortunately for these communities, a number of independent operators are having to shut down and remove their ATMs because their banking relationships are being terminated. These closures started in late 2014, but a larger wave of account closures has been occurring over the last several months. In many cases, the operators are given no reason for the sudden termination. Some operators believe their settlement bank views them as a high-risk business related to money laundering, since the primary product of the ATM is cash. Financial institutions may incorrectly group these operators with money service businesses (MSB), even though state regulators do not consider them to be MSBs. Earlier this year, the U.S. House Financial Services Subcommittee on Financial Institutions and Consumer Credit held a hearing over concerns that this de-risking could be blocking consumers' (and small businesses') access to financial products and services. You can watch the hearing on video (the hearing actually begins at 16:40).

While a financial institution should certainly monitor its customer accounts to ensure compliance with its risk tolerance and compliance policies, we have to ask if the independent ATM operators are being painted with a risk brush that is too broad. The reality is that it is extremely difficult for an ATM operator to funnel "dirty money" through an ATM. First, to gain access to the various ATM networks, the operator has to be sponsored by a financial institution (FI). In the sponsorship process, the FI rigorously reviews the operator's financial stability and other business operations as well as compliance with BSA/AML because the FI sponsor is ultimately responsible for any network violations. Second, the networks handling the transaction are completely independent from the ATM owners. They produce financial reports that show the amount of funds that an ATM dispenses in any given period and generate the settlement transactions. These networks maintain controls that clearly document the funds flowing through the ATM, and a review of the settlement account activity would quickly identify any suspicious activity.

The industry groups representing the independent ATM operators appear to have gained a sympathetic ear from legislators and, to some degree, regulators. But the sympathy hasn't extended to those financial institutions that are accelerating account closures in some areas. We will continue to monitor this issue and report any major developments. Please let us know your thoughts.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

September 10, 2018 in banks and banking, consumer protection, financial services, money laundering, regulations, regulators, third-party service provider | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

June 4, 2018


The GDPR's Impact on U.S. Consumers

If your email inbox is like mine, it's recently been flooded with messages from companies you’ve done online business with about changes in their terms and conditions, particularly regarding privacy. What has prompted this wave of notices is the May 25 implementation of Europe's General Data Protection Regulation (GDPR). Approved by the European Parliament in April 2016 after considerable debate, the regulation standardizes data privacy regulations across Europe for the protection of EU citizens.

The regulation applies to both data "controllers" and data "processors." A data controller is the organization that owns the data, while the data processor is an outside company that helps to manage or process that data. The focus of the GDPR requirements is on controllers and processors directly conducting business in the 28 countries that make up the European Union (EU). But the GDPR has the potential to affect businesses based in any country, including the United States, that collect or process the personal data of any EU citizen. Penalties for noncompliance can be quite severe. For that reason, many companies are choosing to err on the side of caution and sending to all their customers notices of changes to their privacy disclosure terms and conditions. Some companies have even gone so far as to provide the privacy protections contained in the GDPR to all their customers, EU citizens or not.

The GDPR has a number of major consumer protections:

  • Individuals can request that controllers erase all information collected on them that is not required for transaction processing. They can also ask the controller to stop companies from distributing that data any further and, with some exceptions, have third parties stop processing the data. (This provision is known as "data erasure" or the "right to be forgotten.")
  • Companies must design information technology systems to include privacy protection features. In addition, they must have a robust notification system in place for when breaches occur. After a breach, the data processor must notify the data controller "without undue delay." When the breach threatens "risk for the rights and freedoms of individuals," the data controller must notify the supervisory authority within 72 hours of discovery of the breach. Data controllers must also notify "without undue delay" the individuals whose information has been affected.
  • Individuals can request to be informed if the companies are obtaining their personal data and, if so, how they will use that data. Individual also have the right to obtain without charge electronic copies of collected data, and they may send that data to another company if they choose.

In addition, the GDPR requires large processing companies, as well as public authorities and other specified businesses, to designate a data protection officer to oversee the companies' compliance with the GDPR.

There have been numerous efforts in the United States to pass uniform privacy legislation, with little or no change. My colleague Doug King authored a post back in May 2015 about three cybersecurity bills under consideration that included privacy rights. Three years later, for each bill, either action has been suspended or it's still in committee. It will be interesting to see, as the influence of the GDPR spreads globally, whether there will be any additional efforts to pass similar legislation in the United States. What do you think?

And by the way, fraudsters are always looking for opportunities to install malware on your phones and other devices. We've heard reports of the criminal element using "update notice" emails. The messages, which appear to be legitimate, want the unsuspecting recipient to click on a link or open an attachment containing malware or a virus. So be careful!

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

June 4, 2018 in consumer protection, cybersecurity, data security, privacy, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 2, 2018


Advice to Fintechs: Focus on Privacy and Security from Day 1

Fintech continues to have its moment. In one week in early March, I attended three Boston-area meetings devoted to new ideas built around the blockchain, open banking APIs, and apps for every conceivable wrinkle in personal financial management.

"Disruptive" was the vocabulary word of the week.

But no matter how innovative, disruptive technology happens within an existing framework of consumer protection practices and laws. Financial products and tools—whether a robofinancial adviser seeking to consolidate your investment information or a traditional checking account at a financial institution—are subject to laws and regulations that protect consumers. As an attorney speaking at one of the Boston meetings put it, "The words 'unfair,' 'deceptive,' and 'misleading' keep me up at night."

A failure to understand the regulatory framework can play out in various ways. For example, in a recent survey of New York financial institutions (FI)s by the Fintech Innovation Lab, 60 percent of respondents reported that regulatory, compliance, or security issues made it impossible to move fintech proposals into proof-of-concept testing. Great ideas, but inadequate infrastructure.

To cite another example, in 2016, the Consumer Financial Protection Bureau took action against one firm for misrepresenting its data security practices. And just last month, the Federal Trade Commission (FTC) reached a settlement with another firm over allegations that the firm had inadequately disclosed both restrictions on funds availability for transfer to external bank accounts and consumers' ability to control the privacy of their transactions. Announcing the settlement, acting FTC chairman Maureen Ohlhausen pointed out that it sent a strong message of the "need to focus on privacy and security from day one."

As Ohlhausen made clear, whoever the disrupter—traditional financial institution or garage-based startup—consumer protection should be baked in from the start. At the Boston meetings, a number of entrepreneurs advocated a proactive stance for working with regulators and urged that new businesses bring in compliance expertise early in product design. Good advice, not only for disrupters but also for innovation labs housed in FIs, FIs adopting third-party technology, and traditional product design.

Photo of Claire Greene By Claire Greene, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

April 2, 2018 in innovation, regulations, regulators | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

February 5, 2018


Elements of an Ethical Data Policy

In my last post, I introduced the issue of ethical considerations around data collection and analytics. I asked if, along with the significant expansion of technical capabilities over the last five years, there had been a corresponding increase in awareness and advancement of the ethical issues of big data, such as privacy, confidentiality, transparency, protection, and ownership. In this post, I focus on the disclosures provided to users and suggest some elements concerning ethics to include in them.

The complexities of ethics policies

As I've researched the topic of big data, I've come to realize that developing a universal ethics policy will be difficult because of the diversity of data that's collected and the many uses for this data—in the areas of finance, marketing, medicine, law enforcement, social science, and politics, to name just a few.

Privacy and data usage policies are often disclosed to users signing up for particular applications, products, or services. My experience has been that the details about the data being collected are hidden in the customer agreement. Normally, the agreement offers no "opt-out" of any specific elements, so users must either decline the service altogether or begrudgingly accept the conditions wholesale.

But what about the databases that are part of public records? Often these public records are created without any direct communication with the affected individuals. Did you know that in most states, property records at the county level are available online to anyone? You can look up property ownership by name or address and find out the sales history of the property, including prices, square footage, number of bedrooms and baths, often a floor plan, and even the name of the mortgage company—all useful information for performing a pricing analysis for comparable properties, but also useful for a criminal to combine with other socially engineered information for an account takeover or new-account fraud attempt. Doesn't it seem reasonable that I should receive a notification or be able to document when someone makes such an inquiry on my own property record?

Addressing issues in the disclosure

Often, particularly with financial instruments and medical information, those collecting data must comply with regulations that require specific disclosures and ways to handle the data. The following elements together can serve as a good benchmark in the development of an ethical data policy disclosure:

  • Type of data collected and usage. What type of data are being collected and how will that data be used? Will the data be retained at the individual level or aggregated, thereby preventing identification of individuals? Can the data be sold to third parties?
  • Accuracy. Can an individual review the data and submit corrections?
  • Protection. Are people notified how their data will be protected, at least in general terms, from unauthorized access? Are they told how they will be notified if there is a breach?
  • Public versus private system. Is it a private system that usually restricts access, or a public system that usually allows broad access?
  • Open versus closed. Is it a closed system, which prevents sharing, or is it open? If it's open, how will the information will be shared, at what level, and with whom? An example of an open system is one that collects information for a governmental background check and potentially shares that information with other governmental or law enforcement agencies.
  • Optional versus mandatory. Can individuals decline participation in the data collection, or decline specific elements? Or is the individual required to participate such that refusal results in some sort of punitive action?
  • Fixed versus indefinite duration. Will the captured data be deleted or destroyed on a timetable or in response to an event—for example, two years after an account is closed? Or will it be retained indefinitely?
  • Data ownership. Do individuals own and control their own data? Biometric data stored on a mobile phone, for example, are not also stored on a central storage site. On the other hand, institutions may retain ownership. Few programs are under user ownership, although legal rights governing how the data can be used may be made by agreement.

What elements have I missed? Do you have anything to suggest?

In my next post, I will discuss appropriate guiding principles in those circumstance when individuals have no direct interaction with the collection effort.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed


February 5, 2018 in consumer protection, innovation, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 29, 2018


Big Data, Big Dilemma

Five years ago, I authored a post discussing the advantages and pitfalls of "big data." Since then, data analytics has come to the forefront of computer science, with data analyst being among the most sought-after talents across many industries. One of my nephews, a month out of college (graduating with honors with a dual degree in computer science and statistics) was hired by a rail transportation carrier to work on freight movement efficiency using data analytics—with a starting salary of more than $100,000.

Big data, machine learning, deep learning, artificial intelligence—these are terms we constantly see and hear in technology articles, webinars, and conferences. Some of this usage is marketing hype, but clearly the significant increases in computing power at lower costs have empowered a continued expansion in data analytical capability across a wide range of businesses including consumer products and marketing, financial services, and health care. But along with this expansion of technical capability, has there been a corresponding heightened awareness of the ethical issues of big data? Have we fully considered issues such as privacy, confidentiality, transparency, and ownership?

In 2014, the Executive Office of the President issued a report on big data privacy issues. The report was prefaced with a letter that included this caution:

Big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans' relationship with data should expand, not diminish, their opportunities and potential.

(The report was updated in May 2016.)

In the European Union, the 2016 General Data Protection Regulation was adopted (enforceable after 2018); it provides for citizens of the European Union (EU) to have significant control over their personal data as well as to control the exportation of that data outside of the EU. Although numerous bills have been proposed in the U.S. Congress for cybersecurity, including around data collection and protection (see Doug King’s 2015 post), nothing has been passed to date despite the continuing announcements of data breaches. We have to go all the way back to the Privacy Act of 1974 for federal privacy legislation (other than constitutional rights) and that act only dealt with the collection and usage of data on individuals by federal agencies.

In a future blog post, I will give my perspective on what I believe to be the critical elements in developing a data collection and usage policy that addresses ethical issues in both overt and covert programs. In the interim, I would like to hear from you as to your perspective on this topic.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

January 29, 2018 in consumer protection, innovation, regulations | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 4, 2017


What Will the Fintech Regulatory Environment Look Like in 2018?

As we prepare to put a bow on 2017 and begin to look forward to 2018, I can’t help but observe that fintech was one of the bigger topics in the banking and payments communities this year. (Be sure to sign up for our December 14 Talk About Payments webinar to see if fintech made our top 10 newsworthy list for 2017.) Many industry observers would likely agree that it will continue to garner a lot of attention in the upcoming year, as financial institutions (FI) will continue to partner with fintech companies to deliver client-friendly solutions.

No doubt, fintech solutions are making our daily lives easier, whether they are helping us deposit a check with our mobile phones or activating fund transfers with a voice command in a mobile banking application. But at what cost to consumers? To date, the direct costs, such as fees, have been minimal. However, are there hidden costs such as the loss of data privacy that could potentially have negative consequences for not only consumers but also FIs? And what, from a regulatory perspective, is being done to mitigate these potential negative consequences?

Early in the year, there was a splash in the regulatory environment for fintechs. The Office of the Comptroller of the Currency (OCC) began offering limited-purpose bank charters to fintech companies. This charter became the subject of heated debates and discussions—and even lawsuits, by the Conference of State Bank Supervisors and the New York Department of Financial Services. To date, the OCC has not formally begun accepting applications for this charter.

So where will the fintech regulatory environment take us in 2018?

Will it continue to be up to the FIs to perform due diligence on fintech companies, much as they do for third-party service providers? Will regulatory agencies offer FIs additional guidance or due diligence frameworks for fintechs, over and above what they do for traditional third-party service providers? Will one of the regulatory agencies decide that the role of fintech companies in financial services is becoming so important that the companies should be subject to examinations like financial institutions get? Finally, will U.S. regulatory agencies create sandboxes to allow fintechs and FIs to launch products on a limited scale, such as has taken place in the United Kingdom and Australia?

The Risk Forum will continue to closely monitor the fintech industry in 2018. We would enjoy hearing from our readers about how they see the regulatory environment for fintechs evolving.

Photo of Douglas King By Douglas A. King, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

 

 

December 4, 2017 in banks and banking, financial services, innovation, mobile banking, regulations, regulators, third-party service provider | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

Google Search



Recent Posts


Archives


Categories


Powered by TypePad