At the recent Money20/20 conference, sessions on artificial intelligence (AI) joined those on friction in regulatory and technological innovation in dominating the agenda. A number of panels highlighted the competitive advantages AI tools offer companies. It didn't matter if the topic was consumer marketing, fraud prevention, or product development—AI was the buzzword. One speaker noted the social good that could come from such technology, pointing to the work of a Stanford research team trying to identify individuals with a strong likelihood of developing diabetes by running an automated review of photographic images of their eyes. Another panel discussed the privacy and ethical issues around the use of artificial intelligence.

But do any of these applications marketed as AI pass Alan Turing's 1950s now-famous Turing test defining true artificial intelligence? Turing was regarded as the father of computer science. It was his efforts during World War II that led a cryptographic team to break the Enigma code used by the Germans, as featured in the 2014 movie The Imitation Game. Turing once said, "A computer would deserve to be called intelligent if it could deceive a human into believing that it was human." An annual competition held since 1991, aims to award a solid 18-karat gold medal and a monetary prize of $100,000 for the first computer whose responses are indistinguishable from a real human's. To date, no one has received the gold medal, but every year, a bronze medal and smaller cash prize are given to the "most humanlike."

Incidentally, many vendors seem to use artificial intelligence as a synonym for the terms deep learning and machine learning. Is this usage of AI mostly marketing hype for the neural network technology developed in the mid-1960s, now greatly improved thanks to the substantial increase in computing power? A 2016 Forbes article by Bernard Marr provides a good overview of the different terms and their applications.

My opinion is that none of the tools in the market today meet the threshold of true artificial intelligence based on Turing's criteria. That isn't to say the lack of this achievement should diminish the benefits that have already emerged and will continue to be generated in the future. Computing technology certainly has advanced to be able to handle complex mathematical and programmed instructions at a much faster rate than a human.

What are your thoughts?

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed