I always enjoy reading the words David writes.
This particular post creates a moment to reflect. As we consider the implications of the Fourth Industrial Revolution, we must remember the significance many have attributed to Artificial Intelligence. Those two letters AI are clearly key to the what, why and wherefore of the change ahead.
Clearly machines that work faster, search deeper and are capable of studying vast realms of data are changing the nature of so much. Simply consider the risks to our security cyber hackers and terrorists wrought on this world or the shenanigans many claim the Russians use to disrupt as they explore and exploit the power of social media.
Moreover as we look afield many industries are being disrupted: movies, books, music, news … to name few. Outsourcing and robotics is changing the nature of work and the skills necessary to compete and ultimately survive to enjoy the pleasures available in our increasingly digital world.
David makes the point that the intelligence Isaac Asimov and other science fiction envisioned has not yet emerged. I think he is right. The message I take aware -we who market these solutions should walk forward with care.
People are clearly feeling threatened by the change impacting their towns, families and livelihood.
We must be mindful that complexity breeds confusion. Confusion drives disillusion. This then causes people to react, often in nonsensical ways.
Take On Payments
Federal Reserve Bank of Atlanta
Posted: Nov 27, 2017 10:51 am
At the recent Money20/20 conference, sessions on artificial intelligence (AI) joined those on friction in regulatory and technological innovation in dominating the agenda. A number of panels highlighted the competitive advantages AI tools offer companies. It didn’t matter if the topic was consumer marketing, fraud prevention, or product development—AI was the buzzword. One speaker noted the social good that could come from such technology, pointing to the work of a Stanford research team trying to identify individuals with a strong likelihood of developing diabetes by running an automated review of photographic images of their eyes. Another panel discussed the privacy and ethical issues around the use of artificial intelligence.
But do any of these applications marketed as AI pass Alan Turing’s 1950s now-famous Turing test defining true artificial intelligence? Turing was regarded as the father of computer science. It was his efforts during World War II that led a cryptographic team to break the Enigma code used by the Germans, as featured in the 2014 movie The Imitation Game. Turing once said, “A computer would deserve to be called intelligent if it could deceive a human into believing that it was human.” An annual competition held since 1991, aims to award a solid 18-karat gold medal and a monetary prize of $100,000 for the first computer whose responses are indistinguishable from a real human’s. To date, no one has received the gold medal, but every year, a bronze medal and smaller cash prize are given to the “most humanlike.”
Incidentally, many vendors seem to use artificial intelligence as a synonym for the terms deep learning and machine learning. Is this usage of AI mostly marketing hype for the neural network technology developed in the mid-1960s, now greatly improved thanks to the substantial increase in computing power? A 2016 Forbes article by Bernard Marr provides a good overview of the different terms and their applications.
My opinion is that none of the tools in the market today meet the threshold of true artificial intelligence based on Turing’s criteria. That isn’t to say the lack of this achievement should diminish the benefits that have already emerged and will continue to be generated in the future. Computing technology certainly has advanced to be able to handle complex mathematical and programmed instructions at a much faster rate than a human.
What are your thoughts?
By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed