The Power of Natural Language Processing

Today’s boom in Artificial Intelligence

In 2014, Benedict Cumberbatch played the famous role of Alan Turing in the movie “The Imitation Game”. Source: OnSugar

The history of Natural Language Processing can be dated back to 1950s, first proposed by Alan Turing in the form of a simple test to determine if a machine can be considered “intelligent”.

“A computer would deserve to be called intelligent if it could deceive a human into believing that it was human.”
- Alan Turing

The “standard interpretation” of the Turing Test, in which player C, the interrogator, is given the task of trying to determine which player — A or B — is a computer and which is a human. The interrogator is limited to using the responses to written questions to make the determination. (Source: Saygin, 2000)

Turing proposed that evaluators have conversations with both humans and machines to test whether they could consistently tell apart the two agents. The conversation would be limited to a text-only channel to create a level playing field. During the test, if the evaluator fails to continuously distinguish machine from human, the machine is said to have passed Turing test.

Turing’s test framework was ahead of its time as it would take over half a century for technology to even come near in succeeding the test.

References to various technologies in media. (2011–2016) Source: Nathan Benaich

Artificial Intelligence is … when a machine mimics “cognitive” functions that humans associate with other human minds, such as learning and problem solving. — Russell & Norvig 2009

Source: Laughing Squid

The Timing

The sheer computing power required to run AI algorithms only became available to researchers and developers in the last couple years due to the proliferation of Cloud Computing. Massive amounts of GPUs are required to accommodate intensive processing requirements, and with Cloud Computing they are at anybody’s fingertips. With such immense advances in computation power and availability, AI has even begun to encroach upon human ability.

Source: IBM Watson

In 2011, IBM’s Watson beat Ken Jennings and Brad Rutter in the highly televised Jeopardy contest. It consisted of ten racks of ten Power 750 servers, all closed from the public. Today, however, we can harness the power of Watson and other comparable engines on hardware as measly as our smartphones, all due to the power of Cloud Computing.

Artificial Intelligence landscape

Major AI companies competing for similar services

Today, there are multiple large companies offering cloud services for AI and NLP.

Google has its Cloud Machine Learning Engine; Microsoft has a series of Cognitive services APIs through Azure; IBM’s Watson Ecosystem has spread out into various sectors, and it’s Natural Language Processing services have been at the forefront; and Amazon, the biggest cloud player, has many Machine Learning tools offered through AWS such as Rekognition, Polly and Lex.

From the perspective of NLP, there are several services available to engineers based on the level of customization you’re interested in. On one hand you have open source, hands-on, APIs like Tensor Flow, Stanford’s Core NLP suite, Caffe, Theano, Torch, CNTK and more. Dealing with such APIs requires you to create, train and deploy your own data-set, which might not be feasible for a lot of teams.

If, however, you’re looking to build something quick to test your hypothesis, you can pick from services like API.AI, Watson Conversation, Amazon Lex or MS Cognitive Services. They are all comparable in price and some of them have lucrative free credit offerings for startups and entrepreneurs. (e.g. IBM Global Entrepreneurship, AWS Activate Program). This makes them highly attainable to small developers.

Overall, the current landscape of NLP is booming, as it becomes possible for anyone to create anything. This development will revolutionize AI, and in the near future we can expect human-like bots to join us in our daily lives.