Artificial Intelligence has made the leap from science fiction to real life in a short matter of time. It was initially envisioned as a panacea for the intricate but repetitive processes that aided scientific research and technological advancement – a role it has fulfilled and, in many instances, surpassed. The inclusion of ‘learning abilities’ – mostly thought unique to humans and very few other evolved primates – defines artificial intelligence to a large extent. Faced with unfamiliar situations, how the program deals with the problems and attempts to solve them is key to identifying a stretch of software code as ‘artificially intelligent’. Training a program by making it understand a variety of sensory inputs, whether in the form of digital or analog data, does not mean that program has ‘intelligence’. The result of this factor being used to decide the intelligence of software leads to various technologies that were quite revolutionary at their inception now being classified as ro...