The first underpinnings of modern Cognitive Computing date back to the late 19th century, with the work of mathematician George Boole and his book The Laws of Thought, and the propositions of Charles Babbage on creating what he termed an “analytical engine.” The term Artificial Intelligence (AI) was coined by the late John McCarthy in 1955 (revised in 2007), when he defined AI as “the science and engineering of making intelligent machines.”
The study of AI–which includes other disciplines such as psychology, neuroscience, linguistics, mathematics, logic, computer science, perception, natural language processing, and many others–really began to excel during the 1980s when funding increased considerably over previous decades. Then on May 11, 1997 the world’s imagination was captivated when IBM’s Deep Blue beat Garry Kasparov, the current world chess champion. The world of AI research exploded. Some other notable dates include:
- 2005: Stanford-built robot wins DARPA Grand Challenge
- 2011: Watson defeats two of the greatest Jeopardy! champions without being hooked to the Internet
But in reality, ideas about thinking machines date back to ancient history, when Greek mythologists imagined such artificial devices as Hephaestus’ bronze robot Talos, automatons of Hero of Alexandria, and the carved ivory statue Galatea that came to life in Ovid’s retelling of Pygmalion.
Such grand creations have long been the purview of human imagination, but only in the past 30-40 years (with the last decade being of great importance) has the reality of Cognitive Computing started to manifest in our daily affairs. According to Dharmendra Modha, the Manager of Cognitive Computing at IBM Research:
“Cognitive computing goes well beyond artificial intelligence and human-computer interaction as we know it–it explores the concepts of perception, memory, attention, language, intelligence and consciousness. Typically, in AI, one creates an algorithm to solve a particular problem. Cognitive computing seeks a universal algorithm for the brain. This algorithm would be able to solve a vast array of problems.”
What is Cognitive Computing?
Questions abound about the parallels and distinctions between Cognitive Computing and Artificial Intelligence, as well as misunderstandings about semantic processing, natural language processing, decision automation, cognitive science, computational intelligence, machine learning, statistical intelligence modelling, cognitive simulation, and a host of other terms. This article does not have the space to delineate each of the terms and so will use Cognitive Computing as an umbrella term that throughout its long history has in one way or another encompassed, been developed through, or is akin to those terms mentioned above, among others. Libraries are filled with tomes on the topic of Artificial Intelligence – which, while still a predominant expression within itself, has been superseded in the Big Data and Data Management industry by the term Cognitive Computing.
IBM has long been a leader in research on Cognitive Computing and says that:
“Cognitive computing systems learn and interact naturally with people to extend what either humans or machine could do on their own. They help human experts make better decisions by penetrating the complexity of Big Data…
“[And] Big Data growth is accelerating as more of the world’s activity is expressed digitally. Not only is it increasing in volume, but also in speed, variety and uncertainty. Most data now comes in unstructured forms such as video, images, symbols and natural language–a new computing model is needed in order for businesses to process and make sense of it, and enhance and extend the expertise of humans. Rather than being programmed to anticipate every possible answer or action needed to perform a function or set of tasks, cognitive computing systems are trained using artificial intelligence (AI) and machine learning algorithms to sense, predict, infer and, in some ways, think.”
Thus, Cognitive Computing has borrowed some of the foundational tenets from the AI community, especially in terms of algorithms that can be employed through Data Management systems–Big Data in particular–to gain a deeper understanding and control of the vast amounts of data pouring into modern enterprises.
Big Data Evangelist James Kobielus, expands on this idea when he states:
“When we speak of ‘cognitive computing,’ we’re generally referring to the ability of automated systems to handle the conscious, critical, logical, attentive, reasoning mode of thought that humans engage in when they, say, play ‘Jeopardy!’ or try to master some academic discipline.”
Cognitive Systems are still in their infancy today, even after the advances in such fields as Cognitive Cloud apps and healthcare that IBM’s Watson has made over the past few years. The potential of Cognitive Systems though is being heralded as the next stage in computational growth. According to IBM Senior Vice President John E. Kelly, there have been two eras of computing thus far:
- The Tabulating Era: original calculators, tabulating machines, vacuum systems. “In the first era of data we basically fed data in on punch cards.”
- The Programmable Era: later vacuum tube systems up to our current microprocessing computers. “It was about taking processes and putting them into the machine. It’s completely controlled by the programming we inflict on the system.”
According to Kelly, the next era will be:
- The Cognitive Computing Era: computers work directly with humans in a more synergetic association where the relationships between human and computer essentially blur and both interact in such a way that the computer helps the human unravel vast stores of information through its advanced processing speeds, but the creativity of the human creates the environment for such an “unlocking” to occur.
Kelly went on to say, “This is no longer a game. This is about unleashing a new form of interaction between man and machine. And unleashing a new power in that data we’re generating.” Eventually, Cognitive Systems will facilitate much greater interaction and magnification between machines and humans.
Contrary to popular belief, such interaction is certainly not in the realm of Terminator or other classic sci-fi movies that certain critics of Cognitive Computing and AI like to rave about. AlchemyAPI CEO Elliot Turner said it well when he stated earlier this year:
“I believe in the strength of the human spirit. While the systems that are coming online are amazing… you can still have a person read a document better than a machine can today. Same thing for vision. We just have to focus on the things that make us special, and move away from the historical view of rote memorization.”
IBM’s Stephen Gold added:
“I think the challenge and the skills are how we educate. Understanding natural language and machine learning, understanding analytics and Big Data — we need to modify our educational system to get people who can truly build these new systems.”
Cognitive Systems do not focus on the computer taking over all tasks, but rather on aiding humans in completing tasks with more precision, faster, and in more complex ways than ever before. Humans have the creativity and expertise in fields like finance, commerce, engineering, healthcare, Data Management, et al. Computers have the power to bring about insights from data while also getting smarter as they work. Working together, Cognitive Systems tame the gluttonous Leviathan that is Big Data, a devourer of vast resources, productivity hours, and money.
Conclusion – Investments Abound
The deluge of data is not going to abate, it’s only going to get larger. Therefore, the need for greater processing systems built on precepts that allow machines to learn faster, think better, and interact more efficiently with humans has become the next great arena for venture capitalists and tech gurus–not to mention every person who owns a smart device, personal computer, or smart refrigerator.
The current Cognitive Systems being researched and produced seek to aid doctors in patient care, speed up interactions on the sales room floor, improve supply chain efficiencies, support financial advisors, provide lay users with better “smart” experiences, and improve the way humans live on a global scale.
Recently, IBM invested $1 billion to establish the IBM Watson Group, “a new business unit dedicated to the development and commercialization of cloud-delivered cognitive innovations.” Google paid $400 million to purchase DeepMind, a company that is developing AI for e-commerce, online gaming, and image recognition. Mark Zukerberg, Elon Musk, and Ashton Kutcher have all invested millions into Vicarious, a company whose primary aim, according to its co-founder Scott Phoenix, is nothing less than:
“Replicating the neocortex, the part of the brain that sees, controls the body, understands language and does math. Translate the neocortex into computer code and you have a computer that thinks like a person. Except it doesn’t have to eat or sleep.”
Extensive research efforts are on-going at such universities as MIT, Carnegie Mellon, New York University, Rensselaer Polytechnic, and many others. And there are numerous others: Rebellion Research has invested $7 million to develop a stock investment program based on machine learning. Cerebellum Capital in San Francisco invested $10 million for similar aims. Such investments will only grow as the potential of cognitive computing and its partner-in-crime, AI becomes more obvious.
The process that began long ago in the minds of Greek mythologists is finally coming to fruition. Yet, according to Zachary Lemnios, the Vice President of Strategy at IBM Research, more breakthroughs are still needed:
“Cognitive systems will require innovation breakthroughs at every layer of information technology, starting with nanotechnology and progressing through computing systems design, information management, programming and machine learning, and, finally, the interfaces between machines and humans. Advances on this scale will require remarkable efforts and collaboration, calling forth the best minds—and the combined resources–of academia, government and industry.”