Click here to learn more about author James Kobielus.
I’ve been nearsighted since I was a little boy. My visual acuity has never been sharp, but at least my mind has retained its cognitive prowess into late middle age. Without it, my career as a high-tech thought leader would have been impossible.
Everyone wants to retain their cognitive skills into old age. But we’re all aware that Alzheimer’s Disease, cerebrovascular impairments, and other organic conditions can easily rob us of those powers through no fault of our own. Our susceptibility to these debilitating diseases may be encoded in our genes. Or it may be exacerbated by environmental stresses to which we’ve been exposed since we were in our mothers’ wombs. Or it may stem from bad habits, such as unhealthy diets, that we’ve acquired over the course of our lives.
None of us truly knows whether we’ll lose our cognitive acuity in old age. Fortunately, genomic analysis is increasingly illuminating the factors that makes some people more susceptible to Alzheimer’s and other aging-related cognitive impairments. Regardless of root cause, dementias are usually irreversible, no matter what any of us has tried to do keep ourselves from being afflicted. Though many people tout faddish lifestyle changes (e.g, pharmaceuticals, foods, exercises) that they claim will keep your cognitive powers in tip-top shape, medical science has not confirmed the efficacy of any these approaches.
In this recent article, author Abdullahi Muhammed claims that cognitive computing tools—aka artificial intelligence, machine learning, and data science—might help healthy people to sharpen our intellectual skills.
On one level, I don’t doubt that claim, if you understand it as referring simply to use of cognitive applications, such as IBM Watson, to accelerate human learning. You can definitely get smarter if you’re engaging with a massive, organized, ever-growing corpus of information that predictively delivers the insights you’re seeking.
For my thoughts in this regard, check out this post from this past February. In that piece, I discuss the use of machine learning model to algorithmically detect when a student’s grasp of some body of knowledge is weak. The personalized learning program uses machine learning to dynamically address those deficiencies. It adjusts an online course of study to drill students in exactly the right topics at the right times and in the right ways to boost retention.
There’s nothing controversial about claiming that you can use cognitive computing as an educational tool. To deny it would be equivalent to arguing that earning a 4-year university degree or availing oneself of on-the-job training doesn’t make you smarter. If you want conclusive evidence of educational efficacy, just refer to pre- and post-tests that show the students have indeed mastered the subject matter.
However, I take issue with Muhammed’s assertion that using cognitive applications can actually improve “brain function,” a term that usually refers to the neurological substrate that drives organic cognition. He doesn’t offer even an iota of scientific evidence to bolster this claim. Instead, he relies on a dubious “brain is a muscle” analogy—specifically, a resistance-training analogy that equates using cognitive-learning tools with lifting weights.
In fact, there is no cure for dementia, as the Wikipedia page devoted to this topic states in no uncertain terms. So let’s not create false hope that using cognitive apps will prevent aging-related cognitive decline and other organic conditions that impair brain function.
Nevertheless, we should recognize that cognitive computing provides medical science with very useful tools for analyzing the neurological and other organic factors that may cause or contribute to Alzheimer’s and other dementias. The hope is that, through by the ability to analyze non-invasive neuro-imaging scans, medical science can identify potential cures and people afflicted with dementias can receive appropriate treatments as early as possible.
In that regard, I strongly recommend this recent Scientific American article that reports on the use of machine learning tools to analyze brain scans and thereby improve early detection of diverse dementias. Researchers can do fine-grained correlated analysis of several types of MRIs in order to identify functional and structural changes in the brains of the afflicted.
If we extrapolate these innovations into the near future, it’s not inconceivable that researchers will use machine-learning tools to identify biological markers that are strongly correlated with early onset of various dementia-related neurological impairments. One could even foresee a day when each of us could use wearable devices and other Internet of Things tools to track the presence of these biomarkers in ourselves and our loved ones. And by correlating these biomarkers with other wearable-derived data—as well as with neurological MRI scans captured by professional radiologists–we may all someday be able to assess everyone’s cognitive fitness to the finest neurological degree and in a thoroughly non-invasive fashion.
Let’s say that such futuristic machine-learning tools identify factors indicative of the onset of one or more types of dementia. Diagnosis might be accelerated through various behavioral A/B tests that look for specific physiological, behavioral, gestural, speech, facial, social, and other variables associated with one or another form of cognitive decline. Cognitive-fitness testing of this sort could be integrated with the physical-fitness-monitoring apps that many people already use. In that way, people might be able to provide their doctors with a 360-degree real-time portrait of their cognitive health, and perhaps even treat otherwise invisible conditions such as micro-strokes before they can do irreversible damage.
We’re all mortal and no one expects to have the same robust physiology at age 100 that they enjoyed in their early-adult years. But there’s every reason to believe that we can keep our cognitive faculties sharp till our final breath. If we can use cognitive computing to reduce the onset and severity of aging-related dementias, we can significantly improve people’s quality of life.