Click to learn more about author Joe deBuzna.
Racing down the tracks with the equivalent of 170 Boeing 747s in tow, the 440,000 pound (200,000kg) next-generation locomotive is about to break. Some of the hundreds of sensors pulling in over 150,000 data points per minute show one of the train’s two turbochargers (yes, some trains have turbochargers) is starting to fail along with part of the cooling system.
While the cost of replacement parts alone could fetch you a small fleet of Teslas, it pales in comparison to the lost revenue from downtime of the multi-million dollar locomotive and its 4000+ horsepower engine. Unfortunately, these aren’t the kind of components you have just laying around. They’ll have to wait two months to make the most complicated of these specialty parts and that’s a best-case scenario which assumes the manufacturing plant has available the raw materials, can stop whatever it is doing, retool and rush through the emergency order.
But to the makers of the train this is not new information. They’ve been continuously monitoring every aspect of this train’s performance in real-time since it went into service. Eventually, with a great degree of confidence, they predicted that servicing would be needed right about now, avoiding a catastrophic failure while towing a fortune of freight. With access to continuously integrated data across all lines of business (LoB) their data models were able to preemptively suggest a number of high-value alternative actions. They chose one and three months ago when they were building these same parts for a new train they were able to add in one more production run without any noticeable disruption. The parts have already arrived at the local maintenance yard and total downtime for this locomotive will be less than a week.
One Percent in Efficiency = $1.8 Billion
Today we hear a lot about the advances of electric cars, reusable rockets and talking speakers but over the decades trains have been quietly and consistently chugging along at the forefront of innovation. And in an industry where a one percent increase in efficiency is worth $1.8 billion, it’s easy to see their motivation. A central figure in the success story of this modern marvel of engineering is the same as that of other manufacturing endeavors: data. And lots of it. While we enter the zettabye era concluding we have lots of data offers nothing new. Fortunately, that’s only where this narrative begins.
The military employs a useful method of communication called a SITREP, or situational report. There are several versions of what exactly constitutes a SITREP but they all provide an insightful summary of the situation right now so that life and death decisions can be quickly made. The computing corollary is “Descriptive Analytics” and for decades it has been and continues to be the most widely used type of Data Analytics. It is essentially a collection of, “What’s up?”, questions mixed in with some “why” diagnostics that usually yield performance numbers. Ask a specific question, get a specific answer. Over the years as these questions started to get more involved, running them on the production systems that generated the data became prohibitive. We started creating batch-oriented ETL (extract, transform, and load) tools to format and cleanse production data and periodically load it into a custom built data warehouses with tremendous processing power to tell us how we’re doing; how we did last quarter, last month, yesterday, and now.
Real-Time as an Evolving Concept
Our definition of “now” – or “real-time” – continues to evolve over time too. As Rebecca Slonit points out in her book River of Shadows, it was, in fact, the arrival of train schedules in the mid-1800’s that first forced modern society to collectively agree on the current time. Prior to that the clocks in the bank, post office, saloon and everywhere else weren’t in sync because they didn’t have to be. Suddenly, it was the clock at the train station that mattered most. Timewise, trains got us all on the same track.
As our situational Descriptive Analytics have increased efficiencies and helped fuel a competitive advantage arms race, our data-centric definition of real-time has steadily shortened from weeks, days, and hours to minutes, seconds, and milliseconds. Continuous integration through data streaming displaces batch loading or you get left behind. When a machine that’s playing a part in the symphony of manufacturing goes silent, kicking off a workflow to fix it 15 minutes later is these days about 14 minutes and 59 seconds too late. Action must be taken immediately.
But better than knowing immediately is knowing about something before it happens. Enter “Predictive Analytics.” Building on Descriptive Analytics, trends can be spotted and specific outcomes foretold. If Descriptive Analytics is the “what and why” question, this is the “when” question. These prophetic predictions have been with us almost as long as their descriptive companions. Both require asking a specific question to get a specific answer and both have been steadily trending toward fresher data inputs as efficiencies improve and competition intensifies.
A dramatic example of Predictive Analytics can be seen in a video where a Tesla on “autopilot” in the Netherlands predicted an accident between two cars that were beyond human vision before it happened, slowing down early enough to avoid potential death to its passengers. Predictive Analytics was also clearly in play when our locomotive makers could see the parts trending toward failure months in advance. We’re seeing Real-time Predictive Analysis on wheels brought to you by technologies spanning multiple generations.
Defining Predictive Analytics
But it doesn’t stop there. The push toward data-driven decisions requires us to integrate new data faster, know our data better, and compels us to tear down the walls of siloed data. Repeated over and over is the story where each LoB has crafted its own single (or multiple) source of truth over the years. What began as a controlled pilot project morphed into a mission critical monster with an insatiable appetite. In droves over the last several years to contain these creatures we have, or plan to, cordoned them off to the cloud where they can gorge on unlimited resources so long as they continue to do exactly what they’re told.
Data from manufacturing plants, sales organizations, financial systems, and customer support structures are all typically managed independently yet the company’s goals are supposed to transcend them all. Once the data barriers are broken down and the monsters merged a truly holistic view of the enterprise is allowed to rise up and as some enterprises are already learning, a brave new opportunity emerges: unleash the beast so that it can tell us what to do. Or at least give us several strategic options based on a complex array of goals, constraints, and requirements. This is how the entire ecosystem was able to benefit in our train story. Coined by IBM in 2010, this is what is called “prescriptive analytics.”
By as soon as 2019 Gartner predicts that the Prescriptive Analytics software market alone will reach $1.1 billion. The rise of the programming nerds over the last few decades is poised to see a parallel line in the status of math geeks. Armed with nearly unlimited Cloud Computing capacity and timed with the dawn of useful Artificial Intelligence, these so-called Data Scientists and decision analysts are expected to help catalyze the next wave of better alternatives to solving today’s problems. That is if we let them.
Understanding the World
The true test will not be in optimizing manufacturing and supply chains, which have always been steeped in science and are well suited to Prescriptive Analytics. I believe the true test will be when outcomes are more social; where outcomes affect people and policies; where they cause us to switch tracks and change directions. It will be whether decision makers at the highest levels are willing to cast aside their common sense and intuition when presented with counter–intuitive yet data-driven prescriptive options. The more seasoned we are the harder it can be to admit that our brains are no good at comprehending complex systems like the economy, an industry, or a large company. At the same time, we must move forward. As the physicist, mathematician, and social scientist luminary Duncan J. Watts put is in his insightful book Everything Is Obvious (Once You Know the Answer), “[C]ommon sense is wonderful at making sense of the world, but not necessarily at understanding it.” Data Analytics has been helping us understand it by giving us hindsight and insight and now if we allow, foresight. Prescriptive Analytics has arrived and early boarding has begun.