In our field of predictive analytics (asset health and failures), and I presume the same goes for other fields, there is a major misconception about what an algorithm represents. Speaking with business leaders and, more worryingly, practitioners too, about PdA and in most cases the conversation will turn to “and then we build the algorithm that describes the behaviour of…” Here’s some bad news for these people; many, if not most, assets can’t be described with A algorithm. No sooner than the algorithm enters production does it start to age and as with most things technology related, they age fast and at ever faster speeds.

I was listening to Peter Hinssen talk about innovation and disruption the other night and he mentioned “the end of average”. As an example he used the pharmaceutical industry where drugs are made to be applicable in as many cases as possible. Today, this industry is a volume business. What about tomorrow? Well, there are signs that certain drugs will be tailored to both the patient and the disease. This will allow faster and more effective treatment with less side effects.

Having some experience in the printing industry (for those who know this industry, I was on the extreme end: printing at 600lpi), I can tell a printing press can behave very differently according to the circumstances. Not only will the printed result differ but also the reliability of the machine depends on operator, ink, atmosphere, etc. To make matters even more complex, this also changes over time due to the ageing of the machine, maintenance impact, etc.

The only way to deal with this variety is to continuously re-evaluate the algorithm used to predict machine behaviour. Algorithm should no longer refer to a static “formula” but rather to that which is applicable today (or even: at this precise moment). Averaging algorithms simply can’t generate good enough results (in most cases – feel free to try!). This creates a problem for those approaches which rely heavily on data science to analyse the problem; how fast can the data scientists adapt to the ever-changing requirements for the algorithm and at what expense? Bandwidth, speed, flexibility and cost should be key considerations alongside forecasting effectiveness when launching a predictive analytics project.

The four forces of disruption are now being listed as: data – networks – intelligence – automation. Predictive analytics has in it the elements to be all of those – imagine the disruptive potential! However, while we see a lot of data, networks and at least some intelligence, automation is often still just a concept. Automating data science is key to successful predictive analytics projects that will not only work as a proof-of-concept but will also be able to evolve to cover real life situations.

The Algorithm is dead, long live the algorithm!