The Scalability of Data Science: Part 3 – Reality Check  

You’re an operations manager, or even a CEO. Let’s say you have a big predictive analytics initiative and need to deploy several instances of your solution. What are your options? Ok, go ahead and hire a team of 5 data scientists, each with a relatively modest salary of $100k (US) per year (this is very conservative estimate). Now step back… You’ve just spent half a million dollars on staffing (plus the time for hiring, software procurement, etc.) for something that’s going to develop slowly and if it works at all, it may not work well. Have you made a wise investment?

This is the reality that most companies entering the realm of IoT and predictive analytics will face. Why? Most predictive analytic solutions can’t scale (i.e., can’t be rapidly applied across different problems). It’s too time consuming and too expensive and the value may be lost in a ramp-up attempt. A deployed predictive analytics solution must be scalable, fast, and affordable. A data scientist can be great (and many are), but they’re bound by the magnitude in which they can scale and the subjectivity of their respective approaches to the solution. There are many ways to approach data analysis that are correct, but there’s probably an alternative that is more valuable.

The next generation of predictive analytics solutions should be able to accomplish most, if not all, of the above automatically and rapidly with decreasing involvement from humans, and should perform as good or better than a good data science team; this is what Predikto has done (patents pending). We enable operations managers and data scientists by tackling the bulk of the grunt work.

I’m well aware that this may downplay the need for a large data science industry, but really, what’s an industry if it can’t scale? A fad perhaps. Data science is not just machine learning and some basic data manipulation skills. There’s much more to a deployed solution that will impact a customer’s bottom line. To make things worse, many of the key components of success are not things covered in textbooks or in an online course offering on data science.

It’s one thing to win a build the “best” predictive analytics solution (e.g., a Kaggle competition), but try repeating this  process of dozens times in a matter of weeks for predictions of different sorts. If any of these solutions are not correct, it costs real dollars. Realistically scaling in an applied predictive analytics environment should scare the pants off of any experienced data scientist who relies on manual development. Good data science is traditionally slow and manual. Does it have to be?

Rest assured, I’m not trying to undercut the value of a good data scientist; this is needed trade. The issue is simply that data science is difficult to scale in a business setting.

The Scalability of Data Science: Part 2 – The Reality of Deployment

To put my previous post into perspective, let me give you a for instance… An organization wants to develop a deployed predictive analytics solution for an entire class of commuter trains. Let’s be modest and go with 10 different instances from within the data (e.g.,  1) predicting engine failure, 2) turbo charger pressure loss, 3) door malfunction, … and so on…). We’ll focus on just one…

Data from dozens of assets (i.e., trains) are streaming in by the second or quicker and these data must be cleaned and aggregated with other data sources. It’s a big deal to get just this far. Next you have to become and expert in the data and begin cleaning and developing context-based feature data from the raw source data. This is where art comes into play and this part is difficult and time consuming for data scientists. Once a set of inputs has been established, then comes the easier part, applying an appropriate statistical model/s to predict something (e.g., event occurrence, time to failure, latent class, etc.) followed by validating and deploying the results. Oh yes, let’s not forget the oft unspoken reality of threshold settings for the customer (i.e., costs of TPs vs FPs, etc.). To this point, we’re assuming that the solution has value and it’s important to keep in mind that a data science team has probably never seen this sort of data ever before.

So on top of requiring computer programming skills, feature engineering prowess (which is art), understanding statistics/machine learning, and having good enough communication skills to both learn from the customer about their data and to be able to “sell” the solution, this must all be accomplished in a reasonable amount of time. We’re talking about 1 instance to this point, remember? And, we’re still not deployed. Do you have expertise in deploying data for the customer? Now repeat this situation ten times and you’re closer to reality. Your team may now just filled up the next 12 months of work and the utility of the solution is still unknown.

3 ways to improve performance using predictive maintenance

Predikto prevents failure of critical assets

With the rapid acceleration of information technology advancement and proliferation into business, success depends on how businesses leverage their access to information and data while processing and analyzing them for clearer insights.

A recent study revealed “failure of critical assets” to be the biggest problem faced in operations. This is attributed not only by the lack of insight but possibly the absence of “predictability” in the health and performance of assets. This reduces asset productivity and reduces efficiency which is fundamental to a successful business. Here are 3 value propositions of predictive analytics to make better decisions:

    1. Monitor, maintain and maximize assets to gain better utilization and performance in real-time: With real-time data, it has become even more challenging to provide immediate actionable solutions to boost productivity and efficiency across machinery and platforms and market changes. Without the proper manpower or skills, it could prove to be an uphill task for most IT departments. Hence, appropriate IT investments to engage in professional teams could potential help businesses revamp their decision-making processes and effectiveness so the right management decisions can be made instantly.

 

  • Predict asset failures to improve quality and supply chain processes: The key difference between preventive maintenance from predictive maintenance is that it accounts for frequency of usage, wear and tear and environmental factors of the asset to accurately discern its failure. This prediction is vital in resource planning and placing orders for spare parts and financial planning for budgeting as well.

 

 

  • Eliminate risk and uncertainty in the decision making process: Capital equipment utilized in production facilities is subjected to high cost and unplanned downtime. By adopting predictive analytics, we can optimize decisions within resource constraints by visualizing issues to allow for immediate recommended actions, eventually maximizing efficiency and production for the company

Example – By adding predictive maintenance capabilities to its production lines, a steel manufacturing plant can reduce production delays predicting if a batch will have delays prior to starting the rolling process, improving reliability, and thereby significantly reducing maintenance costs.

 

Using Predictive Analytics in Maintenance

predictive analytics for maintenence
Predictive Maintenance is the best type of maintenance a company can undertake, but not all assets classes and applications justify Predictive Maintenance. That is why the best type of maintenance is the type that works for each client. The latest technology in Predictive Maintenance is the use of Predictive Analytics. In some cases Predictive Analytics is reaching accuracies above 95% to predict an asset failure. These results are much higher when compared to traditional predictive maintenance techniques like Lubrication Analysis, Infrared, and Vibration. These are all excellent techniques and companies should continue using them if they are seeing success reducing downtimes, extending the lifetime of equipment, and subsequently saving money.

In the MAPCITE blog, Eric Spiegel, CEO of Siemens U.S.A., consider that “while analytics were implemented widely in industries such as banking and communications initially, we view capital-goods organizations as a huge untapped opportunity, driven primarily by the “Internet of things” and the significant potential to optimize product development, supply chain and asset related services. One example is predictive maintenance – if we were able to better predict when critical and expensive equipment is most likely to fail, we could reduce downtimes, extend the lifetime of the equipment, and realize significant savings”. Read the entire story HERE.

Here comes the flood-analytics, Manufacturing Industries!

Manufacturing Industries

Yes – we have heard about the magic of predictive analytics and how it has helped various companies and industries in predicting the rise and fall of companies’ finances, social media and marketing and such but could the same predictive analytics methods aid manufacturing industries today?

According to Bala Deshpande’s article, manufacturing industries are not entirely oblivious to the idea of collecting data. As a matter of fact, you could call them ‘The Forefathers of Data Collecting’. Manufacturing industries have been collecting data for years on the company’s current operations and quality of their products. However, the time has come for manufacturing industries to start digging these data sets a bit deeper to improve their operations so much so that, companies would be able to improve notably their production yield.

The benefits of manufacturing industries engaging in predictive analytics can be seen when the production process becomes even more efficient and cuts unnecessary costs (i.e. unexpected machine failure). Deshpande highlighted a small company in the manufacturing industry that has already started to engage in predictive analytics by installing overhead GPS sensors that notes down the number of workers working on a particular project and if that project requires assembly so to calculate how extensive the machine is being used and predict any machine failures and such.

Whether manufacturing companies like it or not, engaging in analytics is inevitable. Especially if competitive manufacturing companies are using the same predictive analytics to measure how likely they are going to be performing much better than the other companies!

Maintenance Analytics to Generate $24.7 Billion in 2019

Predikto was recently featured in a  Maintenance Analytics report by ABI Research. The London based Research group forecasts that revenues from maintenance analytics will total $9.1 billion this year. The market’s size will reach $24.7 billion in 2019, driven largely by adoption of predictive analytics and M2M connectivity.

Senior analyst Aapo Markkanen comments, “Today, predictive maintenance is one of the commercially readiest forms of M2M and IoT analytics, possibly second only to usage-based insurance. It helps asset-intensive organizations transform their maintenance operations and eliminate waste, reducing costly downtime. Infrastructure, vehicles, and industrial equipment can all benefit from it.”

ABI Research stated that Predikto is one of the vendors specializing in predictive maintenance in specific verticals.  ABI Research provides in-depth analysis and quantitative forecasting of trends in global connectivity and other emerging technologies.

We are excited to be recognized as a leader in Predictive Analytics for Asset Intensive industries with a focus on improving overall maintenance and equipment reliability.

The report can be found HERE
Mining-broken-axle

Preventive Vs. Predictive Maintenance

You are about to leave on a road trip vacation. Before leaving you checked the fluid levels, made an oil change, and checked the tires. You are diligent about performing the manufacturer recommended maintenance. This is called Preventive Maintenance. You’re, in a preventive way, trying to assure that your machine will be running with no problems. And this is the exact same process big asset-intensive companies do with their preventive maintenance programs.

But, is that preventive maintenance enough to assure there will not be problems with your car?. The answer is no. Even when the preventive maintenance is ideal to prevent failures, we need to complement it with something else to avoid or reduce unplanned breakdowns. In our example, imagine that the car itself anticipates that with the current coolant level, the RPMs for the past few hours, the temperature and humidity of the environment, it will likely fail in the next 45 min. The car can provide you a warning to stop the car and add more coolant because it will begin to overheat in the next 30min and possibly fail in 45min. That approach is called Predictive Analytics which results in you performing Predictive Maintenance.

Predictive Analytics uses Data Scientists to analyze the data generated by maintenance history and sensors. The result are predictive models (mathematical and statistical models) to predict almost in real time when a machine will fail with high rates of probability and accuracy.

Predictive Maintenance can be improved significantly by using the power of Predictive Analytics. The goal is not to replace traditional preventative maintenance programs, but to work together in order to minimize maintenance costs and prioritize what pieces of equipment need to be prioritized.