Digital Transformations : From Analysis Paralysis to Execution Mode

Digital Trans

I have never been more excited about the future of Predikto. We started 4 yrs ago with the Vision of “Moving Unplanned to Planned”. We wanted to help large industrials “To harness the power of predictive analytics to optimize operational performance”. We are enabling this with:

  1. Our software platform including Predikto MAX which automates machine learning algorithm generation at a massive scale
  2. Our unique approach to data preparation optimized for Machine Learning

So why am I so excited? We have seen a big shift in the past 12 months of organizations going from “analysis paralysis” to “let’s start to execute”. We love it when prospects get what we have built. This was not the case 3 and 4 yrs ago. It takes a sophisticated organization to be ready to capitalize on our technology. We look for:

Clear Strategy

A clear strategy with the Executive support that incorporates AI, Predictive Analytics, Analytics/Digital Transformations, and investments that will enable them to increase revenues or cut costs by leveraging their own data. A recent report by IDC found that 80% of senior executives said investing in digital transformation is critical to future success. Investments in digital transformation initiatives will reach USD2.2 trillion by 2019 which is 60% more than this year.

Organizational Readiness

Organizational readiness is another key aspect of prospects and customers who are ready for our technology. Most customers have hired a Chief Digital Officer who came from the outside to change the way they have tackled innovation and digital transformations. They have dedicated teams with the power and budgets to run multiple pilots with companies big and small to learn how new technology can bring tangible value to their organization. They are learning to move fast and fail fast. The best ones are learning from startups and aligning their key initiatives with true disruptors. If you are looking for Ginni to sell you IBM Watson to solve all your problems, you are going to have a rude awakening in 18 months. We actually look for prospects who have already hired IBM and failed. IBM, please send me your list of Pilot customers from the past 3 years for Predictive Maintenance projects.

Technical Transformation

Technical transformation means a lot of different things depending on the industry vertical. Some customers did not have access to their own equipment sensor data since the OEM would keep it. They had to invest in new hardware to tap into the sensor data inside trains. Others had data stored in siloed on-prem historians and it was a challenge to get their IT Security organization to push that data to the Cloud. Others are trying to figure out which cloud provider to go with? If you are still wondering, there are only two you should consider, AWS and Microsoft Azure.

We are finding that prospects that are moving and executing have figured out all three components of their Digital Transformation. IDC also states that by 2019, 40% of All Digital Transformation initiatives, and 100% of all effective IoT efforts, will be supported by Cognitive/AI capabilities. I am excited about our future and the AI / Machine Learning software we have built to bring value to large industrial transportation companies looking to move from Unplanned to Planned using a data approach to complement their engineering based condition monitoring approach.

The Missing Link in Why You’re Not Getting Value From Your Data Science

The Missing Link in Why You’re Not Getting Value From Your Data Science

by Robert Morris, Ph.D.

DECEMBER 28, 2016

Recently, Kalyan Veeramachaneni of MIT published an insightful monologue in the Harvard Business Review entitled “Why You’re Not Getting Value from Your Data Science. The author argued that bfbusinesses struggle to see value from machine learning/data science solutions because most machine learning experts tend not to build and design models around business value. Rather, machine learning models are built around nuanced tuning and subtle, yet complex, performance enhancements. Further, experts tend to make broad assumptions about the data that will be used in such models (e.g., consistent and clean data sources). With these arguments, I couldn’t agree more.

 

WHY IS THERE A MISSING LINK?

At Predikto, I have overseen many deployments of our automated predictive analytics software within many Industrial IoT (IIoT) verticals, including the Transportation industry. In many cases, our initial presence at a customer is in part due to limited short-term value gained from an internal (or consulting) human driven data science effort where the focus had been on just what Kalyan mentioned; a focus on the “model” rather than how to actually get business value from the results. Many companies aren’t seeing a return on their investment in human driven data science.

wallThere are many reasons why experts don’t cook business objectives into their analytics from the outset. This is largely due to a disjunction between academic expertise, habit, and operations management (not to mention the immense diversity of focus areas within the machine learning world, which is a separate topic altogether). This is particularly relevant for large industrial businesses striving to cut costs by preventing unplanned operational downtime. Unfortunately, the bulk of the effort in deploying machine learning solutions geared toward business value is that one of the most difficult aspects of this process is actually delivering and demonstrating value to customers.

WHAT IS THE MISSING LINK?

In the world of machine learning, over 80% of the work revolves around cleaning and preparing data for analysis, which comes before the sexy machine learning part (see this recent Forbes article for some survey results supporting this claim). The remaining 20% involves tuning and validating results from a machine learning model(s). Unfortunately, this calculation fails to account for the most important element of the process; extracting value from the model output.

In business, the goal is to gain value from predictive model accuracy (another subjective topic area worthy of its own dialog). We have found that this is the most difficult aspect of deploying predictive analytics for industrial equipment. In my experience, the breakdown of effort required from beginning (data prep) to end (demonstrating business value) is really more like:

40% Cleaning/Preparing the Data

10% Creating/Validating a well performing machine learning model/s

50% Demonstrating Business Value by operationalizing the output of the model

The latter 50% is something that is rarely discussed in machine learning conversations (with the aforementioned exception). Veeramachaneni is right. It makes a lot of sense to keep models simple if you can, cast a wide net to explore more problems, don’t assume you need all of the data, and automate as much as you can. Predikto is doing all of these things. But again, this is only half the battle. Once you have each of the above elements tackled, you still have to:

Provide an outlet for near-real-time performance auditing. In our market (heavy industry), customers want proof that the models work with their historical data, with their “not so perfect” data today, and with their data in the future. The right solution provides fully transparent and consistent access to detailed auditing data from top to bottom; from what data are used to how models are developed, and how the output is being used. This is not only about trust, but it’s about a continuous improvement process.

Provide an interface for users to tune output to fit operational needs and appetites. Tuning output (not the model) is everything. Users want to set their own thresholds for each output, respectively, and have the option to return to a previous setting on the fly, should operating conditions change. One person’s red-alert is not the same as another’s, and this all may be different tomorrow.

Provide a means for taking action from the model output (i.e., the predictions). Users of our predictive output are fleet managers and maintenance technicians. Even with highly precise, high coverage machine learning models, the first thing they all ask is What do I do with this information? They need an easy-to-use, configurable interface that allows them to take a prediction notification, originating from a predicted probability, to business action in a single click. For us, it is often the creation of an inspection work order in an effort to prevent a predicted equipment failure.

Predikto has learned by doing, and iterating. We understand how to get value from machine learning output, and it’s been a big challenge. This understanding led us to create the Predikto Enterprise Platform®, Predikto MAX® [patent pending], and the Predikto Maintain® user interface. We scale across many potential use cases automatically (regardless of the type of equipment), we test countless model specifications on the fly, we give some control to the customer in terms of interfacing with the predictive output, and we provide an outlet for them to take action from their predictions and show value.

As to the missing 50% discussed above, we tackle it directly with Predikto Maintain® and we believe this is why our customers are seeing value from our software.

pm1

Robert Morris, Ph.D. is Co-founder and Chief Science/Technology Officer at Predikto, Inc. (and former Associate Professor at University of Texas at Dallas).

What about unplanned?

Everybody’s looking at process inefficiencies to improve maintenance but there’s lower hanging – and bigger – fruit to focus on first: unplanned events!

Maintenance has pretty simple goals; guarantee and increase equipment uptime and do so at the lowest possible cost. Let’s take a quick look at how unplanned events influence these three conditions.

Guarantee uptime

When production went through the evolutions of JIT (Just In Time), Lean,… and other optimisation schemes, schedules got ever tighter and deviations from the plan ever more problematic. WIP (Work In Progress) has to be limited as much as possible for understandable reasons. However, this has a side-effect of also limiting buffers, which means that when any cog in the mechanism locks up, the whole thing stops. Therefore, maintenance receives increasing pressure to guarantee uptime, at least during planned production time. Operational risk is something investors increasingly look at when evaluating big ticket investments or during M&A due diligence and for good reason; it’s like investing in a top athlete – don’t just pick the fastest runner, pick the one who can do so consistently!

Failures are bound to happen so the name of the game is to pre-emptively foresee these events in order to remediate them beforehand; planned, and preferably outside of production time.

Increase Uptime

The more you are able to increase (guaranteed) uptime, the more output you can generate from your investment. Unplanned events are true output killers; not just because they stop the failing machine but also because they may cause a waterfall of other equipment – depending on the failing machine’s output – to come to a halt. Unplanned events should therefore a) be avoided and b) dealt with in the fastest possible manner. The latter means having technicians and parts at hand, which can be a very expensive manner (like insurance policies; they’re always too expensive until you need them). In order to avoid unplanned failures, we have therefore introduced preventive maintenance (for either cheaper or cyclical events) and condition based or preventive maintenance. Capturing machine health and deciding when to pre-emptively intervene in order to avoid unplanned failures is a pretty young science but one that shows the highest potential for operational and financial gains in the field of maintenance.

Lower maintenance cost

By now most people know that unplanned maintenance costs a multiple of planned maintenance; by a factor three to nine (depending on the industry) is generally accepted as a ballpark figure. It therefore keeps surprising me that most of the investments have traditionally been made in optimising planned maintenance. Agreed, how to increase efficiencies for planned maintenance is easier to grasp but we have by now come to a level where returns on extra investments in this field are diminishing. Enter unplanned maintenance; can either be avoided (increase equipment reliability) or foreseen (in which case it can be prevented). Increasing equipment reliability has not always been the goal of OEMs. In the traditional business model, they made a good buck from selling spare parts and they therefore had to carefully balance how to stay ahead of the competition without pricing themselves out of the market (reliability comes at a cost). Mind you, this was more an economic balancing act than a deliberate “let’s make equipment fail” decision. Now however, with uptime-based contracts, OEM’s are incentivised to improve equipment reliability. Unfortunately, unplanned failures still occur; and due to tighter planning and higher equipment utilisation requirements, these failures’ costs have increased! Therefore, in order to lower maintenance costs, we have to lower the number of unplanned events. The only practical way is to become better at foreseeing these events in order to be able to plan interventions before they occur. The simple plan is: gather data, turn it into information, make predictions and take action to avoid these events. And voilà, 3-9 times more money saved than if we focused on planned events!

Life can be simple.

5 Reasons Why You Suck at Predictive Analytics

reasons why you may suck at Predictive Analytics

We see it every day… Customer X embarked on a grand in-house initiative to implement Big Data analytics. $X million and Y years later, they can predict nothing, aside from the fact that their consultants seem to get fatter with every passing month, and the engineering guys keep asking for more time and bigger budgets. It’s understandable… These days, cutting through all the noise related to big data analytics can be difficult. The bloggers, analysts, and consultants certainly make it sound easy, yet the road to successful predictive analytics implementations seems to be littered with the corpses of many a well-intentioned executive.

Many of our customers come to us, after they have spun their wheels in the mud trying to implement big data projects on their own or with the help of the talking heads. Below is a list of what I believe to be the top reasons for failed projects with buzzwords omitted:

  1. Data Science and Engineering alignment Fail: Or… The fear that engineering will cannibalize data science. After all, “If I can automate Data Science, why do I need the Data Scientists?”, the reasoning goes. Aligning both camps is difficult in larger organizations, as turf-wars will erupt.  Analytics software should seek to include the data science day-to-day activities rather than exclude them.
  2. Your data sucks: Nothing can save you here. If your vendor/manufacturer is providing shoddy data, you won’t be able to predict or analyze anything. Any consultant that tells you otherwise, is selling fertilizer, not analytics. It is best to reach out to your data generator / vendor and work with them to fix the root of the problem.
  3. You hired IBM: Watson does well with Jeopardy questions, but sadly couldn’t even predict the most recent round of IBM layoffs.
  4. You build when you should buy: Predictive Analytics is really hard, and chances are that it’s not your core competency, so why are you bringing all of this in-house? The real short term costs of implementing and maintaining custom software, data science groups, engineering groups, and infrastructure costs, can easily eat away millions of dollars, and you’re placing really big bets on being able to hire the high-level talent to pull it off.
  5. Operations Misalignment: Predictions are useless unless there is a someone or a some-thing to act on the results. It’s important to make operations a partner in the initiative from the onset. Increasing operational efficiency is the goal here, so really… Operations is the customer. A tight feedback loop with ongoing implementation between both camps is a must.

And so that’s the gist of it – 5 bullets forced out of me at our marketing department’s insistence. As much as I enjoy mocking the hype-masters in the industry, these days I find myself extremely busy helping build a real startup, solving real-world problems, for the Fortune 500, for real dollars. 😉

Should you buy a tool or a solution?

There is an old saying “keep your hand on the plow”. It was quite interesting to read about how this phrase came to be, and how it applies to business today.

Most of us pay little attention to tractors in a field as we drive past.  Yet it was not long ago that this task was done with a horse and plow.  Now I know you are thinking, what the heck does that have to do with business?  Plow the ground, plant the crop, tend the crop and then enjoy the fruits of your labor – sounds a lot like business to me.  Back to plowing – keeping straight rows and turning the soil the right depth is critical. A successful farmer knows to apply constant pressure to the plow.  Too little and the plow pulled out of the ground, too much caused the plow to go too deep and stressed out the horse and farmer.  Another aspect, if you look closely at old pictures, is that you will see the reigns of the horse tossed around the farmer’s neck.  If the farmer looked right the horse knew to pull right.  So if the farmer was day dreaming and looking around the result would be crooked rows.  Crooked rows wasted space and made it hard to care for and harvest the crops.  The farmer knew that to survive he had to keep a steady hand and look forward to get the best results.

Now we interpret this saying to mean don’t get distracted and don’t lose sight of the goal, in other words focus! In predictive maintenance I see this ageless advice often ignored. Many offerings in this space are tools; legacy technology with an element of predictive analytics added as a “check box”.  It makes a good story:  “Here is a tool that gives you the ability to create your own rules and do your own predictive analytics with a pretty UI.”  Customers get enamored with pretty graphs and slick rules configuration and in doing so lose sight of the goal.  The whole purpose of predictive maintenance is to get your maintenance team at the right location, at the right time, with part in hand before something breaks.  It is all about results – period.

Nobody argues the point above, yet we still see some major companies focusing on the wrong thing when considering predictive analytics for maintenance.  Why?  Maintenance teams are conditioned to find a tool versus finding a solution.

The field of data science grew hand in hand with the explosion of cloud computing.  Cloud computing completely turned the concept of buying hardware and operating systems upside down.  Today cloud based SaaS (software as a service) has eliminated the need for deep expertise in technology and instead simply provides the answer to the business problem.  Now business can buy complete turn key solutions instead of buying a tool, buying the hardware to run the tool, paying their IT department to host the tool and hiring expertise in using and maintaining the tool.

There is a reason data science and machine learning have been the domain of PhD’s and academia – advances are happening at a rapid pace and it takes deep expertise with these tools to get results.  In fact the most important aspect of successful machine learning is the process of feature selection.  It is a rare customer who even understands that a feature is data in context, never mind the hundreds of machine learning algorithms available today.  Yet they believe they should buy a tool that requires them to define their own features and algorithms.  Sadly these same companies relate stories of failed attempts at using data science to solve some problem.  Instead of looking for a solution to a business problem they looked for a tool.  They took their “hand off the plow.”  Savy business leaders understand that buying tools and building expertise in a fast changing technology space that is not the core of their business is not a good use of their resources.  Others understand that business moves too fast and is too competitive to waste time and money buying tools and building expertise when they can buy the complete solution now.  Successful companies know they need a solution not another tool.  They “keep their hand on the plow.”


predictive analytics tool or solultion

Predikto, Inc. Partners With New York Air Brake to Incorporate Predictive Analytics In NYAB’s LEADER Advanced Train Control Technology Solutions

ATLANTAJuly 14, 2015 /PRNewswire/ — Predikto, Inc. today announced a new collaborative effort where New York Air Brake will incorporate Predikto’s auto-dynamic predictive analytics platform, MAX, into the company’s LEADER advanced train control technology solutions via its internet of things (IoT) initiative.

New York Air Brake, a subsidiary of the Knorr-Bremse Group (Munich, Germany), an innovation leader and supplier in the rail industry since 1890, will integrate a new predictive analytics component to its Advanced Train Control Technology solution, LEADER (Locomotive Engineer Assist/Display & Event Recorder). The mutually developed solution will now leverage a suite of predictive analytics software applications engineered by Predikto, Inc. Predikto’s patent pending solution, called MAX, is an auto-dynamic machine learning engine that draws upon LEADER train data in addition to capturing data external to the train itself, such as weather and line of road conditions. MAX is a self-learning artificial-intelligence solution that adapts itself to rapid changes in context in near real-time in order to provide the most accurate forecasts possible across an array of use-cases.

“Integrating predictive analytics with the rich train information from LEADER will allow the railroads to utilize their data to proactively identify opportunities to improve operating efficiency and rail safety,” said Mario Montag, CEO of Predikto. “Partnering with a premier technology company in the rail industry, such as New York Air Brake, will allow Predikto’s award-winning platform to make a defining impact on the rail industry.”

The predictions provided by MAX will enable new and existing users to incorporate advanced data analytics to enhance the capabilities currently available through LEADER. Predikto’s MAX platform has already proven success within the rail industry through forecasting failures and health in rail equipment ranging from bullet trains in Europe to wayside detection equipment in North America. This partnership will allow for the deployment of dynamic predictive capabilities that include a locomotive energy efficiency forecaster, a braking efficiency forecaster and track health. The LEADER/MAX solution is poised to revolutionize the rail industry by providing advanced insight to improve velocity and operating efficiency.

“You can have data without information, but you cannot have information without data.  Predikto’s MAX allows us to extract every bit of information and turn it into actionable insights that will improve visibility into operations, provide innovative solutions to improve safety, and provide clarity into the critical maintenance and performance indicators that impact the bottom line most,” states Greg Hrebek, Director of Engineering for New York Air Brake.  “The capability offered between us through this collaboration is unprecedented in the rail industry and will rapidly accelerate the value of the investment the railroads have made into locomotive onboard intelligence.”

About New York Air Brake

New York Air Brake, Inc., headquartered in Watertown, NY, has a long-standing history of innovation and technology in the rail industry ranging from providing advanced braking technology for trains to train control systems. New York Air Brake’s mission is to provide superior railroad brake and train control systems, products, and services with high quality and high value. For more information visit the New York Air Brake website at www.NYAB.com.

About Predikto, Inc.

Predikto, Inc., headquartered in Atlanta, GA, provides actionable solutions for the rail industry as well as industrial equipment and fleets using predictive analytics. Its proprietary data analysis and prediction engine is built on an auto-dynamic machine learning protocol that adapts to changing environments in near real time.  Predikto specializes in operationalizing predictions of key industrial events like asset failures and poor asset health to enhance a company’s overall performance.

The company is comprised of engineers, developers, academics, and industry professionals. Predikto’s technology solution enables companies to achieve seamless operational functionality, efficiency and exponential return on their asset investment.

For more information, visit www.Predikto.com.

 

SOURCE Predikto, Inc.

RELATED LINKS
http://www.predikto.com

Deploying Predictive Analytics (PdA) as an Operational Improvement Solution: A few things to consider

“…in data science…many decisions must be made and there’s a lot of room to be wrong…”

There are a good number of software companies out there who claim to have developed tools that can potentially deploy a PdA solution to enhance operational performance. Some of these packages appear to be okay, some claim that they are really good, and others seem really ambiguous other than being a tool that a data scientist might use to slice and dice data. What’s missing from most that claim they are more than an over glorified calculator are actual use cases that can demonstrate value. Without calling out any names, the one thing that these offerings share in common is the fact that they require services (i.e., consulting) on top of the software itself, which is a hidden cost, before they are operational. There is nothing inherently unique about any of these packages; all of the features they tout can be carried out via open-source software and some programming prowess, but here lies the challenge. Some so-called solutions bank on training potential users (i.e., servicing) for the long-term. These packages differ in their look-and-feel and their operation/programming language and most seem to either require consulting, servicing, or a data science team. In each of these cases, a data scientist must choose a platform/s, learn its language and/or interface, and then become an expert in the data at hand in order to be successful. In the real world, the problem lies in the fact that data tends to differ for each use case (oftentimes dramatically) and even after data sources have been ingested and modified so they are amenable predictive analytics, many decisions must be made and there’s a lot of room to be wrong and even more room to overlook.

“…a tall order for a human.”

Unfortunately, data scientists, by nature, are subjective (at least in the short term) and slow when good data science must be objectively contextual and quick to deploy since there are so many different ways to develop a solution. A good solution must be dynamic when there may be thousands of options. A good product will be objective, context driven, and be able to capitalize on new information stemming from a rapidly changing environment. This is a tall order for a human. In fairness, data science is manual and tough (there’s a tremendous amount grunt work involved) and in a world of many “not wrong” paths, the optimal solution may not be quickly obtained, if at all. That said, a data science team might not be an ideal end-to-end solution when the goal is for a long-term auto-dynamic solution that is adaptive and can to be deployed in an live environment rapidly and that can scale quickly across different use cases.typical solution

“…a good solution must be dynamic…”

End-to-end PdA platforms are available (Data Ingestion -> Data Cleansing/Modification -> Prediction -> Customer Interfacing). Predikto is one such platform where the difference is auto-dynamic scaleability that relieves much of the burden from a data science team. Predikto doesn’t require a manual data science team to ingest and modify data for a potential predictive analytics solution. This platform takes care of most of the grunt work in a very sophisticated way while capitalizing on detail from domain experts, ultimately providing customers with what they want very rapidly (accurate predictions) at a fraction of the cost of a data science team, particularly when the goal is to deploy predictive analytics solutions across a range of problem areas. This context-based solution also automatically adapts to feedback from operations regarding the response to the predictions themselves.

Predikto Solution Utilizing Predictive Analytics

 

Skeptical? Let us show you what Auto-Dynamic Predictive Analytics is all about and how it can reduce downtime in your organization. And by the way, it works… [patents pending]

Predikto Enterprise Platform