Deploying Predictive Analytics (PdA) as an Operational Improvement Solution: A few things to consider

“…in data science…many decisions must be made and there’s a lot of room to be wrong…”

There are a good number of software companies out there who claim to have developed tools that can potentially deploy a PdA solution to enhance operational performance. Some of these packages appear to be okay, some claim that they are really good, and others seem really ambiguous other than being a tool that a data scientist might use to slice and dice data. What’s missing from most that claim they are more than an over glorified calculator are actual use cases that can demonstrate value. Without calling out any names, the one thing that these offerings share in common is the fact that they require services (i.e., consulting) on top of the software itself, which is a hidden cost, before they are operational. There is nothing inherently unique about any of these packages; all of the features they tout can be carried out via open-source software and some programming prowess, but here lies the challenge. Some so-called solutions bank on training potential users (i.e., servicing) for the long-term. These packages differ in their look-and-feel and their operation/programming language and most seem to either require consulting, servicing, or a data science team. In each of these cases, a data scientist must choose a platform/s, learn its language and/or interface, and then become an expert in the data at hand in order to be successful. In the real world, the problem lies in the fact that data tends to differ for each use case (oftentimes dramatically) and even after data sources have been ingested and modified so they are amenable predictive analytics, many decisions must be made and there’s a lot of room to be wrong and even more room to overlook.

“…a tall order for a human.”

Unfortunately, data scientists, by nature, are subjective (at least in the short term) and slow when good data science must be objectively contextual and quick to deploy since there are so many different ways to develop a solution. A good solution must be dynamic when there may be thousands of options. A good product will be objective, context driven, and be able to capitalize on new information stemming from a rapidly changing environment. This is a tall order for a human. In fairness, data science is manual and tough (there’s a tremendous amount grunt work involved) and in a world of many “not wrong” paths, the optimal solution may not be quickly obtained, if at all. That said, a data science team might not be an ideal end-to-end solution when the goal is for a long-term auto-dynamic solution that is adaptive and can to be deployed in an live environment rapidly and that can scale quickly across different use cases.typical solution

“…a good solution must be dynamic…”

End-to-end PdA platforms are available (Data Ingestion -> Data Cleansing/Modification -> Prediction -> Customer Interfacing). Predikto is one such platform where the difference is auto-dynamic scaleability that relieves much of the burden from a data science team. Predikto doesn’t require a manual data science team to ingest and modify data for a potential predictive analytics solution. This platform takes care of most of the grunt work in a very sophisticated way while capitalizing on detail from domain experts, ultimately providing customers with what they want very rapidly (accurate predictions) at a fraction of the cost of a data science team, particularly when the goal is to deploy predictive analytics solutions across a range of problem areas. This context-based solution also automatically adapts to feedback from operations regarding the response to the predictions themselves.

Predikto Solution Utilizing Predictive Analytics

 

Skeptical? Let us show you what Auto-Dynamic Predictive Analytics is all about and how it can reduce downtime in your organization. And by the way, it works… [patents pending]

Predikto Enterprise Platform

The Scalability of Data Science: Part 3 – Reality Check  

You’re an operations manager, or even a CEO. Let’s say you have a big predictive analytics initiative and need to deploy several instances of your solution. What are your options? Ok, go ahead and hire a team of 5 data scientists, each with a relatively modest salary of $100k (US) per year (this is very conservative estimate). Now step back… You’ve just spent half a million dollars on staffing (plus the time for hiring, software procurement, etc.) for something that’s going to develop slowly and if it works at all, it may not work well. Have you made a wise investment?

This is the reality that most companies entering the realm of IoT and predictive analytics will face. Why? Most predictive analytic solutions can’t scale (i.e., can’t be rapidly applied across different problems). It’s too time consuming and too expensive and the value may be lost in a ramp-up attempt. A deployed predictive analytics solution must be scalable, fast, and affordable. A data scientist can be great (and many are), but they’re bound by the magnitude in which they can scale and the subjectivity of their respective approaches to the solution. There are many ways to approach data analysis that are correct, but there’s probably an alternative that is more valuable.

The next generation of predictive analytics solutions should be able to accomplish most, if not all, of the above automatically and rapidly with decreasing involvement from humans, and should perform as good or better than a good data science team; this is what Predikto has done (patents pending). We enable operations managers and data scientists by tackling the bulk of the grunt work.

I’m well aware that this may downplay the need for a large data science industry, but really, what’s an industry if it can’t scale? A fad perhaps. Data science is not just machine learning and some basic data manipulation skills. There’s much more to a deployed solution that will impact a customer’s bottom line. To make things worse, many of the key components of success are not things covered in textbooks or in an online course offering on data science.

It’s one thing to win a build the “best” predictive analytics solution (e.g., a Kaggle competition), but try repeating this  process of dozens times in a matter of weeks for predictions of different sorts. If any of these solutions are not correct, it costs real dollars. Realistically scaling in an applied predictive analytics environment should scare the pants off of any experienced data scientist who relies on manual development. Good data science is traditionally slow and manual. Does it have to be?

Rest assured, I’m not trying to undercut the value of a good data scientist; this is needed trade. The issue is simply that data science is difficult to scale in a business setting.

The Scalability of Data Science: Part 2 – The Reality of Deployment

To put my previous post into perspective, let me give you a for instance… An organization wants to develop a deployed predictive analytics solution for an entire class of commuter trains. Let’s be modest and go with 10 different instances from within the data (e.g.,  1) predicting engine failure, 2) turbo charger pressure loss, 3) door malfunction, … and so on…). We’ll focus on just one…

Data from dozens of assets (i.e., trains) are streaming in by the second or quicker and these data must be cleaned and aggregated with other data sources. It’s a big deal to get just this far. Next you have to become and expert in the data and begin cleaning and developing context-based feature data from the raw source data. This is where art comes into play and this part is difficult and time consuming for data scientists. Once a set of inputs has been established, then comes the easier part, applying an appropriate statistical model/s to predict something (e.g., event occurrence, time to failure, latent class, etc.) followed by validating and deploying the results. Oh yes, let’s not forget the oft unspoken reality of threshold settings for the customer (i.e., costs of TPs vs FPs, etc.). To this point, we’re assuming that the solution has value and it’s important to keep in mind that a data science team has probably never seen this sort of data ever before.

So on top of requiring computer programming skills, feature engineering prowess (which is art), understanding statistics/machine learning, and having good enough communication skills to both learn from the customer about their data and to be able to “sell” the solution, this must all be accomplished in a reasonable amount of time. We’re talking about 1 instance to this point, remember? And, we’re still not deployed. Do you have expertise in deploying data for the customer? Now repeat this situation ten times and you’re closer to reality. Your team may now just filled up the next 12 months of work and the utility of the solution is still unknown.

Predikto at Georgia Manufacturing Day

Georgia Manufacturing Day

Predikto was excited to be part of the “BUY FROM GEORGIA” event proclamation and the Georgia Manufacturing Day at the State Capitol. Governor Nathan Deal presented the proclamation to a group of more than 70 manufacturing and business leaders from around Georgia, including Mario Montag, CEO of Predikto.

The event was organized by Georgia Manufacturing Alliance, the most active group of manufacturing professionals in the state of Georgia. For a list of all the next events at GMA, visit GMA website.

[one_half last=”no”]
Mario Montog, CEO of Predikto
[/one_half]
[one_half last=”yes”]
Speaking at GA Manufacturing Day
[/one_half]

 

Startups driving a $5.7 Billion Market for IoT Analytics in 2015 according to ABI Research

InternetOfThings

A new report from ABI Research presented in Q4 2014, stated that revenues from integrating, storing, analyzing, and presenting Internet of Things (IoT) data will reach $5.7 Billion in 2015.

It is projected that by 2020 the IoT market will make up a third of all big data and analytics revenue.

“About 60% of this year’s revenues come from three key areas: energy management, security management, as well as monitoring and status applications” Aapo Markkanen, principal analyst at ABI Research said. “Within these segments, we can generally find analytic applications that reduce the cost base of asset-intensive operations (condition-based maintenance), automate routine workflows (surveillance), or even enable new business models (usage-based insurance). These early growth drivers also have in common the fact that the economics of IoT connectivity align easily enough with the requirements of analytic modeling.”

A major challenge identified in the research, is the deep domain expertise in analysis that is necessary to extract relevant data and insights from the machine and sensor data. That challenge is especially hard to overcome for the current leading technologies that were designed for more traditional analytic technologies. That is the reason why the majority of  innovations aimed to fill those gaps are being developed by startup companies. According to practice director Dan Shey, “What is remarkable about this market is how much of the innovation actually comes from startups”.

Predikto, an Atlanta based startup, whose solution  helps companies reduce cost base of asset-intensive operations, was included in the research as one of the niche players in the Core Analytics field.

The full research can be found in here

 

Internet of Things: The third wave of the Internet?

IoT

The Internet of Things (IoT) is emerging as the third wave in the development of the Internet. In the 1990’s 1 billion users were connected to the internet, by the 2000’s the release of Internet 2.0 and the popularity of internet capable smart phones that number grew to 3 billion, and now it is expected by 2020 the IoT will connect 28 billion devices to the internet.

According to a report published by Goldman Sachs, “The Internet of Things: Making sense of the next mega-trend” the Internet of Thing “connects devices such as everyday consumer objects and industrial equipment onto the network, enabling information gathering and management of these devices via software to increase efficiency, enable new services, or achieve other health, safety, or environmental benefits.”

The report classified the “things” as 5 key verticals for the adoption of this third wave of the Internet: Connected Wearable Devices, Connected Cars, Connected Homes, Connected Cities, and the Industrial Internet.

The infrastructure that will fully support the connection of 28 billion devices, consisting in telecommunications, sensors and software among other components will allow to use the data generated to make our lives easier (i.e. adjust the temperature of your home before you get home), use energy efficiently (i.e. turn on your washing machine when electricity usage and prices fall in the middle of the night), and help us anticipate and predict failures or problems (i.e. predict when your car is going to fail even before the check engine light turns on).

One of the main concerns about the connectivity of everything is privacy and security. The key for the IoT to develop to its full potential is to make sure the “things” are genuinely adding value instead of being merely intrusive. Goldman Sachs identified three key areas where the development of IoT will generated the most value: home automation, resources, and manufacturing.

The Industrial Internet has now invested large amounts of time and money in sensors, connectivity, micro-controllers and micro-processors, and the amount of data generated is often times large and unintuitive. The missing piece of that puzzle is software that combines all that information and analyzes it to transform it into actionable outputs.

Many predictive analytics companies are trying to solve a technology problem. How will you manage all that data? Should we use Hadoop or No-SQL? What are the issues of the IT department regarding the data they are collecting? I have a huge amount of data that the company is not using, is there valuable information that isn’t being leveraged? But they are not solving the main issue: what problems are you having in your manufacturing facilities that are slowing down your production rates? How can the existing data help you to reach and surpass the production goals reducing downtimes and inefficiencies?

If the operational problems are not being solved, the future of the IoT will not be bright.

 

Predikto raises $4M from TechOperators, ATA to predict machine failure

By Urvaksh Karkaria
Staff Writer-Atlanta Business Chronicle

Predikto Raises $4m to predict machine failures

Atlanta software firm Predikto has raised nearly $4 million to help manufacturers predict product failures earlier.

Predikto’s software engine — dubbed Max — allows manufacturers, railroad companies and other asset-intensive industries to predict equipment failure and warranty claims.

By detecting failures before they happen, companies can increase productivity, reduce downtime and tweak the manufacturing process to increase production volume, CEO Mario Montag said.

Max — an artificial intelligence and machine learning software robot built to design custom algorithms — uses real-time sensor data, historical maintenance records and past failure data to predict equipment breakdowns.

“Clients give us their data and (Max) spits out algorithms that can predict when a piece of equipment is going to fail,” Montag said

While predictive analytics software is widely used in business, manufacturing has been late to leverage Big Data to squeeze efficiencies.

Predikto is riding the wave of the industrial Internet of Things. Pumps, motors and other parts are being manufactured with on-board sensors that deliver tons of data on the health and performance of the devices.

“The industrial Internet of Things is creating a drive to do more with data,” Montag said. “The big data revolution of being able to do more and chew more information is sweeping through industrial manufacturing.”

Predikto raised the $3.6 million in a Series A round led by TechOperators, an Atlanta early stage venture firm managed by a quartet of serial entrepreneurs with billion-dollar exits under their belts.
Super angel groups Atlanta Technology Angels (ATA) and Angel Investor Management Group (AIM) also participated in the Predikto raise.

Predikto’s technology targeted to specific equipment and industry sectors, and its managed-service approach of delivering insight rather than tools, differentiates the startup, said Said Mohammadioun, partner at TechOperators.

“The market expects solutions rather than tools,” Mohammadioun said. “They don’t want to be in the business of figuring out how to use tools.”

The capital will be used for product development and sales and marketing — critical challenges for the startup.

Predikto must continue to innovate to stay ahead of the competition, while taking as much marketshare as possible, said Stephen Walden, a board member at Atlanta Technology Angels.

“That’s why this raise was so important for them to get into the market quickly,” Walden said. “Right now they’ve got, I won’t say a lock on the market, but a proprietary algorithm that nobody else can yet match.”

Launched in late 2012, Predikto is targeting a large addressable market. The industrial predictive analytics market in North America is estimated at more than $10 billion in annual sales, Montag said.

“Our sweet spots are continuous manufacturing facilities, such as steel plants, food & beverage plants and transportation companies with distributed assets — airplanes, trains and truck fleets,” Montag said. Siemens, for example, uses Predikto software to detect failures in train doors and diesel engines.

While the application of predictive analytics has so far been limited to the financial services and retail sector, the next market is industrial systems. Until recently, engineers typically followed a standardized maintenance schedule for industrial equipment, similar to annual maintenance schedules for automobiles.

Predikto relies on real-time data from sensors in the machinery, such as vibrations, electrical usage, and ambient temperature, to give engineers a more efficient predictive maintenance process. Rising temperatures, for instance, when combined with other things, can be a sign of inadequate lubrication, or an incorrectly fitting part that’s causing friction.

“The secret is in being able to run all these variables at once through a sophisticated algorithm to tell what is really going on,” Walden said.

Keeping maintenance downtime for critical equipment to the minimum required can save operators millions of dollars.

“If an engine that powers a steel mill costs $1 million for every hour it is shut down for maintenance, you’d rather not do (maintenance) every thousand hours, if you can do it every five thousand hours,” Walden said.

In the future, Predikto plans to target the booming oil and gas industry.

“The pain of asset downtime in that industry is significant,” Montag noted, adding it costs $500,000 for every day an oil rig is shutdown.

For Predikto, future success will depend on getting customers to fix things proactively, which requires a change in way of doing business, Mohammadioun said

“Companies know how to fix things that break — now, we are able to tell them when things are going to break,” he said. “That requires a change in culture.”

Via:: http://www.bizjournals.com/atlanta/blog/atlantech/2015/01/atlantas-predikto-raises-4m-from-techoperators-ata.html