“If you don’t take action, you don’t realize the value.”
When businesses turn to a tech platform to capture Big Data, one of the key issues is how to leverage different types of data (at rest or in motion) and what kind of analytics to perform on that data. Vitria Technology’s platform involves building analytic pipelines for data to enable historical, predictive, and prescriptive analytics.
In 2015 the company launched an advanced analytics platform for the IoT market that supports both streaming data ingestion and access to a data lake or warehouse. The platform can also be used for operational intelligence and use cases include predictive maintenance; outage management (such as on an electric grids); fraud detection (such as for banks); supply optimization; and customer engagement and marketing. Customers of Vitria include utilities, banks, telecom companies, and even Starbucks.
Vitria now intends to take its analytics platform to the cloud. In the video below, Dale Skeen, CTO and co-founder of Virtia, explains how such pipelines can be built, using smart meters and the electric grid as examples, to enable actionable analytics for average business users:
Interview by Adrian Bowles, RTInsights executive analyst and founder of Storm Insights, Inc.
Dale Skeen: We actually started Vitria, what we called Generation 1 back in 1994. … We focused, at that time, on some emerging markets of BPM and business activity monitoring. It was really the more real-time integration movement and orchestration of information, and then trying to analyze that. Very successful. Took the company public on that.
By around late 2000s, 2007, 2008, we decided that there was a merging opportunity we really wanted to focus the company on, so the original founders and some of the original investors came back in. We invested in the company, took it private, and since then, been really focused on real-time analytics. That is an area that I am very passionate about and very excited about.
Adrian: You talked about sort of restarting and recapitalizing and going private based on a passion in this area. What is it about analytics that got you excited enough to do this? Because that’s a major undertaking.
Dale: Excited about analytics and excited about real time, both of those, and putting those two together. That’s what we’re really focused on. Analytics, we’re now seeing that in business and especially in the emerging internet of things–IoT–analytics is constituting 30 to 40 percent of the value there. Actually, that percentage is likely to go up. It’s something that is contributing huge value to the economy and to businesses. That makes that exciting.
Real time has always been exciting for me, because I believe that people interact in real time in the real world. This is the way we want to conduct businesses. This is the way we should be getting the information. This is the way we should be acting on those. Bringing those two concepts together, to me, was extraordinarily exciting.
Plus, there was a third thing that was happening in the marketplaces. Really have seen it now, is the rise of big data. Actually, most of that now is being generated through the Internet of Things. That is a trend that as we know, is accelerating. Those three trends really made me very excited about restarting the company and really focusing on this new opportunity.
Adrian: What sorts of technologies are you dependent on for your solutions, and what are you developing yourself?
Dale: The key technologies are the rise of big data infrastructures to support that. Then the rise of streaming big data infrastructures that support that. There’s actually excellent open-source technology out there. These are frameworks that are enablers for acquiring data and perhaps processing it quickly. What you need to build on top of that is the modeling environments that capture the analytics. Enable ordinary people, as we call it, to capture the analytics. The operational people, for example, or the business people to capture that. To really build what we would call complex analytics, dense analytics.
Adrian: Would it be fair to say that, as you’re talking about, ordinary people or regular people, I forget the phrase you used, as opposed to data scientists, that you’re providing some level of abstraction that allows people with more limited knowledge of the underlying technology to get the business value from it?
Dale: We don’t expect people to be rocket scientists, data scientists.
Adrian: That’s a small market.
Dale: Our big data programmers do this. We really want to put it into the hands of the people who know the business. What we’re doing is we’re providing a very high-level abstraction. We call it the analytic pipeline. You think in terms, you capture the data. You enrich the data with contextual information. I think what would help there is if you take an example, coming off of, for example, a smart meter. You want to not only be able to [see] … the operational performance of that … [but] concern about maybe fraud or failure. That data comes in, and you want to be able to capture it. You need to enrich it. In this case, if you’re looking at failure, you would want real-time weather information. As you know, that’s a big contributor to failures on the grid.
You enrich that data, and then you do analytics, like KPIs. You look at voltage sags, up and down for that. That provides a certain set of value. Then you can extend that with the notion of dense analytics. What I’d like to do is, looking at voltage signatures over time, do I predict a failure, or a problem with that? You can come and introduce predictive models now. Predictive failures and predictive maintenance is probably one of the largest use cases for real time information.
The next step is using another type of advanced analytics, which is called prescriptive analytics. Now, descriptive, prescriptive and predictive analytics are things that are not in many people’s vocabularies. A few years ago, they would not be able to use those. It’s really our goal to enable people who are focused on the business itself, or on the things, the Internet of Things that they are trying to measure, to really enable them to do this. This is what we’re bringing to the table, on top of these more basic frameworks that are provided.
What we want to do is enable this analytic pipeline, where you are doing your descriptive analytics. Telling your KPIs, for example, and what’s going on right now. You can do your predictive analytics to say what’s likely to happen next, whether you’re trying to track someone to make them an offer, or whether you’re trying to predict failure in the electric grid. The prescriptive part, which is telling you what’s the next best action. There’s one final piece, which is called the action part. Because if you don’t take the action, you don’t realize the value. That’s the core. That’s what we call the analytic pipeline. As you walk down those steps of the pipeline, you’re determining more and more value. It’s really like the analytic value chain, like the concept of a value chain in there.
The Analytic Value Chain
What we focus on is enabling our customers to develop those analytic pipelines. We provide a nice visual programming capability to allow them to build these pipelines. We provide all the analytics that they need for that. Now, one factor in that is, I want to do predictions, I need predictive models. I want to do prescriptions, I need prescriptive models. Quite likely, you’ll be using machine learning techniques for that, and you need the historical data. We support the entire life cycle by also capturing the data, supporting machine-learning techniques over that, and allowing you to operationalize those machine-learning techniques to run at speed. We take care of the entire analytic life cycle. That’s what we do. We allow our customers to build this analytic value chain, which delivers value to them. Our tools allow them to maintain the entire analytic life cycle.
We go to market in two ways. We build this analytic platform that allows our customers to define the analytic value chain, or the analytic pipeline. We also, in just a couple months, we’re coming out with analytic platform as a service offering. They can just go, and the cloud-based offering. We’ll have a public side. They can also install it privately if they want to do it. Come and define analytics as a service. We expect them to have an ecosystem around them, as you said. They may be using … a data sciences service, machine learning service, et cetera. Our goal, both in the on-premise and the cloud offerings, is to make it easy to capture those models of a build, those analytic models, and to operationalize them. Because the models have no value to them until you operationalize them. You build the models. These may be great predictive models. Now you need to insert it into your data stream. That’s what we call operationalizing it. Running it at scale and at speed, so it gives you those real-time predictions and allows you to take those real-time actions.
Adrian: You made some really important points, in terms of not waiting until you have the perfect model, because you will be waiting a very long time. You’re missing out on the value that you could be getting. I also like what you’re talking about in terms of the process of constant improvement of the model as you get more data, as you get more experience. I think that’s great. I think you’re in a terrific space, and I wish you luck with it.
Dale: Thank you. We’re very excited about the space. I appreciate the time to talk with you.
Want more? Check out our most-read content:
Research from Gartner: Real-Time Analytics with the Internet of Things
Getting Fast Results from Slow Data
Goodbye Don Draper, Hello Big Data: An EMA Report on Modern Analytics
Why Edge Computing Is Here to Stay: Five Use Cases
What’s Behind the Attraction to Apache Spark
Video — Plat.One CEO: Enterprise IoT Doesn’t Have to Be Hard
Liked this article? Share it with your colleagues!