A look at what ‘streaming’ and ‘real-time’ data analytics should be. You’ll be surprised by how your customers define those terms.
In the analytics world, there’s a lot of debate over the precise definitions of what “streaming” and “real-time” data/analytics should be. Some argue that streaming analytics need to be near-instantaneous, whereas others say a few seconds to receive results is enough real-time for most purposes, particularly when one considers the additional cost of such fast transactions.
But the internal debate becomes meaningless in the face of what the customer wants and needs. That’s why VoltDB conducted a survey of 2,000 U.S. consumers about their expectations when it comes to “real-time” information, and where businesses are falling short.
And the bar is, indeed, quite high. The study found that 63 percent of consumers feel that waiting for results during any experience negatively impacts their relationship with the brand in question. Given that more than 8 out of 10 consumers define “real-time” as less than two seconds, they have definitively set the most important bar for analytics-driven experiences.
One typical reaction to this dissatisfaction is moving to a different brand or service, and that’s a notable cause of churn and a significant opportunity for businesses to improve the everyday interactions with their customers.
For a startling example of how quickly customers make these decisions, look no further than gamers. Even when playing with an app that’s meant to entertain them, gamers tolerate only three instances of lag before abandoning it. Imagine a similar reaction to a customer dashboard that takes more than a few seconds to load, or isn’t nearly as responsive as they would like. They might continue using it merely because they have no other choice, but their satisfaction drops with every further hiccup.
The age of real-time relevance
Speed is crucial to these customer interactions, but immediate results that are lackluster in quality hurt the business more than slower, more accurate ones. In fact, 86 percent of consumers said that accurate information was crucial to the customer experience, barely edging out the 83 percent that also mention speed.
For example, Amazon’s pages reliably deliver recommended products within a few milliseconds of clicking on a new product page. The speed is there, clearly, but unless those recommended products are relevant to your needs, they’re useless to Amazon and only serve to distract consumers. Given that 63 percent of those surveyed said that non-relevant ads were annoying, the accuracy threat is real.
Layered on top of sheer relevance is the timing of an advertisement. Nearly 8 out of 10 consumers said they wouldn’t click on a recommended product advertisement unless they see it at precisely the right moment during the purchase funnel. And if you deliver that recommendation at the wrong time, such as after they’ve already purchased something similar, you’re bound to annoy them once again — 63 percent of consumers get annoyed at untimely recommendations.
Creating an infrastructure to meet easily annoyed customers
With these numbers in mind, you might be wondering: How do I actually meet these expectations with any semblance of real-time? Remember that the average user expects accurate results in less than two seconds. No matter what technology you throw at the problem, you can take comfort in having a relatively firm goal of two seconds.
The challenge is in not only developing this solution, but also being able to scale it as needed.
Unsurprisingly, VoltDB is eager to publish this information in light of its recent efforts to create what they call a “translytical data platform,” that “combines transactions and analytics under one system, accelerating the analytics and action on data the moment it matters.”
Other platforms, such as DataStax Enterprise (DSE), IBM DB2, MemSQL, Oracle Database In-Memory, and SAP Hana all offer variations on the translytical model as well, with varying features and guarantees about the speediness of the results to common queries. Forrester offers some more profound insights into the highly competitive space.
Foundation of in-memory storage
The foundation of these translytical platforms is in-memory storage of the data, which processes data at speeds a few orders of magnitude faster compared to analyzing data kept on even the fastest solid-state disks.
In-memory storage is possible through some intelligent compression of data, which allows you to keep more data in-memory without paying exorbitant amounts for storage. Using a scale-out architecture gives companies the elasticity they need to have their database span thousands of nodes, allowing them to store even petabytes of data in memory.
The promise of each of the platforms isn’t just speed — before, enterprises that wanted real-time experiences needed to create highly-customized (and expensive) in-house applications. Now, the same technology is accessible through a translytical database provider with the simplicity of an API.
Companies can begin to consolidate their siloed (and on-disk) data warehouses into platforms that allow for tiered storage on a variety of mediums—useful for lesser-accessed data that doesn’t need to be analyzed in real-time.
For streaming and other real-time applications, these translytical platforms offer all-new possibilities. And, given that consumers appear to become increasingly picky about the smoothness of their digital experiences with every passing year, that two-second “real-time” definition might not last long.