Swim: 48% Organizations Analyze Streaming Data In Real-Time


Organizations are increasingly opting for an analyze-then-store method for streaming data, according to a new survey by Swim.

Swim, the developer of a platform for continuous intelligence applications, found that almost half of all respondents to its quarterly streaming data survey analyze data in real-time, instead of storing it first. 

It is an improvement on previous surveys and shows the industry is moving towards an analyze-then-store method, to avoid data staleness. The survey found that 16 percent of organizations perform contextual analysis in stream, lowering the time from streaming data entering the system to analysis even more. 

SEE ALSO: Where is Streaming Data Analytics at Its Most Tangible?

Some data does not need to be analyzed in real-time. 32 percent of organizations store data in a warehouse or lake and process it later in batches, to gain contextual insights. 

Skittish around streaming analytics? Fear of cultural shift might be what’s  holding you back.

Continuous processing and streaming data has many key advantages. According to organizations surveyed, the most important advantage is improved customer experience (44%), followed by early warning of outage or threat (36%) and more efficient supply chain (36%). 

Most use streaming technology for more than four purposes. The main ones are data pipelines (54%), messaging (54%) and microservices (52%). One third of organizations that responded said they use it for streaming applications. 

21 percent of organizations are processing over one million messages per second, a staggering amount to process and analyze. Two percent of organizations are processing over 10 million messages a second. 

70 percent of organizations build their own custom streaming data solutions. 50 percent build the full stack with available open source infrastructure and 20 percent operate on top of a pre-integrated commercial platform. The other 30 percent outsource to commercial platforms for all stages. 

Apache Kafka is by far the most popular open source streaming technology, with 83 percent of organizations using it. Apache Pulsar is second place, with 17 percent usage. Apache Cassandra, Apache ActiveMQ and Apache Spark Streaming were also cited as popular frameworks. 

In terms of skills, people who are technically proficient in Python and Java are rated the highest, followed by Kafka Streams and JavaScript. Scala, Typescript, Golang and Rust all had less than 15 percent interest from organizations, which confirms how the industry has shifted to those two popular programming languages. This makes sense, as the computer hardware, software and SaaS industry is actively processing and analyzing streaming data at a much higher rate than other industries. 

Computer hardware, software and SaaS companies are leading the way in processing and analyzing streaming data, accounting for 20 percent of those surveyed. Retail/e-commerce and financial were joined second in adoption. 

As the shift to more edge computing continues, we expect to see more adoption of analytical streaming capabilities and an effort to analyze in real-time or at source. Through this, organizations will have a continuous feedback loop, instead of weekly or monthly batch updates, leading to quicker and precise decisions.

Skittish around streaming analytics? Fear of cultural shift might be what’s  holding you back.
David Curry

About David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Leave a Reply

Your email address will not be published. Required fields are marked *