This article is sponsored by Streamlio

Building a Unified, Edge-to-Cloud-to-Data-Center Data Platform

PinIt

The explosion of data created by ever-increasing numbers of connected devices presents an opportunity to build many new and innovative applications. To maximize this opportunity, a modern, unified data platform architecture for fast-moving data is essential.

A profound change to the way data gets processed and analyzed is occurring as more applications extend their reach beyond the data center to the network edge. As more data gets created by mobile devices, wearables, autonomous vehicles, drones, and sensors connected to Internet of Things (IoT) applications, data is being fed into any number of edge systems, including microservers, micro-data centers, and even content delivery networks (CDNs).

All that data at the edge creates an opportunity to develop a raft of innovative applications, such as product usage analytics, real-time diagnostics, predictive alerting, personalization, immediate fraud detection, and security applications. Each of these applications requires access to data that needs to be processed as it arrives, in real time. It’s no longer practical for organizations to transfer all the data that needs to be processed to some central location.

Traditional batch-oriented architectures simply can’t support these classes of applications. Application architectures that rely on bringing all data back into a centralized data center residing in the cloud or on-premises have become untenable. They introduce too much latency to applications that require real-time processing capabilities at the edge. It’s also not economically feasible to backhaul massive amounts of data being generated at the edge to a centralized processing center.

A Modern Data Platform Architecture

What’s required today is a more elegant approach encompassing multiple layers of data collection, processing, and communication: a modern, unified data platform architecture for fast-moving data. A platform architected for that provides a centralized system for collecting and analyzing disparate data types and sources, enabling the immediate processing of data at the location that makes sense for that data. Performing a first layer of filtering and processing at the edge where data is collected means that only a subset or aggregation of that data needs to be transported to systems running in the cloud or on-premises.

With a consistent data platform that stretches from the edge to the data center, appropriate filtering, cleansing, enrichment, processing, and analytics can be applied in a consistent manner at the layer of computing where it makes the most sense.

The benefits of a unified data platform architecture include:

  • Efficiency: Processing data in each layer reduces data duplication and unnecessary data movement.
  • Responsiveness: Processing data closer to where data is created makes it possible to act on data faster, ensuring relevance and timeliness.
  • Agility: This architecture makes it easier to make changes because of unified management and a consistent environment from the edge to the data center.
  • Manageability: A simpler, more consistent environment reduces the management and operations burden.

Much of what passes for data platform architectures today is little more than cobbling together multiple generations of batch-oriented systems that have been allowed to evolve independently of one another. Any time data needs to be shared between batch-oriented systems, organizations not only introduce additional application latency, but they also incur integration costs that can all too quickly become quite painful. A unified data platform architecture sharply reduces those costs because it is built from the start for the scenario where every application – to one degree or another – can share data with another. Just as importantly, a common, distributed data platform architecture means data will no longer go missing as it moves between disparate platforms.

Requirements for a Modern Data Platform

Back-end systems based on batch-oriented application architectures are obviously not going away any time soon. There is too much time and money invested in those legacy applications. But it’s also apparent that those systems were never designed to process data in real time.

IT organizations going forward need a modern unified data platform capable of:

  • Being deployed in edge, cloud, and data center environments
  • Dynamically right-sizing deployments to fit resources available in each environment
  • Providing a consistent interface in each environment for easier development
  • Efficiently moving data between environments
  • Supporting real-time data-driven applications, not only batch-driven processes
  • Scaling to handle any degree of performance requirements

Streamlio recognized the need for a modern data platform and saw that new technology was required to meet the demands of modern environments. Taking technology for streaming and real-time processing that our team had originally developed and deployed in production at Twitter and Yahoo, they built a solution capable of connecting, processing, and storing streaming data in real time. 

Whether at the edge, in the cloud, or in a private data center, the Streamlio solution enables enterprises to more easily and more rapidly process and analyze data to support both users and modern data applications.

To learn more about how Streamlio delivers this modern data platform, visit streaml.io to learn about the technologies needed to build and deploy today’s data applications. The way enterprise IT applications are managed is about to forever change for the better, and choosing the right technology will be critical to taking advantage of the new opportunities that creates.

Leave a Reply

Your email address will not be published. Required fields are marked *