Scalability: The Key to High, Long-Term Analytics ROI


To deliver strong ROI, analytics capabilities need to be able to scale. But enabling scalability across highly tailored use cases isn’t always easy.

Across the modern enterprise, every team wants something slightly different from analytics. They have their own goals, own data, and own KPIs—leading many teams to create and deploy their own analytics solutions with available resources.

Over time, that’s created a highly fragmented analytics landscape across many organizations. Siloed solutions stagnate within individual teams and lines of business, creating an environment where:

  • Efforts and investments are duplicated across teams, wasting vital resources that could be better spent driving analytics innovation.
  • A lack of consistency in how analytics is approached and measuring success leads to varying recommendations from different teams.
  • Solutions are poorly maintained because there’s no centralized resource dedicated to keeping them operating efficiently or improving them over time.
  • Teams can’t directly benefit from each other’s analytics innovations or utilize existing models and solutions in new ways.
  • Solutions are limited in their scope, only pulling in data from limited sources and ignoring other data that could add vital and valuable context to insights.

The missing piece in environments and organizations like these is scalability. When teams push ahead with their own siloed analytics projects, the solutions they create can’t scale—making it far harder to realize high ROI from them.

Unfortunately, there’s one big reason why that all-important piece is still missing across many enterprise analytics landscapes: It’s tough to enable if you don’t have the right strategy and support.

Why has enabling analytics scalability proven so hard for so many?

Three main challenges limit organizations’ ability to scale their analytics solutions and investments today:

#1) Disparate and inconsistent data

When an individual team builds its own analytics model, it builds around its data—designing models that make the most of the available data sets, whatever their type or quality may be. Models created for and driven by those data sets become incredibly tough to use in different contexts, where the same type or quality of data isn’t available. There’s no interoperability, so the models can’t be scaled elsewhere.

#2) Low visibility and understanding across silos

If one team doesn’t know about an adjacent team’s existing analytics investments, they can’t leverage and customize them for their own use. Siloed creation and management of analytics capabilities create cultures where people simply aren’t aware of where and how the organization has already invested in analytics—leading to significant duplication of effort and increased costs for the enterprise.

#3) Scalability is hard to build in retroactively

When a team identifies a new internal use case for analytics, they rarely stop to ask, ‘how could other teams or markets benefit from what we’re creating?’ As a result, solutions are built with a single purpose in mind, making it difficult for other teams to utilize them across slightly different use cases. Instead of building a widely usable foundation, then customizing it for each team, solutions are designed for a single team at a core level, making them tough to repurpose or apply elsewhere.

See also: Moving to the Cloud for Better Data Analytics and Business Insights  

Best practices for highly scalable, high-ROI

Organizations need to fundamentally change how they think about, design, and manage analytics capabilities to overcome those challenges and unlock the full ROI of highly scalable analytics models and solutions.

Here are four practices helping organizations do that effectively:

Start with a standardized foundation

Each team across your organization needs bespoke, tailored capabilities to get the most from analytics. But that doesn’t mean they have to build their own solutions from the ground up.

By having a centralized team create a customizable, standardized foundation for analytics, teams can create exactly what they need in a consistent way that enables interoperability and sharing of models and insights across the enterprise.

With an analytics center of excellence (CoE), for example, a centralized team can create everything each team needs for their unique use cases and add value for that team by including insights and capabilities from which adjacent teams have seen value.

Bring data science and data engineering closer together

Even today, many still view analytics as the exclusive domain of data scientists. But, if you want to enable the scalability of analytics models, data engineers need to be involved in the conversation and decision-making.

Data scientists may build the models and algorithms that generate analytical insights, but data engineers ensure a consistent, interoperable data foundation to power those models. By working closely together, they can align their decisions and design choices to help scale analytical capabilities across the business.

Zoom out and get some external perspective on what’s possible

Suppose you want analytics investments to deliver broad value across your organization. In that case, your projects should start with a broad view of what analytics could help you achieve across multiple use cases. Practically, that means:

  • Identifying high-value and complementary use cases together and designing for them in tandem.
  • Find the commonalities between different analytics needs that individual teams don’t recognize.
  • Going to teams proactively with capabilities and insights that can improve what they do, rather than waiting for them to request specific capabilities.
  • Looking at internal and external data holistically to determine where the most valuable insights may be found.
  • Carefully consider how analytics can support high-level business goals and the organization’s overall strategy—not just individual teams within it.

Continuously learn and improve.

Analytics always requires some degree of experimentation, and you can’t realistically expect every single use case to deliver high long-term value. But, even if they’re unsuccessful, organizations should take steps to learn from each of them.

Within an enterprise, someone needs to take responsibility for learning from each use case explored. That person or team can then apply those lessons across new use cases and use them to develop assets and modules that can be reused across geographies and domains, extending, and increasing the value they deliver to the business.

Nitin Aggarwal

About Nitin Aggarwal

Nitin Aggarwal is VP of Data Analytics for The Smart Cube. He is an analytics executive and a seasoned business leader with 15+ years of experience across industries and functions. Based out of The Smart Cube’s Chicago office, Nitin leads the Retail, CPG & Consumer Markets practice in the US. Prior to this role, Nitin developed and scaled the data analytics practice, and managed operations across the globe. He also drove the practice strategy in terms of new capabilities, solutions, and technologies from India. Nitin studied electrical engineering at Punjab Engineering College, Chandigarh, MBA at University of Notre Dame, and got his Executive education in Analytics from SDA Bocconi. An avid sports person, he loves playing tennis and badminton, and is a committed follower of American football.

Leave a Reply

Your email address will not be published. Required fields are marked *