Can’t Fail: Learning From Common Analytics Project Challenges


Not hitting ROI targets may be key, but not using analytics solutions to their fullest potential is also a form of failure. Here’s what we’ve learned.

Organizations do not start analytics or data management projects thinking they will fail, but the reality is that many projects do fall short of expectations. Organizations commonly think of failure as not meeting timelines or staying within allocated budgets, but implementing solutions and not using them to their fullest potential is also a form of failure. In these cases, a lot of effort is expended, yet not much is gained. People may lose confidence in their team’s ability and their trust in the data or the value of business intelligence (BI). Additionally, not maximizing the project’s value can actually affect overall BI maturity and future use.

Therefore, understanding the value of successful project implementations and adoption, and avoiding common pitfalls are important for realizing the full value of data investments. Here are some common challenges organizations face:

Time to implementation versus time to value

When planning projects, time to implementation is usually a key project milestone that defines the desired endpoint of the development cycle. In many cases, focus is placed on gathering the business and technical requirements and ensuring that solutions are delivered on time. When developing BI applications, for instance, IT departments may examine how long the tools take to develop and leverage the timeline as a marker for success, considering the potential adoption of a new platform, data management processes, and application development. While delivering solutions to end users may signal the end of the design or implementation phase and time to delivery is important, it does not necessarily lead to automatic value.

See also: Teredata says….stop buying analytics

Business users want to know that what they are using provides business value and answers the questions required to do their jobs and support growth. Even though a solution exists, end users will not consider their analytical environment valuable unless they know how to use it and are gaining quantifiable benefits from its use. Simply looking at a dashboard is not enough. Understanding the value proposition of the accessed data and how added visibility can affect overall business decisions is what closes the gaps between time to implementation and time to value.

Understanding these differing viewpoints and adjusting internal milestones can make the difference between project success and one team feeling their job is complete while another views analytics as not delivering its promised value.

Current evaluation for future requirements

Although ensuring BI requirements match current project needs is important, understanding future requirements and building out a roadmap is critical for later expansion and scaling of initiatives. All too often departments implement their own solutions, resulting in a multitude of dashboards across the organization that suit the unique needs of various teams. Although effective for individual analytics, developing an enterprise approach to data management and BI delivery can be hampered by this approach as data sources become siloed. The benefits of this include self-service and targeted analytics, but these tradeoffs are quickly outweighed when trying to expand use.

Therefore, organizations and individual teams must consider future use cases as well. It might not be possible to anticipate all needs, but understanding what additional data may be captured in the future, who will access analytics, and some potential use cases will help define growth required. This roadmap of future requirements can help organizations make the right software selection now and prevent the costly reevaluation of their toolsets as new needs are defined.

Incorporating people and processes into technology

One of the reasons embedded BI is increasingly important to organizations is because decision making can no longer be separated from operations. However, many IT teams focus on the technical requirements – what data is needed, how to best handle data complexities and processing times, what capabilities should be built, what metrics are required – but overlook the business processes required to help information make sense, and the end users themselves. Technology without considering processes and people limits successful outcomes. “Build it and they will come” only works if stakeholders are considered and involved in the collaboration process, and when business processes are built-in to technology outcomes.

Understanding and incorporating these three areas creates an environment that supports agile development and higher levels of adoption, therefore should not be overlooked. As more project sponsorship comes from business units, organizations will increasingly understand the value of ensuring process-oriented analytics are incorporated within operational applications. In turn, more business-focused decision making within analytics adoption will increase an organization’s understanding of why these three areas must be considered.

Understanding and selling the value and benefits associated with strong data management

Inclusion of adoption and use in the project lifecycle and considering business and technology requirements will go a long way in nurturing project success. But, the most value possible comes from ensuring strong data management. The adage “garbage in, garbage out” remains true, and is more accurate in today’s world with increasing data complexities. Ensuring security, privacy, compliance, and quality among other factors also requires taking into account how information is processed and stored.

In some cases, organizations concern themselves with delivery and not with how data is managed over time. With increasing complexities, data diversity, and emerging technologies, focusing on data management enables organizations to leverage information assets in a way that keeps them ahead of the competition. However, doing so effectively also requires the ability to sell its value proposition to business decision makers. As organizations expand their analytics use, the value of data becomes more obvious, but is not always easy to sell within newer BI environments.

Laying a strong data management foundation may mean that initial implementations take more time, offsetting the promise of quick time to delivery. However, for longer term success, business and data requirements need to be mapped out. A plan must be built, expanding beyond simple data quality or other data management initiatives towards the ability to tie data management to business outcomes. Many organizations start with master data management (MDM) and customer records, but companies across various industries may choose to first focus on other data sets such as quality control, patient outcomes, or R&D. Irrespective of where an organization starts, it is important to pay attention to the details and understand that simple data integration will not be enough to gain the level of insight required to gain quantitative value out of information assets over time.

The complexities of strong analytics

Successful outcomes require a process that weighs the desired goals and develops a path to reach them. This map includes linking people, processes, and technology to ensure that analytics are tied to operations and forward-looking outcomes. Most importantly, strategic initiatives must include strong data management within an agile development environment to make sure that analytics are not only trusted and reliable, but also delivered proactively to support better business decisions.

Lyndsay Wise

About Lyndsay Wise

Lyndsay Wise is the director of market intelligence at Information Builders. Before joining Information Builders, Lyndsay worked as an industry analyst for 12 years, founding WiseAnalytics in 2007 and covered research areas related to data visualization, analytics, BI in the cloud, and implementation strategies for mid-market organizations. She provided consulting services as well as industry research into leading technologies, market trends, BI products and vendors, mid-market needs, and data visualization. In 2012, Lyndsay wrote Using Open Source Platforms for Business Intelligence: Avoid Pitfalls and Maximize ROI to help provide organizations with the tools needed to evaluate open source business intelligence and make the right software decisions.

Leave a Reply

Your email address will not be published. Required fields are marked *