Gartner: Data Science Needs Are Coming for Your IT Budget

PinIt

Research house Gartner says enterprise data executives should keep an eye on the booming data science market for new critical capabilities and use cases.

With IoT and other disruptive technologies creating a deluge of data, and with growing deployment of artificial intelligence, research firm Gartner says enterprise data and analytics executives should keep an eye on the booming data science market for new critical capabilities and use-case scenarios for platforms.

Their report found a wide range of practices within the three major use cases for data science today: production refinement, business exploration, and advanced prototyping. These differing practices needs often drive data science tool selection and mix.

See also: How to build a data science team, one member at a time

Production refinement remains the most common use case for data science platforms and is most impacted by the delivery and model management capabilities, while the business exploration use case is becoming increasingly common among “citizen” data scientists and “upskilled” business analysts within enterprise organizations.

Advanced prototyping has a more varied needs set within organizations but the data science deployments around this effort are often driven by cutting-edge, open-source technologies, like Apache Spark and deep learning. Leveraging innovation coming from the vast open-source data science ecosystem is a vital trend in the data science platform market.

Gartner Says Take Stock

Gartner recommends enterprise leaders take a look at their organization’s analytics and business intelligence resources today and take stock, and work with data science teams to identify their most significant challenges and high-priority needs.

That should help determine the critical capabilities generating the greatest impact on data science projects, and enable choosing the best platform based on balancing the desired use-case mix, user skill set and prioritized strength of critical capabilities.

And as platform and tool development in the space is moving so quickly, they also recommend long-term commitments in this market segment as vendors and the open-source community rapidly innovate. Key innovations currently pivot around cloud, Apache Spark, automation, collaboration and deep-learning capabilities.

Looking forward, the firm sees predictive and prescriptive analytics will attract 40% of enterprises’ net-new investment in business intelligence and analytics by 2020, with more than 40% of data science tasks being automated by then, resulting in increased productivity and broader usage by “citizen” data scientists.

Next year, they see deep learning, as in deep neural networks, being a standard component in 80% of data scientists’ toolboxes, and by 2019, “citizen” data scientists will surpass data scientists in the amount of advanced analysis produced.

Trevor Curwin

About Trevor Curwin

Trevor has been a new industries analyst and editorial leader in New York and San Francisco for over 15 years, where he has worked in advisory, marketing and capital formation for alternative investments and distributed infrastructure projects. He was Editor-in-Chief of Readwrite, where he turned a 13-year-old tech blog into one of the leading media platforms on IoT and the connected world. He’s an award-winning journalist who’s worked for CNBC, DVICE.com and other NBCUniversal web properties, as well as the Toronto Star, Outside, GigaOm, AWARE Magazine, and Environmental Finance. He’s also a successful angel investor and start-up advisor.

Leave a Reply

Your email address will not be published. Required fields are marked *