MapR has debuted a suite of new services designed to design and move forward client AI projects.
MapR Technologies at the Strata Data Conference this week unveiled six professional engagement services designed to either jumpstart or realign an artificial intelligence (AI) project for a fixed fee within a specified amount of time.
In addition, MapR also announced it has extended an existing relationship with Waterline Data through which a data catalog will now be added to the portfolio of products and services.
The provider of a platform for running big data and AI applications is pursuing that approach because most organizations are no longer going to engage a professional services team using via an hourly contract, says Bill Peterson, vice president of industry solutions for MapR.
“I’m of the belief that any contract that does not include a time parameter is not going to get signed,” says Peterson.
The MapR Data Science Lifecycle Offerings span everything from working with an organization to set up a hackathon to identify a use case and build a prototype of an AI model to integrating AI models with streaming events. MapR Technologies will also work with organizations to identify existing AI models that have failed or might be about to fail for one reason or another. Most AI models fail because new sources of data require different types of algorithms to effectively process or the data being used to drive those models has been compromised by a cybersecurity attack, notes Peterson. The MapR professional service teams have seen AI models that were using encrypted data being compromised by cybercriminals in as little as six to seven hours of being deployed, adds Peterson.
See also: AI needs big data, and big data needs AI
At the Strata Data conference in New York today a new “AI and Analytics in Production” book authored by Ted Dunning, PhD, board member of the Apache Software Foundation and chief application architect at MapR, and Ellen Friedman, PhD, principal technologist at MapR and Apache committer, was launched by O’Reilly Media. The book is intended to provide a non-technical but guide best practices for analytics and AI in a production environment based on real-world use cases. Those AI models can be relying on raw data, data that stored in Hadoop and streaming data sources in any combination.
In general, Peterson says as more business processes become dependent on analytics and AI being applied to streaming data the rate at which organizations are making critical decisions is accelerating. Now every business process requires analytics be applied to streaming data, but it is apparent the definition of near real-time in the enterprise is now defined by the rate at which streaming data can be processed.
The challenge organizations will increasingly face is knowing when an AI model needs to be replaced preferably before it breaks, says Peterson. Achieving that goal requires not only continuous building and training of AI models, there needs to be a set of well-documented processes in place for replacing one AI model over another. Ultimately, Peterson says most every enterprise application is either going to have AI capabilities embedded within it or accessible via REST application programming interfaces (APIs). The path organizations will take to achieve that goal will vary by use case, says Peterson. But if a process in any way impacts revenue, Peterson says it’s only a matter of time now before some form of AI gets applied to optimize it.