Organizations like NASA are sharing their experiences with AI and AIOps with those just getting started.
Organizations interested in implementing AIOps have the benefit of learning from the advice others that have taken steps into the use of artificial intelligence in their operations and DevOps groups.
One such innovator sharing advice is NASA, which recently highlighted the importance of taking a holistic approach to rolling out AI for the space agency’s business applications like finance and procurement, particularly in an AIOps environment.
In a presentation to the AI operations group ATARC, NASA’s data science group leaders identified the three key elements that an AIOps initiative needs. Technology may be the most obvious one, but organizations also must prep their data to make sure it is ready for AI, and equally important is an enterprise culture shift, with people accepting the need to let data help shape decisions.
Why listen to NASA? Maybe because they’re the ones who can honestly say, “Yes, we are rocket scientists.”
“The bit where the algorithm is run is a very small part. The data is rarely in a form where you can really use it. People have their ways of doing things already; there’s this problem of needing to learn a new tool or set of tools and understanding how it really works. People have different conceptions of what AI can do for them,” said Nikunj Oza, NASA’s data sciences group lead, according to a report by Government CIO Media.
He added, “Data is not [automatically] ready for AI to use, so you can start an AIOps project and it screeches to a halt because the other parts of your system are not ready for it.”
Shenandoah Speers, associate CIO of applications at NASA, told the ATARC audience that the agency’s digital transformation is underway, but still maturing. “We’re seeing a lot of influx of data, and how to digest that data and make business decisions and mission decisions on that data.”
Oza also discussed some of the misconceptions that surround AI and machine learning, including the in-house fear that AI will steal jobs from humans.
A webinar speaker from the US Defense Department Joint Artificial Intelligence Center (JAIC) also discussed the data quality challenge. “There’s issues with label quality, data quality,” said Yevgeniya Pinelis, chief of test and evaluation of AI/machine learning at JAIC. “There are infrastructure issues, of course. … For us to really build trustworthy AI systems, we need to have that ecosystem and all the piping in place.”
She added that culture is a factor because of the need to get teams to adopt Agile and DevSecOps. “If your user and tester are engaged really early on, that’s how you get this Agile process and avoid disasters at the end. That’s a big culture change we’re going through. We’ve had a lot of luck in joint logistics — those tend to be AI problems that are well scoped, although data readiness is always an issue.”
As more enterprises gain experience with AIOps and observability, there are growing opportunities for those who are just getting started to gather AIOps advice from the pioneers.