Center for Continuous Intelligence

Not Sure What Your AI Is Doing? Google Will Explain

PinIt

Google Explainable AI is a new set of tools and resources to help developers better understand model behavior and detect and resolve bias, drift, and other gaps in their models.

While a lot of organizations are already seeing upwards trends with their AI deployment, some are struggling to understand what the AI is doing and why.

Not many businesses have strong AI expertise in-house, and most AI programs do not come with a simplistic user interface to explain decisions. The end result is confused business leaders, who cannot understand the AI behavior.

SEE ALSO: Summarization Platform Receives Google, Microsoft Backing

Google wants to help bridge the communication gap, with the launch of Explainable AI. The new set of tools and resources, available to its cloud platform customers, enables developers to better understand model behavior and “detect and resolve bias, drift, and other gaps” in their models.

In cases where an AI has gone off the rails or incorrectly examined data, Explainable AI will provide an analysis of what led to the decision. From that, developers should be able to patch the AI and better understand the cause-and-effect.

“While this capability allows AI models to reach incredible accuracy, inspecting the structure or weights of a model often tells you little about a model’s behavior,” said director of AI strategy, Tracy Frey, in a blog post. “This means that for some decision-makers, particularly those in industries where confidence is critical, the benefits of AI can be out of reach without interpretability.”

Google does note that while Explainable AI will provide an analysis of the model, it cannot make a judgement on the data inputted. As explained in a Gartner webinar, one of the key factors in AI deployment success is good data quality and governance.

“AI Explanations reflect the patterns the model found in the data, but they don’t reveal any fundamental relationships in your data sample, population, or application. We’re striving to make the most straightforward, useful explanation methods available to our customers while being transparent about its limitations,” said Frey.

David Curry

About David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Leave a Reply

Your email address will not be published. Required fields are marked *