A team led by Charles River Analytics has secured a four-year contract worth approximately $8 million to support the Defense Advanced Research Project Agency‘s Explainable Artificial Intelligence program, which seeks to make artificial intelligence easier to explain.
Charles River said Monday its team — comprised of Brown University, University of Massachusetts at Amherst and Roth Cognitive Engineering — will carry out the Causal Models to Explain Learning effort under DARPA’s XAI program.
“In CAMEL, we’re developing causal models of the operation of highly-complex machine learning systems, such as autonomous systems that use deep neural networks and deep reinforcement learning,” said Brian Ruttenberg, a senior scientist at Charles River.
Ruttenberg added the CAMEL team aims to create tools and techniques that can help users “understand why a machine learning system reached a particular conclusion.”
Charles River will use its Figaro open-source probabilistic programming language in the effort to simplify explanations of machine learning models.
Under XAI, DARPA funds multiple teams to develop machine learning techniques and human-computer interaction designs that could help make AI more understandable.