The Defense Advanced Research Projects Agency awarded a $7.5 million grant to researchers at the University of Arizona to develop artificial intelligence that can comprehend social signals and human exchanges.
Researchers plan to study the AI within a video game format where it will then be played by humans.
Adarsh Pyarelal, research scientist for the School of Information in the Machine Learning for Artificial Intelligence Lab and principal investigator for the project, explained what the research will try to solve.
“Actually understanding the content of the dialogue is really not an easy thing for a computer to do,” Pyarelal said. “There’s been a lot of research in the last couple of decades. I would say we’re really trying to bring together sort of complementary strengths from many labs to attack this really hard problem.”
Pyarelal said the project will look into specific things people take for granted, such as recognizing people’s body language and facial expressions.
The development could help incorporate AI in individualized tutoring. An AI tutor could potentially recognize when a student is feeling frustrated in the way a teacher understands, according to Pyarelal.
Health sciences has also taken an interest in the technology, like to help in recognizing when a patient is in distress.
The grant negotiation took place in June 2019 and, in the end, a million dollars was taken out of the funds. Pyarelal expects the project to last four years, until the end of 2023.
RELATED: Africana Studies Program’s new robot combines humanities and technology
According to Pyarelal, multiple people are needed due to the complexity of the project, so the grant will support four labs and 20 personnel.
Pyarelal said he hopes to make free software tools accessible to everyone and advance the state of the art in the field while writing papers to share with the scientific community.
Clayton Morrison, associate professor for the School of Information and co-principal investigator for Theory of Mind-based Cognitive Architecture for Teams, or ToMCAT, explained his role in the project.
“The work that I do is at the intersection of machine learning and artificial intelligence methods that involve what’s called automative planning,” Morrison said. “For this project, we’re building an AI system that is able to pay attention to human users that are using the system … A key focus of the project is to do a better job of understanding how teams of humans who are collaborating on a task get along.”
The AI system will understand what humans are trying to do or accomplish, and Morrison’s role is to develop the technology that will make sense of what people are doing.
“We learn how to get along with each other and collaborate,” Morrison said. “A key part of that is the ability to, in a sense, read the minds of other people.”
He hopes to make machines able to understand people’s expressions.
“Fundamental work in natural language processing, understanding language, people who are using the system are going to talk to each other as well as with the system,” Morrison said. “We’re also going to be measuring some aspects of people: what their expressions are, how they’re behaving like and what their tone of face is.”
Special equipment will be used to read electrical activity that can be recorded on the surface of the skull. This information can show the emotional state of people completing a task.
Kobus Barnard, a computer science professor and co-principle investigator on the project, provided insight on the process.
“We will develop an understanding of team coordination and performance that is amenable to interventions,” Barnard said in an email. “For example, team dynamics in a search-and-rescue scenario or a triage scenario, where there are physical risks as well as social and emotional factors, could be improved by monitoring systems that can help the team do their job. You might think of the end goal as digital assistants for the team as a whole that understands that social and emotional factors can be extremely important for success and safety.”
Follow Gabriella Cobian on Twitter