How robots learn to understand our world

 |  ResearchCollaborative Research CentresArtificial IntelligenceNews

After breakfast, the dishes are cleared away: the empty milk carton belongs in the trash, the dirty plates in the dishwasher. For humans, these manual operations are a matter of course. What if we could teach robots these manual skills in the same way that parents teach their children? A consortium of scientists from the universities of Bielefeld, Paderborn and Bremen wants to radically rethink the interaction between humans and machines and teach robots knowledge and manual skills in a natural way.

Until now, machines have mainly been programmed to perform a single task in factories and production halls. Although there are also robots that do more complex things, fed with extremely large amounts of data and completely pre-programmed in laboratories, here too the task areas are clearly defined in advance. The scientists' goal goes beyond this: they want to break new ground in the interaction between humans and robots, enabling robots to develop completely new tasks in interaction with human users.

More powerful than the brain: artificial neural networks

This flexibility is a characteristic of computer programs based on artificial intelligence (AI). Today, such programs already perform complex tasks that sometimes exceed the capacities of the human brain. One example is Deep Learning technology, which enables data-based predictions using artificial neural networks. This allows AI to generate solutions to problems that human experts might not have thought of.

One hurdle in using this technology, however, is that an AI does not provide a rationale that humans can understand as to why a particular prediction was made. This makes dialogues difficult, as the AI often cannot provide meaningful answers to follow-up questions. For the transfer of AI technology to robotics, this is a particular problem. So how can an AI-supported robot learn to adapt to the very personal requirements of users? The scientists' answer is: like a human - by means of co-construction, i.e. learning through collaboration.

Excellently positioned through collaborations

"Human-robot research is forward-looking - but in science, co-construction as a guiding principle in robotics is not addressed enough," says Prof. Dr. Philipp Cimiano, spokesman for the Research Institute for Cognitive Interaction Technology (CITEC) at Bielefeld University. He is developing the new research approach together with computer scientist Prof. Dr.-Ing. Britta Wrede from Bielefeld University, psycholinguist Prof. Dr. Katharina Rohlfing and computer scientist Prof. Dr. Axel-Cyrille Ngonga Ngomo from Paderborn University, and computer scientists Prof. Dr. Michael Beetz and Prof. Dr. Tanja Schultz from Bremen University. The initiative further develops the ongoing interdisciplinary collaborations of the three universities and bundles them in the new center CoAI. CoAI stands for Cooperative and Cognition-enabled Artificial Intelligence.

In the Collaborative Research Center/Transregional Collaborative Research Center "Constructing Explainability" (SFB/TRR 318), scientists from Bielefeld and Paderborn are already investigating the cooperative practices of explanation - and how these can be taken into account in the design of AI systems. At the same time, scientists at the University of Bremen are conducting studies in the Collaborative Research Center EASE "Science of Everyday Activities" (SFB 1320) on what skills are required for robots to understand their environment and their own.

When breakfast becomes science

Interdisciplinarity is central, says Britta Wrede, co-leader of Project Ö (public relations) in SFB/TRR 318, and explains how the research focuses complement each other: "The team in Bremen has robots that have highly complex architectures. These interact with the scientists on site, but currently not with normal users. Robots from Bielefeld, such as Pepper, interact with humans, but are not yet able to perform any sophisticated actions. Paderborn, on the other hand, is home to experts in the principle of co-construction, who will now apply their knowledge of human interaction to robots as well."

In the team of the three universities, researchers from computer science, robotics, linguistics, psycholinguistics, psychology, philosophy and cognitive sciences are cooperating. A morning kitchen scenario provides the team with a special field of research in the lab: "There are variations for every everyday action," says Philipp Cimiano. "Because each*of us sets the table differently or has different preferences. We are interested in precisely these actions that involve flexibility," says Cimiano. This interaction between humans and machines is an example of where AI can be put to good use - especially if it also masters the method of co-construction. In the CoAI center, he said, the goal is now to develop a common, convergent approach to research that will allow co-construction to be used across disciplinary boundaries.

Co-construction: creating something together

In order for the robot to one day be able to prepare the breakfast egg as desired, the researchers are analyzing exactly how skill learning works in humans. Co-construction as a pedagogical approach means learning through collaboration. For psycholinguist Rohlfing, this interaction is not a specific moment, but builds up: "Signals are constantly sent back and forth and result in a constant adaptation to the other person. This adaptation creates something new between two people that wasn't there before." This is how understanding and learning occur.

We know co-construction from human development. When adults teach children something, they use the method of scaffolding: "I, as an adult, take over the role of the child in some places so that the child can fulfill his part in other places and thus learn. Gradually, this scaffolding of assistance is dismantled," Rohlfing explains. The scientists are specifically investigating how people teach others in a kitchen scenario: How does pouring, stirring, cutting work? "This is done by showing, demonstrating and presenting. For AI-supported robots to use this principle and learn manual skills, they have to be made sensitive to such assistance strategies."

New representations in robot architecture

Understanding human cognitive abilities is also fundamental for computer scientists. "As soon as the robot can grasp what humans want and what they themselves are capable of, it can help them in direct interaction," says Michael Beetz, spokesman for the SFB EASE. "Because if we don't have an idea of that, we can't realize systems that act with and for humans."

The goal, he says, is to create new technological foundations for robotic and AI systems accordingly. "This requires new architectures that bring all these things together: Dialogue, action, perception, planning, reasoning, general knowledge, partner modeling. All these aspects that are necessary for a different quality of human-machine interaction," Beetz said. With the help of this new representation in the robot - the understanding of people, but especially of action - application scenarios arise that, according to the researchers, can form a basis for flexible and meaningful interaction between robots and people in everyday life.

This text has been translated automatically.

Professor Cimiano
Photo (Universität Bielefeld, Michael Adamski): Prof. Philipp Cimiano.
Professor Wrede
Photo (Universität Bielefeld, Michael Adamski): Prof. Britta Wrede.
A robot is preparing breakfast.
Photo (Universität Bielefeld, Patrick Pollmeier).
Professor Rohlfing
Photo (Universität Bielefeld, Susanne Freitag): Prof. Katharina Rohlfing.
Professor Beetz
Photo (Universität Bremen): Prof. Michael Beetz.