Human Tutoring of Robots in Industry

The project Human Tutoring of Robots in Industry investigates requirements from industrial companies to augment their robots with artificial intelligence for efficient human-robot interaction. We research possibilities to adapt an artificial system that learns new actions and objects through observation and their natural language descriptions by a human tutor to the industrial context. To enable successful worker-robot interaction, however, it is also necessary to research factors influencing workers well-being, perception of control etc. In the project, we build on results from the project RALLI on robot action and word learning through observation and natural language descriptions. Coming from basic research, the idea of the proposed project is to investigate up-to-date requirements on human-robot interaction of stakeholders from industries employing robots in their manifacturing process.

Through interviews and online surveys, our system will be adapted to the industrial context, including (i) requirements from the management level of industries employing industrial robots, and (ii) requirements from workers interacting with industrial robots. Partial results are implemented in URSim, focusing on non-verbal apsects of worker-cobot interaction.

Summing up, the practical results show the individual importance of systems for worker-cobot interaction. In this respect we offer consultancy for interested companies.

Publications

Press coverage

  • Interview of Stephanie Gross in an episode of matrix, ORF Radio Ö1's weekly show on new technologies on April 23rd, 2021. The episode, entitled "Hallo, spricht hier ein Bot?", investigates the roles and capabilities of voice assistants from social robots to shakespeare automats. Link

Gallery

An example of a decision tree on how a robot reacts to human gestures in a collaborative scenario.

An example of a decision tree on how a robot reacts to human gestures in a collaborative scenario. When the system recognizes a human gesture, it needs to identify the type of gesture and then act accordingly.

Example 1 of nonverbal behavior implemented on the UR10 in the simulation environment URSIM. The robot directs its attention to the human. Potential context: the robot has just finished a task and is now waiting for the human to proceed. (code)

Example 2 of nonverbal behavior implemented on the UR10 in the simulation environment URSIM. The robot beckons to the human and then directs its attention to the human. Potential context: the robot has just finished a task and beckons to the human for the human to proceed with the task. (code)

Example 3 of nonverbal behavior implemented on the UR10 in the simulation environment URSIM. The robot points at a certain location in the room and then directs its attention to the human. Potential context: the robot signals the human to bring an object from a certain location in the room. (code)

Research staff

Sponsor

Vienna Science and Technology Fund

NEXT – New Exciting Transfer Projects 2019

Key facts