Published 2022 | Version v1
Publication

Trust Metrics for Task Assignment in Cooperative Teams of Robots

Description

In the last decade, the concept of trust and its dynamics has received considerable attention in robotics research. This is particularly true in the field of human-robot interaction, where several different factors, ranging from the user expectations of the robot capabilities to the physical appearance of the robot, has been defined as strongly affecting trust. On the contrary, the study of trust dynamics between robotic agents needs to be explored further. Starting from this premise, this work proposes a framework in which different robotic agents can model the concept of trust they have in each other for the accomplishment of a given task. Preliminary experiments performed with real robots (two Pepper robots and one NAO by SoftBank Robotics) provides a proof-of-concept for broader utilization of the system in cooperative robotic scenarios.

Additional details

Created:
February 11, 2024
Modified:
February 11, 2024