TY - GEN
T1 - Cognitive semantics for dynamic planning in human-robot teams
AU - Angleraud, Alexandre
AU - Houbre, Quentin
AU - Netzev, Metodi
AU - Pieters, Roel
PY - 2019
Y1 - 2019
N2 - Robots are making their way into our society and are foreseen to become an important part in our everyday life, at work or at home. Industrial factory layouts are moving robots out of enclosures bringing them side by side with human workers. As for service robots they are by definition meant to perform tasks in our immediate proximity. To be performed successfully, these tasks, also referred to as joint actions, require coordination and trust. Coordination implies that the robot needs to account for his actions and their effects on the environment but also for changes that the user introduces. Therefore, flexible planning capacities allowing on-the-fly adaptation to what a human is requesting or doing, together with a shared mental representation of the task, are needed. In this paper we present (i) a symbolic knowledge system and the way it translates into simple temporal networks (STN) to generate actions plans, and (ii) interaction models based on natural language. First results indicate the robot can build plans for a joint action according to several parameters given its conceptual semantic description. Furthermore, a human can interactively either modify the plan or ask for explanations about it. By several experiments we demonstrate the generation and adaptation of these dynamic human-robot collaboration plans.
AB - Robots are making their way into our society and are foreseen to become an important part in our everyday life, at work or at home. Industrial factory layouts are moving robots out of enclosures bringing them side by side with human workers. As for service robots they are by definition meant to perform tasks in our immediate proximity. To be performed successfully, these tasks, also referred to as joint actions, require coordination and trust. Coordination implies that the robot needs to account for his actions and their effects on the environment but also for changes that the user introduces. Therefore, flexible planning capacities allowing on-the-fly adaptation to what a human is requesting or doing, together with a shared mental representation of the task, are needed. In this paper we present (i) a symbolic knowledge system and the way it translates into simple temporal networks (STN) to generate actions plans, and (ii) interaction models based on natural language. First results indicate the robot can build plans for a joint action according to several parameters given its conceptual semantic description. Furthermore, a human can interactively either modify the plan or ask for explanations about it. By several experiments we demonstrate the generation and adaptation of these dynamic human-robot collaboration plans.
U2 - 10.1109/COASE.2019.8842842
DO - 10.1109/COASE.2019.8842842
M3 - Conference contribution
SN - 978-1-7281-0357-0
T3 - IEEE International Conference on Automation Science and Engineering
SP - 942
EP - 947
BT - IEEE 15th International Conference on Automation Science and Engineering (CASE) 2019
PB - IEEE
T2 - IEEE International Conference on Automation Science and Engineering
Y2 - 1 January 1900
ER -