Robots are making their way into our society and are foreseen to become an important part in our everyday life, at work or at home. Industrial factory layouts are moving robots out of enclosures bringing them side by side with human workers. As for service robots they are by definition meant to perform tasks in our immediate proximity. To be performed successfully, these tasks, also referred to as joint actions, require coordination and trust. Coordination implies that the robot needs to account for his actions and their effects on the environment but also for changes that the user introduces. Therefore, flexible planning capacities allowing on-the-fly adaptation to what a human is requesting or doing, together with a shared mental representation of the task, are needed. In this paper we present (i) a symbolic knowledge system and the way it translates into simple temporal networks (STN) to generate actions plans, and (ii) interaction models based on natural language. First results indicate the robot can build plans for a joint action according to several parameters given its conceptual semantic description. Furthermore, a human can interactively either modify the plan or ask for explanations about it. By several experiments we demonstrate the generation and adaptation of these dynamic human-robot collaboration plans.
|Name||IEEE International Conference on Automation Science and Engineering|
|Conference||IEEE International Conference on Automation Science and Engineering|
|Period||1/01/00 → …|
- Publication forum level 1