Mobile and adaptive User interface for human robot collaboration in assembly tasks

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


The manufacturing sector is constantly looking for more efficient ways of production. The Industry 4.0 related technologies such as augmented and mixed reality, connectivity and digitalisation as well as the current trend of robotisation have resulted a number of technical solutions to support the production in factories. The combination of human-robot collaboration and augmented reality shows good promises. The challenges in this case come from the need to reconfigure the physical production layout and how to deliver the digital instructions to the operator. This paper introduces a model for collaborative assembly tasks that uses a mobile user interface based on the depth sensors and a projector. The novelty of this research comes from the adaptivity of the user interface, as it can be freely moved between the tasks around the workstation based on the operator needs and requirements of the tasks. The ability to move projection surface is achieved by detecting the surface position using Aruco markers and computing required transformation of the projector image.

Original languageEnglish
Title of host publication2021 20th International Conference on Advanced Robotics (ICAR)
Number of pages6
ISBN (Electronic)9781665436847
ISBN (Print)9781665436854
Publication statusPublished - 2021
Publication typeA4 Article in a conference publication
EventInternational Conference on Advanced Robotics - Ljubljana, Slovenia
Duration: 6 Dec 202110 Dec 2021


ConferenceInternational Conference on Advanced Robotics

Publication forum classification

  • Publication forum level 1

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Software


Dive into the research topics of 'Mobile and adaptive User interface for human robot collaboration in assembly tasks'. Together they form a unique fingerprint.

Cite this