2018 | OriginalPaper | Chapter
GroundSim: Animating Human Agents for Validated Workspace Monitoring
Authors : Kim Wölfel, Tobias Werner, Dominik Henrich
Published in: Tagungsband des 3. Kongresses Montage Handhabung Industrieroboter
Publisher: Springer Berlin Heidelberg
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by
In the promising field of human-robot cooperation, robot manipulators must account for humans in the shared workspace. To this end, current prototypes integrate various algorithms (e.g. path planning or computer vision) into complex solutions for workspace monitoring. The step from research to industrial use for these solutions demands rigid validation of the underlying software with real-world and synthetic data. Related fields (e.g. human factors and ergonomics) implement toolsets to create synthetic data of human-machine interactions. However, existing toolsets employ hand-crafted motion paths or motion segments for their human agents. This limits the variety of resulting motions and implies laborious composition of animation sequences. In contrast to this, we contribute a novel approach to human animation for synthetic validation: We animate our human agents through a realistic physics simulation and we expose motion paths in a flexible and intuitive high-level editing interface. We also generate photo-realistic images of resulting animations through state-of-the-art rendering techniques. Finally, we employ these synthetic images and their ground-truth backing to validate a prototype for a workspace monitoring system and a subsequent online path planner.