Automatica 2008

05/08/2008 : 13:05

CoTeSys’ first participation at a trade fair was extremely successful.

The three exhibits attracted professional audience from industry and science. Often the number of visitors requesting information at the same time outnumbered the CoTeSys staff of six by far. Also many students expressed an enormous interest in the work of the Cluster.

Dr. Haass stated: "We were taken by surprise. Everyday our stand had to cope with huge crowds of professional people and their high quality of their interest. It appears that automation industry desperately needs solutions where human workers cooperate with machines and robots side-by-side. This is one of the core competences of CoTeSys."

Especially the eyetracking-camera "EyeSeeCam" was in the focus of many visitors. This new wearable eye-controlled camera system has been developed to continuously move and direct a camera with the direction of user gaze. The camera moves as fast as the eyes. Thereby, it records what the user sees.

Furthermore, the exhibit "Japanese table" showed a photorealistic model of a typical household environment. The images rendered from this model serve as a prediction of the visual appearance of the scene during any manipulative operation of its objects. Cognitive systems need an appropriate model of their environment in order to execute tasks. Currently, robots use geometric models for navigation and modeling of the environment. However, in case of filigree structures or translucent objects, these methods fail. Here, image-based models yield realistic representations and enable the cognitive system to reason about the true visual appearance of its surroundings.

The third exhibit "Joint Action for Humans and Industrial Robots (JAHIR)" is part of the research platform Cognitive Factory. The respective skills of humans and robots shall be combined: the robot hands over work pieces or tools for the next construction step, or attends to repetitive processes; the worker with his fine motor skills and high perceptive capabilities thus can process the more sensitive tasks in the mean time more effectively than his robot companion.