Real-World Interaction Challenge

KUKA Innovation Award 2018

The finalists for the 2018 KUKA Innovation Award have been chosen! An international jury of experts selected the five best concepts addressing this year’s topic: the “Real-World Interaction Challenge”. The finalist teams now have until April 2018 to implement their ideas. At next year’s Hannover Messe, the winner will be awarded the 20,000-euro prize.

This year’s competition focuses on robots that can interact outside as well as inside the industrial environment, with emphasis on direct support for humans. The concept and presentation of the applications should be as realistic as possible.

To enable fair comparison of the concepts, KUKA provides each of the final teams with a flexFellow – a mobile robotic unit on which an LBR iiwa (a sensitive lightweight robot for safe human-robot collaboration) is mounted. Beyond this, a 3D vision system from the start-up company Roboception will be available.

On this page we present the teams in more detail.

Team Alberta

The Robot Vision research group from the University of Alberta is working on image guided motion control of robot arms and hands. This team is implementing processes that allow the robot to learn from humans by means of observation, gestures and dialog so that future robot systems will be able to work with humans even in unstructured environments. The goal is for the robot, with this acquired knowledge, to be in a position to grip various everyday objects, workpieces and components and to sort them independently, even if new, unknown objects are included.

"We are thinking the fair is like a window to the future, a place where you can present and discover real innovations. For us, it is a great opportunity to build a bridge between research and industry."

Team Alberta - University of Alberta, Canada

Team CoAware

The project brings together experts from the Istituto Italiano di Tecnologia in the fields of dynamic human modeling, image processing and robot interaction control. Its objective is to deploy robots to support and guide humans in laborious industrial processes in order to ease the strain on them and prevent injury. When collaborating with a human, the robot monitors the dynamic performance and mental condition of its counterpart.

"The idea is to make robots collaborate with humans - not to avoid them. This requires a lot of data fusion and is the biggest challenge we have."

Team CoAware - Istituto Italiano di Tecnologia, Italy

Team CRoW

The team from the Institute for Computational Design and Construction at the University of Stuttgart combines expertise in algorithmic geometry and the development of robot-based material processing systems. In this project, it aims to provide small and medium-sized companies with access to robot-assisted methods of work. The concept comprises a collaborative robot workbench with an augmented reality interface. The project will demonstrate a woodworking scenario in which a robot assists a human.

"Existing robotic interfaces and workflows require extensive expert knowledge and experience. We believe that they should radically change. With the help of Augmented Reality, non-expert users should be able to engage with robots as well."

Team CRoW - University of Stuttgart, Germany

Team DynaMaP

The employees of Draper, a non-profit R&D organization, the Robot Locomotion Group at MIT and the Agile Robotics Lab in Harvard, aim to show that robots can orient themselves and execute tasks in unstructured environments. For this, the team uses neural networks to determine the positions and interactive dynamics of objects in the environment. The team’s developments are demonstrated by means of a maintenance task that is typically carried out by a human. 

"If a robot is working next to a person at a cluttered workbench or in a crowded kitchen, it needs to be able to effectively recognize objects and to manipulate them – without really knowing anything about them beforehand.

Team DynaMap - MIT & Harvard University, USA

Team UPEnD

Four researchers from the University of Pennsylvania combine their expertise from various robotics disciplines ranging from image processing and manipulation planning to system integration. The team is tackling the challenges facing a robotic system working with containers filled with liquids and used for exact dosing, for example in the pharmaceutical industry. The robot is controlled using sensors on the robot arm and two stereo cameras.

"We believe that our project really embodies the goal of KUKA: To have robot team mates that can work effectively and safely hand in hand with human colleagues."

Team UPEnD - University of Pennsylvania, USA

About the KUKA Innovation Award

KUKA launched the research competition in 2014 to promote innovation in the field of robot-based automation and to support technology transfer from science to industry. It is aimed at developers, graduates and research teams from companies or universities. The participants develop ideas for tackling challenges specified by KUKA. A jury of experts selects the finalists from all of the entries submitted.

The best teams implement their projects using KUKA hardware and present the results to wide-ranging specialist audiences at major trade fairs. The winners receive a monetary prize of 20,000 euro. The award was first presented at AUTOMATICA 2014.

Timeline of the KUKA Innovation Award 2018

We use cookies to offer you the best service online, as well. More under Privacy Policy.