Real-world Interaction Challenge

KUKA Innovation Award 2018 - Get to know the Finalists

The finalists for the 2018 KUKA Innovation Award have been chosen! An international jury of experts selected the five best concepts addressing this year’s topic: the “Real-World Interaction Challenge”. The finalist teams now have until April 2018 to implement their ideas. At next year’s Hannover Messe, the winner will be awarded the 20,000-euro prize.

This year’s competition focuses on robots that can interact outside as well as inside the industrial environment, with emphasis on direct support for humans. The concept and presentation of the applications should be as realistic as possible.

To enable fair comparison of the concepts, KUKA provides each of the final teams with a flexFellow – a mobile robotic unit on which an LBR iiwa (a sensitive lightweight robot for safe human-robot collaboration) is mounted. Beyond this, a 3D vision system from the start-up company Roboception will be available. 

Team Alberta

The Robot Vision research group from the University of Alberta is working on image guided motion control of robot arms and hands. This team is implementing processes that allow the robot to learn from humans by means of observation, gestures and dialog so that future robot systems will be able to work with humans even in unstructured environments. The goal is for the robot, with this acquired knowledge, to be in a position to grip various everyday objects, workpieces and components and to sort them independently, even if new, unknown objects are included.

"We want to provide a possibility for humans to transfer their knowledge to robots and vice versa."

 

"We are thinking the fair is like a window to the future, a place where you can present and discover real innovations . For us, it is a great opportunity to build a bridge between research and industry."

Team Alberta - University of Alberta, Canada

Team CRoW

The team from the Institute for Computational Design and Construction at the University of Stuttgart combines expertise in algorithmic geometry and the development of robot-based material processing systems. In this project, it aims to provide small and medium-sized companies with access to robot-assisted methods of work. The concept comprises a collaborative robot workbench with an augmented reality interface. The project will demonstrate a woodworking scenario in which a robot assists a human.

"We believe that augmented reality can make complex processes and complex tools like robots more accessible to a larger audience."

"Existing robotic interfaces and workflows require extensive expert knowledge and experience. We believe that they should radically change to enable non-expert users to engage with robots as well. It’s about real-life, direct interaction, where a user can make informed decisions and influence robotic processes, based on tactile feedback and superimposed data."

Team CRoW - University of Stuttgart, Germany

Team Co-Aware

The project brings together experts from the Istituto Italiano di Tecnologia in the fields of dynamic human modeling, image processing and robot interaction control. Its objective is to deploy robots to support and guide humans in laborious industrial processes in order to ease the strain on them and prevent injury. When collaborating with a human, the robot monitors the dynamic performance and mental condition of its counterpart.

"The idea is to make robots collaborate with humans - not to avoid them. This requires a lot of data fusion and is the biggest challenge we have."

"Hannover Fair is a great opportunity for us to interact with people of the industry who use or develop robots – and we want to inspire them with our idea."

Team Co-Aware - Istituto Italiano di Tecnologia, Italy

Team DynaMap

The employees of Draper, a non-profit R&D organization, the Robot Locomotion Group at MIT and the Agile Robotics Lab in Harvard, aim to show that robots can orient themselves and execute tasks in unstructured environments. For this, the team uses neural networks to determine the positions and interactive dynamics of objects in the environment. The team’s developments are demonstrated by means of a maintenance task that is typically carried out by a human.

"We are building robots which can interact in a regular human environment instead of being limited to traditional fixed industrial applications."

"If a robot is working next to a person at a cluttered workbench or in a crowded kitchen, it needs to be able to effectively recognize objects and to manipulate them without really knowing anything about them beforehand. Our control software enables the robot to plan actions and trajectories on-the-fly, depending on the sensed configuration of the objects."

Team DynaMap - MIT & Harvard University, USA

Team UPEnD

Four researchers from the University of Pennsylvania combine their expertise from various robotics disciplines ranging from image processing and manipulation planning to system integration. The team is tackling the challenges facing a robotic system working with containers filled with liquids and used for exact dosing, for example in the pharmaceutical industry. The robot is controlled using sensors on the robot arm and two stereo cameras.

"We believe that our project really embodies the goal of KUKA: To have robot team mates that can work effectively and safely hand in hand with human colleagues."

"We're looking forward to the great opportunity to demonstrate our work at the fair. It is going to be really exciting to present it in front of a large audience."

Team UPEnD - University of Pennsylvania, USA

Timeline after the application

The finalists now realize their ideas with a KUKA robot. They will present their developments to an international audience of specialists at Hannover Messe 2018, a major flagship trade fair. There, the jury will crown the winner of the renowned competition.

Timeline of the KUKA Innovation Award 2018

Používáme cookies, abychom vám i online mohli poskytnout ty nejlepší služby. Více informací.

Ok