Accessibility Tools

Skip to main content


Main Motivation

Increased availability in the last decade of affordable robotic manipulation systems like robot arms and grippers opened new perspectives for various applications. However, the lack of capabilities to interpret working and environmental conditions due to missing intelligence of these systems makes them unreliable in real-life usage.

A key challenge of intelligent robotics is to create robots that are capable to directly and autonomously interact with and manipulate the world around them to achieve their goals. Learning is essential to such systems, as the real world contains too many variations for a robot to have in advance an accurate model of human requests and behaviour, of its surrounding environment and objects, or the skills required to manipulate these. It is of crucial importance to be able to transfer knowledge and abilities from an application to another and from a robotic system to another.

Further aspects are safety and acceptability of the system that must be guaranteed at any time. Robots must detect conditions in which they are incapable to solve problems respecting safety requirements and must be accepted by the human users.

Project Goal and Objectives

The IntelliMan project focuses on how a robot can learn efficiently to perform tasks in a targeted, high-performance and safe matter.

Main Goal

To enable next generation robots to efficiently learn how to interact with and manipulate their surroundings in a purposeful and highly performant way, being capable to:

  • Perform tasks with limited human supervision.
  • Autonomously interact with objects regardless of their material, size and shape.
  • Deal with environments that neither it, nor its designers have foreseen or encountered before.
  • Platform-independent transfer of knowledge between different systems and domain.

As an additional goal, user perception of such AI-powered robotic manipulation systems and factors enhancing human acceptability are investigated.


Development of next-generation robotic manipulation systems, empowered by artificial intelligence that:

  • Learn individual manipulation skills from human demonstration.
  • Learn abstract descriptions of manipulation tasks suitable for high-level planning.
  • Discover functionalities of objects by interaction.
  • Guarantee performance and safety via automated, adaptive fault detection and user acceptability.

Use cases as demonstration objectives

The development of various solutions for the interactive object manipulation problem in complex environments is supported by a comprehensive set of application scenarios. These use cases will demonstrate the effectiveness of the technologies developed: 

Upper-limb prosthetics

increase user acceptance and trustworthiness and enhance embodiment in upper-limb prostheses

Daily-life kitchen activities

reliable robotic manipulation of everyday kitchen objects

Robot-based assembling of products

robust handling of deformable objects

Robotic fresh food handling for logistic applications

sensing-based multi-finger grasping

Outcomes and expected impact

The project has a wide variety of specialized deployment scenarios of manipulation robots in unstructured environments that neither it nor its designers have foreseen or encountered before. The potential for autonomous manipulation application with robots capable to manipulate their environment is vast: hospitals, elder- and child-care, factories, outer space, restaurants, service industries, home environment.

The expected outcomes can be summarized in:

  • Broader adoption of AI-oriented methods in robotic manipulation and human-machine interaction for increased portability, affordability, reliability and safety.
  • Penetration in industry through safe, fast and easily adaptable AI-powered manipulation systems.
  • Increased acceptance of robots by the general population.
  • Improved acceptability and reliability of AI-enhanced prosthesis and service robots.
  • Rising awareness about benefits of AI-oriented manipulation methods.

IntelliMan Pert Chart