About us

The FlexSight experiment is funded by the European Community's project ECHORD++. FlexSight is a research project that involves: the Ro.Co.Co. Laboratory of DIAG (Department of Computer, Control, and Management Engineering  Antonio Ruberti at Sapienza University of Rome), which provides expertise in software design and development for robotic perception applications; IT+Robotics, which provides expertise in the industrial robotics domain; and Robox, a company specialized in the development of innovative hardware and software solutions for robotics and control systems.

To learn more about the FlexSight sensor, please contact the project coordinator:

Alberto Pretto pretto@dis.uniroma1.it

Goal

The FlexSight experiment aims to provide a robotic solution for the “pick&place” class of applications with rigid and deformable objects. The project focuses on building a prototype smart camera – the FlexSight Sensor (FSS) – which can be integrated in the chassis of an existing robot to empower it with detection and localization capabilities. The main objectives of the FlexSight experiment are:

  1. Enable a robot to perceive a large and widespread class of rigid and deformable objects in an accurate and reliable way, with a particular emphasis on the computational speed of the whole system.
  2. Implement a prototype of a compact industrial sensor (the FlexSight Sensor, FSS), that integrates all the required sensing and processing needed to run the detection and localization algorithms inside a robust and small chassis.
  3. Integrate the FSS within a working system that will be tested in several industrial and logistic use cases.

FlexSight C1: First working prototype at Automatica 2018

FlexSight C1: First working prototype at Automatica 2018

 

 

The FlexSight Consortium is proud to announce the first working prototype of the FlexSight C1 device. The prototype has been presented ad Automatica 2018 in Munich during the international fair. The C1 has been involved in a random bin picking demo where an anthropomorphic robotic manipulator has been directly driven by the sensor, without the needing of any external computation unit, to collect deformable objects in a cluttered environment. The demonstration has gained much interest from the professional and academic community, showing the sensor’s main features, e.g. Deep learning based object detection and localization algorithms, high resolution and high quality 3D reconstruction, compactness and embedded all-in-one solutions to common high level industrial tasks, and many others.

The FlexSight C1 brings together the power of a high level image processing unit with the usability of a compact and ready-to-use camera device. It is equipped with high resolution cameras and powerful tools like a pseudo random pattern laser projector which is used for recovering the 3D reconstruction of the working environment. Such hardware power is accompanied with novel and powerful software implementations for solving complex industrial tasks, e.g. deformable object detection and recognition, 3D reconstruction driven by stereo cameras and automatic grasping point generation.

The next showcase of the FlexSight C1 sensor will take place in Padova at the IT+Robotics facility the next October 2018.

Below a brief video review of our amazing experience in Munich.

Project presentation

FlexSight @ Hannover Messe

Research

Sensor Design

Design and development of hardware solutions based on different technologies, ensuring robustness and reliability of the perception system.

Objects Recognition and Localization

Implementation of state-of-the-art algorithms for textureless and deformable objects recognition and localization. The system has to provide a high confidence level for the recognition along with a very accurate full 3D pose estimation of the objects in the workspace.

System Integration

The FlexSight sensor will be integrated in a full autonomous system that will be tested on several industrial and logistic applications.

Read more

Project Consortium

Technical Partners

People

Contacts

FUNDING