Theme 3 Robotics for Industrial Automation (R4IA)
L01029 lab is dedicated for this theme.
- Robotics for Manufacturing of Large-Size Structures: A wheeled UGV (Husky), a serial manipulator, (URF5 Arm), a gripping mechanism, and an attached brick storage, are used for bricklaying tasks in construction processes. Mechanical system development focused on the sizing of the ground and aerial robots as well as the development of appropriate gripping systems to pick and place the bricks. For more efficiency in the brick collection, a storage platform was attached to the UGV. The autonomous navigation of both the ground and aerial robots is performed by fusing the onboard sensors’ data (Velodyne Lidar) including wheel odometry, visual odometry, IMU data, altitude data, and GPS data to localize the robots.
- Compliant Manipulator: Baxter robot with vision-based gripper system using event-based cameras utilizing the deformations of the test object to be grasped in obtaining important data to assist effective grasping.
- Novel smart biorobotic assistive exoskeletons for the rehabilitation of stroke patients and novel discrete variable stiffness actuators for safe Human-Robot Interaction (HRI). The novelty of the actuators lie in their design topology that allows stiffness level change independently from their position.
- Prokaryote’s flagella inspired Soft robot for underwater locomotion and manipulation.
Noteworthy equipment include Ultimaker and Lulzbot 3d-printers Mitsubishi Industrial Manipulator with 6 Dof and payload of 6kg, Nao Humanoid Robots
- Marine Robotics Facility with Wave tank system
Marine Robotics testing facility in D00058 allows testing under controlled environment that simulates the adverse environmental conditions of the sea. The facility is composed of approximately 10m x 15m pool in a 300m2 (x 8m height) lab space. To achieve this, we need an integrated wave tank system consisting of a wave generator and flow current generator and an overhanging (gantry-type) mechanism covering the whole pool width. This infrastructure will allow for extensive and controlled testing of marine robotic vehicles that could be used for testing underwater, surface and bio-robotic autonomous devices or remotely tele-operated devices.

Theme 2 Robotics for Extreme Environments (R4EE)
Labs G0010, C00059 and C00060 are dedicated for this theme.The primary objective is to investigate robot based emergency response systems for extreme environments.
- Collaborative Robot Navigation and Robot Interventions in Harsh Environments
Here, we deploy collaborative robots (drones DJI Wind 4, Agras MG-1S and ground vehicles Jackal) in a fire related urban search and rescue (USAR) scenario. It addresses the following objectives related to the application of a team of UAVs and UGVs in a high rise building fire scenario:
- Intelligent feature detection, state estimation and data fusion (using deep learning workstations)
- Robust localization using multi-sensors ( GPS System fused with Radar, LiDAR, Visual Camera and drone’s sensory system)
- Decentralized collaboration of heterogeneous robot systems
- Optimized mission planning and decision making
- Search Space Coverage Path Planning experimentation for inspecting complex structure such as aircraft using a drone (e.g. DJI Matrice 100)
Some of the noteworthy equipment used are the Optitrack motion capture system, various small drines, and the Quanser QDrone , which are used for indoor localitization and algorithm testing.
Theme 1 Robotics for Infrastructures Inspection (R4II)
Visual Signal and Processing lab L03029 is dedicated for this theme. The focus will be on autonomous and persistent surveillance of civil and industrial infrastructures, based on easy to deploy unmanned vehicles.
VSAP focuses on vision and perception research, where the key and most important equipment needed are high performance workstations and computers for training the deep learning models required for autonomous vehicle, infrastructure inspection and human tracking projects. The lab is equipped with state of the art machines that deliver the high performance computing needed for deep learning applications, these include Nvidia Quadro graphic cards and the latest Nvidia RTX graphic cards. In addition, an array of professional grade and specialized cameras for depth, tracking, and localization are at disposal for the creation of novel visual datasets.
- Lambda TensorBook Laptops: High performance deep learning optimized laptops with 6 core CPU and 8GB VRAM Nvidia RTX 2080 Max-Q GPU
- A collection of Zed stereoscopic camera and I Intel® RealSense™ Depth Camera D435
- cameras for depth, tracking, and localization
- Nvidia Jetson Xavier: AI computing platform for autonomous machines and deep learning
The following State-of-the-art sensors and cameras are used:
Hyperspectral Cameras such as Photonfocus SN5*5 HS camera , Ximea SN5*5 HS camera are used for image detection and analysis for UAV operating in harsh environments
Other sensors in VSAP include:
Lumidigm V-Series V302 Multispectral Fingerprint Scanner
Fujitsu PalmSecure Palm Vein Scanner
FLIR Camera
Portable 3D Face Scanner
3D Face Scanner
- Caps-sim: Software Packages Development For Active Capsules
Virtual simulation environment for magnetically-driven active endoscopic capsule. The environment will integrate the haptic feedback into the endoscopic system to allow the doctors to control the robotic capsule movements based on both vision and interaction forces with the colon environment. Give the human operator adequate guidance and an intuitive control interface to perform the best procedure possible. Introduce new diagnosis and treatment procedures such as performing palpation inside the colon, feeling the strength of the colon contractions and its effect on the capsule motion, and control and feel the pressure of a tool attached to the capsule for a possible new intervention procedure. Development is done on both Gazebo and Sofa, where the later allows
Noteworthy equipment used here:
- Geomagic Touch X Haptic Device: Haptic feedback device with 6-degree-of-freedom positional sensing and 3-degrees for control
- Oculus Rift S VR headset: Inside out tracking for immersive simulation experience