KU Center for Autonomous Robotic Systems

Facility

Theme 1 Robotics for Infrastructures Inspection (R4II)

Visual Signal and Processing lab L03029 is dedicated for this theme. The focus will be on autonomous and persistent surveillance of civil and industrial infrastructures, based on easy to deploy unmanned vehicles.

VSAP focuses on vision and perception research, where the key and most important equipment needed are high performance workstations and computers for training the deep learning models required for autonomous vehicle, infrastructure inspection and human tracking projects. The lab is equipped with state of the art machines that deliver the high performance computing needed for deep learning applications, these include Nvidia Quadro graphic cards and the latest Nvidia RTX graphic cards. In addition, an array of professional grade and specialized cameras for depth, tracking, and localization are at disposal for the creation of novel visual datasets.

 

  • Lambda TensorBook Laptops: High performance deep learning optimized laptops with 6 core CPU and 8GB VRAM Nvidia RTX 2080 Max-Q GPU
  • A collection of Zed stereoscopic camera and I Intel® RealSense™ Depth Camera D435
  • cameras for depth, tracking, and localization
  • Nvidia Jetson Xavier: AI computing platform for autonomous machines and deep learning

 

The following State-of-the-art sensors and cameras are used:

Hyperspectral Cameras such as Photonfocus SN5*5 HS camera , Ximea SN5*5 HS camera are used for image detection and analysis for UAV operating in harsh environments

Other sensors in VSAP include:

Lumidigm V-Series V302 Multispectral Fingerprint Scanner

Fujitsu PalmSecure Palm Vein Scanner

FLIR Camera

Portable 3D Face Scanner

3D Face Scanner

 

 

  • Caps-sim: Software Packages Development For Active Capsules

Virtual simulation environment for magnetically-driven active endoscopic capsule. The environment will integrate the haptic feedback into the endoscopic system to allow the doctors to control the robotic capsule movements based on both vision and interaction forces with the colon environment. Give the human operator adequate guidance and an intuitive control interface to perform the best procedure possible. Introduce new diagnosis and treatment procedures such as performing palpation inside the colon, feeling the strength of the colon contractions and its effect on the capsule motion, and control and feel the pressure of a tool attached to the capsule for a possible new intervention procedure. Development is done on both Gazebo and Sofa, where the later allows

 

Noteworthy equipment used here:

  • Geomagic Touch X Haptic Device: Haptic feedback device with 6-degree-of-freedom positional sensing and 3-degrees for control
  • Oculus Rift S VR headset: Inside out tracking for immersive simulation experience