RESEARCHES
Smart Vision & Robotic Sensing
Professor, Robotics Laboratory
Smart Innovation Program, Graduate School of Advanced Science and Engineering
Hiroshima University
Smart Innovation Program, Graduate School of Advanced Science and Engineering
Hiroshima University
Idaku ISHII
- >> Research Contents
- In order to establish high-speed robot senses that are much faster than human senses, we are conducting research and development of information systems and devices that can achieve real-time image processing at 1000 frames/s or greater. As well as integrated algorithms to accelerate sensor information processing, we are also studying new sensing methodologies based on vibration and flow dynamics; they are too fast for humans to sense.
Multicopter Tracking Using High-speed Vision
We propose a high-frame-rate (HFR) vision based surveillance solution for real-time multicopter detecting and tracking. The proposed vision-based vibration source localization algorithm based on the vibration-based image features, which are obtained by detecting periodic changes in image intensities at pixels around rotating propellers of multicopter, vibrating at audio-level frequency (several hundred hertz) by using pixel-level band-pass digital filters. Clearly different from conventional appearance-based object tracking with spatial pattern recognition, our algorithm is robust with various factors of appearance changes, such as brightness, defocus blur, apparent scale, pose variation, complicated background, partial occlusion and motion blur. Therefore, is robustly applicable in real scenario.
The algorithm was implemented on a high-speed vision platform with a 2-DOF active vision system, which is controlled by visual feedback at 1000 fps that always track the target multicopter at right center of the camera view. The proposed HFR-vision-based method with pixel-level digital filters enables vibration source localization with sub-degree-level angular directivity in real time.
The algorithm was implemented on a high-speed vision platform with a 2-DOF active vision system, which is controlled by visual feedback at 1000 fps that always track the target multicopter at right center of the camera view. The proposed HFR-vision-based method with pixel-level digital filters enables vibration source localization with sub-degree-level angular directivity in real time.
AVI movie(14.7M) Multicopter tracking demo |
Reference
- Mingjun Jiang, Tadayoshi Aoyama, Takeshi Takaki, and Idaku Ishii: Pixel-Level and Robust Vibration Source Sensing in High-Frame-Rate Video Analysis, Sensors, Vol.16, No.11, 1842 (2016)
- Mingjun Jiang, Tadayoshi Aoyama, Takeshi Takaki, and Idaku Ishii: Vibration Source Localization for Motion-blurred High-frame-rate Videos, Proc. IEEE Int. Conf. on Robotics and Biomimetics, pp.774-779, 2016.