Scientists Propose New Real-Time Running Person Detection System

Date:09-05-2020   |   【Print】 【close

Moving object detection under dynamic backgrounds is of great significance in many applications penetrated in human life, such as autonomous driving, smart home and so on.

Much attention has been paid to the speed and accuracy of the detection in various applications. It is a challenge considering the trade-off between the detection speed and the accuracy of the object detection, even more difficult to detect moving target(s) under unmanned aerial vehicle (UAV) due to multi-angle, camera motion and real-time requirement. 

A research team led by Prof. ZHOU Yimin from Shenzhen Institutes of Advanced Technology (SIAT), Chinese Academy of Sciences has proposed a fast-running human detection system for the UAV based on optical flow and deep convolution networks, which has achieved accurate recognition of the running people in various backgrounds. 

This study was published in IET Intelligent Transport Systems on April 30.  

First of all, a serial of pre-processing were performed to extract effective region of interest (ROI), including image greying and image blurring. The image blurring started with filtering the original frame with a high-pass filter in the time domain and a low-pass filter in the spatial domain under multi-scales. 

For the purpose of fast locating the candidate targets, the optical flow represented motion information was calculated with every two successive frames. 

Afterwards, a series of prior-processing operations, including spatial average filtering, morphological expansion and outer contour extraction, were performed to extract the salient region of the moving targets (i.e. running and walking person). 

In order to avoid the failure of the extraction, researchers further proposed a leak-filling algorithm based on morphology. 

A mean filtering was adopted to blur the motion information, and the outcome of this operation was operated through the morphological expansion for accurate bounding-box prediction.  

Besides, an adaptive threshold based on optical flow as the segmentation standard was adopted to eliminate the ill effects caused by UAV motion or jittering.  

Field experiments and benchmark testing have demonstrated that the proposed system can achieve running person detection at the speed of 15 frames per second (fps) with 81.1% accuracy under complex environments with UAV scenes. 

In the future research, the team will investigate the methods with the combination of different modules, such as Kalman filter to track moving targets to achieve high-level semantic segmentation or video-based action classification for abnormal event detection. 


Media Contact:
ZHANG Xiaomin