Applications of Cooperative Imaging System on Working Pattern Analyses in Agricultural Environment
Date Issued
2016
Date
2016
Author(s)
Huang, Yang-Wen
Abstract
The surveillance systems solve the difficulties of long-term observation. We are designing a cooperative surveillance system to make managers and owners get to know their places of the agricultural environments better by providing the working patterns and the other additional long-time observed information. This research continues the predecessor’s system structure and improves it. The old system use a combination of a panorama camera set and a Pan-Tilt-Zoom (PTZ) camera. The system has the advantage of monitoring the object’s surroundings and the object itself in high resolution at the same time. Due to the limitation of the hardware performance and the low image quality from the camera set, the new system uses an ultra-wide field of view (FOV) camera and a Pan-Tilt-Zoom (PTZ) camera. Ultra-wide FOV images from the static camera up to 135 degree provide most of any possible happenings, and the images from PTZ camera fill the information of the low-resolution images of the ultra-wide FOV camera. In the software, we changed the system from Master-slave system to Cooperative Surveillance system. The relationship between the cameras has changed from inequality to equal. The Network Control Center in the system is able to let the cameras to communicate with each other instead of doing their works on their own. In order to detect the objects from the cameras, we use Multi-resolution Gaussian Mixture Model, Static Object Detection and Dynamic Euclidean Distance to adapt the video sequence in different environment. Some changes are made to make the system more “cooperative”. First, the system structure is changed to parallel thread processing. We also use frame differential and centroid estimation to process the PTZ images to make the PTZ able to do self-tracking tasks to increase the accuracy of capturing any tracked object. All the information from the cameras will be sent to the Network Control Center to analyze the working patterns of a place. We use mixture feature extracting method consist of Gabor filter and Bag of Words to process the images of the detected objects. The processed featured will be trained and predicted using SVM. To make the system able to fit different environment use, a Custom Define Rules system is provided to let users to create their own working patterns by existing features and the trajectory information. An Object View Manager is also provided for users to look up the individual detail of any detected object. We designed five experiments to validate our system. The first experiment is done in the NTU farm; we verified our object tracking methods with the new camera. The second experiment is done in the Zhi Chen farm. We enhanced and redesigned the Custom Define Rules System. The third experiment in the 3rd vegetable packaging factory verified the previous changes is adaptable in different environment. With the fourth experiment outside of the Tomatake Hall, we improved the PTZ self-tracking algorithm. At last, the fifth experiment in the plaza in front of the Dept. of the BIME, we again verified the above changes and made a final improvement of our system and data structure. The cooperative system ensures the maximum information during any event that provides us to judge more precisely.
Subjects
Cooperative Surveillance System
Pattern Analyses
Object Tracking
Custom Define Rules System
Agricultural Environment
Type
thesis
File(s)
Loading...
Name
ntu-105-R03631012-1.pdf
Size
23.32 KB
Format
Adobe PDF
Checksum
(MD5):93c7f372e76d400e1a367a3a02c39d63