With the development and maturity of object detection techniques, more and more researchers begin to utilize deep learning methods for object detection and classification of the image data obtained by dynamic vision sensors(DVS). Considering that the event stream data obtained by DVS does not have grayscale feature information, we want to convert it into a frame image and utilize the YOLOv3 neural network model for learning to achieve object detection. Since the IoU loss function in the original YOLOv3 network cannot represent the distance between the output predicted box and the grounding truth, this paper improves YOLOv3 by employing the GIoU loss function to achieve more accurate detection. By conducting experiments on a self-collected dataset, we find that the GIoU-improved YOLOv3 network has good performance and can accurately achieve human action recognition and classification.
Address-event-based Dynamic Vision Sensor(DVS) and Convolutional Neural Network(CNN) have been widely researched in recent years. However, the collected data of DVS are easily affected by some noise, which makes it difficult to identify the target during the classification processing. In order to solve the problem of misclassification, a novel improved CNN(NI-CNN) technique is proposed in this paper. Firstly, the appropriate number of event pulses are chosen and mapped to the frame domain, then the optimization denosing approach is utilized to the whole classification system. Secondly, reducing intra-class spacing and enlarging inter-class divergence by joint loss function which is adjusted regularization parameters. Numerical comparisons between our proposed approach and some state-of-the-art solvers, on several accessible databases, are presented to demonstrate its efficiency and effectiveness.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.