Preprint
Article

Identifying Human Behavious Using Deep Trajectory Descriptors

Altmetrics

Downloads

252

Views

138

Comments

0

This version is not peer-reviewed

Submitted:

24 May 2019

Posted:

29 May 2019

You are already at the latest version

Alerts
Abstract
Identifying human actions in complex scenes is widely considered as a challenging research problem due to the unpredictable behaviors and variation of appearances and postures. For extracting variations in motion and postures, trajectories provide meaningful way. However, simple trajectories are normally represented by vector of spatial coordinates. In order to identify human actions, we must exploit structural relationship between different trajectories. In this paper, we propose a method that divides the video into N number of segments and then for each segment we extract trajectories. We then compute trajectory descriptor for each segment which capture the structural relationship among different trajectories in the video segment. For trajectory descriptor, we project all extracted trajectories on the canvas. This will result in texture image which can store the relative motion and structural relationship among the trajectories. We then train Convolution Neural Network (CNN) to capture and learn the representation from dense trajectories. . Experimental results shows that our proposed method out performs state of the art methods by 90.01% on benchmark data set.
Keywords: 
Subject: Computer Science and Mathematics  -   Artificial Intelligence and Machine Learning
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated