Preprint
Article

This version is not peer-reviewed.

Human-Vehicle Leader-Follower Control using Deep Learning-Driven Gesture Recognition

Submitted:

02 February 2022

Posted:

04 February 2022

You are already at the latest version

Abstract
Leader-follower autonomy (LFA) systems have so far only focused on vehicles following other vehicles. Though there have been several decades of research into this topic, there has not yet been any work on human-vehicle leader-follower systems. We present a system in which an autonomous vehicle — our ACTor 1 platform — can follow a human leader who controls the vehicle through gestures. We successfully developed a modular pipeline that uses artificial intelligence/deep learning to recognize hand-and-body gestures from a user in view of the vehicle’s camera and translate those gestures into physical action by the vehicle. We demonstrate our work using our ACTor 1 platform, a modified Polaris Gem 2 electric vehicle. This work has numerous applications, such as material transport in industrial contexts.
Keywords: 
;  ;  ;  ;  ;  ;  ;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated