Article
Version 1
Preserved in Portico This version is not peer-reviewed
YOLOv2 for Pigs Detection in Industrial Farming
Version 1
: Received: 3 September 2020 / Approved: 4 September 2020 / Online: 4 September 2020 (07:59:03 CEST)
How to cite: Khan, A. Q.; Khan, S. YOLOv2 for Pigs Detection in Industrial Farming. Preprints 2020, 2020090088. https://doi.org/10.20944/preprints202009.0088.v1 Khan, A. Q.; Khan, S. YOLOv2 for Pigs Detection in Industrial Farming. Preprints 2020, 2020090088. https://doi.org/10.20944/preprints202009.0088.v1
Abstract
Generic object detection is one of the most important and flourishing branches of computer vision and has real-life applications in our day to day life. With the exponential development of deep learning-based techniques for object detection, the performance has enhanced considerably over the last 2 decades. However, due to the data-hungry nature of deep models, they don't perform well on tasks which have very limited labeled dataset available. To handle this problem, we proposed a transfer learning-based deep learning approach for detecting multiple pigs in the indoor farm setting. The approach is based on YOLO-v2 and the initial parameters are used as the optimal starting values for train-ing the network. Compared to the original YOLO-v2, we transformed the detector to detect only one class of objects i.e. pigs and the back-ground. For training the network, the farm-specific data is annotated with the bounding boxes enclosing pigs in the top view. Experiments are performed on a different configuration of the pen in the farm and convincing results have been achieved while using a few hundred annotated frames for fine-tuning the network.
Keywords
YOLOv2; transfer learning; pig farming; object detection
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment