Article
Version 1
This version is not peer-reviewed
One-Shot Learning from Prototype SKU Images
Version 1
: Received: 11 July 2024 / Approved: 11 July 2024 / Online: 11 July 2024 (12:37:30 CEST)
How to cite: Kowalczyk, A.; Sarwas, G. One-Shot Learning from Prototype SKU Images. Preprints 2024, 2024070979. https://doi.org/10.20944/preprints202407.0979.v1 Kowalczyk, A.; Sarwas, G. One-Shot Learning from Prototype SKU Images. Preprints 2024, 2024070979. https://doi.org/10.20944/preprints202407.0979.v1
Abstract
This paper highlights the importance of one-shot learning from prototype SKU images for efficient product recognition in retail and inventory management. Traditional methods require large supervised datasets to train deep neural networks, which can be costly and impractical. One-shot learning techniques mitigate this issue by enabling classification from a single prototype image per product class, thus reducing data annotation efforts. We introduce the variational prototyping-encoder (VPE), a novel deep neural network for one-shot classification. Utilizing a support set of prototype SKU images, VPE learns to classify query images by capturing image similarity and prototypical concepts. Unlike metric learning-based approaches, VPE pre-learns image translation from real-world object images to prototype images as a meta-task, facilitating efficient one-shot classification with minimal supervision. Our research demonstrates that VPE can significantly reduce the need for large datasets while accurately classifying query images into their respective categories, providing a practical solution for product classification tasks.
Keywords
one-shot learning; autoencoders; prototyping
Subject
Computer Science and Mathematics, Computer Vision and Graphics
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment