Preprint Article Version 1 This version is not peer-reviewed

A Smart Grasp Deep Network Based on One-Way Fusion Strategy

Version 1 : Received: 1 October 2024 / Approved: 1 October 2024 / Online: 2 October 2024 (08:03:50 CEST)

How to cite: Yang, Y.; Li, W.; Cang, X.; Cao, Z.; Bao, J. A Smart Grasp Deep Network Based on One-Way Fusion Strategy. Preprints 2024, 2024100076. https://doi.org/10.20944/preprints202410.0076.v1 Yang, Y.; Li, W.; Cang, X.; Cao, Z.; Bao, J. A Smart Grasp Deep Network Based on One-Way Fusion Strategy. Preprints 2024, 2024100076. https://doi.org/10.20944/preprints202410.0076.v1

Abstract

Robot grasp modeling and implementation is one of essential abilities for a robot with embodied artificial intelligence. However, most existing deep learning-based grasp methods suffer a large number of parameters and heavy computation overheads. To address this issue, by fully exploiting the complementary capabilities of both CNNs and Transformers, we propose a smart grasp deep network with one-way fusion strategy via context path and spatial path (SGNet), which enjoys a lightweight structure, fast inference speed and easy deployment on devices with limited computation resources. Specifically, the context path employs lightweight depthwise separable convolution to achieve fast down-sampling while a novel DSFormer module mainly by integrating Transformer is to extract global and context-rich features. The spatial path efficiently fuses feature information from the context path in one-way manner and generate high-resolution feature maps via point-by-point convolution operations. Experimental results show the proposed model with only 1M parameters has a significantly overall performance, achieving 99.4% accuracy on Cornell dataset and 93.4% accuracy on Jacquard dataset, as well as within 12.5ms inference time.

Keywords

robot grasp; lightweight structure; deep network; fusion strategy; transformer

Subject

Engineering, Control and Systems Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.