Preprint Article Version 1 This version is not peer-reviewed

Smooth Attention: Improving Image Semantic Segmentation

Version 1 : Received: 16 September 2024 / Approved: 17 September 2024 / Online: 17 September 2024 (06:14:20 CEST)

How to cite: Kriuk, B.; Kriuk, F.; Praveen, K. Smooth Attention: Improving Image Semantic Segmentation. Preprints 2024, 2024091283. https://doi.org/10.20944/preprints202409.1283.v1 Kriuk, B.; Kriuk, F.; Praveen, K. Smooth Attention: Improving Image Semantic Segmentation. Preprints 2024, 2024091283. https://doi.org/10.20944/preprints202409.1283.v1

Abstract

Attention mechanisms have become a fundamental component of deep learning, including the field of computer vision. The key idea behind attention in computer vision is to help the model focus on the relevant spatial regions of the input image, rather than treating all regions equally. The traditional approaches to attention mechanisms in computer vision often suffer from distribution inconsistencies in the attention maps, resulting in sharp transitions that negatively affect model’s focus and lead to poor generalization on complex shapes. The problem of spatial incoherence is particularly pronounced in the task of semantic segmentation, where accurate pixel-level predictions require a detailed understanding of the spatial relationships within the image. In this paper, we propose an attention mechanism called Smooth Attention designed for convolutional neural networks to address the problem of spatial inconsistency in attention maps through multidimensional spatial smoothing. We conduct a series of experiments to evaluate the effectiveness of the proposed mechanism and demonstrate its superior performance compared to traditional methods.

Keywords

Semantic Segmentation; Attention Mechanism; Spatial Relationships; Computer Vision; Deep Learning

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.