Preprint Article Version 1 This version is not peer-reviewed

The Impact of Adversarial Attacks on a Computer Vision Models Perception of Images

Version 1 : Received: 1 August 2024 / Approved: 2 August 2024 / Online: 5 August 2024 (07:14:51 CEST)

How to cite: Bolozovskii, R.; ---, A. L. The Impact of Adversarial Attacks on a Computer Vision Models Perception of Images. Preprints 2024, 2024080204. https://doi.org/10.20944/preprints202408.0204.v1 Bolozovskii, R.; ---, A. L. The Impact of Adversarial Attacks on a Computer Vision Models Perception of Images. Preprints 2024, 2024080204. https://doi.org/10.20944/preprints202408.0204.v1

Abstract

Image clustering and classification are fundamental tasks in computer vision, critical for applications overlap image retrieval, object recognition, and image classification. However, the robustness of clustering algorithms against adversarial attacks remains interesting topic. In this paper, we investigate how adversarial attacks on image classification algorithms impact Image Clustering, similarity obtained using the Dot Product, KNN, HNSW algorithms and model Gradient-Weighted Class Activation Mapping (Grad-CAM). In our work was proposed a targeted study of the impact of adversarial attacks on the clustering ability of ResNet50 under various adversarial scenarios. Was used ResNet50 as the basis for the experiments, a widely used architecture known for its effectiveness in image classification. This network was subjected to various adversarial attacks in order to understand how these perturbations affect its clustering capabilities. By thoroughly examining the resultant clustering outcomes under different attack scenarios, we aim to uncover vulnerabilities and nuances inherent in clustering algorithms and similarity metrics when confronted with adversarial input.

Keywords

adversarial attacks; computer vision; information security; ResNet50; image clustering; KNN; HNSW

Subject

Computer Science and Mathematics, Security Systems

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.