Preprint Article Version 1 This version is not peer-reviewed

Multi-View 3D Reconstruction Based on FEWO-MVSNet

Version 1 : Received: 30 October 2024 / Approved: 31 October 2024 / Online: 31 October 2024 (15:04:34 CET)

How to cite: Yao, G.; Wang, Z.; Wei, G.; Zhu, F.; Fu, Q.; Yu, Q.; Wei, M. Multi-View 3D Reconstruction Based on FEWO-MVSNet. Preprints 2024, 2024102566. https://doi.org/10.20944/preprints202410.2566.v1 Yao, G.; Wang, Z.; Wei, G.; Zhu, F.; Fu, Q.; Yu, Q.; Wei, M. Multi-View 3D Reconstruction Based on FEWO-MVSNet. Preprints 2024, 2024102566. https://doi.org/10.20944/preprints202410.2566.v1

Abstract

Aiming to address the issue that the existing multi-view stereo reconstruction methods have insufficient adaptability to the repetitive patterns and weak textures in the multi-view images, this paper proposes a three-dimensional (3D) reconstruction algorithm based on feature enhancement and weight optimization MVSNet (Abbreviated as FEWO-MVSNet). To obtain accurate and detailed global and local features, we first develop an adaptive feature enhancement approach to obtain multi-scale information from the images. Second, we introduce an attention mechanism and a spatial feature capture module to enable high-sensitivity detection for weak texture features. Third, based on the 3D convolutional neural network, the fine depth map for multi-view images can be predicted and the complete 3D model is subsequently reconstructed. Last, we evaluated the proposed FEWO-MVSNet through training and testing on the DTU, BlendedMVS, and Tanks&Temples datasets. The results demonstrate significant superiorities of our method for 3D reconstruction from multi-view images, with our method ranking first in accuracy and second in completeness when compared to the existing representative methods.

Keywords

multi-view; MVSNet; Transformer; depth estimation; 3D reconstruction

Subject

Computer Science and Mathematics, Computer Vision and Graphics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.