Preprint Article Version 1 This version is not peer-reviewed

Simultaneous Stereo Matching and Confidence Estimation Network

Version 1 : Received: 2 July 2024 / Approved: 3 July 2024 / Online: 3 July 2024 (09:49:02 CEST)

How to cite: Schmähling, T.; Müller, T.; Eberhardt, J.; Elser, S. Simultaneous Stereo Matching and Confidence Estimation Network. Preprints 2024, 2024070313. https://doi.org/10.20944/preprints202407.0313.v1 Schmähling, T.; Müller, T.; Eberhardt, J.; Elser, S. Simultaneous Stereo Matching and Confidence Estimation Network. Preprints 2024, 2024070313. https://doi.org/10.20944/preprints202407.0313.v1

Abstract

In this paper, we present a multi-task model that predicts disparities and confidence levels in deep stereo matching simultaneously. We do this by combining its successful model for each separate task and obtaining a multitask model that can be trained with a proposed loss function. We show the advantages of this model compared to training and predicting disparity and confidence sequentially. This method enables an improvement of 15% to 30% in the area under the curve (AUC) metric when trained parallel rather than sequential. In addition, the effect of weighting the components in the loss function on the stereo and confidence performance is investigated. By improving the confidence estimate, the practicality of stereo estimators for creating distance images is increased.

Keywords

Stereo Vision; Confidence; Multi-task Learning; Uncertainty

Subject

Computer Science and Mathematics, Computer Vision and Graphics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.