Preprint Article Version 1 This version is not peer-reviewed

Collaborative Channel Perception of UAV Data Link Network Based on Data Fusion

Version 1 : Received: 8 August 2024 / Approved: 8 August 2024 / Online: 12 August 2024 (02:53:58 CEST)

How to cite: Zhao, Z.; Mao, Z.; Zhang, Z.; Pan, Y.; Xu, J. Collaborative Channel Perception of UAV Data Link Network Based on Data Fusion. Preprints 2024, 2024080657. https://doi.org/10.20944/preprints202408.0657.v1 Zhao, Z.; Mao, Z.; Zhang, Z.; Pan, Y.; Xu, J. Collaborative Channel Perception of UAV Data Link Network Based on Data Fusion. Preprints 2024, 2024080657. https://doi.org/10.20944/preprints202408.0657.v1

Abstract

The existing collaborative channel perception suffered from unreasonable data fusion weight allocation, which mismatched the channel perception capability of node devices. This often led to significant deviations between the channel perception results and the actual channel state. To solve this issue, this paper integrated the data fusion algorithm from evidence fusion theory with data link channel state perception. It applied the data fusion advantages of evidence fusion theory to evaluate the traffic pulse statistical capability of network node devices. Specifically, the typical characteristic parameters describing the channel perception capability of node devices were regarded as evidence parameter sets under the recognition framework. By calculating the credibility and falsity of the characteristic parameters, the differences and conflicts between nodes were measured to achieve a comprehensive evaluation of the traffic pulse statistical capabilities of node devices. Based on this evaluation, the geometric mean method was adopted to calculate channel state perception weights for each node within a single-hop range, and a weight allocation strategy was formulated to improve the accuracy of channel state perception.

Keywords

collaborative perception; data link; data fusion; weight allocation

Subject

Engineering, Telecommunications

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.