Due to the nature of the motion experienced by ship-borne mechanical platform on the sea, there are three degrees of freedom that need to be considered: heave, roll, and pitch. Typically, the motion signal in the heave direction is acquired through double integration of data from an acceleration sensor. However, ship-borne mechanical platform on the sea also experiences movements in roll and pitch directions, which can cause the acceleration sensor to deviate from the ideal horizontal plane. These inclination angles cause additional horizontal forces exerted on the piezoelectric element of the acceleration sensor, leading to errors in the displacement measurement obtained through integration. As a result, it becomes difficult to accurately calculate the attitude of the ship-borne mechanical platform.
Key concerns in the attitude estimation of ship-borne mechanical platform are mainly about improving accuracy and real-time performance of the sensor system, particularly under the influence of low-frequency motion caused by sea waves. Challenges arise as each type of sensor has its limitations and they cannot solely obtain accurate multidimensional attitude data. Yet, aggressively increasing the number of sensors is not a wise solution, since it brings massive computational burden. To achieve precise estimation of ship-borne mechanical platform attitude while eliminating the information redundancy, efficient multiple sensor fusion method with multilevel information processing abilities is required.
To address the shortcomings of a single sensor in accurately estimating real-time and comprehensive multi-degree-of-freedom information, scholars have conducted extensive research on multi-sensor-fusion techniques. Luo R C developed the general paradigm of a sensor data fusion system in 1988.[
1] Durrant-Whyte H F presented a fully decentralized architecture for data fusion problems in 1990.[
2] Wen W described an algorithm for implementing a multi-sensor system in a model-based environment with consideration of the constraints in 1992.[
3] Harris C J formally introduced the process of data fusion and sensor integration with a variety of implementation architectures, that recognize data fusion as a critical element in overall systems integration in 1998.[
4] Llinas J provided a tutorial on data fusion, introducing data fusion applications, process models, and identification of applicable techniques in 1998.[
5] Chen S designed a detection system based on multi-sensor data fusion technology in 2003.[
6] Herpel T modeled sensor phenomena, road traffic scenarios, data fusion paradigms and signal processing algorithms and investigate the impact of combining sensor data on different levels of abstraction on the performance of the multi-sensor system by means of discrete event simulation in 2008.[
7] At the same year, Manjunatha P proposed a multi-sensor data fusion algorithm in WSN using fuzzy logic for event detection application, improved the reliability and accuracy of the sensed information.[
8] Dong J presented an overview of recent advances in multi-sensor fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed.[
9] Wolter P T used partial least squares (PLS) regression to integrate different combinations of satellite sensor data to determine the best combination of sensor data in 2010.[
10] Medjahed H proposed an automatic in-home healthcare monitoring system with several sensors that can be installed at home and enables to get a full and tightly controlled universe of data sets in 2011.[
11] Banerjee T P propose and investigate a hybrid method for fault signal classification based on sensor data fusion by using the Support Vector Machine (SVM) and Short Term Fourier Transform (STFT) techniques in 2012.[
12] Frikha A proposed A modified Analytical Hierarchy Process (AHP) that incorporates several criteria to determine the weights of a sensor reading set.[
13] Azimirad E present a comprehensive review of data fusion architecture, and exploring its conceptualizations, benefits, and challenging aspects, as well as existing architectures in 2015.[
14] G Fortino proposed a system which is based on a multi-sensor data fusion schema to perform automatic detection of handshakes between two individuals and capture of possible heart-rate-based emotion reactions due to the individuals’ meeting.[
15] S Rawat used back-propagation neural network to solve an inherent problem of multi-sensor data fusion in wireless sensor network applications in 2016.[
16] Duro J A proposed a novel multi-sensor data fusion framework to enable identification of the best sensor locations for monitoring cutting operations in 2016.[
17] Maimaitijiang M evaluated the power of high spatial resolution RGB, multispectral and thermal data fusion to estimate soybean (Glycine max) biochemical parameters including chlorophyll content and nitrogen concentration in 2017.[
18] Jing L proposed an adaptive multi-sensor data fusion method based on deep convolutional neural networks (DCNN) for fault diagnosis in 2017.[
19] Kumar P proposed a novel multi-sensor fusion framework for Sign Language Recognition (SLR) using Coupled Hidden Markov Model (CHMM) in 2017.[
20] Bouain M proposed a Multi-Sensor Data Fusion (MSDF)embedded design for vehicle perception tasks using stereo camera and Light Detection and Ranging (LIDAR) sensors in 2018.[
21] Xiao F proposed a weighted combination method for conflicting pieces of evidence in multi-sensor data fusion in 2018.[
22] Zhang W proposed a mathematical model based on a multi-sensor data fusion algorithm. The safety distance of the line drone is diagnosed in 2019.[
23] De Farias C M proposed a MDF technique that divides the monitored interval into a set of intervals (non-overlapped intervals and overlapped intervals) and attributes each interval to an abstract sensor in 2019.[
24] Xiao F proposed a novel method for multi-sensor data fusion based on a new belief divergence measure of evidences and the belief entropy in 2019 and then proposed a hybrid MSDF method in 2020.[
25,
26] Muzammal M proposed a data fusion enabled Ensemble approach in 2020.[
27] Li N proposed a RUL prediction method based on a multi-sensor data fusion model in 2021.[
28] Kashinath S A reviewed DF methods used for real-time and multi-sensor (heterogeneous) TFA studies in 2021.[
29] Fei S used machine learning (ML) methods for fusion of unmanned aerial vehicle (UAV)-based multi-sensor data can improve the prediction accuracy of crop yield in 2023.[
30] Han C proposed an absolute displacement measurement method and its application in ship motion measurement.[
31] However, in the field of three-degree-of-freedom motion measurement of ship-borne mechanical platforms, the method of multi-sensor fusion has been seldomly used. There are still many technical problems in the application of ship-borne mechanical platform motion measurements. Especially, the inherent interference between the effects of different degree-of-freedom ship movements on sensor outputs cause challenges to traditional sensor-fusion methods.
In this paper, in order to improve the accuracy of the ship-borne mechanical platform motion signal computed by the double integration of the single-degree-of-freedom acceleration sensor, a ship-borne mechanical platform motion measurement correction method based on multi-sensor fusion is proposed with a set of angle sensors and four sets of acceleration sensor sensors. The proposed sensor configuration eliminates the influence of the inclination angle on the integral displacement signal in the heave direction, and utilizes the Kalman filter to estimate the optimal inclination angle and heave displacement. In the proposed Kalman filtering-based algorithm, the calculated angle signal is used as the estimation value, and the angle signal collected by the angle sensor is used as the observation value to iteratively optimize the inclination angles. In order to prove the effectiveness of the method, we verified the proposed algorithm on the laboratory platform by simulating the motion of the ship-borne mechanical platform and computing the multi-sensor fusion heave displacement estimation. Finally, the estimated data was compared with the standard motion signal obtained by a laser sensor. It is verified that the algorithm can restore the actual motion characteristics of the ship-borne mechanical platform with improved accuracy through data fusion of acceleration sensors and tilt angle sensor compared with methods using only one type of motion sensor.