Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Bias Analysis and Correction in Weighted-L1 Estimators for the First-Order Bifurcating Autoregressive Model

Version 1 : Received: 24 September 2024 / Approved: 25 September 2024 / Online: 25 September 2024 (13:24:46 CEST)

How to cite: Elbayoumi, T.; Mostafa, S. Bias Analysis and Correction in Weighted-L1 Estimators for the First-Order Bifurcating Autoregressive Model. Preprints 2024, 2024092024. https://doi.org/10.20944/preprints202409.2024.v1 Elbayoumi, T.; Mostafa, S. Bias Analysis and Correction in Weighted-L1 Estimators for the First-Order Bifurcating Autoregressive Model. Preprints 2024, 2024092024. https://doi.org/10.20944/preprints202409.2024.v1

Abstract

This study examines the bias in weighted least absolute deviation (WL1) estimation within the context of stationary first-order bifurcating autoregressive (BAR(1)) models, which are frequently employed to analyze binary tree-like data, including applications in cell lineage studies. Initial findings indicate that WL1 estimators can demonstrate substantial and problematic biases, especially when small to moderate sample sizes. The autoregressive parameter and the correlation between model errors influence the volume and direction of the bias. To address this issue, we propose two bootstrap-based bias-corrected estimators for the WL1 estimator. We conduct extensive simulations to assess the performance of these bias-corrected estimators. Our empirical findings demonstrate that these estimators effectively reduce the bias inherent in WL1 estimators, with their performance being particularly pronounced at the extremes of the autoregressive parameter range.

Keywords

Bifurcating; Autoregressive; Singe Bootstrap; Fast Double Bootstrap

Subject

Computer Science and Mathematics, Probability and Statistics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.