1. Introduction
The Negative Binomial (NB) Regression model (NBRM) is analogous to a multiple regression model, that the explanatory variables are observations that follow the NB. The NBRM model is a generalized linear model (GLM) model which relaxes the restrictive assumption of the Poisson regression model having the variance being equal to its mean. The NBRM model is chosen over the Poisson regression model, especially when the data is over-dispersed (Algamal (2012), Cameron & Trivedi (2013), De Jong & Heller (2008)). The parameters in the NBR model are generally evaluated by applying the maximum likelihood (ML) method.
According Alobaidi et al. (2021), the explanatory variables do not exhibit high correlation among themselves that is no multicollinearity under the NBRM. Recent studies have demonstrated that multicollinearity is often a challenge amid the explanatory variables. The ML method exists tremendous to multicollinearity. In Kibria (2003), the regression coefficients become inconclusive when the variables to be estimated are perfectly related. One of the major consequences of multicollinearity encounter in regression model for the regression coefficients is the standard errors will be inflated and hard to test individual (Oranye & Ugwuowo, 2021). So, it will be difficult to judge if the regression coefficients are significant or not. Secondly, multicollinearity shows a wider confidence interval which has the tendency to produce a false negative result, known as an error of omission in hypothesis testing (Qasim et al. 2020).
Some researchers have come with diverse way to contend with multicollinearity in linear regression Model (LRM) see E.G., Hoerl & Kennard (1970), Liu (2003), Kibria & Lukman (2020), Ozkale & Kaciranlar (2007), Suhail et al., (2020), Ugwuowo et al. (2021), Arum & Ugwuowo (2022), Perveen & Suhail, (2021), Babar et al., (2021), Dawoud et al. (2022), Algamal et al. (2023), Wasim et al., (2023), Abonazel et al. (2023) and Dawoud et al. (2022). Hoerl & Kennard (1970) proposed the ridge estimator to handle high correlation in the LRM. Singh et al. (1986) introduced an estimator to minimize the bias in the ridge estimate using jackknife procedure. It was observed that there is reduction in bias of the new estimator that causes the MSE to be smaller than the ridge estimator. According to Khurana et al. (2014), biasedness of the estimate is close to zero when they attain a second-order jackknifed ridge (J2R).
Negative binomial ridge regression model (NBRRM) has been widely used when the problem of multicollinearity exits among the explanatory variables. In literature, several types of ridge estimators have ben proposed for the estimation of biasing parameter in NBRRM, see e.g., Turkan & Özel (2018), Rashad et al. (2019), Alobaidi et al. (2021), Akram et al. (2022) and Jabur et al. (2022).
Kibria & Lukman (2020) proposed a new ridge type estimator called the Kibria-Lukman (KL) estimator that supersedes that of the ridge estimator. Although, we observed that the KL estimator in the LRM still has a substantial amount of bias. On this note, we apply the procedure of Jackknife to KL estimator on the negative binomial model.
Our objective is to jackknife the KL estimator for the NBRM in section one. We evaluated the NBRM and some existing estimators with their properties. Also, we derived the new estimator which is then compared to the existing methods theoretically. In section three, the selection of the matrix k, is very important which we adopted severally methods in estimating k, in section four; simulation study was conducted to evaluate the performance of the new estimator. A real-life application and concluding remark were given in sections five and six, respectively.
2. Methodology
The NB which is in the family of discrete probability distribution is commonly wield to model the number of successes in an outcome of independent and identically distributed Bernoulli trials ahead of a stated number of failures. It is often used in analyzing count data as long as the response variable follows a negative binomial with a sequence of Bernoulli trials, as a result of success probability π, are adhere and until successes θ are adhered to.
The NB distribution probability function is given by
where
and
. The negative binomial with parameter
with an over-dispersion parameter given as
. Equation (1) can be rewritten as
where
and
The negative binomial model is estimated using the method of ML. Cameron & Trivedi (2013) describe the likelihood function as follows:
Let
is the data matrix of the ith row of
with p explanatory variables,
vector of coefficients.
is evaluated by the MLE which will maximize the log likelihood given by
where
and
The MLE can be achieved by working out the likelihood equation
Equation (5) is nonlinear for
, hence, the weighted least square algorithm is adopted as follows:
where P=
and
is a vector where ith elements equal to
The ML estimator of
is asymptotically commonly classified with mean vector to be
and covariance matrix
(
. The mean squared error established on the asymptotic covariance matrix equals
where
is the eigenvalue of the
matrix.
Mansson (2012) proposed the negative binomial ridge estimator to account for multicollinearity and defined as follows:
where the tuning parameter k>0. The scalar MSE is defined as follows:
2.1. The Proposed Estimator
According to Oranye & Ugwuowo (2021), opined that the jackknifed procedure reduces the biasness of the Poisson KL estimator in the generalized linear models. They derived an estimator that has smaller bias and also, has a desirable large sample property. Thus, we developed the jackknifed NBR model as follows:
The Negative Binomial of the KL is given by
let
be the eigenvalues of
matrix, insomuch that
, where
Q is an orthogonal matrix of P that the rows are normalized eigenvectors. So, the Negative Binomial MLE can be written as:
where
Khurana et al. (2014) reduce the bias of the ridge estimator using the jackknife procedures and Quenouille (1956) study the properties of jackknife procedure. The Negative Binomial KL (NBKL) estimator is written as
The jackknifed procedure will now be applied to the Negative Binomial of K-L estimator in equation (12) as follows:
Equation (13) can be written as follows:
According to Sherman-Morrison-Woodbury theorem Rao (1973),
. Hence,
where
where
and
is the ith row of Z,
is the ith NBKL residual and
.
Also, agreeing to Khurana et al. (2014), the weighted pseudo values are outline as follows:
The weighted Jackknifed estimator of
is established as:
since
Then Jackknifing the Negative Binomial for the K-L estimator is
where I-(2k) is represented by A
We then follow the procedure of Akdeniz &Akdeniz (2010), to modified equation Khurana et al. (2014) by replacing
with
to get the modified jackknife Negative Binomial KL given as:
where
Hence replacing
in equation 19, we get the modified Jackknifed Negative Binomial for the KL estimator given as:
2.2. The Characteristics of the MJNBKL Estimator
where
The following Lemmas were adopted for the theoretical comparisons.
Lemma 1:
Suppose Z be anmatrix of positive definite, Farebrother (1976) that is Z > 0,be some vector, then iff.
Lemma 2:
Suppose=1,2,is a linear estimator. Let, wherei=1,2 is the covariance matrix ofand . So,
if and only if where Trenkler & Toutenburg (1990).
2.3. Comparison among the Estimators
2.3.1. Comparison between and
Theorem 1.
The JMNBKL estimator () supersede that of Maximum Likelihood estimator for the NB (iff
Proof. The difference between MSE () and MSEM is
Consequently, is positive definite provided
2.3.2. Comparison between and
Theorem 2.
The JMNBKL estimator () supersede that of Ridge regression for the NB estimator if and only if
Proof. The difference between MSE() and MSEM is
Consequently, is positive definite provided that
2.3.3. Comparison between and
Theorem 3.
The JMNBKL estimator () supersede that of Jackknifed ridge regression for the NB estimatorif and only if
Proof. The difference between MSE () and MSEM is
Consequently, is positive definite provided that
2.3.4. Comparing between and
Theorem 4.
The JMNBKL estimator () supersede that of KL for the NB estimatorif and only if
Proof. The difference between MSE () and MSEM is
Consequently, [ is positive definite provided
3. Estimating the Value of the Biasing Parameters Selection
The selection of the matrix k, is very important. In order to estimate the optimal value in equation 20, different methods have been adopted to estimate different biasing parameter both in linear and generalized linear models. The parameter k (ridge) reduces the problem of multicollinearity. These methods are used to select the optimal shrinkage parameter for the negative binomial regression model and they are given below by different authors;
Lukman et al. (2021)
Kibria (2002)
Manson et al. (2012)
Lukman & Ayinde. (2017)
4. Monte Carlo Stimulation
A Monte Carlo simulation experiment was perpetrated in this research to decide the effectiveness of the proposed estimator with the various levels of multicollinearity.
4.1. Stimulation Design
According to Kibria (2003) and Månsson & Shukur (2011), the dependent variable from the NBR is generated from n observations as
with
. Here
with
and
. The explanatory variables at various levels of correlation are generated as:
is the different stages of multicollinearity among the explanatory variables Kibria & Banik (2016), Arum et al. (2022) and Arum et al. (2023).
are computer generated random numbers which are resulted in applying the standard normal distribution insomuch that i scales from 1 to n and j from 1 to p. Sample size which will have effect on accuracy of the prediction, four representatives value of the sample size are considered which are 25, 50, 75, 100 and 200 observations. The various levels of correlation consider are 0.8, 0.9, 0.99 and 0.999 and the independent variables which can also cause increase in MSE when the parameters are 5 and 9 are considered. The mean square is estimated and given in equation (37).
where
is the estimate of the ith parameter in jth replication and
is the actual parameter values. The experiment is reproduced 5000 times. Simulated results for the MSE values of the estimators for p=5 and 9 for different shrinkages parameters selected are show in
Table 1 and
Table 2 respectively.
From
Table 1 and
Table 2 above, we observed that increase in the correlation increases the MSE estimated values. Number of explanatory variables will also affect the estimators meaning as p increases the MSE also increases. When the degree of correlation increases, the sample size increases which in turn will also decrease the value of the MSE of the estimators. Jackknifed Negative Binomial KL estimator has the smallest MSE when compare with other estimators when the explanatory variables are 4 and 9 respectively and also, for sample sizes of 25, 50, 75, 100 and 200. Furthermore, in terms of the optimal shrinkage’s parameter of k, JNBKL shows superiority over other estimators for k3 in the simulation study which shows the lowest MSE.
5. Real-Life Application
We adopted two datasets to illustrate the theoretical findings in this research, the first is the English league football data taken from Alanaz & Algmal (2018), and Arum et al., (2022) which is a Poisson data. The data consists of 20 sample size observations with one dependent and five independent variables.
define the number of yellow cards,
– is the number of red cards,
define the goals scored,
is the number of conceded goads and
– the number of points earned. Y which is the response variable denotes the number of won matches. We examined whether the Poisson regression model fits the data using the deviance test. The results show that the model fits a Poisson regression. We observed that the data has high level of multicollinearity because the Condition Index for the dataset is greater than one. The data were analyzed in
Table 3 with the same bias parameter as in equation (36) for the estimator and the result shows that the MSE for the proposed estimator MJNBKL has a smaller MSE than the other estimator in study. The regression coefficients for all the estimators possesses the different sign.
For the second real life application, we used the aircraft damage dataset to examine the efficiency of the proposed estimator. Researchers have used the aircraft damage dataset to fit the Poisson regression model Amin et al. (2020). Mayer et al. (2012) shows that there is high level of multicollinearity in the dataset with condition number given by 219.37. The performance of the estimators was assessed via the mean squared error (MSE) and the result is presented in
Table 4 below.
6. Conclusions
This research work examines the selection of a new biasing parameter for the negative binomial regression model. The modified jackknife Kibra–Lukman estimator was proposed to reduce the bias in the jackknifed KL for the selection of the biasing parameter. Theoretical comparison was conducted to establish the superiority of MJNBKL estimator over the existing one like MLE, Negative Binomial ridge, Jackknifed NB ridge and Negative Binomial KL using the MSE. Also, the performance of the proposed estimator was compared with the existing ones through a Monte Carlo simulation study and two real life application datasets. The results from the simulation and real life application show that the proposed estimator (MJNBKL) outperformed the other estimators compared with by having the smallest MSE across all sample sizes (n) and for different levels of correlation () for the four biasing parameters used, which k3 has the optimal shrinkage parameter with the lowest MSE. Thus, this research can further be extended to other generalized linear models (GLMs) like logistics regression model, gamma regression model etc. to mitigate the problem in any of the regression models suggested.
Acknowledgments
This research was sponsored by Afe Babalola University, Ado Ekiti, Nigeria, and the authors express their gratitude for their support.
References
- Abonazel, R.M.; Saber, A.A.; Awwad, F.A. Kibria-Lukman estimator for the Conway-maxwell-Poisson regression model: Simulation & applications. Sci. Afr. 2023, 19, e01553. [Google Scholar] [CrossRef]
- Akdeniz, D.E.; Akdeniz, F. Efficiency of the modified jackknifed Liu-type estimator. Stat Papers 2010, 53, 265–280. [Google Scholar] [CrossRef]
- Akram, M.N.; Abonazel, M.R.; Amin, M.; Kibria, B.M.G.; Afzal, N. A new stein estimator for the zero-inflated negative binomial regression model. Concurr. Comput. Pract. Exp. 2022, 34, e7045. [Google Scholar] [CrossRef]
- Alanaz, M.M.; Algamal, Z. Proposed methods in estimating the ridge regression parameter in Poisson regression model. Electron. J. Appl. Statist. Analy. 2018, 11, 506–515. [Google Scholar] [CrossRef]
- Algamal, Z.Y. Diagnostic in Poisson regression models. Elect. J. Appl. Stat. Anal. 2012, 5, 178–186. [Google Scholar]
- Algamal, Z.Y.; Abonazel, M.R.; Awwad, F.A.; Eldin, E.T. Modifid Jackknife Ridge estimator for the Conway Maxwell-Poisson model. Sci. Afr. 2023, 19, e01543. [Google Scholar] [CrossRef]
- Alobaidi, N.N.; Shamany, R.E.; Algamal, Z.Y. A New Ridge Estimator for the Negative Binomial Regression model. Thail. Stat. 2021, 19, 116–125. [Google Scholar]
- Amin, M.; Akram, M.N.; Amanullah, M. On the James-Stein estimator for the poisson regression model. Commun. Stat. Simul. Comput. 2020, 51, 5596–5608. [Google Scholar] [CrossRef]
- Amin, M.; Akram, M.N.; Majid, A. On the estimation of Bell regression model using ridge estimator. Commun. Stat. Simul. Comput. 2021, 52, 854–867. [Google Scholar] [CrossRef]
- Arum, K.C.; Ugwuowo, F.I. Combining Principal Component and Robust ridge estimators in Linear regression Model with Multicollinearity and outlier. Concurr. Comput. Pract. Exp. 2022, 34, e6803. [Google Scholar] [CrossRef]
- Arum, K.C.; Ugwuowo, F.I.; Oranye, H.E. Robust modified jackknife ridge estimator for the Poisson regression model with multicollinearity and outliers. Sci. Afr. 2022, 17, e01386. [Google Scholar] [CrossRef]
- Arum, K.C.; Ugwuowo, F.I.; Oranye, H.E.; Alakija, T.O.; Ugah, T.E.; Asogwa, O.C. Combating outliers and multicollinearity in linear regression model using robust Kibria-Lukman mixed with principal component estimator, simulation and computation. Sci. Afr. 2023, 19, e01566. [Google Scholar] [CrossRef]
- Babar, I.; Ayed, H.; Chand, S.; Suhail, M.; Khan, Y.A.; Marzouki, R. Modified Liu estimators in the linear regression model: An application to Tobacco data. PLoS ONE 2021, 16, 13. [Google Scholar] [CrossRef] [PubMed]
- Cameron, A.C.; Trivedi, P.K. Regression Analysis of Count Data, 2nd ed.; Econometric Society Monograph Vol.53; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
- Dawoud, I.; Abonazel, M.R.; Awwad, F.A. Generalized Kibria-Lukman estimator: Method, simulation and application. Front. Appl. Math. Stat. 2022. [Google Scholar] [CrossRef]
- Dawoud, I.; Awwad, F.A.; Eldin, E.T.; Abonazel, M.R. New Robust estimators for handling multicollinearity and outliers in the Poisson model: Simulation & applications. Axioms 2022, 11, 612. [Google Scholar] [CrossRef]
- De Jong, H. Generalized Linear Models for Insurance Data. In International Series on Actuarial Science; 2008. [Google Scholar]
- Farebrother, R.W. Further results on the mean squared error of ridge regression. J. R. Stat. Soc. B 1976, 38, 248–250. [Google Scholar] [CrossRef]
- Hoerl, A.E.; Kennard, R.W. Ridge regression: Biased estimation for non-orthogonal problems. Technometrics 1970, 12, 55–67. [Google Scholar] [CrossRef]
- Jabur, D.M.; Rashad, N.K.; Algamal, Z.Y. Jackknifed Liu-type estimator in the Negative binomial regression model. Int. J. Nonlinear Anal. App. 2022, 13, 2675–2684. [Google Scholar] [CrossRef]
- Mansi, K.; Chaubey Yogendra, P.; Shalini, C. Jackknifing the Ridge Regression Estimator: A Revisit. Commun. Stat. Theory Methods 2014, 43, 5249–5262. [Google Scholar] [CrossRef]
- Kibria, B.M.G. Performance of some new ridge regression estimators. Commun. Stat. Simul. Comput. 2003, 32, 419–435. [Google Scholar] [CrossRef]
- Kibria, B.M.G.; Banik, S. Some Ridge Regression Estimators and Their Performances. J. Mod. Appl. Stat. Methods 2016, 15, 12. [Google Scholar] [CrossRef]
- Kibria, B.M.G.; Lukman, A.F. A New Ridge-Type Estimator for the Linear Regression Model: Simulations and Applications. Scientifica 2020, 1–16. [Google Scholar] [CrossRef]
- Li, Y.; Yang, H. A new Liu-type estimator in linear regression model. Stat. Papers 2012, 53, 427–437. [Google Scholar] [CrossRef]
- Liu, K. Using Liu-type estimator to combat collinearity. Commun. Stat. Theory Methods 2003, 32, 1009–1020. [Google Scholar] [CrossRef]
- Lukman, A.F.; Ayinde Sek, S.K.; Adewuyi, E. A modified new two-parameter estimator in a linear regression model. Model. Simul. Eng. 2019, 2019, 6342702. [Google Scholar] [CrossRef]
- Malinvard, E. Statistical: Methods of Econometrics, North Holland, Amsterdam, 3rd ed.; 1980. [Google Scholar]
- Månsson, K.; Shukur, G. A Poisson ridge regression estimator. Econ. Model. 2011, 28, 1475–1481. [Google Scholar] [CrossRef]
- Månsson, K. On ridge estimators for the negative binomial regression model. Econ. Model. 2012, 29, 178–184. [Google Scholar] [CrossRef]
- Myers, R.H.D.C.; Montgomery, G.G. Vining, and T. J. Robinson. In Generalized Linear Models: With Applications in Engineering and the Sciences; Wiley: New York, NY, USA, 2012; Volume 791. [Google Scholar]
- Oranye, H.E.U. Modiifed Jackknifed Kibria-Lukman for the Poisson regression. Concurr. Comput. Pract. Exp. 2021. [CrossRef]
- Ozkale, M.R.; Kaciranlar, S. The restricted and unrestricted two-parameter estimators. Commun.Stat. Theory Methods 2007, 36, 2707–2725. [Google Scholar] [CrossRef]
- Perveen, I.; Suhail, M. Bootstrap Liu estimators for Poisson regression model. Commun. Stat. Simul. Comput. 2021, 52, 2811–2821. [Google Scholar] [CrossRef]
- Qasim, M.; Kibria, B.M.G.; Mansson, K.; Sjölander, P. A new Poisson Liu regression estimator: Method and application. J. Appl. Stat. 2020, 47, 2258–2271. [Google Scholar] [CrossRef]
- Quenouille, M.H. Notes on bias in estimation. Biometrika 1956, 43, 353–360. [Google Scholar] [CrossRef]
- Rao, C.R. Linear Statistical Inference and Its Application, 2nd ed.; John Wiley: New York, NY, USA, 1973. [Google Scholar]
- Rashad, N.K.; Algamal, Z.Y. A new ridge estimator for the Poisson regression model. Iran. J. Sci. Technol. Trans. A Sci. 2019, 43, 2921–2928. [Google Scholar] [CrossRef]
- Schaeffer, R.L.; Roi, L.D.; Wolfe, R.A. A ridge logistic estimator. Commun. Stat. Theory Methods 1984, 13, 99–113. [Google Scholar] [CrossRef]
- Singh, B.; Chaubey, Y.P.; Dwivedi, T.D. An almost unbiased ridge estimator. Sankhya Ser. B 1986, 48, 342–346. [Google Scholar]
- Suhail, M.; Chand, S.; Kibria, B.M.G. Quantile based estimation of biasing parameters in ridge regression model. Commun. Stat. Simul. Comput. 2020, 49, 1530782. [Google Scholar] [CrossRef]
- Trenkler, G.; Toutenburg, H. Mean squared error matrix comparisons between biased estimators—An overview of recent results. Stat. Pap. 1990, 31, 165–179. [Google Scholar] [CrossRef]
- Semra, T.; Gamze, Ö. A Jackknifed estimator for the Negative Binomial regression model. Commun. Stat. Simul. Comput. 2018. [Google Scholar] [CrossRef]
- Ugwowo, F.I.; Oranye, H.E.; Arum, K.C. On the Jackknifed Kibria-Lukman estimator for the Linear regression model. Commun. Stat. Simul. Comput. 2021. [CrossRef]
- Wasim, D.; Khan, S.A.; Suhail, M. Modified robust ridge M-estimators for linear regression models: An application to tobacco data. J. Stat. Comput. Simul. 2023, 1–22. [Google Scholar] [CrossRef]
- Yang, H.; Chang, X. A new two-parameter estimator in linear regression. Commun. Stat. Theory Methods 2010, 39, 923–934. [Google Scholar] [CrossRef]
Table 1.
for the estimated MSEs of the estimator.
Table 1.
for the estimated MSEs of the estimator.
n |
|
|
1 |
2 |
|
Ρ |
MLE |
Ridge |
Jackridge |
Kl |
MJNBKL |
Ridge |
Jackridge |
Kl |
MJNBKL |
25 |
0.8 |
0.378 |
0.378 |
0.378 |
0.377 |
0.376 |
0.374 |
0.374 |
0.37 |
0.367 |
0.9 |
0.392 |
0.392 |
0.392 |
0.391 |
0.389 |
0.387 |
0.386 |
0.381 |
0.377 |
0.99 |
0.645 |
0.617 |
0.616 |
0.591 |
0.545 |
0.559 |
0.544 |
0.492 |
0.418 |
0.999 |
3.172 |
2.676 |
2.616 |
2.254 |
1.653 |
1.998 |
1.752 |
1.241 |
1.205 |
50 |
0.8 |
0.357 |
0.357 |
0.357 |
0.356 |
0.355 |
0.353 |
0.353 |
0.349 |
0.346 |
0.9 |
0.371 |
0.371 |
0.371 |
0.37 |
0.368 |
0.366 |
0.365 |
0.36 |
0.356 |
0.99 |
0.624 |
0.596 |
0.595 |
0.57 |
0.524 |
0.538 |
0.523 |
0.471 |
0.397 |
0.999 |
3.151 |
2.655 |
2.595 |
2.233 |
1.632 |
1.977 |
1.731 |
1.22 |
1.184 |
75 |
0.8 |
0.3558 |
0.3558 |
0.3558 |
0.3548 |
0.3538 |
0.3518 |
0.3518 |
0.3478 |
0.3448 |
0.9 |
0.3698 |
0.3698 |
0.3698 |
0.3688 |
0.3668 |
0.3648 |
0.3638 |
0.3588 |
0.3548 |
0.99 |
0.6228 |
0.5948 |
0.5938 |
0.5688 |
0.5228 |
0.5368 |
0.5218 |
0.4698 |
0.3958 |
0.999 |
3.1498 |
2.6538 |
2.5938 |
2.2318 |
1.6308 |
1.9758 |
1.7298 |
1.2188 |
1.1828 |
100 |
0.8 |
0.3547 |
0.3547 |
0.3547 |
0.3537 |
0.3527 |
0.3507 |
0.3507 |
0.3467 |
0.3437 |
0.9 |
0.3687 |
0.3687 |
0.3687 |
0.3677 |
0.3657 |
0.3637 |
0.3627 |
0.3577 |
0.3537 |
0.99 |
0.6217 |
0.5937 |
0.5927 |
0.5677 |
0.5217 |
0.5357 |
0.5207 |
0.4687 |
0.3947 |
0.999 |
3.1487 |
2.6527 |
2.5927 |
2.2307 |
1.6297 |
1.9747 |
1.7287 |
1.2177 |
1.1817 |
200 |
0.8 |
0.3303 |
0.3303 |
0.3303 |
0.3293 |
0.3283 |
0.3263 |
0.3263 |
0.3223 |
0.3193 |
0.9 |
0.3443 |
0.3443 |
0.3443 |
0.3433 |
0.3413 |
0.3393 |
0.3383 |
0.3333 |
0.3293 |
0.99 |
0.5973 |
0.5693 |
0.5683 |
0.5433 |
0.4973 |
0.5113 |
0.4963 |
0.4443 |
0.3703 |
0.999 |
3.1243 |
2.6283 |
2.5683 |
2.2063 |
1.6053 |
1.9503 |
1.7043 |
1.1933 |
1.1573 |
N |
|
|
3 |
4 |
|
Ρ |
MLE |
Ridge |
Jackridge |
Kl |
MJNBKL |
Ridge |
Jackridge |
Kl |
MJNBKL |
25 |
0.8 |
0.368 |
0.368 |
0.356 |
0.344 |
0.321 |
0.374 |
0.374 |
0.37 |
0.367 |
0.9 |
0.372 |
0.372 |
0.362 |
0.35 |
0.332 |
0.387 |
0.386 |
0.381 |
0.377 |
0.99 |
0.617 |
0.617 |
0.616 |
0.501 |
0.5 |
0.559 |
0.544 |
0.492 |
0.418 |
0.999 |
2.676 |
2.676 |
2.616 |
1.652 |
1.058 |
1.998 |
1.752 |
1.241 |
1.205 |
50 |
0.8 |
0.368 |
0.347 |
0.335 |
0.323 |
0.3 |
0.353 |
0.353 |
0.349 |
0.346 |
0.9 |
0.372 |
0.351 |
0.341 |
0.329 |
0.311 |
0.366 |
0.365 |
0.36 |
0.356 |
0.99 |
0.617 |
0.596 |
0.595 |
0.48 |
0.479 |
0.538 |
0.523 |
0.471 |
0.397 |
0.999 |
2.676 |
2.655 |
2.595 |
1.631 |
1.037 |
1.977 |
1.731 |
1.22 |
1.184 |
75 |
0.8 |
0.368 |
0.3458 |
0.3338 |
0.3218 |
0.2988 |
0.3518 |
0.3518 |
0.3478 |
0.3448 |
0.9 |
0.372 |
0.3498 |
0.3398 |
0.3278 |
0.3098 |
0.3648 |
0.3638 |
0.3588 |
0.3548 |
0.99 |
0.617 |
0.5948 |
0.5938 |
0.4788 |
0.4778 |
0.5368 |
0.5218 |
0.4698 |
0.3958 |
0.999 |
2.676 |
2.6538 |
2.5938 |
1.6298 |
1.0358 |
1.9758 |
1.7298 |
1.2188 |
1.1828 |
100 |
0.8 |
0.368 |
0.3447 |
0.3327 |
0.3207 |
0.2977 |
0.3507 |
0.3507 |
0.3467 |
0.3437 |
0.9 |
0.372 |
0.3487 |
0.3387 |
0.3267 |
0.3087 |
0.3637 |
0.3627 |
0.3577 |
0.3537 |
0.99 |
0.617 |
0.5937 |
0.5927 |
0.4777 |
0.4767 |
0.5357 |
0.5207 |
0.4687 |
0.3947 |
0.999 |
2.676 |
2.6527 |
2.5927 |
1.6287 |
1.0347 |
1.9747 |
1.7287 |
1.2177 |
1.1817 |
200 |
0.8 |
0.3303 |
0.3203 |
0.3083 |
0.2963 |
0.2733 |
0.3263 |
0.3263 |
0.3223 |
0.3193 |
0.9 |
0.3443 |
0.3243 |
0.3143 |
0.3023 |
0.2843 |
0.3393 |
0.3383 |
0.3333 |
0.3293 |
0.99 |
0.5973 |
0.5693 |
0.5683 |
0.4533 |
0.4523 |
0.5113 |
0.4963 |
0.4443 |
0.3703 |
0.999 |
3.1243 |
2.6283 |
2.5683 |
1.6043 |
1.0103 |
1.9503 |
1.7043 |
1.1933 |
1.1573 |
Table 2.
for the estimated MSEs of the estimator.
Table 2.
for the estimated MSEs of the estimator.
N |
|
|
1 |
2 |
|
Ρ |
MLE |
Ridge |
Jackridge |
Kl |
MJNBKL |
Ridge |
Jackridge |
Kl |
MJNBKL |
25 |
0.8 |
1.143 |
1.143 |
1.143 |
1.142 |
1.141 |
1.139 |
1.139 |
1.135 |
1.132 |
0.9 |
1.157 |
1.157 |
1.157 |
1.156 |
1.154 |
1.152 |
1.151 |
1.146 |
1.142 |
0.99 |
1.41 |
1.382 |
1.381 |
1.356 |
1.31 |
1.324 |
1.309 |
1.257 |
1.183 |
0.999 |
3.937 |
3.441 |
3.381 |
3.019 |
2.418 |
2.763 |
2.517 |
2.006 |
1.97 |
50 |
0.8 |
1.122 |
1.122 |
1.122 |
1.121 |
1.12 |
1.118 |
1.118 |
1.114 |
1.111 |
0.9 |
1.136 |
1.136 |
1.136 |
1.135 |
1.133 |
1.131 |
1.13 |
1.125 |
1.121 |
0.99 |
1.389 |
1.361 |
1.36 |
1.335 |
1.289 |
1.303 |
1.288 |
1.236 |
1.162 |
0.999 |
3.916 |
3.42 |
3.36 |
2.998 |
2.397 |
2.742 |
2.496 |
1.985 |
1.949 |
75 |
0.8 |
1.1208 |
1.1208 |
1.1208 |
1.1198 |
1.1188 |
1.1168 |
1.1168 |
1.1128 |
1.1098 |
0.9 |
1.1348 |
1.1348 |
1.1348 |
1.1338 |
1.1318 |
1.1298 |
1.1288 |
1.1238 |
1.1198 |
0.99 |
1.3878 |
1.3598 |
1.3588 |
1.3338 |
1.2878 |
1.3018 |
1.2868 |
1.2348 |
1.1608 |
0.999 |
3.9148 |
3.4188 |
3.3588 |
2.9968 |
2.3958 |
2.7408 |
2.4948 |
1.9838 |
1.9478 |
100 |
0.8 |
1.1197 |
1.1197 |
1.1197 |
1.1187 |
1.1177 |
1.1157 |
1.1157 |
1.1117 |
1.1087 |
0.9 |
1.1337 |
1.1337 |
1.1337 |
1.1327 |
1.1307 |
1.1287 |
1.1277 |
1.1227 |
1.1187 |
0.99 |
1.3867 |
1.3587 |
1.3577 |
1.3327 |
1.2867 |
1.3007 |
1.2857 |
1.2337 |
1.1597 |
0.999 |
3.9137 |
3.4177 |
3.3577 |
2.9957 |
2.3947 |
2.7397 |
2.4937 |
1.9827 |
1.9467 |
200 |
0.8 |
1.0953 |
1.0953 |
1.0953 |
1.0943 |
1.0933 |
1.0913 |
1.0913 |
1.0873 |
1.0843 |
0.9 |
1.1093 |
1.1093 |
1.1093 |
1.1083 |
1.1063 |
1.1043 |
1.1033 |
1.0983 |
1.0943 |
0.99 |
1.3623 |
1.3343 |
1.3333 |
1.3083 |
1.2623 |
1.2763 |
1.2613 |
1.2093 |
1.1353 |
0.999 |
3.8893 |
3.3933 |
3.3333 |
2.9713 |
2.3703 |
2.7153 |
2.4693 |
1.9583 |
1.9223 |
N |
|
|
3 |
4 |
|
Ρ |
MLE |
Ridge |
Jackridge |
Kl |
MJNBKL |
Ridge |
Jackridge |
Kl |
MJNBKL |
25 |
0.8 |
1.143 |
1.133 |
1.121 |
1.109 |
1.086 |
1.139 |
1.139 |
1.135 |
1.132 |
0.9 |
1.157 |
1.137 |
1.127 |
1.115 |
1.097 |
1.152 |
1.151 |
1.146 |
1.142 |
0.99 |
1.41 |
1.382 |
1.381 |
1.266 |
1.265 |
1.324 |
1.309 |
1.257 |
1.183 |
0.999 |
3.937 |
3.441 |
3.381 |
2.417 |
1.823 |
2.763 |
2.517 |
2.006 |
1.97 |
50 |
0.8 |
1.122 |
1.112 |
1.1 |
1.088 |
1.065 |
1.118 |
1.118 |
1.114 |
1.111 |
0.9 |
1.136 |
1.116 |
1.106 |
1.094 |
1.076 |
1.131 |
1.13 |
1.125 |
1.121 |
0.99 |
1.389 |
1.361 |
1.36 |
1.245 |
1.244 |
1.303 |
1.288 |
1.236 |
1.162 |
0.999 |
3.916 |
3.42 |
3.36 |
2.396 |
1.802 |
2.742 |
2.496 |
1.985 |
1.949 |
75 |
0.8 |
1.1208 |
1.1108 |
1.0988 |
1.0868 |
1.0638 |
1.1168 |
1.1168 |
1.1128 |
1.1098 |
0.9 |
1.1348 |
1.1148 |
1.1048 |
1.0928 |
1.0748 |
1.1298 |
1.1288 |
1.1238 |
1.1198 |
0.99 |
1.3878 |
1.3598 |
1.3588 |
1.2438 |
1.2428 |
1.3018 |
1.2868 |
1.2348 |
1.1608 |
0.999 |
3.9148 |
3.4188 |
3.3588 |
2.3948 |
1.8008 |
2.7408 |
2.4948 |
1.9838 |
1.9478 |
100 |
0.8 |
1.1197 |
1.1097 |
1.0977 |
1.0857 |
1.0627 |
1.1157 |
1.1157 |
1.1117 |
1.1087 |
0.9 |
1.1337 |
1.1137 |
1.1037 |
1.0917 |
1.0737 |
1.1287 |
1.1277 |
1.1227 |
1.1187 |
0.99 |
1.3867 |
1.3587 |
1.3577 |
1.2427 |
1.2417 |
1.3007 |
1.2857 |
1.2337 |
1.1597 |
0.999 |
3.9137 |
3.4177 |
3.3577 |
2.3937 |
1.7997 |
2.7397 |
2.4937 |
1.9827 |
1.9467 |
200 |
0.8 |
1.0953 |
1.0853 |
1.0733 |
1.0613 |
1.0383 |
1.0913 |
1.0913 |
1.0873 |
1.0843 |
0.9 |
1.1093 |
1.0893 |
1.0793 |
1.0673 |
1.0493 |
1.1043 |
1.1033 |
1.0983 |
1.0943 |
0.99 |
1.3623 |
1.3343 |
1.3333 |
1.2183 |
1.2173 |
1.2763 |
1.2613 |
1.2093 |
1.1353 |
0.999 |
3.8893 |
3.3933 |
3.3333 |
2.3693 |
1.7753 |
2.7153 |
2.4693 |
1.9583 |
1.9223 |
Table 3.
Regression coefficients of all the estimators and their corresponding MSE values.
Table 3.
Regression coefficients of all the estimators and their corresponding MSE values.
Coef. |
MLE |
RIDGE |
JRIDGE |
KL |
MJNBKL |
|
1.0955 |
0.5763 |
0.4789 |
0.55674 |
0.4470 |
|
0.00534 |
0.00534 |
0.00643 |
0.0053810 |
0.00971 |
|
-0.0154 |
-0.0154 |
-0.00536 |
-0.014483 |
-0.0093 |
|
0.01389 |
0.01389 |
0.01522 |
0.013970 |
0.01735 |
|
0.00078 |
0.00079 |
0.00368 |
0.000942 |
0.00866 |
|
0.02874 |
0.02875 |
0.02554 |
0.028586 |
0.01922 |
MSE |
0.00803 |
0.0051 |
0.0039 |
0.00189 |
0.00182 |
Table 4.
Regression coefficient adopting the Aircraft damage data.
Table 4.
Regression coefficient adopting the Aircraft damage data.
Coeff. |
MLE |
RIDGE |
JRIDGE |
KL |
MJNBKL |
Intercepts |
-0.4060 |
-0.16755 |
-0.0221 |
-0.10676 |
-0.18840 |
|
0.568772 |
0.379945 |
0.3212759 |
0.3905872 |
-0.0739795 |
|
0.165425 |
0.170544 |
0.169616 |
0.167538 |
0.205048 |
|
-0.01352 |
-0.015299 |
-0.016419 |
-0.0157773 |
-0.0150024 |
MSE |
1.029008 |
0.272690 |
0.23568 |
0.2248674 |
0.0750591 |
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).