999精品在线视频,手机成人午夜在线视频,久久不卡国产精品无码,中日无码在线观看,成人av手机在线观看,日韩精品亚洲一区中文字幕,亚洲av无码人妻,四虎国产在线观看 ?

Variable Selection and Parameter Estimation with M-Estimation Method for Ultra-high Dimensional Linear Model

2021-10-20 03:26:22ZHUYanling朱艷玲WANGKai汪凱ZHAOMingtao趙明濤
應用數學 2021年4期

ZHU Yanling(朱艷玲),WANG Kai(汪凱),ZHAO Mingtao(趙明濤)

(School of Statistics and Applied Mathematics,Anhui University of Finance and Economics,Bengbu 233000,China)

Abstract:In this paper,the variable selection and parameter estimation of linear regression model in ultra-high dimensional case are considered.The proposed penalty likelihood M-estimator is proved to have good large sample properties by unifying the least squares,least absolute deviation,quantile regression and Huber regression into a general framework.The variable selection and parameter estimation are performed best by combining backward regression with local linear regression in the numerical simulation.In the case of ultra-high dimension,our general method has good robustness and effectiveness in variable selection and parameter estimation.

Key words:Ultra-high dimensionality;M-estimation;Penalized likelihood;Variable selection

1.Introduction

For the classical linear regression modelY=Xβ+ε,whereY=(y1,y2,...,yn)Tis the response vector,X=(X1,X2,...,Xpn)=(x1,x2,...,xn)T=(xij)n×pnis ann×pndesign matrix,andε=(ε1,ε2,...,εn)Tis a random vector.When dimensionpnis high,it is often assumed that only a small number of predictors among all the predictors contribute to the response,which amounts to assuming ideally that the parameter vectorβis sparse.In order to implement sparsity,variable selection can improve the accuracy of estimation by effectively identifying the subset of important predictors,and also enhance model interpretability of the model.

In this paper,we assume that the functionρis convex,hence the objective function is still convex and the obtained local minimizer is global minimizer.In Section 2,we discuss some theoretical properties of LLA estimator.In Section 3,we supply a simple and efficient algorithm to give the numerical simulation results.The proofs are given in Section 4.

2.Main Results

3.Simulation Results

In this section we evaluate the performance of the M-estimator proposed in(1.1)by simulation studies.

For the RSIS+LLA method,it tends to make a more robust selection in the first step,and performs well on the two indicators of estimation error and prediction error,but thus loses part of the ability to misidentify important variables,which makes it easy to omit important variables and cause great errors.In fact,we can guarantee the robustness of the selection and estimation by using the LAD loss function in the second step variable selection.

For the HOLP+LLA method,its performance in the three indicators of prediction error,correct exclusion of unimportant variables,and error identification of important variables is almost equal to that of FR+LLA,but it is slightly worse in the estimation error of the first index.

Tab.1 Numerical Results for ε ~N(0,1)

Example 2 We consider the case of the LAD loss function and the error termε ~t5.The estimated results are shown in Tab.2.

Tab.2 Numerical Results for ε ~t5

When the selection error term follows the heavy-tailed distribution,all six methods in Tab.2 perform better than the error term of the standard normal distribution in the first index estimation error and the third index correctly excluding the unimportant variable.The second indicator is slightly worse than the forecast error,and the fourth indicator is basically flat.Overall conclusion is consistent with Example 1,i.e.the performance of the FR+LLA method is slightly superior.

Example 3We consider the case of the LAD loss function and the error termε ~0.9N(0,1)+0.1N(0,9).The estimated results are shown in Tab.3.

Tab.3 Numerical Results for ε ~0.9N(0,1)+0.1N(0,9)

Synthesizing simulation results from Example 1 to Example 3,it can be seen that in the case that the number of explanatory variables is larger than the sample size,we design the plan that the backward regression FR method is used in the first step of variable screening,and the second step uses the local linear penalty LLA method proposed,and the performance on the four indicators is quite good.It also shows that for ultra-high dimensional data models,using the screening method of FR+LLA we provide is feasible and effective,and can be applied to more extensive data to obtain more satisfactory results.

4.Proofs of Main Results

主站蜘蛛池模板: 欧美色香蕉| 国产91高清视频| 亚洲看片网| 伊人网址在线| 九九免费观看全部免费视频| 国产在线91在线电影| 久青草免费视频| 亚洲另类第一页| 国产亚洲成AⅤ人片在线观看| 久久综合亚洲鲁鲁九月天| 亚洲精品无码在线播放网站| 国产导航在线| 日韩成人在线一区二区| 午夜高清国产拍精品| 欧美成人午夜在线全部免费| 国产在线98福利播放视频免费| 午夜国产在线观看| 成人免费黄色小视频| 亚洲免费黄色网| …亚洲 欧洲 另类 春色| 日韩精品免费一线在线观看 | 中文毛片无遮挡播放免费| 露脸真实国语乱在线观看| 国产精品成人观看视频国产| 亚洲人成成无码网WWW| 久久精品嫩草研究院| 免费国产不卡午夜福在线观看| 情侣午夜国产在线一区无码| 久久 午夜福利 张柏芝| 无码AV动漫| 无码内射在线| 国产福利影院在线观看| 国产AV无码专区亚洲A∨毛片| 97久久免费视频| 国产人妖视频一区在线观看| 国产H片无码不卡在线视频| 人妻出轨无码中文一区二区| 精品黑人一区二区三区| 狠狠躁天天躁夜夜躁婷婷| 国产一区二区福利| 无码专区在线观看| 五月天综合婷婷| 中文字幕有乳无码| 在线无码私拍| 国产精品19p| 制服丝袜国产精品| 成人国产精品一级毛片天堂| 国产成人av大片在线播放| 波多野结衣一区二区三区AV| 国产精品女同一区三区五区| 无码国产偷倩在线播放老年人| 国产精品久久自在自2021| 国产一级在线观看www色 | 欧美国产日产一区二区| 亚洲天堂精品在线| 亚洲熟妇AV日韩熟妇在线| 草草线在成年免费视频2| 欧美自慰一级看片免费| 欧美成人手机在线视频| 天天综合亚洲| 福利在线不卡| 亚洲男女在线| 国产精欧美一区二区三区| 欧美一区精品| 精品三级在线| 亚洲最大看欧美片网站地址| 国产一区二区三区日韩精品| 超薄丝袜足j国产在线视频| 国产在线麻豆波多野结衣| 农村乱人伦一区二区| 久久免费看片| 日韩精品免费在线视频| 亚洲日本韩在线观看| 国产三级成人| 国产欧美在线视频免费| 欧美精品影院| 日本免费a视频| 日韩精品免费一线在线观看| 美女裸体18禁网站| 九九九久久国产精品| 亚洲欧洲一区二区三区| 看你懂的巨臀中文字幕一区二区 |