Automatic variable selection in a linear model on massive data
From MaRDI portal
Publication:5042096
DOI10.1080/03610918.2020.1752377OpenAlexW3024069037MaRDI QIDQ5042096
Publication date: 18 October 2022
Published in: Communications in Statistics - Simulation and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610918.2020.1752377
Cites Work
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- A partially linear framework for massive heterogeneous data
- Divide and conquer local average regression
- Distributed testing and estimation under sparse high dimensional models
- Test by adaptive Lasso quantile method for real-time detection of a change-point
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Additive partially linear models for massive heterogeneous data
- High-dimensional integrative analysis with homogeneity and sparsity recovery
- Block average quantile regression for massive dataset
- Composite versus model-averaged quantile regression
- Computational and statistical analyses for robust non-convex sparse regularized regression problem
- Panel data quantile regression with grouped fixed effects
- Distributed inference for quantile regression processes
- Penalized expectile regression: an alternative to penalized quantile regression
- Adaptive LASSO model selection in a multiphase quantile regression
- A split-and-conquer approach for analysis of
- Sampling Lasso quantile regression for large-scale data
- Model Selection for High-Dimensional Quadratic Regression via Regularization
- Feasible algorithm for linear mixed model for massive data
- A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models
This page was built for publication: Automatic variable selection in a linear model on massive data