Large-scale distributed non-negative sparse coding and sparse dictionary learning
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Journal of Computational Neuroscience
Bagging ensemble selection for regression
AI'12 Proceedings of the 25th Australasian joint conference on Advances in Artificial Intelligence
A fresh perspective: learning to sparsify for detection in massive noisy sensor networks
Proceedings of the 12th international conference on Information processing in sensor networks
FeaFiner: biomarker identification from medical data through feature generalization and selection
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
The Journal of Machine Learning Research
Sparsity regret bounds for individual sequences in online linear regression
The Journal of Machine Learning Research
Interquantile shrinkage and variable selection in quantile regression
Computational Statistics & Data Analysis
Nonparametric sparsity and regularization
The Journal of Machine Learning Research
Hi-index | 0.00 |
Modern statistics deals with large and complex data sets, and consequently with models containing a large number of parameters. This book presents a detailed account of recently developed approaches, including the Lasso and versions of it for various models, boosting methods, undirected graphical modeling, and procedures controlling false positive selections.A special characteristic of the book is that it contains comprehensive mathematical theory on high-dimensional statistics combined with methodology, algorithms and illustrations with real data examples. This in-depth approach highlights the methods great potential and practical applicability in a variety of settings. As such, it is a valuable resource for researchers, graduate students and experts in statistics, applied mathematics and computer science.