组稀疏模型及其算法综述
Survey on Group Sparse Models and Algorithms
查看参考文献75篇
文摘
|
稀疏性与组稀疏性在统计学、信号处理和机器学习等领域中具有重要的应用.本文总结和分析了不同组稀疏模型之间的区别与联系,比较了不同组稀疏模型的变量选择能力、变量组选择能力、变量选择一致性和变量组选择一致性,总结了组稀疏模型的各类求解算法并指出了各算法的优点和不足.最后,本文对组稀疏模型未来的研究方向进行了探讨. |
其他语种文摘
|
The sparsity and group sparsity have important applications in the statistics,signal processing and machine learning.This paper summarized and analyzed the differences and relations between various group sparse models.In addition,we compared different models' variable selection ability,variable group selection ability,variable selection consistency and variable group selection consistency.We also summarized the algorithms of group sparse models and pointed the advantages and disadvantages of the algorithms.Finally,we point out the future research directions of the group sparse models. |
来源
|
电子学报
,2015,43(4):776-782 【核心库】
|
DOI
|
10.3969/j.issn.0372-2112.2015.04.021
|
关键词
|
稀疏性
;
组稀疏性
;
变量选择
;
变量组选择
;
一致性
|
地址
|
中国石油大学(北京)自动化研究所, 北京, 102249
|
语种
|
中文 |
文献类型
|
综述型 |
ISSN
|
0372-2112 |
学科
|
自动化技术、计算机技术 |
基金
|
国家自然科学基金
|
文献收藏号
|
CSCD:5442859
|
参考文献 共
75
共4页
|
1.
Tibshirani R. Regression shrinkage and selection via the lasso.
Journal of the Royal Statistical Society:Series B,1996,58(1):267-288
|
CSCD被引
953
次
|
|
|
|
2.
Yuan M. Model selection and estimation in regression with grouped variables.
Journal of the Royal Statistical Society:Series B,2006,68(1):49-67
|
CSCD被引
118
次
|
|
|
|
3.
Turlach B A. Simultaneous variable selection.
Technometrics,2005,47(3):349-363
|
CSCD被引
9
次
|
|
|
|
4.
Tropp J A. Algorithms for simultaneous sparse approximation.
Signal Processing,2006,86(3):589-602
|
CSCD被引
77
次
|
|
|
|
5.
Quattoni A. Transfer learning for image classification with sparse prototype representations.
Proceedings of IEEE Conference on Computer Vision and Pattern Recognition,2008:1-8
|
CSCD被引
2
次
|
|
|
|
6.
Schmidt M W. Structure learning in random fields for heart motion abnormality detection.
Proceedings of IEEE Conference on Computer Vision and Pattern Recognition,2008:1-8
|
CSCD被引
1
次
|
|
|
|
7.
Quattoni A. An efficient projection for L_(1,∞) regularization.
Proceedings of the 26th Annual International Conference on Machine Learning,2009:857-864
|
CSCD被引
4
次
|
|
|
|
8.
Vogt J E. The group-Lasso:l_(1,∞) regularization versus l_(1,2) regularization.
Proceedings of the 32nd DAGM conference on Pattern recognition,2010:252-261
|
CSCD被引
2
次
|
|
|
|
9.
Huang J. The benefit of group sparsity.
The Annals of Statistics,2010,38(4):1978-2004
|
CSCD被引
9
次
|
|
|
|
10.
Sra S. Fast projections onto ?_(1,q)-norm balls for grouped feature selection.
Lecture Notes in Computer Science,2011:305-317
|
CSCD被引
1
次
|
|
|
|
11.
Kowalski M. Sparse regression using mixed norms.
Applied and Computational Harmonic Analysis,2009,27(3):303-324
|
CSCD被引
4
次
|
|
|
|
12.
Rakotomamonjy A. Lp-Lq penalty for sparselinear and sparse multiple kernel multi-task learning.
IEEE Transactions on Neural Networks,2011,22(8):1307-1320
|
CSCD被引
11
次
|
|
|
|
13.
Simon N. Standardization and the group lasso penalty.
Statistica Sinica,2012,22(3):983-1001
|
CSCD被引
1
次
|
|
|
|
14.
Bunea F. The group square-root lasso:theoretical properties and fast algorithms.
IEEE Transactions on Information Theory,2014,60(2):1313-1325
|
CSCD被引
1
次
|
|
|
|
15.
Belloni A. Square-root lasso:pivotal recovery ofsparse signals via conic programming.
Biometrika,2011,98(4):791-806
|
CSCD被引
4
次
|
|
|
|
16.
Wang H. A note on adaptive group lasso.
Computational Statistics and Data Analysis,2008,52(12):5277-5286
|
CSCD被引
8
次
|
|
|
|
17.
Wei F. Consistent group selection in high-dimensional linear regression.
Bernoulli,2010,16(4):1369-1384
|
CSCD被引
6
次
|
|
|
|
18.
Zou H. The adaptive lasso and its oracle properties.
Journal of theAmerican statistical association,2006,101(476):1418-1429
|
CSCD被引
209
次
|
|
|
|
19.
Zhang H H. Adaptive lasso for Cox's proportional hazards model.
Biometrika,2007,94(3):691-703
|
CSCD被引
21
次
|
|
|
|
20.
Huang J. Adaptive lasso for sparse high-dimensional regression models.
Statistica Sinica,2008,18(4):1603-1618
|
CSCD被引
13
次
|
|
|
|
|