集成学习方法:研究综述

徐继伟 杨云

引用本文:
Citation:

集成学习方法:研究综述

    作者简介: 徐继伟(1995-),男,湖南人,硕士生,主要研究领域为机器学习,集成学习.E-mail:420076887@qq.com.;
    通讯作者: 杨云, yangyun@ynu.edu.cn
  • 基金项目:

    国家自然科学基金(61876166,61663046).

A survey of ensemble learning approaches

    Corresponding author: YANG Yu, yangyun@ynu.edu.cn
  • 摘要: 机器学习的求解过程可以看作是在假设空间中搜索一个具有强泛化能力和高鲁棒性的学习模型,而在假设空间中寻找合适模型的过程是较为困难的.然而,集成学习作为一类组合优化的学习方法,不仅能通过组合多个简单模型以获得一个性能更优的组合模型,而且允许研究者可以针对具体的机器学习问题设计组合方案以得到更为强大的解决方案.回顾了集成学习的发展历史,并着重对集成学习中多样性的产生、模型训练和模型组合这三大策略进行归纳,然后对集成学习在现阶段的相关应用场景进行了描述,并在最后对集成学习的未来研究方向进行了分析和展望.
  • [1] TANG W,ZHOU Z H.Bagging-based selective clusterer ensemble[J].Journal of Software,2005,16(4):496-502.
    [2] DIETTERICH T G.Ensemble methods in machine learning[J].Proc International Workshop on Multiple Classifier Systems,2000,1857(1):1-15.
    [3] ZHANG C X,ZHANG J S.A survey of selective ensemble learning algorithms[J].Chinese Journal of Computers,2011,34(8):1399-1410.
    [4] KOHAVI R,WOLPERT D.Bias plus variance decomposition for zero-one loss functions[C].Thirteenth International Conference on International Conference on Machine Learning,1996:275-283.
    [5] CAO Y,MIAO Q G,LIU J C,et al.Advance andprospects of adaBoost algorithm[J].Acta Automatica Sinica,2013, 39(6):745-758.
    [6] DASARATHY B V,SHEELA B V.A composite classifier system design:Concepts and methodology[C].Proceedings of the IEEE,1979,67(5):708-713.
    [7] HANSEN L K,SALAMON P.Neural network ensembles[C].IEEE Computer Society,1990:993-1001.
    [8] YU L,WU T G.Assemble learning:A survey of Boosting algorithm[J].Pattern Recognition and Artificial Entelligence,2004,17(1):52-59.
    [9] FREUND Y,SCHAPIRE R E.A decision-theoretic generalization of on-line learning and an application to boosting[J].Journal of Computer and System Sciences,1997,55(1):23-37.
    [10] XIONG Z B.Research on GDP time series forecastingbased onintergrating ARIMA with neural networks[J].Jouranal of Applied Statistics and Management,2011,30(2):306-314.
    [11] JACOBS R A,et al.Adaptive mixtures of local experts[J].Neural Computation,1991,3(1):79-87.
    [12] LI X,ZHANG T W,GUO Z.An novelb ensemble method of feature gene selection based on recursive partition-tree[J].Chinese Journal of Computers,2004,27(5):675-682.
    [13] GU Y,XU Z B,SUN J,et al.An intrution detection ensemble system based on the featureextracted by PCA and ICA[J].Journal of Computer Research and Development,2006,43(4):67-72.
    [14] WOLPERT D H.Stacked generalization[M].US:Springer,1992:241-259.
    [15] FREUND Y,SCHAPIRE R E.Game theory,on-line prediction and boosting[C].Conference on Computational Learning Thoery,1996.DOI: 10.1145/238061.238163.
    [16] FREUND Y,SCHAPIRE R E.Experiments with a new boosting algorithm[C].International Conference on Machine Learning,1996:148-156.
    [17] BREIMAN L.Bagging predictors[J].Machine Learning,1996,24:123-140.
    [18] WOODS K S,KEGELMEYER W P,BOWYER K W.Combination of multiple classifiers using local accuracy estimates[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1997,19(4):405-410.
    [19] BREIMAN L.Random forests[J].Machine Learning,2001,45(1):5-32.
    [20] ZHANG C,ZHANG C.HC/Technik/Sonstiges.Ensemble Machine Learning[M].US:Springer,2009.
    [21] BROWN G,et al.Diversity creation methods:a survey and categorisation[J].Information Fusion,2005,6(1):5-20.
    [22] TO G B,BROWN G.Diversity in neural network ensembles[D].Birmingham:University of Birmingham,2004.
    [23] TUMER K,GHOSH J.Error correlation and error reduction in ensemble classifiers[J].Connection Science,1996,8(3/4):385-404.
    [24] KROGH A,VEDELSBY J.Neural network ensembles,cross validation and active learning[J].International Conference on Neural Information Processing Systems,1994,7(10):231-238.
    [25] TANG E K,SUGANTHAN P N,YAO X.An analysis of diversity measures[J].Machine Learning,2006,65(1):247-271.
    [26] BANFIELD R E,et al.Ensemble diversity measures and their application to thinning[J].Information Fusion,2005,6(1):49-62.
    [27] KUNCHEVA L I,WHITAKER C J.Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy[J].Machine Learning,2003,51(2):181-207.
    [28] KUNCHEVA L I.Combining pattern classifiers:methods and algorithms[M].New Jersey:John Wiley Sons,2004.
    [29] ZHOU Z H.Ensemble methods:foundations and algorithms[M].Taylor & Francis,2012:77-79.
    [30] YANG Y,JIANG J.Hybrid sampling-based clustering ensemble with global and local constitutions[J].IEEE Transactions on Neural Networks & Learning Systems,2017,27(5):952-965.
    [31] YANG Y,CHEN K.Unsupervised learning via iteratively constructed clustering ensemble[C].International Joint Conference on Neural Networks,2017:1-8.
    [32] BOSTROM H.Feature vs.classifier fusion for predictive data mining a case study in pesticide classification[C].International Conference on Information Fusion,2007:1-7.
    [33] HAN D Q,HAN C Z,YANG Y.Multi-class SVM classifiers fusion based on evidence combination[C].International Conference on Wavelet Analysis and Pattern Recognition,2008:579-584.
    [34] MAIMON O,ROKACH L.Improving supervised learning by feature decomposition[J].International Symposium on Foundations of Information and Knowledge Systems,2002,2284:178-196.
    [35] HO T K.The random subspace method for constructing decision forests[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,1998,20(8):832-844.
    [36] ROKACH L.Genetic algorithm-based feature set partitioning for classification problems[J].Pattern Recognition,2008,41(5):1676-1700.
    [37] ROKACH L.Decomposition methodology for classification tasks:a meta decomposer framework[J].Pattern Analysis & Applications,2006,9(2/3):257-271.
    [38] KUSIAK A.Decomposition in data mining:an industrial case study[J].IEEE Transactions on Electronics Packaging Manufacturing,2000,23(4):345-353.
    [39] YANG Y,LIU X,YE Q,et al.Ensemble learning based person Re-Identification with multiple feature representations[J].Complexity,2018(To Appear).
    [40] ROKACH L,MAIMON O.Feature set decomposition for decision trees[M].Nethorlands:IOS Press,2005:131-158.
    [41] YANG Y,JIANG J.HMM-based hybrid meta-clustering ensemble for temporal data[J].Knowledge-Based Systems,2014,56:299-310.
    [42] BREIMAN L.Randomizing outputs to increase prediction accuracy[J].Machine Learning,2000,40(3):229-242.
    [43] DIETTERICH T G,BAKIRI G.Solving multiclass learning problems via error-correcting output codes[J].AI Access Foundation,1994,2(1):263-286.
    [44] SALZBERG S L.C4.5:Programs for machine learning by J.Ross Quinlan.Morgan Kaufmann Publishers 1993[J].Machine Learning,1994,16(3):235-240.
    [45] LEE S J,XU Z,LI T,et al.A novel bagging C4.5 algorithm based on wrapper feature selection for supporting wise clinical decision making[J].Journal of Biomedical Informatics,2017,78:144-155.
    [46] GÖNEN M,ALPAYDIN E.Multiple kernel learning algorithms[J].Journal of Machine Learning Research,2011,12:2211-2268.
    [47] PARTRIDGE D,YATES W B.Engineering multiversion neural-net systems[J].Neural Computation,1996,8(4):869-893.
    [48] YATES W B,PARTRIDGE D.Use of methodological diversity to improve neural network generalisation[J].Neural Computing & Applications,1996,4(2):114-128.
    [49] JORDAN M I,LECUN Y,SOLLA S A.Advances in neural information processing systems[J].Biochemical & Biophysical Research Communications,2002,159(6):125-132.
    [50] OPITZ D W,SHAVLIK J W.Generating accurate and diverse members of a neural-network ensemble[J].Advances in Neural Information Processing Systems,1996,8:535-541.
    [51] LIU Y.Generate different neural networks by negative correlation learning[J].Springer Berlin Heidelberg,2005,3 610:417-417.
    [52] BROWN G,WYATT J.Negative correlation learning and the ambiguity family of ensemble methods[C].Multiple Classifier Systems,Internation Workshp MCS,Guilfod UK,2003:266-275.
    [53] ROSEN B E.Ensemble learning using decorrelated neural networks[J].Connection Science,1996,8(34):373-384.
    [54] YANG Y,CHEN K.An ensemble of competitive learning networks with different representations for temporal data clustering[C].International Joint Conference on Neural Networks,2006:3120-3127.
    [55] YANG Y,LIU X.A robust semi-supervised learning approach via mixture of label information[M].Amsterdan:Elsevier Science Inc,2015.
    [56] WANG W,JONES P,PARTRIDGE D.Diversity between neural networks and decision trees for building multiple classifier systems[C].International Workshop on Multiple Classifier Systems,2000:240-249.
    [57] ROKACH L.Collective-agreement-based pruning of ensembles[J].Computational Statistics & Data Analysis,2009,53(4): 1015-1026.
    [58] LIU H,MANDVIKAR A,MODY J.An empirical study of building compact ensembles[C].Advances in Web-Age Information Management:International Conference,2004:622-627.
    [59] ZHOU Z H,WU J,TANG W.Ensembling neural networks:many could be better than all[J].Artificial Intelligence,2002,137(1/2):239-263.
    [60] CHANDRA A,YAO X.Evolving hybrid ensembles of learning machines for better generalisation[J].Neurocomputing,2006,69(7):686-700.
    [61] LIU Y,YAO X.Ensemble learning via negative correlation[J].Neural Netw,1999,12(10):1399-1404.
    [62] BREIMAN L.Pasting small votes for classification in large databases and on-line[J].Machine Learning,1999,36(1/2):85-103.
    [63] BREIMAN L.Bias variance and arcing classifiers[J].Additives for Polymers,1996,2002(6):10.
    [64] BÜHLMANN P,YU B.Analyzing bagging[J].Annals of Statistics,2002,30(4):927-961.
    [65] SCHAPIRE R E.The strength of weak learnability[J].Proceedings of the Second Annual Workshop on Computational Learning Theory,1990,5(2):197-227.
    [66] ZHU X,BAO C,QIU W.Bagging very weak learners with lazy local learning[C].International Conference on Pattern Recognition,2012:1-4.
    [67] ZHU X,YANG Y.A lazy bagging approach to classification[J].Pattern Recognition,2008,41(10):2980-2992.
    [68] 唐伟,周志华.基于Bagging的选择性聚类集成[J].软件学报,2005,16(4): 496-502.
    [69] 张春霞,张讲社.选择性集成学习算法综述[J].计算机学报,2011,34(8):1399-1410.
    [70] HASTIE T,FRIEDMAN J,TIBSHIRANI R.The elements of statistical learning[J].Technometrics,2001,45(3):267-268.
    [71] MEIR R,RÄTSCH G.An introduction to boosting and leveraging[J].Advanced Lectures on Machines Learning,2003,2600:119-184.
    [72] SCHAPIRE R E.The boosting approach to machine learning[J].An Overview,2003,171:149-171.
    [73] 曹莹,黄启广,刘家辰,等.AdaBoost算法研究进展与展望[J].自动化学报,2013,39(6):745-758.
    [74] 于玲,吴铁军.集成学习:Boosting算法综述[J].模式识别与人工智能,2004,17(1):52-59.
    [75] BENMOKHTAR R,HUET B.Classifier fusion:combination methods for semantic indexing in video content[C].International Conference on Artificial Neural Networks,2006:65-74.
    [76] LAM L.Classifier combinations:implementations and theoretical issues[C].International Workshop on Multiple Classifier Systems,2000:77-86.
    [77] RAHMAN A F R,FAIRHURST M C.Serial combination of multiple experts:a unified evaluation[J].Pattern Analysis & Applications,1999,2(4):292-311.
    [78] WOŹNIAK M,GRAÑA M,CORCHADO E.A survey of multiple classifier systems as hybrid systems[J].Information Fusion,2014,16(1):3-17.
    [79] KAI M T,WITTEN I H.Stacked generalization:when does it work?[C].Fifteenth International Joint Conference on Artifical Intelligence,1997:866-871.
    [80] SIGLETOS,PALIOURAS G,SPYROPULOS C D,et al.Combining information extraction systems using voting and stacked generalization[J].Journal of Machine Learning Research,2005,6(3):1751-1782.
    [81] LEDEZMA A,ALER R,SANCHIS A,et al.GA-stacking:Evolutionary stacked generalization[J].Intelligent Data Analysis,2010,14(1):89-119.
    [82] KUMARI G T P.A study of Bagging and Boosting approaches to develop meta-classifier[J].Engineering Science and Technology,2012,2(5):850-855
    [83] VALENTINI G,DIETTERICH T G.Bias-variance analysis of SVM for the development of SVM-based Ensemble[J].Journal of Machine Learning Research,2004,5(3):725-775.
    [84] ARISTOTLE J.Weighted majority algorithm[M].Secut Press,2013.
    [85] PAPPENBERGER F,BEVEN K J,HONTER N M,et al.Cascading model uncertainty from medium range weather forecasts (10 days) through a rainfall-runoff model to flood inundation predictions within the european flood forecasting system (EFFS)[J].Hydrology & Earth System Sciences,2005,9(4):381-393.
    [86] ISLAM M M,YAO X,MURASE K.A constructive algorithm for training cooperative neural network ensembles[J].IEEE Transactions on Neural Networks,2003,14(4):820.
    [87] KIM D,KIM C.Forecasting time series with genetic fuzzy predictor ensemble[J].Fuzzy Systems IEEE Transactions on,1997,5(4):523-535.
    [88] ASSAAD M,BONÉ R,CARDOT H.A new boosting algorithm for improved time-series forecasting with recurrent neural networks[J].Information Fusion,2008,9(1):41-55.
    [89] 熊志斌. 基于ARIMA与神经网络集成的GDP时间序列预测研究[J].数理统计与管理,2011,30(2):306-314.
    [90] NEUGEBAUER J,BREMER J,HINRICHS C,et al.Generalized cascade classification model with customized transformation based ensembles[C].International Joint Conference on Neural Networks,2016:4056-4063.
    [91] BAGNALL A,LINES T,HILLS J,et al.Time-series classification with COTE:the collective of transformation-based ensembles[J].IEEE Transactions on Knowledge & Data Engineering,2015,27(9):2522-2535.
    [92] NGUYEN M N,LI X L,NG S K.Ensemble based positive unlabeled learning for time series classification[M].Springer Berlin Heidelberg,2012:243-257.
    [93] OEDA S,KURIMOTO I,ICHIMURA T.Time series data classification using recurrent neural network with ensemble learning[M].Springer Berlin Heidelberg,2006:742-748.
    [94] GIACOMEL F,PEREIRA A C M,GALANTE R.Improving financial time series prediction through output classification by a neural network ensemble[M].Springer International Publishing,2015:331-338.
    [95] YANG Y,CHEN K.Time series clustering via RPCL network ensemble with different representations[J].Systems Man and Cybernetics,2011,41(2):190-199.
    [96] YANG Y,CHEN K.Temporal data clustering via weighted clustering ensemble with different representations[J].IEEE Transactions on Knowledge and Data Engineering,2011,23(2):307-320.
    [97] YANG Y,JIANG J.Bi-weighted ensemble via HMM-based approaches for temporal data clustering[J].Pattern Recognition,2018,76:391-403.
    [98] YANG Y,JIANG J.Adaptive bi-weighting toward automatic initialization and model selection for HMM-based hybrid meta-clustering ensembles[J].IEEE Transactions on Cybernetics,2018,pp(99):1-12.
    [99] YANG Y.Temporal data mining via unsupervised ensemble learning[M].Elsevier,2017.
    [100] YANG Y,CHEN K.Combining competitive learning networks of various representations for sequential data clustering[C].Combining Competitie Learning Networks for Sequence Clustering,2006:315-336.DOI: 10.1007/978-3-540-36122-0_13.
    [101] STREHL A,GHOSH J.Cluster ensembles—a knowledge reuse framework for combining multiple partitions[J].JMLR org,2003,3(3):583-617.
    [102] YANG Y,LI I,WANG W,et al.An adaptive semi-supervised clustering approach via multiple density-based information[J].Neurocomputing,2017,257:193-205.
    [103] REKOW E D,CAD/CAM in dentistry[J].Alpha Omegan,1991,84(4):41.
    [104] GARG A X, ADHIKARI N K J,MCDENALD H,et al.Effects of computerized clinical decision support systems on practitioner performance and patient outcomes:a systematic review[J].Centre for Reviews and Dissemination (UK),2005,293(10):1223-1238.
    [105] LI Y,YANG L,WANG P,et al.Classification of parkinson's disease by decision tree based instance selection and ensemble learning algorithms[J].Journal of Medical Imaging & Health Informatics,2017,7(2).DOI: 10.1116/jmihi.2017.2033.
    [106] BIJIU LSSAC,NAUMAN LSRAR.Case studies in intelligent computing achievements and trends[M].CRC Press,2014:517-532.
    [107] SUBRAMANIAN D,SUBRAMANIAN V,DESWAL A,et al.New predictive models of heart failure mortality using time-series measurements and ensemble models[J].Circulation Heart Failure,2011,4(4):456-462.
    [108] GUO H Y,WANG D.A multilevel optimal feature selection and ensemble learning for a specific CAD system-pulmonary nodule detection[J].Applied Mecharics & Materials,2013,380-384:1593-1599
    [109] 李霞,张田文,郭政.一种基于递归分类树的集成特征基因选择方法[J].计算机学报,2004,27(5):675-682.
    [110] TAN A C,GILBRT D.Ensemble machine learning on gene expression data for cancer classification[J].Appl Bioinformatics,2003,2(Suppl 3):S75.
    [111] YANG P,Hwa Yang Y, B ZHOU B,et al.A review of ensemble methods in bioinformatics[J].Current Bioinformatics,2010,5(4):296-308.
    [112] OKUN O.Feature selection and ensemble methods for bioinformatics:Algorithmic Classification and Implementations[M].Information Science Reference-Imprint of:IGI Publishing,2011.
    [113] YANG P,ZHOU B B,YANG J Y,et al.Stability of feature selection algorithms and ensemble feature selection methods in bioinformatics[M].John Wiley & Sons Inc,2014:333-352.
    [114] MA Y.An empirical investigation of tree ensembles in biometrics and bioinformatics research[D].Morgantown:West Virginia University,2007.
    [115] ROWLAND C H.Intrusion detection system[P].US,Patent 6 405 318,2002-06-11.
    [116] SOMMER R,PAXSON V.Outside the closed world:on using machine learning for network intrusion detection[J].IEEE Symposium on Security and Privacy,2010,41(3):305-316.
    [117] KARTHIKEYAN S S,KARTHIKEYAN,MAYBELL P S.An ensemble design of intrusion detection system for handling uncertainty using Neutrosophic Logic Classifier[J].Elsevier Science Publishers B V,2012,28(2):88-96.
    [118] GAIKWAD D,THOOL R.DAREnsemble:decision tree and rule learner based ensemble for network intrusion detection system[M].Springer International Publishing,2016.
    [119] MEHETREY P,SHAHRIARI B,MOH M.Collaborative ensemble-learning based intrusion detection systems for clouds[C].International Conference on Collaboration Technologies and Systems,2017:404-411.
    [120] 谷雨,徐宗本,孙剑,等.基于PCA与ICA特征提取的入侵检测集成分类系统[J].计算机研究与发展,2006,43(4):633-638.
    [121] SORNSUWIT P,JAIYEN S.Intrusion detection model based on ensemble learning for U2R and R2L attacks[C].International Conference on Information Technology and Electrical Engineering,2016:354-359.
    [122] CHEBROLU S,ABRAHAM A,THOMAS J P.Feature deduction and ensemble design of intrusion detection systems[J].Computers & Security,2005,24(4):295-307.
    [123] GIACINTO G,PERDISCI R,DEL RIO M,et al.Intrusion detection in computer networks by a modular ensemble of one-class classifiers[J].Information Fusion,2008,9(1):69-82.
  • [1] 宋炯金钊杨维和 . 机器学习中加速强化学习的一种函数方法. 云南大学学报(自然科学版), 2011, 33(S2): 176-181.
    [2] 张羿赵志勇黄治方俊霆 . SOA安全体系在南方电网信息集成平台中的应用研究. 云南大学学报(自然科学版), 2013, 35(S2): 220-. doi: 10.7540/j.ynu.2013b56
    [3] 徐兵元周兴东胡永华 . 基于动态规则的SOA服务集成平台的研究与开发. 云南大学学报(自然科学版), 2013, 35(S2): 184-. doi: 10.7540/j.ynu.2013b61
    [4] 张秀年曹杰杨素雨札明辉 . 多模式集成MOS方法在精细化温度预报中的应用. 云南大学学报(自然科学版), 2011, 33(1): 67-70 .
    [5] 廖松愉李志军 . 一个四维超混沌系统的构建及其电路集成实现. 云南大学学报(自然科学版), 2018, 40(2): 243-251. doi: 10.7540/j.ynu.20170496
    [6] 胡光华胡光涛 . 基于线性近似的即时差分学习. 云南大学学报(自然科学版), 2002, 24(1): 9-13.
    [7] 邱宇青胡光华潘文林 . 基于正交表的支持向量机并行学习算法. 云南大学学报(自然科学版), 2006, 28(2): 93-97.
    [8] 王崇文任翔 . 一种基于SNS平台的网络协作学习模式研究. 云南大学学报(自然科学版), 2012, 34(S1): 16-19.
    [9] 王崇文任翔 . 一种基于移动智能手机的微学习模式研究. 云南大学学报(自然科学版), 2013, 35(S1): 133-137. doi: 10.7540/j.ynu.20130262
    [10] 袁青梅 . 如何激发学生对生物材料学课程的学习兴趣. 云南大学学报(自然科学版), 2014, 36(S2): 206-207. doi: 10.7540/j.ynu.2014b24
    [11] 虞双吉苗春生王新 . 极限学习机神经网络在短期降水预报中的应用. 云南大学学报(自然科学版), 2013, 35(4): 507-515. doi: 10.7540/j.ynu.20120670
    [12] 张明真郭敏 . 基于流形学习和SVM的储粮害虫声信号识别研究. 云南大学学报(自然科学版), 2014, 36(2): 174-180. doi: 10.7540/j.ynu.20130300
    [13] 孙蓉蓉郭磊 . 现代教育技术与学科课程整合网络环境下英语学习策略研究. 云南大学学报(自然科学版), 2015, 37(S1): 1-. doi: 10.7540/j.ynu.20140213
    [14] 陈建平杨宜民张会章陈学松 . 一种基于GMDH模型的神经网络学习算法. 云南大学学报(自然科学版), 2008, 30(6): 569-574.
    [15] 蔡娜王俊英刘惟一 . 一种基于小数据集的贝叶斯网络学习方法. 云南大学学报(自然科学版), 2007, 29(4): 359-363,370.
    [16] 刘正之李国忠朱丹袁际学杨孟庄 . 云南省高等院校体育专业学生学习倦怠情况的研究与分析. 云南大学学报(自然科学版), 2016, 38(S1): 137-. doi: 10.7540/j.ynu.20160095
    [17] 韩格岳昆刘惟一 . 一种基于博弈论的交通系统最优调度策略学习方法. 云南大学学报(自然科学版), 2010, 32(1): 36-42 .
    [18] 刘琰煜周冬明聂仁灿侯瑞超丁斋生 . 低秩表示和字典学习的红外与可见光图像融合算法. 云南大学学报(自然科学版), 2019, 41(4): 689-698. doi: 10.7540/j.ynu.20180753
    [19] 燕志星王海瑞杨宏伟靖婉婷 . 基于深度学习特征提取和GWO-SVM滚动轴承故障诊断的研究. 云南大学学报(自然科学版), 2020, 42(): 1-8. doi: 10.7540/j.ynu.20190535
  • 加载中
计量
  • 文章访问数:  424
  • HTML全文浏览量:  109
  • PDF下载量:  98
  • 被引次数: 0
出版历程
  • 收稿日期:  2018-08-01
  • 刊出日期:  2018-11-10

集成学习方法:研究综述

    作者简介:徐继伟(1995-),男,湖南人,硕士生,主要研究领域为机器学习,集成学习.E-mail:420076887@qq.com.
    通讯作者: 杨云, yangyun@ynu.edu.cn
  • 1. 云南大学 软件学院,云南 昆明 650500
基金项目:  国家自然科学基金(61876166,61663046).

摘要: 机器学习的求解过程可以看作是在假设空间中搜索一个具有强泛化能力和高鲁棒性的学习模型,而在假设空间中寻找合适模型的过程是较为困难的.然而,集成学习作为一类组合优化的学习方法,不仅能通过组合多个简单模型以获得一个性能更优的组合模型,而且允许研究者可以针对具体的机器学习问题设计组合方案以得到更为强大的解决方案.回顾了集成学习的发展历史,并着重对集成学习中多样性的产生、模型训练和模型组合这三大策略进行归纳,然后对集成学习在现阶段的相关应用场景进行了描述,并在最后对集成学习的未来研究方向进行了分析和展望.

English Abstract

参考文献 (123)

目录

    /

    返回文章
    返回