WhenEfficientModelAveragingOut-PerformBagging…当有效的模型平均进行套袋…

上传人:ra****d 文档编号:252527499 上传时间:2024-11-17 格式:PPT 页数:16 大小:867.50KB
返回 下载 相关 举报
WhenEfficientModelAveragingOut-PerformBagging…当有效的模型平均进行套袋…_第1页
第1页 / 共16页
WhenEfficientModelAveragingOut-PerformBagging…当有效的模型平均进行套袋…_第2页
第2页 / 共16页
WhenEfficientModelAveragingOut-PerformBagging…当有效的模型平均进行套袋…_第3页
第3页 / 共16页
点击查看更多>>
资源描述
Click to edit Master title style,Click to edit Master text styles,Second level,Third level,Fourth level,Fifth level,*,When Efficient Model Averaging Out-Perform Bagging and Boosting,Ian Davidson,SUNY Albany,Wei,Ensemble Techniques,Techniques such as boosting and bagging are methods of combining models.,Used extensively in ML and DM seems to work well in a large variety of situations.,But model averaging is the“correct Bayesian method of using multiple models.,Does model averaging have a place in ML and DM?,What is Model Averaging?,Posterior,weighting,Class,Probability,Integration Over,Model Space,Averaging of class probabilities weighted by posterior,Removes model uncertainty by averaging,Prohibitive for large model spaces,such as decision trees,Efficient Model Averaging:PBMA and Random DT,PBMA(Davidson 04):parametric bootstrap model averaging,Use parametric model to generate multiple bootstraps computed from a single training set.,Random Decision Tree(Fan et al 03),Construct each trees structure randomly,Categorical feature used once in a decision path,Random threshold for continuous features.,Leaf node statistics estimated from data.,Average probability of multiple trees.,Our Empirical Study,Idea:When model uncertainty occurs,model averaging should perform well,Four specific but common situations when factoring in model uncertainty is beneficial,Class label noise,Many label problem,Sample selection bias,Small data sets,Class Label Noise,Randomly flip 10%of labels,Data Set with Many Classes,Biased Training Sets,See ICDM 2005 for a formal analysis,See KDD 2006 to look at estimating accuracy,See ICDM 2006 for a case study,Universe of Examples,Two classes:,red and green,red:f2f1,green:f2=f1,Unbiased and Biased Samples,Single Decision Tree,Unbiased 97.1%,Biased 92.1%,Random Decision Tree,Unbiased 96.9%,Biased 95.9%,Bagging,Unbiased 97.82%,Biased 93.52%,PBMA,Unbiased 99.08%,Biased 94.55,Boosting,Unbiased 96.405%,Biased 92.7%,Scope of This Paper,Identifies conditions where model averaging should outperform bagging and boosting.,Empirically verifies these claims.,Other questions:,Why does bagging and boosting perform badly in these conditions?,
展开阅读全文
相关资源
正为您匹配相似的精品文档
相关搜索

最新文档


当前位置:首页 > 商业管理 > 商业计划


copyright@ 2023-2025  zhuangpeitu.com 装配图网版权所有   联系电话:18123376007

备案号:ICP2024067431-1 川公网安备51140202000466号


本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。装配图网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知装配图网,我们立即给予删除!