问题为什么现在的计算机处理智能信息效率很低

上传人:小*** 文档编号:252785714 上传时间:2024-11-19 格式:PPT 页数:24 大小:2.56MB
返回 下载 相关 举报
问题为什么现在的计算机处理智能信息效率很低_第1页
第1页 / 共24页
问题为什么现在的计算机处理智能信息效率很低_第2页
第2页 / 共24页
问题为什么现在的计算机处理智能信息效率很低_第3页
第3页 / 共24页
点击查看更多>>
资源描述
单击此处编辑母版标题样式,单击此处编辑母版文本样式,第二级,第三级,第四级,第五级,*,Statistical Learning and Inference,*,Prof.,Liqing,Zhang,Dept. Computer Science & Engineering,Shanghai Jiaotong University,Statistical Learning & Inference,Books and References,Trevor Hastie,Robert Tibshirani,Jerome Friedman,The Elements of statistical Learning:,Data Mining, Inference, and Prediction,2001, Springer-,Verlag,Vladimir N. Vapnik,The Nature of Statistical Learning Theory, 2nd ed., Springer, 2000,S.,Mendelson,A few notes on Statistical Learning Theory in Advanced Lectures in Machine Learning: Machine Learning Summer School 2002, S.,Mendelson,and A. J.,Smola,(,eds,), Lecture Notes in Computer Science, 2600, Springer, 2003,M.,Vidyasagar,Learning and generalization: with applications to neural networks, 2nd ed., Springer, 2003,2024/11/19,2,Statistical Learning and Inference,Overview of the Course,Introduction,Overview of Supervised Learning,Linear Method for Regression and Classification,Basis Expansions and Regularization,Kernel Methods,Model Selections and Inference,Support Vector Machine,Bayesian Inference,Unsupervised Learning,2024/11/19,3,Statistical Learning and Inference,Why Statistical Learning?,我门被信息淹没,但却缺乏知识。,- R. Roger,恬静的统计学家改变了我们的世界;不是通过发现新的事实或者开发新技术,而是通过改变我们的推理、实验和观点的形成方式。,- I. Hacking,问题:为什么现在的计算机处理智能信息效率很低?,图像、视频、音频,认知,语言,2024/11/19,4,Statistical Learning and Inference,ML: SARS,Risk Prediction,SARS,Risk,Age,Gender,Blood Pressure,Chest X-Ray,Pre-Hospital,Attributes,Albumin,Blood pO2,White Count,RBC Count,In-Hospital,Attributes,2024/11/19,5,Statistical Learning and Inference,ML: Auto Vehicle Navigation,Steering Direction,2024/11/19,6,Statistical Learning and Inference,Protein Folding,2024/11/19,7,Statistical Learning and Inference,The Scale of Biomedical Data,2024/11/19,8,Statistical Learning and Inference,计算科学与脑科学,计算机信息处理,基于逻辑的计算,和数据分离,数据处理与存储简单,智能信息处理复杂、慢,认知能力弱,信息处理模式:,逻辑概念统计信息,大脑信息处理,基于统计信息的计算,计算和数据集成一体,数据处理与存储未知,智能信息处理简单、快速,认知能力强,信息处理模式:,统计信息概念逻辑,2024/11/19,9,Statistical Learning and Inference,Function Estimation Model,The Function Estimation Model of learning examples:,Generator (G),generates observations,x,(typically in,R,n,), independently drawn from some fixed distribution,F,(,x,),Supervisor (S),labels each input,x,with an output value,y,according to some fixed distribution,F,(,y|x,),Learning Machine (LM),“learns” from an i.i.d.,l,-sample of (,x,y,)-pairs output from,G,and,S, by choosing a function that best approximates,S,from a parameterised function class,f,(,x,), where,is in the parameter set,2024/11/19,10,Statistical Learning and Inference,Function Estimation Model,Key concepts,:,F,(,x,y,), an i.i.d. k-sample on,F, functions,f,(,x,),and the equivalent representation of each,f,using its index,x,G,S,LM,y,y,2024/11/19,11,Statistical Learning and Inference,The,loss functional,(,L, Q),the error of a given function on a given example,The,risk functional,(,R,),the expected loss of a given function on an example drawn from,F,(,x,y,),the (usual concept of) generalisation error of a given function,The Problem of Risk Minimization,2024/11/19,12,Statistical Learning and Inference,The Problem of Risk Minimization,Three Main Learning Problems,Pattern Recognition:,Regression Estimation:,Density Estimation:,2024/11/19,13,Statistical Learning and Inference,General Formulation,The Goal of Learning,Given an i.i.d.,k,-sample,z,1,z,k,drawn from a fixed distribution,F(z),For a function class loss functionals,Q (z ,), with,in,We wish to,minimise the risk, finding a function,*,2024/11/19,14,Statistical Learning and Inference,General Formulation,The,Empirical Risk Minimization (ERM),Inductive Principle,Define the,empirical risk,(sample/training error):,Define the empirical risk minimiser:,ERM approximates,Q (z ,*,),with,Q (z ,k,),the,R,emp,minimiserthat is,ERM approximates,*,with,k,Least-squares and Maximum-likelihood are realisations of ERM,2024/11/19,15,Statistical Learning and Inference,4 Issues of Learning Theory,Theory of,consistency,of learning processes,What are (necessary and sufficient) conditions for consistency (convergence of,R,emp,to,R,) of a learning process based on the ERM Principle?,Non-asymptotic theory of the,rate of convergence,of learning processes,How fast is the rate of convergence of a learning process?,Generalization ability,of learning processes,How can one control the rate of convergence (the generalization ability) of a learning process?,Constructing learning algorithms,(i.e. the SVM),How can one construct algorithms that can control the generalization ability?,2024/11/19,16,Statistical Learning and Inference,Change in Scientific Methodology,TRADITIONAL,Formulate hypothesis,Design experiment,Collect data,Analyze results,Review hypothesis,Repeat/Publish,NEW,Design large,experiments,Collect large data,Put data in large database,Formulate hypothesis,Evaluate hypothesis on database,Run limited experiments,Review hypothesis,Repeat/Publish,2024/11/19,17,Statistical Learning and Inference,Learning & Adaptation,In the broadest sense, any method that incorporates information from training samples in the design of a classifier employs learning.,Due to complexity of classification problems, we cannot guess the best classification decision ahead of time, we need to learn it.,Creating classifiers then involves positing some general form of model, or form of the classifier, and using examples to learn the complete classifier.,2024/11/19,18,Statistical Learning and Inference,Supervised learning,In supervised learning, a teacher provides a category label for each pattern in a training set. These are then used to train a classifier which can thereafter solve similar classification problems by itself.,2024/11/19,19,Statistical Learning and Inference,Unsupervised learning,In unsupervised learning, or clustering, there is no explicit teacher or training data. The system forms natural clusters of input patterns and classifiers them based on clusters they belong to .,2024/11/19,20,Statistical Learning and Inference,Reinforcement learning,In reinforcement learning, a teacher only says to classifier whether it is right when suggesting a category for a pattern. The teacher does not tell what the correct category is.,2024/11/19,21,Statistical Learning and Inference,Classification,The task of the classifier component is to use the feature vector provided by the feature extractor to assign the object to a category.,Classification is the main topic of this course.,The abstraction provided by the feature vector representation of the input data enables the development of a largely domain-independent theory of classification.,Essentially the classifier divides the feature space into regions corresponding to different categories.,2024/11/19,22,Statistical Learning and Inference,Classification,The degree of difficulty of the classification problem depends on the variability in the feature values for objects in the same category relative to the feature value variation between the categories.,Variability is natural or is due to noise.,Variability can be described through statistics leading to statistical pattern recognition.,2024/11/19,23,Statistical Learning and Inference,Classification,Question: How to design a classifier that can cope with the variability in feature values? What is the best possible performance?,S(x,)=0,Class A,S(x,)0,Class B,S(x,)=0,Objects,X2,(area),(perimeter) X1,Object Representation in Feature Space,Noise and Biological Variations Cause Class Spread,Classification error due to class overlap,2024/11/19,24,Statistical Learning and Inference,
展开阅读全文
相关资源
正为您匹配相似的精品文档
相关搜索

最新文档


当前位置:首页 > 临时分类 > 职业技能


copyright@ 2023-2025  zhuangpeitu.com 装配图网版权所有   联系电话:18123376007

备案号:ICP2024067431-1 川公网安备51140202000466号


本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。装配图网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知装配图网,我们立即给予删除!