资源描述
1 Givethedefinitionsoryourcomprehensionsofthefollowingterms.(12)TheinductivelearninghypothesisP17OverfittingP49ConsistentlearnerP1482 Givebriefanswerstothefollowingquestions.(15)IfthesizeofaversionspaceisIVS*I.Ingeneralwhatisthesmallestnumberofqueriesmayberequiredbyaconceptlearnerusingoptimalquerystrategytoperfectlylearnthetargetconcept?P27IngenaraLdecisiontreesrepresentadisjunctionofconjunctionsofconstrainsontheattributevaluesofinstanse.thenwhatexpressiondoesthefollowingdecisiontreecorrespondsto?YesNoYesNo3 Givetheexplainationtoinductivebias,andlistinductivebiasofCANDIDATE-ELIMINATIONalgorithm,decisiontreelearning(ID3),BACKPROPAGATIONalgorithm。O)4 Howtosolveoverfittingindecisiontreeandneuralnetwork?。)Solution:Decisiontree: 及早停止树增加(stopgrowingearlier) 后修剪法(posl-pruning)NeuralNetwork权值衰减(weigh】decay)验证数据集(validalionset)A5 ProvethattheLMSweightupdatemleq吗+(匕w加(/O-Vg)*perfomisagradientdescenttominimizethesquarederror.Inparticular,definethesquarederrorEasinthetext.NowAcalculatethederivativeofEwithrespecttotheweightassumingthatV(b)isalinearfunctionasdefinedinthetext.GradientdescentisachievedbyupdatingeachweightinproportiondEto.Therefore,youmustshowthattheLMStrainingrulealtersweightsinthisproportion物foreachtrainingexampleitencounters.(E=工(%他-)2)(bJJaw&rainintxatnpleSolution:AAsVtrain(b)hk(VxeX)(hk(x)=1)-(A;(x)=1)then%=(%1%)人仇/%)(10,)Thehypothesisisfalse.OnecounterexampleisAXORBwhileifA!=B,trainingexamplesareallpositive,whileifA=B.trainingexamplesareallnegative,then,usingID3toextendDI,thenewtreeD2willbeequivalenttoDI,D2isequaltoDI.7 Designatwo-inputperceptronthatimplementsthebooleanfunctionAa-iB.Designatwo-layernetworkofperceptronsthatimplementsAXOR8.(10)8 Supposethatahypothesisspacecontainingthreehypotheses,九喜.由,andtheposteriorprobabilitiesofthesetypothesesgiventhetrainingdataare,andrespectively.AndifanewinstanceXisencountered,whichisclassifiedpositiveby%,butnegativeby%andh5.thengivetheresultanddetailclassificationcourseofBayesoptimalclassifier110)P1259 SupposeSisacollectionoftraining-exampledaysdescribedbyattributesincludingHumidity,whichcanhavethevaluesHighorNormal.AssumeSisacollectioncontaining10examples,7+,3-LOfthese10examples,suppose3ofthepositiveand2ofthenegativeexampleshaveHumidity=High,andtheremainderhaveHumidity=Normal.Pleasecalculatetheinformationgainduetosortingtheoriginal10examplesbytheattributeHumidity.(log2l=0,log22=Llog23=,log24=2,Iogi5=,Iog26=,logz7=,log28=3,logi9=,log210=,)(5)Solution:7733ia)HerewedenoteS=7+3-bthenEnlropyi7+.3-l)=-log2-log2=;(b)Gain(S,Humidity)=Entropy(S)-ZyyEntropy(Sv)Gain(S.a2)vvaucs(IIuniidityi)|,/Values(Humidity)=High.NormalS岫=seSIHiuMity(s)=HighEntropy(Sw,)=-1log,|-1log21=0.972,际而|=5=4*Entropy(5jVfWM/)=-log2-1log21=0,72,屈尔/=5ThusGain(S,Humidity)=(.x0.972+-*0.72)=10Finishthefollowingalgorithm.(IO)(1) GRADIENT-DESCENT(trainingexamples,rj)Eachtrainingexampleisapairoftheform卜),wherexisthevectorofinputvalues,andtisthetargetoutputvalue.Jisthelearningrate. Initializeeachqtosomesmallrandomvalue Untiltheterminationconditionismet,Do Initializeeach弓tozero. Foreach卜,intraining_examples/Do InputtheinstanceXtotheunitandcomputetheoutputo Foreachlinearunitweight,Do Foreachlinearunitweight,Do(2)FIND-SAlgorithm InitializehtothemostspecifichypothesisinH ForeachpositivetraininginstancexForeachattributeconstrainta(inhIfThendonothingElsereplacea,inhbythenextmoregeneralconstraintthatissatisfiedbyxOutputhypothesish
展开阅读全文