1、 关于机器学习的若干理论问题徐宗本(西安交通大学 )Email: 主页 : http:/纲 要l 线性学习机的万能性理论l 基于误差建模的正则化理论l 稀疏信息处理的新模型与新理论l线性学习机的万能性理论A New Learning Paradigm: LtDAHP(Learning through Deterministic Assignment of Hidden Parameters)Zongben Xu(Xian Jiaotong University, Xian, China)Email: Homepage: http:/l A supervised learning problem
2、: difficult or easy?l Can a difficult learning problem be solved more simply?l Is a linear machine universal?Outlinel Some Related Conceptsl LtRAHP: Learning through Random Assignment of Hidden Parametersl LtDAHP: Learning through Deterministic Assignment of Hidden Parametersl Concluding RemarksOutl
3、inel Some Related Conceptsl LtRAHP: Learning through Random Assignment of Hidden Parametersl LtDAHP: Learning through Deterministic Assignment of Hidden Parametersl Concluding RemarksSupervised Learning: Given a finite number of input/output samples, to find a function f in a machine H that approxim
4、ates the unknown relation between the input and output spaces. Some Related Concepts: Supervised LearningBlack boxFace Recognition Social Network Stock Index TrackingERMMachine:FNNs:Hidden Parameter: Determine the hidden predictors (non-linear mechanism).Bright Parameter: Determine how the hidden pr
5、edictors are linearlycombined (linear mechanism) Some Related Concepts: HP vs BP Bright parameterHidden parameterBright parameterHidden parameterHidden parameterBright parameterOne-Stage Learning: HPs and BPs are trained simultaneously in one stage.Two-Stage Learning: HPs and BPs are trained separat
6、ely in two stages. Machine:Some Related Concepts: OSL vs TSL TSLStage 1:Stage 2:Bright parameterHidden parameterOSLQ1: How to specify assign function?ADMLtRAHPLtDAHPQ2: Can TSL work?Some Related Concepts: Main Concernsl T=assign(a) =l T=assign() = random assignmentl T=assign(n) = deterministic assignmentl Universal approximation?l Does it degrade the generalization ability?l Consistency/Convergence ?l Effectiveness & Efficiency?