一、报告题目
On the Expressivity of Convolutional Neural Networks
二、主讲人
熊欢
三、报告时间
2021年11月26日 14:30-16:00
四、报告地点
腾讯会议 ID :215 925 239
五、摘要
One fundamental problem in deep learning is understanding the excellent performance of deep Neural Networks (NNs) in practice. An explanation for the superiority of NNs is that they can realize a large family of complicated functions, i.e., they have powerful expressivity. The expressivity of a Neural Network with Piecewise Linear activations (PLNN) can be quantified by the maximal number of linear regions it can separate its input space into. In this talk, we provide several mathematical results needed for studying the linear regions of Piecewise Linear Convolutional Neural Networks (PLCNNs), and use them to derive the maximal and average numbers of linear regions for one-layer PLCNNs. Furthermore, we obtain upper and lower bounds for the number of linear regions of multi-layer PLCNNs. Rectified Linear Unit (ReLU) is a piecewise linear activation function that has been widely adopted in various architectures. Our results suggest that deeper ReLU CNNs have more powerful expressivity than their shallow counterparts, while ReLU CNNs have more expressivity than fully-connected ReLU NNs per parameter, in terms of the number of linear regions.
六、主讲人简介
熊欢,哈尔滨工业大学数学研究院教授,博士生导师。本科毕业于北京大学数学学院,博士2016年毕业于苏黎世大学,师从Guo-Niu Han和Paul-Olivier Dehaye。研究方向为组合数学和机器学习。在IEEE TPAMI, JCTA, Sci. China Math., ICML, NeurIPS, CVPR, ICCV, ECCV, FPSAC等国际知名期刊和会议发表论文二十余篇。主持或参与瑞士国家自然科学基金、法国国家科研中心的多项研究项目。
七、主办单位
非线性期望前沿科学中心
数学与交叉科学研究中心