Lectures
Location Home > Research > Lectures > Content
Enhanced Expressive Power and Fast Training of Neural Networks by Random Projections
编辑:林煜发布时间:2019年03月18日

SpeakerJianfeng CaiThe Hong Kong University of Science and Technology

Time2019-3-18 10:00

LocationConference Room 105 at Experiment Building at Haiyun Campus

Abstract:

Random projections are able to perform dimension reduction efficiently for datasets with nonlinear low-dimensional structures. One well-known example is that random matrices embed sparse vectors into a low-dimensional subspace nearly isometrically, known as the restricted isometric property in compressed sensing. In this talk, we explore some applications of random projections in deep neural networks. We provide the expressive power of fully connected neural networks when the input data are sparse vectors or form a low-dimensional smooth manifold. We prove that the number of neurons required for approximating a Lipschitz function with a prescribed precision depends on the sparsity or the dimension of the manifold weakly on the dimension of the input vector. The key in our proof is that random projections embed stably the set of sparse vectors or a low-dimensional smooth manifold into a low-dimensional subspace. Based on this fact, we also propose some new neural network models, where at each layer the input is first projected onto a low-dimensional subspace by a random projection then the standard linear connection non-linear activation are applied. In this way, the number of parameters in neural networks is significantly reduced, therefore the training of neural networks can be accelerated without too much performance loss.


Baidu
sogou