Applying Dropout to Prevent Shallow Neural Networks from Overtraining

Detta är en Kandidat-uppsats från Lunds universitet/Beräkningsbiologi och biologisk fysik - Genomgår omorganisation

Författare: Denhanh Huynh; [2017]

Nyckelord: Physics and Astronomy;

Sammanfattning: Artificial neural networks are machine learning systems based on the neural networks of the human brain. A problem that has to be overcome for neural networks is overtraining, which means that the network performs well on data that has been used for training the network, but does not make good predictions on new data. One branch of artificial neural networks, called deep neural networks, uses a lot of hidden layers of neurons to produce state-of-the-art results on a wide variety of problems. Because of the size of these networks, training requires a lot of computation, and some methods for dealing with overtraining that are available for shallow neural networks, with only a few hidden layers, become impractical. Dropout is a recently developed method to reduce overtraining without being too computationally demanding for deep neural networks. In this project, however, dropout is applied to shallow neural networks, and in this thesis it is shown that dropout is a good way to reduce overtraining in shallow neural networks on a variety of classification problems.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)