Exploring the LASSO as a Pruning Method

Detta är en Kandidat-uppsats från Lunds universitet/Beräkningsbiologi och biologisk fysik - Genomgår omorganisation

Sammanfattning: In this study, the efficiency of various pruning algorithms were investigated, with an emphasis on regularization methods. Pruning is a method which aims to remove excess objects from a neural network. In particular, this included the LASSO (Least Asbolute Shrinkage and Selection Operator) and the extensions derived from it, which were compared with other methods, including optimal brain damage and the elastic net. Initially, this was implemented for MLPs, but the same methods were extended to CNNs with some alterations for increased computational efficiency. Pruning was then implemented on the level of weights, neurons as well as filters. It was concluded that the LASSO tends to yield a superior sparsity on the level of weights, but the group LASSO's ability to select variables simultaneously is a worthwhile addition. Also, optimal results can be obtained by combining both while regularizing the cost function.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)