Performance Analysis of Various Activation Functions Using LSTM Neural Network For Movie Recommendation Systems

Detta är en Kandidat-uppsats från KTH/Skolan för elektroteknik och datavetenskap (EECS)

Författare: Philip Song; André Brogärd; [2020]

Nyckelord: ;

Sammanfattning: The growth of importance and popularity of recommendations system has increased in many various areas. This thesis focuses on recommendation systems for movies. Recurrent neural networks using LSTM blocks have shown some success for movie recommendation systems. Research has indicated that by changing activation functions in LSTM blocks, the performance, measured as accuracy in predictions, can be improved. In this study we compare four different activation functions (hyperbolic tangent, sigmoid, ELU and SELU activation functions) used in LSTM blocks, and how they impact the prediction accuracy of the neural networks. Specifically, they are applied to the block input and the block output of the LSTM blocks. Our results indicate that the hyperbolic tangent, which is the default, and sigmoid function perform about the same, whereas the ELU and SELU functions perform worse. Further research is needed to identify other activation functions that could improve the prediction accuracy and improve certain aspects of our methodology.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)