Sökning: "Weight Initialization"
Visar resultat 1 - 5 av 9 uppsatser innehållade orden Weight Initialization.
1. Investigating Relations between Regularization and Weight Initialization in Artificial Neural Networks
Kandidat-uppsats, Lunds universitet/Beräkningsbiologi och biologisk fysik - Genomgår omorganisationSammanfattning : L2 regularization is a common method used to prevent overtraining in artificial neural networks. However, an issue with this method is that the regularization strength has to be properly adjusted for it to work as intended. This value is usually found by trial and error which can take some time, especially for larger networks. LÄS MER
2. Self-Supervised Transformer Networks for Error Classification of Tightening Traces
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : Transformers have shown remarkable results in the domains of Natural Language Processing and Computer Vision. This naturally raises the question whether the success could be replicated in other domains. LÄS MER
3. Decoding Electrocorticography Signals by Deep Learning for Brain-Computer Interface
Master-uppsats, KTH/Skolan för kemi, bioteknologi och hälsa (CBH)Sammanfattning : Brain-Computer Interface (BCI) offers the opportunity to paralyzed patients to control their movements without any neuromuscular activity. Signal processing of neuronal activity enables to decode movement intentions. Ability for patient to control an effector is closely linked to this decoding performance. LÄS MER
4. Lifetime estimation of lithium-ion batteries for stationary energy storage system
Master-uppsats, KTH/Skolan för kemivetenskap (CHE)Sammanfattning : With the continuing transition to renewable inherently intermittent energy sources like solar- and wind power, electrical energy storage will become progressively more important to manage energy production and demand. A key technology in this area is Li-ion batteries. LÄS MER
5. A new scheme for training ReLU-based multi-layer feedforward neural networks
Master-uppsats, KTH/Skolan för datavetenskap och kommunikation (CSC)Sammanfattning : A new scheme for training Rectified Linear Unit (ReLU) based feedforward neural networks is examined in this thesis. The project starts with the row-by-row updating strategy designed for Single-hidden Layer Feedforward neural Networks (SLFNs). LÄS MER