Sökning: "Gated Recurrent Units GRUs"

Hittade 1 uppsats innehållade orden Gated Recurrent Units GRUs.

  1. 1. Temporal Localization of Representations in Recurrent Neural Networks

    Master-uppsats, Högskolan Dalarna/Institutionen för information och teknik

    Författare :Asadullah Najam; [2023]
    Nyckelord :Recurrent Neural Networks RNNs ; Deep Learning; Time Series Prediction; Exploding Values; Gradient Decay; Long Short-Term Memory LSTMs ; Gated Recurrent Units GRUs ; Attention Mechanism; Moving Representations; Localizing Representations;

    Sammanfattning : Recurrent Neural Networks (RNNs) are pivotal in deep learning for time series prediction, but they suffer from 'exploding values' and 'gradient decay,' particularly when learning temporally distant interactions. Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU) have addressed these issues to an extent, but the precise mitigating mechanisms remain unclear. LÄS MER