Sökning: "attractor neural network"

Visar resultat 1 - 5 av 10 uppsatser innehållade orden attractor neural network.

  1. 1. Hierarchical Clustering using Brain-like Recurrent Attractor Neural Networks

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Hannah Kühn; [2023]
    Nyckelord :Hierarchical Clustering; Attractor Network; Recurrent Neural Network; Brain-like computing; Hierarkisk klustring; Anlockningsnätverk; Återkommande neurala nätverk; Hjärnliknande databehandling;

    Sammanfattning : Hierarchical clustering is a family of machine learning methods that has many applications, amongst other data science and data mining. This thesis belongs to the research area of brain-like computing and introduces a novel approach to hierarchical clustering using a brain-like recurrent neural network. LÄS MER

  2. 2. Role of Context in Episodic Memory : A Bayesian-Hebbian Neural Network Model of Episodic Recall

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Rohan Raj; [2022]
    Nyckelord :episodic memory; long-term memory; Bayesian Confidence Propagation Neural Network; synaptic plasticity; plasticity modulation; computational neuroscience;

    Sammanfattning : Episodic memory forms a fundamental aspect of human memory that accounts for the storage of events as well as the spatio-temporal relations between events during a lifetime. These spatio-temporal relations in which episodes are embedded can be understood as their contexts. Contexts play a crucial role in episodic memory retrieval. LÄS MER

  3. 3. Effects of Network Size in a Recurrent Bayesian Confidence Propagating Neural Network With two Synaptic Traces

    Kandidat-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :William Laius Lundgren; Ludwig Karlsson; [2021]
    Nyckelord :BCPNN; computational brain modelling; shortterm memory; sequential learning; hypercolumns;

    Sammanfattning : A modular Recurrent Bayesian Confidence PropagatingNeural Networks (BCPNN) with two synaptic time tracesis a computational neural network that can serve as a modelof biological short term memory. The units in the network aregrouped into modules called hypercolumns within which there isa competitive winner-takes-all mechanism. LÄS MER

  4. 4. Modelling Immediate Serial Recall using a Bayesian Attractor Neural Network

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Julia Ericson; [2021]
    Nyckelord :Bayesian Confidence Propagating Neural Network; Phonological Loop; Computational model; Immediate serial recall; Bayesian Confidence Propagating Neural Network; Fonologiska loopen; Datorsimulation; Sekventiellt korttidsminne;

    Sammanfattning : In the last decades, computational models have become useful tools for studying biological neural networks. These models are typically constrained by either behavioural data from neuropsychological studies or by biological data from neuroscience. LÄS MER

  5. 5. Attractor Neural Network modelling of the Lifespan Retrieval Curve

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Patrícia Pereira; [2020]
    Nyckelord :reminiscence bump; attractor neural network; Bayesian Confidence Propagation Neural Network BCPNN ; recency; synaptic plasticity; episodic memory; ”reminiscence bump”; attraktorneuronnät; Bayesian Confidence Propagation Neural Network BCPNN ; nysseffekt; synaptisk plasticitet; episodiskt mine.;

    Sammanfattning : Human capability to recall episodic memories depends on how much time has passed since the memory was encoded. This dependency is described by a memory retrieval curve that reflects an interesting phenomenon referred to as a reminiscence bump - a tendency for older people to recall more memories formed during their young adulthood than in other periods of life. LÄS MER