Cellular Automata as Synthetic Training Data : Exploring behavioural patterns of neural networks

Detta är en Master-uppsats från Uppsala universitet/Institutionen för informationsteknologi

Författare: Onur Yüksel; [2023]

Nyckelord: ;

Sammanfattning: On the highest heights of the third AI spring where the interest in AI research and industrial applications is booming, it’s worth taking a step back to reexplore the basics. With the spirit of data-centric-ai, we discuss the use of Cellular Automata as a resource for synthetic training data and explore how properties of CA rules relate to learning. We showcase three experiments with Neural Networks (NN) where we utilize 1-d elementary CA as training data. First, we explore how well a 1-hidden-layered NN learns the rules of CA. Second, we look at how the lambda property -of a rule- relates to the network's uncertainty. And finally, we explore how the CA rule itself affects the network's uncertainty, in connection to the number of parameters in the network and the number of training data points. While the results are qualitative and require elaboration, the findings from the last experiment can be summarized. We point out to three behavioural patterns of 1-hidden-layered networks: (I) networks that exhibit convergence to zero uncertainty; (II) networks that struggle at non-zero uncertainty values with low variance throughout the experiment parameter space; (III) networks that exhibit high uncertainty variance.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)