Sökning: "redundancy in neural networks"

Visar resultat 1 - 5 av 9 uppsatser innehållade orden redundancy in neural networks.

  1. 1. Visual Attention Guided Adaptive Quantization for x265 using Deep Learning

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Mikaela Gärde; [2023]
    Nyckelord :video encoding; deep learning; visual attention; adaptive quantization; videokodning; djupinlärning; visuellt fokus; adaptiv kvantisering;

    Sammanfattning : The video on demand streaming is raising drastically in popularity, bringing new challenges to the video coding field. There is a need for new video coding techniques that improve performance and reduce the bitrates. LÄS MER

  2. 2. More efficient training using equivariant neural networks

    Uppsats för yrkesexamina på avancerad nivå, Uppsala universitet/Avdelningen Vi3

    Författare :Karl Bylander; [2023]
    Nyckelord :convolutional neural networks; equivariance; equivariant neural networks; transmission electron microscopy; machine learning;

    Sammanfattning : Convolutional neural networks are equivariant to translations; equivariance to other symmetries, however, is not defined and the class output may vary depending on the input's orientation. To mitigate this, the training data can be augmented at the cost of increased redundancy in the model. LÄS MER

  3. 3. Using Reinforcement Learning to Correct Soft Errors of Deep Neural Networks

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Yuhang Li; [2023]
    Nyckelord :DNN; Soft errors; Redundancy; DRL; DQN; Transfer learning; Training time; DNN; Mjuka fel; Redundans; DRL; DQN; Överföringsinlärning; Utbildningstid;

    Sammanfattning : Deep Neural Networks (DNNs) are becoming increasingly important in various aspects of human life, particularly in safety-critical areas such as autonomous driving and aerospace systems. However, soft errors including bit-flips can significantly impact the performance of these systems, leading to serious consequences. LÄS MER

  4. 4. Distillation or loss of information? : The effects of distillation on model redundancy

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Eva Elzbieta Sventickaite; [2022]
    Nyckelord :distillation; distillation effects; distilbert; distilmbert; distilroberta; distilgpt-2; distilled neurons; redundancy; redundancy in neural networks; redundancy in language models; neuron reduction in language models; distilled language models;

    Sammanfattning :     The necessity for billions of parameters in large language models has lately been questioned as there are still unanswered questions regarding how information is captured in the networks. It could be argued that without this knowledge, there may be a tendency to overparametarize the models. LÄS MER

  5. 5. Non-Destructive Biomass and Relative Growth Rate Estimation in Aeroponic Agriculture using Machine Learning

    Master-uppsats, Lunds universitet/Matematik LTH

    Författare :Oskar Åström; [2022]
    Nyckelord :Machine Learning; Image Analysis; Aeroponics; Hydroculture; Relative Growth Rate; Multi-variate Regression; Neural Network; ResNet; Plant Growth; Plant Physiology; Technology and Engineering;

    Sammanfattning : Optimising plant growth in a controlled climate requires good measurements of both biomass (measured in grams) and relative growth rate (measured in grams of growth per day and gram of plant). In order to do this efficiently and continuously on an individual level during plant development, this has to be done non-destructively and without frequent and labor intensive weighing of plant biomass. LÄS MER