Avancerad sökning

Hittade 2 uppsatser som matchar ovanstående sökkriterier.

  1. 1. Fine-Tuning Pre-Trained Language Models for CEFR-Level and Keyword Conditioned Text Generation : A comparison between Google’s T5 and OpenAI’s GPT-2

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Quintus Roos; [2022]
    Nyckelord :Transformed-based Pre-trained Language Models; Natural Language Processing; Natural Language Generation; Conditional Text Generation; Text Classification; Fine-tuning; English Language Learning.; Transformbaserade förtränade språkmodeller; naturlig språkbehandling; naturlig språkgenerering; betingad textgenerering; finjustering; instruktionsjustering; engelska inlärning.;

    Sammanfattning : This thesis investigates the possibilities of conditionally generating English sentences based on keywords-framing content and different difficulty levels of vocabulary. It aims to contribute to the field of Conditional Text Generation (CTG), a type of Natural Language Generation (NLG), where the process of creating text is based on a set of conditions. LÄS MER

  2. 2. Comparing Catastrophic Interference between Incremental Moment Matching-Mean and Hard Attention to the Task

    Kandidat-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Quintus Roos; William Lilliesköld; [2020]
    Nyckelord :;

    Sammanfattning : When a neural networks trained on data to solve one problem is trained on new data to solve another problem it tends to forget what it had previously knew that made it able to solve the first problem. This phenomenon is called Catastrophic Interference. LÄS MER