Sökning: "GPT-2"

Visar resultat 1 - 5 av 16 uppsatser innehållade ordet GPT-2.

  1. 1. Recommendation of Text Properties for Short Texts with the Use of Machine Learning : A Comparative Study of State-of-the-Art Techniques Including BERT and GPT-2

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Luciano Zapata; [2023]
    Nyckelord :Text classification; Short texts; Deep Learning; BERT; GPT; GPT-2; Transformers; Natural Language Processing; Textklassificering; Korta Texter; Djupinlärning; BERT; GPT; GPT-2; Transformatorer; Naturlig språkbehandling;

    Sammanfattning : Text mining has gained considerable attention due to the extensive usage ofelectronic documents. The significant increase in electronic document usagehas created a necessity to process and analyze them effectively. LÄS MER

  2. 2. A Preliminary Observation: Can One Linguistic Feature Be the Deterministic Factor for More Accurate Fake News Detection?

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Yini Chen; [2023]
    Nyckelord :Fake news detection; Generative models;

    Sammanfattning : This study inspected three linguistic features, specifically the percentage of nouns per sentence, the percentage of verbs per sentence, as well as the mean of dependency distance of the sentence, and observed their respective influence on the fake news classification accuracy. In comparison to the previous studies where linguistic features are combined as a set to be leveraged, this study attempted to untangle the effective individual features from the previously proposed optimal sets. LÄS MER

  3. 3. Fine-Tuning Pre-Trained Language Models for CEFR-Level and Keyword Conditioned Text Generation : A comparison between Google’s T5 and OpenAI’s GPT-2

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Quintus Roos; [2022]
    Nyckelord :Transformed-based Pre-trained Language Models; Natural Language Processing; Natural Language Generation; Conditional Text Generation; Text Classification; Fine-tuning; English Language Learning.; Transformbaserade förtränade språkmodeller; naturlig språkbehandling; naturlig språkgenerering; betingad textgenerering; finjustering; instruktionsjustering; engelska inlärning.;

    Sammanfattning : This thesis investigates the possibilities of conditionally generating English sentences based on keywords-framing content and different difficulty levels of vocabulary. It aims to contribute to the field of Conditional Text Generation (CTG), a type of Natural Language Generation (NLG), where the process of creating text is based on a set of conditions. LÄS MER

  4. 4. Evaluation of generative machine learning models : Judging the quality of generated data with the use of neural networks

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Sam Yousefzadegan Hedin; [2022]
    Nyckelord :Generative Modeling; MAUVE; Deep Learning; GPT-2; evaluation; Generativ modellering; MAUVE; Djupinlärning; GPT-2; evaluering;

    Sammanfattning : Generative machine learning models are capable of generating remarkably realistic samples. Some models generate images that look entirely natural, and others generate text that reads as if a human wrote it. However, judging the quality of these models is a major challenge. LÄS MER

  5. 5. Towards a Language Model for Stenography : A Proof of Concept

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Naomi Johanna Langstraat; [2022]
    Nyckelord :stenography; language model; GPT-2; GPT; grapheme-to-phoneme; G2P; low-resource; perplexity; compression;

    Sammanfattning : The availability of the stenographic manuscripts of Astrid Lindgren have sparked an interest in the creation of a language model for stenography. By its very nature stenography is low-resource and the unavailability of data requires a tool for using normal data. LÄS MER