Sökning: "BERT encoding"

Hittade 4 uppsatser innehållade orden BERT encoding.

  1. 1. Object Classification using Language Models

    Uppsats för yrkesexamina på avancerad nivå, Uppsala universitet/Signaler och system

    Författare :Gustav From; [2022]
    Nyckelord :Classifier; BERT; machine learning; ML; language model; IMDB; word2Vec; doc2Vec; NLP;

    Sammanfattning : In today’s modern digital world more and more emails and messengers must be sent, processed and handled. The categorizing and classification of these text pieces can take an incredibly long time and will cost the company a lot of time and money. LÄS MER

  2. 2. Chinese Zero Pronoun Resolution with Neural Networks

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Yifan Yang; [2022]
    Nyckelord :zero pronoun; zero pronoun resolution; chinese zero pronoun; machine translation; neural network;

    Sammanfattning : In this thesis, I explored several neural network-based models to resolve the issues of zero pronoun in Chinese English translation tasks. I reviewed previous work that attempts to take the resolution as a classification task, such as determining if a candidate in a given set is the antecedent of a zero pronoun, which can be categorized as rule-based and supervised methods. LÄS MER

  3. 3. A retrieval-based chatbot ́s opinion on the trolley problem

    Kandidat-uppsats,

    Författare :Hampus Björklin; Tim Abrahamsson; Oscar Widenfalk; [2021]
    Nyckelord :chatbot; language model; trolley problem; BERT encoding; discord;

    Sammanfattning : The goal of this project was to create a chatbot capable of debating a user using limited resources including a discussion thread from the online debate forum Kialo. A retrieval based bot was designed and the discussion thread was converted into a database which the bot could interpret and choose an appropriate answer from. LÄS MER

  4. 4. Using Bidirectional Encoder Representations from Transformers for Conversational Machine Comprehension

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Evangelina Gogoulou; [2019]
    Nyckelord :conversational machine comprehension; question answering; transformers; self-attention; language modelling; samtalsmaskinförståelse; frågesvar; transformatorer; självuppmärksamhet; språkmodellering;

    Sammanfattning : Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language representation model, designed to pre-train deep bidirectional representations, with the goal of extracting context-sensitive features from an input text [1]. One of the challenging problems in the field of Natural Language Processing is Conversational Machine Comprehension (CMC). LÄS MER