Sökning: "BERT encoding"
Hittade 4 uppsatser innehållade orden BERT encoding.
1. Object Classification using Language Models
Uppsats för yrkesexamina på avancerad nivå, Uppsala universitet/Signaler och systemSammanfattning : In today’s modern digital world more and more emails and messengers must be sent, processed and handled. The categorizing and classification of these text pieces can take an incredibly long time and will cost the company a lot of time and money. LÄS MER
2. Chinese Zero Pronoun Resolution with Neural Networks
Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologiSammanfattning : In this thesis, I explored several neural network-based models to resolve the issues of zero pronoun in Chinese English translation tasks. I reviewed previous work that attempts to take the resolution as a classification task, such as determining if a candidate in a given set is the antecedent of a zero pronoun, which can be categorized as rule-based and supervised methods. LÄS MER
3. A retrieval-based chatbot ́s opinion on the trolley problem
Kandidat-uppsats,Sammanfattning : The goal of this project was to create a chatbot capable of debating a user using limited resources including a discussion thread from the online debate forum Kialo. A retrieval based bot was designed and the discussion thread was converted into a database which the bot could interpret and choose an appropriate answer from. LÄS MER
4. Using Bidirectional Encoder Representations from Transformers for Conversational Machine Comprehension
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language representation model, designed to pre-train deep bidirectional representations, with the goal of extracting context-sensitive features from an input text [1]. One of the challenging problems in the field of Natural Language Processing is Conversational Machine Comprehension (CMC). LÄS MER