Sökning: "Bidirectional Encoder Representation from Transformers BERT"
Visar resultat 1 - 5 av 7 uppsatser innehållade orden Bidirectional Encoder Representation from Transformers BERT.
1. Classifying personal data on contextual information
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : In this thesis, a novel approach to classifying personal data is tested. Previous personal data classification models read the personal data before classifying it. However, this thesis instead investigates an approach to classify personal data by looking at contextual information frequently available in data sets. LÄS MER
2. Help Document Recommendation System
Master-uppsats, Malmö universitet/Fakulteten för teknik och samhälle (TS)Sammanfattning : Help documents are important in an organization to use the technology applications licensed from a vendor. Customers and internal employees frequently use and interact with the help documents section to use the applications and know about the new features and developments in them. LÄS MER
3. AI Second that Emotion - Using Natural Language Processing to Study the Impact of Non-Stereotyped Video Advertising on Consumers’ Emotions & Online Consumer Engagement
Master-uppsats, Göteborgs universitet/Graduate SchoolSammanfattning : This paper aims to provide a deeper understanding of the emotional and online engagement behavioral responses to non-stereotyped gender role depictions in video advertisements. The consumer response to two video ads that portray non-stereotyped gender roles by the well-known brands Gillette and Always was analyzed. LÄS MER
4. Bidirectional Encoder Representations from Transformers (BERT) for Question Answering in the Telecom Domain. : Adapting a BERT-like language model to the telecom domain using the ELECTRA pre-training approach
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : The Natural Language Processing (NLP) research area has seen notable advancements in recent years, one being the ELECTRA model which improves the sample efficiency of BERT pre-training by introducing a discriminative pre-training approach. Most publicly available language models are trained on general-domain datasets. LÄS MER
5. Coronavirus public sentiment analysis with BERT deep learning
Kandidat-uppsats, Högskolan Dalarna/InformatikSammanfattning : Microblog has become a central platform where people express their thoughts and opinions toward public events in China. With the sudden outbreak of coronavirus, the posts related to coronavirus are usually followed by a burst immediately in microblog volume, which provides a great opportunity to explore public sentiment about the events. LÄS MER