Sökning: "bert"

Visar resultat 1 - 5 av 79 uppsatser innehållade ordet bert.

  1. 1. Evaluation of Approaches for Representation and Sentiment of Customer Reviews

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Stavros Giorgis; [2021]
    Nyckelord :machine learning; nlp; text analytics; sentiment analysis; transformers; tfidf; bow; fasttext; word2vec; bert; xlnet; roberta; maskininlärning; nlp; textanalys; sentimentanalys; transformatorer; tfidf; bow; fasttext; word2vec; bert; xlnet; roberta;

    Sammanfattning : Classification of sentiment on customer reviews is a real-world application for many companies that offer text analytics and opinion extraction on customer reviews on different domains such as consumer electronics, hotels, restaurants, and car rental agencies. Natural Language Processing’s latest progress has seen the development of many new state-of-the-art approaches for representing the meaning of sentences, phrases, and words in the text using vector space models, so-called embeddings. LÄS MER

  2. 2. Evaluating semantic similarity using sentence embeddings

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Jacob Malmberg; [2021]
    Nyckelord :;

    Sammanfattning : Semantic similarity search is the task of searching for documents or sentences which contain semantically similar content to a user-submitted search term. This task is often carried out, for instance when searching for information on the internet. LÄS MER

  3. 3. Semantic Topic Modeling and Trend Analysis

    Master-uppsats, Linköpings universitet/Statistik och maskininlärning

    Författare :Jasleen Kaur Mann; [2021]
    Nyckelord :NLP; unsupervised topic modelling; trend analysis; LDA; BERT; Sentence-BERT; TF-IDF; transformer based language models; document clustering;

    Sammanfattning : This thesis focuses on finding an end-to-end unsupervised solution to solve a two-step problem of extracting semantically meaningful topics and trend analysis of these topics from a large temporal text corpus. To achieve this, the focus is on using the latest develop- ments in Natural Language Processing (NLP) related to pre-trained language models like Google’s Bidirectional Encoder Representations for Transformers (BERT) and other BERT based models. LÄS MER

  4. 4. Large-Context Question Answering with Cross-Lingual Transfer

    Master-uppsats, Uppsala universitet/Institutionen för informationsteknologi

    Författare :Markus Sagen; [2021]
    Nyckelord :Long-Context Multilingual Model; Longformer XLM-R; Longformer; Long-term Context; Extending Context; Extend Context; Large-Context; Long-Context; Large Context; Long Context; Cross-Lingual; Multi-Lingual; Cross Lingual; Multi Lingual; QA; Question-Answering; Question Answering; Transformer model; Machine Learning; Transfer Learning; SQuAD; Memory; Transfer Learning; Long-Context; Long Context; Efficient; Monolingual; Multilingual; QA model; Language Model; Huggingface; BERT; RoBERTa; XLM-R; mBERT; Multilingual BERT; Efficient Transformers; Reformer; Linformer; Performer; Transformer-XL; Wikitext-103; TriviaQA; HotpotQA; WikiHopQA; VINNOVA; Peltarion; AI; LM; MLM; Deep Learning; Natural Language Processing; NLP; Attention; Transformers; Transfer Learning; Datasets;

    Sammanfattning : Models based around the transformer architecture have become one of the most prominent for solving a multitude of natural language processing (NLP)tasks since its introduction in 2017. However, much research related to the transformer model has focused primarily on achieving high performance and many problems remain unsolved. LÄS MER

  5. 5. Exploring Transformer-Based Contextual Knowledge Graph Embeddings : How the Design of the Attention Mask and the Input Structure Affect Learning in Transformer Models

    Master-uppsats, Linköpings universitet/Artificiell intelligens och integrerade datorsystem

    Författare :Oskar Holmström; [2021]
    Nyckelord :Knowledge Graph; Knowledge Graph Embedding; Embedding; Transformer model; BERT; Attention mask;

    Sammanfattning : The availability and use of knowledge graphs have become commonplace as a compact storage of information and for lookup of facts. However, the discrete representation makes the knowledge graph unavailable for tasks that need a continuous representation, such as predicting relationships between entities, where the most probable relationship needs to be found. LÄS MER