Sökning: "Multilingual BERT"

Visar resultat 1 - 5 av 22 uppsatser innehållade orden Multilingual BERT.

  1. 1. Ensuring Brand Safety by Using Contextual Text Features: A Study of Text Classification with BERT

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Lingqing Song; [2023]
    Nyckelord :;

    Sammanfattning : When advertisements are placed on web pages, the context in which the advertisements are presented is important. For example, manufacturers of kitchen knives may not want their advertisement to appear in a news article about a knife-wielding murderer. LÄS MER

  2. 2. Exploring Cross-Lingual Transfer Learning for Swedish Named Entity Recognition : Fine-tuning of English and Multilingual Pre-trained Models

    Kandidat-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Daniel Lai Wikström; Axel Sparr; [2023]
    Nyckelord :NER; Cross-lingual transfer; Transformer; BERT; Deep Learning; namnigenkänning; NER; multilingvistisk överföring; Transformer; BERT; deep learning;

    Sammanfattning : Named Entity Recognition (NER) is a critical task in Natural Language Processing (NLP), and recent advancements in language model pre-training have significantly improved its performance. However, this improvement is not universally applicable due to a lack of large pre-training datasets or computational budget for smaller languages. LÄS MER

  3. 3. Multilingual Transformer Models for Maltese Named Entity Recognition

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Kris Farrugia; [2022]
    Nyckelord :low-resource; named-entity; information extraction; Maltese;

    Sammanfattning : The recently developed state-of-the-art models for Named Entity Recognition are heavily dependent upon huge amounts of available annotated data. Consequently, it is extremely challenging for data-scarce languages to obtain significant result. LÄS MER

  4. 4. Distilling Multilingual Transformer Models for Efficient Document Retrieval : Distilling multi-Transformer models with distillation losses involving multi-Transformer interactions

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Xuecong Liu; [2022]
    Nyckelord :Dense Passage Retrieval; Knowledge Distillation; Multilingual Transformer; Document Retrieval; Open Domain Question Answering; Tät textavsnittssökning; kunskapsdestillering; flerspråkiga transformatorer; dokumentsökning; domänlöst frågebesvarande;

    Sammanfattning : Open Domain Question Answering (OpenQA) is a task concerning automatically finding answers to a query from a given set of documents. Language-agnostic OpenQA is an increasingly important research area in the globalised world, where the answers can be in a different language from the question. LÄS MER

  5. 5. Analysis of Syntactic Behaviour of Neural Network Models by Using Gradient-Based Saliency Method : Comparative Study of Chinese and English BERT, Multilingual BERT and RoBERTa

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Jiayi Zhang; [2022]
    Nyckelord :neural network models; gradient-based saliency; BERT; mBERT; RoBERTa;

    Sammanfattning : Neural network models such as Transformer-based BERT, mBERT and RoBERTa are achieving impressive performance (Devlin et al., 2019; Lewis et al., 2020; Liu et al., 2019; Raffel et al. LÄS MER