Sökning: "Multilingual Transformer"

Visar resultat 1 - 5 av 16 uppsatser innehållade orden Multilingual Transformer.

  1. 1. Exploring Cross-Lingual Transfer Learning for Swedish Named Entity Recognition : Fine-tuning of English and Multilingual Pre-trained Models

    Kandidat-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Daniel Lai Wikström; Axel Sparr; [2023]
    Nyckelord :NER; Cross-lingual transfer; Transformer; BERT; Deep Learning; namnigenkänning; NER; multilingvistisk överföring; Transformer; BERT; deep learning;

    Sammanfattning : Named Entity Recognition (NER) is a critical task in Natural Language Processing (NLP), and recent advancements in language model pre-training have significantly improved its performance. However, this improvement is not universally applicable due to a lack of large pre-training datasets or computational budget for smaller languages. LÄS MER

  2. 2. Multilingual Transformer Models for Maltese Named Entity Recognition

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Kris Farrugia; [2022]
    Nyckelord :low-resource; named-entity; information extraction; Maltese;

    Sammanfattning : The recently developed state-of-the-art models for Named Entity Recognition are heavily dependent upon huge amounts of available annotated data. Consequently, it is extremely challenging for data-scarce languages to obtain significant result. LÄS MER

  3. 3. Distilling Multilingual Transformer Models for Efficient Document Retrieval : Distilling multi-Transformer models with distillation losses involving multi-Transformer interactions

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Xuecong Liu; [2022]
    Nyckelord :Dense Passage Retrieval; Knowledge Distillation; Multilingual Transformer; Document Retrieval; Open Domain Question Answering; Tät textavsnittssökning; kunskapsdestillering; flerspråkiga transformatorer; dokumentsökning; domänlöst frågebesvarande;

    Sammanfattning : Open Domain Question Answering (OpenQA) is a task concerning automatically finding answers to a query from a given set of documents. Language-agnostic OpenQA is an increasingly important research area in the globalised world, where the answers can be in a different language from the question. LÄS MER

  4. 4. Analysis of Syntactic Behaviour of Neural Network Models by Using Gradient-Based Saliency Method : Comparative Study of Chinese and English BERT, Multilingual BERT and RoBERTa

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Jiayi Zhang; [2022]
    Nyckelord :neural network models; gradient-based saliency; BERT; mBERT; RoBERTa;

    Sammanfattning : Neural network models such as Transformer-based BERT, mBERT and RoBERTa are achieving impressive performance (Devlin et al., 2019; Lewis et al., 2020; Liu et al., 2019; Raffel et al. LÄS MER

  5. 5. Unsupervised multilingual distractor generation for fill-in-the-blank questions

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Zhe Han; [2022]
    Nyckelord :Multilingual; Distractor; BERT;

    Sammanfattning : Fill-in-the-blank multiple choice questions (MCQs) play an important role in the educational field, but the manual generation of them is quite resource-consuming, so it has gradually turned into an attractive NLP task. Thereinto, question creation itself has become a mainstream NLP research topic, while distractor (wrong alternative) generation (DG) still remains out of the spotlight. LÄS MER