Sökning: "multilingual pre-trained language models"

Visar resultat 1 - 5 av 16 uppsatser innehållade orden multilingual pre-trained language models.

  1. 1. Ensuring Brand Safety by Using Contextual Text Features: A Study of Text Classification with BERT

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Lingqing Song; [2023]
    Nyckelord :;

    Sammanfattning : When advertisements are placed on web pages, the context in which the advertisements are presented is important. For example, manufacturers of kitchen knives may not want their advertisement to appear in a news article about a knife-wielding murderer. LÄS MER

  2. 2. Monolingual and Cross-Lingual Survey Response Annotation

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Yahui Zhao; [2023]
    Nyckelord :transfer learning; zero-shot cross-lingual transfer; model-based transfer; multilingual pre-trained language models; sequence labeling; open-ended questions; democracy;

    Sammanfattning : Multilingual natural language processing (NLP) is increasingly recognized for its potential in processing diverse text-type data, including those from social media, reviews, and technical reports. Multilingual language models like mBERT and XLM-RoBERTa (XLM-R) play a pivotal role in multilingual NLP. LÄS MER

  3. 3. Exploring Cross-Lingual Transfer Learning for Swedish Named Entity Recognition : Fine-tuning of English and Multilingual Pre-trained Models

    Kandidat-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Daniel Lai Wikström; Axel Sparr; [2023]
    Nyckelord :NER; Cross-lingual transfer; Transformer; BERT; Deep Learning; namnigenkänning; NER; multilingvistisk överföring; Transformer; BERT; deep learning;

    Sammanfattning : Named Entity Recognition (NER) is a critical task in Natural Language Processing (NLP), and recent advancements in language model pre-training have significantly improved its performance. However, this improvement is not universally applicable due to a lack of large pre-training datasets or computational budget for smaller languages. LÄS MER

  4. 4. Can Wizards be Polyglots: Towards a Multilingual Knowledge-grounded Dialogue System

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Evelyn Kai Yan Liu; [2022]
    Nyckelord :Knowledge-grounded dialogue; Dialogue systems; Generative question answering; Multilingual question answering; Multilingual dialogue systems; Transfer learning; Multi-task learning; Sequential training; Conversational AI; Natural Language Processing NLP ; Deep learning; Machine learning;

    Sammanfattning : The research of open-domain, knowledge-grounded dialogue systems has been advancing rapidly due to the paradigm shift introduced by large language models (LLMs). While the strides have improved the performance of the dialogue systems, the scope is mostly monolingual and English-centric. LÄS MER

  5. 5. Task-agnostic knowledge distillation of mBERT to Swedish

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Added Kina; [2022]
    Nyckelord :Natural Language Processing; Transformers; Knowledge Distillation; BERT; Multilingual Models; Cross-Lingual Transfer; Naturlig bearbetning av språk; Transformatorer; Kunskapsdestillation; BERT; Flerspråkiga modeller; Tvärspråklig inlärningsöverföring;

    Sammanfattning : Large transformer models have shown great performance in multiple natural language processing tasks. However, slow inference, strong dependency on powerful hardware, and large energy consumption limit their availability. LÄS MER