Sökning: "Contextualized Language Models"

Visar resultat 1 - 5 av 7 uppsatser innehållade orden Contextualized Language Models.

  1. 1. Cross-Lingual and Genre-Supervised Parsing and Tagging for Low-Resource Spoken Data

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Iliana Fosteri; [2023]
    Nyckelord :dependency parsing; part-of-speech tagging; low-resource languages; transcribed speech; large language models; cross-lingual learning; transfer learning; multi-task learning; Universal Dependencies;

    Sammanfattning : Dealing with low-resource languages is a challenging task, because of the absence of sufficient data to train machine-learning models to make predictions on these languages. One way to deal with this problem is to use data from higher-resource languages, which enables the transfer of learning from these languages to the low-resource target ones. LÄS MER

  2. 2. Improving BERTScore for Machine Translation Evaluation Through Contrastive Learning

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Oreen Yousuf; [2022]
    Nyckelord :machine translation; evaluation; BERTScore; contrastive learning; SimCSE; Hausa; Somali; Chinese;

    Sammanfattning : Since the advent of automatic evaluation, tasks within Natural Language Processing (NLP), including Machine Translation, have been able to better utilize both time and labor resources. Later, multilingual pre-trained models (MLMs)have uplifted many languages’ capacity to participate in NLP research. LÄS MER

  3. 3. Analyzing the Anisotropy Phenomenon in Transformer-based Masked Language Models

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Ziyang Luo; [2021]
    Nyckelord :Transformer; BERT; RoBERTa; Anisotropy;

    Sammanfattning : In this thesis, we examine the anisotropy phenomenon in popular masked language models, BERT and RoBERTa, in detail. We propose a possible explanation for this unreasonable phenomenon. LÄS MER

  4. 4. Unsupervised Lexical Semantic Change Detection with Context-Dependent Word Representations

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Huiling You; [2021]
    Nyckelord :;

    Sammanfattning : In this work, we explore the usefulness of contextualized embeddings from language models on lexical semantic change (LSC) detection. With diachronic corpora spanning two time periods, we construct word embeddings for a selected set of target words, aiming at detecting potential LSC of each target word across time. LÄS MER

  5. 5. Automatic Categorization of News Articles With Contextualized Language Models

    Master-uppsats, Linköpings universitet/Artificiell intelligens och integrerade datorsystem

    Författare :Lukas Borggren; [2021]
    Nyckelord :Natural Language Processing; Text Classification; Hierarchical Classification; Hierarchical Multi-label Text Classification; Domain Specialization; Metadata Features; Model Compression; Quantization; Pruning; Machine Learning; Deep Learning; Contextualized Language Models; BERT; ELECTRA; News Media;

    Sammanfattning : This thesis investigates how pre-trained contextualized language models can be adapted for multi-label text classification of Swedish news articles. Various classifiers are built on pre-trained BERT and ELECTRA models, exploring global and local classifier approaches. LÄS MER