Sökning: "low resource language"

Visar resultat 1 - 5 av 51 uppsatser innehållade orden low resource language.

  1. 1. How negation influences word order in languages : Automatic classification of word order preference in positive and negative transitive clauses

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Chen Lyu; [2023]
    Nyckelord :;

    Sammanfattning : In this work, we explore the possibility of using word alignment in parallel corpus to project language annotations such as Part-of-Speech tags and dependency relation from high-resource languages to low-resource languages. We use a parallel corpus of Bible translations, including 1,444 translations in 986 languages, and a well-developed parser is used to annotate source languages (English, French, German, and Czech). LÄS MER

  2. 2. Exploring implications of the EU Taxonomy on funding and disclosure for Swedish SMEs

    Master-uppsats, SLU/Dept. of Economics

    Författare :Jessie Westerberg; Sofia Gren; [2023]
    Nyckelord :EU taxonomy; CSRD; GAR; BTAR; SMEs; Voluntary disclosure;

    Sammanfattning : In 2019 the EU taxonomy was launched as a classification system that aims to provide a common language and framework for sustainable finance. The purpose is to guide investors and businesses to identify environmentally sustainable economic activities to support the transition towards a low-carbon, resource-efficient economy by 2050 within the EU. LÄS MER

  3. 3. Head-to-head Transfer Learning Comparisons made Possible : A Comparative Study of Transfer Learning Methods for Neural Machine Translation of the Baltic Languages

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Mathias Stenlund; [2023]
    Nyckelord :machine translation; transfer learning; Latvian; Lithuanian; low-resource languages; transformers; parent language; child language; comparative study;

    Sammanfattning : The struggle of training adequate MT models using data-hungry NMT frameworks for low-resource language pairs has created a need to alleviate the scarcity of sufficiently large parallel corpora. Different transfer learning methods have been introduced as possible solutions to this problem, where a new model for a target task is initialized using parameters learned from some other high-resource task. LÄS MER

  4. 4. Context-aware Swedish Lexical Simplification : Using pre-trained language models to propose contextually fitting synonyms

    Kandidat-uppsats, Linköpings universitet/Institutionen för datavetenskap

    Författare :Emil Graichen; [2023]
    Nyckelord :automatic text simplification; lexical simplification; Swedish; BERT; GPT-3; evaluation dataset; synonymy;

    Sammanfattning : This thesis presents the development and evaluation of context-aware Lexical Simplification (LS) systems for the Swedish language. In total three versions of LS models, LäsBERT, LäsBERT-baseline, and LäsGPT, were created and evaluated on a newly constructed Swedish LS evaluation dataset. LÄS MER

  5. 5. Neural maskinöversättning av gawarbati

    Kandidat-uppsats, Stockholms universitet/Avdelningen för datorlingvistik

    Författare :Katarina Gillholm; [2023]
    Nyckelord :Machine translation; neural machine translation; NMT; low resource language; Gawarbati; transfer learning; GPT; Maskinöversättning; neural maskinöversättning; NMT; lågresursspråk; gawarbati; överföringsinlärning; GPT;

    Sammanfattning : Nya neurala modeller har lett till stora framsteg inom maskinöversättning, men fungerar fortfarande sämre på språk som saknar stora mängder parallella data, så kallade lågresursspråk. Gawarbati är ett litet, hotat lågresursspråk där endast 5000 parallella meningar finns tillgängligt. LÄS MER