Sökning: "low-resource language"

Visar resultat 16 - 20 av 31 uppsatser innehållade orden low-resource language.

  1. 16. Improving BERTScore for Machine Translation Evaluation Through Contrastive Learning

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Oreen Yousuf; [2022]
    Nyckelord :machine translation; evaluation; BERTScore; contrastive learning; SimCSE; Hausa; Somali; Chinese;

    Sammanfattning : Since the advent of automatic evaluation, tasks within Natural Language Processing (NLP), including Machine Translation, have been able to better utilize both time and labor resources. Later, multilingual pre-trained models (MLMs)have uplifted many languages’ capacity to participate in NLP research. LÄS MER

  2. 17. SPEECH SYNTHESIS AND RECOGNITION FOR A LOW-RESOURCE LANGUAGE Connecting TTS and ASR for mutual benefit

    Master-uppsats, Göteborgs universitet / Institutionen för filosofi, lingvistik och vetenskapsteori

    Författare :Liliia Makashova; [2021-09-23]
    Nyckelord :Speech synthesis; automatic speech recognition; low-resource language; machine learning; transfer learning;

    Sammanfattning : Speech synthesis (text-to-speech, TTS) and speech recognition (automatic speech recognition, ASR) are the NLP technologies that are the least available for low-resource and indigenous languages. Lack of computational and data resources is the major obstacle when it comes to the development of linguistic tools for these languages. LÄS MER

  3. 18. Low-resource Language Question Answering Systemwith BERT

    Uppsats för yrkesexamina på avancerad nivå, Mittuniversitetet/Institutionen för informationssystem och –teknologi

    Författare :Herman Jansson; [2021]
    Nyckelord :BERT; Question Answering system; Reading Comprehension; Low resource language; SQuADv2;

    Sammanfattning : The complexity for being at the forefront regarding information retrieval systems are constantly increasing. Recent technology of natural language processing called BERT has reached superhuman performance in high resource languages for reading comprehension tasks. LÄS MER

  4. 19. Extractive Text Summarization of Norwegian News Articles Using BERT

    Master-uppsats, Linköpings universitet/Medie- och Informationsteknik; Linköpings universitet/Tekniska fakulteten

    Författare :Thomas Indrias Biniam; Adam Morén; [2021]
    Nyckelord :extractive text summarization; NLP; deep learning; BERT; BERTSum; Multilingual BERT; Norwegian BERT; transformer; Norwegian; news articles;

    Sammanfattning : Extractive text summarization has over the years been an important research area in Natural Language Processing. Numerous methods have been proposed for extracting information from text documents. Recent works have shown great success for English summarization tasks by fine-tuning the language model BERT using large summarization datasets. LÄS MER

  5. 20. Automatic Speech Recognition for low-resource languages using Wav2Vec2 : Modern Standard Arabic (MSA) as an example of a low-resource language

    Master-uppsats, Högskolan Dalarna/Institutionen för information och teknik

    Författare :Taha Zouhair; [2021]
    Nyckelord :Automatic Speech Recognition; Facebook Wav2Vec; Mozilla Common Voice; Low-Resource Language;

    Sammanfattning : The need for fully automatic translation at DigitalTolk, a Stockholm-based company providing translation services, leads to exploring Automatic Speech Recognition as a first step for Modern Standard Arabic (MSA). Facebook AI recently released a second version of its Wav2Vec models, dubbed Wav2Vec 2. LÄS MER