Sökning: "Multilingual question answering"

Visar resultat 1 - 5 av 6 uppsatser innehållade orden Multilingual question answering.

  1. 1. Can Wizards be Polyglots: Towards a Multilingual Knowledge-grounded Dialogue System

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Evelyn Kai Yan Liu; [2022]
    Nyckelord :Knowledge-grounded dialogue; Dialogue systems; Generative question answering; Multilingual question answering; Multilingual dialogue systems; Transfer learning; Multi-task learning; Sequential training; Conversational AI; Natural Language Processing NLP ; Deep learning; Machine learning;

    Sammanfattning : The research of open-domain, knowledge-grounded dialogue systems has been advancing rapidly due to the paradigm shift introduced by large language models (LLMs). While the strides have improved the performance of the dialogue systems, the scope is mostly monolingual and English-centric. LÄS MER

  2. 2. Low-resource Language Question Answering Systemwith BERT

    Uppsats för yrkesexamina på avancerad nivå, Mittuniversitetet/Institutionen för informationssystem och –teknologi

    Författare :Herman Jansson; [2021]
    Nyckelord :BERT; Question Answering system; Reading Comprehension; Low resource language; SQuADv2;

    Sammanfattning : The complexity for being at the forefront regarding information retrieval systems are constantly increasing. Recent technology of natural language processing called BERT has reached superhuman performance in high resource languages for reading comprehension tasks. LÄS MER

  3. 3. Large-Context Question Answering with Cross-Lingual Transfer

    Master-uppsats, Uppsala universitet/Institutionen för informationsteknologi

    Författare :Markus Sagen; [2021]
    Nyckelord :Long-Context Multilingual Model; Longformer XLM-R; Longformer; Long-term Context; Extending Context; Extend Context; Large-Context; Long-Context; Large Context; Long Context; Cross-Lingual; Multi-Lingual; Cross Lingual; Multi Lingual; QA; Question-Answering; Question Answering; Transformer model; Machine Learning; Transfer Learning; SQuAD; Memory; Transfer Learning; Long-Context; Long Context; Efficient; Monolingual; Multilingual; QA model; Language Model; Huggingface; BERT; RoBERTa; XLM-R; mBERT; Multilingual BERT; Efficient Transformers; Reformer; Linformer; Performer; Transformer-XL; Wikitext-103; TriviaQA; HotpotQA; WikiHopQA; VINNOVA; Peltarion; AI; LM; MLM; Deep Learning; Natural Language Processing; NLP; Attention; Transformers; Transfer Learning; Datasets;

    Sammanfattning : Models based around the transformer architecture have become one of the most prominent for solving a multitude of natural language processing (NLP)tasks since its introduction in 2017. However, much research related to the transformer model has focused primarily on achieving high performance and many problems remain unsolved. LÄS MER

  4. 4. Investigating the Effect of Complementary Information Stored in Multiple Languages on Question Answering Performance : A Study of the Multilingual-T5 for Extractive Question Answering

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Björn Aurell Hansson; [2021]
    Nyckelord :Machine learning; Transformers; multilingual-T5; question answering; NLP; Maskininlärning; transformatormodeller; frågeställning; naturlig språkbehandling.;

    Sammanfattning : Extractive question answering is a popular domain in the field of natural language processing, where machine learning models are tasked with answering questions given a context. Historically the field has been centered on monolingual models, but recently more and more multilingual models have been developed, such as Google’s MT5 [1]. LÄS MER

  5. 5. A Method for the Assisted Translation of QA Datasets Using Multilingual Sentence Embeddings

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Thomas Vakili; [2020]
    Nyckelord :Natural Language Processing NLP ; Information Retrieval IR ; Multilingual Sentence Embeddings; QADatasets; Lesser-Resourced Languages; språkteknologi; informationssökning; språkagnostiska meningsvektorer; fråga-svarskorpusar; språk med mindre resurser;

    Sammanfattning : This thesis presents a method which reduces the amount of labour required to translate the English question answering dataset SQuAD into Swedish. The purpose of the study is to contribute to shrinking the gap between natural language processing research in English and research in lesser-resourced languages by providing a method for creating datasets in these languages which are counterparts to those used in English. LÄS MER