Sökning: "TriviaQA"

Hittade 2 uppsatser innehållade ordet TriviaQA.

  1. 1. Active Learning for Extractive Question Answering

    Master-uppsats, Linköpings universitet/Statistik och maskininlärning

    Författare :Salvador Marti Roman; [2022]
    Nyckelord :Machine Learning; Deep Learning; Active Learning; Natural Language Processing; NLP; Question Answering; Transformers; Uncertainty; Language Models;

    Sammanfattning : Data labelling for question answering tasks (QA) is a costly procedure that requires oracles to read lengthy excerpts of texts and reason to extract an answer for a given question from within the text. QA is a task in natural language processing (NLP), where a majority of recent advancements have come from leveraging the vast corpora of unlabelled and unstructured text available online. LÄS MER

  2. 2. Large-Context Question Answering with Cross-Lingual Transfer

    Master-uppsats, Uppsala universitet/Institutionen för informationsteknologi

    Författare :Markus Sagen; [2021]
    Nyckelord :Long-Context Multilingual Model; Longformer XLM-R; Longformer; Long-term Context; Extending Context; Extend Context; Large-Context; Long-Context; Large Context; Long Context; Cross-Lingual; Multi-Lingual; Cross Lingual; Multi Lingual; QA; Question-Answering; Question Answering; Transformer model; Machine Learning; Transfer Learning; SQuAD; Memory; Transfer Learning; Long-Context; Long Context; Efficient; Monolingual; Multilingual; QA model; Language Model; Huggingface; BERT; RoBERTa; XLM-R; mBERT; Multilingual BERT; Efficient Transformers; Reformer; Linformer; Performer; Transformer-XL; Wikitext-103; TriviaQA; HotpotQA; WikiHopQA; VINNOVA; Peltarion; AI; LM; MLM; Deep Learning; Natural Language Processing; NLP; Attention; Transformers; Transfer Learning; Datasets;

    Sammanfattning : Models based around the transformer architecture have become one of the most prominent for solving a multitude of natural language processing (NLP)tasks since its introduction in 2017. However, much research related to the transformer model has focused primarily on achieving high performance and many problems remain unsolved. LÄS MER