Sökning: "BERT for passage-retrieval"

Hittade 2 uppsatser innehållade orden BERT for passage-retrieval.

  1. 1. Distilling Multilingual Transformer Models for Efficient Document Retrieval : Distilling multi-Transformer models with distillation losses involving multi-Transformer interactions

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Xuecong Liu; [2022]
    Nyckelord :Dense Passage Retrieval; Knowledge Distillation; Multilingual Transformer; Document Retrieval; Open Domain Question Answering; Tät textavsnittssökning; kunskapsdestillering; flerspråkiga transformatorer; dokumentsökning; domänlöst frågebesvarande;

    Sammanfattning : Open Domain Question Answering (OpenQA) is a task concerning automatically finding answers to a query from a given set of documents. Language-agnostic OpenQA is an increasingly important research area in the globalised world, where the answers can be in a different language from the question. LÄS MER

  2. 2. Zero-shot, One Kill: BERT for Neural Information Retrieval

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Stergios Efes; [2021]
    Nyckelord :neural information retrieval; passage ranking; weak supervision; question answering; passage reranking; BERT; transfer-learning in IR; zero-shot IR; passage-retrieval; BERT for passage-retrieval; MS Marco; information retrieval; neural IR;

    Sammanfattning : [Background]: The advent of bidirectional encoder representation from trans- formers (BERT) language models (Devlin et al., 2018) and MS Marco, a large scale human-annotated dataset for machine reading comprehension (Bajaj et al., 2016) that made publicly available, led the field of information retrieval (IR) to experience a revolution (Lin et al. LÄS MER