Sökning: "distilroberta"

Hittade 2 uppsatser innehållade ordet distilroberta.

  1. 1. Distillation or loss of information? : The effects of distillation on model redundancy

    Master-uppsats, Uppsala universitet/Institutionen för lingvistik och filologi

    Författare :Eva Elzbieta Sventickaite; [2022]
    Nyckelord :distillation; distillation effects; distilbert; distilmbert; distilroberta; distilgpt-2; distilled neurons; redundancy; redundancy in neural networks; redundancy in language models; neuron reduction in language models; distilled language models;

    Sammanfattning :     The necessity for billions of parameters in large language models has lately been questioned as there are still unanswered questions regarding how information is captured in the networks. It could be argued that without this knowledge, there may be a tendency to overparametarize the models. LÄS MER

  2. 2. Text ranking based on semantic meaning of sentences

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Olivia Stigeborn; [2021]
    Nyckelord :Natural language processing; Word Embedding; Resume Ranking; Semantic meaning; Språkteknologi; Ordinbäddning; CV rankning; Semantisk betydelse;

    Sammanfattning : Finding a suitable candidate to client match is an important part of consultant companies work. It takes a lot of time and effort for the recruiters at the company to read possibly hundreds of resumes to find a suitable candidate. LÄS MER