Sökning: "naturlig bearbetning av språk"

Hittade 4 uppsatser innehållade orden naturlig bearbetning av språk.

  1. 1. Automated Vulnerability Management

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Yuhan Ma; [2023]
    Nyckelord :Software security; Machine learning; Automation; Vulnerability management; Natural language processing; Programvarusäkerhet; Maskininlärning; Automation; Sårbarhetshantering; Bearbetning av naturligt språk;

    Sammanfattning : The field of software security is constantly evolving, and security must be taken into consideration throughout the entire product life cycle. This is particularly important in today’s dynamic security landscape, where threats and vulnerabilities constantly change. LÄS MER

  2. 2. Discover patterns within train log data using unsupervised learning and network analysis

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Zehua Guo; [2022]
    Nyckelord :Log analysis; Natural language processing; Unsupervised learning; Clustering; Network analysis; Logganalys; Bearbetning av naturligt språk; Oövervakat lärande; Clustering; Nätverksanalys;

    Sammanfattning : With the development of information technology in recent years, log analysis has gradually become a hot research topic. However, manual log analysis requires specialized knowledge and is a time-consuming task. Therefore, more and more researchers are searching for ways to automate log analysis. LÄS MER

  3. 3. Task-agnostic knowledge distillation of mBERT to Swedish

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Added Kina; [2022]
    Nyckelord :Natural Language Processing; Transformers; Knowledge Distillation; BERT; Multilingual Models; Cross-Lingual Transfer; Naturlig bearbetning av språk; Transformatorer; Kunskapsdestillation; BERT; Flerspråkiga modeller; Tvärspråklig inlärningsöverföring;

    Sammanfattning : Large transformer models have shown great performance in multiple natural language processing tasks. However, slow inference, strong dependency on powerful hardware, and large energy consumption limit their availability. LÄS MER

  4. 4. DistillaBSE: Task-agnostic  distillation of multilingual sentence  embeddings : Exploring deep self-attention distillation with switch transformers

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Boris Bubla; [2021]
    Nyckelord :Transformers; Knowledge Distillation; Language Agnostic BERT Sentence Embeddings; Natural Language Processing; Switch Transformers; Transformatorer; kunskapsdestillation; språkagnostisk inbäddning av BERT- mening; naturlig bearbetning av språk; switchtransformatorer;

    Sammanfattning : The recent development of massive multilingual transformer networks has resulted in drastic improvements in model performance. These models, however, are so large they suffer from large inference latency and consume vast computing resources. Such features hinder widespread adoption of the models in industry and some academic settings. LÄS MER