Sökning: "naturligtspråkbehandling"
Visar resultat 1 - 5 av 6 uppsatser innehållade ordet naturligtspråkbehandling.
1. Classification of invoices using a 2D NLP approach : A comparison between methods for invoice information extraction for the purpose of classification
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : Many companies are handling a large number of invoices every year. To manually categorize them takes a lot of time and resources. For a model to automatically categorize invoices, the documents need to be properly read and processed by the model. LÄS MER
2. Evaluating the robustness of DistilBERT to data shift in toxicity detection
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : With the rise of social media, cyberbullying and online spread of hate have become serious problems with devastating consequences. Mentimeter is an interactive presentation tool enabling the presentation audience to participate by typing their own answers to questions asked by the presenter. LÄS MER
3. Classifying and Comparing Latent Space Representation of Unstructured Log Data.
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : This thesis explores and compares various methods for producing vector representation of unstructured log data. Ericsson wanted to investigate machine learning methods to analyze logs produced by their systems to reduce the cost and effort required for manual log analysis. LÄS MER
4. Automating Question Generation Given the Correct Answer
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : In this thesis, we propose an end-to-end deep learning model for a question generation task. Given a Wikipedia article written in English and a segment of text appearing in the article, the model can generate a simple question whose answer is the given text segment. The model is based on an encoder-decoder architecture. LÄS MER
5. Using Bidirectional Encoder Representations from Transformers for Conversational Machine Comprehension
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language representation model, designed to pre-train deep bidirectional representations, with the goal of extracting context-sensitive features from an input text [1]. One of the challenging problems in the field of Natural Language Processing is Conversational Machine Comprehension (CMC). LÄS MER