Sökning: "domänspecifika språk"
Visar resultat 1 - 5 av 18 uppsatser innehållade orden domänspecifika språk.
1. A Language for Board Games – Development of an Embedded Domain-Specific Language for Describing Board Games
Kandidat-uppsats, Göteborgs universitet/Institutionen för data- och informationsteknikSammanfattning : In recent years board games have increasingly found themselves in the digital medium. One way to enable easier creation of digital board games is to create a domainspecific language (DSL) for that purpose. This thesis details the process of developing an embedded DSL for describing board games with Haskell as its host language. LÄS MER
2. Context-aware security testing of Android applications : Detecting exploitable vulnerabilities through Android model-based security testing
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : This master’s thesis explores ways to uncover and exploit vulnerabilities in Android applications by introducing a novel approach to security testing. The research question focuses on discovering an effective method for detecting vulnerabilities related to the context of an application. LÄS MER
3. Exploring Knowledge Vaults with ChatGPT : A Domain-Driven Natural Language Approach to Document-Based Answer Retrieval
Kandidat-uppsats, Mittuniversitetet/Institutionen för data- och elektroteknik (2023-)Sammanfattning : Problemlösning är en viktig aspekt i många yrken. Inklusive fabriksmiljöer, där problem kan leda till minskad produktion eller till och med produktionsstopp. Denna studie fokuserar på en specifik domän: en massafabrik i samarbete med SCA Massa. LÄS MER
4. A Prompting Framework for Natural Language Processing in the Medical Field : Assessing the Potential of Large Language Models for Swedish Healthcare
Master-uppsats, KTH/Medicinteknik och hälsosystemSammanfattning : The increasing digitisation of healthcare through the use of technology and artificial intelligence has affected the medical field in a multitude of ways. Generative Pre-trained Transformers (GPTs) is a collection of language models that have been trained on an extensive data set to generate human-like text and have been shown to achieve a strong understanding of natural language. LÄS MER
5. An Arrow Metalanguage for Partially Invertible Computation
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : Programming languages traditionally describe computations going one way: a program might compute a hash value from a string, or an encrypted message from a plaintext. However, sometimes it is also of interest to go the other way around: for encryption, we not only want to encrypt messages but also to decrypt them, and to be sure that the decryption correctly reproduces the original message. LÄS MER