ASSESSING STRATEGIES FORBEHAVIOUR CONSISTENCYCHECKING USING LLMS

Detta är en Kandidat-uppsats från Mälardalens universitet/Akademin för innovation, design och teknik

Författare: Oskar Berglund; [2024]

Nyckelord: ;

Sammanfattning: UML diagrams are used today to aid developers in a variety of ways. Some of the things UML supports is the creation of abstract representations of projects and the creation of views of different aspects of the system that is being modelled. When a developer is implementing functionality as described in the UML diagram, an important aspect would be to know of there exist any inconsistencies between the diagram and implementation. While inconsistency tolerance is a commonly followed paradigm, it is important to identify inconsistencies so that important ones may be resolved. These inconsistencies and how to find them are what this thesis focuses on. We perform behaviour consistency checks between a UML activity diagram and its associated code implementation. To test ChatGPTs ability to perform behaviour consistency checks, we gave it a UML diagram and a code base that implemented the representation of the diagram as input and asked it to respond with a behaviour consistency check between them. As the nature of ChatGPT is to answer uniquely every time, even to identical prompts, the result varied but by establishing criteria and rules we could define and conclude how ChatGPT can be used for behaviour consistency checks. We found that ChatGPT can perform these checks but with a variety of success. If presented with two artefacts with no inconsistencies then the check was correct every time, however if there were inconsistencies between the artefacts, the checks could be incorrect. Therefore, we conclude that, ChatGPT can be a tool used to perform behaviour consistency checks but one needs to use specific approaches to limit the risk of erroneous results.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)