Performance Evaluation of Serverless Edge Computing for AI Applications : Implementation, evaluation and modeling of an object-detection application running on a serverless architecture implemented with Kubernetes

Detta är en Master-uppsats från KTH/Skolan för elektroteknik och datavetenskap (EECS)

Sammanfattning: Serverless edge computing is a distributed network and computing system in which the data is processed at the edge of the network based on serverless architecture. It can provide large-scale computing and storage resources with low latency, which are very useful in AI applications such as object detection. However, when analyzing serverless computing architectures, we model them using simple models, such as single server or multi-server queues, and it is important to make sure these models can explain the behaviors of real systems. Therefore, we focus on the performance evaluation of serverless edge computing for AI applications in this project. With that, we aim at proposing more realistic and accurate models for real serverless architectures. In this project, our objective is to evaluate the performance and model mathematically an object-detection application running on a serverless architecture implemented with Kubernetes. This project provides a detailed description of the implementation of the serverless platform and YOLOv5-based object detection application. After implementation, we design experiments and make performance evaluations of the time of object detection results and quality of object detection results. Finally, we conclude that the number of users in the system significantly affects the service time. We observe that there is no queue in the system, so we cannot just use mathematical models with a queue to model the system. Therefore, we consider that the processor sharing model is more appropriate for modeling this serverless architecture. This is very helpful for giving insights on how to make more realistic and accurate mathematical queueing models for serverless architectures. For future work, other researchers can also implement our serverless platform and do further development, such as deploying other serverless applications on it and making performance evaluations. They can also design other use-cases for the experiments and make further analyses on queue modeling of serverless architecture based on this project.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)