Deployment of AI Model inside Docker on ARM-Cortex-based Single-Board Computer : Technologies, Capabilities, and Performance

Detta är en Master-uppsats från Blekinge Tekniska Högskola/Institutionen för datalogi och datorsystemteknik

Sammanfattning: IoT has become tremendously popular. It provides information access, processing and connectivity for a huge number of devices or sensors. IoT systems, however, often do not process the information locally, rather send the information to remote locations in the Cloud. As a result, it adds huge amount of data traffic to the network and additional delay to data processing. The later feature might have significant impact on applications that require fast response times, such as sophisticated artificial intelligence (AI) applications including Augmented reality, face recognition, and object detection. Consequently, edge computing paradigm that enables computation of data near the source has gained a significant importance in achieving a fast response time in the recent years. IoT devices can be employed to provide computational resources at the edge of the network near the sensors and actuators. The aim of this thesis work is to design and implement a kind of edge computing concept that brings AI models to a small embedded IoT device by the use of virtualization concepts. The use of virtualization technology enables the easy packing and shipping of applications to different hardware platforms. Additionally, this enable the mobility of AI models between edge devices and the Cloud. We will implement an AI model inside a Docker container, which will be deployed on a FireflyRK3399 single-board computer (SBC). Furthermore, we will conduct CPU and memory performance evaluations of Docker on Firefly-RK3399. The methodology adopted to reach to our goal is experimental research. First, different literatures have been studied to demonstrate by implementation the feasibility of our concept. Then we setup an experiment that covers measurement of performance metrics by applying synthetic load in multiple scenarios. Results are validated by repeating the experiment and statistical analysis. Results of this study shows that, an AI model can successfully be deployed and executed inside a Docker container on Arm-Cortex-based single-board computer. A Docker image of OpenFace face recognition model is built for ARM architecture of the Firefly SBC. On the other hand, the performance evaluation reveals that the performance overhead of Docker in terms of CPU and Memory is negligible. The research work comprises the mechanisms how AI application can be containerized in ARM architecture. We conclude that the methods can be applied to containerize software application in ARM based IoT devices. Furthermore, the insignificant overhead brought by Docker facilitates for deployment of applications inside a container with less performance overhead. The functionality of IoT device i.e. Firefly-RK3399 is exploited in this thesis. It is shown that the device is capable and powerful and gives an insight for further studies. 

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)