Machine Learning (ML) is increasingly exploited in a wide range of application areas to analyze data streams from large-scale sensor networks, train predictive models and perform inference. The Cloud-Edge Intelligence (CEI) computing paradigm integrates cloud infrastructures for resource-intensive ML tasks with devices at the border of a local network for distributed data preprocessing, small-scale model training and prediction tasks. This can achieve a tunable trade-off of ML accuracy with improved data privacy, response latency, and bandwidth usage. Prevalent CEI architectures are based on microservices encapsulated in containers, but serverless computing is emerging as an alternative model. It is based on stateless event-driven functions to facilitate development and provisioning of application components, increase infrastructure elasticity and reduce management effort. This paper proposes a novel CEI framework for sensor-based applications, exploiting serverless computing for data management and ML tasks. Small-scale model training occurs at the edge with local data for quick prediction response, while large-scale models are trained in the cloud with the full sensor network data and then they are fed back to edge nodes for a progressive accuracy improvement. A fully functional prototype has been built by leveraging open source software tools, selected devices for field sensing and edge computing, and a commercial cloud platform. Experiments validate the feasibility and sustainability of the proposal, compared to an existing container-oriented microservice architecture.
Serverless Microservice Architecture for Cloud-Edge Intelligence in Sensor Networks / Loconte, Davide; Ieva, Saverio; Gramegna, Filippo; Bilenchi, Ivano; Fasciano, Corrado; Pinto, Agnese; Loseto, Giuseppe; Scioscia, Floriano; Ruta, Michele; Di Sciascio, Eugenio. - In: IEEE SENSORS JOURNAL. - ISSN 1530-437X. - ELETTRONICO. - (In corso di stampa).
Serverless Microservice Architecture for Cloud-Edge Intelligence in Sensor Networks
Davide Loconte;Saverio Ieva;Filippo Gramegna;Ivano Bilenchi;Corrado Fasciano;Agnese Pinto;Floriano Scioscia;Michele Ruta;Eugenio Di Sciascio
In corso di stampa
Abstract
Machine Learning (ML) is increasingly exploited in a wide range of application areas to analyze data streams from large-scale sensor networks, train predictive models and perform inference. The Cloud-Edge Intelligence (CEI) computing paradigm integrates cloud infrastructures for resource-intensive ML tasks with devices at the border of a local network for distributed data preprocessing, small-scale model training and prediction tasks. This can achieve a tunable trade-off of ML accuracy with improved data privacy, response latency, and bandwidth usage. Prevalent CEI architectures are based on microservices encapsulated in containers, but serverless computing is emerging as an alternative model. It is based on stateless event-driven functions to facilitate development and provisioning of application components, increase infrastructure elasticity and reduce management effort. This paper proposes a novel CEI framework for sensor-based applications, exploiting serverless computing for data management and ML tasks. Small-scale model training occurs at the edge with local data for quick prediction response, while large-scale models are trained in the cloud with the full sensor network data and then they are fed back to edge nodes for a progressive accuracy improvement. A fully functional prototype has been built by leveraging open source software tools, selected devices for field sensing and edge computing, and a commercial cloud platform. Experiments validate the feasibility and sustainability of the proposal, compared to an existing container-oriented microservice architecture.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.