Serverless Data Pipelines for IoT Data Analytics: A Cloud Vendors Perspective and Solutions DOI
Shivananda R. Poojara, Chinmaya Kumar Dehury, Pelle Jakovits

et al.

Springer eBooks, Journal Year: 2022, Volume and Issue: unknown, P. 107 - 132

Published: Sept. 22, 2022

Language: Английский

Challenges and Solutions in Network Security for Serverless Computing DOI Open Access
Sina Ahmadi

International Journal of Current Science Research and Review, Journal Year: 2024, Volume and Issue: 07(01)

Published: Jan. 11, 2024

This research study explores the challenges and solutions related to serverless computing so that computer systems connected network can be protected. Serverless defined as a method of managing services without need have fixed servers. The qualitative is used by this study, which does not include any numerical data involves examination non-number security identified in detail. In literature review, past studies from 2019 2023 are reviewed identify gaps foundation for investigating security. review based on thematic analysis, all organized into meaningful themes. findings like privacy, insecure dependencies limited control. strategies overcome these encryption, strong monitoring other relevant strategies. also suggests use blockchain technology Artificial Intelligence. short, provides insights improve guides future researchers innovate creative developing challenges.

Language: Английский

Citations

11

A Serverless Approach for Resource-constrained Smart Locker Networks DOI Open Access

M. Seleiro,

José Simão, Nuno Datia

et al.

Procedia Computer Science, Journal Year: 2025, Volume and Issue: 256, P. 602 - 609

Published: Jan. 1, 2025

Language: Английский

Citations

0

Towards an Edge-Fog-Cloud Serverless Continuum for IoT Data Processing Pipeline DOI
Thanda Shwe, Masayoshi Aritsugi

Published: Feb. 18, 2024

As most of the Internet Things (IoT) applications are event-driven, emergence serverless computing paradigm, which is a natural fit for event-driven applications, promising to host multi-tenant IoT applications. Furthermore, increasing resource capability low-cost edge and fog devices provides an opportunity take advantage resources available leads edge-fog-cloud continuum, can conduct processing across entire continuum. To identify necessary adaptations we integrate paradigm in each layer continuum investigate performance parameters by running workloads using benchmarks.

Language: Английский

Citations

3

Challenges and Solutions in Network Security for Serverless Computing DOI Open Access
Sina Ahmadi

Published: Jan. 11, 2024

This research study explores the challenges and solutions related to serverless computing so that computer systems connected network can be protected. Serverless defined as a method of managing services without need have fixed servers. The qualitative is used by this study, which does not include any numerical data involves examination non-number security identified in detail. In literature review, past studies from 2019 2023 are reviewed identify gaps foundation for investigating security. review based on thematic analysis, all organized into meaningful themes. findings like privacy, insecure dependencies limited control. strategies overcome these encryption, strong monitoring other relevant strategies. also suggests use blockchain technology Artificial Intelligence. short, provides insights improve guides future researchers innovate creative developing challenges.

Language: Английский

Citations

2

Performance Analysis of Apache OpenWhisk Across the Edge-Cloud Continuum DOI

Areej Alabbas,

Ashish Kaushal, Osama Almurshed

et al.

Published: July 1, 2023

Serverless computing offers opportunities for auto-scaling, a pay-for-use cost model, quicker deployment and faster updates to support services. Apache OpenWhisk is one such open-source, distributed serverless platform that can be used execute user functions in stateless manner. We conduct performance analysis of on an edge-cloud continuum, using function chain video applications. consider combination Raspberry Pi cloud nodes deploy OpenWhisk, modifying number parameters, as maximum memory limit runtime, investigate application behaviours. The five main factors considered are: cold warm activation, input size, CPU architecture, runtime packages used, concurrent invocations. results have been evaluated initialization, execution time, minimum requirement, inference time accuracy.

Language: Английский

Citations

6

Comparison of Reinforcement Learning Algorithms for Edge Computing Applications Deployed by Serverless Technologies DOI Creative Commons
Mauro Femminella, Gianluca Reali

Algorithms, Journal Year: 2024, Volume and Issue: 17(8), P. 320 - 320

Published: July 23, 2024

Edge computing is one of the technological areas currently considered among most promising for implementation many types applications. In particular, IoT-type applications can benefit from reduced latency and better data protection. However, price typically to be paid in order offered opportunities includes need use a amount resources compared traditional cloud environment. Indeed, it may happen that only node used. these situations, essential introduce memory resource management techniques allow optimized while still guaranteeing acceptable performance, terms probability rejection. For this reason, serverless technologies, managed by reinforcement learning algorithms, an active area research. paper, we explore compare performance some machine algorithms managing horizontal function autoscaling edge system. make open deployed Kubernetes cluster, experimentally fine-tune algorithms. The results obtained both understanding basic mechanisms typical systems related technologies determine system guiding configuration choices operation.

Language: Английский

Citations

1

Moving Healthcare AI Support Systems for Visually Detectable Diseases to Constrained Devices DOI Creative Commons
Tess Watt, Christos Chrysoulas, Peter J. Barclay

et al.

Applied Sciences, Journal Year: 2024, Volume and Issue: 14(24), P. 11474 - 11474

Published: Dec. 10, 2024

Image classification usually requires connectivity and access to the cloud, which is often limited in many parts of world, including hard-to-reach rural areas. Tiny machine learning (tinyML) aims solve this problem by hosting artificial intelligence (AI) assistants on constrained devices, eliminating issues processing data within device itself, without Internet or cloud access. This study explores use tinyML provide healthcare support with low-spec devices low-connectivity environments, focusing diagnosis skin diseases ethical AI a setting. To investigate this, images lesions were used train model for classifying visually detectable (VDDs). The weights then offloaded Raspberry Pi webcam attached, be It was found that developed prototype achieved test accuracy 78% when trained HAM10000 dataset, 85% ISIC 2020 Challenge dataset.

Language: Английский

Citations

1

A General-Purpose Distributed Analytic Platform Based on Edge Computing and Computational Intelligence Applied on Smart Grids DOI Creative Commons
Juan Ignacio Guerrero,

Antonio Martín,

Antonio Parejo

et al.

Sensors, Journal Year: 2023, Volume and Issue: 23(8), P. 3845 - 3845

Published: April 9, 2023

Currently, in many data landscapes, the information is distributed across various sources and presented diverse formats. This fragmentation can pose a significant challenge to efficient application of analytical methods. In this sense, mining mainly based on clustering or classification techniques, which are easier implement environments. However, solution some problems usage mathematical equations stochastic models, more difficult Usually, these types need centralize required information, then modelling technique applied. environments, centralization may cause an overloading communication channels due massive transmission also privacy issues when sending sensitive data. To mitigate problem, paper describes general-purpose analytic platform edge computing for networks. Through engine (DAE), calculation process expressions (that requires from sources) decomposed between existing nodes, allows partial results without exchanging original information. way, master node ultimately obtains result expressions. The proposed examined using three different computational intelligence algorithms, i.e., genetic algorithm, algorithm with evolution control, particle swarm optimization, decompose expression be calculated distribute tasks nodes. has been successfully applied case study focused key performance indicators smart grid, achieving reduction number messages by than 91% compared traditional approach.

Language: Английский

Citations

2

An Empirical Study on Edge-to-Cloud Continuum for Smart Applications: Performance, Design Patterns, and Key Factors DOI

Norman Chen,

Adel N. Toosi, Bahman Javadi

et al.

Published: July 7, 2024

Language: Английский

Citations

0

Application of Proximal Policy Optimization for Resource Orchestration in Serverless Edge Computing DOI Creative Commons
Mauro Femminella, Gianluca Reali

Computers, Journal Year: 2024, Volume and Issue: 13(9), P. 224 - 224

Published: Sept. 6, 2024

Serverless computing is a new cloud model suitable for providing services in both large and edge clusters. In clusters, the autoscaling functions play key role on serverless platforms as dynamic scaling of function instances can lead to reduced latency efficient resource usage, typical requirements edge-hosted services. However, badly configured introduce unexpected due so-called “cold start” events or service request losses. this work, we focus optimization resource-based OpenFaaS, most-adopted open-source Kubernetes-based platform, leveraging real-world traffic traces. We resort reinforcement learning algorithm named Proximal Policy Optimization dynamically configure value Kubernetes Horizontal Pod Autoscaler, trained real traffic. This was accomplished via state space able take into account consumption, performance values, time day. addition, reward definition promotes Service-Level Agreement (SLA) compliance. evaluate proposed agent, comparing its terms average latency, CPU memory loss percentage with respect baseline system. The experimental results show benefits provided by obtaining within SLA while limiting consumption loss.

Language: Английский

Citations

0