IoT-Based Service Allocation in Edge Computing Using Game Theory DOI
Kushagra Agrawal, Polat Göktaş, Biswajit Sahoo

et al.

Lecture notes in computer science, Journal Year: 2024, Volume and Issue: unknown, P. 45 - 60

Published: Dec. 31, 2024

Language: Английский

Embryonic Machine-Deep Learning, Smart Healthcare and Privacy Deliberations in Hospital Industry: Lensing Confidentiality of Patient’s Information and Personal Data in Legal-Ethical Landscapes Projecting Futuristic Dimensions DOI
Bhupinder Singh, Christian Kaunert

Published: Jan. 1, 2024

Language: Английский

Citations

12

LEVERAGING ARTIFICIAL INTELLIGENCE (AI) IN PUBLIC SECTOR FINANCIAL RISK MANAGEMENT: INNOVATIONS, CHALLENGES, AND FUTURE DIRECTIONS DOI
Mehdi Bouchetara, Messaoud Zerouti, Anaïs Radja Zouambi

et al.

EDPACS, Journal Year: 2024, Volume and Issue: 69(9), P. 124 - 144

Published: July 16, 2024

Language: Английский

Citations

10

A Comprehensive Review of AI Techniques for Resource Management in Fog Computing: Trends, Challenges, and Future Directions DOI Creative Commons
Deafallah Alsadie

IEEE Access, Journal Year: 2024, Volume and Issue: 12, P. 118007 - 118059

Published: Jan. 1, 2024

Language: Английский

Citations

10

IoT‐5G and B5G/6G resource allocation and network slicing orchestration using learning algorithms DOI Creative Commons
Ado Adamou Abba Ari,

Faustin Samafou,

Arouna Ndam Njoya

et al.

IET Networks, Journal Year: 2025, Volume and Issue: 14(1)

Published: Jan. 1, 2025

Abstract The advent of 5G networks has precipitated an unparalleled surge in demand for mobile communication services, propelled by the sophisticated wireless technologies. An increasing number countries are moving from fourth generation (4G) to fifth (5G) networks, creating a new expectation services that dynamic, transparent, and differentiated. It is anticipated these will be adapted multitude use cases become standard practice. diversity increasingly complex network infrastructures present significant challenges, particularly management resources orchestration services. Network Slicing emerging as promising approach address it facilitates efficient Resource Allocation (RA) supports self‐service capabilities. However, effective segmentation implementation requires development robust algorithms guarantee optimal RA. In this regard, artificial intelligence machine learning (ML) have demonstrated their utility analysis large datasets facilitation intelligent decision‐making processes. certain ML methodologies limited ability adapt evolving environments characteristic beyond (B5G/6G). This paper examines specific challenges associated with evolution B5G/6G particular focus on solutions RA dynamic slicing requirements. Moreover, article presents potential avenues further research domain objective enhancing efficiency next‐generation through adoption innovative technological solutions.

Language: Английский

Citations

1

Adapting Containerized Workloads for the Continuum Computing DOI Creative Commons
Alberto Robles-Enciso, Antonio Skármeta

IEEE Access, Journal Year: 2024, Volume and Issue: 12, P. 104102 - 104114

Published: Jan. 1, 2024

Container and microservices management platforms are currently one of the most important tools for cloud computing, but since scope these is homogeneous architectures they have serious limitations in adapting to new computing paradigms. Therefore, using default scheduler heterogeneous node systems faces significant when tasked with orchestrating workloads a Continuum Computing environment, as nodes very different characteristics restrictions. To solve this limitation we decided use Kubernetes it popular tool propose replace native reimplementation that gives us complete flexibility process assigning pods nodes, providing framework design algorithms considers all necessary parameters deployment services Continuum. In addition, address limiting aspects K8s scheduler, its pod-by-pod allocation approach, which makes difficult optimise set allocations. test our proposal case perform several tests on real environment based virtual machines, stress conducted measure performance each method. We then present series results justify benefits proposal, including reduced provided by pod-pod approach how batch-based greatly improves efficiency. The show usefulness approaches extension points not enough support requirements

Language: Английский

Citations

4

Artificial Intelligence-Based Cloud Planning and Migration to Cut the Cost of Cloud Sasibhushan Rao Chanthati DOI Creative Commons
Sasibhushan Rao Chanthati

American Journal of Smart Technology and Solutions, Journal Year: 2024, Volume and Issue: 3(2), P. 13 - 24

Published: Aug. 7, 2024

The paper titled “Artificial Intelligence-Based Cloud Planning and Migration to Cut the Cost of Cloud” aims examine how AI can be implemented improve cloud planning migration in a bid reduce their costs. proposal is concerned with utilization multiple techniques, such as machine learning models, natural language processing, reinforcement learning, manage process cloud. In incorporating within transitions, establishes organizations productivity, stability, security during transitions. It provides detailed pseudocode scenario, making content sufficiently intelligible IT professionals who wish implement these algorithms. this regard, helps fill gap that has been demonstrated current literature regarding link between theoretical uses its application towards enhancing deployment efficacy cost-efficiency services. article was first completed 2021 later I have modified latest updates till date 2024.

Language: Английский

Citations

4

Machine Learning-Based Resource Management in Fog Computing: A Systematic Literature Review DOI Creative Commons

Fahim Ullah Khan,

Ibrar Ali Shah, Sadaqat Jan

et al.

Sensors, Journal Year: 2025, Volume and Issue: 25(3), P. 687 - 687

Published: Jan. 23, 2025

This systematic literature review analyzes machine learning (ML)-based techniques for resource management in fog computing. Utilizing the Preferred Reporting Items Systematic Reviews and Meta-Analyses (PRISMA) protocol, this paper focuses on ML deep (DL) solutions. Resource computing domain was thoroughly analyzed by identifying key factors constraints. A total of 68 research papers extended versions were finally selected included study. The findings highlight a strong preference DL addressing challenges within paradigm, i.e., 66% reviewed articles leveraged techniques, while 34% utilized ML. Key such as latency, energy consumption, task scheduling, QoS are interconnected critical optimization. analysis reveals that prime addressed ML-based management. Latency is most frequently parameter, investigated 77% articles, followed consumption scheduling at 44% 33%, respectively. Furthermore, according to our evaluation, an extensive range challenges, computational scalability management, data availability quality, model complexity interpretability, employing 73, 53, 45, 46 ML/DL

Language: Английский

Citations

0

IoT-Based Service Allocation in Edge Computing Using Game Theory DOI
Kushagra Agrawal, Polat Göktaş, Biswajit Sahoo

et al.

Lecture notes in computer science, Journal Year: 2024, Volume and Issue: unknown, P. 45 - 60

Published: Dec. 31, 2024

Language: Английский

Citations

0