Artificial Intelligence-Defined Wireless Networking for Computational Offloading and Resource Allocation in Edge Computing Networks DOI Creative Commons
Syed Danial Ali Shah, Mark Gregory, Fayçal Bouhafs

et al.

IEEE Open Journal of the Communications Society, Journal Year: 2024, Volume and Issue: 5, P. 2039 - 2057

Published: Jan. 1, 2024

The advent of the Internet Everything and new Ultra-Reliable Low-Latency Communication (URLLC) services has resulted in an exponential growth data demands at network's edge. To meet stringent performance requirements evolving 5G (and beyond) applications, deploying dedicated resources closer to mobile users is essential. Multi-Access Edge Computing (MEC) a promising technology for bringing computational users. However, distributed limited MEC must be effectively optimized maximize number benefiting from low-latency each time slot highly congested, large-scale, dynamic wireless network scenarios. In this research, we propose evaluate novel Artificial Intelligence-Defined Wireless Networking (AIDWN) approach that builds on conventional Software-Defined (SDN), implementing AI-defined application plane offloading resource allocation MEC-enabled networks. AIDWN implements deep reinforcement learning framework neural networks dynamically adapt optimal decisions while considering handover, mobility, coordinated challenges multi-MEC server environments. Compared recent state-of-the-art proposals, proposed demonstrates substantial improvement, utilizing more than 90% per across all servers. It also accommodates significantly congested We identified various future research directions highlighting potential simplifying management next-generation

Language: Английский

Wireless Powered Mobile Edge Computing Networks: A Survey DOI Open Access
Xiaojie Wang, Jiameng Li, Zhaolong Ning

et al.

ACM Computing Surveys, Journal Year: 2023, Volume and Issue: 55(13s), P. 1 - 37

Published: Jan. 11, 2023

Wireless Powered Mobile Edge Computing (WPMEC) is an integration of (MEC) and Power Transfer (WPT) technologies, to both improve computing capabilities mobile devices energy compensation for their limited battery capabilities. Generally, transmitters, devices, edge servers form a WPMEC system that realizes closed loop sending collecting as well offloading receiving task data. Due constraints time-varying network environments, time-coupled levels, the half-duplex character joint design computation resource allocation solutions in systems has become extremely challenging, great number studies have been devoted it recent years. In this article, we first introduce basic model system. Then, present key issues techniques related WPMEC. addition, summarize solve critical networks. Finally, discuss some research challenges open issues.

Language: Английский

Citations

88

Machine Learning for Large-Scale Optimization in 6G Wireless Networks DOI
Yandong Shi, Lixiang Lian, Yuanming Shi

et al.

IEEE Communications Surveys & Tutorials, Journal Year: 2023, Volume and Issue: 25(4), P. 2088 - 2132

Published: Jan. 1, 2023

The sixth generation (6G) wireless systems are envisioned to enable the paradigm shift from "connected things" intelligence", featured by ultra high density, large-scale, dynamic heterogeneity, diversified functional requirements, and machine learning capabilities, which leads a growing need for highly efficient intelligent algorithms. classic optimization-based algorithms usually require precise mathematical model of data links suffer poor performance with computational cost in realistic 6G applications. Based on domain knowledge (e.g., optimization models theoretical tools), (ML) stands out as promising viable methodology many complex large-scale problems 6G, due its superior performance, efficiency, scalability, generalizability. In this paper, we systematically review most representative "learning optimize" techniques diverse domains networks identifying inherent feature underlying problem investigating specifically designed ML frameworks perspective optimization. particular, will cover algorithm unrolling, branch-and-bound, graph neural network structured optimization, deep reinforcement stochastic end-to-end semantic well federated distributed capable addressing challenging arising variety crucial Through in-depth discussion, shed light excellent ML-based respect classical methods, provide insightful guidance develop advanced networks. Neural design, tools different implementation issues, challenges future research directions also discussed support practical use

Language: Английский

Citations

54

Resource Scheduling in Edge Computing: Architecture, Taxonomy, Open Issues and Future Research Directions DOI Creative Commons
Mostafa Raeisi-Varzaneh, Omar Dakkak, Adib Habbal

et al.

IEEE Access, Journal Year: 2023, Volume and Issue: 11, P. 25329 - 25350

Published: Jan. 1, 2023

An inflection point in the computing industry is occurring with implementation of Internet Things and 5G communications, which has pushed centralized cloud toward edge resulting a paradigm shift computing. The purpose to provide computing, network control, storage accommodate computationally intense latency-critical applications at resource-limited endpoints. Edge allows devices offload their overflowing tasks servers. This procedure may completely exploit server's computational capabilities efficiently execute operations. However, transferring all an server leads long processing delays surprisingly high energy consumption for numerous tasks. Aside from this, unused powerful centers lead resource waste. Thus, hiring collaborative scheduling approach based on task properties, optimization targets, system status servers, centers, critical successful operation paper briefly summarizes architecture information processing. Meanwhile, scenarios are examined. Resource techniques then discussed compared four collaboration modes. As part our survey, we present thorough overview various offloading schemes proposed by researchers Additionally, according literature surveyed, looked fairness load balancing indicators scheduling. Finally, issues, challenges, future directions have discussed.

Language: Английский

Citations

47

Real-time data visual monitoring of triboelectric nanogenerators enabled by Deep learning DOI
H. H. Zhang, Tao Liu, Xuelian Zou

et al.

Nano Energy, Journal Year: 2024, Volume and Issue: 130, P. 110186 - 110186

Published: Aug. 27, 2024

Language: Английский

Citations

12

A Comprehensive Review of AI Techniques for Resource Management in Fog Computing: Trends, Challenges, and Future Directions DOI Creative Commons
Deafallah Alsadie

IEEE Access, Journal Year: 2024, Volume and Issue: 12, P. 118007 - 118059

Published: Jan. 1, 2024

Language: Английский

Citations

10

Resource allocation problem and artificial intelligence: the state-of-the-art review (2009–2023) and open research challenges DOI
Javad Hassannataj Joloudari,

Sanaz Mojrian,

Hamid Saadatfar

et al.

Multimedia Tools and Applications, Journal Year: 2024, Volume and Issue: 83(26), P. 67953 - 67996

Published: Jan. 29, 2024

Language: Английский

Citations

8

A Survey of Edge Computing Resource Allocation and Task Scheduling Optimization DOI

Xiaowei Xu,

Han Ding, Ling Wang

et al.

Communications in computer and information science, Journal Year: 2024, Volume and Issue: unknown, P. 125 - 135

Published: Jan. 1, 2024

Language: Английский

Citations

7

Comprehensive survey on resource allocation for edge-computing-enabled metaverse DOI

Tanmay Baidya,

Sangman Moh

Computer Science Review, Journal Year: 2024, Volume and Issue: 54, P. 100680 - 100680

Published: Sept. 9, 2024

Language: Английский

Citations

7

Leveraging Deep Learning to Strengthen the Cyber-Resilience of Renewable Energy Supply Chains: A Survey DOI
Malka N. Halgamuge

IEEE Communications Surveys & Tutorials, Journal Year: 2024, Volume and Issue: 26(3), P. 2146 - 2175

Published: Jan. 1, 2024

Deep learning shows immense potential for strengthening the cyber-resilience of renewable energy supply chains. However, research gaps in comprehensive benchmarks, real-world model evaluations, and data generation tailored to domain persist. This study explores applying state-of-the-art deep techniques secure chains, drawing insights from over 300 publications. We aim provide an updated, rigorous analysis applications this field guide future research. systematically review literature spanning 2020-2023, retrieving relevant articles major databases. examine learning's role intrusion/anomaly detection, chain cyberattack detection frameworks, security standards, historical attack analysis, management strategies, architectures, cyber datasets. Our demonstrates enables anomaly by processing massively distributed data. highlight crucial design factors, including accuracy, adaptation capability, communication security, resilience adversarial threats. Comparing 18 attacks informs risk analysis. also showcase evaluating their relative strengths limitations applications. Moreover, our emphasizes best practices curation, considering quality, labeling, access efficiency, governance. Effective integration necessitates tuning guidance, generation. multi-dimensional motivates focused efforts on enhancing explanations, securing communications, continually retraining models, establishing standardized assessment protocols. Overall, we a roadmap progress leveraging potential.

Language: Английский

Citations

6

Contemporary advances in multi-access edge computing: A survey of fundamentals, architecture, technologies, deployment cases, security, challenges, and directions DOI
Mobasshir Mahbub, Raed M. Shubair

Journal of Network and Computer Applications, Journal Year: 2023, Volume and Issue: 219, P. 103726 - 103726

Published: Aug. 26, 2023

Language: Английский

Citations

15