Deep Learning Algorithms for Detecting and Mitigating DDoS Attacks DOI Creative Commons

Soran Hamad,

Shavan Askar,

Farah Sami Khoshaba

et al.

Indonesian Journal of Computer Science, Journal Year: 2024, Volume and Issue: 13(2)

Published: April 20, 2024

Raising the threat of Distributed Denial Service (DDoS) attacks means that high and adapted detection tools are required now more than ever. This research focuses on exploring latest solutions in preventing DDoS emphasizes how Artificial Intelligence (AI) is involved enhancing end-to-end techniques. Through analysis several key approaches, this work notes AI-guided models quickly identify counteract any unusual traffic patterns may indicate an oncoming attack. Essential aspects towards creating resilient networks against such include machine learning algorithms, sophisticated data analytics together with AI based systems for pattern recognition. Importantly, does well behavioral because it can distinguish adapt to changing attack vectors. Additionally, puts into perspective as making positive mitigation strategies possible contain quick interferences temporary halt traffic, rerouting targeted block listing real time control panel operations. On contrary, current prevention techniques remain critically addressed persistent challenges limitations fundamental them. From what emerges, they should always be ready innovation improvement might evolve over time. paper aligns itself position AI-driven mechanisms natural network security attacks. It underlines importance integrating AI-based conventional practices order enhance resilience efficiently cyber threats evolving all

Language: Английский

Auto-Scaling Techniques in Cloud Computing: Issues and Research Directions DOI Creative Commons

Saleha Alharthi,

Afra Alshamsi,

Anoud Alseiari

et al.

Sensors, Journal Year: 2024, Volume and Issue: 24(17), P. 5551 - 5551

Published: Aug. 28, 2024

In the dynamic world of cloud computing, auto-scaling stands as a beacon efficiency, dynamically aligning resources with fluctuating demands. This paper presents comprehensive review techniques, highlighting significant advancements and persisting challenges in field. First, we overview fundamental principles mechanisms auto-scaling, including its role improving cost performance, energy consumption services. We then discuss various strategies employed ranging from threshold-based rules queuing theory to sophisticated machine learning time series analysis approaches. After that, explore critical issues practices several studies that demonstrate how these can be addressed. conclude by offering insights into promising research directions, emphasizing development predictive scaling integration advanced techniques achieve more effective efficient solutions.

Language: Английский

Citations

7

Machine Learning-Based Resource Management in Fog Computing: A Systematic Literature Review DOI Creative Commons

Fahim Ullah Khan,

Ibrar Ali Shah, Sadaqat Jan

et al.

Sensors, Journal Year: 2025, Volume and Issue: 25(3), P. 687 - 687

Published: Jan. 23, 2025

This systematic literature review analyzes machine learning (ML)-based techniques for resource management in fog computing. Utilizing the Preferred Reporting Items Systematic Reviews and Meta-Analyses (PRISMA) protocol, this paper focuses on ML deep (DL) solutions. Resource computing domain was thoroughly analyzed by identifying key factors constraints. A total of 68 research papers extended versions were finally selected included study. The findings highlight a strong preference DL addressing challenges within paradigm, i.e., 66% reviewed articles leveraged techniques, while 34% utilized ML. Key such as latency, energy consumption, task scheduling, QoS are interconnected critical optimization. analysis reveals that prime addressed ML-based management. Latency is most frequently parameter, investigated 77% articles, followed consumption scheduling at 44% 33%, respectively. Furthermore, according to our evaluation, an extensive range challenges, computational scalability management, data availability quality, model complexity interpretability, employing 73, 53, 45, 46 ML/DL

Language: Английский

Citations

0

Forensics and security issues in the Internet of Things DOI Creative Commons
Shams Forruque Ahmed,

Shanjana Shuravi Shawon,

Afsana Bhuyian

et al.

Wireless Networks, Journal Year: 2025, Volume and Issue: unknown

Published: March 27, 2025

Language: Английский

Citations

0

Advancing Fog-Edge Continuum: A Hybrid Approach Using GRL and Stable Matching for Task Offloading DOI Open Access
Nilesh Kumar Verma,

K. Jairam Naik

Procedia Computer Science, Journal Year: 2025, Volume and Issue: 258, P. 3523 - 3534

Published: Jan. 1, 2025

Language: Английский

Citations

0

Fog Computing Challenges and Opportunities in IoT Networks: A Review DOI Open Access

Z. Ahmed,

Shavan Askar, Diana Hayder Hussein

et al.

Procedia Computer Science, Journal Year: 2025, Volume and Issue: 259, P. 1749 - 1764

Published: Jan. 1, 2025

Language: Английский

Citations

0

Ripple-Induced Whale Optimization Algorithm for Independent Tasks Scheduling on Fog Computing DOI Creative Commons
Zulfiqar Ali Khan, Izzatdin Abdul Aziz

IEEE Access, Journal Year: 2024, Volume and Issue: 12, P. 65736 - 65753

Published: Jan. 1, 2024

Due to the revolution of Internet Things (IoT), amount data generation has been redoubling, leading higher latency and network traffic. This resulted in delays services increased energy consumption cloud servers. Fog computing tackles issues associated with long geographical distance between end-users servers by extending service provision closer edge, reducing makespan, optimizing during workload execution. Instead offloading all tasks cloud, delay-sensitive are executed at fog nodes, while others offloaded cloud. However, resources layer limited, posing a challenge for task scheduling computing, particularly as multi-objective optimization problem. Meta-heuristic algorithms have potent find an optimal solution such problems within reasonable time. The Whale Optimization Algorithm (WOA) is relatively new meta-heuristic algorithm that received significant attention from researchers due its impressive characteristics. being exploitation-oriented technique, it falls into local optima lack generating solutions over Limited exploration capabilities also compromise diversity space prolong convergence Therefore, this study, enhanced Ripple-induced (RWOA) proposed, utilizing ripple effects schedule independent computing. It aims minimize makespan maximizing throughput fog-cloud infrastructure improving poor through substantial changes. Extensive simulations performed assess effectiveness proposed algorithm. RWOA outperformed TCaS, HFSGA, MGWO, WOAmM on two datasets: Random NASA Ames iPSC. statistical significance results validated Friedman test Wilcoxon Signed-rank test.

Language: Английский

Citations

3

EXPLORING THE IMPACT OF ARTIFICIAL INTELLIGENCE ON HUMANROBOT COOPERATION IN THE CONTEXT OF INDUSTRY 4.0 DOI Creative Commons
Hawkar ASAAD, Shavan Askar, Ahmed KAKAMIN

et al.

Applied Computer Science, Journal Year: 2024, Volume and Issue: 20(2), P. 138 - 156

Published: June 30, 2024

The function of Artificial Intelligence (AI) in Human-Robot Cooperation (HRC) Industry 4.0 is unequivocally important and cannot be undervalued. It uses Machine Learning (ML) Deep (DL) to enhance collaboration between humans robots smart manufacturing. These algorithms effectively manage analyze data from sensors, machinery, other associated entities. As an outcome, they can extract significant insights that beneficial optimizing the manufacturing process overall. Because dumb systems hinder coordination, collaboration, communication among various components. Consequently, efficiency, quality, productivity all suffer as a whole. Additionally, makes it possible implement sophisticated learning processes human-robot effectiveness when comes assembly tasks domain by enabling at level comparable human-human interactions. When widely applied (HRC), new dynamic environment for created responsibilities are divided distributed throughout social physical spaces. In conclusion, plays crucial indispensable role facilitating effective efficient within framework 4.0. implementation (AI)-based algorithms, encompassing deep learning, machine reinforcement highly consequential enhances streamlines production procedures, boosts overall productivity, efficiency industry.

Language: Английский

Citations

2

THE UTILIZATION OF 6G IN INDUSTRY 4.0 DOI Creative Commons
Hanan M. Shukur, Shavan Askar, Subhi R. M. Zeebaree

et al.

Applied Computer Science, Journal Year: 2024, Volume and Issue: 20(2), P. 75 - 89

Published: June 30, 2024

The sixth-generation (6G) communication technology has potential in various applications, for instance, industrial automation, intelligent transportation, healthcare systems, and energy consumption prediction. On the other hand, concerns of privacy measures security 6G-enabled networks are considered critical issues challenges. integration 6G with advanced technologies example computing, Artificial Intelligence (AI), Internet Things (IoT) is a common theme this paper. Additionally, paper discusses challenges advancements required to be utilized technologies, involving edge technology, big data analytics, deep learning. In review paper, authors overview cutting-edge like IoT, IoMT, AI, computing that address human requirements issues. addition, make values new Big data, federated learning machine learning, multiple aspects merged collectively offer network growing era. These integrations can monitoring real-time, transportation solutions, improved signal reconstruction, automation. illustrate considerations face performance requirements, security, concerns. Overall, suggests revolutionize different sides our society, enhance efficiency accuracy future automation sectors.

Language: Английский

Citations

1

Cost-effective task offloading and trajectory optimization in UAV assisted edge networks with DDPG DOI

Jiaqing Shen,

Xu Bai, Xiaoguang Tu

et al.

International Journal of Web Information Systems, Journal Year: 2024, Volume and Issue: unknown

Published: Sept. 10, 2024

Purpose Unmanned aerial vehicles (UAVs), known for their exceptional flexibility and maneuverability, have become an integral part of mobile edge computing systems in networks. This paper aims to minimize system costs within a communication cycle. To this end, has developed model task offloading UAV-assisted networks under dynamic channel conditions. study seeks efficiently execute while satisfying UAV energy constraints, validates the effectiveness proposed method through performance comparisons with other similar algorithms. Design/methodology/approach address issue, proposes trajectory optimization algorithm using deep deterministic policy gradient, which jointly optimizes Internet Things (IoT) device scheduling, power distribution, flight costs. Findings The analysis simulation results indicates that achieves lower redundancy compared others, along reductions size by 22.8%, time 34.5%, number IoT devices 11.8%, 25.35% required cycle per-bit tasks 33.6%. Originality/value A multi-objective problem is established conditions, approach validated.

Language: Английский

Citations

1

Multiobjective Offloading Optimization in Fog Computing Using Deep Reinforcement Learning DOI Creative Commons
Hojjat Mashal, Mohammad Hossein Rezvani

Journal of Computer Networks and Communications, Journal Year: 2024, Volume and Issue: 2024(1)

Published: Jan. 1, 2024

Edge computing allows IoT tasks to be processed by devices with passive processing capacity at the network’s edge and near instead of being sent cloud servers. However, 5G‐enabled architectures such as Fog Radio Access Network (F‐RAN) use smart bring delay down even a few milliseconds. This is important, especially in latency‐sensitive applications online digital games. trade‐off must made between energy consumption. If too many are locally on or fog servers, consumption increases because mobile smartphones tablets have limited charges. paper proposes Deep Reinforcement Learning (DRL) method for offloading optimization. In designing states, we consider all three critical components memory consumption, number CPU cycles, network mode. makes modeling aware workload tasks. As result, model matches requirements real world. For each device that submits task system, reward. It includes total The output our DRL specifies which edge/fog/cloud should offloaded. results show technique produces less resource waste than RL when very high. addition, consumes 30% resources FIFO method. provides better local execution other methods.

Language: Английский

Citations

1