Harnessing Artificial Intelligence and Machine Learning to Transform Cloud Computing with Enhanced Efficiency and Personalization DOI Creative Commons
Veeramalai Sankaradass,

Ramsriprasaath Devasenan

Research Square (Research Square), Journal Year: 2024, Volume and Issue: unknown

Published: Dec. 20, 2024

Abstract This work seeks to evaluate how ML and GAI could be integrated into the cloud computing model with an effort of optimizing use resources, minimizing energy consumption providing value added services. Similar other systems its nature, which are large-scale distributed systems, have several topics concern including dynamic resource management, security issues, issues regarding user interface. To address these discrepancies, this proposes a single D-PAL framework that uses predictive application for synthetic data generation. In environments framework, it employs workload prediction scheduling estimation through ML, Reinforcement Learning, augmentation GAN. From experimental assessment one is able observe implicit improvements in performance consumption, customised regard, paper advances theoretical empirical understanding on personnel characteristics AI deploy new methods improve maintain usability. Future will more focused expanding proposed models scale integrating techniques increase control.

Language: Английский

Advancements in heuristic task scheduling for IoT applications in fog-cloud computing: challenges and prospects DOI Creative Commons
Deafallah Alsadie

PeerJ Computer Science, Journal Year: 2024, Volume and Issue: 10, P. e2128 - e2128

Published: June 17, 2024

Fog computing has emerged as a prospective paradigm to address the computational requirements of IoT applications, extending capabilities cloud network edge. Task scheduling is pivotal in enhancing energy efficiency, optimizing resource utilization and ensuring timely execution tasks within fog environments. This article presents comprehensive review advancements task methodologies for systems, covering priority-based, greedy heuristics, metaheuristics, learning-based, hybrid nature-inspired heuristic approaches. Through systematic analysis relevant literature, we highlight strengths limitations each approach identify key challenges facing scheduling, including dynamic environments, heterogeneity, scalability, constraints, security concerns, algorithm transparency. Furthermore, propose future research directions these challenges, integration machine learning techniques real-time adaptation, leveraging federated collaborative developing resource-aware energy-efficient algorithms, incorporating security-aware techniques, advancing explainable AI methodologies. By addressing pursuing directions, aim facilitate development more robust, adaptable, efficient task-scheduling solutions ultimately fostering trust, security, sustainability systems facilitating their widespread adoption across diverse applications domains.

Language: Английский

Citations

8

Leveraging Community-based Approaches for Enhancing Resource Allocation in Fog Computing Environment DOI Open Access

Alasef M. Ghalwah,

Ghaidaa A. Al-Sultany

Engineering Technology & Applied Science Research, Journal Year: 2025, Volume and Issue: 15(1), P. 20372 - 20378

Published: Feb. 2, 2025

Efficient resource allocation in fog computing environments is essential to address the increasing demand for high-performance and adaptable network services. Traditional methods lack granular differentiation based on traffic characteristics often resulting suboptimal bandwidth utilization elevated latency. To enhance efficiency, this study applies a community-based approach leveraging Louvain algorithm dynamically cluster nodes with similar demands. By forming communities latency needs, enables targeted distribution, aligning each community optimized pathways that specific requirements. The results indicate notable performance gains, including 14% increase affecting download reduction by an average of 23% time-sensitive applications. These improvements highlight effectiveness proposed managing diverse demands, improving data flow stability, enhancing overall infrastructures. findings underscore potential support scalable, adaptable, secure management, positioning it as viable solution meet complex needs IoT other distributed systems.

Language: Английский

Citations

0

Dynamic Surgical Prioritization: A Machine Learning and XAI-Based Strategy DOI Creative Commons
Fabián Silva-Aravena, Jenny Morales, Manoj Jayabalan

et al.

Technologies, Journal Year: 2025, Volume and Issue: 13(2), P. 72 - 72

Published: Feb. 8, 2025

Surgical waiting lists present significant challenges to healthcare systems, particularly in resource-constrained settings where equitable prioritization and efficient resource allocation are critical. We aim address these issues by developing a novel, dynamic, interpretable framework for prioritizing surgical patients. Our methodology integrates machine learning (ML), stochastic simulations, explainable AI (XAI) capture the temporal evolution of dynamic scores, qp(t), while ensuring transparency decision making. Specifically, we employ Light Gradient Boosting Machine (LightGBM) predictive modeling, simulations account variables competitive interactions, SHapley Additive Explanations (SHAPs) interpret model outputs at both global patient-specific levels. hybrid approach demonstrates strong performance using dataset 205 patients from an otorhinolaryngology (ENT) unit high-complexity hospital Chile. The LightGBM achieved mean squared error (MSE) 0.00018 coefficient determination (R2) value 0.96282, underscoring its high accuracy estimating qp(t). Stochastic effectively captured changes, illustrating that Patient 1’s qp(t) increased 0.50 (at t=0) 1.026 t=10) due growth such as severity urgency. SHAP analyses identified (Sever) most influential variable, contributing substantially non-clinical factors, capacity participate family activities (Lfam), exerted moderating influence. Additionally, our achieves reduction times up 26%, demonstrating effectiveness optimizing prioritization. Finally, strategy combines adaptability interpretability, transparent aligns with evolving patient needs constraints.

Language: Английский

Citations

0

Reinforcement learning-based solution for resource management in fog computing: A comprehensive survey DOI
Reyhane Ghafari, N. Mansouri

Expert Systems with Applications, Journal Year: 2025, Volume and Issue: unknown, P. 127214 - 127214

Published: March 1, 2025

Language: Английский

Citations

0

Reward Shaping in DRL: A Novel Framework for Adaptive Resource Management in Dynamic Environments DOI Creative Commons
Mario Chahoud, Hani Sami, Rabeb Mizouni

et al.

Information Sciences, Journal Year: 2025, Volume and Issue: unknown, P. 122238 - 122238

Published: April 1, 2025

Language: Английский

Citations

0

Evaluation of Optimization Algorithm for Application Placement Problem in Fog Computing: A Systematic Review DOI
Ankur Goswami, Kirit Modi,

Chirag M. Patel

et al.

Archives of Computational Methods in Engineering, Journal Year: 2025, Volume and Issue: unknown

Published: Feb. 20, 2025

Language: Английский

Citations

0

Resource allocation algorithm for 5G and B5G D2D underlay wireless cellular networks DOI
Malle Gopal,

T. Velmurugan

Multimedia Tools and Applications, Journal Year: 2024, Volume and Issue: 83(25), P. 66841 - 66868

Published: Jan. 25, 2024

Language: Английский

Citations

2

Fuzzy Reinforcement Learning Algorithm for Efficient Task Scheduling in Fog-Cloud IoT-Based Systems DOI
Reyhane Ghafari, N. Mansouri

Journal of Grid Computing, Journal Year: 2024, Volume and Issue: 22(4)

Published: Sept. 23, 2024

Language: Английский

Citations

2

Optimizing Femtocell Networks: A Novel Game Theory Based Power Management Model for Enhanced SINR and Energy Efficiency DOI Creative Commons
Gregorius Airlangga, Denny Jean Cross Sihombing, Julius Bata

et al.

IEEE Access, Journal Year: 2024, Volume and Issue: 12, P. 74444 - 74455

Published: Jan. 1, 2024

This research presents a novel game theory based model for femtocell power management, engineered to significantly enhance the Signal-to-Interference-plus-Noise Ratio (SINR) while optimizing energy consumption across wireless communication networks. Femtocells, as solution increasing demand high-quality indoor network coverage, face challenges in management and interference mitigation. Our proposed addresses these issues, providing sophisticated algorithmic approach that ensures high SINR levels without proportional increase usage. Through series of simulations, model's performance was evaluated against existing techniques. The results, delineated several tables, revealed consistently achieved often surpassed targeted with modest increments, even at targets. Notably, target 20, sustained 23.62 maintaining reasonable profile. Additionally, exhibited exceptional operational efficiency, characterized by low execution times rapid convergence rates, varying conditions. responsiveness is essential adapting user mobility traffic patterns, particularly dense urban settings during peak usage periods.

Language: Английский

Citations

1

Machine Learning Based Intelligent Management System for Energy Storage Using Computing Application DOI Creative Commons
Bhawani Sankar Panigrahi,

R. Kishore Kanna,

Pragyan Paramita Das

et al.

EAI Endorsed Transactions on Energy Web, Journal Year: 2024, Volume and Issue: 11

Published: June 5, 2024

INTRODUCTION: Cloud computing, a still emerging technology, allows customers to pay for services based on usage. It provides internet-based services, whilst virtualization optimizes PC’s available resources. OBJECTIVES: The foundation of cloud computing is the data center, comprising networked computers, cables, electricity components, and various other elements that host store corporate data. In centres, high performance has always been critical concern, but this often comes at cost increased energy consumption. METHODS: most problematic factor reducing power consumption while maintaining service quality balance system efficiency use. Our proposed approach requires comprehensive understanding usage patterns within environment. RESULTS: We examined trends demonstrate with application right optimization principles models, significant savings can be made in centers. During prediction phase, tablet optimization, its 97 % accuracy rate, enables more accurate future forecasts. CONCLUSION: Energy major concern To handle incoming requests fewest resources possible, given increasing demand widespread adoption it essential maintain effective efficient center strategies.

Language: Английский

Citations

1