A survey on various security protocols of edge computing DOI
Tathagata Bhattacharya,

Adithya Vardhan Peddi,

Srikanth Ponaganti

и другие.

The Journal of Supercomputing, Год журнала: 2024, Номер 81(1)

Опубликована: Дек. 18, 2024

Язык: Английский

A Survey on the Convergence of Edge Computing and AI for UAVs: Opportunities and Challenges DOI Creative Commons
Patrick McEnroe, Shen Wang, Madhusanka Liyanage

и другие.

IEEE Internet of Things Journal, Год журнала: 2022, Номер 9(17), С. 15435 - 15459

Опубликована: Май 19, 2022

The latest 5G mobile networks have enabled many exciting Internet of Things (IoT) applications that employ unmanned aerial vehicles (UAVs/drones). success most UAV-based IoT is heavily dependent on artificial intelligence (AI) technologies, for instance, computer vision and path planning. These AI methods must process data provide decisions while ensuring low latency energy consumption. However, the existing cloud-based paradigm finds it difficult to meet these strict UAV requirements. Edge AI, which runs on-device or edge servers close users, can be suitable improving services. This article provides a comprehensive analysis impact key technical aspects (i.e., autonomous navigation, formation control, power management, security privacy, vision, communication) delivery systems, civil infrastructure inspection, precision agriculture, search rescue (SAR) operations, acting as wireless base stations (BSs), drone light shows). As guidance researchers practitioners, this also explores implementation challenges, lessons learned, future research directions.

Язык: Английский

Процитировано

253

Machine-Learning-Assisted Security and Privacy Provisioning for Edge Computing: A Survey DOI
Shivani Singh, A. Razia Sulthana, Tanvi Shewale

и другие.

IEEE Internet of Things Journal, Год журнала: 2021, Номер 9(1), С. 236 - 260

Опубликована: Авг. 10, 2021

Edge computing (EC), is a technological game changer that has the ability to connect millions of sensors and provide services at device end. The broad vision EC integrates storage, processing, monitoring, control operations in network. Though provides end-to-end connectivity, speeds up operation, reduces latency data transfer, security major concern. tremendous growth number Devices amount sensitive information generated cloud creates surface attack therefore, need secure static mobile imperative. This article comprehensive survey describes privacy issues various layers architecture result from networking heterogeneous devices. Second, it discusses wide range machine learning deep algorithms are applied use cases. Following this, this broadly details different types attacks network confronts, intrusion detection systems corresponding overcome these concerns. techniques for tabulated. Finally, open securing networks future research directions provided.

Язык: Английский

Процитировано

83

COVID-19 Risk Prediction for Diabetic Patients Using Fuzzy Inference System and Machine Learning Approaches DOI Open Access
Alok Aggarwal,

Madam Chakradar,

Manpreet Singh Bhatia

и другие.

Journal of Healthcare Engineering, Год журнала: 2022, Номер 2022, С. 1 - 10

Опубликована: Апрель 1, 2022

Individuals with pre-existing diabetes seem to be vulnerable the COVID-19 due changes in blood sugar levels and complications. As observed globally, around 20-50% of individuals affected by coronavirus had diabetes. However, there is no recent finding that diabetic patients are more prone contract than nondiabetic patients. a few findings have it could at least twice as likely die from complications Considering multifold mortality rate patients, this study proposes risk prediction model for using fuzzy inference system machine learning approaches. This aimed estimate level without medical practitioner's advice timely action overcoming The proposed takes eight input parameters, which were found most influential symptoms With help various state-of-the-art techniques, fifteen models built over rule base. CatBoost classifier gives best accuracy, recall, precision, F1 score, kappa score. After hyper-parameter optimization, showed 76% accuracy improvements followed logistic regression XGBoost 75.1% 74.7% accuracy. Stratified k-fold cross-validation used validation purposes.

Язык: Английский

Процитировано

36

A Hybrid Cryptographic Mechanism for Secure Data Transmission in Edge AI Networks DOI Creative Commons

Abdulmohsen Almalawi,

Shabbir Hassan,

Adil Fahad

и другие.

International Journal of Computational Intelligence Systems, Год журнала: 2024, Номер 17(1)

Опубликована: Фев. 6, 2024

Abstract As Edge AI systems become more prevalent, ensuring data privacy and security in these decentralized networks is essential. In this work, a novel hybrid cryptographic mechanism was presented by combining Ant Lion Optimization (ALO) Diffie–Hellman-based Twofish cryptography (DHT) for secure transmission. The developed work collects the from created edge system processes it using Autoencoder. Autoencoder learns patterns identifies malicious entry. Diffie–Hellman (DH) key exchange generates shared secret encryption, while ALO optimizes improves performance. Further, algorithm performs encryption generated key, preventing threats during implementation results of study show that achieved higher accuracy 99.45%, lower time consumption 2 s, minimum delay 0.8 reduced energy 3.2 mJ.

Язык: Английский

Процитировано

6

A Survey on Edge Computing (EC) Security Challenges: Classification, Threats, and Mitigation Strategies DOI Creative Commons
Abdul Manan Sheikh, Md. Rafiqul Islam, Mohamed Hadi Habaebi

и другие.

Future Internet, Год журнала: 2025, Номер 17(4), С. 175 - 175

Опубликована: Апрель 16, 2025

Edge computing (EC) is a distributed approach to processing data at the network edge, either by device or local server, instead of centralized centers cloud. EC proximity source can provide faster insights, response time, and bandwidth utilization. However, architecture makes it vulnerable security breaches diverse attack vectors. The edge paradigm has limited availability resources like memory battery power. Also, heterogeneous nature hardware, communication protocols, difficulty in timely updating patches exist. A significant number researchers have presented countermeasures for detection mitigation threats an paradigm. that differs from traditional privacy-preserving mechanisms already used cloud required. Artificial Intelligence (AI) greatly improves through advanced threat detection, automated responses, optimized resource management. When combined with Physical Unclonable Functions (PUFs), AI further strengthens leveraging PUFs’ unique unclonable attributes alongside AI’s adaptive efficient management features. This paper investigates various strategies cutting-edge solutions. It presents comparison between existing strategies, highlighting their benefits limitations. Additionally, offers detailed discussion threats, including characteristics classification different types. also provides overview privacy needs EC, detailing technological methods employed address threats. Its goal assist future pinpointing potential research opportunities.

Язык: Английский

Процитировано

0

Security Issues and Challenges in Edge Computing Architecture for the Drone Industry DOI
Imdad Ali Shah

Advances in computational intelligence and robotics book series, Год журнала: 2025, Номер unknown, С. 257 - 270

Опубликована: Март 28, 2025

The way unmanned aerial vehicles (UAVs) function is being completely transformed by the incorporation of cutting-edge computing into drone industry which offers notable improvements in processing information, latency reduction, and autonomous capability. This chapter explores functional concepts architectural frameworks essential for implementing edge computing. We outline techniques maximizing computational resources, boosting real-time decision-making, increasing overall system efficiency examining relationship between drones nodes. implementation infrastructure edge-enabled applications, difficulties with security reliability, network connectivity are important subjects.

Язык: Английский

Процитировано

0

Deep Learning based Wireless Channel Prediction: 5G Scenario DOI Open Access
Rajat Varshney,

Chirag Gangal,

Mohd Sharique

и другие.

Procedia Computer Science, Год журнала: 2023, Номер 218, С. 2626 - 2635

Опубликована: Янв. 1, 2023

In the area of wireless communication, channel estimation is a challenging problem due to need for real-time implementation as well system dependence on accuracy. This work presents Long-Short Term Memory (LSTM) based deep learning (DL) approach prediction response in and real-world non-stationary channels. The model uses pre-defined history impulse (CIR) data along with two other features viz. transmitter-receiver update distance root-mean-square delay spread values which are also changing time response. objective obtain an approximate estimate CIRs using through DL compared conventional methods. For training model, sample dataset generated open-source simulation software NYUSIM realizes samples measurement-based models various multipath parameters. From test results, it found that proposed provides viable lightweight solution communication.

Язык: Английский

Процитировано

6

Scalable Mobile Computing: From Cloud Computing to Mobile Edge Computing DOI

Baghiani Radouane,

Lyamine Guezouli, Ahmed Korichi

и другие.

Опубликована: Март 30, 2022

In recent years, cloud computing has emerged as a major change in the world of technology, whereby workloads are transferred first from local data centers located within companies and organizations to large centers. On horizon, new wave forces transition edge computing, which is at borders networks so that computation close source processed data. This improve performance reliability applications services, reduce cost their operation by optimizing distance traveled mobile environment, we have ask for best technology handle such an whether it be or computing. survey will go over several papers varying understand researchers' focus. We see mobility simple phenomenon help emergence technologies.

Язык: Английский

Процитировано

10

Online data poisoning attack against edge AI paradigm for IoT-enabled smart city DOI Creative Commons
Yanxu Zhu, Hong Wen, Jinsong Wu

и другие.

Mathematical Biosciences & Engineering, Год журнала: 2023, Номер 20(10), С. 17726 - 17746

Опубликована: Янв. 1, 2023

<abstract> <p>The deep integration of edge computing and Artificial Intelligence (AI) in IoT (Internet Things)-enabled smart cities has given rise to new AI paradigms that are more vulnerable attacks such as data model poisoning evasion attacks. This work proposes an online attack framework based on the environment IoT-enabled cities, which takes into account limited storage space a rehearsal-based buffer mechanism manipulate by incrementally polluting sample stream arrives at appropriately sized cache. A maximum-gradient-based selection strategy is presented, converts operation traversing historical gradients iterative computation method overcome problem periodic overwriting cache after training. Additionally, maximum-loss-based pollution proposed solve each being updated only once basic attacks, transforming bi-level optimization from offline mode mode. Finally, gray-box algorithms implemented evaluated devices using simulated with open-grid datasets. The results show outperforms existing baseline methods both effectiveness overhead.</p> </abstract>

Язык: Английский

Процитировано

5

Predicting short‐term mobile Internet traffic from Internet activity using recurrent neural networks DOI
Guto Leoni Santos, Pierangelo Rosati, Theo Lynn

и другие.

International Journal of Network Management, Год журнала: 2021, Номер 32(3)

Опубликована: Ноя. 2, 2021

Summary Mobile network traffic prediction is an important input into capacity planning and optimization. Existing approaches may lack the speed computational complexity to account for bursting, non‐linear patterns, or other correlations in time series mobile data. We compare performance of two deep learning (DL) architectures, long short‐term memory (LSTM) gated recurrent unit (GRU), conventional machine (ML) architectures—Random Forest Decision Tree—for predicting Internet using 2 months Telecom Italia data Milan. K ‐Means clustering was used a priori group cells based on activity, Grid Search method identify best configurations each model. The predictive quality models evaluated root mean squared error absolute error. Both DL algorithms were effective modeling activity seasonality, both within days across months. find variations clusters city. Overall, outperformed ML models, LSTM GRU our experiments.

Язык: Английский

Процитировано

10