Optimization of the Electronic Nose Sensor Array for Asthma Detection Based on Genetic Algorithm DOI Creative Commons
Dava Aulia, Riyanarto Sarno,

Shintami Chusnul Hidayati

et al.

IEEE Access, Journal Year: 2023, Volume and Issue: 11, P. 74924 - 74935

Published: Jan. 1, 2023

The human body releases several types of gases and volatile organic compounds through exhaled breath. This compound can be used as markers lung disease, including asthma. An electronic nose play a role in determining patient's condition. main problem that often occurs is the selection appropriate sensors based on their characteristics performance detecting various gas to provide an optimal system while still providing high accuracy. Genetic algorithms have good advantage applying feature problems effectively solve noise collinearity three genetic operators: crossover, mutation, selection. study aims apply this method determine number identifying healthy people asthma suspects Several classification methods are combined with selected sensor arrays obtain optimized system, support vector machine (SVM), random forest (RF), extreme gradient boosting (XGBoost), artificial neural network (ANN), one-dimensional convolutional (1D-CNN), long short-term memory (LSTM), gated recurrent unit (GRU), 1D CNN-LSTM, CNN-GRU. These machine-learning approaches usually for systems highly accurate depending parameters. experimental results showed algorithm was able produce five provided certain pattern breath from suspects. Meanwhile, 1D-CNN model chosen dataset accuracy 96.6%, precision 96.1%, recall 95.5%, F1-score 95.6%.

Language: Английский

A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications DOI Creative Commons
Laith Alzubaidi, Jinshuai Bai, Aiman Al-Sabaawi

et al.

Journal Of Big Data, Journal Year: 2023, Volume and Issue: 10(1)

Published: April 14, 2023

Abstract Data scarcity is a major challenge when training deep learning (DL) models. DL demands large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate train frameworks. Usually, manual labeling needed provide labeled data, which typically involves human annotators with vast background knowledge. This annotation process costly, time-consuming, and error-prone. every framework fed by significant automatically learn representations. Ultimately, larger would generate better model its performance also application dependent. issue the main barrier for dismissing use DL. Having sufficient first step toward any successful trustworthy application. paper presents holistic survey on state-of-the-art techniques deal models overcome three challenges including small, imbalanced datasets, lack generalization. starts listing techniques. Next, types architectures are introduced. After that, solutions address listed, such as Transfer Learning (TL), Self-Supervised (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these were followed some related tips about acquisition prior purposes, well recommendations ensuring trustworthiness dataset. The ends list that suffer from scarcity, several alternatives proposed in order more each Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, Cybersecurity. To best authors’ knowledge, this review offers comprehensive overview strategies tackle

Language: Английский

Citations

379

A Review of ARIMA vs. Machine Learning Approaches for Time Series Forecasting in Data Driven Networks DOI Creative Commons
Vaia I. Kontopoulou, Athanasios D. Panagopoulos, Iοannis Kakkos

et al.

Future Internet, Journal Year: 2023, Volume and Issue: 15(8), P. 255 - 255

Published: July 30, 2023

In the broad scientific field of time series forecasting, ARIMA models and their variants have been widely applied for half a century now due to mathematical simplicity flexibility in application. However, with recent advances development efficient deployment artificial intelligence techniques, view is rapidly changing, shift towards machine deep learning approaches becoming apparent, even without complete evaluation superiority new approach over classic statistical algorithms. Our work constitutes an extensive review published literature regarding comparison algorithms forecasting problems, as well combination these two hybrid statistical-AI wide variety data applications (finance, health, weather, utilities, network traffic prediction). has shown that AI display better prediction performance most applications, few notable exceptions analyzed our Discussion Conclusions sections, while steadily outperform individual parts, utilizing best algorithmic features both worlds.

Language: Английский

Citations

111

Role of artificial intelligence (AI) in fish growth and health status monitoring: a review on sustainable aquaculture DOI
Arghya Mandal, Apurba Ratan Ghosh

Aquaculture International, Journal Year: 2023, Volume and Issue: 32(3), P. 2791 - 2820

Published: Oct. 10, 2023

Language: Английский

Citations

59

A Critical Review of RNN and LSTM Variants in Hydrological Time Series Predictions DOI Creative Commons
Muhammad Waqas, Usa Wannasingha Humphries

MethodsX, Journal Year: 2024, Volume and Issue: 13, P. 102946 - 102946

Published: Sept. 12, 2024

Language: Английский

Citations

43

Read-First LSTM model: A new variant of long short term memory neural network for predicting solar radiation data DOI

Mohammad Ehteram,

Mahdie Afshari Nia,

Fatemeh Panahi

et al.

Energy Conversion and Management, Journal Year: 2024, Volume and Issue: 305, P. 118267 - 118267

Published: March 7, 2024

Language: Английский

Citations

20

Machine Learning and Deep Learning Paradigms: From Techniques to Practical Applications and Research Frontiers DOI Creative Commons
Kamran Razzaq, Mahmood Shah

Computers, Journal Year: 2025, Volume and Issue: 14(3), P. 93 - 93

Published: March 6, 2025

Machine learning (ML) and deep (DL), subsets of artificial intelligence (AI), are the core technologies that lead significant transformation innovation in various industries by integrating AI-driven solutions. Understanding ML DL is essential to logically analyse applicability identify their effectiveness different areas like healthcare, finance, agriculture, manufacturing, transportation. consists supervised, unsupervised, semi-supervised, reinforcement techniques. On other hand, DL, a subfield ML, comprising neural networks (NNs), can deal with complicated datasets health, autonomous systems, finance industries. This study presents holistic view technologies, analysing algorithms application’s capacity address real-world problems. The investigates application which techniques implemented. Moreover, highlights latest trends possible future avenues for research development (R&D), consist developing hybrid models, generative AI, incorporating technologies. aims provide comprehensive on serve as reference guide researchers, industry professionals, practitioners, policy makers.

Language: Английский

Citations

4

A Comparative Analysis on Suicidal Ideation Detection Using NLP, Machine, and Deep Learning DOI Creative Commons
Rezaul Haque, Naimul Islam, Maidul Islam

et al.

Technologies, Journal Year: 2022, Volume and Issue: 10(3), P. 57 - 57

Published: April 29, 2022

Social networks are essential resources to obtain information about people’s opinions and feelings towards various issues as they share their views with friends family. Suicidal ideation detection via online social network analysis has emerged an research topic significant difficulties in the fields of NLP psychology recent years. With proper exploitation media, complicated early symptoms suicidal ideations can be discovered hence, it save many lives. This study offers a comparative multiple machine learning deep models identify thoughts from media platform Twitter. The principal purpose our is achieve better model performance than prior works recognize indications high accuracy avoid suicide attempts. We applied text pre-processing feature extraction approaches such CountVectorizer word embedding, trained several for goal. Experiments were conducted on dataset 49,178 instances retrieved live tweets by 18 non-suicidal keywords using Python Tweepy API. Our experimental findings reveal that RF highest classification score among algorithms, 93% F1 0.92. However, training classifiers embedding increases ML models, where BiLSTM reaches 93.6% 0.93 score.

Language: Английский

Citations

71

A real-time adaptive model for bearing fault classification and remaining useful life estimation using deep neural network DOI Creative Commons
Muktesh Gupta, Rajesh Wadhvani, Akhtar Rasool

et al.

Knowledge-Based Systems, Journal Year: 2022, Volume and Issue: 259, P. 110070 - 110070

Published: Oct. 30, 2022

Language: Английский

Citations

49

Integration of Deep Learning into the IoT: A Survey of Techniques and Challenges for Real-World Applications DOI Open Access
Abdussalam Elhanashi, Pierpaolo Dini, Sergio Saponara

et al.

Electronics, Journal Year: 2023, Volume and Issue: 12(24), P. 4925 - 4925

Published: Dec. 7, 2023

The internet of things (IoT) has emerged as a pivotal technological paradigm facilitating interconnected and intelligent devices across multifarious domains. proliferation IoT resulted in an unprecedented surge data, presenting formidable challenges concerning efficient processing, meaningful analysis, informed decision making. Deep-learning (DL) methodologies, notably convolutional neural networks (CNNs), recurrent (RNNs), deep-belief (DBNs), have demonstrated significant efficacy mitigating these by furnishing robust tools for learning extraction insights from vast diverse IoT-generated data. This survey article offers comprehensive meticulous examination recent scholarly endeavors encompassing the amalgamation deep-learning techniques within landscape. Our scrutiny encompasses extensive exploration models, expounding on their architectures applications domains, including but not limited to smart cities, healthcare informatics, surveillance applications. We proffer into prospective research trajectories, discerning exigency innovative solutions that surmount extant limitations intricacies deploying methodologies effectively frameworks.

Language: Английский

Citations

32

Deep learning-driven hybrid model for short-term load forecasting and smart grid information management DOI Creative Commons
Xinyu Wen, Jiacheng Liao,

Qingyi Niu

et al.

Scientific Reports, Journal Year: 2024, Volume and Issue: 14(1)

Published: June 14, 2024

Accurate power load forecasting is crucial for the sustainable operation of smart grids. However, complexity and uncertainty load, along with large-scale high-dimensional energy information, present challenges in handling intricate dynamic features long-term dependencies. This paper proposes a computational approach to address these short-term information management, goal accurately predicting future demand. The study introduces hybrid method that combines multiple deep learning models, Gated Recurrent Unit (GRU) employed capture dependencies time series data, while Temporal Convolutional Network (TCN) efficiently learns patterns data. Additionally, attention mechanism incorporated automatically focus on input components most relevant prediction task, further enhancing model performance. According experimental evaluation conducted four public datasets, including GEFCom2014, proposed algorithm outperforms baseline models various metrics such as accuracy, efficiency, stability. Notably, GEFCom2014 dataset, FLOP reduced by over 48.8%, inference shortened more than 46.7%, MAPE improved 39%. significantly enhances reliability, stability, cost-effectiveness grids, which facilitates risk assessment optimization operational planning under context management grid systems.

Language: Английский

Citations

13