Published: July 3, 2024
Language: Английский
Published: July 3, 2024
Language: Английский
Elsevier eBooks, Journal Year: 2025, Volume and Issue: unknown, P. 181 - 201
Published: Jan. 1, 2025
Citations
0Crop Protection, Journal Year: 2025, Volume and Issue: unknown, P. 107206 - 107206
Published: March 1, 2025
Language: Английский
Citations
0AgriEngineering, Journal Year: 2025, Volume and Issue: 7(4), P. 103 - 103
Published: April 3, 2025
Accurate weed segmentation in Unmanned Aerial Vehicle (UAV) imagery remains a significant challenge precision agriculture due to environmental variability, weak contextual representation, and inaccurate boundary detection. To address these limitations, we propose the Multi-Scale Edge-Aware Network (MSEA-Net), lightweight efficient deep learning framework designed enhance accuracy while maintaining computational efficiency. Specifically, introduce Spatial-Channel Attention (MSCA) module recalibrate spatial channel dependencies, improving local–global feature fusion reducing redundant computations. Additionally, Edge-Enhanced Bottleneck (EEBA) integrates Sobel-based edge detection refine delineation, ensuring sharper object separation dense vegetation environments. Extensive evaluations on publicly available datasets demonstrate effectiveness of MSEA-Net, achieving mean Intersection over Union (IoU) 87.42% Motion-Blurred UAV Images Sorghum Fields dataset 71.35% CoFly-WeedDB dataset, outperforming benchmark models. MSEA-Net also maintains compact architecture with only 6.74 M parameters model size 25.74 MB, making it suitable for UAV-based real-time segmentation. These results highlight potential automated efficiency deployment.
Language: Английский
Citations
0Smart Agricultural Technology, Journal Year: 2025, Volume and Issue: unknown, P. 100966 - 100966
Published: April 1, 2025
Language: Английский
Citations
0Drones, Journal Year: 2024, Volume and Issue: 8(7), P. 293 - 293
Published: June 28, 2024
Invasive knotweeds are rhizomatous and herbaceous perennial plants that pose significant ecological threats due to their aggressive growth ability outcompete native plants. Although detecting identifying is crucial for effective management, current ground-based survey methods labor-intensive limited cover large hard-to-access areas. This study was conducted determine the optimum flight height of drones aerial detection at different phenological stages develop automated on images using state-of-the-art Swin Transformer. The results this found that, vegetative stage, Japanese knotweed giant were detectable ≤35 m ≤25 m, respectively, above canopy an RGB sensor. flowers ≤20 m. Thermal multispectral sensors not able detect any species. Transformer achieved higher precision, recall, accuracy in acquired with than conventional convolutional neural networks (CNNs). demonstrated use drones, sensors, deep learning revolutionizing invasive detection.
Language: Английский
Citations
2Forests, Journal Year: 2024, Volume and Issue: 15(10), P. 1814 - 1814
Published: Oct. 17, 2024
As the main area for photosynthesis in trees, canopy absorbs a large amount of carbon dioxide and plays an irreplaceable role regulating cycle atmosphere mitigating climate change. Therefore, monitoring growth is crucial. However, traditional field investigation methods are often limited by time-consuming labor-intensive methods, as well limitations coverage, which may result incomplete inaccurate assessments. In response to challenges encountered application tree crown segmentation algorithms, such adhesion between individual crowns insufficient generalization ability algorithm, this study proposes improved algorithm based on Mask R-CNN (Mask Region-based Convolutional Neural Network), identifies irregular edges RGB images obtained from drones. Firstly, it optimizes backbone network improving ResNeXt embedding SENet (Squeeze-and-Excitation Networks) module enhance model’s feature extraction capability. Secondly, BiFPN-CBAM introduced enable model learn utilize features more effectively. Finally, mask loss function Boundary-Dice further improve effect. study, TCSNet also incorporated concept panoptic segmentation, achieving coherent consistent throughout entire scene through fine boundary recognition integration. was tested three datasets with different geographical environments forest types, namely artificial forests, natural urban forests performing best. Compared original dataset, precision increased 6.6%, recall rate 1.8%, F1-score 4.2%, highlighting its potential robustness detection segmentation.
Language: Английский
Citations
2AgriEngineering, Journal Year: 2024, Volume and Issue: 6(1), P. 555 - 573
Published: March 1, 2024
The invasive morning glory, Ipomoea purpurea (Convolvulaceae), poses a mounting challenge in vineyards by hindering grape harvest and as secondary host of disease pathogens, necessitating advanced detection control strategies. This study introduces novel automated image analysis framework using aerial images obtained from small fixed-wing unmanned aircraft system (UAS) an RGB camera for the large-scale I. flowers. aimed to assess sampling fidelity comparison with actual infestation measured ground validation surveys. UAS was systematically operated over 16 vineyard plots infested another without infestation. We used semi-supervised segmentation model incorporating Gaussian Mixture Model (GMM) Expectation-Maximization algorithm detect count flower detectability GMM compared that conventional K-means methods. results this showed detected presence flowers all 0% both type I II errors, while method had 6.3% respectively. methods 76% 65% flowers, These underscore effectiveness GMM-based accurately detecting quantifying approach. demonstrated efficiency coupled vineyards, achieving success relying on data-driven deep-learning models.
Language: Английский
Citations
1Journal of Ambient Intelligence and Humanized Computing, Journal Year: 2024, Volume and Issue: 15(9), P. 3547 - 3561
Published: July 12, 2024
Language: Английский
Citations
1International Journal of Systems Assurance Engineering and Management, Journal Year: 2024, Volume and Issue: 15(9), P. 4636 - 4648
Published: Aug. 23, 2024
Language: Английский
Citations
1Drones, Journal Year: 2024, Volume and Issue: 8(11), P. 645 - 645
Published: Nov. 6, 2024
The automatic identification of plant species using unmanned aerial vehicles (UAVs) is a valuable tool for ecological research. However, challenges such as reduced spatial resolution due to high-altitude operations, image degradation from camera optics and sensor limitations, information loss caused by terrain shadows hinder the accurate classification UAV imagery. This study addresses these issues proposing novel preprocessing pipeline evaluating its impact on model performance. Our approach improves quality through multi-step that includes Enhanced Super-Resolution Generative Adversarial Networks (ESRGAN) enhancement, Contrast-Limited Adaptive Histogram Equalization (CLAHE) contrast improvement, white balance adjustments color representation. These steps ensure high-quality input data, leading better For feature extraction classification, we employ pre-trained VGG-16 deep convolutional neural network, followed machine learning classifiers, including Support Vector Machine (SVM), random forest (RF), Extreme Gradient Boosting (XGBoost). hybrid approach, combining with not only enhances accuracy but also reduces computational resource requirements compared relying solely models. Notably, + SVM achieved an outstanding 97.88% dataset preprocessed ESRGAN adjustments, precision 97.9%, recall 97.8%, F1 score 0.978. Through comprehensive comparative study, demonstrate proposed framework, utilizing extraction, images achieves superior performance in
Language: Английский
Citations
1