Abstract
Automated
disease
recognition
plays
a
pivotal
role
in
advancing
smart
artificial
intelligence
(AI)‐based
agriculture
and
is
crucial
for
achieving
higher
crop
yields.
Although
substantial
research
has
been
conducted
on
deep
learning‐based
automated
plant
systems,
these
efforts
have
predominantly
focused
leaf
diseases
while
neglecting
affecting
fruits.
We
propose
an
efficient
architecture
effective
fruit
with
state‐of‐the‐art
performance
to
address
this
gap.
Our
method
integrates
advanced
techniques,
such
as
multi‐head
attention
mechanisms
lightweight
convolutions,
enhance
both
efficiency
performance.
Its
ultralightweight
design
emphasizes
minimizing
computational
costs,
ensuring
compatibility
memory‐constrained
edge
devices,
enhancing
accessibility
practical
usability.
Experimental
evaluations
were
three
diverse
datasets
containing
multi‐class
images
of
disease‐affected
healthy
samples
sugar
apple
(
Annona
squamosa
),
pomegranate
Punica
granatum
guava
Psidium
guajava
).
proposed
model
attained
exceptional
results
test
set
accuracies
weighted
precision,
recall,
f1‐scores
exceeding
99%,
which
also
outperformed
pretrain
large‐scale
models.
Combining
high
accuracy
represents
significant
step
forward
developing
accessible
AI
solutions
agriculture,
contributing
the
advancement
sustainable
agriculture.
Frontiers in Plant Science,
Год журнала:
2024,
Номер
15
Опубликована: Дек. 12, 2024
Flavescence
dorée
(FD)
poses
a
significant
threat
to
grapevine
health,
with
the
American
leafhopper,
Scaphoideus
titanus
,
serving
as
primary
vector.
FD
is
responsible
for
yield
losses
and
high
production
costs
due
mandatory
insecticide
treatments,
infected
plant
uprooting,
replanting.
Another
potential
vector
mosaic
Orientus
ishidae
commonly
found
in
agroecosystems.
The
current
monitoring
approach,
which
involves
periodic
human
identification
of
yellow
sticky
traps,
labor-intensive
time-consuming.
Therefore,
there
compelling
need
develop
an
automatic
pest
detection
system
leveraging
recent
advances
computer
vision
deep
learning
techniques.
However,
progress
developing
such
has
been
hindered
by
lack
effective
datasets
training.
To
fill
this
gap,
our
study
contributes
fully
annotated
dataset
S.
O.
from
includes
more
than
600
images,
approximately
1500
identifications
per
class.
Assisted
entomologists,
we
performed
annotation
process,
trained,
compared
performance
two
state-of-the-art
object
algorithms:
YOLOv8
Faster
R-CNN.
Pre-processing,
including
cropping
eliminate
irrelevant
background
information
image
enhancements
improve
overall
quality
dataset,
was
employed.
Additionally,
tested
impact
altering
resolution
data
augmentation,
while
also
addressing
issues
related
class
detection.
results,
evaluated
through
10-fold
cross
validation,
revealed
promising
accuracy,
achieving
[email protected]
92%,
F1-score
above
90%,
mAP@[0.5:0.95]
66%.
Meanwhile,
R-CNN
reached
86%
55%,
respectively.
This
outcome
offers
encouraging
prospects
management
strategies
fight
against
dorée.
Remote Sensing,
Год журнала:
2025,
Номер
17(6), С. 962 - 962
Опубликована: Март 9, 2025
The
Advanced
Insect
Detection
Network
(AIDN),
which
represents
a
significant
advancement
in
the
application
of
deep
learning
for
ecological
monitoring,
is
specifically
designed
to
enhance
accuracy
and
efficiency
insect
detection
from
unmanned
aerial
vehicle
(UAV)
imagery.
Utilizing
novel
architecture
that
incorporates
advanced
activation
normalization
techniques,
multi-scale
feature
fusion,
custom-tailored
loss
function,
AIDN
addresses
unique
challenges
posed
by
small
size,
high
mobility,
diverse
backgrounds
insects
images.
In
comprehensive
testing
against
established
models,
demonstrated
superior
performance,
achieving
92%
precision,
88%
recall,
an
F1-score
90%,
mean
Average
Precision
(mAP)
score
89%.
These
results
signify
substantial
improvement
over
traditional
models
such
as
YOLO
v4,
SSD,
Faster
R-CNN,
typically
show
performance
metrics
approximately
10–15%
lower
across
similar
tests.
practical
implications
AIDNs
are
profound,
offering
benefits
agricultural
management
biodiversity
conservation.
By
automating
classification
processes,
reduces
labor-intensive
tasks
manual
enabling
more
frequent
accurate
data
collection.
This
collection
quality
frequency
enhances
decision
making
pest
conservation,
leading
effective
interventions
strategies.
AIDN’s
design
capabilities
set
new
standard
field,
promising
scalable
solutions
UAV-based
monitoring.
Its
ongoing
development
expected
integrate
additional
sensory
real-time
adaptive
further
applicability,
ensuring
its
role
transformative
tool
monitoring
environmental
science.
Abstract
Automated
disease
recognition
plays
a
pivotal
role
in
advancing
smart
artificial
intelligence
(AI)‐based
agriculture
and
is
crucial
for
achieving
higher
crop
yields.
Although
substantial
research
has
been
conducted
on
deep
learning‐based
automated
plant
systems,
these
efforts
have
predominantly
focused
leaf
diseases
while
neglecting
affecting
fruits.
We
propose
an
efficient
architecture
effective
fruit
with
state‐of‐the‐art
performance
to
address
this
gap.
Our
method
integrates
advanced
techniques,
such
as
multi‐head
attention
mechanisms
lightweight
convolutions,
enhance
both
efficiency
performance.
Its
ultralightweight
design
emphasizes
minimizing
computational
costs,
ensuring
compatibility
memory‐constrained
edge
devices,
enhancing
accessibility
practical
usability.
Experimental
evaluations
were
three
diverse
datasets
containing
multi‐class
images
of
disease‐affected
healthy
samples
sugar
apple
(
Annona
squamosa
),
pomegranate
Punica
granatum
guava
Psidium
guajava
).
proposed
model
attained
exceptional
results
test
set
accuracies
weighted
precision,
recall,
f1‐scores
exceeding
99%,
which
also
outperformed
pretrain
large‐scale
models.
Combining
high
accuracy
represents
significant
step
forward
developing
accessible
AI
solutions
agriculture,
contributing
the
advancement
sustainable
agriculture.