Abstract
To
understand
the
processes
behind
pollinator
declines,
and
thus
to
maintain
pollination
efficiency,
we
also
have
fundamental
drivers
influencing
behaviour.
In
this
study,
aim
explore
foraging
behaviour
of
wild
bumblebees,
recognizing
its
importance
from
economic
conservation
perspectives.
We
recorded
Bombus
terrestris
on
Lotus
creticus
,
Persicaria
capitata
Trifolium
pratense
patches
in
five-minute-long
slots
urban
areas
Terceira
(Azores,
Portugal).
For
automated
bumblebee
detection,
created
computer
vision
models
based
a
deep
learning
algorithm,
with
custom
datasets.
achieved
high
F1
scores
0.88
for
0.95
indicating
accurate
detection.
found
that
flower
cover
per
cent,
but
not
plant
species,
influenced
attractiveness
patches,
significant
positive
effect.
There
were
no
differences
between
species
heads.
The
handling
time
was
longer
large-headed
than
those
smaller-headed
.
However,
our
result
did
indicate
bumblebees
spent
flowers
among
three
species.
Here,
justify
vision-based
analysis
as
reliable
tool
studying
behavioural
ecology.
PLoS ONE,
Journal Year:
2024,
Volume and Issue:
19(4), P. e0295474 - e0295474
Published: April 3, 2024
Insect
monitoring
is
essential
to
design
effective
conservation
strategies,
which
are
indispensable
mitigate
worldwide
declines
and
biodiversity
loss.
For
this
purpose,
traditional
methods
widely
established
can
provide
data
with
a
high
taxonomic
resolution.
However,
processing
of
captured
insect
samples
often
time-consuming
expensive,
limits
the
number
potential
replicates.
Automated
facilitate
collection
at
higher
spatiotemporal
resolution
comparatively
lower
effort
cost.
Here,
we
present
Detect
DIY
(do-it-yourself)
camera
trap
for
non-invasive
automated
flower-visiting
insects,
based
on
low-cost
off-the-shelf
hardware
components
combined
open-source
software.
Custom
trained
deep
learning
models
detect
track
insects
landing
an
artificial
flower
platform
in
real
time
on-device
subsequently
classify
cropped
detections
local
computer.
Field
deployment
solar-powered
confirmed
its
resistance
temperatures
humidity,
enables
autonomous
during
whole
season.
On-device
detection
tracking
estimate
activity/abundance
after
metadata
post-processing.
Our
classification
model
achieved
top-1
accuracy
test
dataset
generalized
well
real-world
images.
The
software
highly
customizable
be
adapted
different
use
cases.
With
custom
models,
as
accessible
programming,
many
possible
applications
surpassing
our
proposed
method
realized.
Journal of Pollination Ecology,
Journal Year:
2025,
Volume and Issue:
37, P. 1 - 21
Published: Jan. 10, 2025
Monitoring
plant-pollinator
interactions
is
crucial
for
understanding
the
factors
influencing
these
relationships
across
space
and
time.
Traditional
methods
in
pollination
ecology
are
resource-intensive,
while
time-lapse
photography
offers
potential
non-destructive
automated
complementary
techniques.
However,
accurate
identification
of
pollinators
at
finer
taxonomic
levels
(i.e.,
genus
or
species)
requires
high
enough
image
quality.
This
study
assessed
feasibility
using
a
smartphone
setup
to
capture
images
arthropods
visiting
flowers
evaluated
whether
offered
sufficient
resolution
arthropod
by
taxonomists.
Smartphones
were
positioned
above
target
from
various
plant
species
urban
green
areas
around
Leipzig
Halle,
Germany.
We
present
proportions
identifications
(instances)
different
(order,
family,
genus,
based
on
visible
features
as
interpreted
document
limitations
stem
(e.g.,
fixed
positioning
preventing
distinguishing
despite
resolution)
low
Recommendations
provided
address
challenges.
Our
results
indicate
that
89.81%
all
Hymenoptera
instances
identified
family
level,
84.56%
pollinator
only
25.35%
level.
less
able
identify
Dipterans
levels,
with
nearly
50%
not
identifiable
26.18%
15.19%
levels.
was
due
their
small
size
more
challenging
needed
wing
veins).
Advancing
technology,
along
accessibility,
affordability,
user-friendliness,
promising
option
coarse-level
monitoring.
Agronomy,
Journal Year:
2025,
Volume and Issue:
15(3), P. 693 - 693
Published: March 13, 2025
Pest
infestations
have
always
been
a
major
factor
affecting
tea
production.
Real-time
detection
of
pests
using
machine
vision
is
mainstream
method
in
modern
agricultural
pest
control.
Currently,
there
notable
absence
devices
capable
real-time
monitoring
for
small-sized
the
market,
and
scarcity
open-source
datasets
available
remains
critical
limitation.
This
manuscript
proposes
YOLOv8-FasterTea
algorithm
based
on
cross-domain
transfer
learning,
which
was
successfully
deployed
novel
device.
The
proposed
leverages
learning
from
natural
language
character
domain
to
domain,
termed
complex
small
characteristics
shared
by
characters
pests.
With
sufficient
samples
can
effectively
enhance
tiny
feature
extraction
capabilities
deep
networks
mitigate
few-shot
problem
detection.
information
texture
features
are
more
likely
be
lost
with
layers
neural
network
becoming
deep.
Therefore,
method,
YOLOv8-FasterTea,
removes
P5
layer
adds
P2
target
YOLOv8
model.
Additionally,
original
C2f
module
replaced
lighter
convolutional
modules
reduce
loss
about
Finally,
this
applies
outdoor
equipment.
Experimental
results
demonstrate
that,
sample
yellow
board
dataset,
[email protected]
value
model
increased
approximately
6%,
average,
after
learning.
improved
3.7%,
while
size
reduced
46.6%.
Research Square (Research Square),
Journal Year:
2025,
Volume and Issue:
unknown
Published: April 7, 2025
Abstract
Pollinating
insects
provide
essential
ecosystem
services,
and
using
time-lapse
photography
to
automate
their
observation
could
improve
monitoring
efficiency.
Computer
vision
models,
trained
on
clear
citizen
science
photos,
can
detect
in
similar
images
with
high
accuracy,
but
performance
taken
is
unknown.
We
evaluated
the
generalisation
of
three
lightweight
YOLO
detectors
(YOLOv5-nano,
YOLOv5-small,
YOLOv7-tiny),
previously
images,
for
detecting
~
1,300
flower-visiting
arthropod
individuals
nearly
24,000
captured
a
fixed
smartphone
setup.
These
field
featured
unseen
backgrounds
smaller
arthropods
than
training
data.
model
highest
number
trainable
parameters,
performed
best,
localising
91.21%
Hymenoptera
80.69%
Diptera
individuals.
However,
classification
recall
was
lower
(80.45%
66.90%,
respectively),
partly
due
Syrphidae
mimicking
challenge
smaller,
blurrier
flower
visitors.
This
study
reveals
both
potential
limitations
such
models
real-world
automated
monitoring,
suggesting
they
work
well
larger
sharply
visible
pollinators
need
improvement
less
sharp
cases.
Remote Sensing in Ecology and Conservation,
Journal Year:
2025,
Volume and Issue:
unknown
Published: April 17, 2025
Abstract
Insects
represent
nearly
half
of
all
known
multicellular
species,
but
knowledge
about
them
lags
behind
for
most
vertebrate
species.
In
part
this
reason,
they
are
often
neglected
in
biodiversity
conservation
policies
and
practice.
Computer
vision
tools,
such
as
insect
camera
traps,
automated
monitoring
have
the
potential
to
revolutionize
study
conservation.
To
further
advance
trapping
analysis
their
image
data,
effective
processing
pipelines
needed.
paper,
we
present
a
flexible
fast
pipeline
designed
analyse
these
recordings
by
detecting,
tracking
classifying
nocturnal
insects
broad
taxonomy
15
classes
resolution
individual
moth
A
classifier
with
anomaly
detection
is
proposed
filter
dark,
blurred
or
partially
visible
that
will
be
uncertain
classify
correctly.
simple
track‐by‐detection
algorithm
track
classified
incorporating
feature
embeddings,
distance
area
cost.
We
evaluated
computational
speed
power
performance
different
edge
computing
devices
(Raspberry
Pi's
NVIDIA
Jetson
Nano)
compared
various
time‐lapse
(TL)
strategies
tracking.
The
minimum
difference
detections
was
found
2‐min
TL
intervals
0.5
frames
per
second;
however,
fewer
than
one
night,
Pearson
correlation
decreases.
Shifting
from
would
reduce
number
recorded
images
allow
real‐time
on
trap
Raspberry
Pi.
Nano
energy‐efficient
solution,
capable
at
fps.
Our
applied
more
5.7
million
second
12
light
traps
during
two
full
seasons
located
diverse
habitats,
including
bogs,
heaths
forests.
results
thus
show
scalability
traps.
Insects,
Journal Year:
2024,
Volume and Issue:
15(9), P. 729 - 729
Published: Sept. 22, 2024
To
understand
the
processes
behind
pollinator
declines
and
for
conservation
of
pollination
services,
we
need
to
fundamental
drivers
influencing
behaviour.
Here,
aimed
elucidate
how
wild
bumblebees
interact
with
three
plant
species
investigated
their
foraging
behaviour
varying
flower
densities.
We
video-recorded
Bombus
terrestris
in
60
×
cm
quadrats
Lotus
creticus,
Persicaria
capitata,
Trifolium
pratense
urban
areas
Terceira
(Azores,
Portugal).
For
automated
bumblebee
detection
counting,
created
deep
learning-based
computer
vision
models
custom
datasets.
achieved
high
model
accuracy
0.88
0.95
Trifolium,
indicating
accurate
detection.
In
our
study,
cover
was
only
factor
that
influenced
attractiveness
patches,
did
not
have
an
effect.
detected
a
significant
positive
effect
on
patches
flower-visiting
bumblebees.
The
time
spent
per
unit
inflorescence
surface
area
longer
than
those
Persicaria.
However,
result
indicate
differences
inflorescences
among
species.
also
justify
vision-based
analysis
as
reliable
tool
studying
behavioural
ecology.