Applied Sciences,
Journal Year:
2024,
Volume and Issue:
14(8), P. 3264 - 3264
Published: April 12, 2024
This
study
delves
into
the
analysis
of
a
vineyard
in
Carinthia,
Austria,
focusing
on
automated
derivation
ecosystem
structures
individual
vine
parameters,
including
heights,
leaf
area
index
(LAI),
surface
(LSA),
and
geographic
positioning
single
plants.
For
these
intricate
segmentation
processes
nuanced
UAS-based
data
acquisition
techniques
are
necessary.
The
detection
vines
was
based
3D
point
cloud
data,
generated
at
phenological
stage
which
plants
were
absence
foliage.
mean
distance
from
derived
locations
to
reference
measurements
taken
with
GNSS
device
10.7
cm,
root
square
error
(RMSE)
1.07.
Vine
height
normalized
digital
model
(nDSM)
using
photogrammetric
showcased
strong
correlation
(R2
=
0.83)
real-world
measurements.
Vines
underwent
classification
through
an
object-based
image
(OBIA)
framework.
process
enabled
computation
plant
level
post-segmentation.
Consequently,
it
delivered
comprehensive
canopy
characteristics
rapidly,
surpassing
speed
manual
With
use
uncrewed
aerial
systems
(UAS)
equipped
optical
sensors,
dense
clouds
computed
for
canopy-related
vines.
While
LAI
LSA
computations
await
validation,
they
underscore
technical
feasibility
obtaining
precise
geometric
morphological
datasets
UAS-collected
paired
analysis.
Sensors International,
Journal Year:
2024,
Volume and Issue:
5, P. 100292 - 100292
Published: Jan. 1, 2024
The
integration
of
Artificial
Intelligence
(AI)
and
Internet
Things
(IoT)
technologies
is
transforming
precision
agriculture
by
enhancing
crop
monitoring
management.
This
review
explores
cutting-edge
methodologies
innovations
in
modern
agriculture,
including
high-throughput
phenotyping,
remote
sensing,
automated
agricultural
robots
(AgroBots).
These
automate
tasks
such
as
harvesting,
sorting,
weed
detection,
significantly
reducing
labor
costs
environmental
impacts.
High-throughput
phenotyping
leverages
spectral
imaging,
robotics
to
collect
data
on
plant
traits,
enabling
informed
decisions
fertilization,
irrigation,
pest
DGPS
sensing
offer
precise,
real-time
essential
for
soil
condition
assessment
health
monitoring.
Advanced
image
segmentation
techniques
ensure
accurate
detection
plants
fruits,
overcoming
challenges
posed
varying
lighting
conditions
complex
backgrounds.
Case
studies
like
the
PACMAN
SCRI
project
apple
load
management
Project
PANTHEON's
SCADA
system
hazelnut
orchard
demonstrate
transformative
potential
AI
IoT
optimizing
practices.
upcoming
5G
future
6G
mobile
networks
promises
address
connectivity
challenges,
promoting
widespread
adoption
smart
However,
several
research
gaps
remain.
Integrating
diverse
datasets,
ensuring
scalability
small
medium-sized
farms,
decision-making
need
further
investigation.
Developing
robust
models
devices
varied
conditions,
creating
user-friendly
interfaces
farmers,
addressing
privacy
security
concerns
are
essential.
Addressing
these
can
enhance
effectiveness
leading
more
sustainable
productive
farming
Remote Sensing,
Journal Year:
2024,
Volume and Issue:
16(3), P. 584 - 584
Published: Feb. 3, 2024
Precision
viticulture
systems
are
essential
for
enhancing
traditional
intensive
viticulture,
achieving
high-quality
results,
and
minimizing
costs.
This
study
explores
the
integration
of
Unmanned
Aerial
Vehicles
(UAVs)
artificial
intelligence
in
precision
focusing
on
vine
detection
vineyard
zoning.
Vine
employs
YOLO
(You
Only
Look
Once)
deep
learning
algorithm,
a
remarkable
90%
accuracy
by
analysing
UAV
imagery
with
various
spectral
ranges
from
phenological
stages.
Vineyard
zoning,
achieved
through
application
K-means
incorporates
geospatial
data
such
as
Normalized
Difference
Vegetation
Index
(NDVI)
assessment
nitrogen,
phosphorus,
potassium
content
leaf
blades
petioles.
approach
enables
efficient
resource
management
tailored
to
each
zone’s
specific
needs.
The
research
aims
develop
decision-support
model
viticulture.
proposed
demonstrates
high
defines
zones
variable
weighting
factors
assigned
while
preserving
location
information,
revealing
significant
differences
variables.
model’s
advantages
lie
its
rapid
results
minimal
requirements,
offering
profound
insights
into
benefits
precise
management.
has
potential
expedite
decision
making,
allowing
adaptive
strategies
based
unique
conditions
zone.
Computers and Electronics in Agriculture,
Journal Year:
2023,
Volume and Issue:
213, P. 108174 - 108174
Published: Sept. 6, 2023
A
smartphone
with
both
colour
and
time
of
flight
depth
cameras
is
used
for
automated
grape
yield
estimation
Chardonnay
grapes.
new
technique
developed
to
automatically
identify
berries
in
the
smartphone's
maps.
This
utilises
distortion
peaks
map
caused
by
diffused
scattering
light
within
each
berry.
then
extended
allow
unsupervised
training
a
YOLOv7
model
detection
images.
correlation
coefficient
(R2)
0.946
was
achieved
when
comparing
count
observed
RGB
images
those
accurately
identified
YOLO.
Additionally,
an
average
precision
score
0.970
attained.
Two
techniques
are
presented
estimate
size
generate
3D
models
bunches
using
information.
The Journal of Agricultural Science,
Journal Year:
2024,
Volume and Issue:
162(1), P. 19 - 32
Published: Feb. 1, 2024
Abstract
Varietal
identification
plays
a
pivotal
role
in
viticulture
for
several
purposes.
Nowadays,
such
is
accomplished
using
ampelography
and
molecular
markers,
techniques
requiring
specific
expertise
equipment.
Deep
learning,
on
the
other
hand,
appears
to
be
viable
cost-effective
alternative,
as
recent
studies
claim
that
computer
vision
models
can
identify
different
vine
varieties
with
high
accuracy.
Such
works,
however,
limit
their
scope
handful
of
selected
do
not
provide
accurate
figures
external
data
validation.
In
current
study,
five
well-known
were
applied
leaf
images
verify
whether
results
presented
literature
replicated
over
larger
set
consisting
27
26
382
images.
It
was
built
2
years
dedicated
field
sampling
at
three
geographically
distinct
sites,
validation
collected
from
Internet.
Cross-validation
purpose-built
confirm
results.
However,
same
models,
when
validated
against
independent
set,
appear
unable
generalize
training
retain
performances
measured
during
cross
These
indicate
further
enhancement
have
been
done
filling
gap
developing
more
reliable
model
discriminate
among
grape
varieties,
underlining
that,
achieve
this
purpose,
image
resolution
crucial
factor
development
models.
ACS Omega,
Journal Year:
2025,
Volume and Issue:
unknown
Published: Jan. 11, 2025
As
robots
undertake
increasingly
complex
tasks,
such
as
real-time
visible
image
sensing,
environmental
analysis,
and
weather
monitoring
under
harsh
conditions,
design
of
an
appropriate
robot
shell
has
become
crucial
to
ensure
the
reliability
internal
electronic
components.
Several
key
factors,
cooling
efficiency,
transparency,
mechanical
performance,
weathering
resistance
material,
are
proposed
in
this
research
future
functionality.
In
study,
a
polymeric
double-layered
for
fabrication
by
stereolithography
3D
printing
was
designed,
featuring
porous
outer
layer
spherical
inner
shell.
The
provides
approximately
90%
transmission
near-infrared
wavelength
range
(450-1050
nm)
ensures
proper
functioning
optical
devices,
cameras,
lidar,
solar
cells,
inside
robot.
addition,
material
displays
high
emittance
mid-infrared
(5-20
μm)
facilitate
effective
radiative
protect
control
system
from
thermal
damage.
3D-printed
is
exposed
real
environment
three
months,
its
stable
performance
confirms
ability.
Moreover,
promotes
strength
while
moving.
optimal
50%
designed
continuous
moving
impact.
Finite
element
simulations
also
used
show
that
porosity
significantly
reduces
strain
energy
upon
Compared
with
conventional
single-layer
130
mJ,
exhibits
reduced
22.09
mJ.
This
design,
which
offers
excellent
resistance,
cooling,
promising
applications
both
land
water
shells.
Sensors,
Journal Year:
2025,
Volume and Issue:
25(2), P. 431 - 431
Published: Jan. 13, 2025
Assessing
vines’
vigour
is
essential
for
vineyard
management
and
automatization
of
viticulture
machines,
including
shaking
adjustments
berry
harvesters
during
grape
harvest
or
leaf
pruning
applications.
To
address
these
problems,
based
on
a
standardized
growth
class
assessment,
labeled
ground
truth
data
precisely
located
grapevines
were
predicted
with
specifically
selected
Machine
Learning
(ML)
classifiers
(Random
Forest
Classifier
(RFC),
Support
Vector
Machines
(SVM)),
utilizing
multispectral
UAV
(Unmanned
Aerial
Vehicle)
sensor
data.
The
input
features
ML
model
training
comprise
spectral,
structural,
texture
feature
types
generated
from
orthomosaics
(spectral
features),
Digital
Terrain
Surface
Models
(DTM/DSM-
structural
Gray-Level
Co-occurrence
Matrix
(GLCM)
calculations
(texture
features).
specific
extensive
literature
research,
especially
the
fields
precision
agri-
viticulture.
integrate
only
vine
canopy-exclusive
into
classifications,
different
extracted
spatially
aggregated
(zonal
statistics),
combined
pixel-
object-based
image-segmentation-technique-created
row
mask
around
each
single
grapevine
position.
canopy
progressively
grouped
seven
groups
training.
Model
overall
performance
metrics
optimized
grid
search-based
hyperparameter
tuning
repeated-k-fold-cross-validation.
Finally,
ML-based
prediction
results
extensively
discussed
evaluated
(accuracy,
f1-weighted)
specific-
classification
user-
producer
accuracy).
Agriculture,
Journal Year:
2025,
Volume and Issue:
15(3), P. 298 - 298
Published: Jan. 30, 2025
Phenotypic
traits
of
fungi
and
their
automated
extraction
are
crucial
for
evaluating
genetic
diversity,
breeding
new
varieties,
estimating
yield.
However,
research
on
the
high-throughput,
rapid,
non-destructive
fungal
phenotypic
using
3D
point
clouds
remains
limited.
In
this
study,
a
smart
phone
is
used
to
capture
multi-view
images
shiitake
mushrooms
(Lentinula
edodes)
from
three
different
heights
angles,
employing
YOLOv8x
model
segment
primary
image
regions.
The
segmented
were
reconstructed
in
Structure
Motion
(SfM)
Multi-View
Stereo
(MVS).
To
automatically
individual
mushroom
instances,
we
developed
CP-PointNet++
network
integrated
with
clustering
methods,
achieving
an
overall
accuracy
(OA)
97.45%
segmentation.
computed
phenotype
correlated
strongly
manual
measurements,
yielding
R2
>
0.8
nRMSE
<
0.09
pileus
transverse
longitudinal
diameters,
=
0.53
RMSE
3.26
mm
height,
0.79
0.12
stipe
diameter,
0.65
4.98
height.
Using
these
parameters,
yield
estimation
was
performed
PLSR,
SVR,
RF,
GRNN
machine
learning
models,
demonstrating
superior
performance
(R2
0.91).
This
approach
also
adaptable
extracting
other
fungi,
providing
valuable
support
initiatives.