Mathematical and Computational Applications,
Journal Year:
2024,
Volume and Issue:
29(4), P. 56 - 56
Published: July 13, 2024
Feature
selection
is
a
preprocessing
step
in
machine
learning
that
aims
to
reduce
dimensionality
and
improve
performance.
The
approaches
for
feature
are
often
classified
according
the
evaluation
of
subset
features
as
filter,
wrapper,
embedded
approaches.
high
performance
wrapper
associated
at
same
time
with
disadvantage
computational
cost.
Cost-reduction
mechanisms
have
been
proposed
literature,
where
competitive
achieved
more
efficiently.
This
work
applies
simple
effective
resource-saving
fixed
incremental
sampling
fraction
strategies
memory
avoid
repeated
evaluations
multi-objective
permutational-based
differential
evolution
selection.
selected
approach
an
extension
DE-FSPM
algorithm
mechanism
GDE3
algorithm.
results
showed
resource
savings,
especially
number
required
search
process.
Nonetheless,
it
was
also
detected
algorithm’s
diminished.
Therefore,
reported
literature
on
effectiveness
cost
reduction
single-objective
were
only
partially
sustained
Geoderma,
Journal Year:
2024,
Volume and Issue:
449, P. 116999 - 116999
Published: Aug. 13, 2024
Understanding
and
predicting
global
soil
moisture
(SM)
is
crucial
for
water
resource
management
agricultural
production.
While
deep
learning
methods
(DL)
have
shown
strong
performance
in
SM
prediction,
imbalances
training
samples
with
different
characteristics
pose
a
significant
challenge.
We
propose
that
improving
the
diversity
balance
of
batch
during
gradient
descent
can
help
address
this
issue.
To
test
hypothesis,
we
developed
Cluster-Averaged
Sampling
(CAS)
strategy
utilizing
unsupervised
techniques.
This
approach
involves
model
evenly
sampled
data
from
clusters,
ensuring
both
sample
numerical
consistency
within
each
cluster.
prevents
overemphasizing
specific
characteristics,
leading
to
more
balanced
feature
learning.
Experiments
using
LandBench1.0
dataset
five
seeds
1-day
lead-time
predictions
reveal
CAS
outperforms
several
Long
Short-Term
Memory
(LSTM)-based
models
do
not
employ
strategy.
The
median
Coefficient
Determination
(R2)
improved
by
2.36
%
4.31
%,
while
Kling-Gupta
Efficiency
(KGE)
1.95
3.16
%.
In
high-latitude
areas,
R2
improvements
exceeded
40
regions.
further
validate
under
realistic
conditions,
tested
it
Soil
Moisture
Active
Passive
Level
3
(SMAP-L3)
satellite
1
3-day
predictions,
confirming
its
efficacy.
study
substantiates
introduces
novel
method
enhancing
generalization
DL
models.
Mathematical and Computational Applications,
Journal Year:
2024,
Volume and Issue:
29(4), P. 56 - 56
Published: July 13, 2024
Feature
selection
is
a
preprocessing
step
in
machine
learning
that
aims
to
reduce
dimensionality
and
improve
performance.
The
approaches
for
feature
are
often
classified
according
the
evaluation
of
subset
features
as
filter,
wrapper,
embedded
approaches.
high
performance
wrapper
associated
at
same
time
with
disadvantage
computational
cost.
Cost-reduction
mechanisms
have
been
proposed
literature,
where
competitive
achieved
more
efficiently.
This
work
applies
simple
effective
resource-saving
fixed
incremental
sampling
fraction
strategies
memory
avoid
repeated
evaluations
multi-objective
permutational-based
differential
evolution
selection.
selected
approach
an
extension
DE-FSPM
algorithm
mechanism
GDE3
algorithm.
results
showed
resource
savings,
especially
number
required
search
process.
Nonetheless,
it
was
also
detected
algorithm’s
diminished.
Therefore,
reported
literature
on
effectiveness
cost
reduction
single-objective
were
only
partially
sustained