Since
the
powerful
memory
capabilities
of
deep
neural
networks,
they
tend
to
overfit
noisy
labels,
resulting
in
degradation
discrimination.
Sample
selection
methods
that
filter
out
possibly
clean
labels
have
been
mainstream
learning
with
labels.
large
gap
between
size
filtered,
subset
and
unlabeled
subset,
which
is
particularly
obvious
under
high
noise
rates,
label-free
samples
sample
cannot
be
fully
used,
leaving
space
for
performance
improvement.
This
paper
proposes
an
improved
Selection
framework
OverSampling
strategy,
SOS,
overcome
this
deficiency.
It
mines
useful
information
carried
instances
boost
models’
by
combining
oversampling
strategy
existing
SOTA
methods.
We
demonstrate
effectiveness
SOS
through
extensive
experimental
results
on
both
synthetic
datasets
real-world
datasets.
The
code
will
available
at
https://github.com/LanXiaoPang613/SOS.
Applied Sciences,
Journal Year:
2023,
Volume and Issue:
13(12), P. 6933 - 6933
Published: June 8, 2023
As
one
of
the
most
important
fields
in
computer
vision,
object
detection
has
undergone
marked
development
recent
years.
Generally,
requires
many
labeled
samples
for
training,
but
it
is
not
easy
to
collect
and
label
specialized
fields.
In
case
few
samples,
general
detectors
typically
exhibit
overfitting
poor
generalizability
when
recognizing
unknown
objects,
FSOD
methods
also
cannot
make
good
use
support
information
or
manage
potential
problem
relationships
between
branch
query
branch.
To
address
this
issue,
we
propose
paper
a
novel
framework
called
Decoupled
Multi-scale
Attention
(DMA-Net),
core
which
Module
(DMAM),
consists
three
primary
parts:
multi-scale
feature
extractor,
attention
module,
decoupled
gradient
module
(DGM).
DMAM
performs
extraction
layer-to-layer
fusion,
can
more
efficiently,
DGM
reduce
impact
optimization
exchange
two
branches.
DMA-Net
implement
incremental
FSOD,
suitable
practical
applications.
Extensive
experimental
results
demonstrate
that
comparable
on
generic
benchmarks,
particularly
setting,
where
achieves
state-of-the-art
performance.
PLoS ONE,
Journal Year:
2024,
Volume and Issue:
19(12), P. e0309841 - e0309841
Published: Dec. 5, 2024
Deep
neural
networks
have
powerful
memory
capabilities,
yet
they
frequently
suffer
from
overfitting
to
noisy
labels,
leading
a
decline
in
classification
and
generalization
performance.
To
address
this
issue,
sample
selection
methods
that
filter
out
potentially
clean
labels
been
proposed.
However,
there
is
significant
gap
size
between
the
filtered,
possibly
subset
unlabeled
subset,
which
becomes
particularly
pronounced
at
high-noise
rates.
Consequently,
results
underutilizing
label-free
samples
methods,
leaving
room
for
performance
improvement.
This
study
introduces
an
enhanced
framework
with
oversampling
strategy
(SOS)
overcome
limitation.
leverages
valuable
information
contained
instances
enhance
model
by
combining
SOS
state-of-the-art
methods.
We
validate
effectiveness
of
through
extensive
experiments
conducted
on
both
synthetic
datasets
real-world
such
as
CIFAR,
WebVision,
Clothing1M.
The
source
code
will
be
made
available
https://github.com/LanXiaoPang613/SOS.
Since
the
powerful
memory
capabilities
of
deep
neural
networks,
they
tend
to
overfit
noisy
labels,
resulting
in
degradation
discrimination.
Sample
selection
methods
that
filter
out
possibly
clean
labels
have
been
mainstream
learning
with
labels.
large
gap
between
size
filtered,
subset
and
unlabeled
subset,
which
is
particularly
obvious
under
high
noise
rates,
label-free
samples
sample
cannot
be
fully
used,
leaving
space
for
performance
improvement.
This
paper
proposes
an
improved
Selection
framework
OverSampling
strategy,
SOS,
overcome
this
deficiency.
It
mines
useful
information
carried
instances
boost
models’
by
combining
oversampling
strategy
existing
SOTA
methods.
We
demonstrate
effectiveness
SOS
through
extensive
experimental
results
on
both
synthetic
datasets
real-world
datasets.
The
code
will
available
at
https://github.com/LanXiaoPang613/SOS.
Since
the
powerful
memory
capabilities
of
deep
neural
networks,
they
tend
to
overfit
noisy
labels,
resulting
in
degradation
discrimination.
Sample
selection
methods
that
filter
out
possibly
clean
labels
have
been
mainstream
learning
with
labels.
large
gap
between
size
filtered,
subset
and
unlabeled
subset,
which
is
particularly
obvious
under
high
noise
rates,
label-free
samples
sample
cannot
be
fully
used,
leaving
space
for
performance
improvement.
This
paper
proposes
an
improved
Selection
framework
OverSampling
strategy,
SOS,
overcome
this
deficiency.
It
mines
useful
information
carried
instances
boost
models’
by
combining
oversampling
strategy
existing
SOTA
methods.
We
demonstrate
effectiveness
SOS
through
extensive
experimental
results
on
both
synthetic
datasets
real-world
datasets.
The
code
will
available
at
https://github.com/LanXiaoPang613/SOS.