Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks
Haoxuan Chen,
No information about this author
Huamao Huang,
No information about this author
Yangyang Peng
No information about this author
et al.
Agriculture,
Journal Year:
2025,
Volume and Issue:
15(3), P. 301 - 301
Published: Jan. 30, 2025
Oudemansiella
raphanipes
is
valued
for
its
rich
nutritional
content
and
medicinal
properties,
but
traditional
manual
grading
methods
are
time-consuming
labor-intensive.
To
address
this,
deep
learning
techniques
employed
to
automate
the
process,
knowledge
distillation
(KD)
used
enhance
accuracy
of
a
small-parameter
model
while
maintaining
low
resource
occupation
fast
response
speed
in
resource-limited
devices.
This
study
employs
three-teacher
KD
framework
investigates
three
cascaded
structures:
parallel
model,
standard
series
with
residual
connections
(residual-series
model).
The
student
lightweight
ShuffleNet
V2
0.5x,
teacher
models
VGG16,
ResNet50,
Xception.
Our
experiments
show
that
structures
result
improved
performance
indices,
compared
ensemble
equal
weights;
particular,
residual-series
outperforms
other
models,
achieving
99.7%
on
testing
dataset
an
average
inference
time
5.51
ms.
findings
this
have
potential
broader
application
environments
automated
quality
grading.
Language: Английский
Application of a Multi-Teacher Distillation Regression Model Based on Clustering Integration and Adaptive Weighting in Dam Deformation Prediction
Fawang Guo,
No information about this author
Jing Yuan,
No information about this author
Danyang Li
No information about this author
et al.
Water,
Journal Year:
2025,
Volume and Issue:
17(7), P. 988 - 988
Published: March 27, 2025
Deformation
is
a
key
physical
quantity
that
reflects
the
safety
status
of
dams.
Dam
deformation
influenced
by
multiple
factors
and
has
seasonal
periodic
patterns.
Due
to
challenges
in
accurately
predicting
dam
with
traditional
linear
models,
deep
learning
methods
have
been
increasingly
applied
recent
years.
In
response
problems
such
as
an
excessively
long
training
time,
too-high
model
complexity,
limited
generalization
ability
large
number
complex
hybrid
models
current
research
field,
we
propose
improved
multi-teacher
distillation
network
for
regression
tasks
improve
performance
model.
The
constructed
using
Transformer
considers
global
dependencies,
while
student
Temporal
Convolutional
Network
(TCN).
To
efficiency,
draw
on
concept
clustering
integration
reduce
teacher
networks
loss
function
tasks.
We
incorporate
adaptive
weight
module
into
assign
more
teachers
accurate
prediction
results.
Finally,
knowledge
information
formed
based
differences
between
network.
concrete-faced
rockfill
located
Guizhou
province,
China,
results
demonstrate
that,
compared
other
methods,
this
approach
exhibits
higher
accuracy
practicality.
Language: Английский
Class-adaptive attention transfer and multilevel entropy decoupled knowledge distillation
Multimedia Systems,
Journal Year:
2025,
Volume and Issue:
31(3)
Published: April 15, 2025
Language: Английский
A Study on the Change of Artistic Creation Styles in the Internet Era in Higher Art Education Based on the Perspective of Network Analysis
Applied Mathematics and Nonlinear Sciences,
Journal Year:
2024,
Volume and Issue:
9(1)
Published: Jan. 1, 2024
Abstract
Artistic
creation
in
the
network
era,
examined
terms
of
artistic
style,
can
be
roughly
divided
into
several
important
stages
time,
such
as
1985-1995,
1996-2005,
2006-2015,
and
2016-present.
This
paper
integrates
pedagogical
knowledge
distillation
an
style
feature
extraction
model
to
investigate
change
era.
Using
global
features,
color
histogram
art
image
region
is
extracted
by
calculating
features
on
H,
S,
V
from
both
texture
features.
The
vectors
are
normalized,
assuming
that
they
conform
a
Gaussian
distribution.
Quantify
using
adaptive
weighted
Gram
matrix
improve
accuracy
classification.
Joint
for
dissemination
Plotting
scatter
chromaticity
values
L*-b*
all
painting
samples
reveals
luminance
L*
paintings
fluctuates
between
21.38
79.94,
red/greenness
value
a*
-8.57
1.7,
yellow/blueness
b*
1.67
16.21.
temporal
development
characteristics
styles
follows
clear
pattern.
In
comparison
eigenvalues
under
variable
parameters,
moment
inertia
images
period
1985-1995
raised
147.698
615.965
after
noise
addition,
there
different
degrees
growth
other
periods,
but
with
magnitudes
growth,
obvious
differences.
Language: Английский