Human and environmental feature-driven neural network for path-constrained robot navigation using deep reinforcement learning
Engineering Science and Technology an International Journal,
Год журнала:
2025,
Номер
64, С. 101993 - 101993
Опубликована: Фев. 25, 2025
Язык: Английский
Memory-driven deep-reinforcement learning for autonomous robot navigation in partially observable environments
Engineering Science and Technology an International Journal,
Год журнала:
2025,
Номер
62, С. 101942 - 101942
Опубликована: Янв. 24, 2025
Язык: Английский
High-Quality Text-to-Image Generation Using High-Detail Feature-Preserving Network
Applied Sciences,
Год журнала:
2025,
Номер
15(2), С. 706 - 706
Опубликована: Янв. 13, 2025
Multistage
text-to-image
generation
algorithms
have
shown
remarkable
success.
However,
the
images
produced
often
lack
detail
and
suffer
from
feature
loss.
This
is
because
these
methods
mainly
focus
on
extracting
features
text,
using
only
conventional
residual
blocks
for
post-extraction
processing.
results
in
loss
of
features,
greatly
reducing
quality
generated
necessitating
more
resources
calculation,
which
will
severely
limit
use
application
optical
devices
such
as
cameras
smartphones.
To
address
issues,
novel
High-Detail
Feature-Preserving
Network
(HDFpNet)
proposed
to
effectively
generate
high-quality,
near-realistic
text
descriptions.
The
initial
(iT2IG)
module
used
maps
avoid
Next,
fast
excitation-and-squeeze
extraction
(FESFE)
recursively
high-detail
feature-preserving
with
lower
computational
costs
through
three
steps:
channel
excitation
(CE),
(FFE),
squeeze
(CS).
Finally,
attention
(CA)
mechanism
further
enriches
details.
Compared
state
art,
experimental
obtained
CUB-Bird
MS-COCO
datasets
demonstrate
that
HDFpNet
achieves
better
performance
visual
presentation,
especially
regarding
preservation.
Язык: Английский
Web-Based Real-Time Alarm and Teleoperation System for Autonomous Navigation Failures Using ROS 1 and ROS 2
Actuators,
Год журнала:
2025,
Номер
14(4), С. 164 - 164
Опубликована: Март 26, 2025
This
paper
presents
an
alarm
system
and
teleoperation
control
framework,
comparing
ROS
1
2
within
a
local
network
to
mitigate
the
risk
of
robots
failing
reach
their
goals
during
autonomous
navigation.
Such
failures
can
occur
when
robot
moves
through
irregular
terrain,
becomes
stuck
on
small
steps,
or
approaches
walls
obstacles
without
maintaining
safe
distance.
These
issues
may
arise
due
combination
technical,
environmental,
operational
factors,
including
inaccurate
sensor
data,
blind
spots,
localization
errors,
infeasible
path
planning,
inability
adapt
unexpected
obstacles.
The
integrates
web-based
graphical
interface
developed
using
frontend
frameworks
joystick
for
real-time
monitoring
robot’s
localization,
velocity,
proximity
is
equipped
with
RGB-D
tracking
cameras,
2D
LiDAR,
odometry
sensors,
providing
detailed
environmental
data.
provides
sensory
feedback
visual
alerts
web
vibration
walls,
faces
potential
collisions
objects,
loses
stability.
evaluated
in
both
simulation
(Gazebo)
real-world
experiments,
where
latency
measured
performance
assessed
2.
results
demonstrate
that
systems
operate
effectively
real
time,
ensuring
safety
enabling
timely
operator
intervention.
offers
lower
LiDAR
inputs,
making
it
advantageous
over
1.
However,
camera
higher,
suggesting
need
optimizations
image
data
processing.
Additionally,
platform
supports
integration
additional
sensors
applications
based
user
requirements.
Язык: Английский