International Journal of Neural Systems,
Journal Year:
2024,
Volume and Issue:
34(05)
Published: Feb. 18, 2024
Classifying
images
has
become
a
straightforward
and
accessible
task,
thanks
to
the
advent
of
Deep
Neural
Networks.
Nevertheless,
not
much
attention
is
given
privacy
concerns
associated
with
sensitive
data
contained
in
images.
In
this
study,
we
propose
solution
issue
by
exploring
an
intersection
between
Machine
Learning
cryptography.
particular,
Fully
Homomorphic
Encryption
(FHE)
emerges
as
promising
solution,
it
enables
computations
be
performed
on
encrypted
data.
We
therefore
Residual
Network
implementation
based
FHE
which
allows
classification
images,
ensuring
that
only
user
can
see
result.
suggest
circuit
reduces
memory
requirements
more
than
[Formula:
text]
compared
most
recent
works,
while
maintaining
high
level
accuracy
short
computational
time.
implement
using
well-known
Cheon–Kim–Kim–Song
(CKKS)
scheme,
approximate
computations.
evaluate
results
from
three
perspectives:
requirements,
time
calculations
precision.
demonstrate
possible
ResNet20
less
five
minutes
laptop
approximately
15[Formula:
text]GB
memory,
achieving
91.67%
CIFAR-10
dataset,
almost
equivalent
plain
model
(92.60%).
Proceedings of the IEEE,
Journal Year:
2022,
Volume and Issue:
110(10), P. 1572 - 1609
Published: Oct. 1, 2022
Data
privacy
concerns
are
increasing
significantly
in
the
context
of
Internet
Things,
cloud
services,
edge
computing,
artificial
intelligence
applications,
and
other
applications
enabled
by
next-generation
networks.
Homomorphic
encryption
addresses
challenges
enabling
multiple
operations
to
be
performed
on
encrypted
messages
without
decryption.
This
article
comprehensively
homomorphic
from
both
theoretical
practical
perspectives.
delves
into
mathematical
foundations
required
understand
fully
(
$\textsf
{FHE}$
).
It
consequently
covers
design
fundamentals
security
properties
describes
main
schemes
based
various
problems.
On
a
more
level,
this
presents
view
privacy-preserving
machine
learning
using
then
surveys
at
length
an
engineering
angle,
covering
potential
application
fog
computing
services.
also
provides
comprehensive
analysis
existing
state-of-the-art
libraries
tools,
implemented
software
hardware,
performance
thereof.
Homomorphic
encryption
(HE)
enables
the
secure
offloading
of
computations
to
cloud
by
providing
computation
on
encrypted
data
(ciphertexts).
HE
is
based
noisy
schemes
in
which
noise
accumulates
as
more
are
applied
data.
The
limited
number
operations
applicable
prevents
practical
applications
from
exploiting
HE.
Bootstrapping
an
unlimited
or
fully
(FHE)
refreshing
ciphertext.
Unfortunately,
bootstrapping
requires
a
significant
amount
additional
and
memory
bandwidth
well.
Prior
works
have
proposed
hardware
accelerators
for
primitives
FHE.
However,
best
our
knowledge,
this
first
propose
FHE
accelerator
that
supports
first-class
citizen.
Blockchains,
Journal Year:
2025,
Volume and Issue:
3(1), P. 1 - 1
Published: Jan. 1, 2025
Federated
learning
(FL)
has
emerged
as
an
efficient
machine
(ML)
method
with
crucial
privacy
protection
features.
It
is
adapted
for
training
models
in
Internet
of
Things
(IoT)-related
domains,
including
smart
healthcare
systems
(SHSs),
where
the
introduction
IoT
devices
and
technologies
can
arise
various
security
concerns.
However,
FL
cannot
solely
address
all
challenges,
privacy-enhancing
(PETs)
blockchain
are
often
integrated
to
enhance
frameworks
within
SHSs.
The
critical
questions
remain
regarding
how
these
they
contribute
enhancing
This
survey
addresses
by
investigating
recent
advancements
on
combination
PETs
healthcare.
First,
this
emphasizes
integration
into
context.
Second,
challenge
integrating
FL,
it
examines
three
main
technical
dimensions
such
blockchain-enabled
model
storage,
aggregation,
gradient
upload
frameworks.
further
explores
collectively
ensure
integrity
confidentiality
data,
highlighting
their
significance
building
a
trustworthy
SHS
that
safeguards
sensitive
patient
information.
Homomorphic
Encryption
(HE)
is
one
of
the
most
promising
post-quantum
cryptographic
schemes
that
enable
privacy-preserving
computation
on
servers.
However,
noise
accumulates
as
we
perform
operations
HE-encrypted
data,
restricting
number
possible
operations.
Fully
HE
(FHE)
removes
this
restriction
by
introducing
bootstrapping
operation,
which
refreshes
data;
however,
FHE
are
highly
memory-bound.
Bootstrapping,
in
particular,
requires
loading
GBs
evaluation
keys
and
plaintexts
from
offchip
memory,
makes
acceleration
fundamentally
bottlenecked
off-chip
memory
bandwidth.In
paper,
propose
ARK,
an
Accelerator
for
with
Runtime
data
generation
inter-operation
Key
reuse.
ARK
enables
practical
workloads
a
novel
algorithm-architecture
co-design
to
accelerate
bootstrapping.
We
first
eliminate
bandwidth
bottleneck
through
runtime
key
This
approach
fully
exploit
on-chip
substantially
reducing
size
working
set.
On
top
such
algorithmic
enhancements,
build
microarchitecture
minimizes
movement
efficient,
alternating
distribution
policy
based
access
patterns
streamlined
dataflow
organization
tailored
functional
units
–
including
base
conversion,
number-theoretic
transform,
automorphism
units.
Overall,
our
codesign
effectively
handles
heavy
overheads
FHE,
drastically
cost
operations,
IEEE Access,
Journal Year:
2022,
Volume and Issue:
10, P. 117477 - 117500
Published: Jan. 1, 2022
Outsourced
computation
for
neural
networks
allows
users
access
to
state-of-the-art
models
without
investing
in
specialized
hardware
and
know-how.
The
problem
is
that
the
lose
control
over
potentially
privacy-sensitive
data.
With
homomorphic
encryption
(HE),
a
third
party
can
perform
on
encrypted
data
revealing
its
content.
In
this
paper,
we
reviewed
scientific
articles
publications
particular
area
of
Deep
Learning
Architectures
Privacy-Preserving
Machine
(PPML)
with
Fully
HE.
We
analyzed
changes
network
architectures
make
them
compatible
HE
how
these
impact
performance.
Next,
find
numerous
challenges
HE-based
privacy-preserving
deep
learning,
such
as
computational
overhead,
usability,
limitations
posed
by
schemes.
Furthermore,
discuss
potential
solutions
PPML
challenges.
Finally,
propose
evaluation
metrics
allow
better
more
meaningful
comparison
solutions.
Computers & Security,
Journal Year:
2023,
Volume and Issue:
137, P. 103605 - 103605
Published: Nov. 29, 2023
The
wide
adoption
of
Machine
Learning
to
solve
a
large
set
real-life
problems
came
with
the
need
collect
and
process
volumes
data,
some
which
are
considered
personal
sensitive,
raising
serious
concerns
about
data
protection.
Privacy-enhancing
technologies
(PETs)
often
indicated
as
solution
protect
achieve
general
trustworthiness
required
by
current
EU
regulations
on
protection
AI.
However,
an
off-the-shelf
application
PETs
is
insufficient
ensure
high-quality
protection,
one
needs
understand.
This
work
systematically
discusses
risks
against
in
modern
systems
taking
original
perspective
owners,
who
those
hold
various
sets,
models,
or
both,
throughout
machine
learning
life
cycle
considering
different
architectures.
It
argues
that
origin
threats,
level
offered
depend
processing
phase,
role
parties
involved,
architecture
where
deployed.
By
offering
framework
discuss
privacy
confidentiality
for
owners
identifying
assessing
privacy-preserving
countermeasures
learning,
this
could
facilitate
discussion
compliance
directives.
We
challenges
research
questions
still
unsolved
field.
In
respect,
paper
provides
researchers
developers
working
comprehensive
body
knowledge
let
them
advance
science
field
well
closely
related
fields
such
Artificial
Intelligence.
IEEE Transactions on Information Forensics and Security,
Journal Year:
2023,
Volume and Issue:
18, P. 2175 - 2187
Published: Jan. 1, 2023
Inference
of
machine
learning
models
with
data
privacy
guarantees
has
been
widely
studied
as
concerns
are
getting
growing
attention
from
the
community.
Among
others,
secure
inference
based
on
Fully
Homomorphic
Encryption
(FHE)
proven
its
utility
by
providing
stringent
at
sometimes
affordable
cost.
Still,
previous
work
was
restricted
to
shallow
and
narrow
neural
networks
simple
tasks
due
high
computational
cost
incurred
FHE.
In
this
paper,
we
propose
a
more
efficient
way
evaluating
convolutions
FHE,
where
remains
constant
regardless
kernel
size,
resulting
in
12–46×
timing
improvement
various
sizes.
Combining
our
methods
FHE
bootstrapping,
achieve
least
18.9%
(and
48.1%)
reduction
homomorphic
evaluation
20-layer
CNN
classifiers
part
it)
CIFAR10/100
ImageNet,
respectively)
datasets.
Furthermore,
consideration
being
effective
for
CNNs
intensive
convolutional
operations
exploring
such
CNNs,
5×
faster
than
prior
works
having
same
or
less
accuracy.
Proceedings on Privacy Enhancing Technologies,
Journal Year:
2023,
Volume and Issue:
2023(1), P. 325 - 342
Published: Jan. 1, 2023
Privacy-preserving
solutions
enable
companies
to
offload
confidential
data
third-party
services
while
fulfilling
their
government
regulations.
To
accomplish
this,
they
leverage
various
cryptographic
techniques
such
as
Homomorphic
Encryption
(HE),
which
allows
performing
computation
on
encrypted
data.
Most
HE
schemes
work
in
a
SIMD
fashion,
and
the
packing
method
can
dramatically
affect
running
time
memory
costs.
Finding
that
leads
an
optimal
performant
implementation
is
hard
task.
We
present
simple
intuitive
framework
abstracts
decision
for
user.
explain
its
underlying
structures
optimizer,
propose
novel
algorithm
2D
convolution
operations.
used
this
implement
HE-friendly
version
of
AlexNet,
runs
three
minutes,
several
orders
magnitude
faster
than
other
state-of-the-art
only
use
HE.
Journal of Network and Computer Applications,
Journal Year:
2024,
Volume and Issue:
231, P. 103989 - 103989
Published: Aug. 2, 2024
The
metaverse
is
a
nascent
concept
that
envisions
virtual
universe,
collaborative
space
where
individuals
can
interact,
create,
and
participate
in
wide
range
of
activities.
Privacy
the
critical
concern
as
evolves
immersive
experiences
become
more
prevalent.
privacy
problem
refers
to
challenges
concerns
surrounding
personal
information
data
within
Virtual
Reality
(VR)
environments
shared
VR
becomes
accessible.
Metaverse
will
harness
advancements
from
various
technologies
such
Artificial
Intelligence
(AI),
Extended
(XR)
Mixed
(MR)
provide
personalized
services
its
users.
Moreover,
enable
experiences,
relies
on
collection
fine-grained
user
leads
issues.
Therefore,
before
potential
be
fully
realized,
related
must
addressed.
This
includes
safeguarding
users'
control
over
their
data,
ensuring
security
information,
protecting
in-world
actions
interactions
unauthorized
sharing.
In
this
paper,
we
explore
future
metaverses
are
expected
face,
given
reliance
AI
for
tracking
users,
creating
XR
MR
facilitating
interactions.
thoroughly
analyze
technical
solutions
differential
privacy,
Homomorphic
Encryption,
Federated
Learning
discuss
sociotechnical
issues
regarding
privacy.
ACM Computing Surveys,
Journal Year:
2024,
Volume and Issue:
56(12), P. 1 - 32
Published: July 5, 2024
Fully
Homomorphic
Encryption
(FHE)
is
a
key
technology
enabling
privacy-preserving
computing.
However,
the
fundamental
challenge
of
FHE
its
inefficiency,
due
primarily
to
underlying
polynomial
computations
with
high
computation
complexity
and
extremely
time-consuming
ciphertext
maintenance
operations.
To
tackle
this
challenge,
various
accelerators
have
recently
been
proposed
by
both
research
industrial
communities.
This
article
takes
first
initiative
conduct
systematic
study
on
14
accelerators:
cuHE/cuFHE,
nuFHE,
HEAT,
HEAX,
HEXL,
HEXL-FPGA,
100×,
F1,
CraterLake,
BTS,
ARK,
Poseidon,
FAB,
TensorFHE.
We
make
our
observations
evolution
trajectory
these
existing
establish
qualitative
connection
between
them.
Then,
we
perform
testbed
evaluations
representative
open-source
provide
quantitative
comparison
Finally,
insights
learned
from
studies,
discuss
potential
directions
inform
future
design
implementation
for
accelerators.