2021 International Conference on Decision Aid Sciences and Application (DASA),
Journal Year:
2020,
Volume and Issue:
unknown, P. 238 - 244
Published: Nov. 8, 2020
Business
processes
are
indicating
how
to
handle
and
deal
with
different
business
situations.
Companies
focus
on
the
flexibility
of
information
technology
architecture
as
their
main
strategy
for
decision
making.
Their
systems
should
satisfy
adaptability
criteria.
In
this
paper,
we
propose
a
framework
measure
quality
existing
workflow
by
using
services
Based
results
measurements,
an
enhancement
be
obtained
concepts
graph
theory
such
min-cut
max-cut
algorithms
some
re-factoring
techniques.
To
reduce
bugs
that
could
occur
reach
long
lifespan
organization
workflow,
proposed,
designed,
implemented
new
mainframe
takes
input
check
whether
it
meets
specific
measurements.
If
is
not
qualified,
will
enhanced
automatically
our
proposed
meet
rules
consistency
within
system.
The
frameworks
components
measurement,
determination,
parsing,
code
adaptation.
We
presented
discussed
real
case
study
help
in
understanding,
illustrating
analyzing
behavior
its
approach
applicability,
addition
to,
related
scenarios.
ACM Transactions on Storage,
Journal Year:
2023,
Volume and Issue:
19(4), P. 1 - 22
Published: July 7, 2023
More
and
more
data
are
stored
in
cloud
storage,
which
brings
two
major
challenges.
First,
the
modified
files
should
be
quickly
synchronized
to
ensure
consistency,
e.g.,
delta
synchronization
(sync)
achieves
efficient
sync
by
synchronizing
only
updated
part
of
file.
Second,
huge
needs
deduplicated
encrypted,
Message-Locked
Encryption
(MLE)
implements
deduplication
encrypting
content
among
different
users.
However,
when
combined,
a
few
updates
can
cause
large
traffic
amplification
for
both
keys
ciphertext
MLE-based
significantly
degrading
efficiency.
A
feature-based
encryption
scheme,
FeatureSync,
is
proposed
address
problem.
with
further
improvement
network
bandwidth,
performance
FeatureSync
stagnates.
In
our
preliminary
experimental
evaluations,
we
find
that
bottleneck
computational
overhead
high-bandwidth
environments
main
FeatureSync.
this
article,
propose
an
enhanced
scheme
FASTSync
optimize
environments.
The
evaluations
on
lightweight
prototype
implementation
show
reduces
time
70.3%
37.3%,
average,
compared
Cloud
computing
is
widely
used
across
industries
due
to
its
benefits
such
as
scalability,
availability
and
powerful
resource
integration.
Though
it
based
on
a
cost
model,
became
affordable
virtualization
technology.
However,
has
caused
many
security
issues
both
at
Hypervisor
(HV)
level
Virtual
Machine
(VM)
level.
Security
arise
vulnerabilities
that
are
exploited
by
adversaries
launch
different
kinds
of
attacks
leading
deterioration
Quality
Service
(QoS)
in
cloud
computing.
Apart
from
related
issues,
evidenced
data
communication
challenges.
This
paper
focuses
the
countermeasures
associated
with
virtualization.
Also,
proposed
research
work
provides
useful
insights
current
state
art
find
challenges
be
addressed
respect
three
attacks.
They
known
VM
side
channel
attack
involves
class
shared
usage
hardware
VMs,
hypervisor
occur
compromised
VMs
targeting
live
migration
exploiting
dynamic
allocation
schemes.
The
findings
this
motivate
further
investigation
into
these
specific
improve
art.
IEEE Transactions on Parallel and Distributed Systems,
Journal Year:
2022,
Volume and Issue:
unknown, P. 1 - 1
Published: Jan. 1, 2022
Delta
sync
(synchronization)
is
a
key
bandwidth-saving
technique
for
cloud
storage
services.
The
representative
delta
utility,
rsync,
matches
data
chunks
by
sliding
search
window
byte-by-byte
to
maximize
the
redundancy
detection
bandwidth
efficiency.
However,
it
difficult
this
process
cater
forthcoming
high-bandwidth
services
which
require
lightweight
that
can
well
support
large
files.
Moreover,
rsync
employs
invariant
chunking
and
compression
methods
during
process,
making
unable
from
various
network
environments
approach
perform
under
different
conditions.
Inspired
Content-Defined
Chunking
(CDC)
used
in
deduplication,
we
propose
NetSync,
adaptive
CDC-based
with
less
computing
protocol
(metadata)
overheads
than
state-of-the-art
approaches.
Besides,
NetSync
choose
appropriate
compressing
strategies
idea
of
(1)
simplify
chunk
matching
proposing
fast
weak
hash
called
FastFP
piggybacked
on
rolling
hashes
CDC,
redesigning
exploiting
deduplication
locality
weak/strong
properties;
(2)
minimize
time
adaptively
choosing
parameters
according
current
Our
evaluation
results
driven
both
benchmark
real-world
datasets
suggest
performs
$2\times$2×–$10\times$10×
faster
supports
notation="LaTeX">$30\%$30%–$80\%$80%
more
clients
rsync-based
WebR2sync+
deduplication-based
approach.
Security and Communication Networks,
Journal Year:
2019,
Volume and Issue:
2019, P. 1 - 16
Published: Dec. 18, 2019
Lifeblood
of
every
organization
is
its
confidential
information.
The
accentuation
on
cybersecurity
has
expanded
considerably
in
the
course
last
few
years
because
number
attacks
at
individual
and
even
state
level.
One
specific
zone
consideration
assurance
security
nuclear
This
may
relate
to
both
Instrumentation
Control
(I&C)
Information
Technology
(IT).
present
measures
are
insufficient
for
information
their
lack
identification,
classification,
securing
(because
multifaceted
nature).
With
increasing
trends
data
storage
management
with
assistance
cloud,
confidentiality
threats
immensely
increasing.
As
there
no
such
safeguard
that
can
make
our
systems
a
hundred
percent
secure,
best
approach
provide
distinct
layers.
basic
purpose
layered
have
benefit
if
one
layer
fails
or
compromised,
other
compensates
maintains
access
control
owner’s
hand.
In
this
paper,
we
proposed
multilevel
protection-based
computing
by
using
Modular
Encryption
Standard
(MES).
We
cloud
framework
as
well
further
enhance
utilizing
multicloud
modular
approach.
By
performing
simulations,
obtained
results
depicted
scheme
works
efficiently
than
commonly
used
schemes.
Journal of Control and Decision,
Journal Year:
2022,
Volume and Issue:
10(4), P. 494 - 503
Published: Aug. 25, 2022
Because
of
its
on-demand
servicing
and
scalability
features
in
cloud
computing,
security
confidentiality
have
converted
to
key
concerns.
Maintaining
transaction
information
on
third-party
servers
carries
significant
dangers
so
that
malicious
individuals
trying
for
illegal
access
data
architecture.
This
research
proposes
a
security-aware
transfer
the
cloud-based
blowfish
algorithm
(BFA)
address
issue.
The
user
is
verified
initially
with
identification
separate
imported
using
pattern
matching
technique.
Further,
BFA
utilised
encrypt
save
cloud.
can
safeguard
streamline
proof
client
cannot
retrieve
without
which
makes
environment
secure.
suggested
approach's
performance
evaluated
several
metrics,
including
encryption
time,
decryption
memory
utilisation,
runtime.
Compared
existing
methodology,
investigational
findings
clearly
show
method
takes
least
time
encryption.
Cloud
provides
a
flexible
on-demand
data
outsourcing
services
for
the
individuals
and
organizations
to
store
their
in
server.
Security
of
cloud
storage
is
guaranteed
through
providing
confidentiality.
The
computing
several
benefits
but
major
risk
provide
security
user's
since
infrastructures
vulnerable
various
threats.
In
order
improve
secure
storage,
an
Orchini
Similarity
Authentication
based
Streebog
Hashing
Secured
Data
Storage
(OSA-SHSDS)
mechanism
introduced.
main
aim
OSA-SHSDS
user
with
higher
confidentiality
rate
lesser
space
complexity.
server,
personal
details
are
be
registered.
After
registering,
server
generates
ID
password
every
registered
user.
Whenever
wants
data,
has
login
authentication.
verifies
whether
authenticated
or
not
orchini
similarity
measure.
authentication,
allowed
server..
This
helps
increase
minimize
experimental
evaluation
existing
methods
performed
using
Amazon
dataset
metrics
such
as
authentication
accuracy,
computation
time,
observed
results
reveal
that
proposed
attains
accuracy
minimum
time
well
complexity
than
other
state
-the-
art-
methods.
Caderno Pedagógico,
Journal Year:
2024,
Volume and Issue:
21(8), P. e6539 - e6539
Published: Aug. 8, 2024
Introdução:
Python
é
amplamente
utilizado
no
desenvolvimento
de
dashboards
analíticos
devido
à
sua
versatilidade,
facilidade
uso
e
vasta
biblioteca
ferramentas.
Dashboards
são
ferramentas
visuais
que
organizam
exibem
dados,
facilitando
análises
rápidas
eficientes.
Objetivo:
Este
trabalho
investiga
o
impacto
das
técnicas
avançadas
visualização
dados
em
na
percepção
utilidade
pelos
usuários
nos
resultados
obtidos
por
organizações.
A
pesquisa
busca
entender
como
essas
influenciam
a
eficácia
dos
nas
decisões
empresariais
os
desafios
técnicos
criação
desses
dashboards,
integração
diferentes
fontes,
escolha
bibliotecas
adequadas
otimização
do
desempenho
para
grandes
volumes
dados.
Método:
Utilizando
uma
metodologia
mista,
combinando
abordagens
qualitativas
quantitativas,
foram
entrevistados
102
programadores,
resultando
amostra
final
93
participantes.
Autores
Zhang
(2020),
Marques
et
al.
(2020)
Alasiri
Salameh
fundamentais
análise.
Resultados:
Identificaram-se
significativos,
incluindo
várias
fontes
seleção
apropriadas,
sublinhando
importância
boas
práticas
gestão
cuidadosa
tecnológicas.
análise
resultados,
permitiu
compreender
necessidade
acessibilidade,
onde
utilização
painéis
análiticos,
atualmente
não
fácil
acesso
pessoas
portadoras
necessidades
especiais.
Conclusões:
O
estudo
concluiu
métodos
avançados
aumenta
significativamente
capacidade
organizações
tomar
estratégicas
informadas,
ressaltando
dessas
cenário
corporativo
competitivo
atual.
Além,
sugerir
aboradagem
inclusiva
pesquisas
futuras.