OBM Neurobiology,
Journal Year:
2024,
Volume and Issue:
08(03), P. 1 - 16
Published: Aug. 19, 2024
Nowadays,
mental
health
disorders
have
become
a
significant
concern
for
everyone.
There
are
diverse
reasons
emanating
from
the
workplace,
business,
and
everyday
environment.
Therefore,
there
is
current
need
to
use
technology
detect
review
their
symptoms
causes.
Accordingly,
in
this
study,
researcher
attempted
recognize
ChatGPT’s
role
decision-making
recognition
of
among
Egyptian
entrepreneurs.
The
used
quantitative
approach
study
based
its
findings
on
332
valid
samples.
study's
results
through
path
analysis
using
Analysis
Moment
Structures
(AMOS)
confirmed
positive
effect
user
perception
ChatGPT
trust
process
disorders.
On
other
hand,
found
negative
recognizing
study’s
demonstrate
that
hurts
This
assist
development
policies
improve
available
digital
services,
such
as
or
AI,
increase
societal
awareness.
Moreover,
by
providing
empirical
evidence
entrepreneurs
developing
country
context,
contribute
existing
psychology,
technology,
management
literature.
Frontiers in Artificial Intelligence,
Journal Year:
2024,
Volume and Issue:
7
Published: June 18, 2024
The
release
of
GPT-4
has
garnered
widespread
attention
across
various
fields,
signaling
the
impending
adoption
and
application
Large
Language
Models
(LLMs).
However,
previous
research
predominantly
focused
on
technical
principles
ChatGPT
its
social
impact,
overlooking
effects
human–computer
interaction
user
psychology.
This
paper
explores
multifaceted
impacts
interaction,
psychology,
society
through
a
literature
review.
author
investigates
ChatGPT’s
foundation,
including
Transformer
architecture
RLHF
(Reinforcement
Learning
from
Human
Feedback)
process,
enabling
it
to
generate
human-like
responses.
In
terms
studies
significant
improvements
GPT
models
bring
conversational
interfaces.
analysis
extends
psychological
impacts,
weighing
potential
mimic
human
empathy
support
learning
against
risks
reduced
interpersonal
connections.
commercial
domains,
discusses
applications
in
customer
service
services,
highlighting
efficiency
challenges
such
as
privacy
issues.
Finally,
offers
predictions
recommendations
for
future
development
directions
impact
relationships.
npj Mental Health Research,
Journal Year:
2024,
Volume and Issue:
3(1)
Published: Oct. 27, 2024
Abstract
The
global
mental
health
crisis
underscores
the
need
for
accessible,
effective
interventions.
Chatbots
based
on
generative
artificial
intelligence
(AI),
like
ChatGPT,
are
emerging
as
novel
solutions,
but
research
real-life
usage
is
limited.
We
interviewed
nineteen
individuals
about
their
experiences
using
AI
chatbots
health.
Participants
reported
high
engagement
and
positive
impacts,
including
better
relationships
healing
from
trauma
loss.
developed
four
themes:
(1)
a
sense
of
‘
emotional
sanctuary’
,
(2)
insightful
guidance’
particularly
relationships,
(3)
joy
connection
’,
(4)
comparisons
between
therapist
’
human
therapy.
Some
themes
echoed
prior
rule-based
chatbots,
while
others
seemed
to
AI.
emphasised
safety
guardrails,
human-like
memory
ability
lead
therapeutic
process.
Generative
may
offer
support
that
feels
meaningful
users,
further
needed
effectiveness.
Scientific Reports,
Journal Year:
2025,
Volume and Issue:
15(1)
Published: Jan. 7, 2025
To
explore
the
attitudes
of
healthcare
professionals
and
public
on
applying
ChatGPT
in
clinical
practice.
The
successful
application
practice
depends
technical
performance
critically
perceptions
non-healthcare
healthcare.
This
study
has
a
qualitative
design
based
artificial
intelligence.
was
divided
into
five
steps:
data
collection,
cleaning,
validation
relevance,
sentiment
analysis,
content
analysis
using
K-means
algorithm.
comprised
3130
comments
amounting
to
1,593,650
words.
dictionary
method
showed
positive
negative
emotions
such
as
anger,
disgust,
fear,
sadness,
surprise,
good,
happy
emotions.
Healthcare
prioritized
ChatGPT's
efficiency
but
raised
ethical
accountability
concerns,
while
valued
its
accessibility
emotional
support
expressed
worries
about
privacy
misinformation.
Bridging
these
perspectives
by
improving
reliability,
safeguarding
privacy,
clearly
defining
role
is
essential
for
practical
integration
JMIR Mental Health,
Journal Year:
2025,
Volume and Issue:
12, P. e60432 - e60432
Published: Feb. 21, 2025
Background
Conversational
artificial
intelligence
(CAI)
is
emerging
as
a
promising
digital
technology
for
mental
health
care.
CAI
apps,
such
psychotherapeutic
chatbots,
are
available
in
app
stores,
but
their
use
raises
ethical
concerns.
Objective
We
aimed
to
provide
comprehensive
overview
of
considerations
surrounding
therapist
individuals
with
issues.
Methods
conducted
systematic
search
across
PubMed,
Embase,
APA
PsycINFO,
Web
Science,
Scopus,
the
Philosopher’s
Index,
and
ACM
Digital
Library
databases.
Our
comprised
3
elements:
embodied
intelligence,
ethics,
health.
defined
conversational
agent
that
interacts
person
uses
formulate
output.
included
articles
discussing
challenges
functioning
role
added
additional
through
snowball
searching.
English
or
Dutch.
All
types
were
considered
except
abstracts
symposia.
Screening
eligibility
was
done
by
2
independent
researchers
(MRM
TS
AvB).
An
initial
charting
form
created
based
on
expected
revised
complemented
during
process.
The
divided
into
themes.
When
concern
occurred
more
than
articles,
we
identified
it
distinct
theme.
Results
101
which
95%
(n=96)
published
2018
later.
Most
reviews
(n=22,
21.8%)
followed
commentaries
(n=17,
16.8%).
following
10
themes
distinguished:
(1)
safety
harm
(discussed
52/101,
51.5%
articles);
most
common
topics
within
this
theme
suicidality
crisis
management,
harmful
wrong
suggestions,
risk
dependency
CAI;
(2)
explicability,
transparency,
trust
(n=26,
25.7%),
including
effects
“black
box”
algorithms
trust;
(3)
responsibility
accountability
(n=31,
30.7%);
(4)
empathy
humanness
(n=29,
28.7%);
(5)
justice
(n=41,
40.6%),
inequalities
due
differences
literacy;
(6)
anthropomorphization
deception
(n=24,
23.8%);
(7)
autonomy
(n=12,
11.9%);
(8)
effectiveness
(n=38,
37.6%);
(9)
privacy
confidentiality
(n=62,
61.4%);
(10)
concerns
care
workers’
jobs
(n=16,
15.8%).
Other
discussed
9.9%
(n=10)
articles.
Conclusions
scoping
review
has
comprehensively
covered
aspects
While
certain
remain
underexplored
stakeholders’
perspectives
insufficiently
represented,
study
highlights
critical
areas
further
research.
These
include
evaluating
risks
benefits
comparison
human
therapists,
determining
its
appropriate
roles
therapeutic
contexts
impact
access,
addressing
accountability.
Addressing
these
gaps
can
inform
normative
analysis
guide
development
guidelines
responsible
E-Learning and Digital Media,
Journal Year:
2024,
Volume and Issue:
unknown
Published: June 3, 2024
In
the
digital
era,
Artificial
Intelligence
(AI)
has
arisen
as
a
revolutionary
influence
with
potential
to
transform
multiple
spheres
of
human
life.
Chatbots,
particularly
OpenAI's
Chat
Generative
Pre-trained
Transformer
(ChatGPT),
are
increasingly
recognised
promising
tools
in
diverse
aspects,
including
mental
health.
This
study
delves
into
ChatGPT's
effectiveness
an
emotional
resilience
support
tool
specifically
for
Generation
Z
(Gen
Z),
demographic
deeply
engaged
interactions.
Employing
sequential
explanatory
design
that
integrates
quantitative
and
qualitative
analyses,
research
investigates
Gen
users'
perceptions
effectiveness,
barriers
its
utilisation,
impact
on
resilience.
The
findings
reveal
significant
acknowledgement
role
enhancing
well-being
notable
concerns
regarding
privacy
security.
Further,
insights
underscore
significance
personalised
interactions,
nonjudgmental
space,
active
listening
characteristics
ChatGPT
fostering
Moreover,
identifies
key
areas
improvement,
such
expanded
topic
coverage
cultural
representation.
Educational
stakeholders
health
professionals
encouraged
utilise
these
integrate
other
AI
tailored
frameworks
Z.
Acta Neuropsychiatrica,
Journal Year:
2024,
Volume and Issue:
unknown, P. 1 - 14
Published: Nov. 11, 2024
Tools
based
on
generative
artificial
intelligence
(AI)
such
as
ChatGPT
have
the
potential
to
transform
modern
society,
including
field
of
medicine.
Due
prominent
role
language
in
psychiatry,
e.g.,
for
diagnostic
assessment
and
psychotherapy,
these
tools
may
be
particularly
useful
within
this
medical
field.
Therefore,
aim
study
was
systematically
review
literature
AI
applications
psychiatry
mental
health.
BACKGROUND
Generative
Artificial
Intelligence
(AI)
chatbots
have
the
potential
to
improve
mental
health
care
for
practitioners
and
clients.
Evidence
demonstrates
that
AI
can
assist
with
tasks
such
as
documentation,
research,
counselling,
therapeutic
exercises.
However,
research
examining
practitioners’
perspectives
is
limited.
OBJECTIVE
Drawing
on
qualitative
quantitative
data,
this
mixed-methods
study
investigates:
(1)
different
uses
of
chatbots;
(2)
their
likelihood
recommending
clients;
(3)
whether
recommendation
increases
after
viewing
a
demonstration.
METHODS
Participants
were
23
(17
female,
6
male;
M
age
=
39.39,
SD
16.20).
In
forty-five-minute
interviews,
participants
selected
three
most
helpful
from
11
options
rated
clients
Likert-scale
before
11-minute
chatbot
RESULTS
Binomial
tests
found
Generating
Case
Notes
was
at
greater-than-chance
levels
(p
.001),
while
Support
Session
Planning
.863)
Identifying
Suggesting
Literature
.096)
not.
Although
55%
(n
12)
likely
recommend
clients,
binomial
test
no
significant
difference
50%
threshold
.738).
A
paired
samples
t-test
increased
significantly
.002)
pre-demonstration
post-demonstration.
CONCLUSIONS
Findings
suggest
favour
administrative
are
more
exposure.
This
highlights
need
practitioner
education
guidelines
support
safe
effective
integration
in
care.
JMIR Mental Health,
Journal Year:
2025,
Volume and Issue:
12, P. e70014 - e70014
Published: Feb. 18, 2025
Abstract
Background
The
global
shortage
of
mental
health
professionals,
exacerbated
by
increasing
needs
post
COVID-19,
has
stimulated
growing
interest
in
leveraging
large
language
models
to
address
these
challenges.
Objectives
This
systematic
review
aims
evaluate
the
current
capabilities
generative
artificial
intelligence
(GenAI)
context
applications.
Methods
A
comprehensive
search
across
5
databases
yielded
1046
references,
which
8
studies
met
inclusion
criteria.
included
were
original
research
with
experimental
designs
(eg,
Turing
tests,
sociocognitive
tasks,
trials,
or
qualitative
methods);
a
focus
on
GenAI
models;
and
explicit
measurement
abilities
empathy
emotional
awareness),
outcomes,
user
experience
perceived
trust
empathy).
Results
studies,
published
between
2023
2024,
primarily
evaluated
such
as
ChatGPT-3.5
4.0,
Bard,
Claude
tasks
psychoeducation,
diagnosis,
awareness,
clinical
interventions.
Most
used
zero-shot
prompting
human
evaluators
assess
AI
responses,
using
standardized
rating
scales
analysis.
However,
methods
often
insufficient
fully
capture
complexity
capabilities.
reliance
single-shot
techniques,
limited
comparisons,
task-based
assessments
isolated
from
may
oversimplify
GenAI’s
overlook
nuances
human–artificial
interaction,
especially
applications
that
require
contextual
reasoning
cultural
sensitivity.
findings
suggest
while
demonstrate
strengths
psychoeducation
their
diagnostic
accuracy,
competence,
ability
engage
users
emotionally
remain
limited.
Users
frequently
reported
concerns
about
trustworthiness,
lack
engagement.
Conclusions
Future
could
use
more
sophisticated
evaluation
methods,
few-shot
chain-of-thought
uncover
potential.
Longitudinal
broader
comparisons
benchmarks
are
needed
explore
effects
GenAI-integrated
care.
Advances in computational intelligence and robotics book series,
Journal Year:
2025,
Volume and Issue:
unknown, P. 305 - 332
Published: Jan. 10, 2025
To
increase
social
interaction
and
encourage
independent
learning
in
Pancasila
citizenship
education
classes,
this
project
investigates
the
use
of
ChatGPT
as
a
new
teaching
tool.
The
main
goal
is
to
test
how
well
encourages
student
participation
group
work
different
classroom
environments.
By
offering
empirical
evidence
efficacy
tool,
research
adds
growing
body
literature
on
AI
education.
We
have
highlighted
importance
facilitating
interactions
self-directed
learning,
which
essential
for
growth
analytical
communicative
abilities.
Educators
lawmakers
can
research's
practical
insights
incorporate
tools
into
school
courses
improve
engagement
outcomes.
better
equip
their
students
face
problems
digital
age,
instructors
should
participate
ongoing
professional
development
opportunities
learn
classroom.