Child Development,
Journal Year:
2023,
Volume and Issue:
94(6), P. 1472 - 1490
Published: Nov. 1, 2023
The
study
of
how
children
learn
numbers
has
yielded
one
the
most
productive
research
programs
in
cognitive
development,
spanning
empirical
and
computational
methods,
as
well
nativist
empiricist
philosophies.
This
paper
provides
a
tutorial
on
to
think
computationally
about
learning
models
domain
like
number,
where
learners
take
finite
data
go
far
beyond
what
they
directly
observe
or
perceive.
To
illustrate,
this
then
outlines
model
which
acquires
counting
procedure
using
observations
sets
words,
extending
proposal
Piantadosi
et
al.
(2012).
new
version
responds
several
critiques
original
work
an
approach
is
likely
appropriate
for
acquiring
further
aspects
mathematics.
Oxford University Press eBooks,
Journal Year:
2023,
Volume and Issue:
unknown
Published: Feb. 16, 2023
Abstract
This
book
argues
that
there
is
a
joint
in
nature
between
seeing
and
thinking,
perception,
cognition.
Perception
constitutively
iconic,
nonconceptual,
nonpropositional,
whereas
cognition
does
not
have
these
properties
constitutively.
The
appeal
to
“intuitions,”
as
common
philosophy,
but
empirical
evidence,
including
experiments
neuroscience
psychology.
affects
i.e.,
perception
cognitively
penetrable,
this
impugn
the
nature.
A
key
part
of
argument
we
perceive
only
low-level
like
colors,
shapes,
textures
also
high-level
such
faces
causation.
Along
way,
explains
difference
perceptual
memory,
differences
format
content,
whether
probabilistic
despite
our
lack
awareness
properties.
for
categories
are
concepts,
need
be
singular,
attribution
discrimination
equally
fundamental,
basic
features
mind
known
“core
cognition”
third
category
chapter
on
consciousness
leverages
results
argue
against
some
most
widely
accepted
theories
consciousness.
Although
one
about
consciousness,
much
rest
repurposes
work
isolate
scientific
basis
perception.
Proceedings of the National Academy of Sciences,
Journal Year:
2022,
Volume and Issue:
119(5)
Published: Jan. 24, 2022
A
major
goal
of
linguistics
and
cognitive
science
is
to
understand
what
class
learning
systems
can
acquire
natural
language.
Until
recently,
the
computational
requirements
language
have
been
used
argue
that
impossible
without
a
highly
constrained
hypothesis
space.
Here,
we
describe
system
maximally
unconstrained,
operating
over
space
all
computations,
able
many
key
structures
present
in
from
positive
evidence
alone.
We
demonstrate
this
by
providing
same
model
with
data
74
distinct
formal
languages
which
argued
capture
features
language,
studied
experimental
work,
or
come
an
interesting
complexity
class.
The
successfully
induce
latent
generating
observed
strings
small
amounts
almost
cases,
including
for
regular
(e.g.,
Cell Reports,
Journal Year:
2024,
Volume and Issue:
43(3), P. 113952 - 113952
Published: March 1, 2024
When
exposed
to
sensory
sequences,
do
macaque
monkeys
spontaneously
form
abstract
internal
models
that
generalize
novel
experiences?
Here,
we
show
neuronal
populations
in
ventrolateral
prefrontal
cortex
jointly
encode
visual
sequences
by
separate
codes
for
the
specific
pictures
presented
and
their
sequential
structure.
We
recorded
neurons
while
passively
viewed
sequence
mismatches
local-global
paradigm.
Even
without
any
overt
task
or
response
requirements,
representations
of
structure,
serial
order,
image
identity
within
distinct
but
superimposed
subspaces.
Representations
structure
rapidly
update
following
single
exposure
a
mismatch
sequence,
represent
different
complexity.
Finally,
those
across
same
repetition
comprising
images.
These
results
suggest
rich
reflecting
both
content-specific
information.
Synthese,
Journal Year:
2024,
Volume and Issue:
203(5)
Published: May 3, 2024
Abstract
Natural
language
syntax
yields
an
unbounded
array
of
hierarchically
structured
expressions.
We
claim
that
these
are
used
in
the
service
active
inference
accord
with
free-energy
principle
(FEP).
While
conceptual
advances
alongside
modelling
and
simulation
work
have
attempted
to
connect
speech
segmentation
linguistic
communication
FEP,
we
extend
this
program
underlying
computations
responsible
for
generating
syntactic
objects.
argue
recently
proposed
principles
economy
design—such
as
“minimal
search”
criteria
from
theoretical
syntax—adhere
FEP.
This
affords
a
greater
degree
explanatory
power
FEP—with
respect
higher
functions—and
offers
linguistics
grounding
first
computability.
mostly
focus
on
building
new
principled
relations
between
also
show
through
sample
preliminary
examples
how
both
tree-geometric
depth
Kolmogorov
complexity
estimate
(recruiting
Lempel–Ziv
compression
algorithm)
can
be
accurately
predict
legal
operations
workspaces,
directly
line
formulations
variational
free
energy
minimization.
is
motivate
general
design
term
Turing–Chomsky
Compression
(TCC).
use
TCC
align
concerns
linguists
normative
account
self-organization
furnished
by
marshalling
evidence
psycholinguistics
ground
core
efficient
computation
within
inference.
Successive
auditory
inputs
are
rarely
independent,
their
relationships
ranging
from
local
transitions
between
elements
to
hierarchical
and
nested
representations.
In
many
situations,
humans
retrieve
these
dependencies
even
limited
datasets.
However,
this
learning
at
multiple
scale
levels
is
poorly
understood.
Here,
we
used
the
formalism
proposed
by
network
science
study
representation
of
higher-order
structures
interaction
in
sequences.
We
show
that
human
adults
exhibited
biases
perception
elements,
which
made
them
sensitive
high-order
such
as
communities.
This
behavior
consistent
with
creation
a
parsimonious
simplified
model
evidence
they
receive,
achieved
pruning
completing
elements.
observation
suggests
brain
does
not
rely
on
exact
memories
but
world.
Moreover,
bias
can
be
analytically
modeled
memory/efficiency
trade-off.
correctly
accounts
for
previous
findings,
including
transition
probabilities
well
structures,
unifying
sequence
across
scales.
finally
propose
putative
implementations
bias.