A New Theory About Our Perceptions of the World
New research is filling in gaps between two prevailing theories about how the brain generates our perception of the world.
One, called the Bayesian decoding theory, says that to best recognize what’s in front of us, the brain combines the sensory signals it receives with our preconceived notions. The second theory, called efficient coding, explains that, for sensory input resources like neurons in the brain and retinas in the eyes to efficiently do their jobs, they process and store more precise information about the most frequently viewed objects.
Alan Stocker, an assistant professor in the psychology and the electrical and systems engineering departments, and Xue-Xin Wei, a psychology graduate student, combined these two concepts to create a new theory about how we perceive our world: How often we observe an object or scene shapes both what we expect of something similar in the future and how accurately we’ll see it.
To read the full story, click here.