Here's a controversial idea: almost all of what you perceive could be really an internal mental simulation of the external world, informed occasionally by sampling data from your senses. This is not a Matrix-like conspiracy theory — some neuroscientists do actually say it's quite likely to be the case.

In other words, reality is real, but what you see, that's all in your head.

But not everyone agrees that we have a built-in virtual reality machine. At the other extreme of this debate, there are neuroscientists who say most of our perception is based on the information we receive through our eyes, ears and other sensory organs and the brain's mental simulation merely serves to fill in the gaps.

Whichever the case may be, everyone does agree that both sensory information and our mental models have a role in how we perceive the world, says Timothy Verstynen, an assistant professor of psychology and neuroscience at Carnegie Mellon University. As we start walking about in the world, seeing, touching and hearing, our brain learns from experience and builds models to predict our future interactions with the environment.

"That's a much more efficient way to get around in the world than to try to process every single bit of sensory data that your senses collect," said Verstynen, who ascribes to the mental simulation theory, saying that as much as 90 percent of our perception is actually mental fabrication. "Rather than trying to process all these tsunamis of incoming information, it's probably more important to have this simulation of the world and just randomly and sparsely check your senses to see if your model is correct and if it's not, fix it."

An imaginary heaviness

The mental simulation theory means that, for example, when you pick up an object, the weight that you feel is mostly what your brain makes up rather than the true weight of the object.

Some of the evidence for this idea comes from illusions, which occur when objects defy our expectations. One example is the size-weight illusion: take two balls with the same weight but different sizes, and pick them up. The smaller ball would feel heavier than the bigger ball.

"That's because your expectation is that the smaller ball should be lighter. When you pick it up and there's a discrepancy between what you expected and what you felt, it makes it seem like it's heavier."

This shows how expectations influence the way we perceive the world, Verstynen said. "If we were purely at the whim of our senses, that size-weight illusion wouldn't exist," he said.

Few years ago, Verstynen and his colleagues accidentally discovered a new illusion, which seems to drastically manipulate one's perception of force. Using a virtual reality system, they had people hold a block without seeing their arm and instead watching a virtual replica of their arm and the block. The participants were then asked to lift the block with their other hand. As they watched the virtual block being lifted from their palm, the researchers maintained the actual load on the participants' hands. This made the participants perceive a strong increase in the force on their hand, even though in reality, the true load force remained constant.

Hard-wired expectations

Illusions often highlight some of our core assumptions about the world. We expect smaller objects to be lighter, and objects in the shadow to be darker. These assumptions often seem to be deeply ingrained, and training people to expect these illusions can only make the effect just a little bit weaker.

"You can train it a little bit, but I haven't seen any studies that have shown it to go away completely in a normal, healthy adult," Verstynen said.

There are some possible reasons for our mental expectations to have such a strong say in our perception. For example, a brain that works off of mental models will have more free resources to devote to potentially important, unexpected sensory signals. Also, it may be more efficient for the brain to judge a property of an object, say, its weight, by comparing it to its own model, rather than calculating from scratch for a large range of possible weights every time.

Human-like robots may need mental models too

Currently, most of robotics is geared toward getting robots to process and react to sensory input, Verstynen says. But to get a human-like robot, its brain needs to have some of the same mental shortcuts that a human brain uses. In other words, a robot, too, needs to have built-in expectations and check in with sensory input rather than trying to process every bit of information that comes its way.

"So much effort is put into processing sensory information and there's not much research on how to build these internal representation," Verstynen said.

The field of artificial intelligence started with a mission to mimic a human mind, but it took a hard turn few decades ago toward more specific computational tasks, like playing chess or Jeopardy. Now, cognitive scientists are increasingly arguing that AI research should go back to trying to learn from the human brain and that robots can truly learn from people.

"Part of the problem is that robotics and AI tend to be dominated by engineers, and neuroscience and computational neuroscience are dominated by mathematicians and biologists," Verstynen said. "They're separate fields that don't talk to each other as much as they should."

Stephanie Pappas contributed reporting to this article.