Facial Recognition

Wired Science has a short discussion about how humans recognize and process facial characteristics and why we sometimes stare at people with facial deformities. An evolutionary response causes our brain to momentarily stumble when we see people that don't have symmetrical features:

To decide, your eyes sweep over the person’s face, retrieving only parts, mainly just his nose and eyes. Your brain will then try to assemble those pieces into a configuration that you know something about.

When the pieces you supply match nothing in the gallery of known facial expressions, when you encounter a person whose nose, mouth or eyes are distorted in a way you have never encountered before, you instinctively lock on. Your gaze remains riveted, and your brain stays tuned for further information.

“When a face is distorted, we have no pattern to match that,” Rosenberg said. “All primates show this [staring] at something very different, something they have not evolved to see. They need to investigate further. ‘Are they one of us or not?’ In other species, when an animal looks very different, they get rejected.”

Some of this response might be applicable to interface design, one would think. How do we respond to interfaces that aren't symmetrical or don't fit a recognizable pattern? Are the same processes at work? Some studies show interfaces that are better designed are percieved as more usable than funtionally identical, but poorly designed ones.