Gender. Age. Weight. Emotion. Personality.
All of these factors affect how we move.
Dr. Nikolaus Troje has spent large parts of his career studying biological motion and how little visual information we actually need to recognize activity.
“Biological motion is fascinating,” says Troje, a professor in the Department of Biology and the Centre for Vision Research at York University.
“There’s an ease by which the visual human system fills in information and recognizes whether they’re male or female, young or old, happy or sad. The changes are very, very minute and yet your visual system picks them up with precision and interprets them with expertise.”
The applications for Troje’s research are varied and can be applied in fields that include computer animation, neurological diseases, psychological therapy and safety.
Can you tell me a little about yourself?
I am just in the process of moving my lab, students and office to York University in Toronto. I spent the past 15 years at Queen’s University and it was very good.
But this new position at York’s Centre for Vision Research provides a more interesting research environment, and it comes with a reduced teaching and administrative load. I’m looking forward to the change. Change is good for us, I think.
What is biomotion?
It was originally called biological motion when the concept was first introduced in the mid-70s. Back then, you would put people in a dark room with dark clothing. When filming them, the only thing you could see were small point lights attached to 10 or 15 points. In a still frame you would see just a random array of dots, but the moment the movie is set into motion you can’t help seeing a person in action. You can even recognize gender, age, emotions and personality traits from these point-light displays. Biomotion is a powerful example of how little visual input we need in order to see a lot.
How did you become interested in biomotion?
The main question for me is how come – given that the sensory information that reaches our brain is noisy and riddled with messy physiology – that the world that we experience feels so predictable, solid and, therefore, “real.” Biomotion is a very, very good example to demonstrate and study how the brain, and specifically how our visual system, organizes this flow of sparse, incomplete incoming information and turns it into something rich and real that we can work with.
What have been your contributions?
I entered the field in 1999. There wasn’t much interest in biological motion research at that time. I started using a new technology called motion capture. You see it everywhere today, but it was brand new at that time.
My lab was the first to systematically collect motion data to investigate walking behaviour and to study the differences in walking style that characterize individual people. We then developed methods that would allow us to identify and quantify the subtle variations in walking behaviour that are characteristic for biological attributes, emotions and personality traits.
In designing perceptual studies, we often use so called point-light displays: In these displays we represent a human in motion just by a few dots that move as if they were attached to the main joints of the body.
One study that is particular interesting for the safety industry showed that people derive intention and a sense of animacy from these point-light displays even if the moving dots are randomly scrambled up. An observer wouldn’t be able to make out the human shape, but can still derive the direction in which someone is oriented. Observers also have the clear perception of seeing something that is alive.
How does that look?
You can see a demonstration on my lab’s website, using a human or cat or pigeon. The dots are on the main joints of people and allow you to reconstruct the form.
People are very good in sensing the direction of the person or the pigeon even after clicking the “scramble” button, which will randomize the location of the moving dots. Now, the interesting observation is that this clear sense of animacy and the ability to see intended walking direction is no longer visible when the display is turned upside down.
It turned out that the critical information that conveys animacy and implied direction is in the motion of the feet: As long they show signs of gravitational acceleration then an automatic detection system kicks into place which tells your brain you might be dealing with something that is alive. We therefore called this neural mechanism the brain’s “life detector.”
How do we learn about the life detector?
It’s an inborn, hereditary trait. Experiments have shown that newborn babies respond already.
The feet grab your attention, the degree to which you respond is genetically determined. It speaks to the evolutionary nature of how people respond to the way a moving animal (or other human, for that matter) responds to gravity when moving efficiently.
The life detector sets an alarm which tells us that something interesting is going on in our visual environment that requires further inspection. It works in our visual periphery, too, and guides our attention.
How does your work apply in the safety area?
This interests people who work in safety as it informs engineers where to put reflective materials when designing safety clothing. Doing it right means to utilize our old, innate ability to respond to the gravitational acceleration of the feet.
What would be your No. 1 piece of advice for safety officers?
For pedestrians, the most important place to put safety tape is on the feet or the ankles.
What about for vehicles?
Placing markers on a wheel of a car or bicycle generates visual motion which is always good, but it wouldn’t trigger the very efficient life detector system because the motion is not affected by gravitational acceleration.
I think there might be ways to bring that back into the game but that would require detailed knowledge about what the most adequate stimulus is that triggers the life detector.
Are there real-life examples?
Hunters will tell you that it’s your foot fall that scares away animals. Some will still recommend wearing long coats that disguise the motion of legs and feet. If you want to see wildlife you can get much closer in a canoe or on a bicycle or in a car. The minute you put your feet on the ground and move them under conditions in which gravitational acceleration plays out, animals will run away or fly away. The fact that animals as well as humans respond to these visual invariants means that it is based on an evolutionarily old mechanism.
Stalking behaviour in animals is also characterized by avoiding accelerated foot fall. Feet are lowered gently, not just to avoid sound, but mainly to avoid triggering the life detection system.
How does biomotion work with computer animation?
When computer animation meant to make aliens, monsters and cartoon characters, getting the details of motion right wasn’t a big issue. Today, we find many applications in which computer animation replaces a real actor in a stunt. The intention is to get it real enough so that the audience doesn’t even notice that an animation replaces the real person.
In order to fool the human visual system into believing that it sees a real person, we have to understand what it knows about real motion. An interesting area into which I am currently moving my research addresses the question how the details of human motion depend on body shape and the way weight is distributed over the body.
If you take the motion from one person and apply it to another it doesn’t necessarily look right. We’re working now to find out what the relationships are between body motion and body shape, what the human visual system knows about these relations, and where it tolerates deviations from such relations.
What about in the neurology field?
Many neurological diseases go along with changes in the way you move. An experienced neurologist can tell whether a person is developing Parkinson’s disease from the way they walk. Our techniques can be used to develop tools for the early diagnosis of neurological disorders.
What applications are there in mental health?
There are therapeutic uses — when you’re depressed you walk differently. We measure that quantitatively and describe what those changes are.
There are relations between your mental state and the way you move. The interesting finding is that causal relationships go both ways. If we induce a person to walk as if they are more happy, they start feeling more happy.
What is next for you?
We are moving away from point-light displays and looking at not just motion but body shape. We’ll be looking at other sources of visual information used for people perception, which includes body shape and how body motion interacts with body shape.
We’re also starting to use a lot of virtual reality for research and using the head-mounted displays to see people in front of you and around you. VR works very well on applied work; when and how you see people in the dark, for example.
Barb Wilkinson is a freelance writer and editor based in Edmonton