IDG.net - CIO Magazine
February 1, 1999
PERCEPTIVE COMPUTERS Are you ready for "mood computers"
that are so personal they can read your facial expressions
and determine whether you're happy, sad, confused or lying,
and then respond accordingly?
Ready or not, such intimate interfaces are currently
the works at two major computer research centers. At the MIT
Media Laboratory in Cambridge, Mass., researchers are
dabbling in "perceptual intelligence"-computers that read
and respond to expressed human behavior-while a group at the
Carnegie Mellon University Robotics Institute in Pittsburgh
has already developed a prototype computer vision system
that automatically distinguishes among subtle facial
The point? Scientists believe that if computers
programmed to detect the subtleties of facial expressions
and physical gestures, then they can be adapted to suit any
number of real-world applications, including:
Criminal investigation. Imagine if a polygraph machine
add face analysis to its quiver of lie-detection tools.
Biomedical applications, including treatment of facial
Security systems. Facial recognition could be key in
controlling access to buildings and computer systems.
"There are lots of real-world applications,"
Pentland, academic head of the Media Lab at MIT. Among
Pentland's current projects are "Smart Rooms" and "Smart
Desks," which use cameras, microphones and other sensors to
recognize people, recall their preferences and anticipate
their needs. "By knowing the situation-whether you're in a
meeting, driving a car, talking to a client-information can
be better tailored to the user's needs," Pentland says.
At Carnegie Mellon, researchers under the direction
adjunct faculty member Jeffrey Cohn have developed a
computer vision system called Auto-mated Face Analysis that
automatically recognizes subtly different facial expressions
based on "facial action coding system action units" using
hidden Markov mathematical models. Translation: The system
extracts facial expressions and analyzes them in the context
of a database that would then be able to equate them with
emotions or desires.
In addition to the potential applications listed
Cohn believes automated face analysis could prove valuable
to such diverse clients as product marketers and
politicians, who could really see how well their messages
are being received. And then there's the customer service
angle, Pentland says-"recognizing the valued customer
instantly when he or she enters the store, or noticing when
someone looks like they could use some help."
Still, don't look to be looked at so closely by
computers anytime soon. These systems are just in the
development stage at both MIT and Carnegie Mellon. Says
Cohn, "I would see two to four years as a likely deployment
- Tom Field