Even if an
android’s appearance is so realistic that it could be mistaken for a human in a
photograph, watching it move in person can feel a bit unsettling. It can smile,
frown, or display other various, familiar expressions, but finding a consistent
emotional state behind those
expressions can be difficult, leaving you unsure of what it is truly feeling
and creating a sense of unease.
Until now, when allowing robots that can
move many parts of their face, like androids, to display facial expressions for
extended periods, a ‘patchwork method’ has been used. This method involves
preparing multiple pre-arranged action scenarios to ensure that unnatural
facial movements are excluded while switching between these scenarios as
needed.
However, this poses practical
challenges, such as preparing complex action scenarios beforehand, minimizing
noticeable unnatural movements during transitions, and fine-tuning movements to
subtly control the expressions conveyed.
In this study, lead author Hisashi
Ishihara and his research group developed a dynamic facial expression synthesis
technology using “waveform movements,” which represents various gestures that
constitute facial movements, such as “breathing,” “blinking,” and “yawning,” as
individual waves. These waves are propagated to the related facial areas and
are overlaid to generate complex facial movements in real time. This method
eliminates the need for the preparation of complex and diverse action data
while also avoiding noticeable movement transitions.
Furthermore, by introducing “waveform
modulation,” which adjusts the individual waveforms based on the robot’s
internal state, changes in internal conditions, such as mood, can be instantly
reflected as variations in facial movements.”
“Advancing this research in dynamic
facial expression synthesis will enable robots capable of complex facial
movements to exhibit more lively expressions and convey mood changes that
respond to their surrounding circumstances, including interactions with
humans,” says senior author Koichi Osuka. “This could greatly enrich emotional
communication between humans and robots.”
Ishihara adds, “Rather than creating
superficial movements, further development of a system in which internal
emotions are reflected in every detail of an android’s actions could lead to
the creation of androids perceived as having a heart.”
By realizing the function to adaptively adjust and express emotions, this technology is expected to significantly enhance the value of communication robots, allowing them to exchange information with humans in a more natural, humanlike manner.
Source: https://resou.osaka-u.ac.jp/en/research/2024/20241223_2
Image Credit: Hisashi Ishihara
No comments:
Post a Comment