1 code implementation • 4 Oct 2021 • Zhaojie Luo, Shoufeng Lin, Rui Liu, Jun Baba, Yuichiro Yoshikawa, Ishiguro Hiroshi
We note that the decoupling of emotional features from other speech information (such as speaker, content, etc.)
no code implementations • 15 Sep 2021 • Shuyun Tang, Zhaojie Luo, Guoshun Nan, Yuichiro Yoshikawa, Ishiguro Hiroshi
Automatic emotion recognition (AER) based on enriched multimodal inputs, including text, speech, and visual clues, is crucial in the development of emotionally intelligent machines.
no code implementations • 11 Mar 2021 • Yuki Tamaru, Yasunori Ozaki, Yuki Okafuji, Junya Nakanishi, Yuichiro Yoshikawa, Jun Baba
For a humanoid robot to make eye contact and initiate communication with a person, it is necessary to estimate the person's head position.
1 code implementation • 4 Mar 2020 • Changzeng Fu, Chaoran Liu, Carlos Toshinori Ishi, Yuichiro Yoshikawa, Hiroshi Ishiguro
Text categorization is the task of assigning labels to documents written in a natural language, and it has numerous real-world applications including sentiment analysis as well as traditional topic assignment tasks.
no code implementations • WS 2018 • Kazuki Sakai, Ryuichiro Higashinaka, Yuichiro Yoshikawa, Hiroshi Ishiguro, Junji Tomita
The results suggest that inserting the question-answer dialogue enhances familiarity and naturalness.
no code implementations • 28 Feb 2017 • Ahmed Hussain Qureshi, Yutaka Nakamura, Yuichiro Yoshikawa, Hiroshi Ishiguro
For a safe, natural and effective human-robot social interaction, it is essential to develop a system that allows a robot to demonstrate the perceivable responsive behaviors to complex human behaviors.
no code implementations • 24 Feb 2017 • Ahmed Hussain Qureshi, Yutaka Nakamura, Yuichiro Yoshikawa, Hiroshi Ishiguro
For robots to coexist with humans in a social world like ours, it is crucial that they possess human-like social interaction skills.