Search Results for author: Che-Jui Chang

Found 5 papers, 2 papers with code

BattleAgent: Multi-modal Dynamic Emulation on Historical Battles to Complement Historical Analysis

1 code implementation23 Apr 2024 Shuhang Lin, Wenyue Hua, Lingyao Li, Che-Jui Chang, Lizhou Fan, Jianchao Ji, Hang Hua, Mingyu Jin, Jiebo Luo, Yongfeng Zhang

This novel system aims to simulate complex dynamic interactions among multiple agents, as well as between agents and their environments, over a period of time.

Decision Making Language Modelling

On the Equivalency, Substitutability, and Flexibility of Synthetic Data

no code implementations24 Mar 2024 Che-Jui Chang, Danrui Li, Seonghyeon Moon, Mubbasir Kapadia

In addition, our study of the impact of synthetic data distributions on downstream performance reveals the importance of flexible data generators in narrowing domain gaps for improved model adaptability.

The Importance of Multimodal Emotion Conditioning and Affect Consistency for Embodied Conversational Agents

no code implementations26 Sep 2023 Che-Jui Chang, Samuel S. Sohn, Sen Zhang, Rajath Jayashankar, Muhammad Usman, Mubbasir Kapadia

We have conducted a user study with 199 participants to assess how the average person judges the affects perceived from multimodal behaviors that are consistent and inconsistent with respect to a driving affect.

Transfer Learning from Monolingual ASR to Transcription-free Cross-lingual Voice Conversion

1 code implementation30 Sep 2020 Che-Jui Chang

Cross-lingual voice conversion (VC) is a task that aims to synthesize target voices with the same content while source and target speakers speak in different languages.

Transfer Learning Voice Conversion

Cannot find the paper you are looking for? You can Submit a new open access paper.