Search Results for author: Shuai Tan

Found 5 papers, 0 papers with code

EDTalk: Efficient Disentanglement for Emotional Talking Head Synthesis

no code implementations2 Apr 2024 Shuai Tan, Bin Ji, Mengxiao Bi, Ye Pan

Achieving disentangled control over multiple facial motions and accommodating diverse input modalities greatly enhances the application and entertainment of the talking head generation.

Disentanglement Talking Head Generation

FlowVQTalker: High-Quality Emotional Talking Face Generation through Normalizing Flow and Quantization

no code implementations11 Mar 2024 Shuai Tan, Bin Ji, Ye Pan

Specifically, we develop a flow-based coefficient generator that encodes the dynamics of facial emotion into a multi-emotion-class latent space represented as a mixture distribution.

Quantization Talking Face Generation

Style2Talker: High-Resolution Talking Head Generation with Emotion Style and Art Style

no code implementations11 Mar 2024 Shuai Tan, Bin Ji, Ye Pan

Although automatically animating audio-driven talking heads has recently received growing interest, previous efforts have mainly concentrated on achieving lip synchronization with the audio, neglecting two crucial elements for generating expressive videos: emotion style and art style.

Talking Face Generation Talking Head Generation

Say Anything with Any Style

no code implementations11 Mar 2024 Shuai Tan, Bin Ji, Yu Ding, Ye Pan

To adapt to different speaking styles, we steer clear of employing a universal network by exploring an elaborate HyperStyle to produce the style-specific weights offset for the style branch.

EMMN: Emotional Motion Memory Network for Audio-driven Emotional Talking Face Generation

no code implementations ICCV 2023 Shuai Tan, Bin Ji, Ye Pan

During training, the emotion embedding and mouth features are used as keys, and the corresponding expression features are used as values to create key-value pairs stored in the proposed Motion Memory Net.

Talking Face Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.