no code implementations • 12 Feb 2024 • Md Musfiqur Rahman, Matt Jordan, Murat Kocaoglu
As an application of our algorithm, we evaluate two large conditional generative models that are pre-trained on the CelebA dataset by analyzing the strength of spurious correlations and the level of disentanglement they achieve.
no code implementations • 2 Jan 2024 • Md Musfiqur Rahman, Murat Kocaoglu
To address this, we propose a sequential training algorithm that, given the causal structure and a pre-trained conditional generative model, can train a deep causal generative model, which utilizes the pre-trained model and can provably sample from identifiable interventional and counterfactual distributions.