1 code implementation • 1 Dec 2023 • Tianyu Ding, Tianyi Chen, Haidong Zhu, Jiachen Jiang, Yiqi Zhong, Jinxin Zhou, Guangzhi Wang, Zhihui Zhu, Ilya Zharkov, Luming Liang
The rapid growth of Large Language Models (LLMs) has been a driving force in transforming various domains, reshaping the artificial general intelligence landscape.
1 code implementation • 30 Nov 2023 • Jinxin Zhou, Tianyu Ding, Tianyi Chen, Jiachen Jiang, Ilya Zharkov, Zhihui Zhu, Luming Liang
We present DREAM, a novel training framework representing Diffusion Rectification and Estimation Adaptive Models, requiring minimal code changes (just three lines) yet significantly enhancing the alignment of training with sampling in diffusion models.
no code implementations • 9 Oct 2023 • Jiachen Jiang, Jinxin Zhou, Peng Wang, Qing Qu, Dustin Mixon, Chong You, Zhihui Zhu
However, most of the existing empirical and theoretical studies in neural collapse focus on the case that the number of classes is small relative to the dimension of the feature space.
1 code implementation • 11 Mar 2021 • Ryan-Rhys Griffiths, Jiachen Jiang, Douglas J. K. Buisson, Dan R. Wilkins, Luigi C. Gallo, Adam Ingram, Alpha A. Lee, Dirk Grupe, Erin Kara, Michael L. Parker, William Alston, Anthony Bourached, George Cann, Andrew Young, Stefanie Komossa
Such a reprocessing model would be characterised by lags between X-ray and optical/UV emission due to differences in light travel time.
Gaussian Processes High Energy Astrophysical Phenomena