no code implementations • 15 Apr 2022 • Tong Yang, Yifei Wang, Long Sha, Jan Engelbrecht, Pengyu Hong
As far as we know, by applying abstract algebra in statistical learning, this work develops the first formal language for general knowledge graphs, and also sheds light on the problem of neural-symbolic integration from an algebraic perspective.
no code implementations • NeurIPS 2021 • Zhixuan Yu, Haozheng Yu, Long Sha, Sujoy Ganguly, Hyun Park
(2) Geometric consistency: every point in the continuous correspondence fields must satisfy the multiview consistency collectively.
no code implementations • 20 Sep 2021 • Zhixuan Yu, Haozheng Yu, Long Sha, Sujoy Ganguly, Hyun Soo Park
(2) Geometric consistency: every point in the continuous correspondence fields must satisfy the multiview consistency collectively.
no code implementations • 13 Aug 2020 • Tong Yang, Long Sha, Pengyu Hong
While nowadays most gradient-based optimization methods focus on exploring the high-dimensional geometric features, the random error accumulated in a stochastic version of any algorithm implementation has not been stressed yet.
no code implementations • 9 Aug 2020 • Tong Yang, Long Sha, Justin Li, Pengyu Hong
In this work, we developed a deep learning model-based approach to forecast the spreading trend of SARS-CoV-2 in the United States.
no code implementations • CVPR 2020 • Long Sha, Jennifer Hobbs, Panna Felsen, Xinyu Wei, Patrick Lucey, Sujoy Ganguly
We evaluate our method on a new college basketball dataset and demonstrate state of the art performance in variable and dynamic environments.
no code implementations • 22 May 2020 • Tong Yang, Long Sha, Pengyu Hong
We demonstrated the existence of a group algebraic structure hidden in relational knowledge embedding problems, which suggests that a group-based embedding framework is essential for designing embedding models.
no code implementations • ICLR 2020 • Xin Xing, Long Sha, Pengyu Hong, Zuofeng Shang, Jun S. Liu
Deep neural networks (DNNs) can be huge in size, requiring a considerable a mount of energy and computational resources to operate, which limits their applications in numerous scenarios.
no code implementations • 30 Dec 2019 • Jennifer Hobbs, Matthew Holbrook, Nathan Frank, Long Sha, Patrick Lucey
Central to all machine learning algorithms is data representation.
no code implementations • 25 Sep 2019 • Tong Yang, Long Sha, Pengyu Hong
We have rigorously proved the existence of a group algebraic structure hidden in relational knowledge embedding problems, which suggests that a group-based embedding framework is essential for model design.
no code implementations • ICLR 2019 • Long Sha, Jonathan Schwarcz, Pengyu Hong
This modification produces statistically significant improvements in comparison with traditional ANN nodes in the context of Convolutional Neural Networks and Long Short-Term Memory networks.
2 code implementations • ICLR 2019 • Eric Zhan, Stephan Zheng, Yisong Yue, Long Sha, Patrick Lucey
We study the problem of training sequential generative models for capturing coordinated multi-agent trajectory behavior, such as offensive basketball gameplay.