no code implementations • 1 Apr 2024 • Shourya Bose, Yu Zhang, Kibaek Kim
The advent of smart meters has enabled pervasive collection of energy consumption data for training short-term load forecasting models.
1 code implementation • 28 Feb 2024 • Guangji Bai, Yijiang Li, Chen Ling, Kibaek Kim, Liang Zhao
The transformative impact of large language models (LLMs) like LLaMA and GPT on natural language processing is countered by their prohibitive computational demands.
no code implementations • 19 Feb 2024 • Zilinghan Li, Shilan He, Pranshu Chaturvedi, Volodymyr Kindratenko, Eliu A Huerta, Kibaek Kim, Ravi Madduri
Federated learning enables multiple data owners to collaboratively train robust machine learning models without transferring large or sensitive local datasets by only sharing the parameters of the locally trained models.
no code implementations • 11 Jan 2024 • Sihan Zeng, Youngdae Kim, Yuxuan Ren, Kibaek Kim
At the heart of power system operations, alternating current optimal power flow (ACOPF) studies the generation of electric power in the most economical way under network-wide load requirement, and can be formulated as a highly structured non-convex quadratically constrained quadratic program (QCQP).
no code implementations • 21 Nov 2023 • Shourya Bose, Yu Zhang, Kibaek Kim
The widespread adoption of smart meters provides access to detailed and localized load consumption data, suitable for training building-level load forecasting models.
1 code implementation • 26 Sep 2023 • Zilinghan Li, Pranshu Chaturvedi, Shilan He, Han Chen, Gagandeep Singh, Volodymyr Kindratenko, E. A. Huerta, Kibaek Kim, Ravi Madduri
Nonetheless, because of the disparity of computing resources among different clients (i. e., device heterogeneity), synchronous federated learning algorithms suffer from degraded efficiency when waiting for straggler clients.
no code implementations • 22 Sep 2023 • Shourya Bose, Kibaek Kim
The advent of smart meters has enabled pervasive collection of energy consumption data for training short-term load forecasting (STLF) models.
1 code implementation • 17 Aug 2023 • Zilinghan Li, Shilan He, Pranshu Chaturvedi, Trung-Hieu Hoang, Minseok Ryu, E. A. Huerta, Volodymyr Kindratenko, Jordan Fuhrman, Maryellen Giger, Ryan Chard, Kibaek Kim, Ravi Madduri
Cross-silo privacy-preserving federated learning (PPFL) is a powerful tool to collaboratively train robust and generalized machine learning (ML) models without sharing sensitive (e. g., healthcare of financial) local data.
no code implementations • 28 Feb 2023 • Minseok Ryu, Kibaek Kim
This paper considers distributed optimization (DO) where multiple agents cooperate to minimize a global objective function, expressed as a sum of local objectives, subject to some constraints.
no code implementations • 18 Feb 2022 • Minseok Ryu, Kibaek Kim
Differential privacy (DP) techniques can be applied to the federated learning model to statistically guarantee data privacy against inference attacks to communication among the learning agents.
1 code implementation • 8 Feb 2022 • Minseok Ryu, Youngdae Kim, Kibaek Kim, Ravi K. Madduri
Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in classical machine learning.
no code implementations • 22 Oct 2021 • Sihan Zeng, Alyssa Kody, Youngdae Kim, Kibaek Kim, Daniel K. Molzahn
We train our RL policy using deep Q-learning, and show that this policy can result in significantly accelerated convergence (up to a 59% reduction in the number of iterations compared to existing, curvature-informed penalty parameter selection methods).
no code implementations • 11 Jun 2021 • Minseok Ryu, Kibaek Kim
Differential privacy (DP) techniques can be applied to the federated learning model to protect data privacy against inference attacks to communication among the learning agents.