1 code implementation • 5 Feb 2024 • Riccardo Grazzi, Julien Siems, Simon Schrodi, Thomas Brox, Frank Hutter
State of the art foundation models such as GPT-4 perform surprisingly well at in-context learning (ICL), a variant of meta-learning concerning the learned ability to solve tasks during a neural network forward pass, exploiting contextual information provided as input to the model.
1 code implementation • NeurIPS 2023 • Julien Siems, Konstantin Ditschuneit, Winfried Ripken, Alma Lindborg, Maximilian Schambach, Johannes S. Otterbach, Martin Genzel
Generalized Additive Models (GAMs) have recently experienced a resurgence in popularity due to their interpretability, which arises from expressing the target value as a sum of non-linear transformations of the features.
no code implementations • 18 Mar 2023 • Julien Siems, Maximilian Schambach, Sebastian Schulze, Johannes S. Otterbach
In this work, we focus on developing dynamic inventory ordering policies for a multi-echelon, i. e. multi-stage, supply chain.
1 code implementation • 31 Aug 2022 • Jakob Weissteiner, Jakob Heiss, Julien Siems, Sven Seuken
In this paper, we address this shortcoming by presenting a Bayesian optimization-based combinatorial assignment (BOCA) mechanism.
1 code implementation • ICLR 2022 • Arber Zela, Julien Siems, Lucas Zimmer, Jovita Lukasik, Margret Keuper, Frank Hutter
We show that surrogate NAS benchmarks can model the true performance of architectures better than tabular benchmarks (at a small fraction of the cost), that they lead to faithful estimates of how well different NAS methods work on the original non-surrogate benchmark, and that they can generate new scientific insight.
1 code implementation • ICLR 2020 • Arber Zela, Julien Siems, Frank Hutter
One-shot neural architecture search (NAS) has played a crucial role in making NAS methods computationally feasible in practice.