no code implementations • 16 Feb 2023 • Gerhard Paaß, Sven Giesselbach
When the models are large enough, they can be instructed by prompts to solve new tasks without any fine-tuning.
no code implementations • 29 Mar 2021 • Vishwani Gupta, Katharina Beckh, Sven Giesselbach, Dennis Wegener, Tim Wirtz
We find that our approach is agnostic to concept drifts, i. e. the machine learning task is independent of the hypotheses in a text.
no code implementations • 11 Nov 2019 • Johannes Burdack, Fabian Horst, Sven Giesselbach, Ibrahim Hassan, Sabrina Daffner, Wolfgang I. Schöllhorn
Therefore, the aim of this analysis is to compare different combinations of commonly applied data preprocessing steps and test their effects on the classification performance of gait patterns.
no code implementations • WS 2019 • Vishwani Gupta, Sven Giesselbach, Stefan R{\"u}ping, Christian Bauckhage
Word-based embedding approaches such as Word2Vec capture the meaning of words and relations between them, particularly well when trained with large text collections; however, they fail to do so with small datasets.
1 code implementation • 29 Mar 2019 • Laura von Rueden, Sebastian Mayer, Katharina Beckh, Bogdan Georgiev, Sven Giesselbach, Raoul Heese, Birgit Kirsch, Julius Pfrommer, Annika Pick, Rajkumar Ramamurthy, Michal Walczak, Jochen Garcke, Christian Bauckhage, Jannis Schuecker
It considers the source of knowledge, its representation, and its integration into the machine learning pipeline.
2 code implementations • 30 Nov 2018 • Sven Giesselbach, Katrin Ullrich, Michael Kamp, Daniel Paurat, Thomas Gärtner
We propose a novel transfer learning approach for orphan screening called corresponding projections.
no code implementations • 12 Jul 2018 • Linara Adilova, Sven Giesselbach, Stefan Rüping
In this paper, we report on an alternative approach where we first construct a relation extraction model using distant supervision, and only later make use of a domain expert to refine the results.