Search Results for author: Maximilian Jeblick

Found 3 papers, 2 papers with code

H2O-Danube-1.8B Technical Report

no code implementations30 Jan 2024 Philipp Singer, Pascal Pfeiffer, Yauhen Babakhin, Maximilian Jeblick, Nischay Dhankhar, Gabor Fodor, Sri Satish Ambati

We present H2O-Danube, a series of small 1. 8B language models consisting of H2O-Danube-1. 8B, trained on 1T tokens, and the incremental improved H2O-Danube2-1. 8B trained on an additional 2T tokens.

Language Modelling

h2oGPT: Democratizing Large Language Models

2 code implementations13 Jun 2023 Arno Candel, Jon McKinney, Philipp Singer, Pascal Pfeiffer, Maximilian Jeblick, Prithvi Prabhu, Jeff Gambera, Mark Landry, Shivam Bansal, Ryan Chesler, Chun Ming Lee, Marcos V. Conde, Pasha Stetsenko, Olivier Grellier, SriSatish Ambati

Applications built on top of Large Language Models (LLMs) such as GPT-4 represent a revolution in AI due to their human-level capabilities in natural language processing.

Chatbot Fairness +8

Cannot find the paper you are looking for? You can Submit a new open access paper.