no code implementations • 17 Mar 2024 • Boaz Carmeli, Yonatan Belinkov, Ron Meir
Artificial agents that learn to communicate in order to accomplish a given task acquire communication protocols that are typically opaque to a human.
no code implementations • 25 Jan 2024 • Asaf Yehudai, Boaz Carmeli, Yosi Mass, Ofir Arviv, Nathaniel Mills, Assaf Toledo, Eyal Shnarch, Leshem Choshen
Furthermore, we compare models trained on our data with models trained on human-written data -- ELI5 and ASQA for LFQA and CNN-DailyMail for Summarization.
no code implementations • 2 Mar 2023 • Asaf Yehudai, Matan Vetzler, Yosi Mass, Koren Lazar, Doron Cohen, Boaz Carmeli
Intent detection with semantically similar fine-grained intents is a challenging task.
2 code implementations • 29 Nov 2022 • George Kour, Samuel Ackerman, Orna Raz, Eitan Farchi, Boaz Carmeli, Ateret Anaby-Tavor
The ability to compare the semantic similarity between text corpora is important in a variety of natural language processing applications.
no code implementations • 4 Nov 2022 • Boaz Carmeli, Ron Meir, Yonatan Belinkov
The field of emergent communication aims to understand the characteristics of communication as it emerges from artificial agents solving tasks that require information exchange.
1 code implementation • 21 Oct 2022 • Ella Rabinovich, Boaz Carmeli
Prominent questions about the role of sensory vs. linguistic input in the way we acquire and use language have been extensively studied in the psycholinguistic literature.
no code implementations • 22 Jun 2022 • Naama Zwerdling, Segev Shlomov, Esther Goldbraich, George Kour, Boaz Carmeli, Naama Tepper, Inbal Ronen, Vitaly Zabershinsky, Ateret Anaby-Tavor
Models for text generation have become focal for many research tasks and especially for the generation of sentence corpora.
no code implementations • 21 Feb 2022 • Zvi Kons, Aharon Satt, Hong-Kwang Kuo, Samuel Thomas, Boaz Carmeli, Ron Hoory, Brian Kingsbury
The NNSI reduces the need for manual labeling by automatically selecting highly-ambiguous samples and labeling them with high accuracy.
no code implementations • 24 Oct 2021 • Eyal Ben-David, Boaz Carmeli, Ateret Anaby-Tavor
We show that intent prediction can be improved by training a deep text-to-text neural model to generate successive user utterances from unlabeled dialogue data.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Naama Tepper, Esther Goldbraich, Naama Zwerdling, George Kour, Ateret Anaby Tavor, Boaz Carmeli
Data balancing is a known technique for improving the performance of classification tasks.
no code implementations • ACL 2020 • Yosi Mass, Boaz Carmeli, Haggai Roitman, David Konopnicki
The two models match user queries to FAQ answers and questions, respectively.
1 code implementation • 8 Nov 2019 • Ateret Anaby-Tavor, Boaz Carmeli, Esther Goldbraich, Amir Kantor, George Kour, Segev Shlomov, Naama Tepper, Naama Zwerdling
Based on recent advances in natural language modeling and those in text generation capabilities, we propose a novel data augmentation method for text classification tasks.
no code implementations • ICLR 2019 • Alon Jacovi, Guy Hadash, Einat Kermany, Boaz Carmeli, Ofer Lavi, George Kour, Jonathan Berant
We propose a method for end-to-end training of a base neural network that integrates calls to existing black-box functions.
no code implementations • 24 Apr 2018 • Guy Hadash, Einat Kermany, Boaz Carmeli, Ofer Lavi, George Kour, Alon Jacovi
At inference time, we replace each estimator with its existing application counterpart and let the base network solve the task by interacting with the existing application.