no code implementations • 28 Sep 2023 • Drew Linsley, John Griffin, Jason Parker Brown, Adam N Roose, Michael Frank, Peter Linsley, Steven Finkbeiner, Jeremy Linsley
Recent breakthroughs by deep neural networks (DNNs) in natural language processing (NLP) and computer vision have been driven by a scale-up of models and data rather than the discovery of novel computing paradigms.
no code implementations • 6 Apr 2023 • Rafael Sousa, Marcio Pereira, Yongin Kwon, TaeHo Kim, Namsoon Jung, Chang Soo Kim, Michael Frank, Guido Araujo
Although code generation for Convolution Neural Network (CNN) models has been extensively studied, performing efficient data slicing and parallelization for highly-constrai\-ned Multicore Neural Processor Units (NPUs) is still a challenging problem.
no code implementations • WS 2019 • Abdellah Fourtassi, Isaac Scheinfeld, Michael Frank
How do children learn abstract concepts such as animal vs. artifact?
no code implementations • ACL 2017 • Gabriel Doyle, Amir Goldberg, Sameer Srivastava, Michael Frank
Cultural fit is widely believed to affect the success of individuals and the groups to which they belong.
no code implementations • 28 Oct 2015 • Michael Codish, Michael Frank, Avraham Itzhakov, Alice Miller
The number $R(4, 3, 3)$ is often presented as the unknown Ramsey number with the best chances of being found "soon".
no code implementations • 18 Sep 2014 • Michael Codish, Michael Frank, Avraham Itzhakov, Alice Miller
This paper introduces a general methodology, based on abstraction and symmetry, that applies to solve hard graph edge-coloring problems and demonstrates its use to provide further evidence that the Ramsey number $R(4, 3, 3)=30$.
no code implementations • NeurIPS 2013 • Nathaniel J. Smith, Noah Goodman, Michael Frank
Language users are remarkably good at making inferences about speakers' intentions in context, and children learning their native language also display substantial skill in acquiring the meanings of unknown words.