no code implementations • 30 Aug 2023 • Paul Tarau
We show the effectiveness of our design via Natlog apps working as orchestrators for JAX and Pytorch pipelines and as DCG-driven GPT3 and DALL. E prompt generators.
1 code implementation • 24 Jun 2023 • Paul Tarau
We automate deep step-by step reasoning in an LLM dialog thread by recursively exploring alternatives (OR-nodes) and expanding details (AND-nodes) up to a given depth.
1 code implementation • 5 Aug 2022 • Paul Tarau
The problem is known as graph node property prediction and our approach will consist in emulating with help of a Prolog program the key information propagation steps of a Graph Neural Network's training and inference stages.
no code implementations • 17 Sep 2021 • Paul Tarau
We introduce Natlog, a lightweight Logic Programming language, sharing Prolog's unification-driven execution model, but with a simplified syntax and semantics.
no code implementations • 22 Sep 2020 • Paul Tarau, Valeria de Paiva
Keywords: combinatorial generation of provable formulas of a given size, intuitionistic and linear logic theorem provers, theorems of the implicational fragment of propositional linear intuitionistic logic, Curry-Howard isomorphism, efficient generation of linear lambda terms in normal form, Prolog programs for lambda term generation and theorem proving.
2 code implementations • 31 Jul 2020 • Paul Tarau, Eduardo Blanco
Working on the Prolog facts and their inferred consequences, the dialog engine specializes the text graph with respect to a query and reveals interactively the document's most relevant content elements.
2 code implementations • 20 Sep 2019 • Paul Tarau, Eduardo Blanco
We build a bridge between neural network-based machine learning and graph-based natural language processing and introduce a unified approach to keyphrase, summary and relation extraction by aggregating dependency graphs from links provided by a deep-learning based dependency parser.
1 code implementation • Conference 2004 • Rada Mihalcea, Paul Tarau
In this paper, we introduce TextRank – a graph-based ranking model for text processing and show how this model can be successfully used in natural language applications.