no code implementations • 13 Jul 2021 • Steven Van Vaerenbergh, Adrián Pérez-Suay
This chapter provides an overview of the different Artificial Intelligence (AI) systems that are being used in contemporary digital tools for Mathematics Education (ME).
no code implementations • 28 Mar 2019 • Michele Cirillo, Simone Scardapane, Steven Van Vaerenbergh, Aurelio Uncini
In this brief we investigate the generalization properties of a recently-proposed class of non-parametric activation functions, the kernel activation functions (KAFs).
no code implementations • 6 Feb 2019 • Simone Scardapane, Steven Van Vaerenbergh, Danilo Comminiello, Aurelio Uncini
Complex-valued neural networks (CVNNs) have been shown to be powerful nonlinear approximators when the input data can be properly modeled in the complex domain.
no code implementations • 11 Jul 2018 • Simone Scardapane, Steven Van Vaerenbergh, Danilo Comminiello, Simone Totaro, Aurelio Uncini
Gated recurrent neural networks have achieved remarkable results in the analysis of sequential data.
no code implementations • 26 Feb 2018 • Simone Scardapane, Steven Van Vaerenbergh, Danilo Comminiello, Aurelio Uncini
Graph neural networks (GNNs) are a class of neural networks that allow to efficiently perform inference on data that is associated to a graph structure, such as, e. g., citation networks or knowledge graphs.
2 code implementations • 22 Feb 2018 • Simone Scardapane, Steven Van Vaerenbergh, Amir Hussain, Aurelio Uncini
Complex-valued neural networks (CVNNs) are a powerful modeling tool for domains where data can be naturally interpreted in terms of complex numbers.
no code implementations • 16 Feb 2018 • Steven Van Vaerenbergh, Ignacio Santamaria, Victor Elvira, Matteo Salvatori
In this paper, we study the problem of locating a predefined sequence of patterns in a time series.
2 code implementations • 13 Jul 2017 • Simone Scardapane, Steven Van Vaerenbergh, Simone Totaro, Aurelio Uncini
Neural networks are generally built by interleaving (adaptable) linear layers with (fixed) nonlinear activation functions.
no code implementations • 12 Jun 2017 • Steven Van Vaerenbergh, Simone Scardapane, Ignacio Santamaria
In kernel methods, temporal information on the data is commonly included by using time-delayed embeddings as inputs.
no code implementations • 11 Sep 2016 • Steven Van Vaerenbergh, Jesus Fernandez-Bes, Víctor Elvira
We study the relationship between online Gaussian process (GP) regression and kernel least mean squares (KLMS) algorithms.
no code implementations • 27 Jan 2015 • Jesus Fernandez-Bes, Víctor Elvira, Steven Van Vaerenbergh
We introduce a probabilistic approach to the LMS filter.
no code implementations • 20 Oct 2013 • Il Memming Park, Sohan Seth, Steven Van Vaerenbergh
The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that "kernelizes" the celebrated (linear) least mean squares algorithm.
no code implementations • 12 Mar 2013 • Fernando Pérez-Cruz, Steven Van Vaerenbergh, Juan José Murillo-Fuentes, Miguel Lázaro-Gredilla, Ignacio Santamaria
Gaussian processes (GPs) are versatile tools that have been successfully employed to solve nonlinear estimation problems in machine learning, but that are rarely used in signal processing.
no code implementations • 16 Aug 2011 • Miguel Lázaro-Gredilla, Steven Van Vaerenbergh, Neil Lawrence
In this work we introduce a mixture of GPs to address the data association problem, i. e. to label a group of observations according to the sources that generated them.