Search Results for author: Steven Van Vaerenbergh

Found 14 papers, 2 papers with code

A Classification of Artificial Intelligence Systems for Mathematics Education

no code implementations13 Jul 2021 Steven Van Vaerenbergh, Adrián Pérez-Suay

This chapter provides an overview of the different Artificial Intelligence (AI) systems that are being used in contemporary digital tools for Mathematics Education (ME).

Classification

On the Stability and Generalization of Learning with Kernel Activation Functions

no code implementations28 Mar 2019 Michele Cirillo, Simone Scardapane, Steven Van Vaerenbergh, Aurelio Uncini

In this brief we investigate the generalization properties of a recently-proposed class of non-parametric activation functions, the kernel activation functions (KAFs).

Widely Linear Kernels for Complex-Valued Kernel Activation Functions

no code implementations6 Feb 2019 Simone Scardapane, Steven Van Vaerenbergh, Danilo Comminiello, Aurelio Uncini

Complex-valued neural networks (CVNNs) have been shown to be powerful nonlinear approximators when the input data can be properly modeled in the complex domain.

Image Classification

Improving Graph Convolutional Networks with Non-Parametric Activation Functions

no code implementations26 Feb 2018 Simone Scardapane, Steven Van Vaerenbergh, Danilo Comminiello, Aurelio Uncini

Graph neural networks (GNNs) are a class of neural networks that allow to efficiently perform inference on data that is associated to a graph structure, such as, e. g., citation networks or knowledge graphs.

Knowledge Graphs

Complex-valued Neural Networks with Non-parametric Activation Functions

2 code implementations22 Feb 2018 Simone Scardapane, Steven Van Vaerenbergh, Amir Hussain, Aurelio Uncini

Complex-valued neural networks (CVNNs) are a powerful modeling tool for domains where data can be naturally interpreted in terms of complex numbers.

Kafnets: kernel-based non-parametric activation functions for neural networks

2 code implementations13 Jul 2017 Simone Scardapane, Steven Van Vaerenbergh, Simone Totaro, Aurelio Uncini

Neural networks are generally built by interleaving (adaptable) linear layers with (fixed) nonlinear activation functions.

Recursive Multikernel Filters Exploiting Nonlinear Temporal Structure

no code implementations12 Jun 2017 Steven Van Vaerenbergh, Simone Scardapane, Ignacio Santamaria

In kernel methods, temporal information on the data is commonly included by using time-delayed embeddings as inputs.

On the Relationship between Online Gaussian Process Regression and Kernel Least Mean Squares Algorithms

no code implementations11 Sep 2016 Steven Van Vaerenbergh, Jesus Fernandez-Bes, Víctor Elvira

We study the relationship between online Gaussian process (GP) regression and kernel least mean squares (KLMS) algorithms.

regression

Bayesian Extensions of Kernel Least Mean Squares

no code implementations20 Oct 2013 Il Memming Park, Sohan Seth, Steven Van Vaerenbergh

The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that "kernelizes" the celebrated (linear) least mean squares algorithm.

Gaussian Processes for Nonlinear Signal Processing

no code implementations12 Mar 2013 Fernando Pérez-Cruz, Steven Van Vaerenbergh, Juan José Murillo-Fuentes, Miguel Lázaro-Gredilla, Ignacio Santamaria

Gaussian processes (GPs) are versatile tools that have been successfully employed to solve nonlinear estimation problems in machine learning, but that are rarely used in signal processing.

BIG-bench Machine Learning Gaussian Processes +2

Overlapping Mixtures of Gaussian Processes for the Data Association Problem

no code implementations16 Aug 2011 Miguel Lázaro-Gredilla, Steven Van Vaerenbergh, Neil Lawrence

In this work we introduce a mixture of GPs to address the data association problem, i. e. to label a group of observations according to the sources that generated them.

Gaussian Processes Multi-Object Tracking +1

Cannot find the paper you are looking for? You can Submit a new open access paper.