no code implementations • 19 Apr 2024 • Shuo Li, Mike Davies, Mehrdad Yaghoobi
Hyperspectral imaging (HSI) is a key technology for earth observation, surveillance, medical imaging and diagnostics, astronomy and space exploration.
no code implementations • 14 Mar 2024 • Andrew Wang, Mike Davies
Ill-posed image reconstruction problems appear in many scenarios such as remote sensing, where obtaining high quality images is crucial for environmental monitoring, disaster management and urban planning.
1 code implementation • 18 Dec 2023 • Jérémy Scanvic, Mike Davies, Patrice Abry, Julián Tachella
These methods critically rely on invariance to translations and/or rotations of the image distribution to learn from incomplete measurement data alone.
no code implementations • 5 Oct 2023 • Sumit Bam Shrestha, Jonathan Timcheck, Paxon Frady, Leobardo Campos-Macias, Mike Davies
Loihi 2 is an asynchronous, brain-inspired research processor that generalizes several fundamental elements of neuromorphic architecture, such as stateful neuron models communicating with event-driven spikes, in order to address limitations of the first generation Loihi.
1 code implementation • 10 Apr 2023 • Jason Yik, Korneel Van den Berghe, Douwe den Blanken, Younes Bouhadjar, Maxime Fabre, Paul Hueber, Denis Kleyko, Noah Pacik-Nelson, Pao-Sheng Vincent Sun, Guangzhi Tang, Shenqi Wang, Biyan Zhou, Soikat Hasan Ahmed, George Vathakkattil Joseph, Benedetto Leto, Aurora Micheli, Anurag Kumar Mishra, Gregor Lenz, Tao Sun, Zergham Ahmed, Mahmoud Akl, Brian Anderson, Andreas G. Andreou, Chiara Bartolozzi, Arindam Basu, Petrut Bogdan, Sander Bohte, Sonia Buckley, Gert Cauwenberghs, Elisabetta Chicca, Federico Corradi, Guido de Croon, Andreea Danielescu, Anurag Daram, Mike Davies, Yigit Demirag, Jason Eshraghian, Tobias Fischer, Jeremy Forest, Vittorio Fra, Steve Furber, P. Michael Furlong, William Gilpin, Aditya Gilra, Hector A. Gonzalez, Giacomo Indiveri, Siddharth Joshi, Vedant Karia, Lyes Khacef, James C. Knight, Laura Kriener, Rajkumar Kubendran, Dhireesha Kudithipudi, Yao-Hong Liu, Shih-Chii Liu, Haoyuan Ma, Rajit Manohar, Josep Maria Margarit-Taulé, Christian Mayr, Konstantinos Michmizos, Dylan Muir, Emre Neftci, Thomas Nowotny, Fabrizio Ottati, Ayca Ozcelikkale, Priyadarshini Panda, Jongkil Park, Melika Payvand, Christian Pehle, Mihai A. Petrovici, Alessandro Pierro, Christoph Posch, Alpha Renner, Yulia Sandamirskaya, Clemens JS Schaefer, André van Schaik, Johannes Schemmel, Samuel Schmidgall, Catherine Schuman, Jae-sun Seo, Sadique Sheik, Sumit Bam Shrestha, Manolis Sifalakis, Amos Sironi, Matthew Stewart, Kenneth Stewart, Terrence C. Stewart, Philipp Stratmann, Jonathan Timcheck, Nergis Tömen, Gianvito Urgese, Marian Verhelst, Craig M. Vineyard, Bernhard Vogginger, Amirreza Yousefzadeh, Fatima Tuz Zohora, Charlotte Frenkel, Vijay Janapa Reddi
The NeuroBench framework introduces a common set of tools and systematic methodology for inclusive benchmark measurement, delivering an objective reference framework for quantifying neuromorphic approaches in both hardware-independent (algorithm track) and hardware-dependent (system track) settings.
1 code implementation • 16 Mar 2023 • Jonathan Timcheck, Sumit Bam Shrestha, Daniel Ben Dayan Rubin, Adam Kupryjanow, Garrick Orchard, Lukasz Pindor, Timothy Shea, Mike Davies
A critical enabler for progress in neuromorphic computing research is the ability to transparently evaluate different neuromorphic solutions on important tasks and to compare them to state-of-the-art conventional solutions.
no code implementations • 5 Sep 2022 • Dongdong Chen, Mike Davies, Matthias J. Ehrhardt, Carola-Bibiane Schönlieb, Ferdia Sherry, Julián Tachella
From early image processing to modern computational imaging, successful models and algorithms have relied on a fundamental property of natural signals: symmetry.
1 code implementation • 23 Mar 2022 • Julián Tachella, Dongdong Chen, Mike Davies
In this paper, we present necessary and sufficient sensing conditions for learning the signal model from measurement data alone which only depend on the dimension of the model and the number of operators or properties of the group action that the model is invariant to.
1 code implementation • 28 Jan 2022 • Julián Tachella, Dongdong Chen, Mike Davies
In many real-world inverse problems, only incomplete measurement data are available for training which can pose a problem for learning a reconstruction function.
no code implementations • 5 Nov 2021 • Garrick Orchard, E. Paxon Frady, Daniel Ben Dayan Rubin, Sophia Sanborn, Sumit Bam Shrestha, Friedrich T. Sommer, Mike Davies
The biologically inspired spiking neurons used in neuromorphic computing are nonlinear filters with dynamic state variables -- very different from the stateless neuron models used in deep learning.
no code implementations • 9 Jun 2021 • Denis Kleyko, Mike Davies, E. Paxon Frady, Pentti Kanerva, Spencer J. Kent, Bruno A. Olshausen, Evgeny Osipov, Jan M. Rabaey, Dmitri A. Rachkovskij, Abbas Rahimi, Friedrich T. Sommer
We see them acting as a framework for computing with distributed representations that can play a role of an abstraction layer for emerging computing hardware.
no code implementations • 17 Aug 2020 • Thomas Feuillen, Mike Davies, Luc Vandendorpe, Laurent Jacques
This work focuses on the reconstruction of sparse signals from their 1-bit measurements.
no code implementations • 20 Jun 2020 • Junqi Tang, Mike Davies
In this work we propose an efficient stochastic plug-and-play (PnP) algorithm for imaging inverse problems.
no code implementations • CVPR 2021 • Julián Tachella, Junqi Tang, Mike Davies
While the NTK theory accurately predicts the filter associated with networks trained using standard gradient descent, our analysis shows that it falls short to explain the behaviour of networks trained using the popular Adam optimizer.
no code implementations • 27 Apr 2020 • E. Paxon Frady, Garrick Orchard, David Florey, Nabil Imam, Ruokun Liu, Joyesh Mishra, Jonathan Tse, Andreas Wild, Friedrich T. Sommer, Mike Davies
Neuromorphic computing applies insights from neuroscience to uncover innovations in computing technology.
no code implementations • 27 Feb 2020 • Derek Driggs, Junqi Tang, Jingwei Liang, Mike Davies, Carola-Bibiane Schönlieb
We introduce SPRING, a novel stochastic proximal alternating linearized minimization algorithm for solving a class of non-smooth and non-convex optimization problems.
Image Deconvolution Stochastic Optimization Optimization and Control 90C26
1 code implementation • 23 Jan 2020 • Mohammad Golbabaee, Guido Buonincontri, Carolin Pirkl, Marion Menzel, Bjoern Menze, Mike Davies, Pedro Gomez
We propose a dictionary-matching-free pipeline for multi-parametric quantitative MRI image computing.
no code implementations • 22 Oct 2019 • Junqi Tang, Karen Egiazarian, Mohammad Golbabaee, Mike Davies
We investigate this phenomenon and propose a theory-inspired mechanism for the practitioners to efficiently characterize whether it is beneficial for an inverse problem to be solved by stochastic optimization techniques or not.
no code implementations • 23 May 2019 • Konstantinos Pitas, Andreas Loukas, Mike Davies, Pierre Vandergheynst
Deep convolutional neural networks (CNNs) have been shown to be able to fit a random labeling over data while still being able to generalize well for normal labels.
no code implementations • 21 May 2019 • Konstantinos Pitas, Mike Davies, Pierre Vandergheynst
Recently developed smart pruning algorithms use the DNN response over the training set for a variety of cost functions to determine redundant network weights, leading to less accuracy degradation and possibly less retraining time.
1 code implementation • 3 Oct 2018 • Mohammad Golbabaee, Zhouye Chen, Yves Wiaux, Mike Davies
Current popular methods for Magnetic Resonance Fingerprint (MRF) recovery are bottlenecked by the heavy computations of a matched-filtering step due to the growing size and complexity of the fingerprint dictionaries in multi-parametric quantitative MRI applications.
no code implementations • 27 Aug 2018 • Heyi Li, Dong-Dong Chen, Bill Nailon, Mike Davies, Dave Laurenson
We explore the use of deep learning for breast mass segmentation in mammograms.
1 code implementation • 12 Mar 2018 • Konstantinos Pitas, Mike Davies, Pierre Vandergheynst
Recent DNN pruning algorithms have succeeded in reducing the number of parameters in fully connected layers, often with little or no drop in classification accuracy.
no code implementations • ICLR 2018 • Konstantinos Pitas, Mike Davies, Pierre Vandergheynst
Recent DNN pruning algorithms have succeeded in reducing the number of parameters in fully connected layers often with little or no drop in classification accuracy.
1 code implementation • 30 Dec 2017 • Konstantinos Pitas, Mike Davies, Pierre Vandergheynst
Recently the generalization error of deep neural networks has been analyzed through the PAC-Bayesian framework, for the case of fully connected layers.
no code implementations • 15 May 2017 • Ping Tak Peter Tang, Tsung-Han Lin, Mike Davies
With a moderate but well-defined assumption, we prove that the SNN indeed solves sparse coding.
no code implementations • 9 Feb 2014 • Alhussein Fawzi, Mike Davies, Pascal Frossard
The dictionary learning problem, which jointly learns the dictionary and linear classifier, is cast as a difference of convex (DC) program and solved efficiently with an iterative DC solver.
no code implementations • 9 Dec 2013 • Mike Davies, Gilles Puy, Pierre Vandergheynst, Yves Wiaux
Inspired by the recently proposed Magnetic Resonance Fingerprinting (MRF) technique, we develop a principled compressed sensing framework for quantitative MRI.
Information Theory Information Theory