no code implementations • 13 Dec 2016 • Victor Dorobantu, Per Andre Stromhaug, Jess Renteria
The vanishing and exploding gradient problems are well-studied obstacles that make it difficult for recurrent neural networks to learn long-term time dependencies.