Multi-task learning (MTL) introduces an inductive bias, based on a-priori relations between tasks: the trainable model is compelled to model more general dependencies by using the abovementioned relation as an important data feature. Hierarchical MTL, in which different tasks use different levels of the deep neural network, provides more effective inductive bias compared to “flat” MTL. Also, hierarchical MTL helps to solve the vanishing gradient problem in deep learning.
Source: Deep multi-task learning with low level tasks supervised at lower layersPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Multi-Task Learning | 3 | 27.27% |
Recommendation Systems | 2 | 18.18% |
Session-Based Recommendations | 1 | 9.09% |
Generalization Bounds | 1 | 9.09% |
Weather Forecasting | 1 | 9.09% |
CCG Supertagging | 1 | 9.09% |
Chunking | 1 | 9.09% |
Domain Adaptation | 1 | 9.09% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |