Paper review
-
[2021.04.12] Barlow TwinsPaper review 2021. 4. 15. 11:06
J. Zbontar, 2021, "Barlow Twins: Self-Supervised Learning via Redundancy Reduction" This paper proposes an objective function that naturally avoids such collapse by measuring the cross-correlation matrix between the outputs of two identical networks fed with distorted versions of a sample, and making it as close to the identity matrix as possible. This causes the representation vectors of distor..
-
[2021.03.15] Few-shot Learning; Self-supervised LearningPaper review 2021. 3. 15. 13:43
S. Gidaris et al., 2018, "Dynamic few-shot visual learning without forgetting" This paper proposes a few-shot object recognition system that is capable of dynamically learning novel categories from only a few training data while at the same time does not forget the base categories on which it was trained. To achieve that, the authors introduced the following two: 1) Classifier of a ConvNet as a ..
-
[2021.03.09] Few-shot Learning; Self-supervised LearningPaper review 2021. 3. 11. 14:29
[Seminar Video] Metric-based Approaches to Meta-learning Common Terminology regarding a Dataset for the Meta-learning What is a Metric-based Approach to Meta-learning? Metric-learning의 개념을 이용해서 meta-learning에 적용시킨 그런 방법론들을 지칭: Deep siamese network; Matching network; Protypical network; Relation network; Reference: http://dmqm.korea.ac.kr/activity/seminar/301 G. Koch, et al., 2015, "Siamese Neura..
-
[2021.01.28] Unsupervised learning, Semi-Supervised LearningPaper review 2021. 1. 28. 19:09
R. Caruana, 1997, "Multitask Learning" Multi-task learning (MTL) It is an inductive transfer mechanism whose principal goal is to improve generalization performance. The MTL improves generalization by leveraging the domain-specific information contained in the training signals of related tasks. In effect, the training signals for the extra tasks serve as an inductive bias. The standard methodolo..