Recent Posts
-
-
[2021.01.28] Unsupervised learning, Semi-Supervised LearningPaper review 2021. 1. 28. 19:09
R. Caruana, 1997, "Multitask Learning" Multi-task learning (MTL) It is an inductive transfer mechanism whose principal goal is to improve generalization performance. The MTL improves generalization by leveraging the domain-specific information contained in the training signals of related tasks. In effect, the training signals for the extra tasks serve as an inductive bias. The standard methodolo..
-
[PyTorch] .detach()Data/Machine learning 2021. 1. 28. 16:42
Tensor가 기록을 추적하는 것을 중단하게 하려면, .detach()를 호출하여 연산 기록으로부터 분리(detach)하여 이후 연산들이 추적되는 것을 방지할 수 있습니다. (출처) Example (Source: here) modelA = nn.Linear(10, 10) modelB = nn.Linear(10, 10) modelC = nn.Linear(10, 10) x = torch.randn(1, 10) a = modelA(x) b = modelB(a.detach()) b.mean().backward() print(modelA.weight.grad) print(modelB.weight.grad) c = modelC(a) c.mean().backward() print(modelA.weight.grad) ..
-
[PyTorch] .detach() in Loss FunctionData/Machine learning 2021. 1. 28. 14:15
What happens if you put .detach() in a loss function? Like in the SimSiam algorithm: Example 1 Let's say, we have the following equations: J=y1y2 y1=2x y2=3x Then, naturally, the derivatives of J w.r.t the x are: J=(2x)(3x)=6x2=12x However, if .detach() is applied to y1, we treat y1 as a constant when computing derivatives: $$ \frac{\partial..