-
Dilated Causal Convolution from WaveNetData/Machine learning 2021. 3. 1. 13:20
Concept It was first proposed from a paper for WaveNet which was developed by Google to generate realistic-sounding speech from text. You can try text2speech of the wavenet here. A comparison between with the dilated causal convolution (DCC) and without it is shown in the following figure: It can be observed that the DCC covers a longer time series, which allows the model to capture the global e..
-
PCA using PythonMathematics 2021. 2. 2. 19:28
import os import numpy as np from sklearn.decomposition import TruncatedSVD from utils.helper_03 import * # set random-seed np.random.seed(1) # Generate data x = np.arange(-0.5, 0.5, 0.01) y = x + (np.random.rand(len(x))-0.5)*0.2 plt.figure(figsize=(5, 5)) plt.plot(x, y, 'o'); plt.grid(); target_arr = np.stack((x, y)).T # (100x2) u, s, vh = np.linalg.svd(target_arr, full_matrices=True) first_com..
-
-
[2021.01.28] Unsupervised learning, Semi-Supervised LearningPaper review 2021. 1. 28. 19:09
R. Caruana, 1997, "Multitask Learning" Multi-task learning (MTL) It is an inductive transfer mechanism whose principal goal is to improve generalization performance. The MTL improves generalization by leveraging the domain-specific information contained in the training signals of related tasks. In effect, the training signals for the extra tasks serve as an inductive bias. The standard methodolo..