-
Access 'Decayed learning rate' in TFData/Machine learning 2020. 10. 9. 09:50
We assume that we use tf.keras.optimizers.schedules.ExponentialDecay and we'd like to print current decayed learning rate using the Callback. Normally, we can print the learning rate using the following Callback class: class LearningRateTracker(tf.keras.callbacks.Callback): def on_epoch_end(self, epoch, logs=None): print("current lr: ", self.model.optimizers.learning_rate) However, the current d..
-
Weights & Biases (W&B) Tutorial: TF2Data/Machine learning 2020. 10. 8. 12:50
1. Log-in the wandb in a command window by the following command: $ wandb login 2. Open the jupyter notebook 3. Initialize wandb as follows: wandb.init(project="bayesian_optimization_test01", config={"lr": 0.001, "layer_size": 64} ) In config you can specify all the hyper-parameters you wanna track. 4. Log all the metrics and loss values can be stored by the following code: from wandb.keras impo..
-
Bayesian OptimizationData/Machine learning 2020. 10. 7. 20:17
Using Bayesian Optimization (BO), we can automate the hyper-parameter tuning process. Background / Theory Refer to this blog posting Python Opensource-library for the BO and Its tutorial. Refer to this blog posting, library's-tutorial-document
-
Word cloudData/Machine learning 2020. 9. 12. 15:34
Easy way to generate a word cloud: By using a library named "wordcloud": https://github.com/amueller/word_cloud Simple example from wordcloud import WordCloud word_string = 'oh oh oh oh ........ culture black culture black culture' wordcloud = WordCloud(width=1600, height=800, collocations = False).generate(word_string) # 'width', 'height' increases a resolution of the image # 'collocations=Fa..