-
Access 'Decayed learning rate' in TFData/Machine learning 2020. 10. 9. 09:50
We assume that we use
tf.keras.optimizers.schedules.ExponentialDecay
and we'd like to printcurrent decayed learning rate
using theCallback
.Normally, we can print the
learning rate
using the followingCallback
class:class LearningRateTracker(tf.keras.callbacks.Callback): def on_epoch_end(self, epoch, logs=None): print("current lr: ", self.model.optimizers.learning_rate)
However, the
current decayed learning rate
cannot be accessed like that.
To print outcurrent decayed learning rate
, you should use the followingCallback
class:class LearningRateTracker(tf.keras.callbacks.Callback): def on_epoch_end(self, epoch, logs=None): current_decayed_lr = self.model.optimizer._decayed_lr(tf.float32).numpy() print("current decayed lr: {:0.7f}".format(current_decayed_lr))
Reference
'Data > Machine learning' 카테고리의 다른 글
Graph Convolution Network (0) 2020.10.19 Weights & Biases (W&B) Tutorial: TF2 (0) 2020.10.08 Bayesian Optimization (0) 2020.10.07