Global Crisis will hit us all

Crises always present new opportunities for entrepreneurs. They are generally viewed as dangerous, and expensive, with destructive agendas and priorities. A Global Crisis that will hit us all soon is…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Learning Rate Scheduling with Callbacks in TensorFlow

One of the useful tweaks for faster training of neural networks is to vary (in often cases reduce) the learning rate hyperparameter which is used by Gradient-based optimization algorithms.

Keras provides a callback function that can be used to control this hyperparameter over time (number of iterations/epochs). To use this callback, we need to:

There are endless ways to schedule/control the learning rate, this section presents some examples.

The following scheduling function keeps the learning rate at constant value regardless of time.

The following scheduling function gradually decreases the learning rate over time from a starting value. The mathematical formula is lr= lr0 / (1+k*t) where lr0 is the initial learning rate value, is a decay hyperparameter and is the epoch/iteration number.

The following scheduling function exponentially decreases the learning rate over time from the starting point. Mathematically it can be represented as where is the initial learning rate value, is a decay hyperparameter and is the epoch/iteration number.

The following scheduling function keeps the learning rate at starting value for the first ten epochs and after that will decrease it exponentially.

The following chart visualizes the learning rate as it is scheduled by each of the previously defined functions.

Add a comment

Related posts:

Jalan tak ditapaki

Dua jalan menyimpang di hutan kemuning, Dan sayang aku tak dapat melalui dua-duanya Dan sebagai seorang pejalan kaki, lama kuberdiri Dan menatap jauh sejauh yang kumampu Menuju ke mana pun itu di…