天天看點

keras.optimizers

Built-in optimizer classes.

<code>schedules</code> module: Public API for tf.keras.optimizers.schedules namespace.

<code>class Adadelta</code>: Optimizer that implements the Adadelta algorithm.

<code>class Adagrad</code>: Optimizer that implements the Adagrad algorithm.

<code>class Adam</code>: Optimizer that implements the Adam algorithm.

<code>class Adamax</code>: Optimizer that implements the Adamax algorithm.

<code>class Ftrl</code>: Optimizer that implements the FTRL algorithm.

<code>class Nadam</code>: Optimizer that implements the NAdam algorithm.

<code>class Optimizer</code>: Updated base class for optimizers.

<code>class RMSprop</code>: Optimizer that implements the RMSprop algorithm.

<code>class SGD</code>: Stochastic gradient descent and momentum optimizer.

<code>deserialize(...)</code>: Inverse of the <code>serialize</code> function.

<code>get(...)</code>: Retrieves a Keras Optimizer instance.

<code>serialize(...)</code>