optax_adagrad

phasic.optax_wrapper.optax_adagrad(
    learning_rate=0.01,
    initial_accumulator_value=0.1,
    eps=1e-07,
)

Create Optax Adagrad optimizer wrapped for phasic.

Parameters

learning_rate : float or optax.Schedule = 0.01

Learning rate.

initial_accumulator_value : float = 0.1

Initial value for accumulator.

eps : float = 1e-7

Small constant for numerical stability.

Returns

: OptaxOptimizer

Wrapped Adagrad optimizer compatible with phasic SVGD.