tidy3d.plugins.invdes.AdamOptimizer#
- class AdamOptimizer[source]#
Bases:
AbstractOptimizer
Specification for an optimization.
- Parameters:
attrs (dict = {}) โ Dictionary storing arbitrary metadata for a Tidy3D object. This dictionary can be freely used by the user for storing data without affecting the operation of Tidy3D as it is not used internally. Note that, unlike regular Tidy3D fields,
attrs
are mutable. For example, the following is allowed for setting anattr
obj.attrs['foo'] = bar
. Also note that Tidy3D` will raise aTypeError
ifattrs
contain objects that can not be serialized. One can check ifattrs
are serializable by callingobj.json()
.design (Union[InverseDesign, InverseDesignMulti]) โ Specification describing the inverse design problem we wish to optimize.
learning_rate (PositiveFloat) โ Step size for the gradient descent optimizer.
maximize (bool = True) โ If
True
, the optimizer will maximize the objective function. IfFalse
, the optimizer will minimize the objective function.num_steps (PositiveInt) โ Number of steps in the gradient descent optimizer.
results_cache_fname (Optional[str] = None) โ If specified, will save the optimization state to a local
.pkl
file usingdill.dump()
. This file stores anInverseDesignResult
corresponding to the latest state of the optimization. To continue this run from the file using the same optimizer instance, calloptimizer.complete_run_from_history()
. Alternatively, the latest results can then be loaded withtd.InverseDesignResult.from_file(fname)
and then continued usingoptimizer.continue_run(result)
.store_full_results (bool = True) โ If
True
, stores the full history for the vector fields, specifically the gradient, params, and optimizer state. For large design regions and many iterations, storing the full history of these fields can lead to large file size and memory usage. In some cases, we recommend setting this field toFalse
, which will only store the last computed state of these variables.beta1 (ConstrainedFloatValue = 0.9) โ Beta 1 parameter in the Adam optimization method.
beta2 (ConstrainedFloatValue = 0.999) โ Beta 2 parameter in the Adam optimization method.
eps (PositiveFloat = 1e-08) โ Epsilon parameter in the Adam optimization method.
Attributes
design
Methods
initial_state
(parameters)initial state of the optimizer
update
(parameters,ย gradient[,ย state])Inherited Common Usage
- beta1#
- beta2#
- eps#
- __hash__()#
Hash method.