Releases: FluxML/Optimisers.jl
Releases · FluxML/Optimisers.jl
v0.4.6
v0.4.5
Optimisers v0.4.5
Merged pull requests:
- type parametrization for rules (#209) (@CarloLucibello)
Closed issues:
- Type Constraints in the Rule Structs (#205)
v0.4.4
v0.4.3
Optimisers v0.4.3
Merged pull requests:
v0.4.2
Optimisers v0.4.2
Merged pull requests:
- Add
Adapt.adapt_structure
method forOptimisers.Leaf
(#180) (@vpuri3) - fix AdamW (#198) (@CarloLucibello)
Closed issues:
v0.4.1
v0.4.0
Optimisers v0.4.0
Merged pull requests:
- make docstrings consistent (#187) (@CarloLucibello)
- Add the option couple to AdamW and set the default to match pytorch (#188) (@CarloLucibello)
- fix epsilon for Float16 (#190) (@CarloLucibello)
- docs for nothing behavior and for walking a tree with keypath (#191) (@CarloLucibello)
Closed issues:
- Stable docs will 404 until a new version is tagged (#25)
- Allow keyword arguments for optimisers (#74)
- doc improvement: working with custom model types (#84)
- Rename or outsource
iswriteable
(#99) - Split out the
rules.jl
as a sub-package (or a separate package) ? (#108) - Wrong model update for BatchNorm for some specific synthax (#123)
- Use
OptChain
as an alias forOptimiserChain
? (#138) nothing
does not correspond to updating the state with a zero gradient. (#140)- Utility for walking a tree (e.g. gradients) w.r.t. a model (#143)
- Adam optimizer can produce NaNs with Float16 due to small epsilon (#167)
- mark as public any non-exported but documented interface (#189)
v0.3.4
Optimisers v0.3.4
Merged pull requests:
- Fix macro signature in docstring (#184) (@abhro)
- fix docs (#185) (@CarloLucibello)
- allow Functor 0.5 (#186) (@CarloLucibello)
Closed issues:
v0.3.3
Optimisers v0.3.3
Merged pull requests:
- add missing SignDecay doc reference (#168) (@CarloLucibello)
- fix broken doc links (#170) (@CarloLucibello)
- add
trainables
(#171) (@CarloLucibello) - fix broken documentation (#172) (@CarloLucibello)
- add
path
option totrainables
(#174) (@CarloLucibello)
Closed issues:
- Documenter CI is failing (#169)
v0.3.2
Optimisers v0.3.2
Merged pull requests: