Skip to content

Releases: FluxML/Optimisers.jl

v0.4.6

01 Apr 20:06
4ff61fc
Compare
Choose a tag to compare

Optimisers v0.4.6

Diff since v0.4.5

Merged pull requests:

v0.4.5

22 Feb 06:50
b75f9e1
Compare
Choose a tag to compare

Optimisers v0.4.5

Diff since v0.4.4

Merged pull requests:

Closed issues:

  • Type Constraints in the Rule Structs (#205)

v0.4.4

07 Jan 18:38
91569f6
Compare
Choose a tag to compare

Optimisers v0.4.4

Diff since v0.4.3

Merged pull requests:

v0.4.3

05 Jan 19:39
6518001
Compare
Choose a tag to compare

Optimisers v0.4.3

Diff since v0.4.2

Merged pull requests:

v0.4.2

11 Dec 16:08
669798c
Compare
Choose a tag to compare

Optimisers v0.4.2

Diff since v0.4.1

Merged pull requests:

Closed issues:

  • Optimiser state a not moving to GPU (#179)
  • AdamW: epsilon and lambda swapped? (#197)

v0.4.1

08 Nov 21:23
2639523
Compare
Choose a tag to compare

Optimisers v0.4.1

Diff since v0.4.0

Merged pull requests:

v0.4.0

07 Nov 07:38
38c9d62
Compare
Choose a tag to compare

Optimisers v0.4.0

Diff since v0.3.4

Merged pull requests:

Closed issues:

  • Stable docs will 404 until a new version is tagged (#25)
  • Allow keyword arguments for optimisers (#74)
  • doc improvement: working with custom model types (#84)
  • Rename or outsource iswriteable (#99)
  • Split out the rules.jl as a sub-package (or a separate package) ? (#108)
  • Wrong model update for BatchNorm for some specific synthax (#123)
  • Use OptChain as an alias for OptimiserChain? (#138)
  • nothing does not correspond to updating the state with a zero gradient. (#140)
  • Utility for walking a tree (e.g. gradients) w.r.t. a model (#143)
  • Adam optimizer can produce NaNs with Float16 due to small epsilon (#167)
  • mark as public any non-exported but documented interface (#189)

v0.3.4

05 Nov 05:53
0ae05d6
Compare
Choose a tag to compare

Optimisers v0.3.4

Diff since v0.3.3

Merged pull requests:

Closed issues:

  • Design (#1)
  • convenience constructors (#11)
  • Restructure is not type stable but could be made stable? (#177)
  • AdamW optimizer implemented incorrectly - weight decay does not incorporate learning rate (#182)

v0.3.3

09 Apr 02:27
c2ae321
Compare
Choose a tag to compare

Optimisers v0.3.3

Diff since v0.3.2

Merged pull requests:

Closed issues:

  • Documenter CI is failing (#169)

v0.3.2

08 Feb 04:27
1908a1c
Compare
Choose a tag to compare

Optimisers v0.3.2

Diff since v0.3.1

Merged pull requests:

  • WeightDecay for L1 norm (#159) (@mcabbott)
  • Add all-keyword constructors, much like @kwdef (#160) (@mcabbott)
  • CompatHelper: add new compat entry for Statistics at version 1, (keep existing compat) (#164) (@github-actions[bot])
  • Don't load Yota at all (#166) (@mcabbott)