Skip to content

beta v0.0.1

Pre-release
Pre-release

Choose a tag to compare

@TYTTYTTYT TYTTYTTYT released this 13 May 00:57
· 3 commits to main since this release

Insight

The AdamR optimizer which tends to maintain the parameters from pre-training.

TODOs

  • a basic AdamR optimzer, just like other PyTorch optimizers
  • refine the documentations
  • provide a function to specify initial parameters