Repository navigation
optimizers
- Website
- Wikipedia
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
A New Optimization Technique for Deep Neural Networks
RAdam implemented in Keras & TensorFlow
Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers
Implementation of the proposed Adam-atan2 from Google Deepmind in Pytorch
Code for the paper "Facial Emotion Recognition: State of the Art Performance on FER2013"
Accelerated tensor operations and dynamic neural networks based on reverse mode automatic differentiation for every device that can run Swift - from watchOS to Linux
Fast, Modern, and Low Precision PyTorch Optimizers
Summarize Massive Datasets using Submodular Optimization
Instantly improve your training performance of TensorFlow models with just 2 lines of code!
FrostNet: Towards Quantization-Aware Network Architecture Search
Prodigy and Schedule-Free, together at last.
Neutron: A pytorch based implementation of Transformer and its variants.
Intergration to get optimizers information from the SolarEdge portal
Lookahead mechanism for optimizers in Keras.
Neural Network optimizers implemented from scratch in numpy (Adam, Adadelta, RMSProp, SGD, etc.)
Toy implementations of some popular ML optimizers using Python/JAX