Repository navigation
optimizers
- Website
- Wikipedia
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
A New Optimization Technique for Deep Neural Networks
RAdam implemented in Keras & TensorFlow
Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers
Accelerated tensor operations and dynamic neural networks based on reverse mode automatic differentiation for every device that can run Swift - from watchOS to Linux
Code for the paper "Facial Emotion Recognition: State of the Art Performance on FER2013"
Instantly improve your training performance of TensorFlow models with just 2 lines of code!
FrostNet: Towards Quantization-Aware Network Architecture Search
Implementation of the proposed Adam-atan2 from Google Deepmind in Pytorch
Summarize Massive Datasets using Submodular Optimization
Fast, Modern, Memory Efficient, and Low Precision PyTorch Optimizers
Neutron: A pytorch based implementation of Transformer and its variants.
Prodigy and ScheduleFree, together at last.
Intergration to get optimizers information from the SolarEdge portal
Neural Network optimizers implemented from scratch in numpy (Adam, Adadelta, RMSProp, SGD, etc.)
Lookahead mechanism for optimizers in Keras.
Toy implementations of some popular ML optimizers using Python/JAX