Repository navigation

#

adam-optimizer

On the Variance of the Adaptive Learning Rate and Beyond

Python
2548
4 年前

Deep learning library in plain Numpy.

Python
321
3 年前

This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"

180
4 年前

CS F425 Deep Learning course at BITS Pilani (Goa Campus)

Jupyter Notebook
109
2 个月前

ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance

C++
85
4 年前

Lion and Adam optimization comparison

Jupyter Notebook
61
2 年前

Reproducing the paper "PADAM: Closing The Generalization Gap of Adaptive Gradient Methods In Training Deep Neural Networks" for the ICLR 2019 Reproducibility Challenge

Python
51
6 年前

Implemented Adam optimizer in python

Python
48
8 年前

Toy implementations of some popular ML optimizers using Python/JAX

Python
44
4 年前

A collection of various gradient descent algorithms implemented in Python from scratch

Python
38
2 年前

This library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated

C
32
2 年前

A compressed adaptive optimizer for training large-scale deep learning models using PyTorch

Python
27
5 年前

The project aimed to implement Deep NN / RNN based solution in order to develop flexible methods that are able to adaptively fillin, backfill, and predict time-series using a large number of heterogeneous training datasets.

Python
26
9 年前

Modified XGBoost implementation from scratch with Numpy using Adam and RSMProp optimizers.

Jupyter Notebook
26
5 年前

Lookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow

Python
25
6 年前

Implementation of Adam Optimization algorithm using Numpy

Jupyter Notebook
20
5 年前