Repository navigation
transformer-network
- Website
- Wikipedia
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
list of efficient attention modules
[TPAMI 2023 ESI Highly Cited Paper] SePiCo: Semantic-Guided Pixel Contrast for Domain Adaptive Semantic Segmentation https://arxiv.org/abs/2204.08808
This repository contains my research work on building the state of the art next basket recommendations using techniques such as Autoencoders, TF-IDF, Attention based BI-LSTM and Transformer Networks
Implementation of Transformer Pointer-Critic Deep Reinforcement Learning Algorithm
Implementation of Basic Conversational Agent(a.k.a Chatbot) using PyTorch Transformer Module
Using Bayesian optimization via Ax platform + SAASBO model to simultaneously optimize 23 hyperparameters in 100 iterations (set a new Matbench benchmark).
A PyTorch implementation of a transformer network trained using back-translation
Implementation of Transformer, BERT and GPT models in both Tensorflow 2.0 and PyTorch.
The objective of the project is to generate a abstractive summary from a bigger article. The process includes all the preprocessing step and summarizing the whole article. This will be very helpful to get the important context of bigger article.
Codes and write-up for Red Dragon AI Advanced NLP Course.
Enhancing burn severity mapping through dual-branch transformer network with bi-temporal Sentinel-2 multi-spectral images
Neural Style Transfer for Images & Videos using models trained by a feedforward transformer network.