Repository navigation

#

generative-pre-trained-transformer

HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer

Jupyter Notebook
26
1 年前

A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.

Python
22
2 年前

Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by-step, we’ll construct a GPT model from the ground up, demystifying its architecture and bringing its mechanics to life through hands-on coding.

Jupyter Notebook
18
9 个月前

An academic implementation of GPT: only math and raw JAX

Python
1
1 个月前

A Generatively Pretrained Transformer that generates Shakespeare-eque quotes

Jupyter Notebook
1
2 年前
Jupyter Notebook
1
2 年前

ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.

Jupyter Notebook
1
9 个月前

(GPT-1) | Generative Pre-trained Transformer - 1

0
2 年前

Repository for all things Natural Language Processing

Jupyter Notebook
0
2 年前

I built a GPT model from scratch to generate text

Jupyter Notebook
0
1 年前

This is a NLP coursework repository for the Honours Bachelor of Artificial Intelligence program at Durham College. This repository contains weekly labs, assignments, and the final project completed during the Winter 2024 term.

Jupyter Notebook
0
3 个月前