Repository navigation
teacher-student
- Website
- Wikipedia
Awesome Knowledge Distillation
Pytorch implementation of various Knowledge Distillation (KD) methods.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
DALI: a large Dataset of synchronised Audio, LyrIcs and vocal notes.
Improving Multi-hop Knowledge Base Question Answering by Learning Intermediate Supervision Signals. WSDM 2021.
[ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Chenyu You, Xiaohui Xie, Zhangyang Wang
College Based Data Management System
It is an online Student-Teacher portal wherein teachers can upload various assignments related to their subjects which the student can download.
A Comprehensive Survey on Knowledge Distillation
PyTorch implementation of "Distilling the Knowledge in a Neural Network"
Deep Neural Network Compression based on Student-Teacher Network
Student Teacher interactive platform
This project implements knowledge distillation from DINOv2 (Vision Transformer) to convolutional networks, enabling efficient visual representation learning with reduced computational requirements.
The Pytorch implementation of Graph convolution network (Kipf et.al. 2017) with vanilla Teacher-Student architecture of knowledge distillation (Hinton et.al 2015).
Mobile-first education software for teachers.
Semi-supervised teacher-student framework
Teaching materials for Procedural Programming Lab
Code for our JSTARS paper "Semi-MCNN: A semisupervised multi-CNN ensemble learning method for urban land cover classification using submeter HRRS images"
REST API in Django using Django REST Framework.