Repository navigation

#

knowledge-distillation

Python
5718
2 个月前

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

Python
3130
2 年前
2888
3 个月前

"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)

Python
2524
2 年前

SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime

Python
2475
1 天前

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility

Python
1961
2 年前

This is a collection of our NAS and Vision Transformer work.

Python
1795
1 年前

Pytorch implementation of various Knowledge Distillation (KD) methods.

Python
1712
4 年前
yoshitomo-matsubara/torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

Python
1541
25 天前

Improving Convolutional Networks via Attention Transfer (ICLR 2017)

Jupyter Notebook
1457
7 年前

利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码

Jupyter Notebook
1443
3 年前
Jupyter Notebook
1291
9 个月前

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.

1138
5 个月前

Collection of recent methods on (deep) neural network compression and acceleration.

949
5 个月前