Repository navigation
distillation
- Website
- Wikipedia
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
Pytorch implementation of various Knowledge Distillation (KD) methods.
A PyTorch-based knowledge distillation toolkit for natural language processing
PaddleSlim is an open-source library for deep model compression and architecture search.
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
Prompt engineering for developers
Segmind Distilled diffusion
⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation (AAAI 2025 Oral)
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
irresponsible innovation. Try now at https://chat.dev/
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.