Repository navigation
npu
- Website
- Wikipedia
An open-source project for Windows developers to learn how to add AI with local models and APIs to Windows apps.
Lemonade helps users run local LLMs with the highest performance by configuring state-of-the-art inference engines for their NPUs and GPUs. Join our discord: https://discord.gg/5xXzkMu8Zk
Efficient Inference of Transformer models
Ollama alternative for Rockchip NPU: An efficient solution for running AI and Deep learning models on Rockchip devices with optimized NPU support ( rkllm )
No-code CLI designed for accelerating ONNX workflows
Easy installation and usage of Rockchip's NPUs found in RK3588 and similar SoCs
FREE TPU V3plus for FPGA is the free version of a commercial AI processor (EEP-TPU) for Deep Learning EDGE Inference
ONNXim is a fast cycle-level simulator that can model multi-core NPUs for DNN inference
hardware design of universal NPU(CNN accelerator) for various convolution neural network
High-speed and easy-use LLM serving framework for local deployment
An interactive Ascend-NPU process viewer
Simplified AI runtime integration for mobile app development