Research
XpulpNN: Enabling Energy Efficient and Flexible Inference of Quantized Neural Network on RISC-V based IoT End Nodes
Abstract
XpulpNN introduces lightweight RISC-V ISA extensions supporting 4-bit and 2-bit SIMD instructions to accelerate heavily Quantized Neural Network (QNN) inference on IoT end nodes. The architecture utilizes a parallel cluster of 8