Please check out our blog post Numerical Encodings for DNN Accelerators published on SIGARCH Computer Architecture Today!
HBFP Emulator
HBFP is a hybrid Block Floating-Point (BFP) - Floating-Point (FP) number representation for DNN training introduced by ColTraIn team. HBFP offers the accuracy of 32-bit floating-point with the numeric and silicon density of 8-bit fixed-point for a wide variety of models (ResNet, WideResNet, DenseNet, AlexNet, LSTM, and BERT). We foresee HBFP laying the foundation for accurate training algorithms running on accelerators with an order of magnitude denser arithmetic than conventional or novel floating-point based platforms. The ColTraIn emulator repository includes several example DNN models including CNNs, LSTMs and BERT for both HBFP and a reference FP32 baseline. Check out the ColTraIn HBFP Training emulator at GitHub.
- S. B. Harma, M. Drumond, T. Lin (2021) ColTraIn HBFP Training Emulator Source code.