Deep neural networks (DNNs) perform complicated tasks such as image classification, speech recognition, and natural language processing with high precision. However, most DNN accelerators are primarily based on a linear synapse model, which limits the accelerators’ density. Existing techniques to improve DNN accelerator density have generally fallen short. While analog neural nets represent a viable option to increase computational density, they display linear synapse characteristics and require bulky circuits. To circumvent these problems, a group of researchers at EPFL have proposed a novel nonlinear analog synapse circuit based on a black-box training model that interpolates data from circuit simulation to calculate gradients.

In their paper recently published in the journal IEEE Micro, Ahmet Caner Yuzuguler, Firat Celik, Mario Drumond, Babak Falsafi, and Pascal Frossard demonstrate that their synapse circuit is not only more resilient to fabrication error than existing models, but also has a low hardware footprint. The simulation results presented in the paper show that the circuit with black-box training can achieve a classification accuracy with minimal deviation across fabricated chips, without any need for calibration or re-training.

Compared to the baseline digital accelerator, the proposed circuit offers 12x better energy efficiency and 29x better computational density.  It achieves 582x better computational density and offers 12x better energy efficiency and 29x better computational density.

The research team plans to extend the precision of their weight generator circuit to support DNN applications that require weight precision higher than 4 bits. They are also exploring different types of digital-to-analog converter types for their weight generator circuit. Although the proposed circuit is applicable to any type of neural network, the EPFL researchers aim to benchmark their design with a recurrent neural network (RNN) workload and achieve a significant improvement in performance and energy-efficiency.

Reference: https://ieeexplore.ieee.org/document/8772076