Efficient and robust bitstream processing in binarised neural networks

Aygun S., Güneş E. O., De Vleeschouwer C.

ELECTRONICS LETTERS, vol.57, no.5, pp.219-222, 2021 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 57 Issue: 5
  • Publication Date: 2021
  • Doi Number: 10.1049/ell2.12045
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Applied Science & Technology Source, Business Source Elite, Business Source Premier, Communication Abstracts, Computer & Applied Sciences, INSPEC, zbMATH, Directory of Open Access Journals
  • Page Numbers: pp.219-222
  • Keywords: Logic circuits, Logic elements, Neural net devices, Neural nets (circuit implementations)
  • Yıldız Technical University Affiliated: Yes


In the neural network context, used in a variety of applications, binarised networks, which describe both weights and activations as single-bit binary values, provide computationally attractive solutions. A lightweight binarised neural network system can be constructed using only logic gates and counters together with a two-valued activation function unit. However, binarised neural networks represent the weights and the neuron outputs with only one bit, making them sensitive to bit-flipping errors. Binarised weights and neurons are manipulated by the utilisation of bitstream processing with regard to stochastic computing to cope with this error sensitivity. Stochastic computing is shown to provide robustness for bit errors on data while being built on a hardware structure, whose implementation is simplified by a novel subtraction-free implementation of the neuron activation.