Analog implementation of a current-mode Rectified Linear Unit (ReLU) for artificial neural networks

Date

2019-05

Authors

Vuppala, Sai Srujana

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

This report explores the design of building blocks that can be employed in analog implementations of Artificial Neural Networks (ANNs) with on-chip learning capability. A circuit for a Rectified Linear Unit (ReLU) is proposed. The design employs current-mode inputs that are combined and applied to a tightly coupled active feedback loop that presents a low input resistance. The design also provides for simultaneous derivative evaluation. The basic operation is verified within a neural network implementation of a logic function. The impact of non-linearity of the ReLU cell is evaluated using a polynomial model within a neural network for MNIST digit recognition. As an extension, a discrete time design that allows for a fully differential implementation of an analog artificial neural network is proposed

Description

LCSH Subject Headings

Citation