keyboard_arrow_up
A Memory Based Approach for Digital Implementation of Tanh using LUT and RALUT

Authors

Samira Sorayassa and Majid Ahmadi, University of Windsor, Canada

Abstract

Tangent Hyperbolic (Tanh) has been used as a preferred activation function in implementing a multi-layer neural network. The differentiability of this function makes it suitable for derivativebased learning algorithm such as error back propagation technique. In this paper two different memory-based techniques for accurate approximation and digital implementation of the Tanh function using Look Up Table (LUT) and Range Addressable Look Up Table (RALUT) are given. A thorough comparative study of the two techniques in terms of their hardware resource usage on FPGA and their accuracies are explained. The schematic of the synthesized design for special cased are given as an example.

Keywords

Tanh Activation function, Tanh Implementation on FPGA, Approximation methods, Lookup Tables (LUT) Range Addressable Lookup Tables (RALUT).

Full Text  Volume 12, Number 22