Realization of the sigmoid activation function for neural networks on current FPGAs by the table-driven method | Vestnik Tomskogo gosudarstvennogo universiteta. Upravlenie, vychislitelnaja tehnika i informatika – Tomsk State University Journal of Control and Computer Science. 2024. № 69. DOI: 10.17223/19988605/69/13

Realization of the sigmoid activation function for neural networks on current FPGAs by the table-driven method

In the work, the sigmoid function is implemented using the bit-level mapping method. Within this method, inputs and outputs of the sigmoid function are represented in binary code in fixed-point format. Each output bit is separated from others and represented by a Boolean function of the input bits or its truth table. The possibilities of implementing sigmoid function output bit calculators on FPGA programmable logic blocks are assessed. Two implementation ways are analyzed: on the base of truth tables and on the base of minimized Boolean functions. All implemented circuits have equal bit widths of inputs and outputs to each other. The circuits based on truth tables have bit widths in the range of 6 to 11 bits. It is shown that the sigmoid output bit calculators of 7- and 8-bit inputs occupy just a single programmable logic block and make calculations in the shortest time. The proposed variant of the sigmoid function calculator can be used as a part of trained neural networks implemented in hardware. The author declares no conflicts of interests.

Download file
Counter downloads: 3

Keywords

neural network, sigmoid function, FPGA, table-driven method

Authors

NameOrganizationE-mail
Ushenina Inna V.Penza State Technological Universityivl23@yandex.ru
Всего: 1

References

Alippi C., Storti-Gajani G. Simple approximation of sigmoidal functions: realistic design of digital neural networks capable of learning // Proc. 1991 IEEE International Symposium on Circuits and Systems (ISCAS). Singapore, 1991. V. 3. P. 1505-1508.
Amin H., Curtis K.M., Hayes-Gill B.R. Piecewise linear approximation applied to nonlinear function of a neural network // IEE Proc. - Circuits, Devices and Systems. 1997. V. 144 (6). P. 313-317. :19971587.
Tatas K., Gemenaris M. High-Performance and Low-Cost Approximation of ANN Sigmoid Activation Functions on FPGAs // Proc. 12th International Conference on Modern Circuits and Systems Technologies (MOCAST). Athens, Greece, 2023. P. 1-4.
Шашев Д.В., Шатравин В.В. Реализация сигмоидной функции активации с помощью концепции перестраиваемых вычислительных сред // Вестник Томского государственного университета. Управление, вычислительная техника и информатика. 2022. № 61. С. 117-127.
Chen H., Jiang L., Luo Y., Lu Z., Fu Y., Li L., Yu Z. A CORDIC-Based Architecture with Adjustable Precision and Flexible Scalability to Implement Sigmoid and Tanh Functions // Proc. 2020 IEEE International Symposium on Circuits and Systems (ISCAS). Seville, Spain, 2020. P. 1-5.
Sartin M.A., da Silva A.C.R. Approximation of hyperbolic tangent activation function using hybrid methods // Proc. 8th International Workshop on Reconfigurable and Communication-Centric Systems-on-Chip (ReCoSoC). Darmstadt, Germany, 2013. P. 1-6.
Saranya S., Elango B. Implementation of PWL and LUT based approximation for hyperbolic tangent activation function in VLSI // Proc. 2014 International Conference on Communication and Signal Processing. Melmaruvathur, India, 2014. P. 1778-1782.
Xie Y., Raj A.N.J., Hu Z., Huang S., Fan Z., Joler M. A twofold lookup table architecture for efficient approximation of activation functions // IEEE Transactions on Very Large Scale Integration (VLSI) Systems. 2020. V. 28 (12). P. 2540-2550.
Tommiska M.T. Efficient digital implementation of the sigmoid function for reprogrammable logic // IEE Proc. - Computers and Digital Techniques. 2003. V. 150 (6). P. 403-411. :20030965.
Li X.J., Li L. IP core based hardware implementation of multi-layer perceptrons on FPGAs: a parallel approach // Advanced Materials Research. 2012. V. 433. P. 5647-5653.
Yang T., Wei Y., Tu Z., Zeng H., Kinsy M.A., Zheng N., Ren P. Design Space Exploration of Neural Network Activation Function Circuits // IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. 2019. V. 38 (10). P. 19741978.
Rajput G., Raut G., Chandra M., Vishvakarma S.K. VLSI implementation of transcendental function hyperbolic tangent for deep neural network accelerators // Microprocessors and Microsystems. 2021. V. 84. Art. 104270.
Holt J.L., Hwang J.N. Finite precision error analysis of neural network hardware implementations // IEEE Transactions on Computers. 1993. V. 42 (3). P. 281-290.
 Realization of the sigmoid activation function for neural networks on current FPGAs by the table-driven method | Vestnik Tomskogo gosudarstvennogo universiteta. Upravlenie, vychislitelnaja tehnika i informatika – Tomsk State University Journal of Control and Computer Science. 2024. № 69. DOI: 10.17223/19988605/69/13

Realization of the sigmoid activation function for neural networks on current FPGAs by the table-driven method | Vestnik Tomskogo gosudarstvennogo universiteta. Upravlenie, vychislitelnaja tehnika i informatika – Tomsk State University Journal of Control and Computer Science. 2024. № 69. DOI: 10.17223/19988605/69/13

Download full-text version
Counter downloads: 128