Paper Type |
Opinion |
Title |
Activation Function of Sigmoid and Hyperbolic Tangent in Training Backpropagation Model - (A Comparison) |
Author |
Maslina Darus[a], Rokiah Ahmad[a], and Siti Mariyam Hj. Shamsuddin[b] |
Email |
- |
Abstract: The activation of a single neural unit in a neural model is determined as an activation function of the net input to that unit. In backpropagation network, a sigmoidal function is generally chosen for that purpose. The convergence rate using this activation function with backpropagation network depends on the complexity of the input data. If the problem to be solved is a classification , then the convergence rates might not be an issue. If the problem to be solved involves function minimization, then the convergence rates will be the issue. How far these statements can be true? Therefore, we figure out the effectiveness of using sigmoid and tanh activation function for problems involving classification with backpropagation model.
|
|
Start & End Page |
57 - 64 |
Received Date |
2000-03-24 |
Revised Date |
|
Accepted Date |
2000-06-05 |
Full Text |
Download |
Keyword |
sigmoid function, hyperbollic tangent, backpropagation model |
Volume |
Vol.27 No.1 (JUNE 2000) |
DOI |
|
Citation |
Darus M., Ahmad R. and Shamsuddin S.M.H., Activation Function of Sigmoid and Hyperbolic Tangent in Training Backpropagation Model - (A Comparison), Chiang Mai J. Sci., 2000; 27(1): 57-64. |
SDGs |
|
View:591 Download:205 |