![]() Unless you’re trying to implement something like a gating mechanism, like in LSTMs or GRU cells, then you should opt for sigmoid and/or tanh in those cells. If you are looking to answer the question, ‘which activation function should I use for my neural network model?’, you should probably go with ReLU. Over the course of the development of neural networks, several nonlinear activation functions have been introduced to make gradient-based deep learning tractable. The nonlinearities that allow neural networks to capture complex patterns in data are referred to as activation functions.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |