Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning
Activation functions are essential for deep learning methods to learn and perform complex tasks such as image classification. Rectified Linear Unit (ReLU) has been widely used and become the default activation function across the deep learning community since 2012. Although ReLU has been popular, ho...
Saved in:
Main Authors: | , , , |
---|---|
Format: | EJournal Article |
Published: |
Universitas Ahmad Dahlan,
2018-07-31.
|
Subjects: | |
Online Access: | Get Fulltext |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Internet
Get Fulltext3rd Floor Main Library
Call Number: |
A1234.567 |
---|---|
Copy 1 | Available |