Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning

Activation functions are essential for deep learning methods to learn and perform complex tasks such as image classification. Rectified Linear Unit (ReLU) has been widely used and become the default activation function across the deep learning community since 2012. Although ReLU has been popular, ho...

Full description

Saved in:
Bibliographic Details
Main Authors: Chieng, Hock Hung (Author), Wahid, Noorhaniza (Author), Pauline, Ong (Author), Perla, Sai Raj Kishore (Author)
Format: EJournal Article
Published: Universitas Ahmad Dahlan, 2018-07-31.
Subjects:
Online Access:Get Fulltext
Tags: Add Tag
No Tags, Be the first to tag this record!

Internet

Get Fulltext

3rd Floor Main Library

Holdings details from 3rd Floor Main Library
Call Number: A1234.567
Copy 1 Available