• Login
    • University Home
    • Library Home
    • Lib Catalogue
    • Advance Search
    View Item 
    •   IR@KDU Home
    • INTERNATIONAL RESEARCH CONFERENCE ARTICLES (KDU IRC)
    • 2019 IRC Articles
    • Computing
    • View Item
    •   IR@KDU Home
    • INTERNATIONAL RESEARCH CONFERENCE ARTICLES (KDU IRC)
    • 2019 IRC Articles
    • Computing
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Effective Usage of Activation Functions for Data Classification with TensorFlow in Deep Neural Networks

    Thumbnail
    View/Open
    com033.pdf (403.9Kb)
    Date
    2019
    Author
    Priyabhashana
    B
    Jayasena
    KPN
    Metadata
    Show full item record
    Abstract
    Artificial neural networks can be known as a computer system modeled on the human brain and neural system. In data classification, neural network provides fast and efficient results. Neural Network models are trained by using sets of labeled data. Neural networks have the ability to work with data, based on the training. There are thousands of interconnected nodes that belong to interconnected hidden layers inside the neural network. Activation function that have included in the neural network provides the output based on given an input or set of inputs. This research work focused on the comparison of the effects of using several activation functions on multiple hidden layers for classification using MNIST (Mixed National Institute of Standards and Technology) data set. Data classification was made using TensorFlow library. Tensorflow library with the help of keras used to build the neural network model. The experiment results of Rectified Linear Unit (ReLu), Leaky ReLU, Hyperbolic Tangent (tanH), Exponential Linear Unit (eLu), sigmoid, softplus, softmax and softsign activation functions. Data have been collected for the experiment in two different methodologies. There is a hidden layer with one activation function and multiple hidden layers with multiple activation function. The result of the study shows that the higher accurate rate than 88% for training and testing when it uses multiple hidden layers with multiple activation functions.
    URI
    http://ir.kdu.ac.lk/handle/345/2284
    Collections
    • Computing [68]

    Library copyright © 2017  General Sir John Kotelawala Defence University, Sri Lanka
    Contact Us | Send Feedback
     

     

    Browse

    All of IR@KDUCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsFacultyDocument TypeThis CollectionBy Issue DateAuthorsTitlesSubjectsFacultyDocument Type

    My Account

    LoginRegister

    Library copyright © 2017  General Sir John Kotelawala Defence University, Sri Lanka
    Contact Us | Send Feedback