DANTE: Deep alternations for training neural networks

Sinha, Vaibhav B. and Kudugunta, Sneha and Sankar, Adepu Ravi and Chavali, Surya Teja and Balasubramanian, Vineeth N. (2020) DANTE: Deep alternations for training neural networks. Neural Networks, 131. pp. 127-143. ISSN 08936080

Full text not available from this repository. (Request a copy)

Abstract

We present DANTE, a novel method for training neural networks using the alternating minimization principle. DANTE provides an alternate perspective to traditional gradient-based backpropagation techniques commonly used to train deep networks. It utilizes an adaptation of quasi-convexity to cast training a neural network as a bi-quasi-convex optimization problem. We show that for neural network configurations with both differentiable (e.g. sigmoid) and non-differentiable (e.g. ReLU) activation functions, we can perform the alternations effectively in this formulation. DANTE can also be extended to networks with multiple hidden layers. In experiments on standard datasets, neural networks trained using the proposed method were found to be promising and competitive to traditional backpropagation techniques, both in terms of quality of the solution, as well as training speed.

[error in script]
IITH Creators:
IITH CreatorsORCiD
Sinha, Vaibhav B.UNSPECIFIED
Kudugunta, SnehaUNSPECIFIED
Sankar, AdepuUNSPECIFIED
Chavali, Surya TejaUNSPECIFIED
Balasubramanian, Vineeth NUNSPECIFIED
Item Type: Article
Uncontrolled Keywords: Activation functions; Alternating minimization; Backpropagation techniques; Convex optimization problems; Neural network configurations; Non-differentiable; Quasi convexity; Training speed
Subjects: Computer science
Divisions: Department of Computer Science & Engineering
Depositing User: . LibTrainee 2021
Date Deposited: 31 Jul 2021 10:51
Last Modified: 31 Jul 2021 10:51
URI: http://raiithold.iith.ac.in/id/eprint/8609
Publisher URL: http://doi.org/10.1016/j.neunet.2020.07.026
Related URLs:

Actions (login required)

View Item View Item
Statistics for RAIITH ePrint 8609 Statistics for this ePrint Item