Joseph, K.J. and Balasubramanian, Vineeth N
(2020)
Meta-consolidation for continual learning.
Advances in Neural Information Processing Systems.
ISSN 10495258
Full text not available from this repository.
(
Request a copy)
Abstract
The ability to continuously learn and adapt itself to new tasks, without losing grasp of already acquired knowledge is a hallmark of biological learning systems, which current deep learning systems fall short of. In this work, we present a novel methodology for continual learning called MERLIN: Meta-Consolidation for Continual Learning. We assume that weights of a neural network ?, for solving task t, come from a meta-distribution p(?|t). This meta-distribution is learned and consolidated incrementally. We operate in the challenging online continual learning setting, where a data point is seen by the model only once. Our experiments with continual learning benchmarks of MNIST, CIFAR-10, CIFAR-100 and Mini-ImageNet datasets show consistent improvement over five baselines, including a recent state-of-the-art, corroborating the promise of MERLIN.
[error in script]
IITH Creators: |
IITH Creators | ORCiD |
---|
Joseph, K.J. | UNSPECIFIED | Balasubramanian, Vineeth N | UNSPECIFIED |
|
Item Type: |
Article
|
Uncontrolled Keywords: |
Biological learning; Continual learning; Data points; Novel methodology; Recent state;Learning systems;Deep learning; Image enhancement |
Subjects: |
Computer science |
Divisions: |
Department of Computer Science & Engineering |
Depositing User: |
. LibTrainee 2021
|
Date Deposited: |
31 Jul 2021 11:52 |
Last Modified: |
31 Jul 2021 11:52 |
URI: |
http://raiithold.iith.ac.in/id/eprint/8616 |
Publisher URL: |
|
Related URLs: |
|
Actions (login required)
|
View Item |