@@ -9,9 +9,18 @@ You can find the list of the latter on my [Google Scholar](https://scholar.googl
9
9
10
10
Every model must inherit ` inclearn.models.base.IncrementalLearner ` .
11
11
12
- ## Small Task Incremental Learning
12
+ < div align = " center " >
13
13
14
- Under review, preprint on arXiv [ here] ( https://arxiv.org/abs/2004.13513 ) .
14
+ # PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning
15
+
16
+ [ ![ Paper] ( https://img.shields.io/badge/arXiv-2004.13513-brightgreen )] ( https://arxiv.org/abs/2004.13513 )
17
+ ![ ECCV] ( https://img.shields.io/badge/ECCC-2020-blue )]
18
+
19
+ </div >
20
+
21
+ ![ podnet] ( images/podnet.png )
22
+
23
+ ![ podnet plot] ( images/podnet_plot.png )
15
24
16
25
If you use this paper/code in your research, please consider citing us:
17
26
@@ -57,9 +66,16 @@ python3 -minclearn --options options/podnet/podnet_cnn_imagenet100.yaml options/
57
66
Furthermore several options files are available to reproduce the ablations showcased
58
67
in the paper. Please see the directory ` ./options/podnet/ablations/ ` .
59
68
60
- ## Insight From the Future for Continual Learning
69
+ <div align =" center " >
70
+
71
+ # Insight From the Future for Continual Learning
72
+
73
+ [ ![ Paper] ( https://img.shields.io/badge/arXiv-2006.13748-brightgreen )] ( https://arxiv.org/abs/2006.13748 )
74
+ ![ Preprint] ( https://img.shields.io/badge/Preprint-2020-blue )]
75
+
76
+ </div >
61
77
62
- Under review, preprint on arXiv [ here ] ( https://arxiv.org/abs/2006.13748 ) .
78
+ ![ ghost ] ( images/ghost.png )
63
79
64
80
If you use this paper/code in your research, please consider citing us:
65
81
0 commit comments