Deep Learning - Continual Learning
Neural networks achieve state-of-the-art performance and in some cases even surpass humans on many machine learning tasks. Despite their success they lack some important properties compared to the way humans and animals learn. One of these is the ability to learn tasks in a sequential order where only training examples of the current tasks are available. Currently neural networks suffer from a drastic decrease in performance on previous tasks when they are trained on a new task. This phenomenon is know as "catastrophic forgetting". Research in the area of continual learning aims at preventing catastrophic forgetting in an efficient and salable way while at the same time enabling the transfer of knowledge obtained on previous into future tasks and vice versa. This ability of learning tasks sequentially and benefiting from other learned tasks is considered by many as an important step on the way to a general artificial intelligence.
- R. Nägele u. a., „Charge Based Mixed-Signal Multiply-Accumulate Circuit for Energy Efficient In-Memory Computing“, in 2021 Kleinheubach Conference, 2021, S. 1–4. doi: 10.23919/IEEECONF54431.2021.9598440.
- F. Wiewel und B. Yang, „Condensed Composite Memory Continual Learning“, in 2021 International Joint Conference on Neural Networks (IJCNN), 2021, S. 1–8. doi: 10.1109/IJCNN52387.2021.9533491.
- A. Bartler, A. Bühler, F. Wiewel, M. Döbler, und B. Yang, „MT3: Meta Test-Time Training for Self-Supervised Test-Time daption“. 2021. [Online]. Verfügbar unter: https://arxiv.org/abs/2103.16201
- F. Fallah, F. Wiewel, und B. Yang, „Semi-supervised Riemannian Dimensionality Reduction and Classification Using a Manifold-based Random Walker Graph“, in 2020 28th European Signal Processing Conference (EUSIPCO), 2021, S. 1120–1124. doi: 10.23919/Eusipco47968.2020.9287877.
- F. Wiewel und B. Yang, „Entropy-based Sample Selection for Online Continual Learning“, in 2020 28th European Signal Processing Conference (EUSIPCO), 2021, S. 1477--1481. doi: 10.23919/Eusipco47968.2020.9287846.
- F. Wiewel, A. Brendle, und B. Yang, „Continual Learning Through One-Class Classification Using VAE“, in ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2020, S. 3307–3311. doi: 10.1109/ICASSP40776.2020.9054743.
- A. Bartler, F. Wiewel, L. Mauch, und B. Yang, „Training Variational Autoencoders with Discrete Latent Variables Using Importance Sampling“, in 2019 27th European Signal Processing Conference (EUSIPCO), 2019, S. 1--5. doi: 10.23919/EUSIPCO.2019.8902811.
- F. Wiewel und B. Yang, „Localizing catastrophic forgetting in neural networks“, arXiv preprint arXiv:1906.02568, 2019, [Online]. Verfügbar unter: https://arxiv.org/abs/1906.02568
- F. Wiewel und B. Yang, „Continual Learning for Anomaly Detection with Variational Autoencoder“, in ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2019, S. 3837--3841. doi: 10.1109/ICASSP.2019.8682702.
Exercise: Advanced Mathematics for Signal and Information Processing