Close

Presentation

Compressed Latent Replays for Lightweight Continual Learning on Spiking Neural Networks
DescriptionRehearsal-based Continual Learning (CL) has been investigated in Deep Neural Networks, while lacking in Spiking Neural Networks (SNNs). We present the first memory-efficient implementation of Latent Replay (LR)-based CL for SNNs, targeting resource-constrained devices. LRs combine new samples with latent representations of previous data, to mitigate forgetting. Experiments on Heidelberg SHD dataset with Sample and Class-Incremental tasks reach 92% Top-1 accuracy on average, without forgetting. Furthermore, we minimize LRs with a time-domain compression, reducing by 140x their memory, with 4% accuracy drop. On a Multi-Class-Incremental task, our SNN learns 10 new classes, with 78.4% accuracy on SHD test set.
Event Type
Work-in-Progress Poster
TimeWednesday, June 265:00pm - 6:00pm PDT
LocationLevel 2 Lobby
Topics
AI
Autonomous Systems
Cloud
Design
EDA
Embedded Systems
IP
Security