[C96] EventGuard: Sparsity-Aware In-Sensor Denoising for Frame-Based Event Vision Sensors

Abstract

Frame-based Event Vision Sensors (EVS) have emerged as a highly efficient solution for capturing dynamic scenes across resource-constrained edge systems. However, their output frames are inevitably contaminated by sensor-inherent background noise, and traditional deep learning denoising algorithms suffer from suboptimal energy efficiency due to massive data-movement over dense memory structures, rendering in-sensor deployment impractical. In this paper, we introduce EventGuard, a sparsity-aware in-sensor denoising system for frame-based EVS using an algorithm-hardware co-design to dynamically preserve event sparsity while achieving high denoising accuracy. To replace heavy floating-point operations, the proposed system incorporates a hybrid Spiking-Binary Neural Network (SNN-BNN) filter that suppresses both temporal and spatial noise, operating exclusively on 1-bit logic. To minimize the significant dynamic power overhead introduced by empty memory fetches, we also propose the Spatially-Indexed Sparsity-Aware (SISA) microarchitecture, which utilizes a coordinate-driven scatter pipeline to exclusively evaluate valid active events. Synthesis and simulation results show that our proposed co-design not only delivers denoising accuracy comparable t_o deep learning baselines (up to 25.43~dB SNR and 0.97 AUC) but also consumes only 22–150~pJ per event, safely satisfying the strict energy envelopes of state-of-the-art in-sensor deployments.

Publication
The Embedded Vision Workshop (EVW)
Woosung Chung (정우성)
Woosung Chung (정우성)
Combined MS-PhD student
Chanwook Hwang (황찬욱)
Chanwook Hwang (황찬욱)
Combined MS-PhD students