Abstract:
Neuromorphic computing is an emerging field that implements bio-inspired perception, computation and processing with the potential of low power and low latency processing. Current traditional computing and hardware are optimized for each other, but they are not well suited to bring out the efficiency of neuromorphic processing principles. In this thesis, we develop an event-based line and line segment tracking algorithm that captures the low-level features of a scene, on top of which other neuromorphic embedded vision tasks can be developed. We use the Dynamic Vision Sensor (DVS) event camera to obtain low-latency inputs. Our algorithm is a Hough Transform implemented as Spiking Neural Networks (SNNs) to detect lines. We use SNNs to leverage asynchronous event camera inputs and perform sparse computation. SNNs enable low power consumption due to the parallel processing of each neuron when implemented on neuromorphic hardware. The number of spiking neurons is minimized to reduce power consumption further. Fixing the number of neurons makes the Hough space coarsely discretized, in which population coding techniques allow finer discretisations. The algorithm takes in variable high-resolution event camera inputs with minimal loss of accuracy. We utilize artificial, recorded, and real-time event camera data to check the functionality of our algorithm running on a GPU, which makes parallel processing of SNN possible. We also show promising results of line-tracking running on the SpiNNaker neuromorphic hardware with limited neurons. We propose extensions to the algorithm to detect line segments and conduct preliminary experiments to evaluate the idea. Hence, we have designed a building block neuromorphic algorithm that paves the way for on-edge neuromorphic applications requiring low latency and low power consumption.