/RoSee - High-speed neuromorphic vision for household robots

RoSee - High-speed neuromorphic vision for household robots

PhD - Antwerpen | More than two weeks ago

Take robotic perception to the next level!

Neuromorphic systems aim to replicate closely how the brain process information. An event-based camera mimics the biological vision system by asynchronously providing brightness events changes, enabling ultrafast imaging. Compared to traditional cameras, it has a much higher dynamic range, low latency, and high temporal resolution (1 μsec). Moreover, it provides a stream of information in time (event or spikes), naturally combining with how spiking neural networks process information. This combination of bio-inspired vision sensors and spiking neural networks aims to achieve novel and better machine vision paradigms (less data, less processing, lower power, lower latency), deployable in practical applications such as robot vision and autonomy (self-driving, self-flying).

In this PhD project, you will work with event-based sensors on data collected in real-time or from known datasets. You will develop a spike-based neuromorphic solution either on custom imec hardware or based on known available framework running in a computer or embedded system. More specifically, you will focus on the following research questions:

  • Can we build a low power, low latency event based vision system based on spiking neural networks, requiring less data, less processing? We will investigate different solutions for event based data preprocessing, novel ways to perform spike encodings and carrying out the learning. A tradeoff between power consumption, hardware and capability of the system will be analyzed.
  • Can we combine event based cameras with other sensors such as radar, or traditional cameras to argument the functions, i.e. overcome the limitation of individual sensors and achieve a greater capability. We will explore how to use and synchronize different data streams combining in a common spiking neural network substrate.
  • How should this low-latency perceptual information be used by the control system towards safety and autonomy?
  • How can these novel techniques be applied to achieve breakthrough robotic perception?
Depending on running projects, we will devise a demonstrator to show case the developed solutions.




Required background: Engineering science, computer/data science

Type of work: 60% algorithm development, 30% experimental, 10% literature

Supervisor: Steven Latré

Daily advisor: Werner Van Leekwijck, Inton Tsang

The reference code for this position is 2022-067. Mention this reference code on your application form.