Recent advances in neurotechnology have significantly facilitated access to neural signals from human subjects. These neural signals have great potential to be of benefit in a wide range of applications, including consumer and medical applications. For example, brain-machine-interfaces (BMIs) use signals recorded from the brain to drive external hardware, such as robotic arms or computer devices. Such BMIs may be employed to help patients suffering from paralysis or to aid computer users by providing a thought-controlled cursor that avoids the traditional mouse.
Importantly, brain-machine-interfaces require smart algorithms to translate neural signals into sensible computer instructions. For BMIs to be useful in everyday applications or for clinical purposes in patients with impairments, the algorithms should be implemented on small (i.e. portable), energy-efficient (i.e. durable and light-weight, that is, without large batteries) and smart (i.e. applying our best machine learning algorithms) electronic chips. However, such chips are currently hard to come by.
In this project, we will explore whether machine learning chips that are being developed at imec can be suited to meet practical BMI needs. We will examine if the power consumption, speed and intelligence of neuromorphic chips are sufficient to serve BMIs under daily circumstances. For this purpose, we will develop and train deep learning algorithms, embed these algorithms on machine Learning chips and then feed the chip with various neural signals (e.g. EEG, ECoG, etc.) with the purpose of driving mechanical and digital actuators and predicting clinical events such as epileptic seizures.
Hence, with this project we aim to drive the transition from basic neuroscientific knowledge to real-word brain-machine-interface applications for clinical and commercial purposes.
Type of work: specify percentage dedicated to literature, technology study, experimental work, other
Supervisor: Rudy Lauwereins
Daily advisor: Bram Verhoef
The reference code for this PhD position is STS1712-32. Mention this reference code on your application form.