PhD - Leuven | More than two weeks ago
Machine learning recently enables to tackle more and more problems such as cognitive recognition, robotics, discovery, and many others. The machine learning algorithms, however, operate in a deterministic way, unable to handle probabilistic inputs or quantify uncertainty associated with their prediction, resulting in false positive responses. One way to increase confidence about the results, is the use of giant data sets and very large-scale models, both unachievable in the context of edge devices.
The alternative of probabilistic machine learning models, such as Bayesian Neural Networks, or Probabilistic Graphical Models with Gibbs sampling, allow to incorporate uncertainty and train with a limited data set. This offers an attractive model for inference and training in resource constrained devices, such as smart phones, glasses or other embedded devices. Yet, probabilistic network learning and intelligence depend crucially on the careful probabilistic representation of uncertainty. This requires the device to be capable of generating tuneable stochastic samples, at a very high throughput and with low energy and area footprint. Current digital CMOS implementations, however, fail to offer such solutions.
On-chip probabilistic samples can also be implemented by exploiting emerging silicon devices, such as magnetic nanodevices (called “p-bits”). These p-bits promise to be more energy efficient and compact and allow to conceptualize a system exploiting the combination of many magnetic nanodevices, each capable of generating tuneable random bits at GHz speeds. The outputs of these devices can be smartly combined in an or digital fashion, to generate a wide variety of tuneable probabilistic distributions. For foresee such building blocks to become a key block in the probabilistic machine learning processors of the future.
In this PhD, the student will be responsible to explore the optimal design of p-bit-based random number generators, as well as their integration into machine learning processor chips for probabilistic algorithms.
The thesis work will involve the following task:
Understanding of probabilistic algorithms, e.g., quantum Montecarlo for edge computing and data fusion applications.
Understanding of mechanism of probabilistic elements.
Designing of probabilistic bit-based random number generators and benchmarking their performance with different state of art digital designs.
Integrate the random number generators into a machine learning processor.
Efficiency benchmarking of the resulting processor for giving probabilistic algorithms.
Experimental verification of the magnetic random access memory enabled processor hardware.
This thesis work allows the student to learn the probabilistic algorithm, probabilistic bit-based machine learning, co-optimization of probabilistic bit-based ML performance, and testing of the hardware of the building blocks
Required background: MTech. Electronics/Electrical Engineering, MTech. AI and ML.
Type of work: 90% modeling, design, testing, and 10% literature.
Supervisor: Marian Verhelst
Daily advisor: Ankit Kumar, Marian Verhelst
The reference code for this position is 2024-035. Mention this reference code on your application form.