Leuven | More than two weeks ago
Develop eco-design principles of artificial intelligence and machine learning systems, by co-optimizing the soft- and hardware to minimize environmental impact across the entire life-cycle.
In recent years, Artificial Intelligence (AI) has seen extremely rapid adoption following the success of Machine Learning (ML). Deep neural networks in particular revolutionized computer vision and natural language processing with convolutional neural networks and large language models respectively. However, these deep neural networks require massive amounts of data to train their billions of parameters, and the strong growth of resource-intensive models amplifies the environmental pressures threatening life on earth as we know it. The design of sustainable AI systems has thus emerged as a crucial research question today.
This problem is mainly being attacked from two angles. First, the machine learning community is leading many efforts in software improvements. The aim of these improvements is to design ML algorithms that are less compute-intensive, typically by using fewer parameters (e.g., sparse neural networks, knowledge distillation), fewer datapoints (e.g., coresets, compressive learning), and less precision (e.g., weights quantization). Second, the hardware community is developing a broad family of hardware accelerators that are specifically designed for running existing ML tasks efficiently (e.g. Neural Processing Units, or NPUs). There is growing interest in going beyond those separated approaches and towards a holistic one where algorithms and hardware are co-designed (as argued for in the Hardware Lottery essay by Sarah Hooker). By opening up the design space, the aim is to enable more radical solutions to emerge to bring substantial performance gains (e.g., neuromorphic computing, hyperdimensional computing).
However, these optimizations so far mainly target improvements during algorithm execution (e.g., a NPU with reduced power consumption and latency during training and/or inference). Although those improvements can succeed in achieving sustainability gains in one targeted area (e.g., by reducing the power consumption of an ML model, the NPU reduces the emissions during the use-phase of that model), it remains unclear whether those developments are beneficial to the environment when looking at the whole life-cycle of the ML model (e.g., the potential increased embodied emissions or decreased lifetime of the specialized NPU vs. an off-the-shelf general-purpose graphics processing unit or GPU).
This PhD thesis will study the eco-design of machine learning systems while co-optimizing the soft- and hardware. A doubly holistic approach will thus be followed: the design space encompassing soft- and hardware co-optimization, and the target objective encompassing the whole life-cycle of the ML system.
This will be tackled from an interdisciplinary perspective, involving knowledge from machine learning and algorithm design, hardware design (System Technology Co-Optimisation or STCO), and life-cycle analysis.
The PhD topic is intentionally very open, but could for example address the following research questions:
The candidate holds an engineering master’s degree (computer science, electrical engineering, applied mathematics) and has good knowledge of at least one of the following areas:
As the research project is interdisciplinary, the candidate moreover has strong interest and the ability to develop expertise in the remaining areas. Good communication skills, collaboration abilities, and a solid project management record will also be considered.
Required background: Engineering (Computer Science, Electrical Engineering, Applied Mathematics) with backround in at least one: (i) Machine Learning, (ii) Hardware design, (iii) Sustainability.
Type of work: 80% modeling/simulation, 20% literature
Supervisor: David Bol
Daily advisor: Vincent Schellekens
The reference code for this position is 2025-174. Mention this reference code on your application form.