Makkeh, Abdullah Dr.


  • Since 2019 Postdoctoral fellow at the Department of Data-driven Analysis of Biological Networks
  • Since 2019 Guest Researcher at the Neural Systems theory group, Max Planck Institute for Dynamics and Self-Organization Göttingen
  • 2018 – 2019 Postdoctoral fellow at the Computational Neuroscience Group, University of Tartu
  • 2018 PhD in Informatics, University of Tartu
  • 2014 – 2018 PhD student at the Theoretical Computer Science, University of Tartu
  • 2011 – 2013 MSc in Mathematics, Lebanese University
  • 2009 – 2011 BSc in Mathematics, Lebanese University



Major Research Interests
My research aims to enhance the capability of information theory in studying complex systems.
Complex systems such as the brain, artificial neural networks, and interactive agents share a common necessity for functioning: processing information. These systems perform their computations by handling vast amounts of information—storing it (e.g., memorization), transferring it (e.g., communication), or modifying it (e.g., perception or decision-making). To understand these systems, we examine them through the lens of information processing using the mathematical framework of partial information decomposition (PID) within information theory. For instance, PID allows us to answer questions such as: Which part of the system has unique information about certain inputs? Or how much of the information in different parts of a system is redundant or synergistic?
Currently, we are developing a novel neural network architecture called Infomorphic Networks, where neurons directly learn to optimize their information processing via local PID-based goal functions.




overview on activities in Developmental Dynamics




homepage department/research group
personal website

Selected Recent Publications

  • Makkeh, A., Graetz, M., Schneider, A. C., Ehrlich, D. A., Priesemann, V., & Wibral, M. (2025). A General Framework for Interpretable Neural Learning based on Local Information-Theoretic Goal Functions. Proceedings of the National Academy of Sciences (PNAS), In Press.
  • Schneider, A. C., Neuhaus, V., Ehrlich, D. A., Makkeh, A., Ecker, A. S., Priesemann, V., & Wibral, M. (2025). What should a neuron aim for? Designing local objective functions based on information theory. The Thirteenth International Conference on Learning Representations (ICLR).
  • Ehrlich, D. A., Schneider, A. C., Priesemann, V., Wibral, M., & Makkeh, A. (2023). A Measure of the Complexity of Neural Representations based on Partial Information Decomposition. Transactions on Machine Learning Research (TMLR).
  • Gutknecht, A. J., Wibral, M., & Makkeh, A. (2021). Bits and pieces: Understanding information decomposition from part-whole relationships and formal logic. Proceedings of the Royal Society A.
  • Makkeh A., Gutknecht A.J., & Wibral M. (2021). Introducing a differentiable measure of pointwise shared information, Physical Review E.




for a full list of publications see Abdullah Makkeh on google scholar