Professor Andrzej Bargiela

Director and CEO, INFOHUB Ltd. · Professor of Computer Science

Nottingham Science and Technology Park, United Kingdom

Profile

Professor Andrzej Bargiela holds a BSc/MSc in Electronics and Informatics from the Silesian Technical University (1978) and a PhD from the University of Durham, United Kingdom (1984). His academic career spans several decades of contributions to computer science, having held positions as Senior Research Fellow, Senior Lecturer, Reader, and ultimately Professor at leading British universities.

In 2007, the appointment as Professor of Computer Science at the University of Nottingham marked a significant milestone, accompanied by a role as Director of the School of Computer Science at the Malaysia Campus. Between 2011 and 2013, the directorship of Internationalisation in the School of Computer Science at the University of Nottingham further broadened the scope of academic leadership.

As the founding Director and Chief Executive Officer of INFOHUB Ltd., a research spin-off company established in 2003, Professor Bargiela bridges the gap between academic research and commercial applications in human-centred computing, data mining, and decision support.

The Full Professorship conferred by the President of Poland in 2005 recognises the distinguished standing of this body of work. Visiting Professorships at universities in New Zealand, Poland, Japan, Finland, Italy, and Canada reflect the international reach and collaborative nature of the research programme.

6,600+
Citations
40+
Years Active
6
Research Areas
50+
PhD Graduates

Research Interests

The research programme falls under the broad heading of Computational Intelligence, encompassing six interconnected areas that together address fundamental and applied challenges in information processing and modelling.

Foundations of Computing

Representation of information and uncertainty through granular algorithms, exploring the mathematical underpinnings of Granular Computing as a paradigm for human-like reasoning.

Human-Centred Information Processing

Semantic transformation of data, multi-abstraction reasoning, and verification of information abstractions that place human cognition at the centre of system design.

Fuzzy and Rough Sets

Algorithms and applications employing fuzzy set theory and rough set methodologies for classification, decision support, and knowledge extraction from imprecise data.

Parallel and Distributed Computing

Computing environments, fault-tolerant architectures, and parallel algorithms designed to harness distributed resources for computationally demanding research tasks.

Neural Computation

Computation with artificial and biological neural networks, exploring architectures inspired by neuroscience for pattern recognition, learning, and adaptive system behaviour.

Complex Systems

Modelling and optimisation of systems characterised by structural and information uncertainties, with applications ranging from water distribution to traffic management.

Selected Academic Contributions

A selection of notable books and edited volumes reflecting the breadth of the research programme in granular computing, modelling, and computational intelligence.

Granular Computing: An Introduction
A. Bargiela, W. Pedrycz. Kluwer Academic Publishers. A foundational text establishing the theoretical framework of granular computing as a discipline.
Human-Centric Information Processing Through Granular Modelling
A. Bargiela, W. Pedrycz (Eds.). Springer, Studies in Computational Intelligence. Documenting milestone contributions to human-centred information processing research.
Seminal Contributions to Modelling and Simulation
K. Al-Begain, A. Bargiela (Eds.). Springer, Simulation Foundations, Methods and Applications. Celebrating 30 years of the European Council of Modelling and Simulation.
Inaugural Lecture on Information and Uncertainty
University of Nottingham. A popular-level discussion of the research programme and its implications for understanding information granularity in everyday reasoning.

View the full publication list →

INFOHUB Ltd.

INFOHUB Ltd. is a research spin-off company founded in 2003, operating from Nottingham Science and Technology Park. The company specialises in human-centred applications encompassing decision support, data mining, and information services.

By translating academic insights into practical solutions, INFOHUB demonstrates how theoretical advances in computational intelligence can be applied to real-world challenges. The company connects the rigour of university research with the demands of industry and public-sector organisations seeking intelligent, data-driven tools.

Company Overview

Founded: 2003

Location: Nottingham, UK

Focus: Decision support, data mining, information services

Status: University spin-off company

Latest News

November 27, 2025

Deep Learning Meets Biological Vision: Lessons from Retinal Modelling

The remarkable success of deep convolutional neural networks in image recognition tasks has drawn attention to the parallels between artificial and biological visual systems. While modern architectures such as ResNet and Vision Transformers have pushed performance on benchmarks to near-human levels, the computational strategies employed by the biological retina remain a rich source of inspiration for the design of more efficient and robust visual processing systems.

The retina is far more than a passive light sensor. It contains multiple layers of neurons that perform sophisticated signal processing before any information reaches the brain. Photoreceptors convert light into electrical signals, but these signals are then processed by successive layers of bipolar, horizontal, amacrine, and ganglion cells, each contributing to operations such as contrast enhancement, motion detection, and adaptation to varying light levels.

Read more
November 17, 2025

European Smart City Initiatives and the Future of Urban Mobility

Urban mobility is undergoing a period of rapid transformation across Europe, driven by environmental policy, technological innovation, and changing citizen expectations. The European Commission has been at the forefront of this shift, funding large-scale demonstration projects that bring together cities, technology providers, and research institutions to develop and test integrated smart city solutions. The Smart Cities Marketplace, operated by the European Commission, serves as a central platform connecting these initiatives and facilitating knowledge exchange between participating cities.

Among the most ambitious of these efforts are the Horizon 2020 lighthouse projects, which designate selected cities as living laboratories for testing innovations in energy, transport, and digital infrastructure. Projects such as REMOURBAN, GrowSmarter, Triangulum, and SmarterTogether have collectively involved over 120 cities across Europe, deploying more than 550 demonstrations of technological and social innovation.

Read more
April 29, 2025

Balancing Accuracy and Interpretability in Fuzzy Classification Systems

Classification is one of the most fundamental tasks in machine learning, and the range of available techniques has grown enormously over the past two decades. Yet a recurring tension runs through the field: the methods that achieve the highest accuracy on benchmark datasets are often the most difficult for human experts to understand and trust. Fuzzy classification systems offer a distinctive middle ground, combining the ability to learn from data with a representational framework that preserves human interpretability.

Fuzzy set theory, originally proposed by Lotfi Zadeh in 1965, provides a mathematical language for expressing partial membership and vague boundaries. In the context of classification, this means that an input pattern need not belong entirely to one class or another; instead, it can have graded membership across multiple categories. This property aligns well with many real-world situations where boundaries between classes are inherently imprecise.

Read more
August 25, 2024

The Role of Granular Computing in Modern Artificial Intelligence

Artificial intelligence has made remarkable strides in recent years, yet many of the most powerful models remain opaque in their reasoning. Granular computing offers a compelling alternative perspective, one that is rooted in the way humans naturally process information through layers of abstraction and contextual grouping. The IEEE Systems, Man, and Cybernetics Society Technical Committee on Granular Computing has long championed this approach as a formal framework for building computational systems that operate at varying levels of detail.

At its core, granular computing is concerned with the construction and manipulation of information granules: collections of objects that are drawn together by similarity, proximity, or functional equivalence. These granules can take the form of fuzzy sets, rough sets, intervals, or clusters, depending on the nature of the problem and the level of precision required. The central insight is that human reasoning rarely operates on raw, atomic data.

Read more