Machine Learning


Here is a list of projects in which the machine learning group is involved in:

Main content

Energy Informatics

The Energy informatics Lab is concerned with the development and application of modern information and communication technology to address challenges related to sustainable energy production, distribution and saving. Energy informatics is a broad field that utilizes methods from artificial intelligence, machine learning, optimization, internet of things technology, high performance computing, algorithms, computer security, and visualization with applications in energy systems such as wind energy, solar power, energy storage, smart grids, and green data centers.

Machine Teaching for Explainable AI

“Machine Teaching for Explainable AI” is a joint project between the Department of Informatics at the University of Bergen, the Valencian Research Institute for AI (VRAIN) and industrial partners Equinor and Eviny. The project is financed by the Norwegian Research Council.

The last decade has witnessed an explosive rise in the use of decision systems built on modern AI techniques that are often opaque, such as deep learning. These black-box systems, based on large amounts of data, are a key tool in making important decisions for both individuals, companies and society at large. It is of greatest importance that the users can evaluate and trust these decisions. The field of explainable artificial intelligence (XAI) addresses this issue, to give human users a better understanding of the behavior of complex AI systems.

This project is directed at example-based explanations, with a novel focus on the simplicity of examples. It will develop mathematical formulations of simplicity across various representation domains, that correlate well with simplicity for the human learner. The project will develop the conceptual and practical setting of example-based explanation, thereby expanding the techniques of machine teaching and breaking new ground by applying them to XAI in an innovative way.

Parameterized Complexity for Practical Computing

Computational tasks vital in all areas of science, industry and society are often NP-hard, meaning unlikely to be solvable optimally on even the fastest computers in any reasonable time. Practitioners use heuristics, a term that denotes intuitive/ad-hoc rules such as the Greedy Heuristic which makes choices based on what looks best at the moment. Heuristics often do a good job, but have no guarantees of efficiency or quality of solution. A central contemporary challenge of theoretical computer science is to explain the surprising effectiveness of heuristics on real world data for hard problems. This Project takes up that challenge in the framework of Parameterized Complexity (PC). High impact will be novel fixed parameter tractable (FPT) subroutines for heuristics in a variety of application areas that out-perform best-practice approaches on standard benchmarks.

A novel theory of FPT will allow systematic transfer of results to the design of heuristics, kernelization, approximation and exponential algorithms. Approximation and worst-case exponential algorithms will be developed based on extremal gradients and smooth FPT that asymptotically beat best-known theoretical results. Breakthrough theory and experiments will be of interest in other areas of algorithm theory, related areas of mathematics, data visualization, machine learning and heuristic practice.

Retail Fresh

For 2019, food waste specifically amounted to 931 million tonnes of food, of which 13% occurred at retailers. The rapid deterioration rate of fresh produce gives it short and variable shelf life. Therefore, fresh produce is typically not supported by automated goods supply systems. Manual ordering has to take into account the large demand variations, high sales volume and particular high-quality product requirements, which in turn may explain the high degree of waste for fresh produce. To reach the UN Sustainable Development Goal target of 50% reduction in tons of food waste by 2030, retailers must implement waste reduction measures for fresh produce that are much more efficient that the state-of-the-art.

In the project, Link Retail and the University of Bergen will develop a scalable system based on machine learning to minimize wastage of fresh goods while sustaining sales. The software will be tested iteratively on different categories (bread, fruit and vegetables, and then meat, fish and deli) in close collaboration with Meny.


The ability to fine-tune well positioning in real-time through geosteering increases the potential value of wells and minimizes risks. Geosteering is essential for petroleum and penetrates other types of drilling, including geothermal projects, civil wells and tunnels, and soon CO2 storage wells. Currently, geosteering decisions rely on fast-paced manual interpretation of real-time data, which needs to consider pre-job modeling, geological setting, and embedded uncertainties. Following the newest trends in the industry geosteering workflow of the future shall capture and update uncertainty in an ensemble of geomodels.

The most advanced ensemble methods are the core of closed-loop reservoir management (EnCLRM), the new standard for optimizing the development of petroleum fields. However, EnCLRM is not fast enough for real-time operations due to the involved geomodelling complexity. We plan to remove this complexity by creating new Generative-Network (GN) geomodels. GN-geomodels “learn geology” before the operation and have sub-second performance. They unlock next-generation data assimilation and new predictive decision-support AI. DISTINGUISH will develop these technologies and combine them into the “geosteering workflow of the future.” It proposes a new way of thinking substituting “depth of detection” with “distance of prediction” and probabilistic decision support.