Jump to main navigation Jump to main navigation Jump to main content Jump to footer content

Artificial Intelligence: Systems & infrastrucutres

Artificial Intelligence has established itself in high-performance computing and provides the sciences with a plethora of new methods for analysing data. That is why we provide the LRZ’s AI systems, host partners’ AI platforms and evaluate innovative AI chips for potential use in research.

From the natural sciences to the humanities: Artificial Intelligence (AI) is now in constant interdisciplinary use. Their methods can be used to comprehensively analyse and recalculate Big Data. AI also complements the various methods in high-performance computing and expands conventional simulations to include statistical analyses where mathematics and digital programming continue to hit their limits. With its LRZ AI systems, the LRZ provides researchers with powerful clusters equipped with Graphics Processing Units (GPU). 

Strong partnerships create even more opportunities: BayernKI, the Bavarian IT infrastructure for AI and the European AI factory HammerHAI provides additional AI resources and supplement the range of training courses and workshops. Last but by no means least, partnerships with the Munich Centre for Machine Learning (MCML) and the German Aerospace Centre (DLR) are improving their range of AI products as well as their experience with AI. 

The new technology is developing dynamically, but the energy requirements of AI are very high, and AI models are often unreliable: This raises research questions for which the LRZ is seeking answers in collaboration with renowned and international institutes.

Our Mission: To turn research into knowledge as quickly as possible

Analysing measured values, developing models, calculating simulations: If research can process its observations efficiently, we will be able to increase our knowledge more quickly and better solve the challenges of the present. That is why we do more than just provide scientists and researchers with high-performance technology and innovative tools so that they can reduce the time it takes to acquire knowledge from research. Above all, we focus on personal support and a comprehensive, up-to-date training programme: At LRZ, you can learn programming languages as well as learn how to use high-performance and AI systems. And it goes without saying that LRZ specialists are continuously optimising the computing power available, the infrastructure for data management and the user-friendliness of technology and tools.

From pattern recognition to language model: AI technology at LRZ

The LRZ's AI Systems

The LRZ’s AI systems are made up of various, high-performance cluster segments with Graphics Processing Units (GPU) from e.g. NVIDIA. The LRZ provides a practical software stack for developing your own models as well as tried and tested data sets and common AI applications such.

To the AI systems

The largest AI chip in the world

Cerebras Systems has developed a chip specially designed for working with and training large language models: The Wafer Scale Engine 2 (WSE-2). It is the size of a pizza box and contains 2.6 billion transistors, 850,000 computing cores and around 40 gigabytes of fast, static main memory. When working with Large Language Models (LLM), data can flow here and results can be temporarily stored and recalculated. Together with researchers, the LRZ is evaluating a CS-2 system with the WSE for its potential use in science.

To Cerebras

Dynamic data flows

Not only does AI need powerful computers, it also needs flexible storage solutions. Regardless whether we’re talking about computer vision or language models – anyone who works with AI needs to prepare AI systems for their tasks. The training runs iteratively or in batch mode, switching from calculation or analysis to storing immediate results. The LRZ’s AI systems have amongst others an interactive web server for these work steps and can be connected to the Data Storage System (DSS) and LRZ Cloud Storage. This creates a flexible, scaleable environment equipped to process a wide range of Big Data tasks.

To the webserver

Working with AI: Research Scenarios

  • Language models for protein codes
    Bioinformatics

    Language models help us to decipher the code of life. The bioinformaticians at the Technical University of Munich’s RostLab have been using Large Language Models (LLM) for some time to analyse protein sequences. Proteins work in a similar way to human languages in that their sequences are constructed like words and sentences. For their analyses: the researchers trained language models using protein code on the specialised Cerebras CS-2 system, thereby laying the foundations for deciphering and developing new materials using proteins.

    Click here to read more

  • AI plus HPC
    Surrogate models for supercomputing

    AI enriches conventional HPC with new, statistical methods. For example, AI uses data and values to supplement simulations that cannot (yet) be calculated mathematically or that are extremely difficult to collect. It is also possible to create more variants of a simulation in less time with the help of AI. As a result, surrogate or substitute models, which expand and sometimes even help to improve conventional simulations, are becoming more widely accepted.

    Click here to read more

  • Eradicating hate speech
    Community Management

    Social communities are faced with the problem of moderating content, filtering out and deleting fake content or even hate speech. Using this task as an example, an LRZ team compared different technologies: They trained the same language models on a GPU cluster and on the Cerebras system. Consequently: The Cerebras performed such tasks considerably faster.

    Click here to read more

  • A strong connection
    terrabyte

    A strong connection: The LRZ created the terrabyte high-performance platform for data analysis in collaboration with the German Aerospace Centre (DLR). It is located in Garching and is connected to the DLR archive, which has been storing satellite data for decades, via a fast data link. This has yielded a wealth of data and provided the natural and environmental sciences in Bavaria with practical AI tools for research. With the help of terrabyte, control systems for monitoring of forests and nature are now being created, as well as AI models for interpreting and analysing LiDAR and raster data.

  • More efficient diagnostics 
    Medicine 

    The LRZ, in collaboration with the University Hospital and the Paracelsus Medical University of Salzburg, is currently developing innovative tools for medical diagnostics with the help of AI. A combination of pattern or image recognition and language models is intended to accelerate bladder cancer diagnoses. After recording ultrasound and other images of the bladder and inputting the data, the tool should be able to generate an automated diagnostic text. 

    Click here for more information

Strong AI partnerships

BayernKI

Researching and developing AI methods and models in Bavaria: The Free State of Bavaria first established around 130 AI professorships as part of its high-tech agenda, followed by BayernKI: This high-performance infrastructure is housed at both the LRZ and Erlangen National High-Performance Computing Centre (NHR@FAU) and connected by a fast data line. Here, researchers of all disciplines get fast and flexibly scaleable access to AI technology. 

To BayernKI 

HammerHAI

EuroHPC Joint Undertaking is funding an AI factory in Germany of which the LRZ is a partner: HammerHAI is expected to meet the demand for computing resources for AI methods in research, industry and the public sector. To this end, the High-Performance Computing Centre (HLRS) in Stuttgart is working with the LRZ and other research institutes to create AI-optimised supercomputing infrastructure and provide support. The LRZ advises users and runs training courses on how to use AI. 

To HammerHAI

Munich Centre for Machine Learning

The LRZ is closely linked to the Munich Centre for Machine Learning (MCML), not just through its Board of Directors but also by hosting AI technology for the MCML. This is structured in a similar way to the LRZ’s AI systems and, if required, can be quickly connected should more computing power be needed for AI applications.

To the MCML

Trillion Parameter Consortium

They should be trustworthy, as versatile as possible and useful for research: However, we’re still a long way off achieving this dream when it comes to AI applications. The Trillion Parameter Consortium (TPC) was founded by the Argonne National Lab in the US and brings together around 60 research institutes from around the world to develop reliable, generative AI models for science.

To TPC

News

AI research & development projects at LRZ

We research the latest computer and storage technologies as well as Internet tools. In collaboration with partners, we develop technologies for future computing, energy-efficient computing and IT security, as well as tools for data analysis and the development of artificial intelligence systems. Here is an overview of all our research projects. 

Your introduction to the world of AI and data

The LRZ doesn’t just support researchers with AI technology and applications, we also advise and support you to collect and process data: Get in touch with the LRZ Big Data and AI team if you would like to know how to prepare, harmonise and ultimately implement the largest data sets on LRZ yourself. You can also rest assured that we will provide you with the resources that best suit your AI projects.

To Consulting