M-Large Language Models (LLMs)

Use Cases

Analysis of multimodal clinical reports

Multimodal LLMs can be used to analyze clinical reports that include not only text but also medical images, laboratory test results, and voice recordings. This allows for a more comprehensive understanding of a patient's clinical picture and can support physicians in diagnostic and therapeutic decisions.

Generation of personalized treatment plans

Using multimodal data such as diagnostic test results, medical images, and patient preferences, Multimodal LLMs can generate personalized treatment plans for patients with complex or multiple conditions. These plans can take into account individual factors such as adverse drug reactions, comorbidities, and patient preferences.

Symptom-recognition through multimodal data analysis

Multimodal LLMs can be trained to recognize symptoms of various medical conditions through integrated analysis of textual data, medical images, and voice data. For example, they can be used to identify signs of deteriorating mental health by analyzing patients' texts, facial expressions in images, and tone of voice in voice recordings.

Potential benefits for the healthcare and scientific research world

LLMs can be used for a variety of tasks

Generating text, translating languages, writing different kinds of creative content, and answering your questions in an informative way

Summarizing large amounts of text in a concise and informative way

Generating new ideas and solutions to complex problems

What do we do


Consulting and support

Consulting and technical support for the use of enabling platforms and software for medical research and diagnosis. Consulting on the installation and configuration of artificial intelligence platforms, or on the operation of molecular simulation software.


Development of customized IT solutions for medical research and diagnosis. For example, you could develop artificial intelligence algorithms for early disease detection, or software for the personalization of medical care.