Wall Calendar 2025

Home » Community » Wall Calendar » Wall Calendar 2025

In 2025, the NFDI4BIOIMAGE wall calendar underwent its second round

In 2025, our wall calendar broadened its scope to a consortium spanning edition. Each month we presented a new image, taken by mobile phones, cameras, drones, and microscopes, and it’s story.

The cover page of our 2025 NFDI4BIOIMAGE calendar depicts a microscopic image of a brain tumor cryosection with different cells types highlighted in distinct colors based on multimodal data integration of bioimaging and spatial transcriptomics approaches.

This overlay of a spinning disk confocal image with gene expression data generated on a Xenium platform precisely characterizes the cellular composition of a human tumor sample. Multimodal datasets from spatial-omics experiments or other technologies are crucial for an in-depth understanding of the structure of biological tissues and the interactions between different cell-types in both physiological and pathological conditions.

After the spatial analysis of the transcriptome of cells the dense tissue sample was re-imaged at higher resolution for improved cell segmentation. Subsequent registration of segmented cell patterns with spatial transcript count information identified and highlighted tumor cells (green) besides immune cells (red), stromal cells (violet) and neurons (cyan) in the brain sample based on gene expression patterns. Cell nuclei are shown in white. Cell types could be identified based on the expression patterns of the biomarkers like PTCH1 for tumor cells, CD4 for immune cells and AQP4 for neurons and others.

This image showcases a project of the NFDI4BIOIMAGE task areas 1 and 3 aiming to improve annotation, visualization and FAIR sharing of multimodal imaging datasets. The latter are images of biological samples with data of non-imaging-based technologies. Such spatial-omics experiments beneficially contribute to the in-depth understanding of health and disease.

omero-vitessce is an exemplary plugin developed by Michele Bortolomeazzi, a bioinformatician at DKFZ Heidelberg and member of NFDI4BIOIMAGE, for the OMERO image data management system which enables the visualization of images stored and annotated in OMERO with the Vitessce multimodal data viewer.

The image was created by Michele using a dataset that has been generated in a collaborative study of the Single-cell Open Lab, the Division of Chromatin Networks, and the Division of Pediatric Neurooncology of the German Cancer Research Center (DKFZ), the German Cancer Consortium (DKTK), and the Hopp Children’s Cancer Center (KiTZ), all located in Heidelberg. The study is currently under review but shared prior to publication.

The datasets are archived and thus made openly accessible in the BioImage Archive. Code for both image processing and data analysis is available on GitHub for extensive documentation and reproducibility.

January 2025:

The image for this month was captured during a joint research project between the Center for Orthopaedics at Greifswald University Medical Center (Greifswald, Germany) supplying samples to the biomedical research center ZIK plasmatis led by Prof. Dr. Sander Bekeschus at the Leibniz Institute for Plasma Science and Technology (INP), where image acquisition was performed by Lea Miebach and Sander Bekeschus. This study aimed to investigate the cellular effects of therapeutic oxidation using medical gas plasma technology in the context of regenerative medicine.

The image showcases primary bone mesenchymal stromal and stem cells isolated from an arthroplasty patient cohort. After treatment, the cells were stained with MitoSpy green and Flash Phalloidin Red following fixation with 4% paraformaldehyde. Nuclei were counterstained with DAPI. High-content imaging was performed using a standardized, automated acquisition setup. Images were acquired in appropriate fluorescence channels using a 20x Air objective (NA = 0.8) in spinning disc-confocal mode. Algorithm-based, unsupervised image analysis was used to quantify nuclei, mitochondria, and actin filament morphology. Sliding-parabola functions were utilized to enhance signal-to-noise ratios. Prior to this, pixel intensities were digitally corrected using flatfield correction. Cell regions were detected using a threshold of pseudofluorescence channels (combined from all fluorescence channels) in relation to the background of each single image separately. Mitochondria were detected as fluorescent spots inside the segmented cell region with distinct discrimination values (e.g., size, relative intensity, and contrast of the spots to the background and each other (splitting)). For texture analysis, a texture SER method was used, and speckle and ridge filters were applied using Kernel normalization to 1-pixel units, followed by quantitative analysis of object features. The settings of the texture filters were identical across all fields of view analyzed, as the curvature of the sliding parabola was kept the same. Experimental metadata and analysis details are stored interconnectedly in a dedicated database to ensure a reproducible image analysis workflow. You can appreciate the characteristic morphology and complex actin cytoskeleton crucial for stem cell function. Can you spot the heart formed by the prominent actin protrusions of interconnected cells?

NFDI4BIOIMAGE provides infrastructure for storing and managing bioimage data using Open Microscopy Environment Remote Objects (OMERO), which directly supports the reproducibility, sharing, and accessibility of experiments like the one described. The experimental metadata, analysis details, and the image analysis workflow align with the principles of NFDI4BIOIMAGE, ensuring that data and findings can be properly archived, validated, and reused by the broader scientific community.

Stay tuned for more updates.

Dr. Mohsen Ahmadi, Data Steward at the NFDI4BIOIMAGE consortium and Leibniz Institute for Plasma Science and Technology – INP Greifswald

 

February 2025:

The image of the month February (PID: http://id.bildindex.de/thing/0002180951) depicts a wall painting from the partially destroyed Savior Transfiguration Cathedral in Odessa, Ukraine. It is situated in the Southern side choir of the cathedral.

The mural painting is part of the Stations of the Cross (Via Crucis), a series of images depicting Jesus Christ the day of his crucification that serves to commemorate his suffering especially in the Lenten season. Here, we see the situation when Christ is collapsing for the third time under the weight of the cross he is carrying.

The iconography notation mentioned in the metadata refers to the classification system „Iconclass“ that captures and indexes image contents.

The painting and the entire cathedral were damaged by a Russian missile on July 23, 2023 although the cathedral was declared a UNESCO World Heritage Site in January 2023 as part of Odessa’s Old Town.

Built as the main Orthodox church in Odessa in 1794, demolished by Soviets in 1936 and rebuilt rather recently in 1999-2005, after the independence of Ukraine, this means a painful loss.

The photograph was taken two months after the attack, in September 2023, as part of the project “Documenting Ukrainian Cultural Heritage” run by Bildarchiv Foto Marburg and TIB Hanover in cooperation with Blue Shield Germany and NGOs in Ukraine. It was funded by ‚Beauftragte der Bundesregierung für Kultur und Medien‘, the Ukraine Art Aid Center and the ‚Deutsch-Ukrainische Gesellschaft für Wirtschaft und Wissenschaft e.V.‘ and co-financed by the Irene and Sigurd Greven Stiftung, the UKRAINE-Förderlinie, the Ernst von Siemens Kunststiftung and the HERMANN REEMTSMA STIFTUNG. In 2022 and 2023, 22 photographers documented about 400 buildings threatened or damaged by the war. The author of this image, Oleg Kutskyi (born in 1947 in Odessa), is a professional documentary photographer. Since 1991 he is a member of the National Union of Photographic Artists of Ukraine.

A selected range of the approximately 4000 images is available here. For security reasons, some of the images obtained in the project will remain hidden from the general public until the war is over.

NFDI4Culture as Consortium for Research Data on Material and Immaterial Cultural Heritage deals with a variety of data from architecture, art history, musicology, theatre, dance, film and media studies. This includes not only images, but also audio, video, augmented and virtual reality, 3D models, notated music etc. The consortium sees its responsibility in preserving endangered data: Especially in case of loss of the original cultural asset digital representations can become immaterial cultural heritage in themselves. Thus, NFDI4Culture engages in diverse projects regarding culture heritage preservation but also fostering resilience against numerous threats, including natural disasters, fires, accidents, wars or robbery, that can endanger cultural heritage. NFDI4Culture is funded by Deutsche Forschungsgemeinschaft (DFG) – 441958017.

Dr. Martha Stellmacher (NFDI4Culture/Saxon State and University Library Dresden) and Dr. Gabi Pahnke (NFDI4Culture/Deutsches Dokumentationszentrum für Kunstgeschichte – Bildarchiv Foto Marburg) with support of Kamila Bojarska (Deutsches Dokumentationszentrum für Kunstgeschichte – Bildarchiv Foto Marburg)

March 2025: 

This month’s featured image, provided by Michael Schwarz from the Max-Planck Institute for Evolutionary Biology in Ploen, offers a fascinating view into the microscopic world of biofilms. Captured at 125x magnification, the image reveals the intricate details of a biofilm created by Pseudomonas fluorescence on the surface of a liquid culture medium. This snapshot was taken with a Zeiss Axio Zoom V16 microscope, highlighting the complexity of microbial life and its interactions.

The biofilm depicted was the result of an inoculation involving two distinct bacterial strains: a cellulose-overexpressing, surface-colonizing wild type tagged with mScarlet, and a GFP-tagged mutant that lacks the ability to colonize surfaces. As the culture matures, initial micro-colonies form and eventually fuse to develop into a confluent biofilm, as shown in the image. Mr. Schwarz’s research is dedicated to understanding how these colonies interact, whether they mix, remain distinct, or if one strain eventually overgrows the other.

Interestingly, biofilms like these can undergo collapse due to their own mass, prompting Pseudomonas fluorescence to evolve new strategies for survival. This dynamic process contributes to the emergence of a unique life cycle.

The original data from this study is managed using the OMERO image data management system. While the OMERO instance of the Fraunhofer Institute is not publicly accessible, we put efforts in to adhere to the FAIR principles by transferring the data to the publicly accessible OMERO at the University of Münster. This was achieved using the python-based CLI tool, omero-cli-transfer, which since 2023 solves the longstanding challenge of transferring data between different OMERO instances. This development underscores the commitment from the bioimaging community together with OME to evolve OMERO in response to researchers’ needs. Additionally, the image has been made available on Zenodo for wider accessibility.

By showcasing projects like this one, we illustrate the potential and importance of proper image data management. Such efforts serve as examples of how data can be systematically shared, validated, and reused, demonstrating the critical role these practices play in enhancing our understanding of e.g. microbial ecosystems and their intricate behaviors.

Stay tuned for more interesting stories and showcases in the months to come.

 

April 2025: Imaging in Geospatial Context

This month we present a screenshot from the tool BIIGLE, which stands for Benthic Image Indexing and Graphical Labelling Environment. BIIGLE is a web-based tool designed for efficient annotation of still images and videos [1, 2]. Originally developed for marine environmental research and monitoring, it can be applied to various types of image and video annotation. The tool is freely available and can be installed in cloud environments, a local network or on mobile platforms during research expeditions. A public instance can be found at biigle.de. The tool finds its support and use in the NFDI consortia NFDI4Biodiversity and NFDI4Earth.

Main features of BIIGLE are:

  • Efficient annotation of large image and video collections. The annotation tools are optimized for a wide range of tasks and have been tested by scientists worldwide.
  • Rapid review and exploration of thousands of annotations for quality control and training. The label-review grid view is designed to maximize human visual perception capabilities.
  • Accelerated image annotation using computer vision. Tools like Machine Learning Assisted Image Annotation method (MAIA) and Magic SAM assist in handling the increasing volume of unannotated images in environmental monitoring and research.
  • Users can create custom hierarchical classification schemes or import species labels from the World Register of Marine Species. Labels can be edited, versioned, and shared with collaborators.
  • Exploration of large gigapixel mosaics, ranging from microscopy images to benthic maps, directly in a web browser. The annotation tool works seamlessly and efficiently, even with extremely large images. 

The annotated image in the calendar shows the coastline of Fernandina Island, Galapagos, which is the habitat of the Galapagos Marine Iguana (Amblyrhynchus cristatus). The image is a large mosaic that was stitched together from many individual images captured by a drone. The green annotations marking the iguanas were machine-generated as part of a feasibility study for the automatic analysis of the data in the project Iguanas from above [3, 4].

[1] Langenkämper, D., Zurowietz, M., Schoening, T., & Nattkemper, T. W. (2017). BIIGLE 2.0-browsing and annotating large marine image collections. Frontiers in Marine Science, 4, 83. https://doi.org/10.3389/fmars.2017.00083

[2] Zurowietz, M., & Nattkemper, T. W. (2021). Current trends and future directions of large scale image and video annotation: Observations from four years of BIIGLE 2.0. Frontiers in Marine Science, 8, 760036. https://doi.org/10.3389/fmars.2021.760036

[3] Varela-Jaramillo, A., Rivas-Torres, G., Guayasamin, J. M., Steinfartz, S., & MacLeod, A. (2023). A pilot study to estimate the population size of endangered Galápagos marine iguanas using drones. Frontiers in Zoology, 20(1), 4. https://doi.org/10.1186/s12983-022-00478-5

[4] https://iguanasfromabove.com

May 2025: FAIR Plant Data

This month’s microscopy image captures the interaction between the barley cv. Golden Promise and the barley powdery mildew fungus Blumeria graminis f.sp. hordei, observed 48 hours post-inoculation. The fungus was stained with Coomassie dye, enhancing its visibility against the barley leaves. The leaves were prepared and fixed onto slides, followed by scanning with a Zeiss Axio Scan Z.1 microscope scanner using a 5x objective lens.

The upper section of the image displays the hyphal colonies, which were automatically segmented, highlighting the fungal structures (black) against the plant tissue (white). The lower section presents a machine learning-based analysis where a Convolutional Neural Network (CNN) was employed to predict the fungal structures. Here, the red bounding boxes show the outer boundaries of detected objects, while the green contours precisely trace the segmented hyphae, illustrating the effectiveness of the segmentation and prediction processes.

The automated image analysis was done using the open source BluVision Micro software.

The metadata of this study was collected using the Minimal Information About Plant Phenotyping Experiment (MIAPPE). MIAPPE is a community-driven standard that specifies the essential metadata required to describe a plant phenotyping experiment in a way that makes the data well-documented and structured and to ensure that plant phenotyping data follows the FAIR principles — meaning the data should be Findable, Accessible, Interoperable, and Reusable by both humans and machines.

MIAPPE organizes information into several major sections:

General metadata: Information about the project, the people involved, and administrative details (project title, contributors, licensing, etc.).

Study: Describes a specific study within the broader project, including its purpose, timeline, and related publications.

Biological material: Detailed information about the plant material used, such as species, variety, accession number, provenance, and any genetic modifications.

Experimental design: Covers how the experiment was organized, including replication, randomization, plot layout, treatments, and control groups.

Environment: Describes the growth conditions and environmental factors that could affect the phenotype — including soil type, irrigation, light intensity, temperature, humidity, and nutrient conditions.

Observations: Details about the traits measured, the measurement protocols, the timing of observations, and the instrumentation used (for example, imaging systems or manual scoring).

Data files and access: Links to the actual datasets produced, and descriptions of their formats and accessibility.

MIAPPE is not intended to capture every possible detail but focuses on the minimum necessary for a meaningful interpretation and reuse of the data. It is designed to be adaptable and can be expanded for specific domains, like high-throughput imaging, field experiments, or specific species. MIAPPE does not prescribe how measurements are made (e.g., manual vs. automated) but insists that the methods be clearly documented. MIAPPE metadata can be represented using ISA-Tab.

MIAPPE is an important tool because phenotyping experiments can be complex and heavily environment-dependent. MIAPPE ensures that critical information is captured so experiments can be reproduced or compared accurately. Phenotyping data are also increasingly large and diverse. Hence, standardizing metadata allows easier sharing and aggregation across institutions and projects. As plant sciences integrate genomics, phenomics, and environmental data, standardized metadata is crucial for linking datasets from different disciplines. And as true for all data: Without detailed metadata, old datasets become meaningless over time. MIAPPE ensures that data remains useful years after an experiment is completed.

Check out the MIAPPE GitHub repository for more information.

 

June

June 2025: As below, so above

This month’s calendar image, contributed by Kevin Warstat from the working group Shoot Dynamics of the Institute of Bio- and Geosciences (IBG) at Forschungszentrum Jülich, offers an interesting aerial perspective on agricultural research at the cutting edge of data-driven science. Unlike the microscopic dimensions we are used to, this illustration brings us high above the fields—a testament to the power and versatility of imaging technologies in life sciences.

Captured from a UAV (unmanned aerial vehicle) equipped with two advanced cameras—a Sony Alpha 7 Mark IV (for RGB color) and the MicaSense RedEdge-MX Dual multispectral sensor—this image collage showcases the diverse capabilities of aerial imaging. The MicaSense camera collects up to 10 narrowband spectral channels, with each individual image later registered and combined in a meticulous post-processing workflow. The processed images are then stitched into a single, geo-referenced orthomosaic, a comprehensive map-like image of the landscape that reveals information invisible to the naked eye.

The illustration presents a side-by-side comparison: on the left, a true-color RGB orthomosaic is displayed, split into its three channels. On the right, a corresponding NDVI (Normalized Difference Vegetation Index) orthomosaic of the same agricultural field is featured, with two additional images above it illustrating the crucial red and near-infrared bands that contribute to NDVI calculations.

The adoption of UAV-based multispectral imaging and NDVI analysis allows researchers to uncover vital insights about crop health, stress, and development—key information for both sustainable agriculture and environmental stewardship. NDVI in particular, a method highlighted in a series of recent publications by the working group (see here, here, and here), enables precise inference about the physiological condition of plants by harnessing the differences in light reflection between the visible red and near-infrared bands.

This work is part of PhenoRob, the University of Bonn’s flagship Cluster of Excellence for digital agriculture and a pilot project for research data management (RDM). PhenoRob is actively involved with FAIRagro, a fellow consortium under the National Research Data Infrastructure (NFDI) umbrella committed to developing innovative, collaborative RDM solutions for agrosystem research. They also provide their own public repository for their research data. The image is part of this publication and from “Flugtag 11.07.2023”.

By integrating advanced sensor technology, sophisticated post-processing, and a commitment to open, reusable data practices, this month’s image captures not just a field, but the spirit of innovation transforming agricultural research today. The stunning orthomosaics are both a practical research tool and a visual testament to how remote sensing and data management are reshaping our ability to observe, analyze, and ultimately nurture the landscapes on which we depend.

Stay tuned as we continue to showcase new perspectives and technological achievements from across the life sciences in the months ahead.

July

July 2025: A Pearl Necklace of Autofluorescent Metabolic Foci in Bacteria

This month’s image showcases the exploration of a mysterious pattern of natural autofluorescence in the multicellular bacterium Streptomyces coelicolor. These bacteria grow as vegetative hyphae forming a complex mycelial network.

The image presents the z projection of an image stack of bacteria captured using spinning disk confocal microscopy. This optical sectioning allowed Pilar Lörzing, PhD student in  the research group Molecular Biophysics of Michael Schlierf at the TU Dresden, to sufficiently resolve the newly discovered and patterned 3-dimensional arrangements of hyphae. The depth information is color-coded in the image.

As part of a collaborative project led by Denis Iliasov in the group of Thorsten Mascher from the Department of General Microbiology of TU Dresden, Pilar investigated intrinsic autofluorescence of filamentous Streptomyces species. She visualized autofluorescent foci representing previously uncharacterized spatial features of Streptomyces hyphae well located at hyphal tips, branching points and along the filaments.

Through multimodal investigations linking microscopy with biochemical and genetic approaches, the research teams identified the flavin-binding metabolic enzyme dihydrolipoyl dehydrogenase (LpdA) as source of autofluorescent foci as described in this pre-print. Unlike typical bacterial metabolic enzymes that are delocalized in the cytoplasm, LpdA accumulates in discrete intracellular foci. Computational modeling of the spatial organization suggests the foci to serve as metabolic centers in bacterial hyphae.

 

August

August 2025: Hand-in-hand with bio-imaging

August shows us the quantification of tissue reperfusion using real-time spectral imaging and deep learning.  The image was captured by Haowen Jiang and Claire Chalopin from the University of Applied Sciences and Arts, Göttingen.

Reperfusion injury is the tissue damage caused when blood supply returns to tissue after a period of ischemia or lack of oxygen. This image illustrates tissue oxygen saturation in the hand, calculated using various computer-assisted methods based on hyper-spectral and multi-spectral imaging. The hyperspectral imaging system had a spectral range of 500–1000 nm, with 100 bands at a spectral resolution of 5 nm. In contrast, the multispectral imaging system had 16 bands in the 460–600 nm range and 15 bands in the 600–850 nm range. Both imaging modalities were operated in reflectance mode.

The purpose is to compare perfusion parameters derived from multi-spectral cameras – which provide relatively limited spectral information but enable real-time imaging (figures 3 and 4) – with those obtained from a hyper-spectral medical system, which captures rich spectral data but does not support real-time imaging (parameter 2). The image demonstrates that deep learning methods (4) outperform classical, non-AI-based approaches (3).

These findings lay the groundwork for future real-time, quantitative assessment of tissue perfusion during organ transplantation surgeries.

September

September 2025: When Two Cells Meet – The Story of an Immune Synapse

At first glance, it could be a landscape from another planet. But this is not outer space. This is the moment two human immune cells meet under a Scanning Electron Microscope, frozen in time at the nanoscale.

Captured by Werner Zuschratter at the Leibniz Institute for Neurobiology (LIN) in Magdeburg, the image reveals the immune synapse, the intricate contact point between a Jurkat T cell (cyan) and a Raji B cell (yellow). Here, cell membranes exchange not only a handshake but molecular information that can determine the fate of an immune response.

The B cell was “primed” with a bacterial signal (staphylococcal enterotoxin E) to entice the T cell into conversation. They were then gently placed on poly-L-lysine–coated slides, fixed, dried, and coated in gold to preserve every detail. Using a scanning electron microscope, they could then magnify this exchange 6,500 times, capturing even the finest folds and protrusions in stunning clarity.

But what makes this image truly powerful is not just its beauty, it’s the story preserved behind it. Every preparation step, microscope setting, and analysis parameter is stored as rich, standardized metadata. This ensures that anyone, anywhere, can not only see the image but also understand exactly how it was created and replicate the work.

Stay tuned with us for more updates.

 

October

October 2025: Meet Peltigera neckeri: The Tiny Lichen Teaming Up with Cyanobacteria in the Eifel

If you wander through the mossy stones of the Northern Eifel, you might stumble upon some tiny, unassuming wonders – like this little lichen superstar, Peltigera neckeri. At first glance, it might just look like a patch of blue-gray fuzz, but zoom in and you’ll see a fascinating partnership at work. This lichen is a master of teamwork: a fungus (the mycobiont) teams up with cyanobacteria (of the genus Nostoc), creating a symbiotic duo that can fix carbon through photosynthesis and access nitrogen. This makes it a real survivor in nutrient-poor environments.

This lichen specimen was recorded in the Northern Eifel region between Blankenheim and Schmidtheim, Germany, on 28 October 2023 at 12:21 CEST, under cloudy, slightly rainy skies, with a temperature of about 14°C. It was collected from moss substrate. Recording such local, microenvironmental parameters is crucial as every collected specimen experiences subtly different conditions—humidity, light, substrate, immediate surroundings—all of which can influence the organism’s appearance, physiology and associated microbial community.

Identification of this lichen was confirmed through ITS sequence analysis—sequencing of the internal transcribed spacer region in the nuclear ribosomal DNA. The ITS marker is widely used in fungal taxonomy because it evolves fast enough to distinguish closely related species, while still being alignable across a broad range of taxa. By comparing ITS sequences to reference databases we can reliably assign species even when morphological features are ambiguous. In addition, sequencing data can reveal intraspecific variation, cryptic species and phylogeographic patterns.

This work is part of the Use Case LichenMetaImage under NFDI4Microbiota. The goal is to develop a customized metadata template for lichen imaging. As described above, special metadata is required to describe our fantastic team in nutrient-poor environments. Thus, integration of specific imaging, environmental and molecular data is necessary to make datasets more reproducible and FAIR (Findable, Accessible, Interoperable, Reusable).

Capturing, curating and sharing such research data through infrastructures like NFDI4BIOIMAGE ensures long-term accessibility and reuse. Standardized metadata—covering organism, location, environmental conditions and imaging parameters—supports reproducibility and enables cross-disciplinary insights. In the context of biodiversity and symbiosis research, robust research data management is essential to connect observations like this one to broader ecological and molecular datasets.

 

November

November 2025: When Brain Tumors Host an Immune Party

This month’s microscopy image captures what looks like a microscopic get-together: a tertiary lymphoid structure (TLS) forming inside a human glioblastoma. While most gliomas are known as “immune-cold” tumors, keeping the immune system at arm’s length, a few seem to break the rules and host their own immune meet-up.

This image was captured by Jadranka Macas at the Edinger Institute (Institute of Neurology) of the Goethe University Frankfurt and is part of a recent study published here: https://doi.org/10.1016/j.immuni.2025.09.018. The research team at Goethe University, University Cancer Center and Frankfurt Cancer Institute investigates how these TLSs influence the immune microenvironment in gliomas and why patients with such immune “hotspots” tend to live longer.

The image shows an human FFPE glioblastoma section, stained and scanned using the Lunaphore COMET high-plex seq-IF system. It displays a perivascular cluster of B cells, T cells, plasma cells, macrophages and endothelial cells, each glowing in their assigned color:

  • CD20 (red) – B-cells
  • CD3 (green) – T-cells
  • CD163 (white) – anti-inflammatory macrophages
  • MZB1 (magenta) – plasma and memory B-cells
  • NF (orange) – neurofilaments
  • GAP43 (cyan) – neurons
  • vWF (blue) – endothelial cells

The full dataset includes 43 fluorescence channels (including autofluorescence), captured at 0.23 µm/pixel with 16-bit depth in OME-TIFF format — a real data giant.

A single image like this holds millions of data points and detailed metadata: acquisition settings, fluorophore channels, biomedical features, file formats and more. Without solid research data management (RDM) practices, it would be nearly impossible to keep track of all this information or to make it reusable for future studies.

What might look like a colorful piece of microscopy art is, in fact, a carefully managed and richly annotated dataset. This November, let’s take a cue from these glioma TLSs: keep your data organized, your metadata complete and your collaborations lively.

 

December 2025: Craving for nitrogen – Hungry green yeast cells

December 2025 offers us a colorful microscopy visualization of microbial oil droplets in the fungus Ustilago maydis. Many organisms growing at nitrogen-starvation conditions tend to produce lipid droplets due to an accumulation of lipids in cells. In this experiment, we can admire this process thanks to the imaging of genetically engineered cells packed with oil droplets. Droplets were then visualized thanks to BODIPY staining, a fluorescent compound used in biological imaging. The study was conducted in the framework of the BioSC project ‘NextVegOil’, a project aiming at a sustainable and tailor-made microbial palm oil substitute from agricultural residues.

This image was nicely annotated according to REMBI. Thanks to this annotation, researchers can have an overview of the instrument (Zeiss Axio Observer Z1), the imaging method (wide-field whole organism microscopy), both image acqusition parameters and, finally, the biological entity (being intact yeast cells and intracellular lipid droplets). The annotation gives also a direct link to the comprehensive description of the bio-sample linked to the publication.

Thanks to Kira Müntjes and Kerstin Schipper from Heinrich Heine University in Düsseldorf for sharing this FAIR dataset with us!