From Image to Results - C. elegans Embryo Cell Division Tracking
Chukwuebuka William Okafornta
Author Chukwuebuka William Okafornta PhD Student
AG Müller-Reichert, Core Facility Cellular Imaging, Faculty of Medicine Carl Gustav Carus, TU Dresden
Dr. Philipp Seidel
Author Dr. Philipp Seidel Product Marketing Manager Life Sciences Software
ZEISS Microscopy
Abstract

From Image to Results | AI-Powered Cell Division Tracking in C. elegans

In this series "From Image to Results", explore various case studies explaining how to reach results from your demanding samples and acquired images in an efficient way. For each case study, we highlight different samples, imaging systems, and research questions.

In this seventh episode, we study a C. elegans embryo during the very early stages of embryonic cell division. Learn how to  segment cell bodies and nuclei using Deep Learning semantic segmentation on the ZEISS arivis Cloud web platform (previously known as APEER).

Case Study Overview

Sample

C. elegans embryo

Task

Cell division tracking

Results

Precise cell fate mapping in the developing embryo

System

ZEISS Lattice Lightsheet 7

Software

ZEISS arivis Pro, ZEISS arivis Cloud

Introduction

Figure 1: "C. elegans embryonic development" by Bob Goldstein is licensed under CC BY-ND 4.0.

In this use case, we study a C. elegans embryo during the very early stages of embryonic cell division (Figure 1). C. elegans is a particularly useful model organism for developmental biology as its growth has been extensively studied. Researchers know the exact timing of its cell divisions and cell differentiation dynamics into the adult stage with precisely 959 cells. This allows for the study of genetic functions or the effects of drugs by observing any perturbations in the normal development scheme1.

Microscopic imaging has been established as a key method for studying early embryos with its ability to provide the spatial context of cell divisions and cell migration patterns. However, commonly used microscopy techniques are often too invasive to observe these delicate processes over extended periods of time. In this case study, we show the success of sensitive and gentle long-term imaging of C. elegans embryos with ZEISS Lattice Lightsheet 7.

As C. elegans is a prime model organism for studying asymmetric cell division, it would have been highly desirable to determine single cell lineages. But limitations in image quality precluded any deeper analysis with conventional methods.

In this case study, we present how segmentation of cell bodies and nuclei using new methods enable this analysis. We leveraged the superior performance of Deep Learning semantic segmentation on the ZEISS arivis Cloud platform (previously known as APEER).

Material and Methods

Figure 2: Deskewed and deconvolved data set of a developing C. elegans embryo as a maximum projection movie. Membranes and nuclei are shown in green and spindle poles in purple. Cell divisions can be observed by the splitting of centrosomes and by membrane compartmentalization.

This C. elegans line was created by C. W. Okafornta, Dresden University of Technology, Germany. It is equipped with fluorescent marker proteins for cell membranes (PH::mKate2), nuclei (H2B::mCherry) and centrosomes (gammaTub::GFP). This allows for visualization of cellular structures to analyze cell division cycles and cell compartmentalization during live embryonic development.

Image data for this use case was captured with ZEISS Lattice Lightsheet 7. This system is designed for gentle, volumetric live cell imaging, making it the ideal instrument for observing delicate structures over long periods of time. The C. elegans embryos were imaged using the green channel (centrosomes) and red channel (cell membranes and nuclei). Live acquisition was performed for almost 2 hours with one volume every 30 sec, 476 planes per volume for a total of 120,000 images.

When using lattice light sheet microscopy, the sample is imaged at an angle, due to the geometry inherent to this technology. Consequently, the raw data must be processed (deconvolved and deskewed) in ZEN (Figure 2) before further analyses can be performed.

The processed .czi file from ZEN was directly imported into ZEISS arivis Pro (formerly Vision4D) and converted into the arivis .sis file format, a format specifically designed for fast handling of large data sets. We then reduced the data size by using only every second time point, resampling the data set to 50% in z dimensions and converted to data set to 8-bit. The final data set has 2 channels, 79 slices, 60 time points, and a size of ~0.9 GB.

arivis logo
arivis logo

ZEISS arivis Pro

ZEISS arivis Pro is a modular software for working with multi-channel 2D, 3D and 4D images of almost unlimited size, highly scalable and independent of local system resources. Many modern microscope systems such as high-speed confocal, light sheet / SPIM, super-resolution, electron microscopy or X-ray instruments can produce huge amounts of imaging data. ZEISS arivis Pro handles such datasets without constraints and in relatively short time.

Software Processing

C. elegans embryo data in the APEER DL Annotation Tool

C. elegans embryo data in the APEER DL Annotation Tool

C. elegans embryo data in the APEER DL Annotation Tool

Figure 3: C. elegans embryo data in the ZEISS arivis Cloud Annotation Tool. Data from 6 time points were uploaded and 76 slices were annotated for detectable cells (dark yellow) and nuclei (gray).

Figure 3: C. elegans embryo data in the ZEISS arivis Cloud Annotation Tool. Data from 6 time points were uploaded and 76 slices were annotated for detectable cells (dark yellow) and nuclei (gray).

We proceeded toupload image data from 6 individual upload image data from 6 individual time points onto the arivis Cloud platform in order to train deep learning models using the arivis AI toolkit. From these, 76 random slices were annotated for cell bodies and nuclei (Figure 3). From the resulting labelled dataset, two distinct Deep Learning segmentation models were trained, one for cell bodies and one for nuclei.

Training of semantic segmentation models

Training of semantic segmentation models

Training of semantic segmentation models

Figure 4:  Training of semantic segmentation models for cell bodies (upper panel) and nuclei (lower panel). Notice the good convergence of training and validation data, suggesting that the model was not overfitted in 800 epochs. Training resulted in high prediction accuracy for both models (>99% for nuclei; >96% for cell bodies).

Figure 4:  Training of semantic segmentation models for cell bodies (upper panel) and nuclei (lower panel). Notice the good convergence of training and validation data, suggesting that the model was not overfitted in 800 epochs. Training resulted in high prediction accuracy for both models (>99% for nuclei; >96% for cell bodies).

Based on the training convergence, both trainings were successful and led to the models accurately predicting the desired objects (Figure 4). Finally, we downloaded the models to use for object inference on the complete data set directly in ZEISS arivis Pro.

Image Analysis Pipeline

Image Analysis Pipeline

Image Analysis Pipeline

Image Analysis Pipeline

Image Analysis Pipeline

For image analysis in ZEISS arivis Pro we aimed at segmenting cell bodies, nuclei and centrosomes.

Centrosomes were segmented conventionally with watershed segmentation, requiring some pre-processing (background subtraction and denoising) to clean up the raw image. For nuclei and cell bodies, we embedded Deep Learning models imported from ZEISS arivis Cloud directly into the arivis Pro analysis pipeline with the newly introduced “Deep Learning Segmenter” function.

After segmentation, all resulting object classes were further optimized and filtered using morphological operations and filter functions. For example., raw cell body segmentations still contained too many merged cells and these could be separated by applying morphological opening, splitting and closing filters.

Next, object tracking was performed based on cell bodies. Parameters were adjusted to also detect (binary) cell divisions. This allowed for tracking complete cell lineages from a few initial cells. Furthermore, compartments were created to study cell bodies along with the respective nuclei and centrosomes contained in them.

Finally, segmented cell bodies, nuclei and centrosomes were grouped per time point to extract global average measurements of numbers, volumes and intensities.
The sketch summarizes the image analysis procedure. The arivis Pro pipeline to perform these operations will be available in detail following the conclusion of this case study.

Execution in ZEISS arivis Pro and arivis Cloud

In this video tutorial, see how the workflow is performed in ZEISS arivis Pro and arivis Cloud.

Validation

Figure 5: The centrosome segment overlaid with the raw image. Membranes and nuclei are shown in green and spindle poles in purple. Centrosome segments are color-coded by time point ranging from blue (1 second) to red (115 seconds).

To validate the quality of centrosome segmentation, we overlayed segmented centrosomes with the raw image (Figure 5). In this video clip, objects are color-coded based on their time of appearance. Centrosomes vary in size and intensity throughout the cell cycle. All centrosomes, including the small interphase centrosomes, were successfully segmented.

Figure 6: Cell body segments validation. Video clip shows the sequence of slices over time and z direction, both with precise and robust segmentation results.

Figure 6 shows a video clip of how the Deep Learning segmentation and post-processing generates robust cell body segmentations. The clip displays slices of the data set both in time and in z direction. The segmentation results were sufficiently precise to use them for reliable cell tracking.

Figure 7: Validation of nucleus segments. The raw image data is shown in green and nucleus segments in red. The nucleus segmentation worked correctly in general, but not as precisely during cell divisions.

Figure 7 shows the results from Deep Learning segmentation of nuclei. Nuclei segmentations were robust, although not as precise as cell segmentations. Particularly during mitosis, the Deep Learning model struggled to identify all nuclei with sufficient precision.

Figure 8: Cell tracking validation. One initial cell and its progeny are shown together with the respective cell lineage track.

Figure 8 illustrates the quality of cell tracking based on the cell lineage track of one initial embryonic cell. The clip shows that both cell movements and cell divisions are correctly detected by the tracking algorithm.

Results

Cell Growth & Cell Divisions

One fundamental read-out of such an imaging experiment is the quantification of cell divisions in the embryo with precise kinetics. With robust segmentations available for cell bodies, nuclei and centrosomes, we have three distinct object groups to use in order to accomplish this (Figure 9). Examining the graphs for cell numbers and nuclei numbers, we find distinct plateaus in the plot. This means that there are distinct waves of collective cell divisions in the early embryo at ~ 3, 20, 30 and 47 minutes.

Counting centrosomes can also be used to indicate cell divisions within the embryo. However, because there are two centrosomes within each cell, centrosome numbers are roughly doubled compared to the cell and nuclei numbers. Because centrosomes divide during interphase, and not during cell division, there is no discernible plateaus as was observed for cells and nuclei.

Quantification of cell growth over time by cell numbers. Time points of collective divisions are indicated as dashed vertical lines.

Figure 9A: Quantification of cell growth over time by cell numbers. Time points of collective divisions are indicated as dashed vertical lines.

Quantification of nuclei numbers. Time points of collective divisions are indicated as dashed vertical lines

Figure 9B: Quantification of nuclei numbers. Time points of collective divisions are indicated as dashed vertical lines.

Quantification of centrosome numbers. Time points of collective divisions are indicated as dashed vertical lines

Figure 9C: Quantification of centrosome numbers. Time points of collective divisions are indicated as dashed vertical lines.

Illustration of cell lineages

Figure 10: Illustration of cell lineages. This scheme shows the cell lineage tracking for the 6 initial embryo cells. Every circle indicates a cell identified at one point in time and forks indicate a cell division. The red, dashed boxed indicate collective waves of cell division as identified in Figure 9.

Cell Lineage Tracking

Measurements as shown in Figure 9 are appropriate for global observations. However, they are not sufficient for detailed analysis, such as the study of the deterministic cell fates within the developing embryo. Such resolution can only be obtained by analyzing cell lineage tracking.

Figure 10 shows the schematic result of cell tracking in this data set. For the initial 6 cells detectable at the beginning of the recording, the algorithm succeeded in generating complete cell lineage tracks. Red, dashed boxes indicate the waves of collective cell divisions. In Figure 10, cell divisions are indicated as forks in the cell hierarchies, and the scheme convincingly shows that most cell divisions occur in these waves.

The detailed cell lineage analysis now allows for observations and quantifications on a single cell level or within a particular branch of cell progenies. One outcome is the qualitative illustration of cell fates in the 3D context of the embryo (Figure 11/12). Notice how cells descending from a particular initial cell occupy coherent patches of the embryo at later stages, a phenomenon known as mosaic development2.

Illustration of mosaic development in the early embryo, t = 0 min
Illustration of mosaic development in the early embryo, t = 10 min
Illustration of mosaic development in the early embryo, t = 20 min
Illustration of mosaic development in the early embryo, t = 30 min
Illustration of mosaic development in the early embryo, t = 40 min
Illustration of mosaic development in the early embryo, t = 50 min
Figure 12: Illustration of mosaic development in the early embryo. Same as Figure 11 but as a continuous video clip.

Notice how cells descending from a particular initial cell occupy coherent patches of the embryo at later stages, a phenomenon known as mosaic development2.

Figure 13: Illustration of centrosome dynamics during cell divisions. The video clip shows the cell division dynamics of a single cell lineage track (green) and centrosome (red) during this process. The lower panel shows the schematic hierarchy of the cell lineage track.

Centrosome Dynamics in Single Cell Lineage Tracks

To illustrate how measurements on single cell lineage tracks may deliver more robust quantitative results, we now reevaluate the relationship of centrosome morphology and cell division. Figure 13 illustrates qualitatively how centrosome signal increases before cell division and drops afterwards.

Global centrosome intensity dynamics

Figure 14: Global centrosome intensity dynamics. Average centrosome intensities for each time frame were plotted. Phases of collective cell divisions are indicated as dashed vertical lines. Notice that centrosome peak intensities can’t be detected if cell division events aren’t sufficiently synchronized (red box).

Figure 14 shows the global measurement of average centrosome intensities. Because there are distinct collective waves of cell division in the embryo, we can detect peak centrosome intensities that coincide with these waves. But what happens if there are no collective waves of cell division? Then, any global measurement of centrosome properties would become diluted and potentially undetectable (compare red box in Figure 14).

Mitochondria color-coded according to this ratio to visualize their distribution in relation to the nucleus

Figure 15A: Centrosome intensity relative to cell division. Centrosome intensities were measured in 6 randomly selected cell lineage tracks and plotted relative to the respective cell division event.  Illustration of a subset of the chosen tracks.

This analysis can be optimized by employing single cell lineage tracks. We therefore defined a fixed observation window (+/- 10 frames) around a set of randomly selected cell division events. We then measured centrosome fluorescence intensities in these sets and plotted them relative to the cell division event (Figure 15; cell division at time = 0). In other words, the cell measurements were “synchronized in-silico”.

Mitochondria color-coded according to this ratio to visualize their distribution in relation to the overall outline of the cell.

Figure 15B: Centrosome intensity relative to cell division. Centrosome intensities were measured in 6 randomly selected cell lineage tracks and plotted relative to the respective cell division event. The figure shows the resulting plot (single tracks in gray, average and standard deviation in black).

The results show that centrosome intensities strongly increase in the 10 minutes prior to cell division and then steeply drop after it. What we observe here is the formation of spindle poles at a late mitotic stage that actively pull chromosomes apart just before the cell splits into its two daughter cells.

Summary

In this case study, we performed a more in-depth analysis of imaging data of a developing C. elegans embryo previously acquired with ZEISS Lattice Lightsheet 7.

Now, we have employed Deep Learning segmentation for the challenging task of identifying cells and nuclei despite considerable noise in the data set. We employed the arivis AI toolkit on the ZEISS arivis Cloud platform for conventional labelling and training for the Deep Learning models. We then transferred the trained models to ZEISS arivis Pro and incorporated them into our existing workflow pipelines. This allowed us to obtain better segmentation results and furthermore simplified the workflow considerably, making complicated pre-processing schemes obsolete. Of note, although arivis AI Deep Learning operates strictly on 2D slices, ZEISS arivis Pro automatically generated meaningful 3D objects with the integrated Deep Learning models.

In this analysis we illustrated how changing upstream methods in the workflow (e.g., improving the segmentation quality) unlocks new opportunities for downstream analysis. Now we could employ ZEISS arivis Pro’s robust cell lineage tracking, a very typical task for such types of data. With this, we were able to refine our prior morphological analysis by zooming into the dynamics of single cell lineage tracks. This is exemplified by the quantification of centrosome dynamics during cell division. Additionally, any other quantitative read-out could be improved in this way.

Try It for Yourself

Download a trial version of ZEISS arivis Pro


Share this article