interfaces also allow use of the data in
third-party analysis tools like Huygens
Deconvolution software, Image J, Fiji,
CellProfiler and KNIME.
“The integration of Leica HCS A
(High Content Screening Automation)
into Leica Microsystems’ confocal and
widefield systems supports the screening
of a large number of samples under various conditions for robust statistics,” says
Kappel. Immediate image analysis eliminates the need to manually sift through
a large number of specimens, where rare
events, such as observing a dividing cell
among others, can easily be missed.
“Computer Aided Microscopy (CAM)
allows these events to continuously stream
to external storage devices where they are
analyzed simultaneously during image
acquisition,” says Kappel. In conjunction
with CAM, Leica HCS A can respond to
feedback from the analysis software about
an event detected during acquisition.
High-content screening is a growing dis-
cipline in life science research, as the num-
ber of complex experiments is increasing,
and statistically relevant data is needed to
further scientific discoveries. “Research is
automated by means of large-scale screen-
ing experiments,” says Kappel. “This allows
a higher throughput of experiments and
higher data quality.”
Automated analysis during acquisition
saves time, since scientists don’t need to
manually sift through data sets and rare
events, and less things are missed. This
also saves on lab equipment costs as exper-
iments don’t need to be repeated as often.
Some techniques based on statistical
Further enhancing image
data analysis have also appeared in recent
years. These approaches bring more
efficient solutions to image processing
traditional issues like classification, pat-
tern matching or tracking. “Super pixel
algorithms allow you to simplify images
with understandable region portioning,
giving access to well-known fast-graph
computations,” says Doux. “Some descrip-
tors like SIFT or FREAK bring some help
for extracting relevant information inde-
pendently from the scale and orientation
of the features.”
However, in terms of technology, some
GPUs offer a large amount of memory
that can be used for processing. And, CPU
All these enhancements help to man-
age the increase of data and modalities in
terms of performance and data correla-
tion. Better statistical data analysis helps
improve image segmentation and may lead
to improved research of complex struc-
tures, especially in the life sciences field.
“Thus, a better understanding of chal-
lenging issues can be done, for example,
segmenting textured phases in materials
science or detecting specific cells inside a
complex life science sample,” says Doux.
To more accurately model the true physiological state, therapeutic and disease progression research is shifting towards an in
vivo model in whole organisms. Pre-clin-ical trial and drug safety studies are also
incorporating in vivo studies earlier in the
process to evaluate toxicity and efficacy.
“Multiphoton systems, such as Nikon’s
A1R MP+ confocal microscope, are
challenged with motion artifacts when
penetrating deep into tissue,” says Sysko.
“When an organism shifts, breathes,
muscles contract, blood flow or slow drift
occurs, real-time image analysis must
Along with the goal of imaging normal
physiological states, label-free techniques
are developing and becoming more preva-
lent, and reliable brightfield image analysis
techniques are in demand to analyze data
sets without the benefits of high contrast
While some solutions have been made
to support the processing of image data,
“It all started with high-speed and
high-resolution imaging systems,” says
Rhodes. “Huge amounts of image data
can nowadays be acquired easily with-
in a single experiment. And that will
lead to even more data after the image
These data and images must be put
into the correct context for researchers to
find the answers they are looking for. Leav-
ing back-traceability of derived data from
image analysis inside big data sets as an
enhancement is needed.
There’s also a growing need for personalized image analysis. “The microscopy
software needs to have open interfaces
to third-party and open source elements,
so that every researcher can define their
project and the information in the image
which is relevant to their scientific question,” says Haas. Further enhancements
can be expected for the seamless integration of third-party and open software tools
into microscopy software.
Image analysis in the future
The potential for image analysis is huge,
and its importance will continue to grow.
But for researchers to not get lost in all
the possibilities to analyze images, it will
become crucial to decide which analysis
algorithm is the correct one with respect to
the scientific question to be answered.
“If there are no standards or guidelines
put into place, tons of ‘data garbage’ will be
produced easily considering the available
computation power: Just test enough different ways to analyze your image data until
you get what you would like to see,” says
Rhodes. If vendors can enhance and expand
the capabilities of image analysis while pro-
viding guidelines and standards for the sci-
entific community, the future will be bright.
Overall, image analysis is widely used
and, with its convenient access to research-
ers, it has become indispensable for quan-
titative, quantifiable and reliable results.
The future possibilities for image analysis
lie in setting up predictive models of bio-
physical processes and development.
— Lindsay Hock, Editor
Nikon A1 confocal images using NIS-Elements for 3-D vol-
ume rendering with 3-D object detection and measurement.
Image: Nikon Instruments