Ⅰ. Introduction
Light microscopy is primarily used to examine difficult-to-observe objects, such as cells or bacteria. After magnifying the object to be observed using a convex lens, image acquisition using a light microscope began. The microscope began with a corrective lens developed by Arab scientist A. I. Firnas in the 9th century, and Z. Janssen from the Netherlands was the first to develop a compound microscope that combines two lenses (objective lens and eyepiece) like the current microscope [1]. Janssens’s microscope was developed by combining a convex and concave lens but was not widely used for severe blurring phenomena. The microscope was used for observation and research around 1660, and it was the first type of compound microscope to use light as a light source at 30 times magnification [2]. Many researchers attempted to improve the resolution and magnification based on this, and Abbe and Zeiss developed a lens that solves spherical aberration and chromatic aberration to form the basic structure of modern light microscopy [3].
The basic principle of light microscopy developed in this manner is to use the property of light refraction. Light microscopy works on the principle of magnifying the real image of the first magnification obtained by placing the objective lens with a short focal length close to the object again with the eyepiece. Fine adjustment of the focus is essential because the image is not clearly formed even if the distance between the target object and the objective lens changes slightly. The magnification of light microscopy can be expressed by multiplying the magnification of the objective lens used in the microscope and the magnification of the eyepiece. For example, when using 10x magnification with the eyepiece at the 100x objective lens, the final images can be obtained at a magnification of 1000x.
Recently, methods for maximizing image quality based on light microscopy have been actively studied. However, light microscopy has limitations in improving resolution for two main reasons: 1. the diffraction limit and 2. uncertainty in single-molecule localization. The diffraction limit is the theory that, despite excellent lens performance, it is difficult to distinguish two objects that are within half the wavelength of visible light [4,5]. This indicates that, although the resolution limit of light microscopy using visible light is 200 nm, it is possible to observe the internal organelle types of cells; however, it is difficult to confirm the detailed structure. In addition, molecules generally absorb photons and their energy levels increase, resulting in an unstable excitation state. Unstable excitation-state molecules emit light energy to return to a stable stage (ground stage). The cycle of excitation and emission takes approximately 3 ns, and even with a high-performance detector, all molecules appear to emit light. In this process, single-molecule localization uncertainty occurs, which is impossible for molecules within the limit of resolution owing to the point spread function [6].
To overcome this limitation, many studies have been conducted, and in 1989, Moerner et al. succeeded in observing a single molecule for the first time in a dense medium [7,8]. However, the noise inevitably generated in light microscopy (including additive white Gaussian noise, Poisson noise, and mixed Poisson-Gaussian noise types) degrades the resolution and deteriorates the image quality [9]. Fig. 1 shows an explanatory diagram of the three noise models in the aforementioned light microscopic image.
Software-based algorithms are widely used to reduce the noise in light microscopy. Among the noise reduction algorithms, the total variation (TV)-based approach has been introduced to estimate noiseless areas in images from noisy observations. The basic equation for the TV norm metric is as follows [10]:
where φhorizontal and φperpendicular denote derivative operators with pixel coordinates p∈Ω in the horizontal and perpendicular directions, respectively. In addition, the non-local means (NLM) approach, which outperforms, with the fusion of patches in an image, was developed by Buades et al. [11]. The basic equation for the NLM denoising procedure is as follows [9]:
where g(i) denotes the noise reduction of the pixel of interest, and h denotes the control parameter for the weight function with the L2 norm. Here, the size of the N p window is equal to that of the N j window.
Another denoising method for an image is to use the wavelet transform. Since it has the characteristics of the spatial domain and performance of the frequency domain, many studies have been conducted to obtain a signal or an image, such as noise reduction and contrast enhancement of an image in image processing [12–15]. The image noise reduction method using wavelets is widely used to remove additional Gaussian noise using the threshold of wavelet coefficients. According to Nowak et al., the wavelet transform is effective in removing Rician distribution-based noise, which is primarily observed in magnetic resonance imaging [16-18]. In particular, the noise removal method based on wavelet transform in optical or microscopic images has become useful since the principle of sequential projection into binary frequency sections began. Thus, the applicability of light microscopy images to noise removal using transforms of a shape similar to a wavelet is also being studied. AlAsadi proved the usefulness of noise reduction of the contour transform method in various imaging systems and demonstrated the advantage of obtaining a multiscale and high degree of directionality in photon imaging [19].
Based on these noise reduction methods, various types of improved filters and algorithms have recently been developed and applied to light microscopy. In particular, image processing methods based on artificial intelligence or deep learning are being actively applied to light microscopy and are beginning to be introduced as approaches for noise reduction. As we mentioned above, many studies on image quality improvement of optical microscope images are being approached in various ways. Thus, this study aimed to systematically analyze the research designs, noise reduction algorithms, and evaluation methods being analyzed in previous studies and to propose future research directions that may be necessary.
Ⅱ. Materials and Methods
1. Design of study
Our systematic review of quantitative and comparative evaluations of noise reduction algorithms in light microscopic images followed the preferred reporting items for systematic reviews and meta-analysis guidelines (PRISMA checklist and statement) [20].
2. Data source and search strategy
MEDLINE-EBSCO, EMBASE, and the Cochrane Library are the most commonly used databases. The search included publications from January 1985 to May 2020 to identify relevant papers on the quantitative and comparison evaluations of noise reduction algorithms in light microscopic images. To find related papers, only English forms were applied, based on the guidelines. Table 1 provides an overview of the search strategies used in this study [21-36].
3. Descriptive analysis
All authors screened titles and abstracts, searching for valid papers, and reviewed and categorized all papers, including verification of all the data.
For descriptive analysis, the searched papers were classified in detail based on the noise reduction method. Additionally, microscope systems were classified into basic and light sheets, and papers were organized by the study type (simulation, experiment, or both). Finally, they were classified based on the measurement method to demonstrate the efficacy of the noise reduction algorithm. Finally, we classified and organized the paper using a method or parameter that evaluates the noise level and image quality.
Ⅲ. Results and Discussion
A total of 139 research papers were identified using three search engines (MEDLINE-EBSCO, EMBASE, and Cochrane libraries). A total of 95 papers were initially included, excluding 44 duplicates out of 139. While searching for papers, we set abstract, title, and keywords, as well as improved accuracy by separating abstract and title with “OR.” Subsequently, we excluded conference proceedings and reviewed the title and abstract with all authors to eliminate papers unrelated to the topic of this manuscript. During this screening process, six conference proceedings and 51 inappropriate papers (not denoising, confocal system, not microscopy, tutorial, optical coherence tomography, artifact reduction, segmentation, uniformity correction, ultrasound, and micro CT) were excluded. Among the 36 papers that compelled with the eligibility criteria, 20 papers that did not match the review papers and trials were excluded, and 16 systematic review papers were selected [21–36]. A PRSIMA flowchart (including identification, screening, and eligibility) of the study is shown in Fig. 2.
Fig. 3 shows the total number of papers for the noise reduction algorithm in light microscopic images by publication year. By displaying the graph using polynomial fitting, a Gaussian distribution was obtained, and the r-square (COD) and standard deviation values were measured to be 0.110 and 0.748, respectively.
1. Types of study
Studies on general images, including light microscopy, can be divided into simulations and experiments. Fig. 4 shows the overall publication numbers for each type of study. Among the 16 identified papers, only the results of experiments and simulations were published in six [22–24,28,32,36] and three [21,25,29] papers, respectively. Seven papers are the results of research conducted simultaneously with experimentation and simulation [26,27,30,31,33-35].
This research trend demonstrates that few simulation programs or tools can accurately model microscopes. In several reviewed studies, the Shepp-logan phantom or the phantom that can be imported from MATLAB were primarily used as simulation images. However, there is a steady increase in the flow of patterns validated by actual experiments following simulation studies. Thus, it is expected that denoising studies with improved validation accuracy can be performed when developing a simulation program or tool that can accurately simulate light microscopy systems.
2. Imaging techniques
As a method for acquiring light microscopic images, we organized basic and light sheet types. Fig. 5 shows the overall number of publications for each imaging technique. Among the 16 identified papers, 15 papers were published on the basic light microscopy system [21–33,35,36], and only one paper was related to light sheet microscopy [34]. Most studies have been conducted to reduce noise in images acquired using basic light microscopy systems. We believe that the fact that noise has been significantly reduced owing to the characteristics of recently developed equipment with improved functions has led to this research trend.
3. Noise reduction methods
Various methods have been used to reduce the noise in light microscopy images. Fig. 6 shows the overall number of publications for each noise reduction method. Among the 16 identified papers, seven transform-based modeling techniques were published and used the most, and the wavelet transform was mostly applied to light microscopic images with six papers [21,22,25-27,32,33]. Subsequently, TV- and NLM-based approaches were used for research in three papers [24,30,35], and techniques that improved conventional filtering methods were also performed in two studies [28,36]. In addition, the noise reduction method in machine learning using dictionary learning, compressed sensing (CS)-based technology, and approaches based on singular value decomposition and Gaussian Markov random field has been applied to light microscopic images [23,29,31,34].
Wavelet transform is the most widely used software technology for reducing noise in light microscopy images. The principle of wavelet transform is suitable for light microscopy systems because it is easy to analyze the characteristics of the signal and control the frequency and spatial components simultaneously. In general, a method for efficiently calculating sub-band coding using a fast wavelet transform is used. In addition, Wang et al. improved the accuracy of observing gene expression using the stationary wavelet transform, and the applicability of the mixed Poisson-Gaussian noise model was analyzed by Luisier et al. [21,26]. After such a wavelet transform has been actively used for light microscopy images, the most recent contourlet transform has been used as a denoising method [33]. The contourlet transform has the advantage of overcoming one of the biggest drawbacks of wavelets: the difficulty in expressing continuous images at edge points in a two-dimensional image. Because wavelet transform uses different sizes based on the multi-resolution structure, a smooth outline expression is not possible, which can be an issue during denoising. Yang et al. conducted a study to reduce the form of mixed Poisson-Gaussian noise generated in light microscopy images using an algorithm based on the contourlet domain. Thus, as an alternative to wavelet transform, a contourlet transform capable of preserving many edge areas generated while removing noise from a light microscopy image has been proposed [33].
4. Evaluation method
The acquired light microscopy images were divided into comparison and quantitative analysis methods. Fig. 7 shows the overall number of publications for each evaluation method. Among the 16 identified papers, 10 [21,24-27,29-33] and one [23] papers used comparison evaluation parameters and quantitative evaluation parameters for image analysis, respectively. The results of image analysis using both comparison and quantitative evaluation parameters have been published in five publications [22,28,34-36].
The easiest approach to quantitatively evaluate the noise level is to measure the standard deviation of the region of interest. However, it is important to simultaneously analyze the noise level and signal or contrast in all images, and the signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) evaluation parameters have been used in many studies. In particular, only the SNR and CNR were used in the study that measured the noise level using only quantitative evaluation methods.
However, when using software technology to reduce noise in all images, including light microscopic images, edge area preservation inevitably degrades. Since this situation implies that the noise level and edge area must be evaluated simultaneously when analyzing the resulting image, parameters for evaluating comparison or similarity are often used. Even in this systematic review, the comparison evaluation parameter was used in 10 of the most frequent studies. Among the parameters used, the mean error [24,25,27,32] and peak signal-to-noise ratio (PSNR) [25,26,29,30,33] were the most frequently applied evaluation parameters. In addition, the structural similarity index measure, which can complexly analyze the similarity of images and the overall structure, is also increasingly used as an evaluation parameter [31]. In particular, by analyzing the relationship between the imaging type and evaluation method, it was possible to derive the result that comparison evaluation parameters were mainly used in simulation research, and the quantitative evaluation method was mainly used for image evaluation using new equipment.
Recently, many studies have confirmed the quantitative evaluation results of these noise levels and have analyzed comparison evaluation results. In the results of the study using the two evaluations concurrently, the coefficient of variation (COV), which can mainly evaluate the noise level, and PSNR, which can be used as a comparison evaluation parameter, were used the most [22,36]. The COV is an evaluation parameter that can simultaneously check the noise level and signal more simply than the SNR measurement method. In addition, based on the most recently published paper by Kim et al., the credibility of the study was enhanced using no-reference-based evaluation parameters, except for the evaluation methods used in the analysis of the resulting image [36]. The no-reference-based evaluation parameter is being actively used in the medical imaging field, and we expect that it can be extended and applied to denoising studies of light microscopy images in the future.
Ⅳ. Conclusion
Noise reduction in light microscopic images is important for accurate physical and biological analysis of tissues. Software technologies for reducing such noise are widely used, and the usefulness of these algorithms has been proven through various quantitative and comparative evaluations. In this study, a systematic review of recent papers on quantitative and comparative evaluations of noise reduction algorithms in light microscopic images was conducted. Over 35 years, until 2020, 139 related papers were published, and 16 papers subject to review were analyzed in various forms.
As a result of systematic consideration, we demonstrated that studies related to image quality improvement through noise reduction of light microscopy images were steadily being validated by using simulations and actual experiments. In particular, research on the application of algorithms to images acquired with a basic light microscopic system has been the main focus, which confirms that there are many areas to be improved in terms of image quality. In addition, transform-based algorithms are widely used to improve the image quality of light microscope images, and the quality evaluation method also tends to be compared and quantitatively analyzed at the same time. In the future, we expect that light microscope image processing technology development and various evaluation methods will be presented by various researchers, and that various optical-based software technologies will be additionally considered.