Medical image fusion quality evaluation method combined multi-angle and multi-scale information

. At present, the existing evaluation indicators of image fusion algorithms do not in conformity with the standards of human vision system and are not suitable for medical image application scenarios. Therefore, this paper proposes a medical image fusion quality evaluation method combining multi-angle and multi-scale information. We use Gabor Filter to decompose the image and obtain multi-angle texture information, and design a new index, called Multiscale Structural Similarity with Image Resolution (SSIM-IR), to comprehensively evaluate the fused image from edge, structure and sharpness. The experimental results show that this method is more suitable for medical image application scenarios, consistent with subjective evaluation, and can be used for measuring the advantages and disadvantages of medical image fusion algorithm.


Introduction
Multi-modal medical image fusion can roundly and accurately reflect the structure and function of human organs and tissues and integrate the advantages of different modal medical images, so as to help doctors make more accurate diagnosis of the disease.With the development of image fusion technology, how to evaluate the performance of image fusion algorithm has become a new research direction.Since there is no standard image for medical fusion image to compare, the quality evaluation for medical fusion image algorithm is a difficult point.
The early objective evaluation indicators based on the pixel features of the image lack the consideration of source images, and are mainly unreferenced image evaluation indexes.In addition, there are evaluation indicators based on information theory and structural similarity.At present, the most common evaluation indicators are based on Human Visual System (HVS), which evaluate the quality of fused images by simulating the perception of images by Human eyes.However, the existing evaluation indicators based on HVS has the limitations that its result can not completely consistent with subjective feelings.It is an inevitable trend to improve subjective perception of fusion images to make objective evaluation indexes more consistent with subjective visual perception.Therefore, many researchers are inclined to combine multiple indicators to obtain a comprehensive indicator.Currently, there is no evaluation method for medical image fusion algorithm.
In this paper, a medical image fusion quality evaluation method combining multi-angle and multi-scale information is proposed, which comprehensively considers edge, structure, brightness, contrast and sharpness, and can be used to evaluate gray and color image fusion algorithms.The experimental results show that the proposed evaluation method with good performance can be used to evaluate medical image fusion algorithms, provide results in accordance with human vision standards.

Multiscale structural similarity with image resolution
In order to better evaluate the quality of fused Image, this paper design a new index called Multiscale Structural Similarity with Image Resolution (SSIM-IR).
The Structural Similarity (SSIM) [1] combines the mean value, variance and covariance of images to evaluate the brightness, contrast and structure between a source image and fusion image without considering the sharpness of images, which is a semi-reference index.
The Average Gradient (AG) can measure the clarity of an image.Let X α and Y α be the average gradient of source image X and fusion image Y respectively.The sharpness influence factor ( ) , R X Y can reflect the sharpness change from source image to fusion image: The formula of SSIM-IR can be expressed as: In order to integrate all source images and fusion images for evaluation, SSIM-IR further combines the image structure information to measure the structural similarity sharpness between the two source images and the generated fusion images.
The principle of SSIM points out that the variance and covariance of image can reflect structural information.According to the definition of Mutual Information (MI) [2] , the weight value of structural information between image X and Y is defined as: Set source image A, B and fusion image F, and the index formula of SSIM-IR is:

Pre-processing
In medical image fusion algorithm, YUV space is frequently YUV used in color image and gray image fusion.When evaluating the fusion algorithm of color and gray medical images, YUV transformation is performed on the color source image and fusion image first, and the y-component obtained is taken as the input.

Image decomposition
Gabor Filter [3] is sensitive to the edge of the image and can provide good direction selection and scale selection characteristics.Moreover, the frequency and direction expression of Gabor Filter are similar to that of human vision system.Thus, Gabor Filter can extract texture details from different angles to obtain texture images and decompose structure images to achieve multi-angle and multi-scale decomposition of images.The operation of image decomposition: decompose source image A, B and fusion image F by Gabor Filter at different angles to obtain texture image A T θ , B T θ and F T θ , U θ ∈ ={0°, 22.5°, 45°, 67.5°, 90°, 112.5°, 135°, 157.5°}, and structural image A

S , B
S and F S . 4]is an objective non-reference evaluation index of fusion image quality, and its algorithm uses local edge measurement to estimate the performance degree of significant information from the source image in the fusion image.The multi-angle texture images contain edge information from source images and fusion image.Thus, use Q AB/F to calculate the edge gradient index f ab|θ Q of the texture image of the source image from different angles A T θ , B T θ and the texture image of the fusion image F T θ , and then obtain the evaluation score T Index of the texture information through L2 norm calculation.L2 norm, also known as Euclidean norm, is defined as the square root of the sum of squares of all elements of a vector.The value of the edge gradient index can be taken as the element of the vector and then integrated through L2 norm, as follows:

Structural information evaluation
Most of the organ structure, brightness and contrast information is retained in the structural image, which can be evaluated by the SSIM-IR designed in this paper.
Use Formula (6) to calculate the SSIM-IR value and get the evaluation score S Index of structural information of the structural image.

Index calculation
Calculate the amount of information of the structure image and texture image decomposed by Gabor Filter to get the final quality evaluation score Index .

Analysis of experimental results
In order to ensure the comprehensiveness of the comprehensive indicators proposed in this chapter, the experiment adopted six indicators for comparison, including SF, EN, FMI [2] , VIFF [5] , Q AB/F [4] and MS-SSIM [1] .The above six indicators used in the experiment are the larger the value is, the better the fusion image quality is.

Contrast experiment
In order to verify the effectiveness of the proposed quality assessment method, this paper selected Pearson Linear Correlation Cofficient (PLCC) and Spearman Rank order Correlation Coefficient (SROCC) to evaluate the correlation between subjective evaluation and objective indicators methods [8] .The value range of SROCC and PLCC is [-1, 1], which represents the relationship between subjective and objective ranking of the fusion algorithm.The higher the value is, the more consistent the quality evaluation method is with the results of subjective evaluation.The following figure shows PLCC and SROCC values of the proposed method and comparison indexes.
From the histogram of figure 2, the proposed method has the highest PLCC and SROCC values, which means that this method is not only in line with the standard of HVS but also ITM Web of Conferences 47, 02049 (2022) CCCAR2022 https://doi.org/10.1051/itmconf/20224702049more suitable for medical image fusion scenes.

Case analysis
In order to further verify the performance of the proposed quality assessment method in medical image fusion between different modes, two pairs of medical images were selected from the experimental dataset for analysis.Table 1 shows the evaluation results of these images by comparing indicators and methods proposed in this paper, including the index value and its rank.    1, it can be found that the objective ranking obtained by the proposed method is basically consistent with the subjective evaluation results, while other evaluation indexes deviate from the subjective rank to some extent.It can prove that the proposed method can be used and more suitable for the evaluation of medical image fusion algorithms.

Conclusion
This paper proposes a medical image fusion quality evaluation method combined multiangle and multi-scale information.This method utilizes Gabor Filter to achieve multi-angle and multi-scale image decomposition, and design the Multiscale Structural Similarity with Image Resolution (SSIM-IR) index to evaluate structural information.In the experiment, through PLCC and SROCC values and case analysis of multi-modal fusion image results, it can be proved that he proposed evaluation method in this paper performs better that other indicators in the evaluation of medical image fusion algorithm and has more subjective consistency.This method can be used to evaluate multi-modal medical image fusion algorithm and promote the development of medical image fusion technology.This research is supported by the National Natural Science Foundation of China (61861004, 61762007, 61862006).

EN
represent the total Information Entropy (EN) of texture image and structure image respectively.

Figure 3 .
Figure 3.Comparison of image fusion result.(a) and (b), (c) and (b) are CT and MR-T1, MR-T2 and PET images respectively scanned along the axial plane of human brain, and (e) -(j) successively are fusion images obtained by CNN, CVT, DWT, LP, NSCT and NSST_PAPCNN fusion algorithms

Table 1 .
Comparison results of different fusion algorithms and evaluation indexes of fusion images.Through recognition and comparison of fused images in Figure3, subjective ranking in Table1can be obtained, which show in the column of rank.Combining with the value rank of all indicators in Table