Neural network solutions for artificial intelligence based on the new MIU-Net model for segmentation of the lung images in the diagnosis and treatment of lung diseases
Abstract
The purpose of this research is to segment the chest on X-rays, a crucial step in medical image processing pivotal for diagnosing and treating lung diseases. By employing modifications to the U-Net model, this study endeavors to enhance chest segmentation on X-rays. The approach includes introducing alterations to the basic architecture of the U-Net model, integrating Inception blocks, and a using a squeeze-and-excitation mechanism to improve segmentation accuracy. The Shenzhen dataset, comprising chest radiographs, serves as the subject of investigation, highlighting the practical application of these modifications. The utilization of the MIU-Net model for automatic chest organ segmentation underscores its significance in the realm of lung disease diagnosis and treatment. Experimental methodologies encompass two types of data augmentation: Contrast Limited Adaptive Histogram Equalization (CLAHE) and the introduction of Gaussian noise, to test the model's robustness under various conditions. A comparative analysis is conducted against both the baseline U-Net and U-Net with Inception blocks. The results show that the improved U-Net model that includes Inception and Squeeze-and-Excitation (SE) is a lot better than the original U-Net. Specifically, the Dice coefficient for the improved model stands at 0.9040 for the original data, 0.9306 with CLAHE application, and 0.9232 with Gaussian noise addition. These findings underscore the importance of the research, emphasizing the significance of improving the accuracy of chest segmentation on X-rays for early disease detection and treatment optimization, which are the practical implications of this study. The research’s implications extend beyond academic interest, offering potential enhancements in clinical practices for lung disease management.
Authors

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.