Utilizing laser-induced breakdown spectroscopy to investigate iron concentrations in the blood of individuals with iron deficiency anemia compared to healthy controls
DOI:
https://doi.org/10.52113/2/12.02.2025/44-55Abstract
This study used Laser-Induced Breakdown Spectroscopy (LIBS) to detect iron deficiency in blood samples by comparing anemic to healthy samples. LIBS analyzed one healthy (H1) and two anemic (E1 and E2) blood samples. Iron spectral lines (Fe I, Fe II, and Fe III) were detected between 393 and 646 nanometers. The wavelengths were checked against NIST database criteria to ensure accuracy. Relative technique was utilized to calculate iron content in each spectral line. Divide the depth of each line by the intensity of the road with the highest spectrum depth. After that, the relative numbers' arithmetic suggestion was used to calculate each sample's average iron content.
It became clear that the iron spectral strains were less strong inside the anemic samples (E1 and E2) than in the healthy samples (H1). The average iron degree in the healthy pattern (H1) became somewhat larger (zero.563) than in the anemic samples (0.456 for E1 and zero.486 for E2). Besides, iron content dropped more in the main anemic sample (E1) than in the second anemic sample (E2). This might be the result of the anemia within the samples being not as severe.
This study shows that LIBS is able to detect iron shortages in blood samples accurately. This is so because the anemic samples contained substantially less iron and lower spectral strains than the healthy samples. These results indicate that LIBS might be a quick and painless way to locate human individuals with iron-deficiency anemia. Miles advised that further study be done with bigger sample sizes to make the impacts especially accurate and helpful.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Zainab Baqi

This work is licensed under a Creative Commons Attribution 4.0 International License.
