Publication:
A real-time framework for video Dehazing using bounded transmission and controlled Gaussian filter

dc.contributor.authorAlajarmeh, Aen_US
dc.contributor.authorZaidan, AAen_US
dc.date.accessioned2024-05-29T02:53:17Z
dc.date.available2024-05-29T02:53:17Z
dc.date.issued2018
dc.description.abstractThe haze phenomenon exerts a degrading effect that decreases contrast and causes color shifts in outdoor images and videos. The presence of haze in outdoor images and videos is bothersome, unpleasant, and occasionally, even dangerous. Atmospheric light scattering (ALS) model is widely used to restore hazy images. In this model, two unknown parameters should be estimated: airlight and scene transmission. The quality of dehazed images and video frames considerably depends on those two parameters as well as on the speed and accuracy of the refinement process of the approximated scene transmission, this refinement is necessary to ensure spatial coherency of the output dehazed video. Spatial coherency should be accounted for in order to eliminate flickering artifacts usually noticed when extending single-image dehazing methods to the video scenario. Classic methods typically require high computation capacity in order to dehaze videos in real time. However, when the driver assistance context is considered, these approaches are inappropriate due to the limited resources mobile environments usually have. To address this issue, this study proposes a framework for real-time video dehazing. This framework consists of two stages: single-image dehazing using the bounded transmission (BT) method, which is utilized to dehaze single video frame in real time with high accuracy; and transmission refinement stage using a filter we call controlled Gaussian filter (CGF), which is proposed for the linear and simplified refinement of the scene transmission. To evaluate the proposed framework, three image datasets in addition to two video streams are employed. Experimental results show that the single-image stage in the proposed framework is at least seven times faster than existing methods. In addition, the analysis of variance (ANOVA) test proves that the quality of dehazed images in this stage is statistically similar to or better than those obtained using existing methods. Also, experiments show that the video stage in the proposed framework is capable of real-time video dehazing with better quality than the existing methods.
dc.identifier.doi10.1007/s11042-018-5861-4
dc.identifier.epage26350
dc.identifier.isbn1573-7721
dc.identifier.issn1380-7501
dc.identifier.issue20
dc.identifier.scopusWOS:000444201500007
dc.identifier.spage26315
dc.identifier.urihttps://oarep.usim.edu.my/handle/123456789/11380
dc.identifier.volume77
dc.languageEnglish
dc.language.isoen_US
dc.publisherSpringeren_US
dc.relation.ispartofMultimedia Tools And Applications
dc.sourceWeb Of Science (ISI)
dc.subjectSingle-image dehazingen_US
dc.subjectReal-time video dehazingen_US
dc.subjectAtmospheric light scattering modelen_US
dc.subjectBounded transmissionen_US
dc.subjectImage integralsen_US
dc.subjectAirlight estimationen_US
dc.titleA real-time framework for video Dehazing using bounded transmission and controlled Gaussian filter
dc.typeArticleen_US
dspace.entity.typePublication

Files