• KSII Transactions on Internet and Information Systems
    Monthly Online Journal (eISSN: 1976-7277)

PATN: Polarized Attention based Transformer Network for Multi-focus image fusion

Vol. 17, No. 4, April 30, 2023
10.3837/tiis.2023.04.011, Download Paper (Free):

Abstract

In this paper, we propose a framework for multi-focus image fusion called PATN. In our approach, by aggregating deep features extracted based on the U-type Transformer mechanism and shallow features extracted using the PSA module, we make PATN feed both long-range image texture information and focus on local detail information of the image. Meanwhile, the edge-preserving information value of the fused image is enhanced using a dense residual block containing the Sobel gradient operator, and three loss functions are introduced to retain more source image texture information. PATN is compared with 17 more advanced MFIF methods on three datasets to verify the effectiveness and robustness of PATN.


Statistics

Show / Hide Statistics

Statistics (Cumulative Counts from December 1st, 2015)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article

[IEEE Style]
P. Wu, Z. Hua, J. Li, "PATN: Polarized Attention based Transformer Network for Multi-focus image fusion," KSII Transactions on Internet and Information Systems, vol. 17, no. 4, pp. 1234-1257, 2023. DOI: 10.3837/tiis.2023.04.011.

[ACM Style]
Pan Wu, Zhen Hua, and Jinjiang Li. 2023. PATN: Polarized Attention based Transformer Network for Multi-focus image fusion. KSII Transactions on Internet and Information Systems, 17, 4, (2023), 1234-1257. DOI: 10.3837/tiis.2023.04.011.

[BibTeX Style]
@article{tiis:38665, title="PATN: Polarized Attention based Transformer Network for Multi-focus image fusion", author="Pan Wu and Zhen Hua and Jinjiang Li and ", journal="KSII Transactions on Internet and Information Systems", DOI={10.3837/tiis.2023.04.011}, volume={17}, number={4}, year="2023", month={April}, pages={1234-1257}}