• KSII Transactions on Internet and Information Systems
    Monthly Online Journal (eISSN: 1976-7277)

Adaptive Attention Annotation Model: Optimizing the Prediction Path through Dependency Fusion


Abstract

Previous methods build image annotation model by leveraging three basic dependencies: relations between image and label (image/label), between images (image/image) and between labels (label/label). Even though plenty of researches show that multiple dependencies can work jointly to improve annotation performance, different dependencies actually do not "work jointly" in their diagram, whose performance is largely depending on the result predicted by image/label section. To address this problem, we propose the adaptive attention annotation model (AAAM) to associate these dependencies with the prediction path, which is composed of a series of labels (tags) in the order they are detected. In particular, we optimize the prediction path by detecting the relevant labels from the easy-to-detect to the hard-to-detect, which are found using Binary Cross-Entropy (BCE) and Triplet Margin (TM) losses, respectively. Besides, in order to capture the inforamtion of each label, instead of explicitly extracting regional featutres, we propose the self-attention machanism to implicitly enhance the relevant region and restrain those irrelevant. To validate the effective of the model, we conduct experiments on three well-known public datasets, COCO 2014, IAPR TC-12 and NUSWIDE, and achieve better performance than the state-of-the-art methods.


Statistics

Show / Hide Statistics

Statistics (Cumulative Counts from December 1st, 2015)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article

[IEEE Style]
F. Wang, J. Liu, S. Zhang, G. Zhang, Y. Zheng, X. Li, W. Liang, Y. Li, "Adaptive Attention Annotation Model: Optimizing the Prediction Path through Dependency Fusion," KSII Transactions on Internet and Information Systems, vol. 13, no. 9, pp. 4665-4683, 2019. DOI: 10.3837/tiis.2019.09.019.

[ACM Style]
Fangxin Wang, Jie Liu, Shuwu Zhang, Guixuan Zhang, Yang Zheng, Xiaoqian Li, Wei Liang, and Yuejun Li. 2019. Adaptive Attention Annotation Model: Optimizing the Prediction Path through Dependency Fusion. KSII Transactions on Internet and Information Systems, 13, 9, (2019), 4665-4683. DOI: 10.3837/tiis.2019.09.019.

[BibTeX Style]
@article{tiis:22217, title="Adaptive Attention Annotation Model: Optimizing the Prediction Path through Dependency Fusion", author="Fangxin Wang and Jie Liu and Shuwu Zhang and Guixuan Zhang and Yang Zheng and Xiaoqian Li and Wei Liang and Yuejun Li and ", journal="KSII Transactions on Internet and Information Systems", DOI={10.3837/tiis.2019.09.019}, volume={13}, number={9}, year="2019", month={September}, pages={4665-4683}}