• KSII Transactions on Internet and Information Systems
    Monthly Online Journal (eISSN: 1976-7277)

Weighted Contrastive Learning for Complementary Label learning

Vol. 19, No. 11, November 30, 2025
10.3837/tiis.2025.11.016, Download Paper (Free):

Abstract

Complementary label learning(CLL), which provides information only on what a sample does not belong to in a certain category, has garnered increasing attention in the field of machine learning. Existing deep learning-based CLL methods mainly consider conducting experiments on unstructured data, employing text or image datasets to measure the final performance. Notably, as the core data form in key fields such as finance and healthcare, tabular data contains both continuous and discrete categorical features, with complex feature distributions and hidden correlations. This characteristic makes it difficult for traditional CLL models to effectively capture the nonlinear relationships between tabular features, resulting in weak feature learning capabilities. Moreover, existing contrastive learning mechanisms have poor adaptability to tabular data. In response, this paper introduces a novel model for CLL tailored to handle tabular data with complementary labels. Specifically, we combine the CLL and contrastive learning together by a multi-task model, where both self-supervised and supervised contrastive learning are employed for auxiliary feature extraction. Besides, the self-supervised and supervised contrastive learning are constructed by different augment views, and the probability score of the pseudo-label is utilized as a weight coefficient to enhance the supervised contrastive loss function. Furthermore, in order to better fit the scenario of the tabular dataset, we have adopted a data augmentation mechanism targeting them. The experimental results strongly confirm that the method proposed in this study performs excellently in the scenario of complementary label learning: compared with existing methods, it achieves an average accuracy improvement of 0.73%-2.66% over existing methods, outperforming all comparative baselines on every dataset.


Statistics

Show / Hide Statistics

Statistics (Cumulative Counts from December 1st, 2015)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article

[IEEE Style]
J. Liu, Y. Zhou, J. Liu, Y. Li, "Weighted Contrastive Learning for Complementary Label learning," KSII Transactions on Internet and Information Systems, vol. 19, no. 11, pp. 4050-4071, 2025. DOI: 10.3837/tiis.2025.11.016.

[ACM Style]
Jiayi Liu, Yangyang Zhou, Jiabin Liu, and Yan Li. 2025. Weighted Contrastive Learning for Complementary Label learning. KSII Transactions on Internet and Information Systems, 19, 11, (2025), 4050-4071. DOI: 10.3837/tiis.2025.11.016.

[BibTeX Style]
@article{tiis:105179, title="Weighted Contrastive Learning for Complementary Label learning", author="Jiayi Liu and Yangyang Zhou and Jiabin Liu and Yan Li and ", journal="KSII Transactions on Internet and Information Systems", DOI={10.3837/tiis.2025.11.016}, volume={19}, number={11}, year="2025", month={November}, pages={4050-4071}}