Summary

Following the intuitive idea of detecting changes by directly measuring dissimilarities between pairs of features, change detection methods based on feature similarity learning have emerged as a crucial field. However, large variances in the scale and location of required contextual information and heavy imbalance between easy and hard samples remain challenging issues. To address the first issue, we propose the Local-Specificity and Wide-View Attention Network (LSWVANet), which features a series of attention modules named Local-Specificity and Wide-View Attention Modules (LSWVAMs). Each LSWVAM consists of two contextual feature units: the Local-Specificity Feature Pyramid unit, which extracts part-specific contexts at the fine-grained level to focus on subtle changes within local discriminative parts, and the Wide-View Feature Pyramid unit, which extracts wide-view contexts at the long-range level to highlight significant changes in large-scale regions. To tackle the second issue, we introduce a novel sample-specific loss function called Hard Sample-Aware Contrastive Loss (HSACL), which is designed to downweight easy samples from both changed and unchanged categories, thereby rapidly shifting the training focus towards the informative hard samples. We demonstrate the effectiveness of our method through experiments on three challenging datasets, VL-CMU-CD, PCD2015 and PSCD, and report the experimental results showing that our approach achieves state-of-the-art accuracy.

Full-Text