国产亚洲AV自拍|av中文字幕一区|资源在线观看一区二区|亚洲影视久久亚洲特级性交|一级做一级a做片爱免费观看|欧美另类亚洲色婷婷精品无码|亚洲青青草免费一区|青青草免费成人网|91久久国内视频|五月天丁香久久

One paper has been accepted by Pattern Recognition.

Our paper entitled "IW-ViT: Independence-Driven Weighting Vision Transformer for Out-of-Distribution Generalization" has been accepted by Pattern Recognition.

Title:  IW-ViT: Independence-Driven Weighting Vision Transformer for Out-of-Distribution Generalization

Author: Weifeng Liu*, Haoran Yu, Yingjie Wang, Baodi Liu, Dapeng Tao, Honglong Chen

Abstract:  Vision Transformer has shown excellent performance in various computer vision applications under the independently and identically distributed assumption. However, if the test distribution differs from the training distribution, the performance of the model drops significantly. To solve this problem, we propose to use independence sample weighting to improve the model’s out-of-distribution generalization ability. It learns a set of sample weights to eliminate the spurious correlation between irrelevant features and labels by eliminating the dependencies between features. Previous work based on independence sample weighting only learned sample weights from the final output of the feature extractor to optimize the model. Different from these works, we consider the difference in spurious correlations between different layers in the feature extraction process. Combining the modular architecture of ViT and independence sample weighting, we propose Independence-Driven Weighting Vision Transformer (IW-ViT) for out-of-distribution generalization. The IW-ViT is constructed by a specialized encoder block, IW-Block, where each IW-Block incorporates the independence sample weighting module. Every IW-Block learns a set of sample weights and generates weighted loss function to differentially eliminate the spurious correlations in different blocks. We conduct detailed verifications on various datasets. Experimental results demonstrate that IW-ViT significantly outperforms previous work in different OOD generalization settings. 

登錄用戶可以查看和發(fā)表評(píng)論, 請(qǐng)前往  登錄 或  注冊(cè)。
SCHOLAT.com 學(xué)者網(wǎng)
免責(zé)聲明 | 關(guān)于我們 | 用戶反饋
聯(lián)系我們: