CVPR2022论文速递(2022.4.11)!共12篇!跟踪/transformer/对比学习等
Updated on : 11 Apr 2022
total number : 12
目标跟踪/Object Tracking - 1 篇
Visible-Thermal UAV Tracking: A Large-Scale Benchmark and New Baseline
标题:可见热无人机跟踪:大型基准和新基线
论文/Paper: http://arxiv.org/abs/2204.04120
代码/Code: None
Transformers - - 1 篇
Gravitationally Lensed Black Hole Emission Tomography
标题:引力镜头的黑洞发射断层扫描
论文/Paper: http://arxiv.org/abs/2204.03715
代码/Code: None
对比学习/Contrastive Learning - 1 篇
Probabilistic Representations for Video Contrastive Learning
标题:视频对比学习的概率表
论文/Paper: http://arxiv.org/abs/2204.03946
代码/Code: None
其他/Other - 9 篇
Dancing under the stars: video denoising in starlight
标题:在星空下跳舞:在星光中的视频去噪
论文/Paper: http://arxiv.org/abs/2204.04210
代码/Code: None
General Incremental Learning with Domain-aware Categorical Representations
标题:域感知分类表示的一般增量学习
论文/Paper: http://arxiv.org/abs/2204.04078
代码/Code: None
Identifying Ambiguous Similarity Conditions via Semantic Matching
标题:通过语义匹配识别模糊的相似条件
论文/Paper: http://arxiv.org/abs/2204.04053
代码/Code: None
Does Robustness on ImageNet Transfer to Downstream Tasks?
标题:在想象网转移到下游任务的鲁棒性吗?
论文/Paper: http://arxiv.org/abs/2204.03934
代码/Code: None
Deep Hyperspectral-Depth Reconstruction Using Single Color-Dot Projection
标题:使用单色点投影深度高光谱深度重建
论文/Paper: http://arxiv.org/abs/2204.03929
代码/Code: None
CD$^2$-pFed: Cyclic Distillation-guided Channel Decoupling for Model Personalization in Federated Learning
标题:CD $ ^ 2 $ -pfed:循环蒸馏引导的通道解耦,用于联邦学习中的模型个性化
论文/Paper: http://arxiv.org/abs/2204.03880
代码/Code: None
Reusing the Task-specific Classifier as a Discriminator: Discriminator-free Adversarial Domain Adaptation
标题:将特定于任务特定的分类器重新定义为判别:无鉴别者的对抗域适应
论文/Paper: http://arxiv.org/abs/2204.03838
代码/Code: https://github.com/xiaoachen98/DALN
TorMentor: Deterministic dynamic-path, data augmentations with fractals
标题:Tormentor:确定性动态路径,带分形的数据增强
论文/Paper: http://arxiv.org/abs/2204.03776
代码/Code: None
TemporalUV: Capturing Loose Clothing with Temporally Coherent UV Coordinates
标题:Temporaluv:用时间相干的UV坐标捕获松散的衣服
论文/Paper: http://arxiv.org/abs/2204.03671
代码/Code: None