每日学术速递2.6
创始人
2024-05-23 15:12:47
0

CV - 计算机视觉 |  ML - 机器学习 |  RL - 强化学习 | NLP 自然语言处理 

Subjects: cs.AI

1.Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class-Incremental Learning(ICLR 2023)

标题:神经崩溃启发下的特征分类器排列,用于小样本的分类增量学习

作者:Yibo Yang, Haobo Yuan, Xiangtai Li, Zhouchen Lin, Philip Torr, DaCheng Tao 

文章链接:https://openreview.net/forum?id=y5W8tpojhtJ

项目代码:https://github.com/NeuralCollapseApplications/FSCIL

摘要:

        小样本的类增量学习(FSCIL)一直是一个具有挑战性的问题,因为在新的环节中,每个新的类只有少数训练样本可以获得。对骨干进行微调或调整之前训练的分类器原型将不可避免地导致旧类的特征和分类器之间的错位,这就是众所周知的灾难性遗忘问题。在本文中,我们在FSCIL中处理了这种错位困境,其灵感来自于最近发现的名为神经塌陷的现象,它揭示了同一类别的最后一层特征会塌陷成一个顶点,所有类别的顶点都与分类器原型对齐,形成一个简单的等边紧缩框架(ETF)。由于Fisher Discriminant Ratio的最大化,它对应于分类的最佳几何结构。我们为FSCIL提出了一个受神经塌陷启发的框架。一组分类器原型被预先分配为整个标签空间的单叉ETF,包括基础会话和所有增量会话。在训练过程中,分类器原型是不可学习的,我们采用了一个新的损失函数,将特征驱动到其相应的原型中。理论分析表明,我们的方法保持了神经塌陷的最优性,并且不会以递增的方式破坏特征-分类器的一致性。在miniImageNet、CUB-200和CIFAR-100数据集上的实验表明,我们提出的框架优于最先进的性能。我们的代码将公开提供。

Few-shot class-incremental learning (FSCIL) has been a challenging problem as only a few training samples are accessible for each novel class in the new sessions. Finetuning the backbone or adjusting the classifier prototypes trained in the prior sessions would inevitably cause a misalignment between the feature and classifier of old classes, which explains the well-known catastrophic forgetting problem. In this paper, we deal with this misalignment dilemma in FSCIL inspired by the recently discovered phenomenon named neural collapse, which reveals that the last-layer features of the same class will collapse into a vertex, and the vertices of all classes are aligned with the classifier prototypes, which are formed as a simplex equiangular tight frame (ETF). It corresponds to an optimal geometric structure for classification due to the maximized Fisher Discriminant Ratio. We propose a neural collapse inspired framework for FSCIL. A group of classifier prototypes are pre-assigned as a simplex ETF for the whole label space, including the base session and all the incremental sessions. During training, the classifier prototypes are not learnable, and we adopt a novel loss function that drives the features into their corresponding prototypes. Theoretical analysis shows that our method holds the neural collapse optimality and does not break the feature-classifier alignment in an incremental fashion. Experiments on the miniImageNet, CUB-200, and CIFAR-100 datasets demonstrate that our proposed framework outperforms the state-of-the-art performances. Our code will be publicly available.

2.Visual Imitation Learning with Patch Rewards

标题:带补丁奖励的视觉模仿学习

作者:Minghuan Liu, Tairan He, Weinan Zhang, Shuicheng Yan, Zhongwen Xu

文章链接:https://arxiv.org/abs/2302.00965v1

项目代码:https://github.com/sail-sg/patchail

摘要:

        视觉模仿学习使强化学习代理能够从专家的视觉演示中学习行为,如视频或图像序列,而没有明确的、明确的奖励。以前的研究要么采用监督学习技术,要么从像素中诱导出简单粗暴的标度奖励,忽视了图像演示中包含的密集信息。在这项工作中,我们提议测量图像样本的各个局部区域的专业性,或称为textit{patches},并相应地恢复多维textit{patch reward}。补丁奖励是一个更精确的奖励表征,可以作为一个细粒度的专业知识测量和视觉可解释性工具。具体来说,我们提出了带有补丁奖励的对抗性模仿学习(PatchAIL),它采用了基于补丁的判别器来测量来自给定图像的不同局部的专业知识并提供补丁奖励。基于斑块的知识也被用来规范聚合的奖励并稳定训练。我们在DeepMind控制套件和Atari任务上评估了我们的方法。实验结果表明,PatchAIL优于基线方法,为视觉演示提供了有价值的解释。

Visual imitation learning enables reinforcement learning agents to learn to behave from expert visual demonstrations such as videos or image sequences, without explicit, well-defined rewards. Previous research either adopted supervised learning techniques or induce simple and coarse scalar rewards from pixels, neglecting the dense information contained in the image demonstrations. In this work, we propose to measure the expertise of various local regions of image samples, or called \textit{patches}, and recover multi-dimensional \textit{patch rewards} accordingly. Patch reward is a more precise rewarding characterization that serves as a fine-grained expertise measurement and visual explainability tool. Specifically, we present Adversarial Imitation Learning with Patch Rewards (PatchAIL), which employs a patch-based discriminator to measure the expertise of different local parts from given images and provide patch rewards. The patch-based knowledge is also used to regularize the aggregated reward and stabilize the training. We evaluate our method on DeepMind Control Suite and Atari tasks. The experiment results have demonstrated that PatchAIL outperforms baseline methods and provides valuable interpretations for visual demonstrations.

Subjects: cs.CV

1.No One Left Behind: Real-World Federated Class-Incremental Learning

标题:不落人后:现实世界中的联合班级-增量学习

作者: Dong-Guw Lee, Myung-Hwan Jeon, Younggun Cho, Ayoung Kim

文章链接:https://arxiv.org/abs/2302.00965v1

项目代码:https://github.com/jiahuadong/lga

摘要:

        联合学习(FL)是一个热门的协作训练框架,通过聚合分散的本地客户端的模型参数。然而,大多数现有的模型都不合理地假定FL框架的数据类别是事先已知的。当本地客户在存储旧类别的有限内存中连续收到新类别时,这使得全局模型在旧类别上的识别性能显著下降(即灾难性遗忘)。此外,一些新的本地客户收集其他客户未曾见过的新类别,可能会被不定期地引入FL训练,这进一步加剧了对旧类别的灾难性遗忘。为了解决上述问题,我们提出了一个新的局部-全局反遗忘(LGA)模型来解决局部和全局对旧类别的灾难性遗忘,这是FL领域中探索全局类增量模型的一项开创性工作。具体来说,考虑到解决局部客户端的类不平衡以克服局部遗忘,我们开发了一个类别平衡的梯度适应性补偿损失和一个类别梯度诱导的语义蒸馏损失。它们可以平衡难以遗忘和容易遗忘的旧类别的异质性遗忘速度,同时保证不同增量任务中内在的类别关系的一致性。此外,还设计了一个代理服务器来解决不同客户之间的非IID类不平衡引起的全局遗忘问题。它在保护隐私的前提下,通过原型梯度通信从本地客户端收集新类别的扰动原型图像,并通过自监督的原型增强来选择最佳的旧全局模型,提高本地蒸馏增益。在代表性数据集上的实验验证了我们的模型相对于其他比较方法的优越性能。

Federated learning (FL) is a hot collaborative training framework via aggregating model parameters of decentralized local clients. However, most existing models unreasonably assume that data categories of FL framework are known and fxed in advance. It renders the global model to signifcantly degrade recognition performance on old categories (i.e., catastrophic forgetting), when local clients receive new categories consecutively under limited memory of storing old categories. Moreover, some new local clients that collect novel categories unseen by other clients may be introduced to the FL training irregularly, which further exacerbates the catastrophic forgetting on old categories. To tackle the above issues, we propose a novel Local-Global Anti-forgetting (LGA) model to address local and global catastrophic forgetting on old categories, which is a pioneering work to explore a global class-incremental model in the FL feld. Specifcally, considering tackling class imbalance of local client to surmount local forgetting, we develop a category-balanced gradient-adaptive compensation loss and a category gradient-induced semantic distillation loss. They can balance heterogeneous forgetting speeds of hard-to-forget and easy-to-forget old categories, while ensure intrinsic class relations consistency within different incremental tasks. Moreover, a proxy server is designed to tackle global forgetting caused by Non-IID class imbalance between different clients. It collects perturbed prototype images of new categories from local clients via prototype gradient communication under privacy preservation, and augments them via self-supervised prototype augmentation to choose the best old global model and improve local distillation gain. Experiments on representative datasets verify superior performance of our model against other comparison methods.

相关内容

热门资讯

常用商务英语口语   商务英语是以适应职场生活的语言要求为目的,内容涉及到商务活动的方方面面。下面是小编收集的常用商务...
六年级上册英语第一单元练习题   一、根据要求写单词。  1.dry(反义词)__________________  2.writ...
复活节英文怎么说 复活节英文怎么说?复活节的英语翻译是什么?复活节:Easter;"Easter,anniversar...
2008年北京奥运会主题曲 2008年北京奥运会(第29届夏季奥林匹克运动会),2008年8月8日到2008年8月24日在中华人...
英语道歉信 英语道歉信15篇  在日常生活中,道歉信的使用频率越来越高,通过道歉信,我们可以更好地解释事情发生的...
六年级英语专题训练(连词成句... 六年级英语专题训练(连词成句30题)  1. have,playhouse,many,I,toy,i...
上班迟到情况说明英语   每个人都或多或少的迟到过那么几次,因为各种原因,可能生病,可能因为交通堵车,可能是因为天气冷,有...
小学英语教学论文 小学英语教学论文范文  引导语:英语教育一直都是每个家长所器重的,那么有关小学英语教学论文要怎么写呢...
英语口语学习必看的方法技巧 英语口语学习必看的方法技巧如何才能说流利的英语? 说外语时,我们主要应做到四件事:理解、回答、提问、...
四级英语作文选:Birth ... 四级英语作文范文选:Birth controlSince the Chinese Governmen...
金融专业英语面试自我介绍 金融专业英语面试自我介绍3篇  金融专业的学生面试时,面试官要求用英语做自我介绍该怎么说。下面是小编...
我的李老师走了四年级英语日记... 我的李老师走了四年级英语日记带翻译  我上了五个学期的小学却换了六任老师,李老师是带我们班最长的语文...
小学三年级英语日记带翻译捡玉... 小学三年级英语日记带翻译捡玉米  今天,我和妈妈去外婆家,外婆家有刚剥的`玉米棒上带有玉米籽,好大的...
七年级英语优秀教学设计 七年级英语优秀教学设计  作为一位兢兢业业的人民教师,常常要写一份优秀的教学设计,教学设计是把教学原...
我的英语老师作文 我的英语老师作文(通用21篇)  在日常生活或是工作学习中,大家都有写作文的经历,对作文很是熟悉吧,...
英语老师教学经验总结 英语老师教学经验总结(通用19篇)  总结是指社会团体、企业单位和个人对某一阶段的学习、工作或其完成...
初一英语暑假作业答案 初一英语暑假作业答案  英语练习一(基础训练)第一题1.D2.H3.E4.F5.I6.A7.J8.C...
大学生的英语演讲稿 大学生的英语演讲稿范文(精选10篇)  使用正确的写作思路书写演讲稿会更加事半功倍。在现实社会中,越...
VOA美国之音英语学习网址 VOA美国之音英语学习推荐网址 美国之音网站已经成为语言学习最重要的资源站点,在互联网上还有若干网站...
商务英语期末试卷 Part I Term Translation (20%)Section A: Translate ...