SC2Net: Sparse LSTMs for Sparse Coding

发布日期:2017-12-06浏览次数:10


报告题目:SC2Net: Sparse LSTMs for Sparse Coding

报 告 人:周天异 研究

单    位新加坡科技研究高性能研究所 (IHPC, A*STAR)

时    间:2017127日 上午10点整  

地    点:计算机科学与技术学院 计B518

主办单位:计算机科学与技术学院

欢迎全校师生踊跃参加!


报告摘要The iterative hard-thresholding algorithm (ISTA) is one of the most popular optimization solvers to achieve sparse codes. However, ISTA suffers from following problems: 1) ISTA employs non-adaptive updating strategy to learn the parameters on each dimension with a fixed learning rate. Such a strategy may lead to inferior performance due to the scarcity of diversity; 2) ISTA does not incorporate the historical information into the updating rules, and the historical information has been proven helpful to speed up the convergence. To address these challenging issues, we propose a novel formulation of ISTA (named as adaptive ISTA) by introducing a novel adaptive momentum vector. To efficiently solve the proposed adaptive ISTA, we recast it as a recurrent neural network unit and show its connection with the well-known long short term memory (LSTM) model. With a new proposed unit, we present a neural network (termed SC2Net) to achieve sparse codes in an end-to-end manner. To the best of our knowledge, this is one of the first works to bridge the L1-solver and LSTM, and may provide novel insights in understanding model-based optimization and LSTM. Extensive experiments show the effectiveness of our method on both unsupervised and supervised tasks.


个人简介周天异新加坡科技研究局高性能研究所科学家负责与参与各种机器学习的项目包括图像识别,的士路线规划等等 他曾经在美国硅谷的索尼研发中心担任高级研发工程师并且负责公司无人车项目的视觉感知部分 他博士毕业于新加坡南洋理工大学, 并且发表十多篇国际顶级期刊和会议文章其中包括AAAI, IJCAI, CVPR, TIP 等等, 他分别于ACML 2012 获得最佳海报名奖,  BeyondLabeler workshop on IJCAI 2016 最佳论文奖.