초대의 말씀
  Program at a Glance
  초청강연
  기조연설
  협력워크샵
  공통워크샵
  분과워크샵
  특별세션I
  특별세션II
  튜토리얼
  논문발표
  조직 및 후원
  행사장소/교통/숙박
 
 
HOME > 행사안내 > Nonlinear Projection Trick in Kernel Methods
 
  Nonlinear Projection Trick in Kernel Methods

 

 

 

 

 

 

 

 

 

 

 

 

 곽노준 교수 (서울대학교)
 
학력/경력: 1993-1997: 서울대학교 전기공학부 학사

                 1997-1999: 서울대학교 전기공학부 석사

                 1999-2003: 서울대학교 전기컴퓨터공학부 박사

                 2003-2006: 삼성전자 통신연구소 책임연구원

                 2006-2007: 서울대학교 정보기술사업단 BK조교수

                 2007-2013: 아주대학교 전자공학과 조/부교수

                 2013-현재: 서울대학교 융합과학부 부교수

 

 연구실적: subspace learning 관련 다수의 논문

 

 주요연구: Principal component analysis based on LP-norm, Nonlinear

               projection trick 등

 

 관심분야: 패턴인식, 기계학습, 영상처리, 컴퓨터비전 등

 

제목

Nonlinear Projection Trick in Kernel Methods

요약

In kernel methods such as kernel PCA and support vector machines, the so called kernel trick (KT) is used to avoid direct calculations in a high (virtually infinite) dimensional kernel space. In this tutorial, we introduce an alternative to the kernel trick that explicitly maps the input data into a reduced dimensional kernel space. This is easily obtained by the eigenvalue decomposition of the kernel matrix. The proposed method is named as the nonlinear projection trick (NPT) in contrast to KT. Firstly, the equivalence of NPT and KT as well as the Nyström method is shown. Secondly, an incremental version of the NPT is also proposed which can further be applied to implement arbitrary kernel methods incrementally. The proposed incremental NPT (INPT) is also shown to be equivalent to a QR factorization in a RKHS. With this technique, the applicability of the kernel methods is widened to arbitrary algorithms that do not utilize the dot product.

 

 
 
 
Untitled Document