Multitask Learning / Domain Adaptation



Multitask learning aims to improve the performance of learning algorithms by learning classifiers for multiple tasks jointly. This works particularly well if these tasks have some commonality and are generally slightly under sampled. One example is a spam-filter. Everybody has a slightly different distribution over spam or not-spam emails (e.g. all emails in russian are spam for me -- but not so for my russian colleagues), yet there is definitely a common aspect across users. Multi-task learning works, because encouraging a classifier (or a modification thereof) to also performs well on a slightly different task is a better regularization than uninformed regularizers e.g. to enforce that all weights are small (the typical l2-regularization).

Relevant publications:

[PDF][CODE][BIBTEX] Minmin Chen, Zhixiang (Eddie) Xu, Kilian Q. Weinberger, Fei Sha. Marginalized Stacked Denoising Autoencoders for Domain Adaptation. Proceedings of 29th International Conference on Machine Learning (ICML), Edingburgh Scotland, Omnipress, pages 767-774, 2012.

[PDF][CODE][BIBTEX] Minmin Chen, Kilian Q. Weinberger, John C. Blitzer. Co-Training for Domain Adaptation. Proceedings of Advances in Neural Information Processing Systems 24 (NIPS). [To appear]

[PDF][CODE][BIBTEX] Minmin Chen, Kilian Q. Weinberger, Yixin Chen. Automatic Feature Decomposition for Single View Co-training. Proceedings of the 28th International Conference on Machine Learning (ICML-11), pages 953--960, ACM, Bellevue, USA, 2011.

[PDF][BIBTEX] Olivier Chapelle, Pannagadatta Shivaswamy, Srinivas Vadrevu, Kilian Q. Weinberger, Ya Zhang, Belle Tseng. Multi-Task Learning for Boosting with Application to Web Search Ranking. Machine Learning Journal, ISSN 0885-6125, pages 1-25, Springer Verlag, 2011.

[PDF][CODE][BIBTEX] Shibin Parameswaran and Kilian Q. Weinberger. Large Margin Multi-Task Metric Learning. In J. Lafferty, C. K. I. Williams, J. Shawe-Taylor, R.S. Zemel, and A. Culotta (eds.), Advances in Neural Information Processing Systems 23 (NIPS), pages 1867-1875, 2010.

[PDF][TALK][BIBTEX] O. Chapelle, S. Vadrevu, K. Q. Weinberger, P. Shivaswamy, Y. Zhang, B. Tseng. Multi-Task Learning for Boosting with Application to Web Search Ranking, KDD 2010. Proceedings of the 16th international conference on Knowledge discovery and data mining (SIGKDD): 1189-1198 ACM.

[PDF][BIBTEX] Yuzong Liu, Mohit Sharma, Charles M. Gaona, Jonathan D. Breshears, Jarod Roland, Zachary V. Freudenburg, Kilian Q. Weinberger, and Eric C. Leuthardt. Decoding Ipsilateral Finger Movements from ECoG Signals in Humans. In J. Lafferty, C. K. I. Williams, J. Shawe-Taylor, R.S. Zemel, and A. Culotta (eds.), Advances in Neural Information Processing Systems 23 (NIPS), pages 1468-1476, 2010. [Attention! Results in Figure 3 could not be reproduced and should be considered invalid. We apologize for this mishap. All other results are not affected. ]

[PDF][BIBTEX] J. Attenberg, K. Q. Weinberger, A. Smola, A. Dasgupta, M. Zinkevich, Collaborative Email-Spam Filtering with the Hashing-Trick. Sixth Conference on Email and Anti-Spam (CEAS) 2009

[PDF][BIBTEX] K. Q. Weinberger, A. Dasgupta, J. Langford, A. Smola, J. Attenberg. Feature Hashing for Large Scale Multitask Learning. In Proceedings of the Twenty Sixth International Confernence on Machine Learning (ICML-09), Canada.