Neural Networks with Categorical Feature Embedding for Classification and Visualization (CENN)

cenn

Thanks to the superior expressive power of neural networks for supervised learning, neural networks have been widely applied in many areas of artificial intelligence, such as computer vision and speech recognition. However, for traditional data mining tasks with hand-crafted features, neural networks fall short of the following aspects: 1) the presence of categorical features can pose problems because neural networks only take numerical features inherently. 2) the interpretability of neural networks leaves something to be desired because it is a big hassle to extract knowledge from neural networks. To address these problems, we advocate a compellingly simple, yet effective neural network architecture with Category Embedding (CENN). It is capable of directly handling both numerical and categorical features as well as providing visual insights on feature similarities. At its core, CENN learns a numerical embedding for each category of a categorical feature, based on which we can visualize all categories in the embedding space and extract knowledge of similarity between categories. In addition, CENN has a close relation to the one-hot encoding. We show that the weight matrix in the first layer of CENN is a low-rank decomposition of neural networks with one-hot encoding.

Torch Code Download

Data download:poker-hand

For any questions or bug reports, please contact Yixin Chen.

Reference

TBD (will be updated when it is published)

Acknowledgement

The authors are supported in part by the IIS-1343896, DBI-1356669, and III-1526012 grants from the National Science Foundation of the United States, a Microsoft Research New Faculty Fellowship, and a Barnes-Jewish Hospital Foundation grant.