Research Library

open-access-imgOpen AccessSwitchTab: Switched Autoencoders Are Effective Tabular Learners
Author(s)
Jing Wu,
Suiyao Chen,
Qi Zhao,
Renat Sergazinov,
Chen Li,
Shengjie Liu,
Chongchao Zhao,
Tianpei Xie,
Hanqing Guo,
Cheng Ji,
Daniel Cociorva,
Hakan Brunzel
Publication year2024
Self-supervised representation learning methods have achieved significantsuccess in computer vision and natural language processing, where data samplesexhibit explicit spatial or semantic dependencies. However, applying thesemethods to tabular data is challenging due to the less pronounced dependenciesamong data samples. In this paper, we address this limitation by introducingSwitchTab, a novel self-supervised method specifically designed to capturelatent dependencies in tabular data. SwitchTab leverages an asymmetricencoder-decoder framework to decouple mutual and salient features among datapairs, resulting in more representative embeddings. These embeddings, in turn,contribute to better decision boundaries and lead to improved results indownstream tasks. To validate the effectiveness of SwitchTab, we conductextensive experiments across various domains involving tabular data. Theresults showcase superior performance in end-to-end prediction tasks withfine-tuning. Moreover, we demonstrate that pre-trained salient embeddings canbe utilized as plug-and-play features to enhance the performance of varioustraditional classification methods (e.g., Logistic Regression, XGBoost, etc.).Lastly, we highlight the capability of SwitchTab to create explainablerepresentations through visualization of decoupled mutual and salient featuresin the latent space.
Language(s)English

Seeing content that should not be on Zendy? Contact us.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here