Speech Recognition using Biologically-Inspired Neural Networks
Thomas Bohnstingl, Ayush Garg, et al.
ICASSP 2022
Transformer-based language models have become the de facto standard in natural language processing. However, they underperform in the tabular data domain compared to traditional tree-based methods. We posit that current models fail to achieve the full potential of language models due to (i) heterogeneity of tabular data; and (ii) challenges faced by the model in interpreting numerical values. Based on this hypothesis, we propose the Tabular Domain Transformer (TDTransformer) framework. TDTransformer has distinct embedding processes for different types of columns. The alignment layers for different column-types transform these embeddings to a common space. Besides, TDTransformer adapts piece-wise linear encoding for numerical values for better performance. We test the proposed method on 76 real-world tabular classification datasets from the OpenML benchmark. Extensive experiments indicate that TDTransformer improves the state-of-the-art methods.
Thomas Bohnstingl, Ayush Garg, et al.
ICASSP 2022
Hsi-ai Tsao, Lei Hsiung, et al.
ICLR 2024
Bo Zhao, Nima Dehmamy, et al.
ICML 2025
Aditya Kashyap, Maria Anna Rapsomaniki, et al.
TIBTECH