OpenTab: Advancing Large Language Models as Open-domain Table Reasoners
On this page
Large Language Models (LLMs) trained on large volumes of data excel atvarious natural language tasks, but they cannot handle tasks requiringknowledge that has not been trained on previously. One solution is to use aretriever that fetches relevant information to expand LLM’s knowledge scope.However, existing textual-oriented retrieval-based LLMs are not ideal onstructured table data due to diversified data modalities and large table sizes.In this work, we propose OpenTab, an open-domain table reasoning frameworkpowered by LLMs. Overall, OpenTab leverages table retriever to fetch relevanttables and then generates SQL programs to parse the retrieved tablesefficiently. Utilizing the intermediate data derived from the SQL executions,it conducts grounded inference to produce accurate response. Extensiveexperimental evaluation shows that OpenTab significantly outperforms baselinesin both open- and closed-domain settings, achieving up to 21.5
Further reading
- Access Paper in arXiv.org