T

Tapas Large Finetuned Wtq

Developed by google
TAPAS is a table question answering model based on the BERT architecture, pre-trained in a self-supervised manner on Wikipedia table data, supporting natural language question answering on table content
Downloads 124.85k
Release Time : 3/2/2022

Model Overview

This model is specifically fine-tuned for the WikiTable Questions (WTQ) task, capable of understanding table structures and answering related questions. It offers both relative and absolute position versions, supporting complex operations such as numerical reasoning and cell selection.

Model Features

Dual Pre-training Mechanism
Combines Masked Language Modeling (MLM) and intermediate pre-training to enhance numerical reasoning capabilities for tables
Multi-task Chained Fine-tuning
Jointly fine-tuned on SQA, WikiSQL, and WTQ datasets to improve generalization
Optional Position Embedding
Provides both relative position (cell reset indexing) and absolute position embedding options
Joint Prediction Architecture
Simultaneously trains cell selection head and aggregation head, supporting discrete value and aggregation operation predictions

Model Capabilities

Table structure understanding
Natural language question parsing
Table cell selection
Numerical comparison and calculation
Aggregation operation prediction (e.g., sum, count, etc.)

Use Cases

Knowledge Question Answering
Wikipedia Table Question Answering
Answer natural language questions about Wikipedia infoboxes and data tables
Achieved 50.97% accuracy on the WTQ development set
Business Intelligence
Financial Statement Analysis
Parse financial statements and answer queries about revenue, growth, and other metrics
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase