T

Tapas Base Finetuned Wikisql Supervised

Developed by google
TAPAS is a BERT-based Transformer model specifically designed for table question answering tasks. It is pre-trained in a self-supervised manner on English Wikipedia table data and supports weakly supervised table parsing.
Downloads 737
Release Time : 3/2/2022

Model Overview

This model learns bidirectional representations of tables and associated text through masked language modeling and intermediate pre-training, suitable for table-based question answering tasks, supporting cell selection and aggregation operations.

Model Features

Two-stage Pre-training
Combines masked language modeling and intermediate pre-training to enhance numerical reasoning capabilities for tables
Relative Position Embeddings
Resets position indices for each cell in the table to optimize understanding of table structures
Joint Fine-tuning
Jointly trains cell selection heads and aggregation heads on SQA and WikiSQL datasets

Model Capabilities

Table Question Answering
Table Content Parsing
Cell Selection
Numerical Aggregation Calculation

Use Cases

Business Intelligence
Financial Statement Analysis
Automatically answers natural language questions about financial statement data
Can accurately extract specific metrics and perform aggregation operations like summation
Data Querying
Natural Language Interface for Databases
Converts natural language questions into table query operations
Supports SQL-like query functionality without requiring SQL knowledge
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase