T

Tapas Base

Developed by google
A table understanding model based on BERT architecture, pretrained on Wikipedia table data through self-supervised learning, supporting table question answering and statement verification tasks
Downloads 2,457
Release Time : 3/2/2022

Model Overview

TAPAS is a BERT-based Transformer model specifically designed for processing tabular data and related text. Through two stages of masked language modeling and intermediate pretraining, the model learns bidirectional representations of tables and text, suitable for downstream tasks such as table question answering and statement verification.

Model Features

Dual-Stage Pretraining
Combines masked language modeling (MLM) with intermediate pretraining to enhance numerical reasoning capabilities for tables
Position Embedding Support
Provides two embedding methods: relative position (default) and absolute position (revision='no_reset')
Table Flattening Processing
Flattens table structures into sequences for combined processing with contextual text

Model Capabilities

Table Data Understanding
Table Question Answering
Statement Verification
Joint Table-Text Representation Learning

Use Cases

Intelligent Q&A
Table Data Question Answering
Answers user questions based on table content
Can accurately answer questions requiring table data reasoning
Information Verification
Table Statement Verification
Verifies whether text statements are supported by table data
Can determine consistency between statements and table data
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase