T

Tapas Large

Developed by google
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data and related text. It is pre-trained through self-supervised learning on a massive collection of English Wikipedia tables and associated text.
Downloads 211
Release Time : 3/2/2022

Model Overview

The TAPAS model learns bidirectional representations of tables and text through masked language modeling and intermediate pre-training, primarily used for downstream tasks such as table question answering or statement verification.

Model Features

Joint Table-Text Processing
Capable of simultaneously processing tabular data and related text, learning the associative representations between them.
Dual Pre-training Objectives
Combines masked language modeling and intermediate pre-training to enhance numerical reasoning capabilities for tables.
Flexible Position Embeddings
Offers both relative position embeddings (default) and absolute position embeddings to accommodate different needs.

Model Capabilities

Table Data Understanding
Text-Table Association Analysis
Table Question Answering
Statement Verification

Use Cases

Information Retrieval
Table Question Answering System
Extracts or infers answers from tables based on user questions
Can build efficient table question answering applications
Data Verification
Statement Truthfulness Verification
Verifies whether textual statements are supported by tabular data
Can be used in scenarios such as fact-checking
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase