T

Tapas Base Finetuned Tabfact

Developed by google
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data. It is pre-trained in a self-supervised manner on English Wikipedia table data and fine-tuned on the TabFact dataset to determine whether a sentence is supported or refuted by table content.
Downloads 6,669
Release Time : 3/2/2022

Model Overview

This model is suitable for determining whether a sentence is supported or refuted by table content, supporting table-based question answering or entailment relationship judgment between sentences and table content.

Model Features

Dual Pre-training Objectives
Combines Masked Language Modeling (MLM) and intermediate pre-training to enhance numerical reasoning capabilities for tables.
Relative Position Embeddings
The default version uses relative position embeddings, resetting position indices for each cell in the table to optimize table processing capabilities.
Synthetic Training Samples
During the intermediate pre-training phase, millions of synthetic training samples are used to build a balanced dataset, improving the model's generalization ability.

Model Capabilities

Table Fact Verification
Table-based Question Answering
Entailment Judgment

Use Cases

Data Validation
Table Content Verification
Verify whether a given sentence is supported or refuted by table content.
Intelligent Question Answering
Table-based Question Answering
Answer natural language questions based on table content.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase