T

Tapas Tiny

Developed by google
TAPAS is a Transformer-based table question answering model pre-trained in a self-supervised manner on English Wikipedia table data, supporting table QA and entailment tasks.
Downloads 44
Release Time : 3/2/2022

Model Overview

This model adopts a BERT-like architecture, learning joint representations of tables and text through masked language modeling and intermediate pre-training, suitable for downstream fine-tuning on table QA and table entailment tasks.

Model Features

Dual-objective Pre-training
Combines masked language modeling and intermediate pre-training to enhance numerical reasoning capabilities in tables
Flexible Position Embedding
Offers both relative (default) and absolute position embedding versions
Table Flattening
Flattens table structures into sequences while preserving contextual relationships with associated text

Model Capabilities

Table data comprehension
Table question answering
Table entailment
Joint table-text representation learning

Use Cases

Intelligent Document Processing
Financial Statement QA
Automatically answer revenue, profit, and related questions from corporate financial statements
Knowledge Base Systems
Wikipedia Table Query
Parse Wikipedia table content and answer user queries
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase