C

Code Trans T5 Large Source Code Summarization Python Multitask Finetune

Developed by SEBIS
Pretrained model based on T5-large architecture, specifically designed for Python code summarization tasks with multi-task learning support
Downloads 78
Release Time : 3/2/2022

Model Overview

This model generates Python function descriptions or fine-tunes for code-related tasks, performing best with tokenized Python code

Model Features

Multi-task pretraining
Trained on 13 supervised tasks and 7 unsupervised datasets to enhance model generalization
Tokenization optimization
Specially optimized for tokenized Python code, delivering better performance with structured code
Large-scale training
Completed 500,000 steps of pretraining and targeted fine-tuning on TPU clusters

Model Capabilities

Python code summarization
Automatic source code documentation generation
Code understanding and analysis

Use Cases

Software development
Automatic function documentation
Automatically generates descriptive documentation for Python functions
Achieves BLEU score of 13.37 (Python test set)
Code review assistance
Helps developers quickly understand code logic through summarization
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase