Original Transformer
O
Original Transformer
Developed by ubaada
PyTorch implementation based on the original Transformer architecture from the 2017 paper 'Attention Is All You Need', a 65-million-parameter base model specifically trained for English-German translation tasks
Downloads 26
Release Time : 11/4/2024
Model Overview
This is a customized HuggingFace model adaptation version, implemented based on the original Transformer architecture, primarily used for English-to-German machine translation tasks.
Model Features
Original Transformer implementation
Strictly follows the original architecture design from the 2017 paper 'Attention Is All You Need'
English-German translation optimization
Specifically trained and optimized for English-to-German translation tasks
Medium-scale parameters
Balanced design with 65 million parameters, considering both performance and efficiency
Model Capabilities
English-to-German text translation
German-to-English text translation
Use Cases
Machine translation
Daily expression translation
Translate English daily expressions into German
Example: 'This is my cat' â 'Das ist meine Katze.'
Document translation
Translate English documents into German versions
Featured Recommended AI Models
Š 2025AIbase