F

Fanformer 1B

Developed by dongyh
FANformer-1B is an autoregressive model enhanced with innovative periodic mechanisms for language modeling, featuring 1.1 billion non-embedding parameters and trained on 1 trillion tokens.
Downloads 114
Release Time : 3/20/2025

Model Overview

A decoder-architecture large language model with enhanced periodic modeling, suitable for general text generation and comprehension tasks.

Model Features

Periodic Modeling Enhancement
Effectively captures periodic patterns in data through innovative FAN layers, improving learning efficiency and performance.
Efficient Training
Achieves superior performance compared to similar models with 1 trillion training tokens.
Lightweight Design
1.1 billion parameters maintain performance while reducing computational resource requirements.

Model Capabilities

Text Generation
Language Understanding
Knowledge Q&A
Logical Reasoning

Use Cases

Text Generation
Academic Writing Assistance
Generates scientific discourse texts incorporating periodic concepts.
Produces academic-style texts with 72.45% coherence (based on arc_easy tests).
Educational Applications
Science Q&A System
Answers fundamental STEM-related questions.
Achieves 94.8% accuracy on the sciq test set.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase