Jpharmatron 7B Base
J
Jpharmatron 7B Base
Developed by EQUES
JPharmatron-7B-base is a large language model with 7 billion parameters in Japanese and English, specifically designed for pharmaceutical applications and research.
Downloads 104
Release Time : 4/1/2025
Model Overview
This model is based on the Qwen2.5-7B architecture and undergoes continuous pre - training using 2 billion tokens from Japanese datasets, focusing on natural language processing tasks in the pharmaceutical field.
Model Features
Domain targeting
Specifically designed for pharmaceutical applications and research, with domain - specific optimizations.
Multilingual support
Supports Japanese and English, suitable for cross - language pharmaceutical research.
Continuous pre - training
Based on Qwen2.5-7B, undergoes continuous pre - training using 2 billion Japanese pharmaceutical domain tokens.
Model Capabilities
Text understanding in the pharmaceutical field
Cross - language term standardization
Pharmaceutical knowledge Q&A
Pharmaceutical document analysis
Use Cases
Pharmaceutical research
Pharmacist qualification exam Q&A
A Q&A system based on the content of the Japanese national pharmacist qualification exam
Performs excellently in the YakugakuQA benchmark test
Cross - language term standardization
Handles the standardization of pharmaceutical synonyms and terms between Japanese and English
Shows competitiveness in the NayoseQA benchmark test
Statement consistency verification
Evaluates the consistency reasoning between paired statements
Performs better than some commercial models in the SogoCheck task
Featured Recommended AI Models
Š 2025AIbase