Erlangshen DeBERTa V2 710M Chinese
Apache-2.0
This is a 710M parameter DeBERTa-v2 model focused on Chinese natural language understanding tasks. It is pre-trained using the whole-word masking method, providing strong support for the Chinese NLP field.
Large Language Model
Transformers Chinese