Grc Odycy Joint Trf
General-purpose natural language processing pipeline for Ancient Greek, supporting multiple tasks including POS tagging and dependency parsing
Downloads 76
Release Time : 4/3/2023
Model Overview
odyCy is a spaCy-based natural language processing pipeline for Ancient Greek, supporting various tasks such as POS tagging, dependency parsing, and lemmatization. This model achieves state-of-the-art performance on multiple tasks in the Universal Dependencies Perseus treebank.
Model Features
Multi-task Processing
Supports various NLP tasks including POS tagging, dependency parsing, and lemmatization
High Performance
Achieves state-of-the-art performance on multiple tasks in the Universal Dependencies Perseus treebank
Stable Performance
Demonstrates relatively stable performance across different evaluation datasets
Transformer-based
Utilizes transformer architecture for more accurate semantic understanding
Model Capabilities
POS tagging
Dependency parsing
Lemmatization
Morphological feature analysis
Sentence boundary detection
Use Cases
Classical Literature Research
Ancient Greek Text Analysis
Used for analyzing the grammatical structure of ancient Greek texts such as Homeric epics
Accurately identifies POS, grammatical relations, and morphological variations
Language Education
Ancient Greek Learning Aid
Helps students understand syntactic structures of Ancient Greek
Provides accurate POS tagging and grammatical analysis
🚀 grc_odycy_joint_trf
A general - purpose NLP pipeline for Ancient - Greek, achieving state - of - the - art performance on multiple NLP tasks.
🚀 Quick Start
Check out our Documentation on Basic Usage. You can also Open in Colab.
✨ Features
odyCy achieves state of the art performance on multiple tasks on unseen test data from the Universal Dependencies Perseus treebank, and performs second best on the PROIEL treebank’s test set on even more tasks. In addition performance also seems relatively stable across the two evaluation datasets in comparison with other NLP pipelines.
For plots and tables on OdyCy's performance, check out the Documentation page on Performance
📚 Documentation
Model Information
Property | Details |
---|---|
Model Type | grc_odycy_joint_trf |
Version | 0.7.0 |
spaCy | >=3.7.4,<3.8.0 |
Default Pipeline | transformer , tagger , morphologizer , parser , trainable_lemmatizer , frequency_lemmatizer |
Components | transformer , tagger , morphologizer , parser , trainable_lemmatizer , frequency_lemmatizer |
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | n/a |
License | MIT |
Author | {Jan Kostkan, Márton Kardos} |
Label Scheme
View label scheme (2299 labels for 3 components)
Component | Labels |
---|---|
tagger |
--------- , --p---fa- , --s---ma- , -3paia--- , -3paim--- , -3siia--- , A- , C- , Df , Dq , Du , F- , G- , I- , Ma , Mo , Nb , Ne , Pc , Pd , Pi , Pk , Pp , Pr , Ps , Px , R- , S- , V- , a-------- , a-------s , a-d---fa- , a-d---fd- , a-d---fg- , a-d---fn- , a-d---ma- , a-d---md- , a-d---mg- , a-d---mn- , a-d---mnc , a-d---mv- , a-d---na- , a-d---ng- , a-d---nn- , a-p----dc , a-p---fa- , a-p---fac , a-p---fas , a-p---fd- , a-p---fdc , a-p---fds , a-p---fg- , a-p---fgc , a-p---fn- , a-p---fnc , a-p---fns , a-p---fv- , a-p---m-- , a-p---m-c , a-p---ma- , a-p---mac , a-p---mas , a-p---md- , a-p---mdc , a-p---mds , a-p---mg- , a-p---mgc , a-p---mgs , a-p---mn- , a-p---mnc , a-p---mns , a-p---mv- , a-p---mvs , a-p---na- , a-p---nac , a-p---nas , a-p---nd- , a-p---ndc , a-p---nds , a-p---ng- , a-p---ngs , a-p---nn- , a-p---nnc , a-p---nns , a-p---nv- , a-s----d- , a-s----dc , a-s----g- , a-s----gc , a-s---fa- , a-s---fac , a-s---fas , a-s---fd- , a-s---fds , a-s---fg- , a-s---fgc , a-s---fgs , a-s---fn- , a-s---fnc , a-s---fns , a-s---fv- , a-s---m-- , a-s---ma- , a-s---mac , a-s---mas , a-s---md- , a-s---mdc , a-s---mds , a-s---mg- , a-s---mgc , a-s---mgs , a-s---mn- , a-s---mnc , a-s---mns , a-s---mv- , a-s---mvc , a-s---mvs , a-s---na- , a-s---nac , a-s---nas , a-s---nd- , a-s---ndc , a-s---nds , a-s---ng- , a-s---nn- , a-s---nnc , a-s---nns , a-s---nv- , a-s---nvs , c-------- , d-------- , d-------c , d-------s , g-------- , i-------- , l-------- , l-d---fa- , l-d---fg- , l-d---mg- , l-d---mn- , l-d---na- , l-d---nn- , l-p---fa- , l-p---fd- , l-p---fg- , l-p---fn- , l-p---ma- , l-p---md- , l-p---mg- , l-p---mn- , l-p---na- , l-p---nd- , l-p---ng- , l-p---nn- , l-s---fa- , l-s---fd- , l-s---fg- , l-s---fn- , l-s---ma- , l-s---md- , l-s---mg- , l-s---mn- , l-s---na- , l-s---nd- , l-s---ng- , l-s---nn- , m-------- , m-p---m-- , m-p---md- , m-p---nn- , n-----fg- , n-----na- , n-----nn- , n-d----a- , n-d---fa- , n-d---fd- , n-d---fg- , n-d---fn- , n-d---ma- , n-d---md- , n-d---mg- , n-d---mn- , n-d---mv- , n-d---na- , n-d---nn- , n-p----d- , n-p----g- , n-p---fa- , n-p---fd- , n-p---fg- , n-p---fn- , n-p---fv- , n-p---ma- , n-p---md- , n-p---mg- , n-p---mn- , n-p---mv- , n-p---na- , n-p---nd- , n-p---ng- , n-p---nn- , n-p---nv- , n-s----d- , n-s----g- , n-s----n- , n-s----v- , n-s---fa- , n-s---fd- , n-s---fg- , n-s---fn- , n-s---fv- , n-s---m-- , n-s---ma- , n-s---md- , n-s---mg- , n-s---mn- , n-s---mv- , n-s---na- , n-s---nd- , n-s---ng- , n-s---nn- , n-s---nv- , p-------- , p-d----d- , p-d----n- , p-d---fa- , p-d---fd- , p-d---fg- , p-d---fn- , p-d---ma- , p-d---md- , p-d---mg- , p-d---mn- , p-d---mv- , p-p----a- , p-p----d- , p-p----g- , p-p----n- , p-p---fa- , p-p---fd- , p-p---fg- , p-p---fn- , p-p---ma- , p-p---md- , p-p---mg- , p-p---mn- , p-p---na- , p-p---nd- , p-p---ng- , p-p---nn- , p-s----a- , p-s----d- , p-s----g- , p-s----n- , p-s---fa- , p-s---fd- , p-s---fg- , p-s---fn- , p-s---ma- , p-s---md- , p-s---mg- , p-s---mn- , p-s---mv- , p-s---na- , p-s---nd- , p-s---ng- , p-s---nn- , p1p---fa- , p1p---ma- , p1p---md- , p1p---mg- , p1p---mn- , p1s---fa- , p1s---fd- , p1s---fg- , p1s---fn- , p1s---ma- , p1s---md- , p1s---mg- , p1s---mn- , p2p----a- , p2p----d- , p2p---ma- , p2p---mg- , p2p---mn- , p2s----a- , p2s----d- , p2s----g- , p2s----n- , p2s---ma- , p2s---md- , p2s---mg- , p3s---fa- , p3s---ma- , r-------- , u-------- , v---na--- , v--amm--- , v--an---- , v--ana--- , v--ane--- , v--anm--- , v--anp--- , v--fna--- , v--fne--- , v--fnm--- , v--fnp--- , v--pna--- , v--pnd--- , v--pne--- , v--pnp--- , v--ppefa- , v--ppemn- , v--rn---- , v--rna--- , v--rne--- , v--rnp--- , v--tna--- , v-dapafn- , v-dapama- , v-dapamg- , v-dapamn- , v-dapmfn- , v-dapmmn- , v-dappma- , v-dappmn- , v-dppafg- , v-dppama- , v-dppamn- , v-dppefn- , v-dppema- , v-dppemd- , v-dppemn- , v-dpppmn- , v-drpama- , v-drpamn- , v-drpefn- , v-drpemn- , v-p-pmma- , v-pap-mn- , v-papafa- , v-papafg- , v-papafn- , v-papama- , v-papamd- , v-papamg- , v-papamn- , v-papana- , v-papand- , v-papann- , v-papefn- , v-papema- , v-papemn- , v-papmfa- , v-papmfg- , v-papmfn- , v-papmma- , v-papmmd- , v-papmmg- , v-papmmn- , v-papmna- , v-papmng- , v-papmnn- , v-pappfd- , v-pappfg- , v-pappfn- , v-pappma- , v-pappmd- , v-pappmg- , v-pappmn- , v-pappna- , v-pappng- , v-pappnn- , v-pfpama- , v-pfpamg- , v-pfpamn- , v-pfpema- , v-pfpemn- , v-pfpmfa- , v-pfpmfn- , v-pfpmma- , v-pfpmmd- , v-pfpmmg- , v-pfpmmn- , v-pfpmnn- , v-pfppmn- , v-ppp-mn- , v-pppafa- , v-pppafd- , v-pppafg- , v-pppafn- , v-pppafv- , v-pppama- , v-pppamd- , v-pppamg- , v-pppamn- , v-pppamv- , v-pppana- , v-pppand- , v-pppang- , v-pppann- , v-pppefa- , v-pppefd- , v-pppefg- , v-pppefn- , v-pppefv- , v-pppema- , v-pppemd- , v-pppemg- , v-pppemn- , v-pppemv- , v-pppena- , v-pppend- , v-pppeng- , v-pppenn- , v-ppppma- , v-ppppmd- , v-ppppmn- , v-prp-mn- , v-prpafa- , v-prpafd- , v-prpafn- , v-prpama- , v-prpamd- , v-prpamg- , v-prpamn- , v-prpana- , v-prpang- , v-prpefa- , v-prpefd- , v-prpefg- , v-prpefn- , v-prpema- , v-prpemd- , v-prpemg- , v-prpemn- , v-prpena- , v-prpend- , v-prpeng- , v-prpenn- , v-prppfn- , v-prppmn- , v-sagamn- , v-saiamn- , v-samp--- , v-sap-mg- , v-sap-mn- , v-sapafa- , v-sapafd- , v-sapafg- , v-sapafn- , v-sapama- , v-sapamd- , v-sapamg- , v-sapamn- , v-sapamv- , v-sapana- , v-sapang- , v-sapann- , v-sapanv- , v-sapema- , v-sapemn- , v-sapmfa- , v-sapmfd- , v-sapmfg- , v-sapmfn- , v-sapmma- , v-sapmmd- , v-sapmmg- , v-sapmmn- , v-sapmna- , v-sapmng- , v-sapmnn- , v-sappfa- , v-sappfd- , v-sappfg- , v-sappfn- , v-sappma- , v-sappmd- , v-sappmg- , v-sappmn- , v-sappna- , v-sappng- , v-sappnn- , v-sappnv- , v-sfpafa- , v-sfpafd- , v-sfpafn- , v-sfpama- , v-sfpamd- , v-sfpamg- , v-sfpamn- , v-sfpmfa- , v-sfpmfd- , v-sfpmfg- , v-sfpmfn- , v-sfpmma- , v-sfpmmg- , v-sfpmmn- , v-sfpmna- , v-sfppma- , v-spiamn- , v-spp-mn- , v-spp-nn- , v-sppa--- , v-sppafa- , v-sppafd- , v-sppafg- , v-sppafn- , v-sppafv- , v-sppama- , v-sppamd- , v-sppamg- , v-sppamn- , v-sppamv- , v-sppana- , v-sppand- , v-sppang- , v-sppann- , v-sppanv- , v-sppefa- , v-sppefd- , v-sppefg- , v-sppefn- , v-sppema- , v-sppemd- , v-sppemg- , v-sppemn- , v-sppemv- , v-sppena- , v-sppend- , v-sppeng- , v-sppenn- , v-spppfa- , v-spppfd- , v-spppfg- , v-spppfn- , v-spppma- , v-spppmn- , v-srp-mn- , v-srpafa- , v-srpafd- , v-srpafg- , v-srpafn- , v-srpama- , v-srpamd- , v-srpamg- , v-srpamn- , v-srpamv- , v-srpana- , v-srpand- , v-srpang- , v-srpann- , v-srpefa- , v-srpefd- , v-srpefg- , v-srpefn- , v-srpema- , v-srpemd- , v-srpemg- , v-srpemn- , v-srpemv- , v-srpena- , v-srpend- , v-srpeng- , v-srpenn- , v-srppfn- , v-srppma- , v-srppmn- , v-srppmv- , v1paia--- , v1paim--- , v1paip--- , v1paoa--- , v1paom--- , v1paop--- , v1pasa--- , v1pase--- , v1pasm--- , v1pasp--- , v1pfia--- , v1pfim--- , v1pfom--- , v1piia--- , v1piie--- , v1plia--- , v1plie--- , v1ppia--- , v1ppie--- , v1ppip--- , v1ppoa--- , v1ppoe--- , v1ppsa--- , v1ppse--- , v1pria--- , v1prie--- , v1prsa--- , v1prse--- , v1ptie--- , v1s-sa--- , v1sa-a--- , v1saia--- , v1saie--- , v1saim--- , v1saip--- , v1sao---- , v1saoa--- , v1saoe--- , v1saom--- , v1saop--- , v1sasa--- , v1sase--- , v1sasm--- , v1sasp--- , v1sfi---- , v1sfia--- , v1sfie--- , v1sfim--- , v1sfip--- , v1siia--- , v1siie--- , v1slia--- , v1slie--- , v1slim--- , v1spia--- , v1spie--- , v1spoa--- , v1spoe--- , v1spsa--- , v1spse--- , v1sria--- , v1srie--- , v1sroa--- , v1sroe--- , v1srsa--- , v1stie--- , v1stim--- , v2daia--- , v2dama--- , v2dasa--- , v2dase--- , v2dfia--- , v2dfim--- , v2diia--- , v2diie--- , v2dpia--- , v2dpma--- , v2dpme--- , v2dria--- , v2drma--- , v2paia--- , v2paim--- , v2paip--- , v2pama--- , v2pame--- , v2pamm--- , v2paoa--- , v2paom--- , v2paop--- , v2pasa--- , v2pase--- , v2pasm--- , v2pasp--- , v2pfia--- , v2pfim--- , v2piia--- , v2piie--- , v2ppia--- , v2ppie--- , v2ppma--- , v2ppme--- , v2ppoa--- , v2ppoe--- , v2ppsa--- , v2pria--- , v2prie--- , v2prma--- , v2prmp--- , v2proa--- , v2prsa--- , v2saia--- , v2saie--- , v2saim--- , v2saip--- , v2sam---- , v2sama--- , v2same--- , v2samm--- , v2samp--- , v2saoa--- , v2saoe--- , v2saom--- , v2saop--- , v2sasa--- , v2sase--- , v2sasm--- , v2sasp--- , v2sfi---- , v2sfia--- , v2sfie--- , v2sfim--- , v2sfip--- , v2siia--- , v2siie--- , v2siip--- , v2slia--- , v2slie--- , v2slim--- , v2spia--- , v2spie--- , v2spma--- , v2spme--- , v2spoa--- , v2spoe--- , v2spsa--- , v2spse--- , v2sria--- , v2srie--- , v2srma--- , v2srme--- , v2sroa--- , v2srsa--- , v2stie--- , v3-roe--- , v3daia--- , v3daim--- , v3daip--- , v3daoa--- , v3dfia--- , v3dfim--- , v3diia--- , v3diie--- , v3dlia--- , v3dlie--- , v3dlim--- , v3dpia--- , v3dpie--- , v3dpma--- , v3dpme--- , v3dpsa--- , v3dria--- , v3pai---- , v3paia--- , v3paie--- , v3paim--- , v3paip--- , v3pamm--- , v3paoa--- , v3paoe--- , v3paom--- , v3paop--- , v3pasa--- , v3pase--- , v3pasm--- , v3pasp--- , v3pfia--- , v3pfie--- , v3pfim--- , v3piia--- , v3piie--- , v3piip--- , v3plia--- , v3plie--- , v3plim--- , v3plip--- , v3ppia--- , `v3ppi |
📄 License
This project is licensed under the MIT
license.
Indonesian Roberta Base Posp Tagger
MIT
This is a POS tagging model fine-tuned based on the Indonesian RoBERTa model, trained on the indonlu dataset for Indonesian text POS tagging tasks.
Sequence Labeling
Transformers Other

I
w11wo
2.2M
7
Bert Base NER
MIT
BERT fine-tuned named entity recognition model capable of identifying four entity types: Location (LOC), Organization (ORG), Person (PER), and Miscellaneous (MISC)
Sequence Labeling English
B
dslim
1.8M
592
Deid Roberta I2b2
MIT
This model is a sequence labeling model fine-tuned on RoBERTa, designed to identify and remove Protected Health Information (PHI/PII) from medical records.
Sequence Labeling
Transformers Supports Multiple Languages

D
obi
1.1M
33
Ner English Fast
Flair's built-in fast English 4-class named entity recognition model, based on Flair embeddings and LSTM-CRF architecture, achieving an F1 score of 92.92 on the CoNLL-03 dataset.
Sequence Labeling
PyTorch English
N
flair
978.01k
24
French Camembert Postag Model
French POS tagging model based on Camembert-base, trained using the free-french-treebank dataset
Sequence Labeling
Transformers French

F
gilf
950.03k
9
Xlm Roberta Large Ner Spanish
A Spanish named entity recognition model fine-tuned based on the XLM-Roberta-large architecture, with excellent performance on the CoNLL-2002 dataset.
Sequence Labeling
Transformers Spanish

X
MMG
767.35k
29
Nusabert Ner V1.3
MIT
Named entity recognition model fine-tuned on Indonesian NER tasks based on NusaBert-v1.3
Sequence Labeling
Transformers Other

N
cahya
759.09k
3
Ner English Large
Flair framework's built-in large English NER model for 4 entity types, utilizing document-level XLM-R embeddings and FLERT technique, achieving an F1 score of 94.36 on the CoNLL-03 dataset.
Sequence Labeling
PyTorch English
N
flair
749.04k
44
Punctuate All
MIT
A multilingual punctuation prediction model fine-tuned based on xlm-roberta-base, supporting automatic punctuation completion for 12 European languages
Sequence Labeling
Transformers

P
kredor
728.70k
20
Xlm Roberta Ner Japanese
MIT
Japanese named entity recognition model fine-tuned based on xlm-roberta-base
Sequence Labeling
Transformers Supports Multiple Languages

X
tsmatz
630.71k
25
Featured Recommended AI Models