Languages
Pre-trained NLP models for 358+ Wikipedia languages. Each language includes tokenizers, n-gram models, Markov chains, vocabularies, and embeddings.
31 with models
4.23x avg compression
A
B
C
D
E
F
G
H
I
K
L
M
N
O
P
R
S
T
U
V
W
No languages found matching your search.