camembert-large / special_tokens_map.json

Commit History

Added Fast tokenizer files with model_max_length
757d2aa
unverified

wissamantoun commited on