modelId stringlengths 4 112 | sha stringlengths 40 40 | lastModified stringlengths 24 24 | tags sequence | pipeline_tag stringclasses 29
values | private bool 1
class | author stringlengths 2 38 ⌀ | config null | id stringlengths 4 112 | downloads float64 0 36.8M ⌀ | likes float64 0 712 ⌀ | library_name stringclasses 17
values | readme stringlengths 0 186k | embedding sequence |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
hfl/chinese-macbert-base | a986e004d2a7f2a1c2f5a3edef4e20604a974ed1 | 2021-05-19T19:09:45.000Z | [
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"zh",
"arxiv:2004.13922",
"transformers",
"license:apache-2.0",
"autotrain_compatible"
] | fill-mask | false | hfl | null | hfl/chinese-macbert-base | 36,823,840 | 43 | transformers | ---
language:
- zh
tags:
- bert
license: "apache-2.0"
---
<p align="center">
<br>
<img src="https://github.com/ymcui/MacBERT/raw/master/pics/banner.png" width="500"/>
<br>
</p>
<p align="center">
<a href="https://github.com/ymcui/MacBERT/blob/master/LICENSE">
<img alt="GitHub" src="https://img.... | [
-0.12368089705705643,
0.0159906018525362,
0.07993485778570175,
0.010601235553622246,
0.06334946304559708,
0.02685576304793358,
0.021748080849647522,
0.007624071557074785,
0.0053676413372159,
-0.0035076644271612167,
0.08696867525577545,
-0.03400150686502457,
0.06885205209255219,
0.063679352... |
microsoft/deberta-base | 7d4c0126b06bd59dccd3e48e467ed11e37b77f3f | 2022-01-13T13:56:18.000Z | [
"pytorch",
"tf",
"rust",
"deberta",
"en",
"arxiv:2006.03654",
"transformers",
"deberta-v1",
"license:mit"
] | null | false | microsoft | null | microsoft/deberta-base | 23,662,412 | 15 | transformers | ---
language: en
tags: deberta-v1
thumbnail: https://huggingface.co/front/thumbnails/microsoft.png
license: mit
---
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention
[DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It... | [
-0.10310466587543488,
-0.1119682714343071,
0.015463879331946373,
0.00776915205642581,
0.017533447593450546,
-0.0014948627213016152,
-0.015752648934721947,
0.038549475371837616,
-0.017899053171277046,
0.048586562275886536,
0.026507191359996796,
-0.007420028559863567,
-0.02895142324268818,
0... |
bert-base-uncased | 418430c3b5df7ace92f2aede75700d22c78a0f95 | 2022-06-06T11:41:24.000Z | [
"pytorch",
"tf",
"jax",
"rust",
"bert",
"fill-mask",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1810.04805",
"transformers",
"exbert",
"license:apache-2.0",
"autotrain_compatible"
] | fill-mask | false | null | null | bert-base-uncased | 22,268,934 | 204 | transformers | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# BERT base model (uncased)
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](http... | [
-0.10429023206233978,
-0.07568823546171188,
0.0528205968439579,
0.029537051916122437,
0.03592930734157562,
0.06076014041900635,
0.016745345667004585,
-0.011535749770700932,
0.014104398898780346,
-0.024821121245622635,
0.0364287793636322,
-0.055009908974170685,
0.05354085937142372,
0.041097... |
gpt2 | 6c0e6080953db56375760c0471a8c5f2929baf11 | 2021-05-19T16:25:59.000Z | [
"pytorch",
"tf",
"jax",
"tflite",
"rust",
"gpt2",
"text-generation",
"en",
"transformers",
"exbert",
"license:mit"
] | text-generation | false | null | null | gpt2 | 11,350,803 | 164 | transformers | ---
language: en
tags:
- exbert
license: mit
---
# GPT-2
Test the whole generation capabilities here: https://transformer.huggingface.co/doc/gpt2-large
Pretrained model on English language using a causal language modeling (CLM) objective. It was introduced in
[this paper](https://d4mucfpksywv.cloudfront.net/better... | [
-0.05892602726817131,
-0.08013296872377396,
0.060150083154439926,
0.0071877371519804,
0.07087277621030807,
0.028216511011123657,
0.01796095259487629,
0.037697840481996536,
0.03646615520119667,
-0.03444734588265419,
-0.016371622681617737,
-0.00010747667693067342,
0.03819410130381584,
0.0576... |
distilbert-base-uncased | 043235d6088ecd3dd5fb5ca3592b6913fd516027 | 2022-05-31T19:08:36.000Z | [
"pytorch",
"tf",
"jax",
"rust",
"distilbert",
"fill-mask",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1910.01108",
"transformers",
"exbert",
"license:apache-2.0",
"autotrain_compatible"
] | fill-mask | false | null | null | distilbert-base-uncased | 11,250,037 | 70 | transformers | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# DistilBERT base model (uncased)
This model is a distilled version of the [BERT base model](https://huggingface.co/bert-base-uncased). It was
introduced in [this paper](https://arxiv.org/abs/1910.01108). The code for the disti... | [
-0.13866068422794342,
-0.06151656433939934,
0.08515582978725433,
0.01706661842763424,
0.014973160810768604,
-0.052277322858572006,
-0.007684790063649416,
0.061743155121803284,
0.00879606045782566,
-0.05457921698689461,
0.02629678323864937,
0.030604751780629158,
0.04207247868180275,
0.00054... |
Jean-Baptiste/camembert-ner | dbec8489a1c44ecad9da8a9185115bccabd799fe | 2022-04-04T01:13:33.000Z | [
"pytorch",
"camembert",
"token-classification",
"fr",
"dataset:Jean-Baptiste/wikiner_fr",
"transformers",
"autotrain_compatible"
] | token-classification | false | Jean-Baptiste | null | Jean-Baptiste/camembert-ner | 9,833,060 | 11 | transformers | ---
language: fr
datasets:
- Jean-Baptiste/wikiner_fr
widget:
- text: "Je m'appelle jean-baptiste et je vis à montréal"
- text: "george washington est allé à washington"
---
# camembert-ner: model fine-tuned from camemBERT for NER task.
## Introduction
[camembert-ner] is a NER model that was fine-tuned from camemBER... | [
-0.11113610863685608,
-0.03646893799304962,
0.059876054525375366,
-0.003222051775082946,
0.012641402892768383,
-0.04548738896846771,
-0.01905476488173008,
0.06513305753469467,
0.05475887656211853,
-0.036326371133327484,
0.01774558238685131,
-0.053867973387241364,
0.0843958929181099,
0.0142... |
bert-base-cased | a8d257ba9925ef39f3036bfc338acf5283c512d9 | 2021-09-06T08:07:18.000Z | [
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1810.04805",
"transformers",
"exbert",
"license:apache-2.0",
"autotrain_compatible"
] | fill-mask | false | null | null | bert-base-cased | 7,598,326 | 30 | transformers | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---
# BERT base model (cased)
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](https:... | [
-0.089531309902668,
-0.07718607783317566,
0.05249395594000816,
0.018646515905857086,
0.0447709858417511,
0.06499486416578293,
0.01449884008616209,
0.013737021014094353,
0.022442683577537537,
-0.017234643921256065,
0.04510723799467087,
-0.0327119417488575,
0.06715553998947144,
0.04273235425... |
roberta-base | 251c3c36356d3ad6845eb0554fdb9703d632c6cc | 2021-07-06T10:34:50.000Z | [
"pytorch",
"tf",
"jax",
"rust",
"roberta",
"fill-mask",
"en",
"dataset:bookcorpus",
"dataset:wikipedia",
"arxiv:1907.11692",
"arxiv:1806.02847",
"transformers",
"exbert",
"license:mit",
"autotrain_compatible"
] | fill-mask | false | null | null | roberta-base | 7,254,067 | 45 | transformers | ---
language: en
tags:
- exbert
license: mit
datasets:
- bookcorpus
- wikipedia
---
# RoBERTa base model
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1907.11692) and first released in
[this repository](https://github.com... | [
-0.08212313055992126,
-0.10390054434537888,
-0.027432728558778763,
0.05361437052488327,
-0.008713857270777225,
0.08931197971105576,
0.012921089306473732,
-0.0319417305290699,
0.03874579817056656,
0.02321513742208481,
0.06043674796819687,
-0.031464289873838425,
0.07827506959438324,
0.029055... |
SpanBERT/spanbert-large-cased | a49cba45de9565a5d3e7b089a94dbae679e64e79 | 2021-05-19T11:31:33.000Z | [
"pytorch",
"jax",
"bert",
"transformers"
] | null | false | SpanBERT | null | SpanBERT/spanbert-large-cased | 7,120,559 | 3 | transformers | Entry not found | [
0.0461147278547287,
-0.038838207721710205,
-0.01049656979739666,
-0.03682169318199158,
0.011261860840022564,
0.013094935566186905,
0.0019101888174191117,
-0.013979103416204453,
0.027092741802334785,
-0.015212527476251125,
0.017284274101257324,
-0.08189476281404495,
0.03817418962717056,
-0.... |
xlm-roberta-base | f6d161e8f5f6f2ed433fb4023d6cb34146506b3f | 2022-06-06T11:40:43.000Z | [
"pytorch",
"tf",
"jax",
"xlm-roberta",
"fill-mask",
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha"... | fill-mask | false | null | null | xlm-roberta-base | 6,960,013 | 42 | transformers | ---
tags:
- exbert
language:
- multilingual
- af
- am
- ar
- as
- az
- be
- bg
- bn
- br
- bs
- ca
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- hu
- hy
- id
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lo
- lt
- lv
- mg
- mk
- ml
- mn
... | [
-0.08281730860471725,
0.003249414497986436,
-0.041813094168901443,
-0.047962404787540436,
0.060015056282281876,
0.03705935552716255,
0.03477831184864044,
0.03974342346191406,
-0.008022570051252842,
0.021637847647070885,
0.08089376986026764,
-0.04877686873078346,
0.08540689200162888,
-0.035... |
distilbert-base-uncased-finetuned-sst-2-english | 00c3f1ef306e837efb641eaca05d24d161d9513c | 2022-07-22T08:00:55.000Z | [
"pytorch",
"tf",
"rust",
"distilbert",
"text-classification",
"en",
"dataset:sst2",
"dataset:glue",
"transformers",
"license:apache-2.0",
"model-index"
] | text-classification | false | null | null | distilbert-base-uncased-finetuned-sst-2-english | 5,401,984 | 77 | transformers | ---
language: en
license: apache-2.0
datasets:
- sst2
- glue
model-index:
- name: distilbert-base-uncased-finetuned-sst-2-english
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: glue
type: glue
config: sst2
split: validation
metrics:
... | [
-0.034528184682130814,
-0.04061563313007355,
-0.060734909027814865,
0.035735320299863815,
0.08161136507987976,
0.04066551476716995,
0.0003900852461811155,
0.05125672370195389,
-0.01557959709316492,
-0.04621459171175957,
0.06463723629713058,
-0.0745718702673912,
0.007900773547589779,
-0.046... |
distilroberta-base | c1149320821601524a8d373726ed95bbd2bc0dc2 | 2022-07-22T08:13:21.000Z | [
"pytorch",
"tf",
"jax",
"rust",
"roberta",
"fill-mask",
"en",
"dataset:openwebtext",
"arxiv:1910.01108",
"arxiv:1910.09700",
"transformers",
"exbert",
"license:apache-2.0",
"autotrain_compatible"
] | fill-mask | false | null | null | distilroberta-base | 5,192,102 | 21 | transformers | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- openwebtext
---
# Model Card for DistilRoBERTa base
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Training Details](#training-details)
5. [Evaluation](#evaluat... | [
-0.1325169950723648,
-0.03109726682305336,
0.035468652844429016,
0.0266091488301754,
0.027898555621504784,
-0.04462097957730293,
-0.04662588611245155,
0.06629492342472076,
-0.08276285231113434,
-0.04405451565980911,
0.023149758577346802,
-0.024889251217246056,
0.013186859898269176,
-0.0024... |
distilgpt2 | ca98be8f8f0994e707b944a9ef55e66fbcf9e586 | 2022-07-22T08:12:56.000Z | [
"pytorch",
"tf",
"jax",
"tflite",
"rust",
"gpt2",
"text-generation",
"en",
"dataset:openwebtext",
"arxiv:1910.01108",
"arxiv:2201.08542",
"arxiv:2203.12574",
"arxiv:1910.09700",
"arxiv:1503.02531",
"transformers",
"exbert",
"license:apache-2.0",
"model-index",
"co2_eq_emissions"
... | text-generation | false | null | null | distilgpt2 | 4,525,173 | 77 | transformers | ---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- openwebtext
model-index:
- name: distilgpt2
results:
- task:
type: text-generation
name: Text Generation
dataset:
type: wikitext
name: WikiText-103
metrics:
- type: perplexity
name: Perplexity
... | [
-0.12711955606937408,
-0.02593734860420227,
0.022106902673840523,
0.04674231633543968,
0.011120681650936604,
-0.0465211421251297,
0.0013098361669108272,
0.08374807238578796,
-0.028656257316470146,
-0.08449400216341019,
0.015780135989189148,
-0.07485173642635345,
-0.02609829604625702,
0.017... |
End of preview. Expand in Data Studio
No dataset card yet
- Downloads last month
- 8