Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
syu03
/
DPO_model
like
0
PEFT
Safetensors
llama
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Use this model
main
DPO_model
16 MB
1 contributor
History:
6 commits
syu03
Upload config.json
0625a78
verified
about 1 year ago
.gitattributes
1.52 kB
initial commit
about 1 year ago
README.md
5.11 kB
Upload 7 files
about 1 year ago
adapter_config (1).json
656 Bytes
Upload 7 files
about 1 year ago
adapter_model.safetensors
3.42 MB
xet
Upload 7 files
about 1 year ago
config.json
682 Bytes
Upload config.json
about 1 year ago
model.safetensors
3.42 MB
xet
Upload model.safetensors
about 1 year ago
special_tokens_map (1).json
325 Bytes
Upload 7 files
about 1 year ago
tokenizer.json
9.09 MB
Upload 7 files
about 1 year ago
tokenizer_config (1).json
54.6 kB
Upload 7 files
about 1 year ago
training_args.bin
5.11 kB
xet
Upload 7 files
about 1 year ago