Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
1
2
PDS DPO
pdsdpo
Follow
https://pds-dpo.github.io/
AI & ML interests
None yet
Organizations
None yet
models
2
Sort: Recently updated
pdsdpo/SynthAlign-7B
Image-Text-to-Text
•
Updated
Dec 26, 2024
•
27
•
1
pdsdpo/SynthAlign-7B-LoRA
Image-Text-to-Text
•
Updated
Dec 26, 2024
•
15
•
1
datasets
2
Sort: Recently updated
pdsdpo/synthalign_v1_1_data
Viewer
•
Updated
Jul 29
•
12.4k
•
108
pdsdpo/synthalign-v1_0-data
Viewer
•
Updated
Jun 29
•
23k
•
114