Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
tommyp111
/
pythia-160m-lora-sae
like
0
Model card
Files
Files and versions
xet
Community
YAML Metadata Warning:
empty or missing yaml metadata in repo card (
https://huggingface.co/docs/hub/model-cards#model-card-metadata
)
dense_4h_to_h/latents-768/k-12
wandb run
dense_h_to_4h/latents-3072-k-32-multi-topk:
wandb run
Downloads last month
-
Downloads are not tracked for this model.
How to track
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Collection including
tommyp111/pythia-160m-lora-sae
Pythia SAE
Collection
4 items
โข
Updated
Jun 28, 2025