Transformers
Inference Endpoints

These SAE's were trained on residual outputs of layers using Eleuther's SAE library here: https://github.com/EleutherAI/sae, on a subset of FineWebText given in lukemarks/vader-post-training

You can see the SAE training details at https://huggingface.co/apart/llama3.2_1b_base_saes_vader/blob/main/config.json .

While FVU and dead_pct metrics for each SAE run are saved under the respective layers, e.g., see https://huggingface.co/apart/llama3.2_1b_base_saes_vader/blob/main/model.layers.12/metrics.json

Downloads last month
6
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for apart/llama3.2_1b_base_saes_vader

Finetuned
(237)
this model

Dataset used to train apart/llama3.2_1b_base_saes_vader