These SAE's were trained on residual outputs of layers using Eleuther's SAE library here: https://github.com/EleutherAI/sae, on a subset of FineWebText given in lukemarks/vader-post-training
You can see the SAE training details at https://huggingface.co/apart/llama3.2_1b_base_saes_vader/blob/main/config.json .
While FVU and dead_pct metrics for each SAE run are saved under the respective layers, e.g., see https://huggingface.co/apart/llama3.2_1b_base_saes_vader/blob/main/model.layers.12/metrics.json
- Downloads last month
- 6
Model tree for apart/llama3.2_1b_base_saes_vader
Base model
meta-llama/Llama-3.2-1B-Instruct