--- license: apache-2.0 base_model: - watt-ai/watt-tool-8B --- This model was converted to FP8 format from `watt-ai/watt-tool-8B` using the llmcompressor library by vLLM. Refer to the [original model card](https://huggingface.co/watt-ai/watt-tool-8B) for more details on the model.