File size: 2,140 Bytes
e9406b8 bba32dc c04821d bba32dc c04821d 0e17acd c04821d bba32dc c04821d bba32dc c04821d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
---
language: en
tags:
- snomed-ct
- text-generation
---
# My Model Name
## Model description
This is a text generation model for SNOMED-CT. As it is text-generation, it is prone to hallucination and should not be used for any kind of production purpose but it was fun to build. It is based on Mixtral7b and was fine-tuned on a part of the SNOMED-CT corpus then tested against a gold-standard.
## How to use
Provide code snippets on how to use your model.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "MattStammers/chatty_mapper"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
# Your example here
Model Performance
Accuracy: 0.0
Precision: 0.0
Recall: 0.0
Example DataFrame head: ParameterName SNOMEDCode \
0 *Heart rate 364075005
1 Peripheral oxygen saturation 431314004
2 Mean arterial pressure 1285244000
3 *Diastolic blood pressure 271650006
4 *Systolic blood pressure 271649006
ExtractedSNOMEDNumbers CorrectPrediction
0 3222222 False
1 4222222000000000000000000000000000000000000000... False
2 NaN False
3 NaN False
4 NaN False
Limitations and bias
It is prone to wandering and certainly not medical-grade.
Acknowledgments
Thanks to the Mixtral AI team for creating the base model.
```
Save the model card in the model directory
with open(f"models/chatty_mapper/README.md", "w") as f:
f.write(model_card_content)
Use Hugging Face's Repository class for Git operations
repo = Repository(local_dir=model_save_path, clone_from=repo_url)
repo.git_add()
repo.git_commit("Initial model upload with model card and metrics")
repo.git_push()
print(f"Model, model card, and metrics successfully pushed to: https://huggingface.co/MattStammers/chatty_mapper") |