amgharhind
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -19,7 +19,7 @@ The `srd2plantUml_Salesforce_codet5-base` model was fine-tuned using a dataset o
|
|
19 |
- **Fine-tuned from model:** `Salesforce/codet5-base`
|
20 |
|
21 |
### Model Sources
|
22 |
-
- **Paper:**
|
23 |
|
24 |
## Uses
|
25 |
|
@@ -40,21 +40,4 @@ The model may have limitations in handling complex or highly technical SRDs outs
|
|
40 |
### Recommendations
|
41 |
Users should be cautious of potential limitations in complex diagrams and may need to manually adjust or review the generated PlantUML code for accuracy, especially in domains not covered by the training data.
|
42 |
|
43 |
-
## How to Get Started with the Model
|
44 |
|
45 |
-
To use the model:
|
46 |
-
|
47 |
-
```python
|
48 |
-
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
|
49 |
-
|
50 |
-
# Load the model and tokenizer
|
51 |
-
model = AutoModel.from_pretrained("amgharhind/srd2plantUml_codet5base_V2")
|
52 |
-
tokenizer = AutoTokenizer.from_pretrained("amgharhind/srd2plantUml_codet5base_V2")
|
53 |
-
|
54 |
-
# Example usage
|
55 |
-
input_text = "Sample SRD input for a use case diagram"
|
56 |
-
inputs = tokenizer(input_text, return_tensors="pt")
|
57 |
-
outputs = model.generate(**inputs)
|
58 |
-
uml_code = tokenizer.decode(outputs[0], skip_special_tokens=True)
|
59 |
-
|
60 |
-
print("Generated PlantUML code:", uml_code)
|
|
|
19 |
- **Fine-tuned from model:** `Salesforce/codet5-base`
|
20 |
|
21 |
### Model Sources
|
22 |
+
- **Paper:** the CodeT5 paper: [CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation](https://arxiv.org/abs/2109.00859)
|
23 |
|
24 |
## Uses
|
25 |
|
|
|
40 |
### Recommendations
|
41 |
Users should be cautious of potential limitations in complex diagrams and may need to manually adjust or review the generated PlantUML code for accuracy, especially in domains not covered by the training data.
|
42 |
|
|
|
43 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|