thinkingmachines
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -20,16 +20,11 @@ Maxwell-TCS-v0.2 is an experimental SOTA **t**ask **c**omplexity **s**corer base
|
|
20 |
|
21 |
Maxwell-TCS can be used in a variety of downstreaming tasks such as prompt difficulty prediction, dataset annotation, dataset augmentation and more.
|
22 |
|
23 |
-
- **Developed by:** [Shreyan C](
|
24 |
- **Model type:** Bidirectional Encoder Representations from Transformers, based on the ModernBERT-Large architecture.
|
25 |
- **Language(s) (NLP):** English (en)
|
26 |
- **License:** Apache License, Version 2.0
|
27 |
- **Finetuned from model**: ModernBERT-Large
|
28 |
-
## Applications
|
29 |
-
|
30 |
-
- **Prompt Complexity Scoring:** Maxwell can be used to predict the complexity of a given instruction or prompt.
|
31 |
-
- **Dataset Annotation:** Maxwell can be used to annotate the complexity of instructions in a dataset.
|
32 |
-
- **Reward Model**: Maxwell can be used as a reward model for reinforcement learning tasks.
|
33 |
|
34 |
### Recommendations
|
35 |
|
|
|
20 |
|
21 |
Maxwell-TCS can be used in a variety of downstreaming tasks such as prompt difficulty prediction, dataset annotation, dataset augmentation and more.
|
22 |
|
23 |
+
- **Developed by:** [Shreyan C](https://huggingface.co/thethinkmachine) | BUD Ecosystem Inc.
|
24 |
- **Model type:** Bidirectional Encoder Representations from Transformers, based on the ModernBERT-Large architecture.
|
25 |
- **Language(s) (NLP):** English (en)
|
26 |
- **License:** Apache License, Version 2.0
|
27 |
- **Finetuned from model**: ModernBERT-Large
|
|
|
|
|
|
|
|
|
|
|
28 |
|
29 |
### Recommendations
|
30 |
|