metadata
base_model:
- AIMH/mental-longformer-base-4096
language:
- en
library_name: transformers
license: mit
metrics:
- name: F1 Score
type: f1
value: 0.5524
verified: false
- name: Accuracy
type: accuracy
value: 0.6064
verified: false
- name: Precision
type: precision
value: 0.602
verified: false
- name: Recall
type: recall
value: 0.5385
verified: false
pipeline_tag: text-classification
About This Model
This model is fine-tuned from the checkpoint of AIMH/mental-longformer-base-4096 using drmuskangarg/CAMS dataset. For more information about the base Longformer model, please visit their model page. We used the same configuration as AIMH/mental-longformer-base-4096
including their tokenizer.
Usage
If you wish to use my model to infer your dataset or maybe pre-train it further, you can import my model in a Python script/notebook.
from transformers import LongformerTokenizer, LongformerForSequenceClassification
tokenizer = LongformerTokenizer.from_pretrained("aimh/mental-longformer-base-4096")
model = LongformerForSequenceClassification.from_pretrained("stackofsugar/mentallongformer-cams-finetuned")
If you prefer to use the high-level HuggingFace pipeline to make predictions, you can also do it in a Python script/notebook.
from transformers import pipeline
pipe = pipeline("text-classification", model="stackofsugar/mentallongformer-cams-finetuned", tokenizer="aimh/mental-longformer-base-4096")
More Information
For more information, visit my GitHub Repo.