Fine-tuned GPT-Neo Model

This is a fine-tuned version of GPT-Neo for specific tasks.

Model Details

  • Model Type: GPT-Neo
  • Fine-tuned for: [Specify tasks or datasets]

Usage

To use the model, run the following code:

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("Torrchy/fine-tuned-gpt-neo")
tokenizer = AutoTokenizer.from_pretrained("Torrchy/fine-tuned-gpt-neo")
Downloads last month
108
Safetensors
Model size
1.32B params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Torrchy/fine-tuned-gpt-neo

Finetuned
(31)
this model