SmallLM / README.md
codebyam's picture
Update README.md
0333203 verified
metadata
license: mit
language:
  - en
tags:
  - code
  - model
  - slm
  - model
  - language model
pipeline_tag: text-generation
library_name: PyTorch

SmallLM

SmallLM is a specialized transformer-based language model with fewer than 50 million parameters. It utilizes a tokenizer derived from minbpe, developed by Andrej Karpathy, and has been trained on custom datasets. This model is currently in active development, and we are continuously working to enhance its performance. The model can be used on any small end devices without any issue. The model has not been released yet, but we are working to make it available soon. Stay tuned!

Features

  • Can be specialized for specific tasks
  • Compact model size (less than 50M parameters)
  • Custom tokenizer from minbpe
  • Trained on custom datasets
  • Can run CPU based devices

Limitations

Due to its relatively small size, limited training data, and shorter training duration, the model may occasionally underperform compared to small language models.

Contact

For any questions or feedback, please reach out to us through Discussions.

Resources