codebyam commited on
Commit
0333203
·
verified ·
1 Parent(s): 45120c0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +39 -3
README.md CHANGED
@@ -1,3 +1,39 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - en
5
+ tags:
6
+ - code
7
+ - model
8
+ - slm
9
+ - model
10
+ - language model
11
+ pipeline_tag: text-generation
12
+ library_name: PyTorch
13
+ ---
14
+
15
+
16
+
17
+ # SmallLM
18
+
19
+ SmallLM is a specialized transformer-based language model with fewer than 50 million parameters. It utilizes a tokenizer derived from minbpe, developed by Andrej Karpathy, and has been trained on custom datasets. This model is currently in active development, and we are continuously working to enhance its performance. The model can be used on any small end devices without any issue. The model has not been released yet, but we are working to make it available soon. Stay tuned!
20
+
21
+ ## Features
22
+
23
+ - Can be specialized for specific tasks
24
+ - Compact model size (less than 50M parameters)
25
+ - Custom tokenizer from minbpe
26
+ - Trained on custom datasets
27
+ - Can run CPU based devices
28
+
29
+ ## Limitations
30
+
31
+ Due to its relatively small size, limited training data, and shorter training duration, the model may occasionally underperform compared to small language models.
32
+
33
+ ## Contact
34
+
35
+ For any questions or feedback, please reach out to us through Discussions.
36
+
37
+ ## Resources
38
+ - Tokenizer Source: <a href="https://github.com/karpathy/minbpe">minbpe</a>
39
+