chore: info
Browse files
README.md
CHANGED
@@ -12,7 +12,6 @@ language:
|
|
12 |
The architecture of StripedHyena-Hessian-7B is different from traditional decoder-only Transformers.
|
13 |
|
14 |
StripedHyena is a hybrid architecture composed of multi-head, grouped-query attention and gated convolutions arranged in [Hyena](https://arxiv.org/abs/2302.10866) blocks.
|
15 |
-
- Costant memory decoding
|
16 |
- Lower latency to preprocess long prompts.
|
17 |
- Improvements to training and inference compute-optimal scaling laws, compared to Transformers.
|
18 |
-
>>>>>>> 70481ea0fbb23e43c66663f8fb40d94661f235f0
|
|
|
12 |
The architecture of StripedHyena-Hessian-7B is different from traditional decoder-only Transformers.
|
13 |
|
14 |
StripedHyena is a hybrid architecture composed of multi-head, grouped-query attention and gated convolutions arranged in [Hyena](https://arxiv.org/abs/2302.10866) blocks.
|
15 |
+
- Costant memory decoding in Hyena blocks via representation of convolutions as state-space models (modal or canonical form), or as truncated filters.
|
16 |
- Lower latency to preprocess long prompts.
|
17 |
- Improvements to training and inference compute-optimal scaling laws, compared to Transformers.
|
|