Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
washeed
/
audio-transcribe
like
3
Automatic Speech Recognition
Transformers
PyTorch
JAX
Safetensors
whisper
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
refs/pr/1
audio-transcribe
1 contributor
History:
11 commits
Callmejammie
Upload Last one April 2nd.mp3
78ec653
verified
about 2 months ago
.gitattributes
1.61 kB
Upload Last one April 2nd.mp3
about 2 months ago
Last one April 2nd.mp3
1.41 MB
LFS
Upload Last one April 2nd.mp3
about 2 months ago
README.md
1.77 kB
Update README.md
12 months ago
added_tokens.json
34.6 kB
Upload 21 files
about 1 year ago
config.json
1.27 kB
Upload 21 files
about 1 year ago
flax_model.msgpack
6.17 GB
LFS
Upload 21 files
about 1 year ago
generation_config.json
3.9 kB
Upload 21 files
about 1 year ago
merges.txt
494 kB
Upload 21 files
about 1 year ago
model.fp32-00001-of-00002.safetensors
4.99 GB
LFS
Upload 21 files
about 1 year ago
model.fp32-00002-of-00002.safetensors
1.18 GB
LFS
Upload 21 files
about 1 year ago
model.safetensors
3.09 GB
LFS
Upload 21 files
about 1 year ago
model.safetensors.index.fp32.json
118 kB
Upload 21 files
about 1 year ago
normalizer.json
52.7 kB
Upload 21 files
about 1 year ago
preprocessor_config.json
340 Bytes
Upload 21 files
about 1 year ago
pytorch_model.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.HalfStorage"
What is a pickle import?
3.09 GB
LFS
Upload 21 files
about 1 year ago
pytorch_model.bin.index.fp32.json
118 kB
Upload 21 files
about 1 year ago
pytorch_model.fp32-00001-of-00002.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
4.99 GB
LFS
Upload 21 files
about 1 year ago
pytorch_model.fp32-00002-of-00002.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
1.18 GB
LFS
Upload 21 files
about 1 year ago
special_tokens_map.json
2.07 kB
Upload 21 files
about 1 year ago
tokenizer.json
2.48 MB
Upload 21 files
about 1 year ago
tokenizer_config.json
283 kB
Upload 21 files
about 1 year ago
vocab.json
1.04 MB
Upload 21 files
about 1 year ago