Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
nayohan
/
corningQA-llama2-13b-chat
like
1
Text Generation
Transformers
PyTorch
myngsoooo/CorningAI-DocQA
English
llama
conversational
text-generation-inference
Inference Endpoints
arxiv:
2307.09288
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
28598da
corningQA-llama2-13b-chat
1 contributor
History:
9 commits
nayohan
Update README.md
28598da
verified
11 months ago
.gitattributes
1.52 kB
initial commit
12 months ago
README.md
3.34 kB
Update README.md
11 months ago
config.json
703 Bytes
Upload LlamaForCausalLM
12 months ago
generation_config.json
183 Bytes
Upload LlamaForCausalLM
12 months ago
pytorch_model-00001-of-00006.bin
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
,
"collections.OrderedDict"
What is a pickle import?
9.96 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00002-of-00006.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.94 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00003-of-00006.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
9.94 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00004-of-00006.bin
9.87 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00005-of-00006.bin
9.87 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00006-of-00006.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
2.49 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model.bin.index.json
29.9 kB
Upload LlamaForCausalLM
12 months ago
special_tokens_map.json
551 Bytes
Upload tokenizer
12 months ago
tokenizer.json
1.84 MB
Upload tokenizer
12 months ago
tokenizer.model
500 kB
LFS
Upload tokenizer
12 months ago
tokenizer_config.json
1.83 kB
Upload tokenizer
12 months ago