poetica / logs /poetry_generation_20241117.log
abhisheksan's picture
Update model configuration and fix tokenizer files; remove outdated model binary
8702989
2024-11-17 00:08:48,570 - main - INFO - Initializing model on device: cpu
2024-11-17 00:08:50,303 - main - INFO - Model and tokenizer loaded successfully
2024-11-17 00:13:06,341 - main - INFO - Initializing model on device: cpu
2024-11-17 00:13:07,660 - main - INFO - Model and tokenizer loaded successfully
2024-11-17 16:33:11,148 - main - INFO - Initializing model on device: cpu
2024-11-17 16:33:13,017 - main - ERROR - Error initializing model: Error(s) in loading state_dict for GPT2LMHeadModel:
size mismatch for transformer.wpe.weight: copying a param with shape torch.Size([400, 384]) from checkpoint, the shape in current model is torch.Size([128, 384]).
2024-11-17 16:33:13,017 - main - ERROR - Detailed traceback:
Traceback (most recent call last):
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 137, in initialize
await self._load_and_optimize_model()
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 185, in _load_and_optimize_model
self.model.load_state_dict(state_dict, strict=False)
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\torch\nn\modules\module.py", line 2189, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for GPT2LMHeadModel:
size mismatch for transformer.wpe.weight: copying a param with shape torch.Size([400, 384]) from checkpoint, the shape in current model is torch.Size([128, 384]).
2024-11-17 16:33:13,020 - main - ERROR - Failed to initialize model manager
2024-11-17 16:33:41,008 - main - INFO - Initializing model on device: cpu
2024-11-17 16:33:43,152 - main - INFO - Model and tokenizer loaded successfully