runtime error

Exit code: 1. Reason: ne 44, in __init__ super().__init__( File "/usr/local/lib/python3.10/site-packages/llama_index/indices/base.py", line 66, in __init__ index_struct = self.build_index_from_nodes(nodes) File "/usr/local/lib/python3.10/site-packages/llama_index/token_counter/token_counter.py", line 78, in wrapped_llm_predict f_return_val = f(_self, *args, **kwargs) File "/usr/local/lib/python3.10/site-packages/llama_index/indices/vector_store/base.py", line 203, in build_index_from_nodes return self._build_index_from_nodes(nodes) File "/usr/local/lib/python3.10/site-packages/llama_index/indices/vector_store/base.py", line 192, in _build_index_from_nodes self._add_nodes_to_index(index_struct, nodes) File "/usr/local/lib/python3.10/site-packages/llama_index/indices/vector_store/base.py", line 168, in _add_nodes_to_index embedding_results = self._get_node_embedding_results(nodes) File "/usr/local/lib/python3.10/site-packages/llama_index/indices/vector_store/base.py", line 87, in _get_node_embedding_results ) = self._service_context.embed_model.get_queued_text_embeddings() File "/usr/local/lib/python3.10/site-packages/llama_index/embeddings/base.py", line 168, in get_queued_text_embeddings embeddings = self._get_text_embeddings(cur_batch_texts) File "/usr/local/lib/python3.10/site-packages/llama_index/embeddings/openai.py", line 267, in _get_text_embeddings return get_embeddings( File "/usr/local/lib/python3.10/site-packages/tenacity/__init__.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/usr/local/lib/python3.10/site-packages/tenacity/__init__.py", line 475, in __call__ do = self.iter(retry_state=retry_state) File "/usr/local/lib/python3.10/site-packages/tenacity/__init__.py", line 376, in iter result = action(retry_state) File "/usr/local/lib/python3.10/site-packages/tenacity/__init__.py", line 419, in exc_check raise retry_exc from fut.exception() tenacity.RetryError: RetryError[<Future at 0x7f58671bdb70 state=finished raised APIRemovedInV1>]

Container logs:

Fetching error logs...