runtime error
Exit code: 1. Reason: *lora_kargs ā ā 173 ā ā logging.info('Loading LLAMA') ā ā ā± 174 ā ā llama_tokenizer = LlamaTokenizer.from_pretrained("Vision-CAIR/ ā ā 175 ā ā llama_tokenizer.pad_token = "$$" ā ā 176 ā ā ā ā 177 ā ā if low_resource: ā ā ā ā /usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base ā ā .py:1809 in from_pretrained ā ā ā ā 1806 ā ā ā ) ā ā 1807 ā ā ā ā 1808 ā ā if all(full_file_name is None for full_file_name in resolved_ ā ā ā± 1809 ā ā ā raise EnvironmentError( ā ā 1810 ā ā ā ā f"Can't load tokenizer for '{pretrained_model_name_or ā ā 1811 ā ā ā ā "'https://huggingface.co/models', make sure you don't ā ā 1812 ā ā ā ā f"Otherwise, make sure '{pretrained_model_name_or_pat ā ā°āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā⯠OSError: Can't load tokenizer for 'Vision-CAIR/llama-2-7b-chat-pytorch'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'Vision-CAIR/llama-2-7b-chat-pytorch' is the correct path to a directory containing all relevant files for a LlamaTokenizer tokenizer.
Container logs:
Fetching error logs...