runtime error
Exit code: 1. Reason: Disabling PyTorch because PyTorch >= 2.1 is required but found 2.0.1 None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. Using device: cpu tokenizer_config.json: 0%| | 0.00/3.36k [00:00<?, ?B/s][A tokenizer_config.json: 100%|██████████| 3.36k/3.36k [00:00<00:00, 19.1MB/s] tokenizer.model: 0%| | 0.00/500k [00:00<?, ?B/s][A tokenizer.model: 100%|██████████| 500k/500k [00:00<00:00, 45.2MB/s] tokenizer.json: 0%| | 0.00/1.84M [00:00<?, ?B/s][A tokenizer.json: 100%|██████████| 1.84M/1.84M [00:00<00:00, 30.6MB/s] added_tokens.json: 0%| | 0.00/293 [00:00<?, ?B/s][A added_tokens.json: 100%|██████████| 293/293 [00:00<00:00, 2.09MB/s] special_tokens_map.json: 0%| | 0.00/455 [00:00<?, ?B/s][A special_tokens_map.json: 100%|██████████| 455/455 [00:00<00:00, 2.11MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 16, in <module> base_model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1885, in __getattribute__ requires_backends(cls, cls._backends) File "/usr/local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1871, in requires_backends raise ImportError("".join(failed)) ImportError: AutoModelForCausalLM requires the PyTorch library but it was not found in your environment. Checkout the instructions on the installation page: https://pytorch.org/get-started/locally/ and follow the ones that match your environment. Please note that you may need to restart your runtime after installation.
Container logs:
Fetching error logs...