-
Notifications
You must be signed in to change notification settings - Fork 5.2k
Closed
Description
Hello,
if I try to download the model, the procedure fails with the following traceback:
C:\Users\a038775\AppData\Roaming\Python\Python311\Scripts>interpreter --local
Open Interpreter supports multiple local model providers.
[?] Select a provider:
Ollama
> Llamafile
LM Studio
Jan
[?] Select a model:
rocket-3b.Q4_K_M.llamafile
> ↓ Download new model
Your machine has 15.66GB of RAM, and 382.65GB of free storage space.
Your computer could handle a mid-sized model (4-10GB), Mistral-7B might be the best model for your computer.
In general, the larger the model, the better the performance, but choose a model that best fits your computer's
hardware.
Only models you have the storage space to download are shown:
[?] Select a model to download::
Llama-3.1-8B-Instruct (4.95GB)
Gemma-2-9b (5.79GB)
Phi-3-mini (2.42GB)
Moondream2 (vision) (1.98GB)
> Mistral-7B-Instruct (4.40GB)
Gemma-2-27b (16.70GB)
TinyLlama-1.1B (0.70GB)
LLaVA 1.5 (vision) (4.29GB)
WizardCoder-Python-13B (7.33GB)
WizardCoder-Python-34B (20.22GB)
Mixtral-8x7B-Instruct (30.03GB)
An error occurred while trying to download the model. Please try again or use a different local model provider.
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "C:\Users\a038775\AppData\Roaming\Python\Python311\Scripts\interpreter.exe\__main__.py", line 7, in <module>
File "C:\Users\a038775\AppData\Roaming\Python\Python311\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 612, in main
start_terminal_interface(interpreter)
File "C:\Users\a038775\AppData\Roaming\Python\Python311\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 471, in start_terminal_interface
interpreter = profile(
^^^^^^^^
File "C:\Users\a038775\AppData\Roaming\Python\Python311\site-packages\interpreter\terminal_interface\profiles\profiles.py", line 64, in profile
return apply_profile(interpreter, profile, profile_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\a038775\AppData\Roaming\Python\Python311\site-packages\interpreter\terminal_interface\profiles\profiles.py", line 148, in apply_profile
exec(profile["start_script"], scope, scope)
File "<string>", line 1, in <module>
File "C:\Users\a038775\AppData\Roaming\Python\Python311\site-packages\interpreter\core\core.py", line 145, in local_setup
self = local_setup(self)
^^^^^^^^^^^^^^^^^
File "C:\Users\a038775\AppData\Roaming\Python\Python311\site-packages\interpreter\terminal_interface\local_setup.py", line 454, in local_setup
model_name = model_path.split("/")[-1]
^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'split'
No errors occure, if I download other models
Reproduce
Expected behavior
Screenshots
No response
Open Interpreter version
0.4.3
Python version
3.11.9
Operating System name and version
Windows 11
Additional context
No response
Metadata
Metadata
Assignees
Labels
No labels