Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Continuous Instruction Refusal #1309

Open
accessor-io opened this issue Jun 18, 2024 · 2 comments
Open

Continuous Instruction Refusal #1309

accessor-io opened this issue Jun 18, 2024 · 2 comments

Comments

@accessor-io
Copy link

Describe the bug

OI will work fine for a while then it begins to disregard file location directions.

while writing this another error occured. now the RAW OpenAI hidden parameter chunks apper to render in terminal after submitting a prompt
image

Reproduce

typing --version into the prompt terminal

  and interact with you in this chat interface. I don't have a software version in a traditional sense. However, if you are referring to the version of your browser, operating system, or any other software installed on your
  machine, I would be happy to help you find that information. Could you please clarify?```

### Expected behavior

expect it to look through the correct directory. Even after explicit  instruction to not look through ```/node_modules/``` folder it continues to search though. This is a continuing issue when working with web app directories.

### Screenshots

![image](https://github.com/OpenInterpreter/open-interpreter/assets/147679457/d9ebf483-4c03-4657-a83a-18a50766155a)


### Open Interpreter version

0.2.6

### Python version

3.10.2

### Operating System name and version

Ubuntu 22.04

### Additional context

_No response_
@accessor-io
Copy link
Author

Am I the only one that has continuous problems with this ? What am i doing wrong? I have set my API key idk how many times.

dot@meta:~/Cross-Link-Manager$ interpreter -y --model gpt-4o --max_tokens 4096 --context_window 10000 --max_output 10000 --force_task_completion  --fast  --auto_run

We have updated our profile file format. Would you like to migrate your profile file to the new format? No data will be lost.

(y/n) y

Migration complete.


▌ A new version of Open Interpreter is available.                                                 

▌ Please run: pip install --upgrade open-interpreter                                              

────────────────────────────────────────────────────────────────────────────────────────────────────
> wget https://huggingface.co/jartine/Mixtral-8x7B-v0.1.llamafile/resolve/main/mixtral-8x7b-instruct-v0.1.Q5_K_M-server.llamafile
Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 736, in completion
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 655, in completion
    return self.streaming(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 833, in streaming
    response = openai_client.chat.completions.create(**data, timeout=timeout)
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
  File "/home/dot/.local/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 606, in create
    return self._post(
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request
    return self._request(
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 1112, in completion
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 1085, in completion
    response = openai_chat_completions.completion(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 742, in completion
    raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 344, in fixed_litellm_completions
    yield from litellm.completion(**params)
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3472, in wrapper
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3363, in wrapper
    result = original_function(*args, **kwargs)
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 2480, in completion
    raise exception_type(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 9927, in exception_type
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 8500, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/respond.py", line 78, in respond
    for chunk in interpreter.llm.run(messages_for_llm):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 263, in run
    yield from run_function_calling_llm(self, params)
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/run_function_calling_llm.py", line 44, in run_function_calling_llm
    for chunk in llm.completions(**request_params):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 347, in fixed_litellm_completions
    raise first_error
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 328, in fixed_litellm_completions
    yield from litellm.completion(**params)
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3472, in wrapper
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3363, in wrapper
    result = original_function(*args, **kwargs)
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 2480, in completion
    raise exception_type(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 9927, in exception_type
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 8467, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: AuthenticationError: OpenAIException - Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 736, in completion
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 655, in completion
    return self.streaming(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 813, in streaming
    openai_client = self._get_openai_client(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 550, in _get_openai_client
    _new_client = openai(
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_client.py", line 104, in __init__
    raise openaiError(
openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/interpreter", line 8, in <module>
    sys.exit(main())
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 509, in main
    start_terminal_interface(interpreter)
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 475, in start_terminal_interface
    interpreter.chat()
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/core.py", line 200, in chat
    for _ in self._streaming_chat(message=message, display=display):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/core.py", line 232, in _streaming_chat
    yield from terminal_interface(self, message)
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/terminal_interface/terminal_interface.py", line 133, in terminal_interface
    for chunk in interpreter.chat(message, display=False, stream=True):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/core.py", line 271, in _streaming_chat
    yield from self._respond_and_store()
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/core.py", line 321, in _respond_and_store
    for chunk in respond(self):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/respond.py", line 101, in respond
    raise Exception(
Exception: Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 736, in completion
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 655, in completion
    return self.streaming(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 833, in streaming
    response = openai_client.chat.completions.create(**data, timeout=timeout)
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
  File "/home/dot/.local/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 606, in create
    return self._post(
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request
    return self._request(
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 1112, in completion
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 1085, in completion
    response = openai_chat_completions.completion(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 742, in completion
    raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 344, in fixed_litellm_completions
    yield from litellm.completion(**params)
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3472, in wrapper
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3363, in wrapper
    result = original_function(*args, **kwargs)
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 2480, in completion
    raise exception_type(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 9927, in exception_type
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 8500, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/respond.py", line 78, in respond
    for chunk in interpreter.llm.run(messages_for_llm):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 263, in run
    yield from run_function_calling_llm(self, params)
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/run_function_calling_llm.py", line 44, in run_function_calling_llm
    for chunk in llm.completions(**request_params):
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 347, in fixed_litellm_completions
    raise first_error
  File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 328, in fixed_litellm_completions
    yield from litellm.completion(**params)
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3472, in wrapper
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3363, in wrapper
    result = original_function(*args, **kwargs)
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 2480, in completion
    raise exception_type(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 9927, in exception_type
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 8467, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: AuthenticationError: OpenAIException - Traceback (most recent call last):
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 736, in completion
    raise e
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 655, in completion
    return self.streaming(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 813, in streaming
    openai_client = self._get_openai_client(
  File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 550, in _get_openai_client
    _new_client = openai(
  File "/home/dot/.local/lib/python3.10/site-packages/openai/_client.py", line 104, in __init__
    raise openaiError(
openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable



There might be an issue with your API key(s).

To reset your API key (we'll use OPENAI_API_KEY for this example, but you may need to reset your ANTHROPIC_API_KEY, HUGGINGFACE_API_KEY, etc):
        Mac/Linux: 'export OPENAI_API_KEY=your-key-here'. Update your ~/.zshrc on MacOS or ~/.bashrc on Linux with the new key if it has already been persisted there.,
        Windows: 'setx OPENAI_API_KEY your-key-here' then restart terminal.


dot@meta:~/Cross-Link-Manager$ wget https://huggingface.co/jartine/Mixtral-8x7B-v0.1.llamafile/resolve/main/mixtral-8x7b-instruct-v0.1.Q5_K_M-server.llamafile
Will not apply HSTS. The HSTS database must be a regular and non-world-writable file.
ERROR: could not open HSTS store at '/home/dot/.wget-hsts'. HSTS will be disabled.
--2024-06-19 16:35:48--  https://huggingface.co/jartine/Mixtral-8x7B-v0.1.llamafile/resolve/main/mixtral-8x7b-instruct-v0.1.Q5_K_M-server.llamafile
Resolving huggingface.co (huggingface.co)... 2600:9000:234c:dc00:17:b174:6d00:93a1, 2600:9000:234c:8e00:17:b174:6d00:93a1, 2600:9000:234c:e800:17:b174:6d00:93a1, ...
Connecting to huggingface.co (huggingface.co)|2600:9000:234c:dc00:17:b174:6d00:93a1|:443... connected.
HTTP request sent, awaiting response... 401 Unauthorized

Username/Password Authentication Failed.
dot@meta:~/Cross-Link-Manager$ 

@accessor-io
Copy link
Author

image

nothing happens

``

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant