Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/code-execution/settings.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ title: Settings

The `interpreter.computer` is responsible for executing code.

[Click here to view `interpreter.computer` settings.](https://docs.openinterpreter.com/settings/all-settings#computer)
[Click here](https://docs.openinterpreter.com/settings/all-settings#computer) to view `interpreter.computer` settings.
9 changes: 9 additions & 0 deletions docs/getting-started/setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,15 @@
title: Setup
---

<iframe
width="560"
height="315"
src="https://www.youtube.com/watch?v=5sk3t8ilDR8"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>

## Installation from `pip`

If you are familiar with Python, we recommend installing Open Interpreter via `pip`
Expand Down
38 changes: 24 additions & 14 deletions docs/guides/basic-usage.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,18 +28,20 @@ title: Basic Usage

### Interactive Chat

To start an interactive chat in your terminal, either run `interpreter` from the command line:
To start an interactive chat in your terminal, either run `interpreter` from the command line or `interpreter.chat()` from a .py file.

```shell
<CodeGroup>

```shell Terminal
interpreter
```

Or `interpreter.chat()` from a .py file:

```python
```python Python
interpreter.chat()
```

</CodeGroup>

---

### Programmatic Chat
Expand All @@ -60,18 +62,22 @@ interpreter.chat("These look great but can you make the subtitles bigger?")

### Start a New Chat

In your terminal, Open Interpreter behaves like ChatGPT and will not remember previous conversations. Simply run `interpreter` to start a new chat:
In your terminal, Open Interpreter behaves like ChatGPT and will not remember previous conversations. Simply run `interpreter` to start a new chat.

```shell
In Python, Open Interpreter remembers conversation history. If you want to start fresh, you can reset it.

<CodeGroup>

```shell Terminal
interpreter
```

In Python, Open Interpreter remembers conversation history. If you want to start fresh, you can reset it:

```python
```python Python
interpreter.messages = []
```

</CodeGroup>

---

### Save and Restore Chats
Expand All @@ -80,13 +86,15 @@ In your terminal, Open Interpreter will save previous conversations to `<your ap

You can resume any of them by running `--conversations`. Use your arrow keys to select one , then press `ENTER` to resume it.

```shell
In Python, `interpreter.chat()` returns a List of messages, which can be used to resume a conversation with `interpreter.messages = messages`.

<CodeGroup>

```shell Terminal
interpreter --conversations
```

In Python, `interpreter.chat()` returns a List of messages, which can be used to resume a conversation with `interpreter.messages = messages`:

```python
```python Python
# Save messages to 'messages'
messages = interpreter.chat("My name is Killian.")

Expand All @@ -97,6 +105,8 @@ interpreter.messages = []
interpreter.messages = messages
```

</CodeGroup>

---

### Configure Default Settings
Expand Down
12 changes: 0 additions & 12 deletions docs/guides/profiles.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -31,15 +31,3 @@ interpreter.loop = True
There are many settings that can be configured. [See them all
here](/settings/all-settings)
</Tip>

## Helpful settings for local models

Local models benefit from more coercion and guidance. This verbosity of adding extra context to messages can impact the conversational experience of Open Interpreter. The following settings allow templates to be applied to messages to improve the steering of the language model while maintaining the natural flow of conversation.

`interpreter.user_message_template` allows users to have their message wrapped in a template. This can be helpful steering a language model to a desired behaviour without needing the user to add extra context to their message.

`interpreter.always_apply_user_message_template` has all user messages to be wrapped in the template. If False, only the last User message will be wrapped.

`interpreter.code_output_template` wraps the output from the computer after code is run. This can help with nudging the language model to continue working or to explain outputs.

`interpreter.empty_code_output_template` is the message that is sent to the language model if code execution results in no output.
12 changes: 12 additions & 0 deletions docs/guides/running-locally.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,18 @@ interpreter.llm.api_base = "http://localhost:11434"
interpreter.chat("how many files are on my desktop?")
```

## Helpful settings for local models

Local models benefit from more coercion and guidance. This verbosity of adding extra context to messages can impact the conversational experience of Open Interpreter. The following settings allow templates to be applied to messages to improve the steering of the language model while maintaining the natural flow of conversation.

`interpreter.user_message_template` allows users to have their message wrapped in a template. This can be helpful steering a language model to a desired behaviour without needing the user to add extra context to their message.

`interpreter.always_apply_user_message_template` has all user messages to be wrapped in the template. If False, only the last User message will be wrapped.

`interpreter.code_output_template` wraps the output from the computer after code is run. This can help with nudging the language model to continue working or to explain outputs.

`interpreter.empty_code_output_template` is the message that is sent to the language model if code execution results in no output.

<Info>
Other configuration settings are explained in
[Settings](/settings/all-settings).
Expand Down
21 changes: 8 additions & 13 deletions docs/language-models/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,25 +10,20 @@ For this reason, we recommend starting with a **hosted** model, then switching t

<CardGroup>

<Card
title="Hosted setup"
icon="cloud"
href="/language-models/hosted-models"
>
<Card title="Hosted setup" icon="cloud" href="/language-models/hosted-models">
Connect to a hosted language model like GPT-4 **(recommended)**
</Card>

<Card
title="Local setup"
icon="microchip"
href="/language-models/local-models"
>
<Card title="Local setup" icon="microchip" href="/language-models/local-models">
Setup a local language model like Mistral
</Card>

</CardGroup>

<br/>
<br/>
<br />
<br />

<span class="opacity-50">Thank you to the incredible [LiteLLM](https://litellm.ai/) team for their efforts in connecting Open Interpreter to hosted providers.</span>
<Info>
Thank you to the incredible [LiteLLM](https://litellm.ai/) team for their
efforts in connecting Open Interpreter to hosted providers.
</Info>
5 changes: 5 additions & 0 deletions docs/language-models/local-models/janai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -49,3 +49,8 @@ interpreter.context_window = 5000
```

</CodeGroup>

<Warning>
If Jan is producing strange output, or no output at all, make sure to update
to the latest version and clean your cache.
</Warning>
2 changes: 1 addition & 1 deletion docs/language-models/settings.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ title: Settings

The `interpreter.llm` is responsible for running the language model.

[Click here to view `interpreter.llm` settings.](/settings/all-settings#language-model)
[Click here](/settings/all-settings#language-model) to view `interpreter.llm` settings.
4 changes: 2 additions & 2 deletions docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -97,10 +97,10 @@
{
"group": "Code Execution",
"pages": [
"code-execution/settings",
"code-execution/usage",
"code-execution/computer-api",
"code-execution/custom-languages"
"code-execution/custom-languages",
"code-execution/settings"
]
},
{
Expand Down
7 changes: 6 additions & 1 deletion docs/safety/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,9 @@ Safety is a top priority for us at Open Interpreter. Running LLM generated code

## Notice

Open Interpreter is not responsible for any damage caused by using the package. These safety measures provide no guarantees of safety or security. Please be careful when running code generated by Open Interpreter, and make sure you understand what it will do before running it.
<Warning>
Open Interpreter is not responsible for any damage caused by using the
package. These safety measures provide no guarantees of safety or security.
Please be careful when running code generated by Open Interpreter, and make
sure you understand what it will do before running it.
</Warning>
6 changes: 1 addition & 5 deletions docs/settings/all-settings.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -393,7 +393,7 @@ interpreter --help

</CodeGroup>

### Force Task Completion
### Loop (Force Task Completion)

Runs Open Interpreter in a loop, requiring it to admit to completing or failing every task.

Expand Down Expand Up @@ -622,8 +622,6 @@ This property holds a list of `messages` between the user and the interpreter.

You can use it to restore a conversation:

<CodeGroup>

```python
interpreter.chat("Hi! Can you print hello world?")

Expand Down Expand Up @@ -652,8 +650,6 @@ print(interpreter.messages)
interpreter.messages = messages # A list that resembles the one above
```

</CodeGroup>

### User Message Template

A template applied to the User's message. `{content}` will be replaced with the user's message, then sent to the language model.
Expand Down
19 changes: 10 additions & 9 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,18 @@

This directory contains various examples demonstrating how to use Open Interpreter in different scenarios and configurations. Each example is designed to provide a practical guide to integrating and leveraging Open Interpreter's capabilities in your projects.

## Overview

- **Terminal Usage**: Examples of how to use Open Interpreter directly from your terminal.
- **Python Integration**: How to integrate Open Interpreter into your Python scripts for more complex workflows.
- **Custom Profiles**: Examples of using YAML files for setting default behaviors and configurations.

## Colab Notebooks

[Google Colab](https://colab.google/) provides a sandboxed development environment for you to run code in. Here are some Jupyter Notebooks on Colab that you can try:

Local 3: [![Local 3](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1jWKKwVCQneCTB5VNQNWO0Wxqg1vG_E1T#scrollTo=13ISLtY9_v7g)
Interactive Demo: [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1WKmRXZgsErej2xUriKzxrEAXdxMSgWbb?usp=sharing)
### Local 3

[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1jWKKwVCQneCTB5VNQNWO0Wxqg1vG_E1T#scrollTo=13ISLtY9_v7g)

### Interactive Demo

[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1WKmRXZgsErej2xUriKzxrEAXdxMSgWbb?usp=sharing)

### Voice Interface

Voice Interface: [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1NojYGHDgxH6Y1G1oxThEBBb2AtyODBIK)
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1NojYGHDgxH6Y1G1oxThEBBb2AtyODBIK)