Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
58 changes: 31 additions & 27 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,42 +93,42 @@ Additionally, for Windows:

### 1. Install Skyvern

```bash

pip install skyvern
```


### 2. Run Skyvern
This is most helpful for first time run (db setup, db migrations etc).

```bash

skyvern quickstart
```


### 3. Run task

#### UI (Recommended)

Start the Skyvern service and UI (when DB is up and running)

```bash

skyvern run all
```


Go to http://localhost:8080 and use the UI to run a task

#### Code

```python

from skyvern import Skyvern

skyvern = Skyvern()
task = await skyvern.run_task(prompt="Find the top post on hackernews today")
print(task)
```

Skyvern starts running the task in a browser that pops up and closes it when the task is done. You will be able to view the task from http://localhost:8080/history

You can also run a task on different targets:
```python

from skyvern import Skyvern

# Run on Skyvern Cloud
Expand All @@ -139,15 +139,15 @@ skyvern = Skyvern(base_url="http://localhost:8000", api_key="LOCAL SKYVERN API K

task = await skyvern.run_task(prompt="Find the top post on hackernews today")
print(task)
```


## Advanced Usage

### Control your own browser (Chrome)
> ⚠️ WARNING: Since [Chrome 136](https://developer.chrome.com/blog/remote-debugging-port), Chrome refuses any CDP connect to the browser using the default user_data_dir. In order to use your browser data, Skyvern copies your default user_data_dir to `./tmp/user_data_dir` the first time connecting to your local browser. ⚠️

1. Just With Python Code
```python

from skyvern import Skyvern

# The path to your Chrome browser. This example path is for Mac.
Expand All @@ -160,34 +160,34 @@ skyvern = Skyvern(
task = await skyvern.run_task(
prompt="Find the top post on hackernews today",
)
```


2. With Skyvern Service

Add two variables to your .env file:
```bash

# The path to your Chrome browser. This example path is for Mac.
CHROME_EXECUTABLE_PATH="/Applications/Google Chrome.app/Contents/MacOS/Google Chrome"
BROWSER_TYPE=cdp-connect
```


Restart Skyvern service `skyvern run all` and run the task through UI or code

### Run Skyvern with any remote browser
Grab the cdp connection url and pass it to Skyvern

```python

from skyvern import Skyvern

skyvern = Skyvern(cdp_url="your cdp connection url")
task = await skyvern.run_task(
prompt="Find the top post on hackernews today",
)
```


### Get consistent output schema from your run
You can do this by adding the `data_extraction_schema` parameter:
```python

from skyvern import Skyvern

skyvern = Skyvern()
Expand All @@ -211,12 +211,12 @@ task = await skyvern.run_task(
}
}
)
```


### Helpful commands to debug issues


```bash

# Launch the Skyvern Server Separately*
skyvern run server

Expand All @@ -234,7 +234,7 @@ skyvern stop ui

# Stop the Skyvern Server Separately
skyvern stop server
```


## Docker Compose setup

Expand Down Expand Up @@ -414,6 +414,8 @@ Make sure to have [uv](https://docs.astral.sh/uv/getting-started/installation/)

More extensive documentation can be found on our [📕 docs page](https://www.skyvern.com/docs). Please let us know if something is unclear or missing by opening an issue or reaching out to us [via email](mailto:[email protected]) or [discord](https://discord.gg/fG2XXEuQX3).

If you want to chat with the skyvern repository to get a high level overview of how it is structured, how to build off it, and how to resolve usage questions, check out [Code Sage](https://sage.storia.ai?utm_source=github&utm_medium=referral&utm_campaign=skyvern-readme).

# Supported LLMs
| Provider | Supported Models |
| -------- | ------- |
Expand All @@ -424,7 +426,7 @@ More extensive documentation can be found on our [📕 docs page](https://www.sk
| Gemini | Gemini 2.5 Pro and flash, Gemini 2.0 |
| Ollama | Run any locally hosted model via [Ollama](https://github.com/ollama/ollama) |
| OpenRouter | Access models through [OpenRouter](https://openrouter.ai) |
| OpenAI-compatible | Any custom API endpoint that follows OpenAI's API format (via [liteLLM](https://docs.litellm.ai/docs/providers/openai_compatible)) |
| OpenAI-compatible | Any custom API endpoint that follows OpenAI's API format (via [liteLLM](https://docs.litellm.ai/docs/providers/openai_compatible)), including CometAPI |

#### Environment Variables

Expand Down Expand Up @@ -496,17 +498,19 @@ Recommended `LLM_KEY`: `OPENROUTER`
##### OpenAI-Compatible
| Variable | Description| Type | Sample Value|
| -------- | ------- | ------- | ------- |
| `ENABLE_OPENAI_COMPATIBLE`| Register a custom OpenAI-compatible API endpoint | Boolean | `true`, `false` |
| `OPENAI_COMPATIBLE_MODEL_NAME` | Model name for OpenAI-compatible endpoint | String | `yi-34b`, `gpt-3.5-turbo`, `mistral-large`, etc.|
| `OPENAI_COMPATIBLE_API_KEY` | API key for OpenAI-compatible endpoint | String | `sk-1234567890`|
| `OPENAI_COMPATIBLE_API_BASE` | Base URL for OpenAI-compatible endpoint | String | `https://api.together.xyz/v1`, `http://localhost:8000/v1`, etc.|
| `ENABLE_OPENAI_COMPATIBLE`| Register a custom OpenAI-compatible API endpoint (e.g., CometAPI) | Boolean | `true`, `false` |
| `OPENAI_COMPATIBLE_MODEL_NAME` | Model name for OpenAI-compatible endpoint (e.g., `cometapi/gpt-4o`, `yi-34b`, `gpt-3.5-turbo`, `mistral-large`, etc.) | String | `cometapi/gpt-4o`|
| `OPENAI_COMPATIBLE_API_KEY` | API key for OpenAI-compatible endpoint (e.g., CometAPI API Key) | String | `sk-comet-1234567890`|
| `OPENAI_COMPATIBLE_API_BASE` | Base URL for OpenAI-compatible endpoint (e.g., `https://api.cometapi.com`, `https://api.together.xyz/v1`, `http://localhost:8000/v1`, etc.) | String | `https://api.cometapi.com`|
| `OPENAI_COMPATIBLE_API_VERSION` | API version for OpenAI-compatible endpoint, optional| String | `2023-05-15`|
| `OPENAI_COMPATIBLE_MAX_TOKENS` | Maximum tokens for completion, optional| Integer | `4096`, `8192`, etc.|
| `OPENAI_COMPATIBLE_TEMPERATURE` | Temperature setting, optional| Float | `0.0`, `0.5`, `0.7`, etc.|
| `OPENAI_COMPATIBLE_SUPPORTS_VISION` | Whether model supports vision, optional| Boolean | `true`, `false`|
| `OPENAI_COMPATIBLE_SUPPORTS_VISION` | Whether model supports vision, optional (e.g., `true` for CometAPI multimodal models) | Boolean | `true`, `false`|

Supported LLM Key: `OPENAI_COMPATIBLE`

Note: For CometAPI, you can also use `https://api.cometapi.com/v1` or `https://api.cometapi.com/v1/chat/completions` as `OPENAI_COMPATIBLE_API_BASE`.

##### General LLM Configuration
| Variable | Description| Type | Sample Value|
| -------- | ------- | ------- | ------- |
Expand Down Expand Up @@ -554,4 +558,4 @@ If you have any questions or concerns around licensing, please [contact us](mail

# Star History

[![Star History Chart](https://api.star-history.com/svg?repos=Skyvern-AI/skyvern&type=Date)](https://star-history.com/#Skyvern-AI/skyvern&Date)
[![Star History Chart](https://api.star-history.com/svg?repos=Skyvern-AI/skyvern&type=Date)](https://star-history.com/#Skyvern-AI/skyvern&Date)
10 changes: 9 additions & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,14 @@ services:
# - LLM_KEY=OPENROUTER
# - OPENROUTER_API_KEY=<your_openrouter_api_key>
# - OPENROUTER_MODEL=mistralai/mistral-small-3.1-24b-instruct
# CometAPI Support:
# CometAPI provides access to 500+ LLMs and multimodal models via an OpenAI-compatible endpoint.
# - ENABLE_OPENAI_COMPATIBLE=true
# - LLM_KEY=OPENAI_COMPATIBLE
# - OPENAI_COMPATIBLE_API_KEY=<your_cometapi_key>
# - OPENAI_COMPATIBLE_API_BASE=https://api.cometapi.com
# - OPENAI_COMPATIBLE_MODEL_NAME=cometapi/gpt-4o # Replace with the specific model you want to use on CometAPI (e.g., cometapi/claude-3-opus)
# - OPENAI_COMPATIBLE_SUPPORTS_VISION=true # Set to true if your chosen model supports vision (e.g., for multimodal models)
# Groq Support:
# - ENABLE_GROQ=true
# - LLM_KEY=GROQ
Expand Down Expand Up @@ -206,4 +214,4 @@ services:
# # Optional: persist Bitwarden CLI config
# - ~/bitwarden-cli-config:/app/.config
# labels:
# - "traefik.enable=false" # Don't expose via reverse proxy
# - "traefik.enable=false" # Don't expose via reverse proxy