Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions api-reference/queries/endpoint/unarchive.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ In Dune context, delete action is replaced by archive as deletion of queries is
<RequestExample>

```bash cURL
curl --request GET \
curl --request POST \
--url https://api.dune.com/api/v1/query/{queryId}/unarchive \
--header 'X-DUNE-API-KEY: <x-dune-api-key>'
```
Expand All @@ -37,14 +37,14 @@ url = "https://api.dune.com/api/v1/query/{queryId}/unarchive"

headers = {"X-DUNE-API-KEY": "<x-dune-api-key>"}

response = requests.request("GET", url, headers=headers)
response = requests.request("POST", url, headers=headers)

print(response.text)

```

```javascript JavaScript
const options = {method: 'GET', headers: {'X-DUNE-API-KEY': '<x-dune-api-key>'}};
const options = {method: 'POST', headers: {'X-DUNE-API-KEY': '<x-dune-api-key>'}};

fetch('https://api.dune.com/api/v1/query/{queryId}/unarchive', options)
.then(response => response.json())
Expand All @@ -65,7 +65,7 @@ func main() {

url := "https://api.dune.com/api/v1/query/{queryId}/unarchive"

req, _ := http.NewRequest("GET", url, nil)
req, _ := http.NewRequest("POST", url, nil)

req.Header.Add("X-DUNE-API-KEY", "<x-dune-api-key>")

Expand All @@ -92,7 +92,7 @@ curl_setopt_array($curl, [
CURLOPT_MAXREDIRS => 10,
CURLOPT_TIMEOUT => 30,
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
CURLOPT_CUSTOMREQUEST => "GET",
CURLOPT_CUSTOMREQUEST => "POST",
CURLOPT_HTTPHEADER => [
"X-DUNE-API-KEY: <x-dune-api-key>"
],
Expand All @@ -111,7 +111,7 @@ if ($err) {
```

```java Java
HttpResponse<String> response = Unirest.get("https://api.dune.com/api/v1/query/{queryId}/unarchive")
HttpResponse<String> response = Unirest.post("https://api.dune.com/api/v1/query/{queryId}/unarchive")
.header("X-DUNE-API-KEY", "<x-dune-api-key>")
.asString();
```
Expand Down
12 changes: 6 additions & 6 deletions api-reference/queries/endpoint/unprivate.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ To access Query endpoints, an [Analyst plan](https://dune.com/pricing) or higher
<RequestExample>

```bash cURL
curl --request GET \
curl --request POST \
--url https://api.dune.com/api/v1/query/{queryId}/unprivate \
--header 'X-DUNE-API-KEY: <x-dune-api-key>'
```
Expand All @@ -33,14 +33,14 @@ url = "https://api.dune.com/api/v1/query/{queryId}/unprivate"

headers = {"X-DUNE-API-KEY": "<x-dune-api-key>"}

response = requests.request("GET", url, headers=headers)
response = requests.request("POST", url, headers=headers)

print(response.text)

```

```javascript JavaScript
const options = {method: 'GET', headers: {'X-DUNE-API-KEY': '<x-dune-api-key>'}};
const options = {method: 'POST', headers: {'X-DUNE-API-KEY': '<x-dune-api-key>'}};

fetch('https://api.dune.com/api/v1/query/{queryId}/unprivate', options)
.then(response => response.json())
Expand All @@ -61,7 +61,7 @@ func main() {

url := "https://api.dune.com/api/v1/query/{queryId}/unprivate"

req, _ := http.NewRequest("GET", url, nil)
req, _ := http.NewRequest("POST", url, nil)

req.Header.Add("X-DUNE-API-KEY", "<x-dune-api-key>")

Expand All @@ -88,7 +88,7 @@ curl_setopt_array($curl, [
CURLOPT_MAXREDIRS => 10,
CURLOPT_TIMEOUT => 30,
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
CURLOPT_CUSTOMREQUEST => "GET",
CURLOPT_CUSTOMREQUEST => "POST",
CURLOPT_HTTPHEADER => [
"X-DUNE-API-KEY: <x-dune-api-key>"
],
Expand All @@ -107,7 +107,7 @@ if ($err) {
```

```java Java
HttpResponse<String> response = Unirest.get("https://api.dune.com/api/v1/query/{queryId}/unprivate")
HttpResponse<String> response = Unirest.post("https://api.dune.com/api/v1/query/{queryId}/unprivate")
.header("X-DUNE-API-KEY", "<x-dune-api-key>")
.asString();
```
Expand Down
12 changes: 6 additions & 6 deletions api-reference/queries/endpoint/update.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,22 +35,22 @@ query = dune.update_query(
)
```

```python Python
```bash cURL
curl --request PATCH \
--url https://api.dune.com/api/v1/query/{queryId} \
--header 'Content-Type: application/json' \
--header 'X-DUNE-API-KEY: <x-dune-api-key>' \
--data '{
"query_id": 1252207,
"query_sql": "{{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}"
"query_sql": "SELECT * FROM {{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}"
}'
```

```javascript JavaScript
const options = {
method: 'PATCH',
headers: {'X-DUNE-API-KEY': '<x-dune-api-key>', 'Content-Type': 'application/json'},
body: '{"query_id":1252207,"query_sql":"{{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}"}'
body: '{"query_id":1252207,"query_sql":"SELECT * FROM {{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}"}'
};

fetch('https://api.dune.com/api/v1/query/{queryId}', options)
Expand All @@ -73,7 +73,7 @@ func main() {

url := "https://api.dune.com/api/v1/query/{queryId}"

payload := strings.NewReader("{\n \"query_id\": 1252207,\n ,\n \"query_sql\": \"{{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think we can keep the \n and also add the SELECT * FROM

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ,\n makes this invalid json. But yes the query_sql should have select * from.

payload := strings.NewReader("{\n \"query_id\": 1252207,\n \"query_sql\": \"SELECT * FROM {{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}")

req, _ := http.NewRequest("PATCH", url, payload)

Expand Down Expand Up @@ -104,7 +104,7 @@ curl_setopt_array($curl, [
CURLOPT_TIMEOUT => 30,
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
CURLOPT_CUSTOMREQUEST => "PATCH",
CURLOPT_POSTFIELDS => "{\n \"query_id\": 1252207,\n \"query_sql\": \"{{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}",
CURLOPT_POSTFIELDS => "{\n \"query_id\": 1252207,\n \"query_sql\": \"SELECT * FROM {{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}",
CURLOPT_HTTPHEADER => [
"Content-Type: application/json",
"X-DUNE-API-KEY: <x-dune-api-key>"
Expand All @@ -127,7 +127,7 @@ if ($err) {
HttpResponse<String> response = Unirest.patch("https://api.dune.com/api/v1/query/{queryId}")
.header("X-DUNE-API-KEY", "<x-dune-api-key>")
.header("Content-Type", "application/json")
.body("{\n \"query_id\": 1252207,\n \"query_sql\": \"{{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}")
.body("{\n \"query_id\": 1252207,\n \"query_sql\": \"SELECT * FROM {{blockchain}}.transactions WHERE to = {{address}} AND block_number > {{blocknumber}}\"}")
.asString();
```

Expand Down
2 changes: 1 addition & 1 deletion api-reference/quickstart/queries-eg.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ In this quickstart, we will walk through how to turn any dashboard (or set of qu

### Prerequisites
- Python environment set up (check out [Anaconda Navigator](https://docs.continuum.io/free/navigator/) if you want somewhere to start.)
- Have a Dune API key from the team/user who's queries you want to manage (to obtain one [follow the steps here](../overview/authentication#generate-an-api-key))
- Have a Dune API key from the team/user whose queries you want to manage (to obtain one [follow the steps here](../overview/authentication#generate-an-api-key))

### Set up a new repo from the GitHub template

Expand Down
44 changes: 22 additions & 22 deletions api-reference/quickstart/results-eg.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -97,27 +97,27 @@ You can choose to either get the latest query result without triggering an execu
<Tabs>
<Tab title="Get latest result without execution">
```python
query_result = dune.get_latest_result(3373921) # get latest result in json format
# query_result = dune.get_latest_result_dataframe(3373921) # get latest result in Pandas dataframe format
```
query_result = dune.get_latest_result(3373921) # get latest result in json format
# query_result = dune.get_latest_result_dataframe(3373921) # get latest result in Pandas dataframe format
```
</Tab>
<Tab title="Query a query">
```python
query = QueryBase(
query_id=3373921,

# uncomment and change the parameter values if needed
# params=[
# QueryParameter.text_type(name="contract", value="0x6B175474E89094C44Da98b954EedeAC495271d0F"), # default is DAI
# QueryParameter.text_type(name="owner", value="owner"), # default using vitalik.eth's wallet
# ],
)
<Tab title="Query a query">
```python
query = QueryBase(
query_id=3373921,

# uncomment and change the parameter values if needed
# params=[
# QueryParameter.text_type(name="contract", value="0x6B175474E89094C44Da98b954EedeAC495271d0F"), # default is DAI
# QueryParameter.text_type(name="owner", value="owner"), # default using vitalik.eth's wallet
# ],
)

query_result = dune.run_query_dataframe(
query=query
# , ping_frequency = 10 # uncomment to change the seconds between checking execution status, default is 1 second
# , performance="large" # uncomment to run query on large engine, default is medium
# , batch_size = 5_000 # uncomment to change the maximum number of rows to retrieve per batch of results, default is 32_000
query=query
# , ping_frequency = 10 # uncomment to change the seconds between checking execution status, default is 1 second
# , performance="large" # uncomment to run query on large engine, default is medium
# , batch_size = 5_000 # uncomment to change the maximum number of rows to retrieve per batch of results, default is 32_000
)

# Note: to get the result in csv format, call run_query_csv(); for json format, call run_query().
Expand Down Expand Up @@ -164,10 +164,10 @@ You can choose to either get the latest query result without triggering an execu
)

query_result = dune.run_query_dataframe(
query=query
# , ping_frequency = 10 # uncomment to change the seconds between checking execution status, default is 1 second
# , performance="large" # uncomment to run query on large engine, default is medium
# , batch_size = 5_000 # uncomment to change the maximum number of rows to retrieve per batch of results, default is 32_000
query=query
# , ping_frequency = 10 # uncomment to change the seconds between checking execution status, default is 1 second
# , performance="large" # uncomment to run query on large engine, default is medium
# , batch_size = 5_000 # uncomment to change the maximum number of rows to retrieve per batch of results, default is 32_000
)

# Note: to get the result in csv format, call run_query_csv(); for json format, call run_query().
Expand Down
4 changes: 2 additions & 2 deletions api-reference/quickstart/tables-eg.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ DUNE_API_KEY=<paste your API key here>
</Tip>

### Upload the CSV
Follow below steps to upload your CSV. Please make sure to modify paths to your .env file and to your CSV file.
Follow the below steps to upload your CSV. Please make sure to modify paths to your .env file and to your CSV file.

```python
import dotenv, os
Expand Down Expand Up @@ -45,7 +45,7 @@ with open(csv_file_path) as open_file:

Once the upload is successful, you will see the data show up under [Your Data](https://dune.com/queries?category=uploaded_data) in the Data Explorer.

You can query your uploaded table under the name `dune.<team or user handle>.dataset_<table name defined>`. For example, here I defined the table name to be "cereal_table" and my team name is "dune", so to access the uploaded table we will do `select * from dune.dune.dataset_cereal_table`
You can query your uploaded table under the name `dune.<team or user handle>.dataset_<table name defined>`. For example, here I defined the table name to be "cereal_table" and my team name is "dune", so to access the uploaded table we will do `select * from dune.dune.dataset_cereal_table`.



Expand Down
2 changes: 1 addition & 1 deletion api-reference/tables/endpoint/clear.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ headers = {
"X-DUNE-API-KEY": "<x-dune-api-key>"
}

response = requests.request("POST", url, data=data, headers=headers)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

keep as is

response = requests.request("POST", url, headers=headers)
```

</RequestExample>
4 changes: 2 additions & 2 deletions api-reference/tables/endpoint/create.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ The resulting table will be empty, and can be inserted into with the [/insert en
</Note>

## Schema
You need to define the schema of your data by providing `schema` array of columns in the request. Each column has three parameters:
You need to define the schema of your data by providing a `schema` array of columns in the request. Each column has three parameters:

**name**: the name of the field

Expand Down Expand Up @@ -61,7 +61,7 @@ table = dune.create_table(
```typescript TS SDK
import { DuneClient, ColumnType } from "@duneanalytics/client-sdk";

client = new DuneClient(process.env.DUNE_API_KEY!);
const client = new DuneClient(process.env.DUNE_API_KEY!);
const result = await client.table.create({
namespace: "my_user",
table_name: "interest_rates",
Expand Down
10 changes: 5 additions & 5 deletions api-reference/tables/endpoint/insert.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ To be able to insert into a table, it must have been created with the [/create e

<Note>
- The data in the files must conform to the schema, and must use the same column names as the schema.
- One successful `/insert` request consumes 1 credits.
- One successful `/insert` request consumes 1 credit.
- The maximum request size is 1.2GB
</Note>

Expand All @@ -19,7 +19,7 @@ A status code of 200 means that the data in the request was successfully inserte
If you get any other status code, you can safely retry your request after addressing the issue that the error message indicated.

## Concurrent requests
A limited number of concurrent insertion requests per table is supported. However, there will be a slight performance penalty as we serialize the writes behind the scenes to ensure data integrity. Larger number of concurrent requests per table may result in an increased number of failures. Therefore, we recommend managing your requests within a 5-10 threshold to maintain optimal performance.
A limited number of concurrent insertion requests per table is supported. However, there will be a slight performance penalty as we serialize the writes behind the scenes to ensure data integrity. A larger number of concurrent requests per table may result in an increased number of failures. Therefore, we recommend managing your requests within a 5-10 threshold to maintain optimal performance.

## Supported filetypes
### CSV files (`Content-Type: text/csv`)
Expand All @@ -32,15 +32,15 @@ Each line must have keys that match the column names of the table.
## Data types
DuneSQL supports a variety of types which are not natively supported in many data exchange formats. Here we provide guidance on how to work with such types.
### Varbinary values
When uploading varbinary data using JSON or CSV formats, you need to convert the binary data into a textual representation. Reason being, JSON or CSV don't natively support binary values. There are many ways to transform binary data to a textual representation. We support **hexadecimal** and **base64** encodings.
When uploading varbinary data using JSON or CSV formats, you need to convert the binary data into a textual representation. The reason being, JSON or CSV don't natively support binary values. There are many ways to transform binary data to a textual representation. We support **hexadecimal** and **base64** encodings.

#### base64
Base64 is a binary-to-text encoding scheme that transforms binary data into a sequence of characters. All characters are taken from a set of 64 characters.

Example: `{"varbinary_column":"SGVsbG8gd29ybGQK"}`

#### hexadecimal
In the hexadecimal representation input data should contain an even number of characters in the range `[0-9a-fA-F]` always prefixed with `0x`.
In the hexadecimal representation, input data should contain an even number of characters in the range `[0-9a-fA-F]` always prefixed with `0x`.

Example: `{"varbinary_column":"0x92b7d1031988c7af"}`

Expand Down Expand Up @@ -74,7 +74,7 @@ with open("./interest_rates.csv", "rb") as data:
import * as fs from "fs/promises";
import { DuneClient, ContentType } from "@duneanalytics/client-sdk";

client = new DuneClient(process.env.DUNE_API_KEY!);
const client = new DuneClient(process.env.DUNE_API_KEY!);
const data = await fs.readFile("./sample_table_insert.csv");
// Or JSON
// const data: Buffer = await fs.readFile("./sample_table_insert.json");
Expand Down
2 changes: 1 addition & 1 deletion api-reference/tables/endpoint/upload.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ For working with uploads, keep in mind that:
- File has to be < 200 MB
- Column names in the table can't start with a special character or digits.
- Private uploads require a Premium subscription.
- If you upload to an existing table name, it will delete the old data and overwite it with your new data. Appends are only supported for the `/create`, `/insert` endpoints.
- If you upload to an existing table name, it will delete the old data and overwrite it with your new data. Appends are only supported for the `/create`, `/insert` endpoints.
- To delete an upload table, you must go to `user settings (dune.com) -> data -> delete`.

If you have larger datasets you want to upload, please [contact us here](https://docs.google.com/forms/d/e/1FAIpQLSekx61WzIh-MII18zRj1G98aJeLM7U0VEBqaa6pVk_DQ7lq6Q/viewform)
Expand Down