Skip to content

Commit

Permalink
Merge branch 'master' into the-test-pollution-hunt
Browse files Browse the repository at this point in the history
  • Loading branch information
haakonvt authored Jun 13, 2024
2 parents f6e3237 + e675cee commit e5c0509
Show file tree
Hide file tree
Showing 34 changed files with 634 additions and 233 deletions.
2 changes: 1 addition & 1 deletion .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
"name": "Dev Container for Cognite Python SDK",

// Python base image reference: https://github.com/devcontainers/images/tree/main/src/python
"image": "mcr.microsoft.com/devcontainers/python:3.12-bullseye",
"image": "mcr.microsoft.com/devcontainers/python:3.8-bullseye",

// Features to add to the dev container. More info: https://containers.dev/features
"features": {
Expand Down
28 changes: 28 additions & 0 deletions .github/workflows/verify-jupyter.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
---
name: build

on:
pull_request:
branches: [master]

jobs:
build_and_test_jupyter_pyodide:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: ./.github/actions/setup
- name: Build package
run: poetry build
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: "20"
- name: Install dependencies
run: npm install [email protected] # JupyterLite currently using pyodide 0.25.0
- name: Install cognite-sdk in pyodide environment
run: |
whl_file=$(find dist -name "*.whl" | sed 's|^dist/||') # Find the built wheel file, remove dist/ prefix
echo "Found built wheel file: $whl_file"
SDK_FILE_PATH=$whl_file \
PACKAGES="[\"pyodide-http\", \"http://localhost:3000/dist/$whl_file\"]" \
node scripts/test-pyodide.js
28 changes: 28 additions & 0 deletions .github/workflows/verify-streamlit.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
---
name: build

on:
pull_request:
branches: [master]

jobs:
build_and_test_streamlit_pyodide:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: ./.github/actions/setup
- name: Build package
run: poetry build
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: "20"
- name: Install dependencies
run: npm install [email protected] # stlite currently using pyodide 0.25.1
- name: Install cognite-sdk in pyodide environment
run: |
whl_file=$(find dist -name "*.whl" | sed 's|^dist/||') # Find the built wheel file, remove dist/ prefix
echo "Found built wheel file: $whl_file"
SDK_FILE_PATH=$whl_file \
PACKAGES="[\"pyodide-http\", \"http://localhost:3000/dist/$whl_file\"]" \
node scripts/test-pyodide.js
4 changes: 2 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.4.5
rev: v0.4.8
hooks:
- id: ruff
args:
Expand Down Expand Up @@ -48,7 +48,7 @@ repos:
require_serial: true # avoid possible race conditions

- repo: https://github.com/jsh9/pydoclint # Run after 'custom-checks' as these may auto-fix
rev: 0.4.1
rev: 0.4.2
hooks:
- id: pydoclint
require_serial: true # Spammy in run-all scenarios (more than fast enough already)
Expand Down
40 changes: 39 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,45 @@ Changes are grouped as follows
- `Fixed` for any bug fixes.
- `Security` in case of vulnerabilities.

## [7.45.0] - 2024-05-28
## [7.49.2] - 2024-06-12
### Fixed
- Converting rows (`RowList` and `RowListWrite`) to a pandas DataFrame no longer silently drops rows that do not have
any columnar data.

## [7.49.1] - 2024-06-11
### Fixed
- Fixes resetting dataSetId to None in a ThreeDModelUpdate.

## [7.49.0] - 2024-06-05
### Added
- `WorkfowExecutionAPI.list` now allows filtering by execution status.

## [7.48.1] - 2024-06-04
### Fixed
- A bug introduced in `7.45.0` that would short-circuit raw datapoint queries too early when a lot of time series was
requested at the same time, and `include_outside_points=True` was used (empty cursor are to be expected).

## [7.48.0] - 2024-06-04
### Changed
- Mark Data Workflows SDK implementation as Generally Available.

## [7.47.0] - 2024-06-04
### Added
- Support for retrieving `Labels`, `client.labels.retrieve`.

## [7.46.2] - 2024-06-03
### Added
- Added option for silencing `FeaturePreviewWarnings` in the `cognite.client.global_config`.

## [7.46.1] - 2024-05-31
### Fixed
- Pyodide issue related to missing tzdata package.

## [7.46.0] - 2024-05-31
### Added
- `RawRowsAPI.insert_dataframe` now has a new `dropna` setting (defaulting to True, as this would otherwise raise later).

## [7.45.0] - 2024-05-31
### Added
- DatapointsAPI now support `timezone` and new calendar-based granularities like `month`, `quarter` and `year`.
These API features are in beta, and the SDK implementation in alpha, meaning breaking changes can
Expand Down
4 changes: 3 additions & 1 deletion cognite/client/_api/datapoint_tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -749,8 +749,10 @@ def _store_first_batch(self, dps: DatapointsAny, first_limit: int) -> None:
self._unpack_and_store(FIRST_IDX, dps)

# Are we done after first batch?
if not self.first_cursor or len(dps) < first_limit:
if len(dps) < first_limit:
self._is_done = True
elif not self.first_cursor and not self.query.include_outside_points:
self._is_done = True # no cursor when including outside...
elif not self.query.use_cursors and self.first_start == self.query.end:
self._is_done = True
elif self.query.limit is not None and len(dps) <= self.query.limit <= first_limit: # TODO: len == limit??
Expand Down
45 changes: 44 additions & 1 deletion cognite/client/_api/labels.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
from __future__ import annotations

from typing import Iterator, Sequence, cast, overload
from typing import Iterator, Literal, Sequence, cast, overload

from cognite.client._api_client import APIClient
from cognite.client._constants import DEFAULT_LIMIT_READ
Expand Down Expand Up @@ -52,6 +52,49 @@ def __call__(
chunk_size=chunk_size,
)

@overload
def retrieve(self, external_id: str, ignore_unknown_ids: Literal[True]) -> LabelDefinition | None: ...

@overload
def retrieve(self, external_id: str, ignore_unknown_ids: Literal[False] = False) -> LabelDefinition: ...

@overload
def retrieve(self, external_id: SequenceNotStr[str], ignore_unknown_ids: bool = False) -> LabelDefinitionList: ...

def retrieve(
self, external_id: str | SequenceNotStr[str], ignore_unknown_ids: bool = False
) -> LabelDefinition | LabelDefinitionList | None:
"""`Retrieve one or more label definitions by external id. <https://developer.cognite.com/api#tag/Labels/operation/byIdsLabels>`_
Args:
external_id (str | SequenceNotStr[str]): External ID or list of external ids
ignore_unknown_ids (bool): If True, ignore IDs and external IDs that are not found rather than throw an exception.
Returns:
LabelDefinition | LabelDefinitionList | None: The requested label definition(s)
Examples:
Get label by external id::
>>> from cognite.client import CogniteClient
>>> client = CogniteClient()
>>> res = client.labels.retrieve(external_id="my_label", ignore_unknown_ids=True)
"""
is_single = isinstance(external_id, str)
external_ids = [external_id] if is_single else external_id
identifiers = IdentifierSequence.load(external_ids=external_ids) # type: ignore[arg-type]
result = self._retrieve_multiple(
list_cls=LabelDefinitionList,
resource_cls=LabelDefinition,
identifiers=identifiers,
ignore_unknown_ids=ignore_unknown_ids,
)
if is_single:
return result[0] if result else None
return result

def list(
self,
name: str | None = None,
Expand Down
43 changes: 37 additions & 6 deletions cognite/client/_api/raw.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,15 @@
import random
import threading
import time
from collections import deque
from collections import defaultdict, deque
from typing import TYPE_CHECKING, Any, Iterator, Sequence, cast, overload

from cognite.client._api_client import APIClient
from cognite.client._constants import _RUNNING_IN_BROWSER, DEFAULT_LIMIT_READ
from cognite.client.data_classes import Database, DatabaseList, Row, RowList, RowWrite, Table, TableList
from cognite.client.data_classes.raw import RowCore
from cognite.client.utils._auxiliary import (
find_duplicates,
interpolate_and_url_encode,
is_finite,
is_unlimited,
Expand Down Expand Up @@ -523,7 +524,12 @@ def insert(
)

def insert_dataframe(
self, db_name: str, table_name: str, dataframe: pd.DataFrame, ensure_parent: bool = False
self,
db_name: str,
table_name: str,
dataframe: pd.DataFrame,
ensure_parent: bool = False,
dropna: bool = True,
) -> None:
"""`Insert pandas dataframe into a table <https://developer.cognite.com/api#tag/Raw/operation/postRows>`_
Expand All @@ -534,23 +540,48 @@ def insert_dataframe(
table_name (str): Name of the table.
dataframe (pd.DataFrame): The dataframe to insert. Index will be used as row keys.
ensure_parent (bool): Create database/table if they don't already exist.
dropna (bool): Remove NaNs (but keep None's in dtype=object columns) before inserting. Done individually per column. Default: True
Examples:
Insert new rows into a table::
Insert new rows into a table:
>>> import pandas as pd
>>> from cognite.client import CogniteClient
>>>
>>> client = CogniteClient()
>>> df = pd.DataFrame(data={"a": 1, "b": 2}, index=["r1", "r2", "r3"])
>>> res = client.raw.rows.insert_dataframe("db1", "table1", df)
>>> df = pd.DataFrame(
... {"col-a": [1, 3, None], "col-b": [2, -1, 9]},
... index=["r1", "r2", "r3"])
>>> res = client.raw.rows.insert_dataframe(
... "db1", "table1", df, dropna=True)
"""
if not dataframe.index.is_unique:
raise ValueError("Dataframe index is not unique (used for the row keys)")
rows = dataframe.to_dict(orient="index")
elif not dataframe.columns.is_unique:
raise ValueError(f"Dataframe columns are not unique: {sorted(find_duplicates(dataframe.columns))}")

rows = self._df_to_rows_skip_nans(dataframe) if dropna else dataframe.to_dict(orient="index")
self.insert(db_name=db_name, table_name=table_name, row=rows, ensure_parent=ensure_parent)

@staticmethod
def _df_to_rows_skip_nans(df: pd.DataFrame) -> dict[str, dict[str, Any]]:
np = local_import("numpy")
rows: defaultdict[str, dict[str, Any]] = defaultdict(dict)
object_cols = df.select_dtypes("object").columns

for column_id, col in df.items():
if column_id not in object_cols:
col = col.dropna()
else:
# pandas treat None as NaN, but numpy does not:
mask = np.logical_or(col.to_numpy() == None, col.notna()) # noqa: E711
col = col[mask]

for idx, val in col.items():
rows[idx][column_id] = val
return dict(rows)

def _process_row_input(self, row: Sequence[Row] | Sequence[RowWrite] | Row | RowWrite | dict) -> list[list[dict]]:
assert_type(row, "row", [Sequence, dict, RowCore])
rows = []
Expand Down
4 changes: 2 additions & 2 deletions cognite/client/_api/sequences.py
Original file line number Diff line number Diff line change
Expand Up @@ -742,7 +742,7 @@ def filter(
>>> client = CogniteClient()
>>> asset_filter = filters.Equals("asset_id", 123)
>>> is_efficiency = filters.Equals(["metadata", "type"], "efficiency")
>>> res = client.time_series.filter(filter=filters.And(asset_filter, is_efficiency), sort="created_time")
>>> res = client.sequences.filter(filter=filters.And(asset_filter, is_efficiency), sort="created_time")
Note that you can check the API documentation above to see which properties you can filter on
with which filters.
Expand All @@ -756,7 +756,7 @@ def filter(
>>> client = CogniteClient()
>>> asset_filter = filters.Equals(SequenceProperty.asset_id, 123)
>>> is_efficiency = filters.Equals(SequenceProperty.metadata_key("type"), "efficiency")
>>> res = client.time_series.filter(
>>> res = client.sequences.filter(
... filter=filters.And(asset_filter, is_efficiency),
... sort=SortableSequenceProperty.created_time)
Expand Down
27 changes: 22 additions & 5 deletions cognite/client/_api/transformations/notifications.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,17 @@
from __future__ import annotations

from typing import Sequence
from typing import Sequence, overload

from cognite.client._api_client import APIClient
from cognite.client._constants import DEFAULT_LIMIT_READ
from cognite.client.data_classes import TransformationNotification, TransformationNotificationList
from cognite.client.data_classes import (
TransformationNotification,
TransformationNotificationList,
TransformationNotificationWrite,
)
from cognite.client.data_classes.transformations.notifications import (
TransformationNotificationCore,
TransformationNotificationFilter,
TransformationNotificationWrite,
)
from cognite.client.utils._identifier import IdentifierSequence
from cognite.client.utils._validation import assert_type
Expand All @@ -17,13 +20,27 @@
class TransformationNotificationsAPI(APIClient):
_RESOURCE_PATH = "/transformations/notifications"

@overload
def create(
self, notification: TransformationNotification | TransformationNotificationWrite
) -> TransformationNotification: ...

@overload
def create(
self, notification: Sequence[TransformationNotification] | Sequence[TransformationNotificationWrite]
) -> TransformationNotificationList: ...

def create(
self, notification: TransformationNotification | Sequence[TransformationNotification]
self,
notification: TransformationNotification
| TransformationNotificationWrite
| Sequence[TransformationNotification]
| Sequence[TransformationNotificationWrite],
) -> TransformationNotification | TransformationNotificationList:
"""`Subscribe for notifications on the transformation errors. <https://developer.cognite.com/api#tag/Transformation-Notifications/operation/createTransformationNotifications>`_
Args:
notification (TransformationNotification | Sequence[TransformationNotification]): Notification or list of notifications to create.
notification (TransformationNotification | TransformationNotificationWrite | Sequence[TransformationNotification] | Sequence[TransformationNotificationWrite]): Notification or list of notifications to create.
Returns:
TransformationNotification | TransformationNotificationList: Created notification(s)
Expand Down
Loading

0 comments on commit e5c0509

Please sign in to comment.