Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create Universal Query Language #400

Merged
Show file tree
Hide file tree
Changes from 33 commits
Commits
Show all changes
59 commits
Select commit Hold shift + click to select a range
6370357
Add scratch of implementation
PawelPeczek-Roboflow May 10, 2024
bb81a3b
WIP
PawelPeczek-Roboflow May 14, 2024
4fe77e6
Bring foundations of universal query langiage
PawelPeczek-Roboflow May 14, 2024
065a7d4
WIP
PawelPeczek-Roboflow May 14, 2024
c9cb8de
WIP
PawelPeczek-Roboflow May 14, 2024
f877177
Add next operations
PawelPeczek-Roboflow May 14, 2024
c04eeba
Fix issues spotted while testing
PawelPeczek-Roboflow May 15, 2024
55bd2b9
WIP - adding kinds into manifests
PawelPeczek-Roboflow May 16, 2024
10427d1
Add manifests of kinds for operations entities
PawelPeczek-Roboflow May 16, 2024
bcec3f0
Add scratches of implementations for new blocks
PawelPeczek-Roboflow May 16, 2024
ae86452
Fix issues spotted while testing
PawelPeczek-Roboflow May 16, 2024
2362b5e
Adjust EE to find dict selectors
PawelPeczek-Roboflow May 16, 2024
8db0206
Add ability to inject dict-based references into blocks run_*() metho…
PawelPeczek-Roboflow May 16, 2024
ab952ba
Get rid of strict typing to make py3.8 be happy with lack of elipsis
PawelPeczek-Roboflow May 16, 2024
d388466
Add scratch of new Roboflow platform sink and remove constrain on not…
PawelPeczek-Roboflow May 17, 2024
6abb716
Add scratch of implementation for new Roboflow data collector sink
PawelPeczek-Roboflow May 17, 2024
7e378aa
First draft of Batch[B] container
PawelPeczek-Roboflow May 20, 2024
dcfe9bb
Merge branch 'feature/sv_detections_in_workflows' into feature/create…
PawelPeczek-Roboflow May 20, 2024
e58a6ad
First draft changes into steps to plug new data abstractions
PawelPeczek-Roboflow May 20, 2024
5b64c2e
Push next steps changes
PawelPeczek-Roboflow May 20, 2024
dccf008
Resolve conflicts with trunk
PawelPeczek-Roboflow May 20, 2024
a8e8b4c
Finalise initial roud of refactor
PawelPeczek-Roboflow May 21, 2024
fd44357
WIP - refactor of tests
PawelPeczek-Roboflow May 21, 2024
9aab498
WIP - refactor of tests
PawelPeczek-Roboflow May 21, 2024
6cfc263
Fix workflows unit tests
PawelPeczek-Roboflow May 21, 2024
b5f5f0d
Fixed tests for workflows
PawelPeczek-Roboflow May 21, 2024
238293c
Fixed serialisation in inference server
PawelPeczek-Roboflow May 21, 2024
d869325
Make new manifests entries
PawelPeczek-Roboflow May 21, 2024
611eaf3
Fix typo
PawelPeczek-Roboflow May 21, 2024
93ce263
Add changes required to complete scratch of Roboflow backend sink
PawelPeczek-Roboflow May 22, 2024
59a717e
Fix initial implementation of sink
PawelPeczek-Roboflow May 22, 2024
d83bb78
Fix issues detected with data collector
PawelPeczek-Roboflow May 23, 2024
d6d38f1
Add integration tests for detections filter
PawelPeczek-Roboflow May 23, 2024
740f9b9
Add unit tests for common utils used to process underlying data
PawelPeczek-Roboflow May 24, 2024
7ec8b77
Add bunch of other tests for utils
PawelPeczek-Roboflow May 24, 2024
bb1373a
Add remaining tests for sv detections utils
PawelPeczek-Roboflow May 27, 2024
74ee418
Add tests for detections consensus block
PawelPeczek-Roboflow May 27, 2024
8aaf460
Add UQL exceptions to HTTP API
PawelPeczek-Roboflow May 27, 2024
e8664b2
Add tests for batches abstraction
PawelPeczek-Roboflow May 27, 2024
85ebfbe
Add tests for image abstraction
PawelPeczek-Roboflow May 27, 2024
3ebd347
Fix integration tests of inference server regarding workflows
PawelPeczek-Roboflow May 27, 2024
b45ab3f
Add fixes to problems detected while UI development
PawelPeczek-Roboflow May 29, 2024
6b1825e
Apply fixeS
PawelPeczek-Roboflow May 29, 2024
4769190
Export sink block
PawelPeczek-Roboflow May 29, 2024
cc51da5
Fix issues spotted while testing
PawelPeczek-Roboflow May 31, 2024
b0fdb76
Fix API boot issues
PawelPeczek-Roboflow May 31, 2024
f7c05c4
Rename blocks
PawelPeczek-Roboflow May 31, 2024
2b64ba7
Fix unit tests
PawelPeczek-Roboflow May 31, 2024
ba6ea47
Fix integration tests
PawelPeczek-Roboflow May 31, 2024
8d7a936
Apply changes suggested in PR CR
PawelPeczek-Roboflow Jun 3, 2024
181f32a
Merge with fresh 'develop'
PawelPeczek-Roboflow Jun 3, 2024
7c67dff
Commit to trigger CI
PawelPeczek-Roboflow Jun 3, 2024
fa8e15e
Revert changes
PawelPeczek-Roboflow Jun 3, 2024
6ad0080
Fix tests
PawelPeczek-Roboflow Jun 3, 2024
ad19f1b
Try to address security issue
PawelPeczek-Roboflow Jun 3, 2024
af8283b
Fix issue with not existing variable
PawelPeczek-Roboflow Jun 3, 2024
02d54e3
Add comment about security vulnerability
PawelPeczek-Roboflow Jun 3, 2024
245d0fc
Remove unwanted unwrapping in one of UQL ops
PawelPeczek-Roboflow Jun 3, 2024
3924b59
Add changes in exceptions handling
PawelPeczek-Roboflow Jun 3, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
94 changes: 93 additions & 1 deletion inference/core/entities/responses/workflows.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,11 @@
from typing import Any, Dict, List
from typing import Any, Dict, List, Optional

from pydantic import BaseModel, Field

from inference.core.workflows.core_steps.common.query_language.entities.introspection import (
OperationDescription,
OperatorDescription,
)
from inference.core.workflows.entities.types import Kind
from inference.core.workflows.execution_engine.introspection.entities import (
BlockDescription,
Expand All @@ -28,6 +32,9 @@ class ExternalWorkflowsBlockSelectorDefinition(BaseModel):
is_list_element: bool = Field(
description="Boolean flag defining if list of references will be accepted"
)
is_dict_element: bool = Field(
description="Boolean flag defining if dict of references will be accepted"
)


class ExternalBlockPropertyPrimitiveDefinition(BaseModel):
Expand All @@ -40,6 +47,88 @@ class ExternalBlockPropertyPrimitiveDefinition(BaseModel):
)


class ExternalOperationDescription(BaseModel):
operation_type: str
compound: bool
input_kind: List[str]
output_kind: List[str]
nested_operation_input_kind: Optional[List[str]] = None
nested_operation_output_kind: Optional[List[str]] = None
description: Optional[str] = None

@classmethod
def from_internal_entity(
cls, operation_description: OperationDescription
) -> "ExternalOperationDescription":
nested_operation_input_kind, nested_operation_output_kind = None, None
if operation_description.nested_operation_input_kind:
nested_operation_input_kind = [
k.name for k in operation_description.nested_operation_input_kind
]
if operation_description.nested_operation_output_kind:
nested_operation_output_kind = [
k.name for k in operation_description.nested_operation_output_kind
]
return cls(
operation_type=operation_description.operation_type,
compound=operation_description.compound,
input_kind=[k.name for k in operation_description.input_kind],
output_kind=[k.name for k in operation_description.output_kind],
nested_operation_input_kind=nested_operation_input_kind,
nested_operation_output_kind=nested_operation_output_kind,
description=operation_description.description,
)


class ExternalOperatorDescription(BaseModel):
operator_type: str
operands_number: int
operands_kinds: List[List[str]]
description: Optional[str] = None

@classmethod
def from_internal_entity(
cls, operator_description: OperatorDescription
) -> "ExternalOperatorDescription":
operands_kinds = [
[k.name for k in kind] for kind in operator_description.operands_kinds
]
return cls(
operator_type=operator_description.operator_type,
operands_number=operator_description.operands_number,
operands_kinds=operands_kinds,
description=operator_description.description,
)


class UniversalQueryLanguageDescription(BaseModel):
operations_description: List[ExternalOperationDescription]
operators_descriptions: List[ExternalOperatorDescription]

@classmethod
def from_internal_entities(
cls,
operations_descriptions: List[OperationDescription],
operators_descriptions: List[OperatorDescription],
) -> "UniversalQueryLanguageDescription":
operations_descriptions = [
ExternalOperationDescription.from_internal_entity(
operation_description=operation_description
)
for operation_description in operations_descriptions
]
operators_descriptions = [
ExternalOperatorDescription.from_internal_entity(
operator_description=operator_description
)
for operator_description in operators_descriptions
]
return cls(
operations_description=operations_descriptions,
operators_descriptions=operators_descriptions,
)


class WorkflowsBlocksDescription(BaseModel):
blocks: List[BlockDescription] = Field(
description="List of loaded blocks descriptions"
Expand All @@ -54,3 +143,6 @@ class WorkflowsBlocksDescription(BaseModel):
description="List defining all properties for all blocks that can be filled "
"with primitive values in workflow definition."
)
universal_query_language_description: UniversalQueryLanguageDescription = Field(
description="Definitions of Universal Query Language operations and operators"
)
20 changes: 14 additions & 6 deletions inference/core/interfaces/http/http_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,7 @@
from inference.core.entities.responses.workflows import (
ExternalBlockPropertyPrimitiveDefinition,
ExternalWorkflowsBlockSelectorDefinition,
UniversalQueryLanguageDescription,
WorkflowInferenceResponse,
WorkflowsBlocksDescription,
WorkflowValidationStatus,
Expand Down Expand Up @@ -134,8 +135,9 @@
from inference.core.managers.base import ModelManager
from inference.core.roboflow_api import get_workflow_specification
from inference.core.utils.notebooks import start_notebook
from inference.core.workflows.core_steps.sinks.active_learning.middleware import (
WorkflowsActiveLearningMiddleware,
from inference.core.workflows.core_steps.common.query_language.introspection.core import (
prepare_operations_descriptions,
prepare_operators_descriptions,
)
from inference.core.workflows.entities.base import OutputDefinition, StepExecutionMode
from inference.core.workflows.errors import (
Expand Down Expand Up @@ -426,9 +428,6 @@ async def count_errors(request: Request, call_next):

self.app = app
self.model_manager = model_manager
self.workflows_active_learning_middleware = WorkflowsActiveLearningMiddleware(
cache=cache,
)

async def process_inference_request(
inference_request: InferenceRequest, **kwargs
Expand Down Expand Up @@ -459,7 +458,6 @@ async def process_workflow_inference_request(
workflow_init_parameters = {
"workflows_core.model_manager": model_manager,
"workflows_core.api_key": workflow_request.api_key,
"workflows_core.active_learning_middleware": self.workflows_active_learning_middleware,
"workflows_core.background_tasks": background_tasks,
}
execution_engine = ExecutionEngine.init(
Expand Down Expand Up @@ -877,6 +875,7 @@ async def describe_workflows_blocks() -> WorkflowsBlocksDescription:
property_description=c.property_description,
compatible_element=c.compatible_element,
is_list_element=c.is_list_element,
is_dict_element=c.is_dict_element,
)
for c in connections
]
Expand All @@ -891,11 +890,20 @@ async def describe_workflows_blocks() -> WorkflowsBlocksDescription:
)
for primitives_connection in blocks_connections.primitives_connections
]
uql_operations_descriptions = prepare_operations_descriptions()
uql_operators_descriptions = prepare_operators_descriptions()
universal_query_language_description = (
UniversalQueryLanguageDescription.from_internal_entities(
operations_descriptions=uql_operations_descriptions,
operators_descriptions=uql_operators_descriptions,
)
)
return WorkflowsBlocksDescription(
blocks=blocks_description.blocks,
declared_kinds=blocks_description.declared_kinds,
kinds_connections=kinds_connections,
primitives_connections=primitives_connections,
universal_query_language_description=universal_query_language_description,
)

@app.post(
Expand Down
101 changes: 12 additions & 89 deletions inference/core/interfaces/http/orjson_utils.py
Original file line number Diff line number Diff line change
@@ -1,32 +1,17 @@
import base64
from typing import Any, Dict, List, Optional, Union

import numpy as np
import orjson
import supervision as sv
from fastapi.responses import ORJSONResponse
from pydantic import BaseModel

from inference.core.entities.responses.inference import InferenceResponse
from inference.core.utils.image_utils import ImageType, encode_image_to_jpeg_bytes
from inference.core.workflows.constants import (
CLASS_ID_KEY,
CLASS_NAME_KEY,
CONFIDENCE_KEY,
DETECTION_ID_KEY,
HEIGHT_KEY,
KEYPOINTS_CLASS_ID_KEY,
KEYPOINTS_CLASS_NAME_KEY,
KEYPOINTS_CONFIDENCE_KEY,
KEYPOINTS_KEY,
KEYPOINTS_XY_KEY,
PARENT_ID_KEY,
POLYGON_KEY,
TRACKER_ID_KEY,
WIDTH_KEY,
X_KEY,
Y_KEY,
from inference.core.utils.image_utils import ImageType
from inference.core.workflows.core_steps.common.serializers import (
serialise_sv_detections,
)
from inference.core.workflows.entities.base import WorkflowImageData


class ORJSONResponseBytes(ORJSONResponse):
Expand Down Expand Up @@ -68,7 +53,7 @@ def serialise_workflow_result(
for key, value in result.items():
if key in excluded_fields:
continue
if contains_image(element=value):
if isinstance(value, WorkflowImageData):
value = serialise_image(image=value)
elif isinstance(value, dict):
value = serialise_dict(elements=value)
Expand All @@ -83,7 +68,7 @@ def serialise_workflow_result(
def serialise_list(elements: List[Any]) -> List[Any]:
result = []
for element in elements:
if contains_image(element=element):
if isinstance(element, WorkflowImageData):
element = serialise_image(image=element)
elif isinstance(element, dict):
element = serialise_dict(elements=element)
Expand All @@ -98,7 +83,7 @@ def serialise_list(elements: List[Any]) -> List[Any]:
def serialise_dict(elements: Dict[str, Any]) -> Dict[str, Any]:
serialised_result = {}
for key, value in elements.items():
if contains_image(element=value):
if isinstance(value, WorkflowImageData):
value = serialise_image(image=value)
elif isinstance(value, dict):
value = serialise_dict(elements=value)
Expand All @@ -117,70 +102,8 @@ def contains_image(element: Any) -> bool:
)


def serialise_image(image: Dict[str, Any]) -> Dict[str, Any]:
image["type"] = "base64"
image["value"] = base64.b64encode(
encode_image_to_jpeg_bytes(image["value"])
).decode("ascii")
return image


def serialise_sv_detections(detections: sv.Detections) -> List[Dict[str, Any]]:
serialized_detections = []
for xyxy, mask, confidence, class_id, tracker_id, data in detections:
detection_dict = {}

if isinstance(xyxy, np.ndarray):
xyxy = xyxy.astype(float).tolist()
x1, y1, x2, y2 = xyxy
detection_dict[WIDTH_KEY] = abs(x2 - x1)
detection_dict[HEIGHT_KEY] = abs(y2 - y1)
detection_dict[X_KEY] = x1 + detection_dict[WIDTH_KEY] / 2
detection_dict[Y_KEY] = y1 + detection_dict[HEIGHT_KEY] / 2

detection_dict[CONFIDENCE_KEY] = float(confidence)
detection_dict[CLASS_ID_KEY] = int(class_id)
if mask is not None:
polygon = sv.mask_to_polygons(mask=mask)
detection_dict[POLYGON_KEY] = []
for x, y in polygon[0]:
detection_dict[POLYGON_KEY].append(
{
X_KEY: float(x),
Y_KEY: float(y),
}
)
if tracker_id is not None:
detection_dict[TRACKER_ID_KEY] = int(tracker_id)
detection_dict[CLASS_NAME_KEY] = str(data["class_name"])
detection_dict[DETECTION_ID_KEY] = str(data[DETECTION_ID_KEY])
if PARENT_ID_KEY in data:
detection_dict[PARENT_ID_KEY] = str(data[PARENT_ID_KEY])
if (
KEYPOINTS_CLASS_ID_KEY in data
and KEYPOINTS_CLASS_NAME_KEY in data
and KEYPOINTS_CONFIDENCE_KEY in data
and KEYPOINTS_XY_KEY in data
):
kp_class_id = data[KEYPOINTS_CLASS_ID_KEY]
kp_class_name = data[KEYPOINTS_CLASS_NAME_KEY]
kp_confidence = data[KEYPOINTS_CONFIDENCE_KEY]
kp_xy = data[KEYPOINTS_XY_KEY]
detection_dict[KEYPOINTS_KEY] = []
for (
keypoint_class_id,
keypoint_class_name,
keypoint_confidence,
(x, y),
) in zip(kp_class_id, kp_class_name, kp_confidence, kp_xy):
detection_dict[KEYPOINTS_KEY].append(
{
KEYPOINTS_CLASS_ID_KEY: int(keypoint_class_id),
KEYPOINTS_CLASS_NAME_KEY: str(keypoint_class_name),
KEYPOINTS_CONFIDENCE_KEY: float(keypoint_confidence),
X_KEY: float(x),
Y_KEY: float(y),
}
)
serialized_detections.append(detection_dict)
return serialized_detections
def serialise_image(image: WorkflowImageData) -> Dict[str, Any]:
PawelPeczek-Roboflow marked this conversation as resolved.
Show resolved Hide resolved
return {
"type": "base64",
"value": image.base64_image,
}
2 changes: 1 addition & 1 deletion inference/core/roboflow_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,7 @@ def register_image_at_roboflow(
image_bytes: bytes,
batch_name: str,
tags: Optional[List[str]] = None,
inference_id=None,
inference_id: Optional[str] = None,
) -> dict:
url = f"{API_BASE_URL}/dataset/{dataset_id}/upload"
params = [
Expand Down
1 change: 1 addition & 0 deletions inference/core/utils/image_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -419,6 +419,7 @@ def convert_gray_image_to_bgr(image: np.ndarray) -> np.ndarray:

def np_image_to_base64(image: np.ndarray) -> bytes:
"""
TODO: This function is probably broken!
PawelPeczek-Roboflow marked this conversation as resolved.
Show resolved Hide resolved
Convert a numpy image to a base64 encoded byte string.

Args:
Expand Down
23 changes: 14 additions & 9 deletions inference/core/workflows/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,25 +3,30 @@
OUTPUT_NODE_KIND = "OUTPUT_NODE"
IMAGE_TYPE_KEY = "type"
IMAGE_VALUE_KEY = "value"
ROOT_PARENT_ID_KEY = "root_parent_id"
PARENT_ID_KEY = "parent_id"
KEYPOINTS_KEY = "keypoints"
KEYPOINTS_CLASS_ID_KEY = "keypoints_class_id"
KEYPOINTS_CLASS_NAME_KEY = "keypoints_class_name"
KEYPOINTS_CONFIDENCE_KEY = "keypoints_confidence"
KEYPOINTS_XY_KEY = "keypoints_xy"
KEYPOINTS_KEY_IN_INFERENCE_RESPONSE = "keypoints"
KEYPOINTS_CLASS_NAME_KEY_IN_INFERENCE_RESPONSE = "class"
KEYPOINTS_CLASS_ID_KEY_IN_INFERENCE_RESPONSE = "class_id"
KEYPOINTS_CONFIDENCE_KEY_IN_INFERENCE_RESPONSE = "confidence"
KEYPOINTS_CLASS_ID_KEY_IN_SV_DETECTIONS = "keypoints_class_id"
KEYPOINTS_CLASS_NAME_KEY_IN_SV_DETECTIONS = "keypoints_class_name"
KEYPOINTS_CONFIDENCE_KEY_IN_SV_DETECTIONS = "keypoints_confidence"
KEYPOINTS_XY_KEY_IN_SV_DETECTIONS = "keypoints_xy"
PREDICTION_TYPE_KEY = "prediction_type"
PARENT_COORDINATES_KEY = "parent_coordinates"
PARENT_DIMENSIONS_KEY = "parent_dimensions"
ROOT_PARENT_COORDINATES_KEY = "root_parent_coordinates"
ROOT_PARENT_DIMENSIONS_KEY = "root_parent_dimensions"
ORIGIN_COORDINATES_KEY = "origin_coordinates"
LEFT_TOP_X_KEY = "left_top_x"
LEFT_TOP_Y_KEY = "left_top_y"
ORIGIN_SIZE_KEY = "origin_image_size"
WIDTH_KEY = "width"
HEIGHT_KEY = "height"
DETECTION_ID_KEY = "detection_id"
PARENT_COORDINATES_SUFFIX = "_parent_coordinates"
X_KEY = "x"
Y_KEY = "y"
CONFIDENCE_KEY = "confidence"
CLASS_ID_KEY = "class_id"
CLASS_NAME_KEY = "class"
POLYGON_KEY = "points"
TRACKER_ID_KEY = "tracker_id"
PREDICTION_TYPE_KEY = "prediction_type"
Loading