A Python client that interacts with both ahnlich DB and AI
The following topics are covered:
- Using Poetry
poetry add ahnlich-client-py
- Using pip
pip3 install ahnlich-client-py
The ahnlich client has some noteworthy modules that should provide some context
- Bincode
- Serde Types
- Serde Binary
The above mentioned are classes generated by serde_generate
to help represent the primitive rust types and provide a base bincode serialization capabilities
- Query: Generated from the spec document, contains all the types used by to send a request to the ahnlich database
- Server Response: Generated from the spec document, contains all the possible server response.
- Builders:
- Exceptions: Possible Client Exceptions
- Libs: Contains helpers, such as
create_store_key
All query types have an associating server response, all which can be found
from ahnlich_client_py import server_response
- Blocking clients
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
- Nonblocking clients
from ahnlich_client_py.non_blocking_client import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
The ahnlich client has the ability to reuse connections. Configurations can be changed by overiding the default class initialization.
@dataclass
class AhnlichDBPoolSettings:
idle_timeout: float = 30.0
max_lifetime: float = 600.0
min_idle_connections: int = 3
max_pool_size: int = 10
enable_background_collector: bool = True
dispose_batch_size: int = 0
Where:
-
enable_background_collector ->
defaults 1
: if True starts a background worker that disposes expired and idle connections maintaining requested pool state. If False the connections will be disposed on each connection release. -
idle_timeout ->
defaults 30.0
: inactivity time (in seconds
) after which an extra connection will be disposed (a connection considered as extra if the number of endpoint connection exceeds min_idle). -
max_lifetime ->
defaults 600.0
: number of seconds after which any connection will be disposed. -
min_idle_connections ->
default 3
: minimum number of connections for the ahnlich db endpoint the pool tries to hold. Connections that exceed that number will be considered as extra and disposed after idle_timeout seconds of inactivity. -
max_pool_size ->
defaults 10
: maximum number of connections in the pool. -
dispose_batch_size: maximum number of expired and idle connections to be disposed on connection release (if background collector is started the parameter is ignored).
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
tracing_id = "00-80e1afed08e019fc1110464cfa66635c-7a085853722dc6d2-01"
response = client.ping(tracing_id)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.info_server()
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.list_clients()
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
tracing_id = "00-80e1afed08e019fc1110464cfa66635c-7a085853722dc6d2-01"
response = client.list_stores(tracing_id)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.create_store(
store_name = "test store",
dimension = 5,
create_predicates = [
"job"
],
error_if_exists=True
)
Once store dimension is fixed, all store_keys
must confirm with said dimension.
Note we only accept 1 dimensional arrays/vectors of length N.
Store dimensions is a one dimensional array of length N
from libs import create_store_key
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
store_key = create_store_key(data=[5.0, 3.0, 4.0, 3.9, 4.9])
store_value = {"rank": query.MetadataValue__RawString(value="chunin")}
response = client.set(
store_name = "test store",
inputs=[(store_key, store_value)]
)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.drop_store(
store_name = "test store",
error_if_not_exists=True
)
Returns an array of tuple of (store_key, store_value) of Maximum specified N
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.get_sim_n(
store_name = "test store",
search_input = key,
closest_n = 3,
algorithm = query.Algorithm__CosineSimilarity(),
condition = None,
tracing_id=None,
)
Closest_n is a Nonzero integer value
Returns an array of tuple of (store_key, store_value)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
key = some_store_key
response = client.get_key(
store_name = "test store",
keys=[key]
)
Same as Get_key but returns results based defined conditions
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
condition = query.PredicateCondition__Value(
query.Predicate__Equals(
key="job",
value=query.MetadataValue__RawString(value="sorcerer")
)
)
response = client.get_by_predicate(
store_name = "test store",
condition=conditon
)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.create_pred_index(
store_name = "test store",
predicates=["job", "rank"]
)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.drop_pred_index(
store_name = "test store",
predicates=["job"],
error_if_not_exists=True
)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.create_non_linear_algorithm_index(
store_name = "test store",
non_linear_indices=[NonLinearAlgorithm__KDTree],
tracing_id = None
)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.drop_non_linear_algorithm_index(
store_name = "test store",
non_linear_indices=[NonLinearAlgorithm__KDTree],
error_if_not_exists=True,
tracing_id = None
)
from ahnlich_client_py.libs import create_store_key
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
store_key = create_store_key(data=[5.0, 3.0, 4.0, 3.9, 4.9])
response = client.delete_key(
store_name = "test store",
keys=[store_key]
)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
condition = query.PredicateCondition__Value(
query.Predicate__Equals(
key="job",
value=query.MetadataValue__RawString(value="sorcerer")
)
)
response = client.delete_predicate(
store_name = "test store",
condition = condition
)
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.ping(tracing_id)
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.info_server(tracing_id)
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.list_stores(tracing_id)
from ahnlich_client_py import AhnlichAIClient
from ahnlich_client_py.internals import ai_query
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.create_store(
store_name = "test store",
model = ai_query.AIModel__AllMiniLML6V2(),
store_type = ai_query.AIStoreType__RawString(),
predicates = [
"job"
],
non_linear_indices= [],
error_if_exists = True,
# Store original controls if we choose to store the raw inputs
# within the DB in order to be able to retrieve the originals again
# during query, else only store values are returned
store_original = True,
tracing_id=None,
)
from ahnlich_client_py import AhnlichAIClient
from ahnlich_client_py.internals import ai_query
client = AhnlichAIClient(address="127.0.0.1", port=port)
store_inputs = [
(
ai_query.StoreInput__RawString("Jordan One"),
{"brand": ai_query.MetadataValue__RawString("Nike")},
),
(
ai_query.StoreInput__RawString("Yeezey"),
{"brand": ai_query.MetadataValue__RawString("Adidas")},
),
]
response = client.set(
store_name = "test store",
inputs=store_inputs,
tracing_id=None
)
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.drop_store(
store_name = "test store",
error_if_not_exists=True,
tracing_id=None
)
Returns an array of tuple of (store_key, store_value) of Maximum specified N
from ahnlich_client_py import AhnlichAIClient
from ahnlich_client_py.internals import ai_query
client = AhnlichAIClient(address="127.0.0.1", port=port)
search_input = ai_query.StoreInput__RawString("Jordan")
response = client.get_sim_n(
store_name = "test store",
search_input = search_input,
closest_n = 3,
algorithm = query.Algorithm__CosineSimilarity(),
condition = None,
tracing_id=None
)
Closest_n is a Nonzero integer value
Same as Get_key but returns results based defined conditions
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
condition = query.PredicateCondition__Value(
query.Predicate__Equals(
key="brand",
value=query.MetadataValue__RawString(value="Nike")
)
)
response = client.get_by_predicate(
store_name = "test store",
condition=conditon,
tracing_id=None,
)
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.create_pred_index(
store_name = "test store",
predicates=["job", "rank"],
tracing_id=None,
)
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.drop_pred_index(
store_name = "test store",
predicates=["job"],
error_if_not_exists=True,
tracing_id=None,
)
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.create_non_linear_algorithm_index(
store_name = "test store",
non_linear_indices=[NonLinearAlgorithm__KDTree],
tracing_id = None
)
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.drop_non_linear_algorithm_index(
store_name = "test store",
non_linear_indices=[NonLinearAlgorithm__KDTree],
error_if_not_exists=True,
tracing_id = None
)
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
key = ai_query.StoreInput__RawString("Custom Made Jordan 4")
response = client.delete_key(
store_name = "test store",
keys=[key],
tracing_id=None
)
Clients have the ability to send multiple requests at once, and these requests will be handled sequentially. The builder class takes care of this. The response is a list of all individual request responses.
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
request_builder = client.pipeline()
request_builder.ping()
request_builder.info_server()
request_builder.list_clients()
request_builder.list_stores()
response: server_response.ServerResult = client.exec()
Sample applies to the AIclient
The DB and AI client class can be used as a context manager hereby closing the connection pool automatically upon context end.
from ahnlich_client_py import AhnlichDBClient
with client.AhnlichDBClient(address="127.0.0.1", port=port) as db_client:
response: server_response.ServerResult = db_client.ping()
However, closing the connection pool can be done by calling cleanup()
on the client.
Replace the contents of MSG_TAG
file with your new tag message
From Feature branch, either use the makefile :
make bump-py-client BUMP_RULE=[major, minor, patch]
or
poetry run bumpversion [major, minor, patch]
When Your PR is made, changes in the client version file would trigger a release build to Pypi
- Store Key: A one dimensional vector
- Store Value: A Dictionary containing texts or binary associated with a storekey
- Store Predicates: Or Predicate indices are basically indices that improves the filtering of store_values
- Predicates: These are operations that can be used to filter data(Equals, NotEquals, Contains, etc)
- PredicateConditions: They are conditions that utilize one predicate or tie Multiple predicates together using the AND, OR or Value operation. Where Value means just a predicate. Example: Value
condition = db_query.PredicateCondition__Value(
db_query.Predicate__Equals(key="job", value=db_query.MetadataValue__RawString(value="sorcerer"))
)
Metadatavalue can also be a binary(list of u8s)
condition = db_query.PredicateCondition__Value(
db_query.Predicate__Equals(key="image_data", value=db_query.MetadataValue__Image(value=[2,2,3,4,5,6,7]))
)
AND
# And[tuples[predicate_conditions]]
condition = db_query.PredicateCondition__AND(
(
db_query.PredicateCondition__Value(
db_query.Predicate__Equals(key="job", db_query.MetadataValue__RawString(value="sorcerer"))
),
db_query.PredicateCondition__Value(
db_query.Predicate__Equals(key="rank", value=db_query.MetadataValue__RawString(value="chunin"))
)
)
)
-
Search Input: A string or binary file that can be stored by the aiproxy. Note, the binary file depends on the supported models used in a store or supported by Ahnlich AI
-
AIModels: Supported AI models used by ahnlich ai
-
AIStoreType: A type of store to be created. Either a Binary or String
Version | Description |
---|---|
0.0.0 | Base Python clients (Async and Sync) to connect to ahnlich db and AI, with connection pooling and Bincode serialization and deserialization |