-
Notifications
You must be signed in to change notification settings - Fork 180
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(robot-server, api): accept RTP overrides via /protocols and inject them in python executor #14688
feat(robot-server, api): accept RTP overrides via /protocols and inject them in python executor #14688
Changes from 7 commits
e61e161
aa0f808
e4740c5
c451654
9c9cbbf
91bdf23
8fbb9ca
205dcb7
71ee6ec
3bc8a61
32011f8
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,5 @@ | ||
"""Router for /protocols endpoints.""" | ||
import json | ||
import logging | ||
from textwrap import dedent | ||
from datetime import datetime | ||
|
@@ -165,6 +166,10 @@ async def create_protocol( | |
" protocol resources on the robot." | ||
), | ||
), | ||
runTimeParameterValues: Optional[str] = Form( | ||
default=None, | ||
description="Key value pairs of run-time parameters defined in a protocol.", | ||
), | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It might be worth clarifying in this description that:
|
||
protocol_directory: Path = Depends(get_protocol_directory), | ||
protocol_store: ProtocolStore = Depends(get_protocol_store), | ||
analysis_store: AnalysisStore = Depends(get_analysis_store), | ||
|
@@ -184,6 +189,7 @@ async def create_protocol( | |
Arguments: | ||
files: List of uploaded files, from form-data. | ||
key: Optional key for client-side tracking | ||
runTimeParameterValues: Key value pairs of run-time parameters defined in a protocol. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Let's make this snake_case for consistency There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's actually camel case for robot-server HTTP API consistency purposes. |
||
protocol_directory: Location to store uploaded files. | ||
protocol_store: In-memory database of protocol resources. | ||
analysis_store: In-memory database of protocol analyses. | ||
|
@@ -205,11 +211,32 @@ async def create_protocol( | |
assert file.filename is not None | ||
buffered_files = await file_reader_writer.read(files=files) # type: ignore[arg-type] | ||
|
||
if isinstance(runTimeParameterValues, str): | ||
# We have to do this isinstance check because if `runTimeParameterValues` is | ||
# not specified in the request, then it gets assigned a Form(None) value | ||
# instead of just a None. \(O.o)/ | ||
parsed_rtp = json.loads(runTimeParameterValues) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Nit, doesn't have to happen in this PR: This will raise a 500 error if the client passes invalid JSON. Ideally, it would return a 422 instead. I wonder if there's a way to make a custom "run-time parameter multipart form field" Pydantic type so FastAPI can do that for us. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hmm.. interesting idea. I'll add a TODO |
||
else: | ||
parsed_rtp = None | ||
content_hash = await file_hasher.hash(buffered_files) | ||
cached_protocol_id = protocol_store.get_id_by_hash(content_hash) | ||
|
||
if cached_protocol_id is not None: | ||
# Protocol exists in database | ||
resource = protocol_store.get(protocol_id=cached_protocol_id) | ||
if parsed_rtp: | ||
# This protocol exists in database but needs to be re-analyzed with the | ||
# passed-in RTP overrides | ||
task_runner.run( | ||
protocol_analyzer.analyze, | ||
protocol_resource=resource, | ||
analysis_id=analysis_id, | ||
run_time_param_overrides=parsed_rtp, | ||
) | ||
analysis_store.add_pending( | ||
protocol_id=cached_protocol_id, | ||
analysis_id=analysis_id, | ||
) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This will probably need to be implemented in a different PR, but at some point before releasing this, we'll need to cap the maximum number of protocol analyses. I'm curious if you have any thoughts on what the limit will be? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ya, we'll have to figure that out. How did we land on the max number of protocols and runs to store? Is there a hard cut off for memory availability we need to keep in mind? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's mostly how long it would take to parse them all into Pydantic models if we had to do some kind of migration. |
||
analyses = analysis_store.get_summaries_by_protocol( | ||
protocol_id=cached_protocol_id | ||
) | ||
|
@@ -228,7 +255,8 @@ async def create_protocol( | |
) | ||
|
||
log.info( | ||
f'Protocol with id "{cached_protocol_id}" with same contents already exists. returning existing protocol data in response payload' | ||
f'Protocol with id "{cached_protocol_id}" with same contents already exists.' | ||
f" Returning existing protocol data in response payload." | ||
) | ||
|
||
return await PydanticResponse.create( | ||
|
@@ -238,7 +266,6 @@ async def create_protocol( | |
) | ||
|
||
try: | ||
# Can make the passed in RTPs as part of protocolSource returned here | ||
source = await protocol_reader.save( | ||
files=buffered_files, | ||
directory=protocol_directory / protocol_id, | ||
|
@@ -272,6 +299,7 @@ async def create_protocol( | |
protocol_analyzer.analyze, | ||
protocol_resource=protocol_resource, | ||
analysis_id=analysis_id, | ||
run_time_param_overrides=parsed_rtp, | ||
) | ||
pending_analysis = analysis_store.add_pending( | ||
protocol_id=protocol_id, | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should use the PE type here if possible
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had a TODO for this exact thing. I/ pycharm must have removed it during refactor. Putting it back in.