This project is a POC, proof of concept for client to upload pipeline artifacts to an object storage as S3 and able to consume the artifacts efficiently. The consumer is scalable, containerized and ready to run in k8s/Openshift
The diagram below represents a high-level workflow, the boxes uploader CLI
and processor
are the components here
being implemented.
The infra needed to run this POC is basically AWS S3 and AWS SQS.
See developer guide for instructions to run or to contribute
The items 1 to 5 below, show an e2e execution from uploader to processor or consumer.
-
SQS and S3 are empty, no messages and no files
-
simulating a
HOST A
where the artifact exist and need to be uploaded to S3, also a message is sent to SQS. For this step the code test_host_producer was executed: -
check there are messages in SQS and files in S3
-
simulating a
HOST B
where it consumes messages from SQS and for each message download the artifact from S3, run an action on this case it run import_to_ibutsu action then delete the file from S3 and finally delete the message from SQS. For this step the code test_host_consumer was executed
Also see scenarios in developer guide.