Concurrent Python made simple
Pyper is a flexible framework for concurrent and parallel data-processing, based on functional programming patterns. Used for 🔀 ETL Systems, ⚙️ Data Microservices, and 🌐 Data Collection
See the Documentation
Key features:
- 💡Intuitive API: Easy to learn, easy to think about. Implements clean abstractions to seamlessly unify threaded, multiprocessed, and asynchronous work.
- 🚀 Functional Paradigm: Python functions are the building blocks of data pipelines. Let's you write clean, reusable code naturally.
- 🛡️ Safety: Hides the heavy lifting of underlying task execution and resource clean-up. No more worrying about race conditions, memory leaks, or thread-level error handling.
- ⚡ Efficiency: Designed from the ground up for lazy execution, using queues, workers, and generators.
- ✨ Pure Python: Lightweight, with zero sub-dependencies.
Install the latest version using pip
:
$ pip install python-pyper
Note that python-pyper
is the pypi registered package.
In Pyper, the task
decorator is used to transform functions into composable pipelines.
Let's simulate a pipeline that performs a series of transformations on some data.
import asyncio
import time
from pyper import task
def get_data(limit: int):
for i in range(limit):
yield i
async def step1(data: int):
await asyncio.sleep(1)
print("Finished async wait", data)
return data
def step2(data: int):
time.sleep(1)
print("Finished sync wait", data)
return data
def step3(data: int):
for i in range(10_000_000):
_ = i*i
print("Finished heavy computation", data)
return data
async def main():
# Define a pipeline of tasks using `pyper.task`
pipeline = task(get_data, branch=True) \
| task(step1, workers=20) \
| task(step2, workers=20) \
| task(step3, workers=20, multiprocess=True)
# Call the pipeline
total = 0
async for output in pipeline(limit=20):
total += output
print("Total:", total)
if __name__ == "__main__":
asyncio.run(main())
Pyper provides an elegant abstraction of the execution of each task, allowing you to focus on building out the logical functions of your program. In the main
function:
pipeline
defines a function; this takes the parameters of its first task (get_data
) and yields each output from its last task (step3
)- Tasks are piped together using the
|
operator (motivated by Unix's pipe operator) as a syntactic representation of passing inputs/outputs between tasks.
In the pipeline, we are executing three different types of work:
-
task(step1, workers=20)
spins up 20asyncio.Task
s to handle asynchronous IO-bound work -
task(step2, workers=20)
spins up 20threads
to handle synchronous IO-bound work -
task(step3, workers=20, multiprocess=True)
spins up 20processes
to handle synchronous CPU-bound work
task
acts as one intuitive API for unifying the execution of each different type of function.
Each task has workers that submit outputs to the next task within the pipeline via queue-based data structures; this is the mechanism underpinning how concurrency and parallelism are achieved. See the docs for a breakdown of what a pipeline looks like under the hood.
See a non-async example
Pyper pipelines are by default non-async, as long as their tasks are defined as synchronous functions. For example:
import time
from pyper import task
def get_data(limit: int):
for i in range(limit):
yield i
def step1(data: int):
time.sleep(1)
print("Finished sync wait", data)
return data
def step2(data: int):
for i in range(10_000_000):
_ = i*i
print("Finished heavy computation", data)
return data
def main():
pipeline = task(get_data, branch=True) \
| task(step1, workers=20) \
| task(step2, workers=20, multiprocess=True)
total = 0
for output in pipeline(limit=20):
total += output
print("Total:", total)
if __name__ == "__main__":
main()
A pipeline consisting of at least one asynchronous function becomes an AsyncPipeline
, which exposes the same usage API, provided async
and await
syntax in the obvious places. This makes it effortless to combine synchronously defined and asynchronously defined functions where need be.
To explore more of Pyper's features, see some further examples
Pyper is implemented in pure Python, with no sub-dependencies. It is built on top of the well-established built-in Python modules:
- threading for thread-based concurrency
- multiprocessing for parallelism
- asyncio for async-based concurrency
- concurrent.futures for unifying threads, processes, and async code
This project is licensed under the terms of the MIT license.