Python Pipes
One of my favorite parts of the Elm programming language is its ability to pipe functions in a nice and clean syntax as so:
1 | 1 |
I covered this in my previous post on functionanl programming in python
where I build a evaluate_pipeline
funciton that had a similar function:
1 |
|
Recently I have been feeling the lack of python’s functional capabilities so I went looking for a cleaner pipeline in python.
What I found was both wonder and scary. It creates a class to hook into pythons underlying “dunder” methods to allow for a pipeline that looks like this:
1 | (1 |
The syntax is pretty, but extremely foreign to python. Lets take a look at how it works.
Pipe Decorator
The Key here is all of the functions that exist in the pipeline have to
be decorated. The decorator wraps the functions in a class that implements
__rrshift__
allowing us to define custom handling of the bitwise shift right
operator.
1 | def pipe(func: Callable): |
This decorator then wraps any function with Pipe
, which knows how to take
a value and apply it and pass along.
1 |
|
Caveats
This works great for single arity functions, or with functions that all
take a common first argument and return one of a similiar type. In fact,
I found this as a hack for pandas.DataFrame
, creating pipeable operations
that all take a data frame as a first example.
Since each item in the pipeline is now a Pipe object instead of a function,
you cannot natively call @pipe
decorated functions, however to allow
for this interop, they can be applied at call time.
1 | def add(a: int, b: int) -> int: |
however that is not nearly as clean.
OOP Pipeline
We can achieve something similar without the crazy >>
syntax:
1 | class Pipline: |
This works basically the same without any crazy python syntax, however
you do need to add the .value
at the end if you want an actual value,
instead of a Pipeline
object. This method is also cleaner in that
we can take pure python functions as arguments without any need for partial
application.
Reduce Pipeline
While we are at it, we can also make a pipeline out of reduce and some s style expression tuples.
1 | from functools import reduce |
Very similar to the OOP version in that we dont need partial or any specially
decorated functions and here we dont need to call .value
at the end
to get an actual value. This also has the benefit of being able to dynamically
build the operations
list of functions and arguments and pass it into
the pipeline.
Assumptions
So far all of the pipelines have assumed that every function passed into the pipeline returns a value, and that it takes its first argument as the pipelined value. This is a little different from the elm or haskell way were the last value of a function is the one that is pipelined.
We can fix that in a few places by swapping the pass in order of *args
and
the value being passed in.
Conclusions
Pipelines are a cool concept and being able to cleanly implement them in
a few ways in python was interesting. I am not sure I would ever use
the >>
pipeline in a production system, but it was an interesting look
at using builtins to extend python’s syntax.