One of my favorite parts of the Elm programming language is its ability to pipe functions in a nice and clean syntax as so:
I covered this in my previous post on functionanl programming in python
where I build a
evaluate_pipeline funciton that had a similar function:
Recently I have been feeling the lack of python’s functional capabilities so I went looking for a cleaner pipeline in python.
What I found was both wonder and scary. It creates a class to hook into pythons underlying “dunder” methods to allow for a pipeline that looks like this:
The syntax is pretty, but extremely foreign to python. Lets take a look at how it works.
The Key here is all of the functions that exist in the pipeline have to
be decorated. The decorator wraps the functions in a class that implements
__rrshift__ allowing us to define custom handling of the bitwise shift right
def pipe(func: Callable):
This decorator then wraps any function with
Pipe, which knows how to take
a value and apply it and pass along.
This works great for single arity functions, or with functions that all
take a common first argument and return one of a similiar type. In fact,
I found this as a hack for
pandas.DataFrame, creating pipeable operations
that all take a data frame as a first example.
Since each item in the pipeline is now a Pipe object instead of a function,
you cannot natively call
@pipe decorated functions, however to allow
for this interop, they can be applied at call time.
def add(a: int, b: int) -> int:
however that is not nearly as clean.
We can achieve something similar without the crazy
This works basically the same without any crazy python syntax, however
you do need to add the
.value at the end if you want an actual value,
instead of a
Pipeline object. This method is also cleaner in that
we can take pure python functions as arguments without any need for partial
While we are at it, we can also make a pipeline out of reduce and some s style expression tuples.
from functools import reduce
Very similar to the OOP version in that we dont need partial or any specially
decorated functions and here we dont need to call
.value at the end
to get an actual value. This also has the benefit of being able to dynamically
operations list of functions and arguments and pass it into
So far all of the pipelines have assumed that every function passed into the pipeline returns a value, and that it takes its first argument as the pipelined value. This is a little different from the elm or haskell way were the last value of a function is the one that is pipelined.
We can fix that in a few places by swapping the pass in order of
the value being passed in.
Pipelines are a cool concept and being able to cleanly implement them in
a few ways in python was interesting. I am not sure I would ever use
>> pipeline in a production system, but it was an interesting look
at using builtins to extend python’s syntax.