Using Metaprogramming for Architecting Flow in Elixir

TL;DR You can adapt the ideas presented in the last article to any kind of application. Macros can be used to build a Plug-like DSL for your specific use-case. But be careful: Use metaprogramming wisely and only where it’s a good fit. Do not try to build the “Plug for everything”.

Adapting Plug for Your Use-Case

Our use-case from the previous article is the conversion of images via a Mix task:

All activities in the BPMN flow chart above should be pluggable (green tasks).

The major properties of Plug are:

A Plug is a module (or function) that takes a Plug.Conn struct and returns a (modified) Plug.Conn struct.

Each request is processed by a Plug pipeline, a series of plugs that get invoked one after another.

The Plug.Conn struct contains all information received in the request and all information necessary to give a response to the request.

We will call our “Plugs” simply “Steps”, because they represent steps in a business process (and because naming things is hard 😄).

A Step will be defined as a module implementing the Step behaviour.

defmoduleConverter.Stepdo# Plug also supports functions as Plugs# we could do that, but for the sake of this article, we won't :)@typet::module@callbackcall(token::Converter.Step.t())::Converter.Step.t()defmacro__using__(_opts\\[])doquotedo@behaviourConverter.StepaliasConverter.Tokenendendend

The use Converter.StepBuilder part is where the metaprogramming starts:

defmoduleConverter.StepBuilderdo# this macro is invoked by `use Converter.StepBuilder`defmacro__using__(_opts\\[])doquotedo# we enable the module attribute `@steps` to accumulate all its values;# this means that the value of this attribute is not reset when# set a second or third time, but rather the new values are prependedModule.register_attribute(__MODULE__,:steps,accumulate:true)# register this module to be called before compiling the source@before_compileConverter.StepBuilder# import the `step/1` macro to build the pipelineimportConverter.StepBuilder# implement the `Step` behaviour's callbackdefcall(token)do# we defer this call to a function, which we will generate at compile time;# we can't generate this function (`call/1`) directly because we would get# a compiler error since the function would be missing when the compiler# checks rundo_call(token)endendend# this macro gets used to register another Step with our pipelinedefmacrostep(module)doquotedo# this is why we set the module attribute to `accumulate: true`:# all Step modules will be stored in this module attribute,# so we can read them back before compiling@stepsunquote(module)endend# this macro is called after all macros were evaluated (e.g. the `use` statement# and all `step/1` calls), but before the source gets compileddefmacro__before_compile__(_env)doquotedo# this quoted code gets inserted into the module containing# our `use Converter.StepBuilder` statementdefpdo_call(token)do# we are reading the @steps and hand them to another function for execution## IMPORTANT: the reason for deferring again here is that we want to do# as little complexity as possible in our generated code in# order to minimize the implicitness in our code!steps=Enum.reverse(@steps)Converter.StepBuilder.call_steps(token,steps)endendenddefcall_steps(initial_token,steps)do# to implement the "handing down" of our token through the pipeline,# we utilize `Enum.reduce/3` and use the accumulator to store the tokenEnum.reduce(steps,initial_token,fnstep,token->step.call(token)end)endend

That seems like a lot to take in. But at the end, it’s rather trivial:

Each call to step/1 adds another module to the @steps attribute.

Right before compiling, we generate a do_call/1 function, which reads the accumulated Step modules from this attribute.

A third function is used to actually call all the Steps.
We do this to minimize the work done in the generated parts of our code.

Also, please note how there is no reference toConverter.Token in our StepBuilder and how it’s just ~40 lines of code. That’s pretty cool!

I really like how this provides visibility into the “business process” that our code is concerned with. This piece of code can serve as an entrypoint for new contributors, since it is not only the runtime blueprint, but it also serves as documentation.

If you read this far, take a deep breath. You’re about to take the red pill.

Advanced Metaprogramming for Complex Flows

In most cases, business processes are more complicated than our example.

Even the flow of our Mix task is less trivial than we made it out to be:
This diagram completely ignores the fact that this flow has at least two different outcomes: one following an early exit, where the given arguments can not be validated and a happy one, where images can be found and converted successfully.

If we remodel our process based on this insight, the result looks something like this:

In order to be able to express this change in our MyProcess module, we will have to be able to provide a filter condition to step/1, which expresses under which circumstances a Step module should be called:

defmoduleConverter.MyProcessdouseConverter.StepBuilderstepConverter.Step.ParseOptionsstepConverter.Step.ValidateOptions# we'll provide the conditions via a keywordstepConverter.Step.PrepareConversion,if:token.errors==[]stepConverter.Step.ConvertImages,if:token.errors==[]stepConverter.Step.ReportResults,if:token.errors==[]# `if:` is not something Elixir provides, we'll have to implement it ourselves# also, we could have named this any way we wanted, `if:` just seemed obviousstepConverter.Step.ReportErrors,if:token.errors!=[]end

With this, we can model the flow from the diagram.

Let’s see how this is done.

Compiling Steps as Case-Statements

To add the dynamic conditions provided via if:, we have to revise our approach from the beginning and rework our __before_compile__/1 and step/1 macros:

defmoduleConverter.StepBuilderdodefmacro__using__(_opts\\[])do# this macro remains unchangedquotedoModule.register_attribute(__MODULE__,:steps,accumulate:true)@before_compileConverter.StepBuilderimportConverter.StepBuilderdefcall(token)dodo_call(token)endendenddefmacrostep(module)doquotedo# we are now using 2-element-sized tuples to save the steps# (the second element will be used to store the given conditions)@steps{unquote(module),true}endenddefmacro__before_compile__(env)do# read steps from `env` (they are in reverse order, like before)steps=Module.get_attribute(env.module,:steps)# we are compiling the body of our `do_call/1` as a quoted expressionbody=Converter.StepBuilder.compile(steps)quotedo# unlike before, we do not call another function, but rather unquote the# body returned by `Converter.StepBuilder.compile/1`defpdo_call(token)dounquote(body)endendenddefcompile(steps)dotoken=quotedo:token# we use Enum.reduce/3 like before, but this time we are compiling all the# calls at compile-time into multiple nested case-statementsEnum.reduce(steps,token,&compile_step/2)enddefpcompile_step({step,_conditions},acc)doquoted_call=quotedounquote(step).call(token)end# this is where the magic happens: we generate a case-statement for# each call and nest them into each otherquotedocaseunquote(quoted_call)do%Converter.Token{}=token-># this is where all the previously compiled case-statements are inserted# thereby "wrapping" them in this new case-statementunquote(acc)_->raiseunquote("expected #{inspect(step)}.call/1 to return a Token")endendendend

Okay, that was fast. Here’s how the “nested case-statements” technique works:

When we read the steps attribute, we get the reversed list of all steps:

# NOTE: the plus sign (+) isn't code; it marks the lines added in each iteration+|token

… then wrap a case-statement for the first step in our list around it …

+|caseConverter.Step.ReportResults.call(token)do+|%Converter.Token{}=token->|token|+|_->+|raise("expected Converter.Step.ReportResults.call/1 to return a Token")+|end

… and with each iteration of the reducer, we wrap the previous block in a new case-statement for the current step in our list …

+|caseConverter.Step.ConvertImages.call(token)do+|%Converter.Token{}=token->|caseConverter.Step.ReportResults.call(token)do|%Converter.Token{}=token->|token||_->|raise("expected Converter.Step.ReportResults.call/1 to return a Token")|end|+|_->+|raise("expected Converter.Step.ConvertImages.call/1 to return a Token")+|end

At the end we get a long list of nested case-statements representing our flow:

defpdo_call(token)docaseConverter.Step.ParseOptions.call(token)do%Converter.Token{}=token->caseConverter.Step.ValidateOptions.call(token)do%Converter.Token{}=token->caseConverter.Step.PrepareConversion.call(token)do%Converter.Token{}=token->caseConverter.Step.ConvertImages.call(token)do%Converter.Token{}=token->caseConverter.Step.ReportResults.call(token)do%Converter.Token{}=token->token_->raise("expected Converter.Step.ReportResults.call/1 to ...")end_->raise("expected Converter.Step.ConvertImages.call/1 to ...")end_->raise("expected Converter.Step.PrepareConversion.call/1 to ...")end_->raise("expected Converter.Step.ValidateOptions.call/1 to ...")end_->raise("expected Converter.Step.ParseOptions.call/1 to ...")endend

That’s a lot to take in. But in the end, it’s not that complicated:

We call a step and check the result via a case macro.

If there is an unexpected return, we raise an exception.

If not, we put the result into the next step and so on …

Think of it as a series of assignments …

result1=casestep1(token)do%Token{}=result1->result1_->raise"Step1 did not work!"endresult2=casestep2(result1)do%Token{}=result2->result2_->raise"Step2 did not work!"endresult3=casestep3(result2)do# and so on ...end

… only that we nest the case-statements instead of assigning them to variables.

casestep1(token)do%Token{}=result1->casestep2(result1)do%Token{}=result2->casestep3(result2)do# and so on ...end_->raise"Step2 did not work!"end_->raise"Step1 did not work!"end

defmacrostep(module,if:conditions)doquotedo# the second element of the tuple stores the given conditions@steps{unquote(module),unquote(Macro.escape(conditions))}endend

… and by updating compile_step/2 to include the given conditions:

defpcompile_step({step,conditions},acc)doquoted_call=quotedounquote(step).call(token)endquotedo# instead of just calling the Step, we are compiling the given conditions# into the callresult=unquote(compile_conditions(quoted_call,conditions))caseresultdo%Converter.Token{}=token->unquote(acc)_->raiseunquote("expected #{inspect(step)}.call/1 to return a Token")endendenddefpcompile_conditions(quoted_call,true)do# if no conditions were given, we simply call the Stepquoted_callenddefpcompile_conditions(quoted_call,conditions)doquotedo# we have to use `var!/1` for our variable to be accessible# by the code inside `conditions`var!(token)=token# to avoid "unused variable" warnings, we assign the variable to `_`_=var!(token)ifunquote(conditions)do# if the given conditions are truthy, we call the Stepunquote(quoted_call)else# otherwise, we just return the tokentokenendendend

This compiles the step and conditions into a block of code, which ensures access to the currenttoken and tests the given conditions with an if statement.

Here’s an example for the last step ReportErrors, which should only be invoked if token.errors != []:

result=(var!(token)=token_=var!(token)iftoken.errors()!=[]doConverter.Step.ReportErrors.call(token)elsetokenend)caseresultdo%Converter.Token{}=token->token_->raise("expected Converter.Step.ReportErrors.call/1 to return a Token")end

These blocks of code are then nested into each other like explained before.
The generated code might seem cumbersome, but since it is generated at compile-time, you do not have to actually read it.

Icing on the Cake: Adding cond-like blocks to Steps

We are now able to model our improved flow diagram. But we’re just not there yet.

We don’t want to write the same if: conditional for each individual step on a path.
Ideally, we want to recognize the paths from the flow diagram in our code without comparing conditions.

We can implement this in a rather simple fashion by simply appending all step/1 calls inside a conditional block with the conditions given in the block’s head.

defmacrostep(do:clauses)doEnum.reduce(clauses,nil,fn{:->,_,[[conditions],args]},acc-># we collect all calls inside the current `->` block ...quoted_calls=caseargsdo{:__block__,_,quoted_calls}->quoted_callssingle_quoted_call->[single_quoted_call]end# ... and add conditions where applicablequotedounquote(acc)unquote(add_conditions(quoted_calls,conditions))endend)enddefpadd_conditions(list,conditions)whenis_list(list)doEnum.map(list,&add_conditions(&1,conditions))end# quoted calls to our `step/1` macro look like this:## {:step, _, [MyStepModule]}## so all we have to do is append the `if:` condition## {:step, _, [MyStepModule, [if: conditions]]}#defpadd_conditions({:step,meta,args},conditions)do{:step,meta,args++[[if:conditions]]}end# if we encounter any other calls, we just leave them intactdefpadd_conditions(ast,_conditions)doastend

But it is also freakin’ scary:
We just wrote a macro that rewrites macro calls that generate macros used to dynamically write new code paths via AST manipulation during compile-time.

With great power comes great responsibility

At this point, you won’t be surprised to hear that Elixir’s metaprogramming facilities are sometimes referred to as “sane insanity”, because you can do these insane things, but at least only at compile-time.

Avoid excessive use of metaprogramming as it tends to make things implicit, side effects less obvious and debugging a nightmare.

Realize that it is more important to understand the principles behind the presented ideas than to build a magically generic solution.

Do not attempt to build the “Plug for everything”.

Build a solution tailored towards your specific problem, since this is the real strength of metaprogramming:
You can build a great DSL that uses conditionals, case-statements, pattern matching and guards under the hood to abstract away the most common use-case of your domain.

Plug & Phoenix do this and the router is a great example of how to create a meaningful DSL for the most common use-case in a large domain!

Hi, I'm René.
I've been fortunate enough to work in software since I've been 17 years old.
Currently, I am Head of Product Development@5MindsIT.
I have a keen interest in distributed systems and hold a PhD in Production Economics.
In my sparetime I am an Open Source maintainer and the creator of Credo, ElixirStatus and ElixirWeekly.