Here is a software analysis anti-pattern I have seen many times in my career. It is popping up in my current project, and I am trying to work out how to subvert it early.

Let me describe the cycle.

Some set of commonly-changing business rules are hard-coded in the source-code.

Developers are too busy or their processes are too slow to be responsive to requests to change the business rules.

It gets decided that the rules should be exposed through a user-interface, to allow the configuration to be customised by a non-developer who directly understands the business.

Great effort is made to create an interface to support customisation.

Optionally: Once again, it is realised that Boolean logic is clumsy in a user-interface that isn’t plain text.

Optionally: Some potential customisations are missed, and the interface needs to be redeveloped, or special hard-coded rules are detected in the source and treated specially.

It turns out that, although the new interface enables the editing of the rules, it the editing isn’t the hard part. Reasoning about how all the different rules interact – that is the hard part. Considering every situation, not just the main ones – that is the hard part. Testing your understanding to make sure it is correctly implemented – that is the hard part. These are the special skills that the developers bring.

The non-developer gives up in disgust, generally blaming the user interface as too complex, not realising it is the problem domain is more complex than they know how to realise.

The task reverts back to the busy developers, who continue to laboriously work through the consequence of every change, but now they have to edit the rules in a custom language/interface, without comments, without source-control, without a testing framework, without IDE support, without a coding standard, …