A follow-up on the last example, here is now some java code to use the ruleengine with Java and Kafka Streams.

What I am doing here is to stream the data from an existing Kafka topic "topic1", run the data through the ruleengine, and then output the results to "topic1_filtered". The keys in the Kafka topic are Strings and the values are in Avro binary format.

The good thing about this is, that the rules for filtering are in the external project zip file, which the ruleengine uses. So if you change or extend this logic, then you do not have to touch this code. Imagine you have many applications that stream data, then not having the logic in the code is a big advantage: you don't have to search for the logic in the code and updates are done easier. The logic is maintaines and kept in a central location outside your code.

You will need two files to run this code: A project file from the Business Rules Maintenance Tool. This is the web application which you use to write the business rules and construct complex business logic the easy way. And then you need an Avro file.

The program will read the schema of the records in the Avro file and then loop over all records. The ruleengine is executed against each record. At the end of the code there is a commented block. if you want to loop over the reults of the ruleengine - they show what exactly happened when the rules where executed - then uncomment that block.

On Github at http://github.com/uwegeercken there is also the source code and the documentation available. You will find also documentation for the API with many useful functions you can use.

As always: if code have the business logic (rules) outside your code, it will make life much easier. You don't have to hardcode the logic in your program. This will make maintenance on the code easier and it will be easier to overview.

The business logic (knowledge) is in a central place and not spread over many different systems and programs.

You know what is really cool: you can plugin the business ruleengine into any Java based application or tool - including web sites.

I have posted articles here on how to use the ruleengine from Java code, but also with Apache Hadoop mapreduce, Pentaho Data Integration (ETL) and Apache Nifi - a Dataflow Management system.

The advantage is always the same: You don't hardcode business logic in your code or process. And that keeps your code cleaner and thus more agile. Agile, clean code directly translates into more quality and satisfaction for both you and your customers and it is also positively impacting time and cost.

Finally I found some time to also write about integrating the ruleengine into an Apache Kafka Consumer. The use case here is to check and validate your data - comming from a Kafka topic - before you further process it. But the ruleengine can not only validate the data, it can also apply actions to modify the data.

I have used a couple of times in the past the "Rent a Ferrari" scenario. Somebody wants to rent a ferrari. But there are some business rules to follow before the applicant can drive this shiny car. The rules are:

Driver must have a valid drivers license

Driver must be of age 26 or above

If driver is less than 26 years old, he/she has to pay an extra fee

Here is how the business logic looks like in the Business Rules Maintenance Tool - a web application to setup and orchestrate complex business logic and actions:

At the bottom of the screenshot you can see that there are two actions. One is fired when the business logic fails and the other one when it passes. The action updates a field "status" to contain the value "passed" or "failed".

I went ahead and have setup a Kafka Connector to read data from a MySQL database. The relevant table is labeled "ferrari applications". The connector picks up changes to or additions of records based on the (autoincrement) id or based on the modified column, which is a timestamp.

One could imagine that the data for this table is collected in a web application of the car rental company. (Yes of course, the data could also go directly to a Kafka Topic from the application.)

This is what the table looks like:

The table carries the name and age of the applicant, if he/she has a drivers license and willing to pay an extra fee to get the car if the age is lower than 26. The "status" column at the end is undefined.

As data arrives or is modified in the table the Kafka connector will pick it up and send it to a Kafka Topic "test_jdbc_ferrari_applications". The data arrives in Kafka as JSON like this:

The last part of the puzzle is a Java class that connects to Kafka and subscribes to the topic. When a record arrives, this record runs through the ruleengine. The business rules are run against the data. Based on if the business logic passed or failed, the relevant action - as described further above - is fired and the record is updated.

// loop over the records for (ConsumerRecord<String, String> record : records) { // create a collection of fields from the incomming record RowFieldCollection collection = getRowFieldCollection(record.value());

// as we are in a loop, we remove the results of the ruleengine before we run it again bre.getRuleExecutionCollection().clear();

// throw exception but continue in case of problems // could be changes in the kafka record (missing fields) or missing data try { // run the ruleengine using the record key and the collection of fields // note that the business logic might contain actions that modify the data bre.run(record.key(), collection); .... ....

Once run, the Java class waits for incomming records from Kafka and processes them. For this simple example, the validated and updated record in then output.

The data correponds to the data captured in the MySQL table, send to Kafka using Kafka connect and then picked up by the Java consumer class. The last column though is containing the results of the validation done with the business rules and the resulting action that was fired. The "status" column was updated to "failed" or "passed".

That's it. You could of course do further processing and e.g. store the data in another Kafka topic. A (micro-)service could pick it up from there and react on the results of the validation. Or you could make the results available for analytical purposes.

This is a simple example with simple business logic. But at this point you can modify the business logic and add additional rules and actions and you don't have to change the process or code. This could be done by a business expert reacting to changing conditions or contracts.

As you can see, the ruleengine plugs in everywhere. You not only get rid of the business logic in your code, but you also have a central place for your business logic. Again this is a plus for quality as also the maintenance of the logic is in one central place. Defining their logic using a central, user-oriented tool is also more transparent for the business users. And at last, when your colleagues leave the company you not anymore have to look for (ever-changing) business logic in their code and processes, spread all over the place.