Hmm sounds like a developer coder approach to recognise individual service actions that work together to produce an outcome? Found this https://www.bing.com/videos/search?q=microservices+tutorial&amp;view=detail&mid=14EF572197376FF7378D14EF572197376FF7378D&FORM=VIRE Sounds like what we did recognising generic actions that reflect the real world of work putting users incl IoTs as drivers? As such removes need for custom coding the individual actions. Seem it is a term that sits in techie land making it sound more complicated than it is for business?

If my thoughts are right then yes it is the start of the coders thinking the right way for business processes/BPM. BUT the objective is to create microservices which allow business users...not coders to build custom and adaptive requirements creating the desired outcomes.

Agree business are not or want to be coders.....? Not sure where you got that from but direct input by users in business language to build exactly what is required is the future. Actual build with business analyst skills with good interpersonal communications...

Agree when it comes to original build but where change is anticipated you can build in capability for users to quickly implement a change just part of the encouragement for users not to fear change and their knowledge matters....

Business poeple don't code but they have to express their work is a logical way which a challenge for many people. At one of my clients, business people can't, even, explain how they handle some particular cases although they perfectly manage them. Just another challenge for business analysits.

+1 @Alexander "not coding but codifying". I see the mania for teaching kids coding as mostly focused on actual coding -- which while nice as far as it goes is not the most important thing required for software. That important thing is captured in the phrase "codifying". Being really good at data modeling and database normalization is just one example of an important form of codifying.

Hi Stefan. I can't speak for Alexander, but I'm not sure how much we disagree. My point was more about the "code rules" meme we see so much of today -- it's unfair of me to over-interpret your brief comment. However, I am not much of a fan of classifying modeling as a form of coding, or business analysts as developers. That's like saying that human life is just chemistry, because we are all just molecules. My argument is an argument for emergent properties. The business analyst who is modeling is doing something that is not helpfully described as coding, but is rather performing the "work of business semantics wrangling" where business ideas are native to, or first-class citizens of, the technology. (I hope I have not misrepresented your POV.)

Not much maybe John. I see BPMN as a graphic DSL. Domain Specific *Language*. What you do with this language is program, nothing else. You just don't do it with letters in some programming language like java or cobol. The graphic symbols are, like the letters, carrying meaning, they are referring to something else. Graphic symbols as well as programs in the usual programming languages can be interpreted and executed. There is nothing filthy with programming, be it in letters or in graphic symbols. There is nothing wrong or lower level with being a technician. You can be doing both simultaneously, business modeling and programming. This explanation is more than obvious of course, but I see this looking down on "devs" everywhere.

Stefan, you are touching a debate which is close to my heart. I wish to separate however the question of "working with symbols" from the question of "productivity" and "cost/benefit". At some point "quantity turns into quality" and the emergent properties of high-level software development tools make it possible for business analysts to be successful even if they aren't programmers. It's much more difficult to do this than many people imagined -- even if RAD has always been a nice sales pitch. I recall a time when there would be ads for "programmer/analysts" -- which meant 95% programming and 5% analysis. And really it was only the analysis that the company cared about in terms of cost/benefit. So any programmer that can figure out how to increase their productivity, i.e. the analysis part of their workday, will be that much more successful. And that probably means more first-class business semantics and less traditional code. What isn't acceptable though is any lack of respect for the hard work of coding or symbol wrangling -- at any level of abstraction.

We agree less or more. For all high level languages to exist low level programming stays a requirement however (they are written themselves in f.e. C or javascript), this "low level work" is worth no less as you depend on it. A business developer / modeler cannot escape being a (lower level) developer also as far as I know. F.e. in a form editor you still need to handle some low level stuff in f.e. javascript.
Forrester:
"Myth No. 1: Low-code platforms are for citizen developers, not pro developers.some vendors — for example, Caspio, intuit QuickBase, ManyWho, snappii, trackVia, and Zoho — seek unique market positions by targeting business experts capable of delivering apps relying on spreadsheets, desktop databases, and similar tools. We rarely found these citizen developers in our research.4 More typically, developers used low-code platforms to create tools for citizen developers to deliver apps.
Myth No. 2: Low-code platforms eliminate the need for any programming. Generally, developers can use low-code platforms to create self-contained, relatively simple applications without having to write any code.5 But typical business applications require some programming and scripting for: 1) integrating with other applications and databases; 2) writing custom algorithms; and 3) accommodating technologies not incorporated in the low-code platform (e.g., native mobile application development). For these use cases, teams will need to use programming extensions within the low-code environment or rely on external programming languages and script"
(http://docplayer.net/28406044-Vendor-landscape-the-fractured-fertile-terrain-of-low-code-application-platforms.html)

@Stefan, I think we, actually, agree that business people must define by themselves how their business works. Unfortunately, it is rather unusual for them to follow your "Business people that model are developers.". Of course, they are "coding", but in a different way in comparison from typical developers. Business people provide WHY a particular business process is as it is presented on a BPMN diagram. Business people provide information "why" the activity ABC must be before the activity XYZ. We forget this "why" and draw a simple BPMN diagram "ABC" -> "XYZ". It will be nice to capture those WHYs and generate an BPMN diagram "automagically".

One of the languages I am using is called openedge "advanced business language". Handy marketing eh? It's made for business devs, I mean people! I would not recommend it btw. ;-)
I'm sorry, I do not see any benefit in this metaphysical opposition business languages vs progammer languages. On the contrary, as I explained. Of course I do see benefit in orchestration of microservices with bpmn etc. I'm developing business apps for more than 20 years btw. Not operating systems or likewise. Btw: "generate an bpmn diagram automagically"? Speak up! Voice recognition? AI? Machine learning? Wow, business running on it's own, like, how was it called, "automatic trading"? Just start your pc and the money is "automagically" transferred to your acount! This is the the God app, highest level possible!

Microservice is just a technology infrastructure paradigm. It promises a new kind of scalability by avoiding aggregation of everything into a single place. That is a good idea in general and follows the way that the internet succeeded in scaling.

BPM, which is a human practice, should work pretty much the same whether you are using a mainframe, three tire architecture, or a cloud of microservices.

WfMC did some advanced work on the kind of standards that would need to be in place to have a collection of small workflow / bpm servers working together as microservices. There are three key protocols necessary, and these are best represented in the following image:

Think of the middle column as a set of microservices and you see how you scale by adding more servers, instead of by making a single server larger. See the entire presentation at:

Microservices are key to isolating technical issues from purely business issues. As shown by Volker Stiehl BPM can work so much better (even putting to sleep the roundtrip problem) when business people can "just do business processes". With such a "business logic segregation pattern" in place, plus executable BPMN, then domain-knowledgeable business analysts will be able to help business realize the promise of BPM. Microservices are an essential part of this development. (It's only taken 20 years to get here.) [Note: Let's not even consider the idea that microservices replace BPM; they are complementary and BPM provides essential and irreduceable business orchestration logic and workflow pattern capability.]

The challenge is process governance and getting funding for an on-going microservices programme -- which is similar to the same challenge for master data management. Part of the solution will be realized via progressive disaggregation of ERP systems on one hand, and on the other delivery of a new generation of cloud-based, API-enabled applications. In both cases it's reasonable to expect implementation as well-documented microservices. As for governance, we are also seeing the rise of domain ontologies which provide the common domain-specific templates for microservice construction.

Yeah, but if you've read Stiehl (and I have; don't get me wrong, love the concept) it's very hard to do in real life. Most don't have the tenacity or the long view to do what he proposes, mostly because of your second paragraph on governance and funding.

RE "It's only taken 20 years to get here." Not correct, somebody had this about 17 years ago.

Actually, you perfectly right about "Microservices are key to isolating technical issues from purely business issues." - if process modeling is done correctly then all necessary microservices are defined and they implementation is very straightforward.

Thanks for the feedback @Patrick and @Alexander. As for "20 years to get here" -- that timeframe referred not only to microservices, but to that in combination with executable BPMN and the business logic segregation pattern too. As Patrick says, few organizations have the tenacity or the long view. There have been many frustrations with the promise of BPM, no matter the commitment or enthusiasm or vision of business leadership. However, today BPM technology is better than before, significantly better. Opportunity awaits. Time to really sell BPM technology again, directly or indirectly.

+1 to John for mentioning process-driven apps.
@Patrick: it's extremely tough to be done, but we're doing it. It's just a scaffolding right now, but this is the direction. We call this "business microservices" (not really a great name) and it embeds the technical concept of microservices and several BPM concepts (like call activity templates, error handling patterns), some business concepts (such as roles, teams, data objects) and some interface paradigms. Because it is really this combination that is relevant to a business user.
And it's not tough to build, but really tough to scale - think CI/CD over multiple custom deployments in different environments, with lots of customer-led forks. But whoever cracks this will get the Holy Grail.

RE " that timeframe referred not only to microservices, but to that in combination with executable BPMN and the business logic segregation pattern too. " Yes, this is a very unfortunate industry in which very one disagrees with the rest.

+1 @Alexander "microbusinesses" -- the term suggests full-on P+L -- which on the face of it sounds ridiculous -- however, one doesn't have to get too futuristic to draw a connection to the so-called gig economy, where progressively more finely grained services are in market. Very quickly we run into the economics of transaction costs -- but this is out-of-scope for this discussion.

More and more, customers will want functionality that is not in ACM/BPM "packages".

They will want seamless automated connectivity between packages and external "services", the latter we can generally describe as 'engines' that accept a message/request and yield back a message/response.

The location of the engines (local to the package, more likely at a distance) can vary.

How a package links to engines can vary a) the package directly messages an engine and waits for a response
b) the package appends messages to a generic data exchanger and waits or later "pulls" responses c) the engines effect a routine "push" to a generic data exchanger at some reasonable frequency for standing messages e.g. daily 1800 hrs weather report).

Packages will want to make requests using their native data element naming conventions, services will want to respond using their own native data element naming conventions.

Formatting for transport and pickup
-------------------------------------------------- For bulk data, time is not super important; transport is not going to consolidate to standard data element names in a standard format; so, the logical/practical solution is

a) formatting of outgoing data to the data exchanger becomes the responsibility of the package

N.B. A generic data exchanger reduces the number of required outgoing formats to ONE, allows the package to post all data to be shared using the package's native data element naming conventions, and allows pickup by engines on a need-to-know basis via a "subscription" matrix

b) formatting of pickup data at the data exchanger is the responsibility of the each engine. for "pull"

Microservices address some issues of adding more functionality by executing microservices OUTSIDE monolith BPM-suite or CASE-suite tools. Current practice is to embed some user-defined code within a BPM-suite tool. Thus man BPM-suite tools becoming "programmable monolith. This is just wrong. An user-defind cod must be executed as a microsrvice - in a separate computing process.

@Alexander
In our operations, a new formal release comprising 1.5 MM lines-of-code executable, release notes, updated help file, plus the occasional utility, plus mastering\deposit of an archive copy of the source with an agent costs about $5,000. We are naturally adverse to doing this more than a few times a year.

We don't allow apps to poke data to our db structures nor do we allow apps to read our db structures. Both bypass complex read/write security - the security goes down to the individual data element name or, if the data element is part of a cluster, the cluster itself.

Data imports to our data exchanger are read by an import engine that applies pre-conditions to such data before any processing is engaged (i.e. an incoming message referencing a part number involves lookup of the part number in our part number index before the data is imported),

Sorry @Karl, but I agree with @Alex - BPMS as programmable monoliths are simply going extinct - there is no way you could accommodate all possible customer requests, at the right speed and cost, without integrating it to existing software, outside your platform.

The main problem of any application development practice is to find “right” services/classes/functions/etc. For example, microservices gurus are talking about “business capabilities” and Domain-Driven Design (DDD).

BPM (if done correctly) provides naturally (or business-friendly) perfect microservices because any explicit and machine-executable process is already a set of elements with single responsibility (SRP). Just wrap events, roles, rules, human activities, automation activities, services which are produced via “correct” process modelling as microservices.

MicroService Architecture (MSA) is an architectural style (not technology stack, sorry Keith) to implement an application as a set of microservices. Any process template and case template is an application.

Thus BPM and MSA is a perfect marriage to reduce complexity and increase agile of business system.

If I were to take several steps back and abstract MS on a very high level in relation to BPM, and compare them to SOA elements for the world of processes, then yes, I definitively would see the relevance and relation. I would then group formlets, scripts and integrations into that category as well.

This is a software architecture question, so I’ll permit myself a technical answer for a change.

I recently gave a developer conference presentation on Process-oriented reactive service architecture (link to title abstract and a video) that discusses this topic. The short version is that business process modelling and execution ideas can usefully inform system architecture, by making an analogy between business service activities and individual microservices.

Another intriguing take away is that using a workflow automation platform to orchestration gives you the possibility of including user tasks in a microservice architecture, effectively allowing software architects to treat human case workers as if they were microservices. The jury’s out on whether that’s a good thing.

This "Microservices Architecture" is growing on me and recognises the component (services) nature of the basic and simple fundamentals of business being less than 15 (micro). This totality of capability creates a Digital Process Platform which in turn creates quickly the required Adaptive Applications. However the starting point is understanding what is required and engaging users and here the BPM discipline gets the ball rolling but only after all understand just how it is all translated into actual delivery of the desired process application.

Whilst the fundamentals are simple the following should be !incorporated into the architecture to make it a "platform" and the "game" changes....it does work!
• Process engine to ensure all works to plan • Rules engine reflecting real world of compliance • Calculation engine automating system work
• State engine Real time feed back from any point • Workflow everything connected in right order • Audit trail, events, escalations = control
• Rapid prototyping user involvement in build • Time recording supports activity based costing
• Real time reporting become predictive • Build mash ups one screen multiple data sources
• Linked intelligent Ajax grids enter data only once • Roles and performers people and machines
• Management hierarchy see who does what and when reallocate work • Call Web Services wrapped up in a process
• User interface dynamically created linking people, roles, task type and data via forms for specific instances, web or client server • TAG library connecting to legacy systems/data resulting in build and maintenance of the resulting web page is simplified as the developer only has to work with mark-up and not code
• Content handler and in memory work capability to ensure high performance. • Pre-built templates for custom documents, letters, e-mails, messages etc dynamically populated with instance specific data and edit capability in browser
• Simplified table creation in RDBMS • Process and task versioning control
• Server Side Message Queue calling on legacy

A very good question! There indeed seems to be a delta when it comes to technical advances, tendencies and, ultimately, real-life benefits from process optimization and automation. For the extended f...