How do you know when you need a BPM solution?
Asked Answered
M

6

27

My customer is looking for a Business Process Management (BPM) solution. What they need is simple document routing and an approval system. What are the drivers for implementing a BPM system? What is the threshold where a developer should suggest implementing a BPM solution vs. a workflow tool or custom development?

When does jBPM fit? When does a state machine built into an app fit? What problems should exist that determine that you need to go with a solution similar to jBPM?

I am looking for some real world examples of "we tried to build the solution ourselves, but ended up going with AquaLogic/jBPM/Lombardi because of _". Please fill in the blank.

Matazzoni answered 2/2, 2011 at 0:57 Comment(3)
I suggest you spell out "BPM".If
Can you provide an example of "BPM solution" and "Workflow tool"? For example, would the jBPM fit into either of these categories: jboss.org/jbpmTindle
That is the problem that I am trying to solve. When does jBPM fit? When does a state machine built into an app fit? What problems should exist that determine that you need to go with a solution similar to jBPM?Matazzoni
T
23

I wrote a workflow engine, because my employer wanted to own the IP, modeled after jBPM. Now the reason you use such a tool, instead of creating your own finite state machine, is accommodating changes without altering persistence and supporting edge cases of workflow processes as I'll explain.

Accommodating Changes Without Altering Persistence

Your typical, or perhaps better to call it "naive", finite state machine implementation features a set of database tables tightly coupled to the data managed and the process it flows through. There might be a way to keep past versions and track who took what action during the process as well. Where this runs into problems changes to data and process structure. Then those tightly coupled tables need to be altered to reflect the new structure and may not be backwardly compatible with the old.

A workflow engine overcomes this challenge in two ways, by using serialization to represent the data and process, and abstracting integration points, in particular security. The serialization aspect means data and process can move together through the system. This allows data instances of the same type to follow completely different processes to the point the process can altered at runtime, by adding a new state for instance. And none of this requires changing the underlying storage.

Integration points are means of injecting algorithms into the process and ties to authentication stores (i.e. users who must take action). Injected algorithms might include determinations of whether or not a state is completed, whereas authentication stores example is LDAP.

Now the tradeoff is difficult search. For instance, because data is serialized, it's usually not possible to query historical information - other than retrieve the records, deserialize and analyze using code.

Edge Cases

The other aspect of a workflow tool is the experience embedded into its design and functionality that you will likely not consider rolling your own and may live to regret when you do need it. The two cases the come to my mind are timed tasks and parallel execution paths.

Timed tasks are assigning an actor responsibility for data after a certain duration has passed. For instance, say a press release is writ and submitted for approval, and then sits for a week without review. What you probably want your system to do is identify that lingering document and draw attention of the appropriate parties.

Parallel execution paths are uncommon in my experience (Content Management Systems), but are still a situation that arises often enough. It's the idea that a given piece of data is sent down two different paths of review or processing, only to be recombined at some later point. This type of problem requires having useful merging algorithms and the ability to represent the data multiply simultaneously. Weaving that into a homespun solution after the fact is much trickier than it may seem, especially if you want to keep track of historical data.

Conclusion

If your system is not likely to change, rolling your own may be an easier solution, particularly if changes can break old information. But if you suspect you have a need for that type of durability or will experience some of these uncommon but thorny scenarios, a workflow tool provides a lot more flexibility and insurance that you won't paint yourself into a corner as the data and business processes change.

Tindle answered 10/2, 2011 at 4:37 Comment(0)
C
26

BPM Acid Test (from Essential Business Process Modeling by Michael Havey, published by O'Reilly).

... BPM is suited only for applications with an essential sense of state or process - that is, applications that are process-oriented. An application passes the BPM acid test if it is legitimately process-oriented. The travel agency application, for example, passes the test because it is best understood in terms of state of the itinerary and is defined at all times by how far the itinerary has gotten. Other typical characteristics of a process-oriented application include the following:

  • Long-running -

From start to finish, the process spans hours, days, weeks, months, or more.

  • Persisted state -

Because the process is long-lived, its state is persisted to a database so that it outlasts the server hosting it

  • Bursty, sleeps most of the time -

The process spends most of its time asleep, waiting for the next triggering event to occur, at which point it wakes up and performs a flurry of activities.

  • Orchestration of system or human communications -

The process is responsible for managing and coordinating the communications of various system or human actors.

... For example, in an automated teller machine, which lets users query their account balance, withdraw cash, deposit checks and cash, and pay bills - any sense of process is fleeting and inessential; an ATM is an online transaction processor, not a process-oriented application.

Choate answered 8/2, 2011 at 6:11 Comment(5)
Thank you for the description of BPM, but when do you decide to invest the time and effort to use BPM? I can do all that you are talking about with a simple state machine and a good web service API. At what point does that become "not enough"?Matazzoni
I once had a contract where I spent the majority of my time convincing my employer not to do what they were paying me to do. It was a glorious SOA/BPEL project where all the fancy tools were used to retire FTP schedule. Only now, BPEL was supposed to schedule FTP jobs. It was insane.Choate
Before this thought naturally occurs to you, it's not a good time to introduce BPM.Choate
+1. @Matazzoni - the keys are long-running and persistent. What happens to your state machine when the web server is restarted? We're talking days, not minutes. As @Yuriy mentions in a comment, one has to have a cake, too. BPM can help move "should I be at this step" business logic steps out of application code. It can also be overkill (as Yuriy also mentions).Evocation
@Yuriy Excellent example and brings insight into when to make the right decision. Other war stories are still welcome.Matazzoni
T
23

I wrote a workflow engine, because my employer wanted to own the IP, modeled after jBPM. Now the reason you use such a tool, instead of creating your own finite state machine, is accommodating changes without altering persistence and supporting edge cases of workflow processes as I'll explain.

Accommodating Changes Without Altering Persistence

Your typical, or perhaps better to call it "naive", finite state machine implementation features a set of database tables tightly coupled to the data managed and the process it flows through. There might be a way to keep past versions and track who took what action during the process as well. Where this runs into problems changes to data and process structure. Then those tightly coupled tables need to be altered to reflect the new structure and may not be backwardly compatible with the old.

A workflow engine overcomes this challenge in two ways, by using serialization to represent the data and process, and abstracting integration points, in particular security. The serialization aspect means data and process can move together through the system. This allows data instances of the same type to follow completely different processes to the point the process can altered at runtime, by adding a new state for instance. And none of this requires changing the underlying storage.

Integration points are means of injecting algorithms into the process and ties to authentication stores (i.e. users who must take action). Injected algorithms might include determinations of whether or not a state is completed, whereas authentication stores example is LDAP.

Now the tradeoff is difficult search. For instance, because data is serialized, it's usually not possible to query historical information - other than retrieve the records, deserialize and analyze using code.

Edge Cases

The other aspect of a workflow tool is the experience embedded into its design and functionality that you will likely not consider rolling your own and may live to regret when you do need it. The two cases the come to my mind are timed tasks and parallel execution paths.

Timed tasks are assigning an actor responsibility for data after a certain duration has passed. For instance, say a press release is writ and submitted for approval, and then sits for a week without review. What you probably want your system to do is identify that lingering document and draw attention of the appropriate parties.

Parallel execution paths are uncommon in my experience (Content Management Systems), but are still a situation that arises often enough. It's the idea that a given piece of data is sent down two different paths of review or processing, only to be recombined at some later point. This type of problem requires having useful merging algorithms and the ability to represent the data multiply simultaneously. Weaving that into a homespun solution after the fact is much trickier than it may seem, especially if you want to keep track of historical data.

Conclusion

If your system is not likely to change, rolling your own may be an easier solution, particularly if changes can break old information. But if you suspect you have a need for that type of durability or will experience some of these uncommon but thorny scenarios, a workflow tool provides a lot more flexibility and insurance that you won't paint yourself into a corner as the data and business processes change.

Tindle answered 10/2, 2011 at 4:37 Comment(0)
L
6

Maybe asking a few questions could help.

Will the processes change ? Will an older version of a process live on while newer version of the process come into existence ? Should the running time of processes (and each step) be measured ?

Is it about business processes (orchestrating the state of multiple resources) or resource lifecycles (only the state of a single document/resource) ? ...

Sorry if it's not much of an answer.

Legend answered 2/2, 2011 at 1:41 Comment(1)
Have my upvote for those excellent and crucial requirement analysis questions. Please add more when you think of them.Cryptoclastic
T
3

I'd take a closer look at the business need that drives your effort (i.e. "business case"). In my understanding BPM/workflow might have one or more of the following goals:

1. Automate actions

This is usually required to replace human with machine through automation of tasks, such as creating documents, archive information, notify users, etc.

2. Tracking of each processes

Companies need to establish tracking when there are a significant number of processes, and business users lose track of them as usually run them in office documents, emails. Every external request for status (e.g. from a client) turns into an investigation.

3. Establish control

For managers it's usually important to gain a high-level view of the process and study it statistically: see whether KPIs are kept to, any lags, exceptions, etc

4. Manage in-process document exchange and collaboration

BPMs often serve as a document exchange tool, as they often enable switch from email and verbal communication to a traceable exchange in a BPM

5. Automate data exchange between enterprise systems

This is a pure integration case and usually is demanded in the case when a number of actions are already performed with (or by) various systems, and there is a need to automate the information exchange among them.


Now, full featured ready-to-use BPMs are good for 2, 3 and sometimes 4. jBPM and other workflow engines are good for 1 and 3, but with an important caveat - they require complex configuration/development.

SOA-based process orchestration engines (sometimes called BPM too!) are good for (5) and (3).

Please feel free to add to the list and argue! I've posted this as my blog post and elaborated a bit more here: http://processmate.net/do-you-need-a-bpm-or-a-workflow/

Tuppeny answered 8/4, 2013 at 19:24 Comment(0)
S
1

Ultimately, all business systems that deal with processing of business related information are BPM or workflow systems. Any business's information processing can be described in terms of workflows or "business processes", involving roles and activities.

The fact that these business activities are often described in Java, C# or other programming languages is basically just the result of automation without mature enough technology for prescription and description of business processes with automatic agents.

The emphasis has been on growth, time to market, and so on, and computerisation was carried out without proper thought about long term maintenance and flexibility. 99% of what is in code now should not be.

In contrast, real time control systems, video games, high performance computing, forecasting, business intelligence and mathematical analysis are all examples of problems that do not lend themselves to graphical workflow description. These are things that should be done by computers and maintained by computer experts.

Business processes should be prescribed, described, and readable by business operations experts. The flexibility gains will become increasingly recognised as the technology that enables this (workflow systems) gets better, and more widely accepted as the world economy de-emphasises "growth".

Suttee answered 9/2, 2011 at 11:50 Comment(0)
V
1
  • When you want to provide flexibility (easy changes) to the process, even for running processes.
  • When you want the business users to control the process. Those are the domain experts who own and define the processes. By using a BPM solution you allow them to control the process using modelling tools instead of programming languages. This is essential.
  • When you want the system to scale easily. Normally system admins would have the possibility to monitor and operate the system. If this should be possible also for business users, then BPM might be a good choice.
Veneer answered 15/11, 2018 at 10:28 Comment(1)
I'm a bit cautious about your arguments. First: From what I've seen, it seems really difficult in most BPM engine to alter the process definition for an already running process. Second: The promise of getting non tech users writing BPM(N) (those you called domain experts) in fact often deceptive. You got some code in the BPM boxes and it's eventually written by developers. Third: A lot of BPM solutions are built as monolithic BPM engines, poorly scalable compared to some lighter event based ones.Sidedress

© 2022 - 2024 — McMap. All rights reserved.