Simplifying BPM and the zero code hypothesis, themes from #BPMNext

By ,

A week now since I got back from BPMNext and I’m still buzzing with the seeds of ideas planted there. First up, I felt motivated to write on the recurrent theme on how to ‘simplify BPM’ and its cousin ‘Zero-Coding’ applications. The latter remains as controversial as ever and perhaps always will so I think I’ll state up front my agreement with Scott Francis’ diplomatic conclusion in his piece on ‘the zero code hypothesis’ that:
It looks like those of us writing code will have some work to do for a few more years yet.
But there are two important corollaries to this hypothesis that I think worthy of mention.
‘Zero code’ is often a short-hand for “empowering ‘real’ users who are unfamiliar with code”. And that is both a worthy and necessary goal. After all, the days when a genius such as Isaac Newton could reach the pinnacle of several fields in one lifetime are long behind us. In the modern world we are all specialists. So if we believe BPM suites are powerful tools for managing our work then we need to strive to make them available to experts in all manner of fields. Let’s not give up on ‘zero code’ whilst acknowledging the limitations of our best efforts to date.
Looking back over BPMNext, we saw some examples of how we can already allow these ‘real’ users to solve some of the simpler problems. For example: the first speaker was Brian Reale of Colossa showing how with 10 minutes and no design environment he could create effective mobile data capture tools to compliment more traditional BPM installs. Not the only use for ProcessMaker, but a sweet spot that makes sense to me. At almost the other end of the conference we saw Keith Swenson show cognoscenti’s implementation of what he terms a ‘Personal Assistant’ starting to automate simple and repetitive work without getting in the way of the expert user. And others too showed how some part of what we traditionally call BPM can be placed in the hands of non-coders.
Secondly, the best of these higher-level tools allow those that are coders to be dramatically more productive. This is why I got into BPM in the first place: rather than forcing users to cobble together a coherent workflow from spreadsheets, post-it notes and highly specialised desktop applications we can smoothly hand work from one person to the next and optimise whatever makes sense to within the clear boundary of a service task on a BPMN diagram.
Not least amongst this category is BP3’s own Brazos UI framework for providing responsive UIs specifically for interacting with RESTful BPM servers. And whilst it is undoubtedly early days, camunda sponsored BPMN.io shows promise to allow a truly capable and embeddable tool for presenting, annotating and even authoring business processes. Plus @JakobFreund gets kudos for shameless geekery – laugh-out-loud! Two projects I will follow with interest.
The current trend for specialised web applications that do one thing really well and offer APIs to access their functionality is truly the opportunity for BPM to ‘cross-the-chasm’ that Paul Harmon spoke of in his keynote at last year’s BPMNext. And one I am busy working on at Syncapt. But that sounds like another post….

Berlin, Business Process – and a Fish Tank

By ,

This Wednesday I had the pleasure of visiting Berlin on a gorgeous summer’s day to join the other members of the OMG’s BPMN Model Interchange Working Group. As you might surmise from the name, the main objective for the group is to make a reality of BPMN model interchange. And if you’ve had cause to exchange models between tools you will know this is quite a quest though hopefully a noble one.

It should be said up front that this work is still in its early stages: assembling a set of test cases that exhibit visual features, the conformance classes and ultimately the exchange of executable models. Another piece of work under way is to create tools that automate comparison of different vendors’ models. In time it should be possible for everyone to use these tools to evaluate and self-certify their own products.

The morning was filled with the glamorous business of poring over specifications, spreadsheets and so on, son in the afternoon we decided we were in need of some light relief! No it was not an early exit to the bar but rather a practical test of the current state of the art. We were able to assemble a chain of 5 modelling tools that each elaborated on a single process model and then with further elaboration executed it.

It should be said that this was not the most complex process known to man but neither was it a ‘tame’ one defined for the purpose. It is, in fact, a pre-existing process used by one of the participants. Well judge for yourself…

This executable model was returned to the first modelling tool where we could see the proprietary executions extensions were preserved though the first tool had no knowledge of them.

We recorded the session and hope to provide a short video summary shortly.

Also on Wednesday as it happened, the Activiti team made a new release and as you can see on Tijs’ blog a key highlight of this release is a dramatically reworked REST interface. So curiosity forced me to see if I could import the executable model using that new API – and indeed I was able to but I think I’ll save the details for a separate post as this is quite long enough.

So what was that about the fish tank? Ah yes, quite the most remarkable lift I’ve seen in the conference hotel. And yes, that is a scuba diver about half way down.

Now THAT'S a fish tank

Now THAT’S a fish tank

Reflections on BPMNext

By ,

So before it is all completely stale I thought I’d review an exhilarating couple of days at BPMNext last week. Thanks to Bruce Silver and Nathaniel Palmer for coming up with a great format: short and snappy and with genuine innovations not only described but displayed. I think 30 minutes rather than the more common 45 or 50 minutes made everyone hone their message and spare us the filler slides. And in an environment filled with people clearly passionate about BPM and wanting to move the industry on. Paul Harmon’s keynote that BPM is yet to ‘cross the chasm’ to widespread adoption set the tone and the challenge for us to finally reach the point that both business and technology people can look at one solution and agree that it indeed solves the problem to hand.

As the name implies the ideas and products on show are not all ready for prime time but without exception everyone provided something requiring and worthy of further thought.

Modeling and beyond

Fluxicon had what could have been the hard job of starting the first day but Anne and Christian certainly made it look a breeze (pun intended) with their example of how 19th century sailors learned to mine the data of their predecessors for the best routes. The demo showed how easily a prepared data set can be loaded into their DISCO tool and manipulated to zoom in on areas causing bottlenecks and other undesirables with an easy swoosh of a slider here and there.

Peopleserv had an interesting take on the under-represented area of resource management and work allocation. With a minimalist model consisting of nested spheres, Roy Altman showed how to create a master hub of all organisational structure, not merely the obvious ones like reporting lines. This model can be populated from as many sources as necessitated by the way it is scattered across existing systems and then queried as a simple web service for inclusion in all applications that need it.

Gero probably gets the prize for best toys in his presentation on process modeling with a camera box that observed and responded to his hands (and on one occasion head!) to select and modify items in the Signavio browser-based modeling tool. He also showed speech control paired with language analysis (English and German) to specify a sequence of tasks and correct stylistic mistakes such as from ‘Purchased Items’ to ‘Purchase Items’.

ITP Commerce demonstrated a repository-wide analysis that was capable of applying a variety of quality measures such as BPMN-I and Bruce’s ‘Method and Style‘ conventions to produce a super-concise R.A.G. report that I could see being very helpful to drive continuous improvement. And who could not relate to Stephan’s quip that when it comes to defining what ‘good’ processes are we should of course look to the BPMN2 spec, shouldn’t we? ‘No, not really’ 🙂

Integration

John Reynolds (IBM) had a great presentation comparing how business process needs to be modifiable just as actors may ad-lib on stage and even weave those ad-libs into future performances. And crucially this ad-libbing depends on a very data-centric view of process that I too feel tends to be neglected. I presume that to anyone who’s been following the case management debate the parallels are obvious…

Lloyd Dugan and Mohamed Keshk elucidated on how SCA stubs can be generated from the simple process and data interfaces of BPMN models to bootstrap an Enterprise Architecture of composible and policy-governed components. Extra geek points for the command line demo guys!

Jakob Freund of Camunda memorably poked fun at zero-coding BPM solutions with the line that it was a lot of fun to see a business analyst try to wire up a web service – but not for the BA! His thesis was that business and IT are best left in their own favoured tools and to leave Camunda Cycle to synchronise all the details between the two. Certainly Bruce was impressed.

Social and mobile

Scott Francis of BP3 got in the first mention of responsive design in his session on the state of the mobile nation. Current mobile BPM can fall into the trap of being a firehose of many different things all jammed into the same task-duedate-priority mould – and we already have that, it’s called email. And I have to say that using IBM Process Designer ‘coach’ view (forms design) with their own UI rendering provided a compelling tool chain for integrating BPMS into a mobile line-of-business app suitable for a variety of devices.

OpenText were notably a vendor who had turned up in force. Their ‘touch’ initiative looks to have a full suite of social features integrated within the process client (rather than the outside-in approach of BP3 or TidalWave) but for a certain class of enterprise users I can see that being more comfortable.

Bonitasoft also presented the outside-in approach demonstrating how Chatter could deliver all task notifications to the chatter-stream. Processing them was just a click through to the specific task’s form. And if Chatter is not your preferred social app my bet is that with 300 connectors they’ve probably got the one you’re looking for.

Webratio showed their novel model-driven application tooling using BPMN for the macro process definition and introduced me to WebML (now being standardised as IFML at OMG) for composing the application interface and logic visually. I’d really like to look into this further but Emanuele Molteni raised the tantalising prospect to have blended these two modeling languages to build fully-visual applications. And of course the best-in-show app was written in this tool so watch out for that when the videos are published for wider voting shortly.

TidalWave pushed the theme that social apps need to be running in their user’s native world by showing how business interactions could be projected from their BPMS into Facebook. Of course that’s not every user’s habitat but for those that it is – perfect! Joel Garcia also pointed out the benefits of OAuth not only as an authentication mechanism but also a means to start to build up an integrated view of the user’s interests.

Analytical

Robert Shapiro of Process Analytica showed us both replay of historical behaviour and forward-looking simulation of a process in order to optimise resourcing levels. This optimisation is able to chase one goal such as reduced idle time whilst taking account of other constraining factors such as SLAs. Though it predates the BPSim specification this tool is an earlier adopter of that spec.

Dominic Greenwood of Whitestein Technologies is clearly focused on goals and gave a compelling demonstration of how planning milestones and their goal-seeking controller combine to put BPM on the front foot. Agility within a framework of governance provided by BPMN (prescriptive) snippets. And a great visualisation of the now-familiar process continuum!

Manoj Das asked slightly tongue-in-cheek “What do executives want? Pictures!” before introducing the latest BAM offering from Oracle. A variety of tools such as moving averages, deviation from trend and missed events showed how to monitor business in a way supportive of ‘management by exception’.

My apologies to Patrick Schmidt of SAP, I had to step out at this point. But I did gather it’s all powered by HANA and is very fast 🙂

Carl Hilliar (Kofax) took up the challenge to ‘show me’ and provisioned a new cloud-based instance of their suite to conduct his demo, during the presentation. For me, he had some interesting points on the distinction between public cloud, private cloud and on-premises. Essentially the private cloud is for those who want finer-grained control over upgrade;otherwise Kofax choose, with advance warning and sensitivity to working hours of course.

Knowledge Partners International presented a case study of how Freddie Mac have cut their disaster response time from 26 weeks at Hurricane Katrina to less than 3 at the time of Sandy through applying the principles of The Decision Model. But more than anything it seemed to me eliminating the waste of translating wordy descriptions into executable rules in a way that is tolerable to the subject matter experts.

Bosch Software Innovations described pushing event awareness and rules to the low-powered CPUs that make up ‘The Internet of Things’. At this stage it is understandably focused on problems like optimising the moment to perform preventative maintenance on   high-end machinery but as more and more of the world falls under the control of these devices it was an intriguing glimpse of the future.

Unstructured process

Denis Gagne gave us a whirlwind tour of how to model everything process. My own favourite was without doubt the Discovery Accelerator that supports both facilitated sessions and existing documentation routes to getting started with BPM. It is a great tool for drawing out the activities, data and people that make up business processes or as Denis puts it the W5 (“who, what, where, when and why”). Check it out at http://www.discoveryaccelerator.com/.

Dave Duggal of EnterpriseWeb gave a stream of consciousness talk that treats the problem domain or business (indeed I suspect the whole world!) holistically as a single network. Both human workers and system agents can crawl the network to manage work in boundless ways that I can scarcely begin to grasp if I’m honest!

Computas showcased their ACM implementation, or as they refer to it: ‘malleable tasks’. The breadth of the solution was staggering but in every case it provides knowledge-workers the ability to respond to events by planning appropriate tasks from a pre-defined set. ‘Agility within Governance’ you might say.

It would be harsh on so many others to say that the organisers had saved the best for last but Keith Swenson’s talk on ‘Anti-fragile‘ was certainly excellent. With a simplicity that belies its power he demonstrated the necessity of decentralised and emergent systems to support ‘real-world’ business.

Wow! As I said at the beginning, a very exhilarating couple of days!