Reflections on BPMNext

By ,

So before it is all completely stale I thought I’d review an exhilarating couple of days at BPMNext last week. Thanks to Bruce Silver and Nathaniel Palmer for coming up with a great format: short and snappy and with genuine innovations not only described but displayed. I think 30 minutes rather than the more common 45 or 50 minutes made everyone hone their message and spare us the filler slides. And in an environment filled with people clearly passionate about BPM and wanting to move the industry on. Paul Harmon’s keynote that BPM is yet to ‘cross the chasm’ to widespread adoption set the tone and the challenge for us to finally reach the point that both business and technology people can look at one solution and agree that it indeed solves the problem to hand.

As the name implies the ideas and products on show are not all ready for prime time but without exception everyone provided something requiring and worthy of further thought.

Modeling and beyond

Fluxicon had what could have been the hard job of starting the first day but Anne and Christian certainly made it look a breeze (pun intended) with their example of how 19th century sailors learned to mine the data of their predecessors for the best routes. The demo showed how easily a prepared data set can be loaded into their DISCO tool and manipulated to zoom in on areas causing bottlenecks and other undesirables with an easy swoosh of a slider here and there.

Peopleserv had an interesting take on the under-represented area of resource management and work allocation. With a minimalist model consisting of nested spheres, Roy Altman showed how to create a master hub of all organisational structure, not merely the obvious ones like reporting lines. This model can be populated from as many sources as necessitated by the way it is scattered across existing systems and then queried as a simple web service for inclusion in all applications that need it.

Gero probably gets the prize for best toys in his presentation on process modeling with a camera box that observed and responded to his hands (and on one occasion head!) to select and modify items in the Signavio browser-based modeling tool. He also showed speech control paired with language analysis (English and German) to specify a sequence of tasks and correct stylistic mistakes such as from ‘Purchased Items’ to ‘Purchase Items’.

ITP Commerce demonstrated a repository-wide analysis that was capable of applying a variety of quality measures such as BPMN-I and Bruce’s ‘Method and Style‘ conventions to produce a super-concise R.A.G. report that I could see being very helpful to drive continuous improvement. And who could not relate to Stephan’s quip that when it comes to defining what ‘good’ processes are we should of course look to the BPMN2 spec, shouldn’t we? ‘No, not really’ 🙂

Integration

John Reynolds (IBM) had a great presentation comparing how business process needs to be modifiable just as actors may ad-lib on stage and even weave those ad-libs into future performances. And crucially this ad-libbing depends on a very data-centric view of process that I too feel tends to be neglected. I presume that to anyone who’s been following the case management debate the parallels are obvious…

Lloyd Dugan and Mohamed Keshk elucidated on how SCA stubs can be generated from the simple process and data interfaces of BPMN models to bootstrap an Enterprise Architecture of composible and policy-governed components. Extra geek points for the command line demo guys!

Jakob Freund of Camunda memorably poked fun at zero-coding BPM solutions with the line that it was a lot of fun to see a business analyst try to wire up a web service – but not for the BA! His thesis was that business and IT are best left in their own favoured tools and to leave Camunda Cycle to synchronise all the details between the two. Certainly Bruce was impressed.

Social and mobile

Scott Francis of BP3 got in the first mention of responsive design in his session on the state of the mobile nation. Current mobile BPM can fall into the trap of being a firehose of many different things all jammed into the same task-duedate-priority mould – and we already have that, it’s called email. And I have to say that using IBM Process Designer ‘coach’ view (forms design) with their own UI rendering provided a compelling tool chain for integrating BPMS into a mobile line-of-business app suitable for a variety of devices.

OpenText were notably a vendor who had turned up in force. Their ‘touch’ initiative looks to have a full suite of social features integrated within the process client (rather than the outside-in approach of BP3 or TidalWave) but for a certain class of enterprise users I can see that being more comfortable.

Bonitasoft also presented the outside-in approach demonstrating how Chatter could deliver all task notifications to the chatter-stream. Processing them was just a click through to the specific task’s form. And if Chatter is not your preferred social app my bet is that with 300 connectors they’ve probably got the one you’re looking for.

Webratio showed their novel model-driven application tooling using BPMN for the macro process definition and introduced me to WebML (now being standardised as IFML at OMG) for composing the application interface and logic visually. I’d really like to look into this further but Emanuele Molteni raised the tantalising prospect to have blended these two modeling languages to build fully-visual applications. And of course the best-in-show app was written in this tool so watch out for that when the videos are published for wider voting shortly.

TidalWave pushed the theme that social apps need to be running in their user’s native world by showing how business interactions could be projected from their BPMS into Facebook. Of course that’s not every user’s habitat but for those that it is – perfect! Joel Garcia also pointed out the benefits of OAuth not only as an authentication mechanism but also a means to start to build up an integrated view of the user’s interests.

Analytical

Robert Shapiro of Process Analytica showed us both replay of historical behaviour and forward-looking simulation of a process in order to optimise resourcing levels. This optimisation is able to chase one goal such as reduced idle time whilst taking account of other constraining factors such as SLAs. Though it predates the BPSim specification this tool is an earlier adopter of that spec.

Dominic Greenwood of Whitestein Technologies is clearly focused on goals and gave a compelling demonstration of how planning milestones and their goal-seeking controller combine to put BPM on the front foot. Agility within a framework of governance provided by BPMN (prescriptive) snippets. And a great visualisation of the now-familiar process continuum!

Manoj Das asked slightly tongue-in-cheek “What do executives want? Pictures!” before introducing the latest BAM offering from Oracle. A variety of tools such as moving averages, deviation from trend and missed events showed how to monitor business in a way supportive of ‘management by exception’.

My apologies to Patrick Schmidt of SAP, I had to step out at this point. But I did gather it’s all powered by HANA and is very fast 🙂

Carl Hilliar (Kofax) took up the challenge to ‘show me’ and provisioned a new cloud-based instance of their suite to conduct his demo, during the presentation. For me, he had some interesting points on the distinction between public cloud, private cloud and on-premises. Essentially the private cloud is for those who want finer-grained control over upgrade;otherwise Kofax choose, with advance warning and sensitivity to working hours of course.

Knowledge Partners International presented a case study of how Freddie Mac have cut their disaster response time from 26 weeks at Hurricane Katrina to less than 3 at the time of Sandy through applying the principles of The Decision Model. But more than anything it seemed to me eliminating the waste of translating wordy descriptions into executable rules in a way that is tolerable to the subject matter experts.

Bosch Software Innovations described pushing event awareness and rules to the low-powered CPUs that make up ‘The Internet of Things’. At this stage it is understandably focused on problems like optimising the moment to perform preventative maintenance on   high-end machinery but as more and more of the world falls under the control of these devices it was an intriguing glimpse of the future.

Unstructured process

Denis Gagne gave us a whirlwind tour of how to model everything process. My own favourite was without doubt the Discovery Accelerator that supports both facilitated sessions and existing documentation routes to getting started with BPM. It is a great tool for drawing out the activities, data and people that make up business processes or as Denis puts it the W5 (“who, what, where, when and why”). Check it out at http://www.discoveryaccelerator.com/.

Dave Duggal of EnterpriseWeb gave a stream of consciousness talk that treats the problem domain or business (indeed I suspect the whole world!) holistically as a single network. Both human workers and system agents can crawl the network to manage work in boundless ways that I can scarcely begin to grasp if I’m honest!

Computas showcased their ACM implementation, or as they refer to it: ‘malleable tasks’. The breadth of the solution was staggering but in every case it provides knowledge-workers the ability to respond to events by planning appropriate tasks from a pre-defined set. ‘Agility within Governance’ you might say.

It would be harsh on so many others to say that the organisers had saved the best for last but Keith Swenson’s talk on ‘Anti-fragile‘ was certainly excellent. With a simplicity that belies its power he demonstrated the necessity of decentralised and emergent systems to support ‘real-world’ business.

Wow! As I said at the beginning, a very exhilarating couple of days!