BPMN™ in Action!

By Tim Stephenson,

Monday, June 22, 2020 – 16:00 – 17:30

This year’s Business Process interchange demonstration is today – almost 10 years to the day since the current major version of BPMN was first published. We will showcase a digital transformation using a wide range of tools for both modelling and automation.

More information and free registration here BPMN™ IN ACTION! Object Management Group® (OMG®) cordially invites Business Process Modeling practitioners and interested parties to attend this free innovative demo of BPMN in Action! Leading software vendors will demonstrate live the iterative elaboration and interchange of a Business Process Model and Notation™ (BPMN) model using their respective tools that implement the BPMN standard. This is the perfect opportunity ask questions of the creators and innovators supporting this most widely adopted business process standard.

To view the participating vendors, click here. For more information about the upcoming OMG 2020 Q2 meeting, click here.

Object Management Group® (OMG®) is an international, open membership, not-for-profit computer industry standards consortium. OMG Task Forces develop enterprise integration standards for a wide range of technologies and an even wider range of industries. OMG standards enable powerful visual design, execution and maintenance of software and other processes.

New sustainability reporting year 2019-20 is now open

By Tim Stephenson,

Hello!

Thanks for using the SRP app this year as part of your sustainability reporting. We hope you found the system useful and would love to hear about your experience and any suggestions or issues you may have via the contact form. We also publish tips and other notifications on Twitter so you may like to follow @srpdigital.

We’ve put live the new report for 2019-20 this weekend so please do log in and get started. This year we have new questions on Plastics and Air Pollution so do look out for those.

All the best from me and the SRP team,
Tim

Sustainability Reporting Portal (SRP) is an app to support data-driven reporting of a range of non-financial measures covering Environment, Social and Governance. It has been developed in conjunction with the NHS Sustainability Unit and is includes modelling specific to the UK health sector.

BPMN IN ACTION – Meet and Greet

By Tim Stephenson,

The Object Management Group (OMG) cordially invites Business Process Modeling practitioners and interested parties to attend this free innovative and informative meet and greet. Cocktails and light snacks will be served while leading software vendors will demonstrate live the iterative elaboration and interchange of a BPMN model using their respective tools that implement the BPMN standard.

  • DATE: Monday, June 17, 2019
  • TIME: 5:00 pm – 7:00 pm
  • PLACE: Park Inn City West, La Guardiaweg 59, Amsterdam, The Netherlands (hotel info)
  • COST: Complimentary (Registration Required)
  • CONTACT:info@omg.org

Register

Vendors participating in the Live interoperability demo*

*Current list of vendors; more may be added. Subject to change.

About OMG
The Object Management Group® (OMG®) is an international, open membership, not-for-profit computer industry standards consortium. OMG Task Forces develop enterprise integration standards for a wide range of technologies and an even wider range of industries. OMG’s modeling standards enable powerful visual design, execution and maintenance of software and other processes. Visit www.omg.org for more information.

Getting on the front foot: A structured approach to Digital Process Automation

By Tim Stephenson,

First published on Zaizi.com

Last time we looked at how you can take a bottom-up approach with Digital Process Automation (DPA) to control your legacy systems. Now we’ll turn our attention to using it pro-actively — taking a top-down approach and creating a roadmap to completely revolutionise your processes.

Just as we are starting to see digital natives enter the workforce, we are also seeing a younger generation of business analysts and leaders exposed to Business Process Model and Notation (BPMN) in their education. So how do BPMN and DPA relate?

 BPMN – the face of DPA

BPMN has a simplicity that belies its power. Starting with a few basic shapes easily sketched during whiteboard discussions, it can provide a precision that minimises misunderstanding. Given that the majority of us are visual thinkers, it is ideally placed to be the common language to bridge the gap between service owners (responsible business users) and technologists.

Whilst it is easy to draw shapes, it is imperative to have some structure for any process overhaul to get the best out of DPA.

Different organisations are, well, different. How you embark on your very first process automation journey will be different to how you evolve your approach when your confidence in the discipline grows.

Wherever you are on your understanding, what follows provides a good template

The three levels of process modelling

Start with the ‘End-to-end’ view

The top process should describe a single value-chain (which we could describe as Level 1). In other words, from the time that a user starts trying to do something to the time they have successfully achieved it. The other aspect to a value-chain is that it’s abstract, it doesn’t go into every detail but just the key milestones. In particular, at this stage you should leave out any complexities introduced by unexpected events or having to go back around a particular loop.

Focus on your front line staff

Next, look at one stage of your end-to-end process,  drill down into how that actually gets done. Look at the individual tasks that your people do. For example, “when we receive this, we need to do that”. Here you need to understand what happens when something goes wrong and if there is rework required. 

This is your ‘Level 2’ process. You’ll have a whole collection of them each tackling one step in your value chain.

Depending on how complex Level 2 is, you may also want to push some of the details down into a Level 2b. For this introductory article, we don’t need to concern ourselves with that.

Look at how your systems contribute

Your front line staff probably have a whole range of systems and knowledge sources that they use to support their decision making. These may be databases and fulfilment systems that they have to update to perform part of a task or it could be offline systems such as a book of regulations or policies that they need to refer to know how to act in different circumstances.

The way these systems are integrated into what people do can be seen as Level 3 processes. If you’ve been reading since the first blog about legacy transformation and APIs, this is where it all comes together. These Level 3 processes are where we make use of those APIs. This is how DPA allows you to make legacy modernisation a practical proposition.

So that’s the approach. We’ve not gone into any details of BPMN but creating each of these three levels of process is a collaborative task very possibly done in front of a whiteboard and leans heavily on BPMN as our common language of all stakeholders.

BPMN Interchange takes on Data Objects and Call Activities

By Tim Stephenson,

BPMN process map divided into segments performed by each tool

Amazing to think this is the fifth year of the BPMN Model Interchange Working Group (MIWG). Yesterday as the annual demonstration and once again there was a room full of BPMN practitioners and vendors as well as a live stream. The demonstration took the form of ten vendors each modelling an additional part on top of the previous tool’s contribution. In other words at each handover the model was interchanged between two tools carrying forward all standard model elements and vendor extensions.

The scenario required a customer data object to be used by numerous tasks. For those unfamiliar with this aspect of BPMN, this requires a single, non-visual, Data Object to be contained in the model and several, visual, Data Object References to be placed on the diagram and linked with the task consuming them. Each Data Object Reference is typically assigned a state that describes what is expected at that point in the process. 

Excerpt of process model and the property sheet for the selected Data Object Reference
Data Object Reference and underlying Data Object it refers to.

A Call Activity is a vital construct that allows process reuse. This scenario showed separate tools creating the called and calling process and a third connecting them together.

Excerpt of process model showing callActivity and process called
Call activity invokes independent process

And this is the called process

This marked a milestone year for KnowProcess as Tim demoed the process modeller modifying processes as opposed to viewing and executing models created in other tools for the first time. More on that in a future post. For now here is the completed process re-imported into KnowProcess.

2018 BPMN MIWG scenario in full
2018 BPMN MIWG scenario in full

Watch the live stream recording, with all the ‘fun’ of co-ordinating participants in 10 countries in real time below.

BPMN in Action: BPMN MIWG Capability Demonstration Seattle 2018

Decision modeling to measure progress on Climate Change and Social Value Acts

By Tim Stephenson,

Decision Camp 2018 logo

Between 17th and 19th September I will be participating in the Decision Camp 2018, an annual event to take stock of the latest progress in Decision Management. This year I hope to see DMN adoption is gathering pace. Indeed the programme seems to suggest it is.

My own presentation this year will be a case study in the way we’ve applied the DMN technology to Sustainability Reporting. Here’s the abstract.

The Climate Change Act and Social Value Act in the UK place responsibilities on many private sector and all public sector organisations to consider not only the value to customers and stakeholders of their activities but also the effect on greenhouse gases and the society they operate within.
The healthcare sector has a particularly significant place in this work, forming as it does a large part of the public sector and being present in every part of UK society. This programme that we’ve been involved with for some years now aims to recruit that considerable geographic & economic muscle to positively drive social and environmental action.

The programme collates the best and most up to date climate and social value modelling available and applies it to the problem of calculating a complete Sustainability report for each and every hospital trust and clinical commissioning group (CCGS – who commission primary healthcare providers) in England. In numerical terms this deals with almost 500 data inputs for each of around 200 hospital trusts, a further 200 CCGs as well as Ambulance Trusts, Mental Health providers and so on.
This case study will show how DMN has been used to document the previously black box system allowing far more people to understand, critique and improve the reporting process. Furthermore, by making these decision models executable we have dramatically reduced the complexity of maintaining and updating the system as new and updated data becomes available.

Update 20th Sept: Here are the slides from my presentation.

MySQL backup and restore with extreme space limits

By Tim Stephenson,

Bank Safe

Digital Ocean is a very powerful and yet simple to use Virtual Private Server (VPS) provider. But then you knew that unless you’ve been living under a rock for the last couple of years ;-). The one area I have struggled is those small disk sizes (at least until the Block Storage reach your preferred datacentre).

This post shows how to work around size restrictions and backup and restore databases that fill most of those disks.

Before you start

Ensure you have two droplets, one containing the database you want to backup and one to backup and restore on.

Also ensure you can ssh between them; the easiest way to do this is to set up key based authentication as described here: https://www.digitalocean.com/community/tutorials/how-to-set-up-ssh-keys–2.

Creating the backup

This little beauty will dump all tables, run the result through gzip compression and pipe the lot to another server (droplet). Not only a backup but potentially an off-site backup.

mysqldump -u usr -p dbname | gzip | ssh backupserver "cat > dump.sql.gz"

Or backup just a subset of tables:

mysqldump -u usr -p dbname table1 table2 | gzip | ssh backupserver "cat > dump.sql.gz"

A variation could even pipe the dump into the mysql client on the backup server to provide a warm standby, but for now I’ll leave that as an exercise for the reader!

Restoring the backup

The reverse one-liner to restore direct into the database is this:

ssh user@backupserver "cat ~/dump.sql.gz" | gunzip | mysql -u usr -p dbname

Scheduling with cron

To place these commands in a cron job is straightforward other than the small matter of supplying the password, which will be requested interactively in the above examples. This can be avoided using the client section of the mysql configuration file. Since the documentation can be a bit dense here’s a shortcut: Put this into ~/.my.cnf (and don’t forget to change permissions so others cannot read it).

[client]
user = dbusername
password = "dbpassword"
host = localhost

First month with Bq Aquaris M10

By Tim Stephenson,

Actually my M10 arrived on 20th April so it’s been a little over the month now but never let it be said that I rush to conclusions!

By now, you will no doubt have read many a mixed review of the first Ubuntu tablet so perhaps first I should declare something of my background. I’ve been a pretty keen Ubuntu user for a decade now but I wouldn’t want you thinking I’m an insider, far from it. I don’t even really think of myself as a techie. Though I have written software and lead software teams all my career computing is for me just a tool to solve problems. A powerful tool to sure but in my job (or vocation) I’m solving problems that allow small companies to compete with large ones and to make all of us tread more lightly on our fragile planet.

I tried Ubuntu out of frustration at losing 50% of my computing power to Windows and Mcafee even though we bought high end Lenovo laptops for all staff in those days. I was persuaded to try a Mac a few years back and I tolerate it as an adequate tool but no more. My only real gripe with Ubuntu (and a few other distros that I’ve tried) is that WiFi can still be a bit hit or miss if you pick the wrong hardware.

So, why would I shell out for an Ubuntu tablet?

For me, every tablet thus far has been a consumption device. It’s a handy way to check email or watch a tv show on the train. I love being able to read both web content and books without lugging a laptop but to be honest a phone is just as good for that. I’ve been super-grateful for JuiceSSH on more than one occasion to kick a malingering server back into life or do an emergency database fix on the go, but honestly you wouldn’t plan to run your life that way, would you?

But with Ubuntu Touch we have the promise of running ‘normal’ applications on the go with minimal compromise and no compromise at all if you can find yourself near a screen and pull out a bluetooth keyboard.

So what will I do with it? Email, iPlayer radio, web and ebook reading, social media, Content Management (WordPress), Business modeling (BPMN, DMN, UML), internal and customer meetings (mostly voice but some screen sharing) and project management (Freemind, Github, Gitlab). In other word everything that is not actually sitting in front of heavy development tools like Eclipse or XCode.

Now I know that some of those are not going to be straight foward but let’s see how I get on…

 

Who wants ‘cloud-native’ business process?

By ,

No, not link-baiting on the perils of putting your important business process into the cloud but actually a question. For a couple of years (at least) it has been clear that ‘modern’ applications are being built with the set of technologies loosely termed ‘HTML5’. And most BPM suites have responded lately with a RESTful API. But it always feels to me that these are traditional enterprise applications wrapped in a light sprinkling of cloudy fairy dust.

Where are the tools that support genuine ‘business processes’ to be integrated with lightweight micro-sites by my existing web agency? Where are the data-binding tools that allow me to use messages triggered from standard HTML pages to control process-driven applications? Where the dynamically scalable, utility computing platform that can host my BPMN models? Which suite lets me tell it the task or project management tool where I want my user tasks to show up? If you too have been hunting, read on…

For some time now KnowProcess has been delivering projects that rely on the features above married with HTML 5 features like offline use, localStorage and richer form validation. We picked our favourite open source BPM engine and gradually built the infrastructure around it to make a BPMS worthy of being termed a cloud-native. A couple of examples: Syncapt is performing a number of marketing automation such as lead management and customer care follow ups. Trakeo is helping of organisations reduce their environmental and social impacts within the complex environments of statutory and voluntary regulation.

Now at last we’re getting around to making it available as a stand-alone service. You’ll be able to create an account and immediately start using the built in processes. And of course you’ll be able to deploy your own too using our built in service tasks. There’s a little way to go yet, sorting out the developer documentation and so on but if it sounds interesting why not get in touch and we’ll get you on the early adopter programme: info (at) knowprocess.com or @knowprocess

Oh, how much does it cost you say? True to its cloud pedigree, it will be a freemium model so expect a small use-based subscription for the service. We owe a lot to existing open source projects and if you want you’ll be able to deploy it yourself on your own kit you’ll be able to though of course we feel you’ll miss much of the benefit that way. The plan is to find an open source home for it soon, perhaps even within an existing project.

 

Simplifying BPM and the zero code hypothesis, themes from #BPMNext

By ,

A week now since I got back from BPMNext and I’m still buzzing with the seeds of ideas planted there. First up, I felt motivated to write on the recurrent theme on how to ‘simplify BPM’ and its cousin ‘Zero-Coding’ applications. The latter remains as controversial as ever and perhaps always will so I think I’ll state up front my agreement with Scott Francis’ diplomatic conclusion in his piece on ‘the zero code hypothesis’ that:
It looks like those of us writing code will have some work to do for a few more years yet.
But there are two important corollaries to this hypothesis that I think worthy of mention.
‘Zero code’ is often a short-hand for “empowering ‘real’ users who are unfamiliar with code”. And that is both a worthy and necessary goal. After all, the days when a genius such as Isaac Newton could reach the pinnacle of several fields in one lifetime are long behind us. In the modern world we are all specialists. So if we believe BPM suites are powerful tools for managing our work then we need to strive to make them available to experts in all manner of fields. Let’s not give up on ‘zero code’ whilst acknowledging the limitations of our best efforts to date.
Looking back over BPMNext, we saw some examples of how we can already allow these ‘real’ users to solve some of the simpler problems. For example: the first speaker was Brian Reale of Colossa showing how with 10 minutes and no design environment he could create effective mobile data capture tools to compliment more traditional BPM installs. Not the only use for ProcessMaker, but a sweet spot that makes sense to me. At almost the other end of the conference we saw Keith Swenson show cognoscenti’s implementation of what he terms a ‘Personal Assistant’ starting to automate simple and repetitive work without getting in the way of the expert user. And others too showed how some part of what we traditionally call BPM can be placed in the hands of non-coders.
Secondly, the best of these higher-level tools allow those that are coders to be dramatically more productive. This is why I got into BPM in the first place: rather than forcing users to cobble together a coherent workflow from spreadsheets, post-it notes and highly specialised desktop applications we can smoothly hand work from one person to the next and optimise whatever makes sense to within the clear boundary of a service task on a BPMN diagram.
Not least amongst this category is BP3’s own Brazos UI framework for providing responsive UIs specifically for interacting with RESTful BPM servers. And whilst it is undoubtedly early days, camunda sponsored BPMN.io shows promise to allow a truly capable and embeddable tool for presenting, annotating and even authoring business processes. Plus @JakobFreund gets kudos for shameless geekery – laugh-out-loud! Two projects I will follow with interest.
The current trend for specialised web applications that do one thing really well and offer APIs to access their functionality is truly the opportunity for BPM to ‘cross-the-chasm’ that Paul Harmon spoke of in his keynote at last year’s BPMNext. And one I am busy working on at Syncapt. But that sounds like another post….