Thursday, April 19, 2018

bpmNEXT 2018 day 2 (part 2)

RPA Enablement: Focus on Long-Term Value and Continuous Process Improvement
Massimiliano Delsante - Cognitive Technology Ltd.

The myInvenio tool can be used to discover processes based on data already collected.  It will derive the process (the tasks, actors, sequence, etc.) from the data and cross-check that with the cases that are already recorded (for example see which are deviating, where time is spent, etc.).
This information can then be used to derive which activities might be the best candidates for automation.  By running a simulation, you can decide for example to add two robots for automating one of the steps (at least the simple cases) and keep one employee for more complex and exceptional cases.



Integration is Still Cool, and Core in your BPM Strategy
Ben Alexander - PMG.net

PMG provides drag and drop low-code processes, with pre-built connectors.  The process included human tasks for approval, but also supported integration with email, phone or text, or slack, etc.  It contacts external services (like Azure ML) for risk assessment, and included some RPA integration.



Making Process Personal
Paul Holmes-Higgin and Micha Kiener - Flowable

Chat is becoming more and more an important communication channel for customers.  Flowable showed an example of how banks are using lots of different channels to communicate with customers, like a chatbot, and using BPMN2 and CMMN during conversations.  
A digital assistant is constantly helping the client advisor during his conversation by creating (sub)cases, advising actions, etc.  For example, it can help enter a client address change, validate the information, ask validation, send confirmation emails, involve a compliance officer if necessary, etc. Behind the scenes, the digital assistant is backed by a process (with forms etc.).  Finally, integrating Machine Learning can be used to replace some of the manual steps.



Robotics, Customer Interactions, and BPM
Francois Bonnet - ITESOFT

A demo with an actual (3d printed, open-source) robot !  Francois brought a robot with video and voice recognition capabilities.  The robot could be used for example in a shop for greeting clients.  Voice recognition can be used to start a process (for example when the customer comes in).  The robot can respond to several commands, follow, do face recognition, take pictures, etc. all by configuring various processes.  The voice and face recognition isn't always working perfectly yet, but interesting to see anyway !



The Future of Voice in Business Process Automation
Brandon Brown - K2

Voice recognition can be used to create a chatbot.  The chatbox can for example be used to request PTO, get your tasks (and complete or even delegate them).  But chatbots aren't great for everything.  Some data is just easier to provide in a structured form.  But even forms can be enhanced with for example sentiment analysis (to automatically update the data based on the sentiment detected from the text provided in the form).  You can then for example create standard processes for how to respond to certain sentiments.



State Machine Applied to Corporate Loans Process
Fernando Leibowich Beker - BeeckerCo

Processes can be unstructured and rely on rules for defining when tasks should be triggered or not.  The demo is using IBM BPM state machine in combination with IBM ODM where the rules define what the next state will be based on the current state and the input.

Wednesday, April 18, 2018

bpmNEXT 2018 day 2

An awesome surprise this year, the videos from yesterday are already available on youtube!  So I've updated my posts from yesterday with the links, amazing job!

BPM 2018-2022: Outlook for the Next Five Years
Nathaniel Palmer

Nathaniel is starting with an outlook of where we are (maybe?) going in the next few years.  The three R's that will define BPM in his point of view are Robots, Rules and Relationships.  With everything running in the cloud.  And using Blockchain ;) 
Interaction has already significantly changed (with everyone having a smartphone), but he predicts the smartphone (as we know it) will go away in the next five years - with consumer adoption of new interfaces accelerating even more.
Robots (including any kind of smart device or service) will represent customers in various interactions.  And will do a lot of the work done by employees nowadays.  Even autonomously. This all will have an impact to application architectures, almost introducing a 4th tier in typical 3-tier architectures.
The future-proof BPM platform (aka the Digital Transformation Platform) brings together various capabilities (like Workflow Mgmt, Decision Mgmt, Machine Learning, etc.) - possibly from different vendors - processing events from many different sources (services, IoT devices, robots, etc.).
And he ended with the advice, that the best way to invent the future, is to help create it !


A Next-Generation Backendless Workflow Orchestration API for ISVs
Brian Reale and Taylor Dondich, ProcessMaker

ProcessMaker is showcasing their cloud-based process service.  It exposes a REST api for interacting with it, and has connectors to various external services.  The service does not come with a BPMN2 designer, but they accept BPMN2 and offer a programmatic interface to create processes as well.  They also introduced a "simplified" designer that ISVs can use to define processes (that underneath exports to BPMN2 as well), but hides a lot of the more complex constructs available.



CapBPM’s IQ – No-code BPM development – Turning Ideas into Value
Max Young, Capital Labs

To avoid being locked into one vendor, IQ is offering a generic web-based user interface for BPM, that can be used on top of various underlying BPM platforms.  On the authoring side you can define process and data models and do different kinds of analysis.  In the end, it generates open-source application code that work with a specific product (that your developers can use as a starting point).


Monitoring Transparency for High-Volume, Next-Generation Workflows
Jakob Freund and Ryan Johnston - Camunda

Camunda is showing Zeebe, their next generation process execution platform.  The demo starts when an arbitrage opportunity is detected, and then does various risk calculations. Zeebe Simple Monitor is a web-based monitoring tool to look at deployed processes and running instances.  With Optimize you can create and look at reports based on the various events that Zeebe is generating, including charts, heat maps, alerts, etc.
And as a treat, they showed a doom like easter egg inside their Cockpit, where you can walk through your process "dungeon" and shoot tokens with your shotgun :)

bpmNEXT 2018 - Part 2

Decision as a Service (DaaS): The DMN Platform Revolution
Denis Gagné - Trisotech

Denis is showing the progress Trisotech has made offering DMN modeling and execution capabilities as a service.  The DMN Modeler is a complete modeling environment for DMN, including collaboration, simulation, test cases, searching, etc. After creating a DMN model, he showed various ways of creating a new DMN Decision Service to expose.
Next, this can be deployed into the cloud (including using our own Drools or Red Hat Decision Manager DMN engine).  Once deployed, it can be tested with a simple HTML form, it has a rest API, debugging the environment allows you to look at the requests that were actually made, etc.  Using an API mgmt tool, you can add even more features like authorization.
Finally, it's of course possible to include these decision services into your processes.



Timing the Stock Market with DMN
Bruce Silver - methodandstyle.com

Bruce implemented a DMN model for predicting when to buy and sell stocks.  Based on historical stock data, it uses a DMN model to detect patterns (based on local min and max, smoothing, etc.) and   This service is then orchestrated by using a process (using Microsoft Flow) to go and download 1 year of data for specific stocks, process it and present the results - using various connectors (to get information from and into Excel, call the REST decision service, etc.).  His goal was to show how a non-programmer like himself can use DMN to do real life use cases that can then be fully executed. And you should buy his DMN Cookbook for all the details :)


Smarter Contracts with DMN
Edson Tirelli - Red Hat

One of the challenges of using Blockchain for smart contracts is that some of the languages used there (for example in Ethereum) isn't always easy to understand or use (especially for non-experts).  The goal Edson had upfront was trying to use DMN instead, as a language for smart contracts that users can understand.  Using an example of selling a property, he showed how some of the logic was externalized from the contract into a DMN decision service.  The contract raised an event, that the Ethereum Oracle picks up and contacts the DMN service (running in the cloud).  Using a simple web app to initialize and finalize the sale, you could see the Blockchain being updated with all the relevant data.
Pretty cool, although as Edson is my colleague I am obviously biased ;)



Designing the Data-Driven Company
Jochen Seemann - MID GmbH


The Business Decision Map is a way to represent decisions at different levels: tactical decision, operational decisions and business events. Using the example of a car rental company, it allows you to represent the decisions they need to make at the different levels.  Using the MID Innovator tool, these decisions can be represented using DMN.  But other options like PMML and Machine learning can also be combined.



Using Customer Journeys to Connect Theory with Reality
Till Reiter and Enrico Teterra - Signavio

Since the focus of any company should be on the customer, Signavio developed a new notation for presenting customer journeys and link those to processes and business intelligence.  Using the example of a communication company where a customer has a connectivity issue, they showed an end-to-end example.  The customer (with different moods) is going through various steps, and traffic lights link these to actual data collected at runtime, or to the business process involved.  Drilling into the data, it became apparent that a process improvement to reduce the number of field visits would be worth the effort, and everything was linked to the data to substantiate that claim.



Discovering the Organizational DNA
Jude Chagas Pereira, IYCON
Frank Kowalkowski, Knowledge Consultants, Inc.

Afterspyre offers various kind of analytics to help organizations make the right decisions.   By modeling your organizational DNA (like objectives, technology solutions, datacenters, etc.), the tool can then find any relationships between all these (for example which datacenter is running which objectives).  Other options include sentiment analysis (based on feedback from customers), affinity matrices (checking how well different thinks go together), ranking (comparing different options with each other), etc. 

Tuesday, April 17, 2018

bpmNEXT 2018 kicking off !

Attending bpmNEXT event again this year in Santa Barbara.  Have been looking forward to this event for quite a few months, so happy to be able to join again this year.  Will try to blog about my impressions


Welcome and Business of BPM Kickoff
Bruce Silver

Bruce started with a kickoff and introduction, explaining why bpmNEXT is different from other BPM events out there (on purpose!), trying to bring together some of the best and brightest people leading BPM efforts across the globe.  And he's right (at least in my opinion), bpmNEXT is different, which is why I enjoy returning to it every year.


The Future of Process in Digital Business
Jim Sinur - Aragan Research

Jim is pitching how process is now part of a much bigger 'digital' shift.  The focus is on the customer journey (or employee or partner journey), to make everything smarter, faster and better - hopefully resulting in new business opportunities, better customer loyalty, agility, etc.  A lot of different technologies (including BPM and DMN of course but also AI, chatbots, self service, etc.) are all converging towards the same goals.  Rather than just data, the focus is moving more to intelligence.  And rather than doing it all at once, he presented 10 mini journeys that can get you closer one step at a time, focused on one specific area they have seen customers have success in (content, collaboration, process, persona, customer interaction, analytics, AI, agile, low code and business functions).  He zoomed in on areas like the decision management framework and customer journey mapping.  But processes are still at the center of IT innovation, although they are driven by much more, including AI, wearables, etc.


A new architecture for automation
Neil Ward-Dutton - mwd advisors

Neil is trying to summarize for us a lot of the discussions he's been having with their community related to automation.  There is an abundance in technology (all playing a part in automation), resources (with cloud), competitors, etc. generating lots of expectation (and investigation) but also fear, chaos and disruption.  Customers need a way to organize this tsunami of technologies.
Neil introduced a model for representing how work gets done.  Customers need to think about how this applies to them, ranging from very programmatic (P) (like straight-through processes), transactional (T) to very exploratory (E) work (like case mgmt).  Depending on your focus, different technologies (AI, Decision Mgmt, Machine Learning, RPA, etc.) might be playing a role in that.  With a rapid moving technology market, customers might end up with a combination of a lot of those.


After these introductory talks, the ignite presentations are kicking off.

Secure, Private, Decentralized Business Processes for Blockchains
Vanessa Bridge - ConsenSys

Consensys is using BPMN in combination with Blockchain.  By using processes to interact with the Blockchain, it simplifies how to work with smart contracts and takes advantage of some of the process capabilities (e.g. timers) for some of the logic.  They are presenting two use cases: a token sale and anonymous voting.  
Whenever a request for buying tokens comes in, the process is responsible for creating the smart contract (encrypting some of the information), checking the funds available and passing along the tokens, etc.
The voting system allows you to put in some information about the vote itself and who should participate.  Again a smart contract is created and allows participants to register and do their vote (again encrypting).



Turn IoT Technology into Operational Capability
Pieter van Schalkwyk - XMPro

IoT devices produce a lot of data, but how to create the glue that connects this data into your operational decisions?  By creating data flows (in this case from a cooling tower for example), you can combine data from different listeners, transform it, and take actions (using a library of extensible components).  Active listeners will be looking for the relevant data from the IoT devices and can then for example end up triggering a BPM tool, call an AI predictive service running in the cloud, etc.  Doing so can transform your Internet of Things into an Internet of People, helping the people making the operational decisions as much as possible.


Business Milestones as Configuration: Process Director App Events
Scott Menter - BPLogix

One of the challenges executing processes is how to easily get an idea of its status, one that makes sense at the business level. (Low-level) app events (coming from your processes) are given business context (making them business events) and used and combined to keep track of business goals.  A journal is then collecting these business events and can be inspected by business users, reacted on, etc.

 

More coming after lunch.

Friday, September 22, 2017

Watch Drools, jBPM and Optaplanner Day LIVE (Sept 26)

We will be streaming all the sessions of the Drools, jBPM and Optaplanner day in New York on September 26th 2017 LIVE !  Check the full agenda here.

Use the following link to watch: http://red.ht/2wuOgi1

Or watch it here:


Part 2:



Tuesday, August 8, 2017

jBPM 7.1 available

Since we have moved to a more agile delivery with monthly community releases, we are happy to announce the availability of jBPM 7.1.0.Final.

You can find all information here:
Downloads
Documentation
Release notes
Ready to give it a try but not sure how to start?  Take a look at the jbpm-installer chapter.

The focus of this release has been mostly on improving the experience and the capabilities for process and task administrators.  These admins are keeping an eye on your infrastruction, making sure the execution of all processes in your system is in good health and resolving any issues that might show up.

To make it easier for these process and task administrators to do their work, we have added a bunch of improvements and new features for them:
  • Error management: Errors that happen during the execution of your processes (or tasks, jobs, etc.) are now better detected and stored.  This could for example be an (unhandled) exception during the execution of your process instance or a job that has hit its retry limit.  
    • At the core engine level, errors are stored in the database and can be acknowledged.  Note that the engine will always guarantee a consist state of all your process instances, so when an exception like this happens, the engine is rolled back to last known state and the error is logged.
    • Through the web-based console, process admins can take a look at any exception that might have happened in detail throught the new Execution errors view, acknowledge them and if possible take action to resolve the issue.
    • The process instance list has been extended with a new column to show any errors possibly related to that instance.

  • Quick filters: Searching for information about specific process instances, tasks, jobs or errors is now made easier by offering a new search tab where you can try to find the data you need by adding quick filters (for example related to the state of your process instances or the time it was started, the name of the task, etc.)
  • Navigation: new actions have been added to the process instance, task, jobs and errors view to more easily navigate between them where appropriate.  For example, you can navigate to the errors associated with a specific process instance (if any) or to take a look at the process instance associated with a specific task or job.
  • Task admin view: the task view that was included in previous versions has been split into two separate views:
    • Tasks: Aims to be used by task operators / end users to work on tasks assigned (or potentially assigned) to them
    • Task administration: Designed to be used by administrators, to manage tasks belonging to other users. This perspective is only available for users with roles admin and process-admin. It is similar to the former "Admin" filter tab on the former tasks perspective.
  • Project and team metrics
    • A brand new dashboard is now available for every project listed in the authoring library. After opening the project details page, a metrics card shows up on the right side of the screen. The cards shows the history of contributions (commits) made to that specific project over time. Clicking on the View All link gives access to the full dashboard which shows several metrics all about the project’s contributions.

    • A brand new dashboard has also been added to the Teams page. A metrics card on the right side shows the history of all contributions (commits). Clicking on the View All link gives access to a full dashboard showing overall contributions metrics. 


More detail can be found in the full release notes.  Especially to our process and task administrators, enjoy !

Drools, jBPM and Optaplanner are switching to agile delivery!

Edson recently blogged about how Drools, jBPM and Optaplanner are moving towards a more agile delivery.  The goal is to be able to release new features much quicker and more often to the community, by having monthly community releases.

Since this obviously has an impact to our entire community (hopefully overall a positive impact of course ;)), wanted to highlight some of the most important consequences as well:
  • More frequent releases gives the community earlier access to new features
  • Reducing the scope of each release allows us to do more predictable releases
  • Since bug fixes as usual are included in each release as well, users will be able to pick those up quicker as well
As a result, starting with v7.0 a few weeks ago, you should see releases more often now.  It does mean that each individual release will be smaller in size.  But overall we believe we will be able to deliver new features and fixes faster and more predictable !

Feel free to take a look at Edson's blog for a little more details.