Digital Process Automation (DPA) is the evolutionary step out of the Business Process Management (BPM) construct of the past 10 years. Generally, most enterprises have some form of business process legacy built into their organization. Within this lens, we can see how organizations have mastered their processes through the creation of flows and rules that dictate the behavior of the organization. From the perspective of a business process, we see clients working along the typical “happy path” or sometimes the “alternate flow path” and “exception flow path”. As we help clients embrace the power of artificial intelligence (AI) to enhance business decisions, we notice that many limit AI to enhance the ability to decide along a particular path step. Others think about the options provided in that moment, within that path step. What many fail to realize is the immense potential of AI to revolutionize business decision-making by changing the flows based on various executions, and examining the flow behavior of those paths into a champion-challenger capability.
In simple terms, champion-challenger testing helps businesses learn if a promising “challenger” can defeat (deliver a better outcomes) the current messaging “champion”. The champion is the production strategy (current decision logic and business rules), and can be made to compete with the challenger (variant of the logic and rules). A layer of AI, when deployed in this process, compares the two outcomes based on the same inputs to predict the better decision. With this idea, consider a situation where a process is wrought with alternate flows and paths are everywhere within that process.
As you begin to look at customer interactions, you will notice status changes that happen during the interaction and overall, you see all types of situations – some work well, some drop to an alternate path, and some to an exception. Within all of them, you can monitor each step with your BPM system and track the characteristics of that situation. With a few basic tweaks to the reports, a tester can see exactly what they asked to see. But this is not a use of AI. To really use AI, the tester must treat the system as one among subject matter experts – and to become a subject matter expert, it typically takes training, observation, and most importantly, practice.
Let us consider an example: if you have a customer that interacted for a change of address, you will most likely have the following data:
- How long did it take to make the change?
- Did the zip code change?
- Have they ever had the new zip code before?
- What is the distance between the two zip codes?
- Some insights into these and we can determine any change in account characteristics from the old zip code to the new
- Increase of account on a credit card balance before the change of address
- Adding another authorized user to the account?
- What method did they use to make the change?
- Did they interact with a CSR on use self-service?
We can compare this data with other outcomes to find a set of customers with the specific pattern. Let us say that the pattern we want is:
- Less than 15 minutes
- 30 miles
- Phone call /CSR
- CSR #123
For this study, let us explore a particular outcome – customer applies for a mortgage within the next 180 days.
What characteristics can we infer from the above data? For example, can we identify CSRs that get a large number of applications because of the conversation? Supposing that CSR X gets more applications, we can reduce the other calls and only work on those with CSR X. Also, is there a clustering pattern in the event behaviors that identify something other than the happy path? Let us assume that there was a common interaction event that the customer used, to provide an intent. Therefore, as it goes, outcomes from CSR X are the top result in the challenger test. With this new insight, we attempt to construct the mechanism that will automatically propose the intent and make it the challenger. We can now reproduce what CSR X does across all other CSRs. For example, best practice from CSR X is to explain the mortgages opportunity while the change of address is happening. After adding that intent to the challenger pathway, we can observe the behavior. Interestingly, letting a machine make choices such as this one, does require some oversight. For example, if we were attempting to determine whether to make the explanation of the mortgage process intent based on zip code, we would potentially face regulatory actions. Because of our methods of implementation and more particularly, the deep expertise we have with the solutions, we can create the transparency controls necessary for humans to keep the machines from committing regulatory errors. In fact, with this approach, we can also prescribe exactly how the machine learned it. The entire procedure is auditable and as such becomes an important aspect for AI to be successful at interacting and working within the business.
This interactive approach for providing alternate pathways to a customer helps us tailor challenger opportunities to the happy path champion.
As many organizations have deployed various BPM applications, the next move forward is to take those event markers that happen during an interaction and observe them through an AI prism. This enables us to dynamically alter the process flows in a champion-challenger configuration.