This is the final installment of my posts on the Functional Fast Track session that I conducted at the 21st Annual North American Shared Services and Outsourcing Week entitled Digitizing the P2P Process to Create Process Intelligence and Better Customer Experience
March 8, 2017, Lowes Royal Pacific, Orlando, Florida
Defining Your P2P Process for Automation
We will now look into the design of KPIs and the overarching process performance measurement system (PPMS).
While I present a typical eight-step project methodology here, there are different but equally suitable ways of setting up. I’m sure that many of us have been involved in process transformation projects before and will be familiar with these alternatives.
Also while I have some detailed diagrams here, I will not expound too much on the methodology. Instead, I will comment on those steps that are often overlooked but are important digitization goal posts.
The most logical starting point is process mapping. The process map is a consensus between the process owner and his stakeholders about the current state of the P2P process. Using this common understanding, the manual process may then be mirrored in the proposed automated workflow. Remember also that processes are rarely static and are almost never organized neatly. Process mapping is a preparatory step that ensures that we do not miss shifting process conditions as well as important but less obvious process tasks. A good illustration might be how an organization manages internally when orders are split apart by a supplier due to delivery bottlenecks. In this example, the original process may not have been designed for such split deliveries but the contingency happens often enough for internal practices to have evolved just to accommodate these.
Process mapping can uncover instances of execution failure that hold back overall process performance. For instance, different processors may not handle invoices in the prescribed and standard manner increasing the likelihood of duplicate payments happening. Once these potentially troublesome process failures are known, we can then institute regular monitoring and define the events to trigger remedial action.
As we mirror the original manual process in the automated workflow, we should safeguard that no control point is overlooked. This is not so straightforward when process steps are compressed, re-arranged or (in the course of enabling system options for various users) the original simple linear process now becomes layered. A case in point is where the technology allows different invoicing and payment channels to be accessible to vendors, which leads to a risk of duplicate payments that did not exist in the original manual process.
The next step involves objective-setting and prioritization. The diagram below gives a few examples of possible priorities.
Setting the baseline
The diagram below lists down some questions to help set the performance tracking baseline. As part of our stock-taking, we need to diagnose our current performance measurement.
We need to be aware of the different types of performance measurement issues that may exist currently under one of five categories:
- We can have issues that relate to data collection and calculation;
- A second issue type involves prescribing the right measuring stick, meaning to say proper definition.For instance, we can be measuring % spend with key suppliers/ total spend – in this measure, are key suppliers the ones with long-term supply contracts, are they the suppliers of a predetermined set strategic commodities and services, or are they those with at least US$ 50 thousand spend over the last six months?
- The third item relates to unclear targets or standards.
- The fourth issue type is the lack of corrective action when the standards are not met.
- The last category pertains to creeping accumulation in the number of scorecard metrics with the passage of time. Compounding this, it is not uncommon to fail to drop the measures that have become irrelevant over time. Lastly as processes and roles evolve, a metric can somehow end up being reported to the wrong person. Given these, it is not hard to lose sight of what the whole body of measurements is supposed to signify.
In searching for the best internal benchmarks, we need to be mindful of the different dimensions to observe for performance variations. The diagram below presents an example where commodity clusters, locations and business and functional units are relevant dimensions.
Develop performance indicators
Knowing our performance measurement issues and which performance indicators need to be modified, improved or changed, we can now look at identifying alternative metrics that would better capture P2P process performance status.
Each distinct measurement will typically fall under one of three categories of efficiency effectiveness and compliance measures. Taken together the mix of should paint a picture not only of how well procurement execution is working, but also how well the component businesses are being satisfied and how well the sourcing strategy is being achieved.
After the list of possible performance indicators are identified, the attributes of those indicators need to be defined (e.g., purpose of metric, method of calculation, frequency of measurement, etc.). Next, we need to agree the KPI selection criteria, and trim down the performance indicators into the preferred list of organizational KPIs.
Define data collection and processing
The next step is essentially translating the process design and process measurement plan into a set of business requirements: In this phase we define things like:
- the channels or portals by which transaction data can be received; and
- the volume, velocity, timing and variety of data expected to pass through each channel.
As we consider switching on application functionalities, we may need to add detail to the business requirements. This can be simple things like defining units of measurement – like choosing between pounds or kilograms, for instance – as part of setting data input quality standards for a user channel.
Design Reporting and Visualization
After we have spelled out the application routines, we next design our reports. Our toolkit to create visibility will include dashboards, mashups, alerting, and predictions. For today, we will just lightly touch on the KPI dashboard.
The purpose of a dashboard is to provide all the relevant information quickly and concisely presented in a clear and intuitive view.
If you are using an ERP system with a built in dashboard like Oracle’s WorkCenter, make the most of it. Built in dashboards offer attractive graphics designed to give a snapshot of the operations – but this functionality is often either switched off or scaled back. Do not reinvent the wheel by straightaway developing your own dashboard.
This completes a very quick walkthrough of the performance tracking design process. As a final note on this, allow me to highlight that performance tracking design is often drawn out and complex and, as such, always requires attention to structure and methodology.
Some things to remember
I will end with some practical tips that can smooth the progress of your digitization journey.
People need training
The P2P staff and perhaps even the key suppliers will need training. Without this, the program is doomed to under-deliver: Processors will fail to take advantage of the full functionalities of the tool. Formal training is best augmented by the formation of communities of practice across the implementation locations in order that forum participants can share best practices as these practices begin to emerge.
Another reason the digitization project may under-deliver is when staff are unable to respond to the alerts and absorb the information surge in the new reports. With greater procurement and accounts payable efficiency, the mix of skills required for P2P roles changes with more time being spent on dispute resolution and other analytical tasks. To the extent that the automation results in the elimination of transactional jobs, the residual organization will need training to improve the ability to tackle diverse tasks in a wider job scope.
Process intelligence allows us to discern evolving customer demand patterns, change course accordingly and reduce unwanted inventory. Our P2P staff and our key suppliers will need to be proficient within a supply chain model that leans more towards a “pull” rather than a “push” strategy. Preparatory learning is again necessary.
New process, new controls
Segregation of duties is a focus area when the automation initiative results in a much leaner organization. One issue that’s easy to overlook is how the intended segregation is bypassed through password sharing when staff go on leave. On the plus side, process supervision improves tremendously with greater visibility, allowing such capabilities as capacity planning and productivity benchmarking as part of regular operations management.
Time and again, we have seen the post-implementation disappointment, if not panic, when realized productivity gains from an automation project undershoot committed efficiency targets. A common enough response is to attempt to wring out the original FTE saves from the project by whatever means. By force fitting the project results to the original targets, however, we may be compromising process controls.
My view is that the agile or scrum framework for these projects only guarantees the early completion of the automation project but does not necessarily translate to the early realization of productivity targets. Productivity follows a j-curve. It is not reasonable to expect staff to immediately move up the learning curve in their transformed and enriched roles. Even managers need time to get used to drawing upon the enhanced process visibility to better orchestrate the activities of their teams.
One final note on undershooting project targets: One common experience we have in the smaller Asian locations of multinational companies is that the global or regional system implementation fails to incorporate requirements peculiar to the local businesses, say, the capture of GST or VAT information required by the tax authorities. Fortunately, the shared services centers in India and the Philippines offer the capability to perform the residual manual work, allowing the smaller locations to achieve some of the benefits of accessing digitized information.
Let me conclude my presentation with a few takeaways.
A well implemented digitization project always yields solid efficiency. But more important than this, digitization enables many reporting and analytics tools. This, in turn, leads to empowerment of P2P process owners with their ability for timely interpretation of information and prompt fine-tuning of operations.
Process Intelligence derives directly from how well you select and define your KPIs: You can analyze, compare, trend, correlate information, and draw conclusions based on what you intended to observe and highlight in the first place. You know that your process intelligence is effective when performance indicators track in real-time how well the P2P process contributes to the organization’s objectives.
Armed with process intelligence, it is now up to our P2P process owners to not just manage capacity, exercise control, and achieve process stability and sustainability but also to progress towards an agile, flexible and customer demand driven procurement model.
Credits and thanks to Ms. Iris Celeste Brem whose Master Thesis entitled KEEPING TRACK OF THE PERFORMANCE OF THE PURCHASE-TO-PAY PROCESS OF PHILIPS LIGHTING (December 2015) described in great detail the 8-step project methodology used for this presentation and a couple of fantastic insights shared here as well.