Tens of thousands of purchases transparent as glass: unraveling the tangle

    It’s not easy to clean up a large bank’s purchases. Especially when they are divorced by two independent integrated systems ERP and EDMS. When combining VTB and VTB24, we also had a unification of information systems, and now a single procurement process passes through them. What to do? Process Mining came to the rescue - one of the most interesting technologies for research, analysis and monitoring of business processes. But at the same time it is very difficult to use.

    Process Mining is an approach to the analysis of business processes using advanced technologies in the field of data collection and processing. We saw a lot of expensive, large-scale projects where they took on the analysis of processes using Process Mining. Despite the fact that these projects were brought to an end, in 80% of cases the beautiful schemes obtained did not work. But the sad statistics didn’t scare us, and we also decided to unravel our tangle of processes through Process Mining. Details under the cut.

    As we have already said, the complexity of the implementation was primarily due to the fact that after the merger of VTB and VTB24, the procurement process in the bank goes through several information systems that are responsible for different stages of the process. In addition, we had to take into account historical information from the decommissioned system. As a result, we got a diverse set of IT data sources - IBM Lotus database, MS SQL database, Oracle database, SAP (integration via RFC). The picture ended with the fact that the data sources are in different network segments - this also had to be taken into account in the solution architecture and integration methods. By the way, we have a separate post about combining the network segments of banks . But back to the procurement business processes.

    In fact, the good desire to restore order in business processes has decomposedtwo tasks :

    • restore a business process based on data from all sources - for subsequent data-driven optimization.
    • calculate key performance indicators (KPIs) of process performance - for reporting to management

    The technological implementation of the solution in the bank includes the following components. The Process Mining platform is implemented on the basis of Celonis software, the data collection component is Pentaho DI + PostreSQL, the data storage and data showcase is the Vertica column database. Pentaho DI + PostreSQL bundle allows us to centrally collect and process data from sources (IBM Lotus, Oracle, MS SQL, SAP DFC). Vertica is a powerful column-type database that allows us to store data in a compressed form and faster process large massive queries. That is why Vertica serves as a data source for Celonis, which takes a data model for further automated business process mapping and subsequent analysis.

    Our key tool is Celonis, used for Process Mining. It has rich internal visualization and analytics, which can be expanded using the built-in Python API, which provides access to all modern data analysis approaches.

    In general, each of the components we have chosen perfectly performs its own task. However, they all fit together as a single solution. The new platform allows you to provide Process Mining as a service, with an adjustable level of detail and the frequency of data updates. For some tasks of the bank, we provide data for Process Mining every 15 minutes. But in the context of this task, there is no need to update data more often than once a day.

    In Celonis it is very convenient to create excel reports based on analytical representations, which always makes the calculation of the tool transparent. We have come to the conclusion that, together with the implemented KPI, it is convenient to have a report on the same sheet with a complete list of transactions (events) on the basis of which this KPI was calculated. As a result, we can solve analytical and internal reporting tasks in parallel - this is an important advantage.

    A digital business process model, assembled in this way, allows you to detect: multiple circles of coordination; delays in the time spent in status; ineffective or most busy performers; the best and worst units in the context of KPI and much more. By analyzing information from a process perspective, it’s easy to transition from digit analysis to optimization. Information on each purchase we can view in Celonis, and with the entire history of changes - before this would have had to go to almost dozens of systems.

    With the help of Process Mining, we can analyze both a specific purchase and a sample of interest by type, unit or other parameters in dynamics. In this way, it is possible to identify inefficient process steps without problems, or, for example, find the reasons for the process deviating from a given model. For example, this is how we learned for sure that agreement negotiation is usually one of the longest steps in the process. And they also calculated the percentage of purchases that do not fall into the final status, and identified the reasons for this.

    If we go further, then Process Mining allows us not only to identify problems based on multifaceted statistics, but also to discover the best ways to go through procurement, to understand why everyone does not use it.

    Okay, Process Mining is great, but what about the specific objectives of the project? We successfully dealt with the first of them in the declared terms. Initially, it was necessary to restore business processes only for procurement by the Department of Information Technology, but after receiving the first results and demonstrating them, the internal customer asked to scale the solution to all purchases of the bank. And we managed to do this without shifting the agreed deadlines.

    With the second task, calculating KPI, everything was not so simple. The stringent requirements for the error in the KPI calculations required an improved quality of the data collected - 96-98% compared to the sources. This quality was not achieved immediately, it took time for the finance department to devote us especially to the business process. The competence center of the process mining bank and the financial department have jointly identified low-quality data and features of technical implementations that sometimes distorted process models.

    As a result of the project, we were among those 20% of the lucky people whom Process Mining really helped. And this is not luck. To build a process model on the basis of real data that is updated daily, calculate process indicators and bring it all to beautiful and convenient analytical representations - this is only part of the story. Many projects miss out on what no Process Mining - data quality - will work without . We have done a lot of work with the internal customer to improve the quality of the data, so that our system can not only conduct analysis, but also prepare regular reports for making important management decisions.

    As a result of the project, our understanding of Process Mining in principle has slightly changed. This is an approach to collecting disparate information about the process and its subsequent in-depth analysis using modern tools. Moreover, an approach involving continuous and sequential collection, recording and analysis of events from information systems about the target object of research, and its evolution when moving along the process.

    Our solution based on Process Mining technology has proven to be useful for a large number of different users involved in the procurement process. Now, within the framework of a single system, they can deeply analyze these processes, monitor the status of specific purchases, KPIs and, finally, automate reporting. If we talk about numbers, the introduction of Process Mining and the implementation of a package of measures by the financial department allowed us to reduce the time for the procurement process by 25%, while the total number of purchases increased by 3 times.

    Celonis has a rich marketplace with paid add-ons. But we came to the conclusion that it is better to develop your own customized fine-tune tools using the Celonis API in Python. We will share this experience in future articles.

    About the union of large banks at different levels, we can read one more thing:

    Also popular now: