Digitisation is about more than storing analogue content, such as books, songs or films, on digital media – it is a megatrend promoted by several new technologies, including sensor systems, mobile communications and ultra-fast data processing, that ultimately synchronises physical and digital realities in real time.
This development is being driven not by technology addicts and nerds, but by tangible economic advantages. Computer-based processing of real-time data creates benefits that are measurable financially. One of the classic SAP Hana business cases in preventive maintenance deals with sensors located in the housing of a pump measure temperature, moisture and vibration and transmit these values to an analytics system in real time.
High-performance statistical algorithms use event-based rules to forecast the probability that a specific pump might fail within the next 24 hours. Maintenance alerts or orders are generated immediately, as and when required. The site technician who receives the order feeds the actual condition of the pump back into the system and enables the analytics application to learn from accurate or inaccurate forecasts. The potential benefits of this approach are obvious: shorter pump downtimes, fewer false positive alerts and defective pumps, and significantly lower operating costs.
ROI does not fall from the sky
The potential is enormous, but the hoped for ROI won’t just fall from the sky. What is needed in both engineering departments and IT are new solution architectures, qualifications and mind-sets.
Our pump scenario includes several sub-systems (see figure): (1) the pump with its sensors, (2) an event stream processor, such as SAP ESP, that computes probabilities of failure and creates maintenance alerts, (3) an ERP solution for processing maintenance orders and receiving feedback data and (4) a rule detection engine that generates and continuously improves the algorithms in the ESP.
Event stream processing
Event streams deliver large amounts of data whose quality is often uncertain. Prior to further processing, such data need to be complemented and/or adjusted. The art of designing a solution architecture is to strike a balance between high data throughput and high error tolerance. Both, specially designed architecture models (such as the lambda architecture) and specific programming paradigms are used for this purpose.
Closed feedback loops
ESP and ERP data are merged in the rule detection engine to conduct further analyses and to be used as a basis for improving the rules in the event stream processor. Historical information thus indirectly influences future maintenance orders, resulting in a closed feedback loop.
All these steps happen automatically, continuously and – more or less – without delay or latency. The overall system is therefore capable of responding flexibly to any new insights or changes in the “behaviour” of the pumps, whatever the cause may be.
Closed control or feedback loops are not a new thing – many process control systems use them. The true innovation in this scenario are fully digitised business models: such feedback loops exist not only for process steps in manufacturing but also for complete, end-to-end business processes. Their underlying rules are not rigid but can be designed so that they are self-learning.
Processing event streams and self-learning feedback loops alone pose a real challenge to seasoned Abap developers. Things become even more complex when solutions use sophisticated statistical algorithms, including Bayesian networks, Arima/Armax and latent variable models.
For both SAP BW and S/4, SAP has pushed the door wide open to the world of statistics by introducing HAP (Hana Analysis Process) and SAP PAL (Predictive Analysis Library) plus integrating R. Seasoned programmers will in the future be replaced with data scientists – many universities already offer related master’s degree courses whose alumni will work either for you or for your competitors ten years down the line.
Induction rather than deduction
Nowadays, organisations operate in an unstable environment. Behavioural patterns of customers and the associated decision rules for business processes change by the day, hour or minute, rather than at annual intervals. This is why the architectures outlined above need to be extremely flexible whilst always ensuring full control over governance and compliance processes. This will work only if (a) the focus is exclusively on correlations instead of explaining cause-effect relationships and if (b) processes are data-driven instead of trying to impose one’s own world view on data (i.e. induction rather than deduction). Changing these mind-sets takes a lot than just migrating from ERP 6.0 to S/4 at the purely technical level.
IT as a composer
The CIO can play a key role in this process by accelerating the evolution of the organisation. Whether associated with outsourcing or not, this is no longer about doing all development work in-house. Rather, the organisation needs to be continuously improved so that it can efficiently manage outsourcing processes and monitor them in a results-driven manner.
However, “results-driven” does not refer to ITIL service levels, but means that IT service providers are benchmarked against quality criteria for the performance of algorithms (in our example, accuracy in predicting pump failure). How such algorithms work in detail is of secondary significance. At best, in-house employees will have developed an understanding of the underlying processes. Crowdsourcing portals for developing algorithms, such as www.kaggle.com, or SAP’s own Idea Marketplace (ideas.sap.com) indicate where we are heading.
Striking a balance between agility and stability will only be possible if in-house IT experts move from the “musician” to the “composer” stage in terms of their skillset. Instead of including any conceivable functionalities in Z programs or BAdIs, the openness ofw a platform like Hana can be utilised to integrate other solutions, for instance via Extended Application Services (XS).
Both, simple tasks like spreadsheet-based reporting and highly specific work steps such as calculation of travel times as part of real-time customer segmentation can thus be shifted to other applications. At the same time, when structuring data flows, care must be taken to clarify where persistent structures are needed (such as the new ADSO in the Hana-based BW) or which parts should be reflected via virtual (for instance Open ODS View) or hybrid (e.g. Composite Provider) structures.
At the end of the day, it is the CIO’s responsibility to ensure that their organisation makes the related decisions and is capable of effortlessly combining tools such as crowdsourcing or SaaS.
Conventional ERP systems have a deterministic structure. As a matter of fact, though, order probabilities can be stored in the SAP CRM system on the quote line item level. Yet, in the next step, a quote may, or may not, turn into a customer order. Subsequent steps, such as material requirements planning, are not controlled by probabilities, which is a severe limitation to Hana, a platform that includes a wide range of stochastic functions.
In future, it will be necessary to make decisions in a partly automated setting – not starting from assumptions (which are probably wrong) but based on calculated probabilities that come much closer to reality. Our pump scenario is a case in point: given the limited number of available service technicians, maintenance alerts will be triggered only above a certain computed, continuously adjusted probability threshold.
Hana is an invaluable tool for digitising business processes. To unleash its full potential, architectures, requirements for employees and traditional mind-sets need to be rethought and redesigned from the ground up. The focus of IT is shifting from simply agreeing on service levels to composing an entire symphony of applications and partners.
The next article in our short series will deal with identifying value-adding business cases for the “digitisation” with Hana.