Digital transformation has changed the way we work together, revolutionizing business processes and models. Digitalization has become part of our everyday life, so much so that imaging a life without it seems impossible. Artificial intelligence (AI) is the new shiny thing that everyone wants to play with, and the magic word is data.
Since May 25, 2018 (introduction of the European General Data Protection Regulation, or GDPR for short), there hasn’t been any doubt to the tremendous importance of data. Companies and websites use it every day to analyze (and subsequently capitalize on) our behavior as consumers. For example, I recently searched the web for a purse that my wife wanted for her birthday – it didn’t even take a minute for ads promoting all kinds of women’s purses to crop up.
AI goes much further: Powerful algorithms hidden behind state-of-the-art software recognize correlations between the burning rainforests in Brazil and a potential terrorist attack in Afghanistan. AI algorithms are most effective at recognizing patterns and drawing conclusions from them, for example in the form of concrete actions or recommendations.
What distinguishes one of the most important subcategories of artificial intelligence – machine learning (ML) – is not only the recognition of patterns, but also the innate ability to train itself to behave in a specific way. Its functionality goes far beyond conventional Boolean operators; machine learning is real learning based on data sets.
Today, the conditions for artificial intelligence and machine learning are more ideal than ever before. Three circumstances have made this possible, first and foremost big data. More data has been generated in the past three years than in the entire history of the Internet. For the year 2020, we assume 44 zettabytes of data – in numbers, that is 10*21 = 10 000 000 000 000 000 000 000 000 000 ¬byte.
This much data of course requires a ton of computing power. With Moore’s Law in mind (the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years), we know that computing power has increased rapidly since the AI topic began gaining traction in 1956 – which is the second circumstance making the conditions for AI more ideal than ever before. My current calculator has more processing power than my first C64.
The third circumstance concerns algorithms. Development environments like Jupyter notebook in combination with open source libraries are easily accessible for everyone and are no longer an insurmountable hurdle.
Digital Innovation Lab for AI
There are several approaches for different aspects of data management and the AI lifecycle. However, assembling and fitting them to your own needs can be a challenging experience.
That’s why we created the Digital Innovation Lab which gives us all the tools we need to identify, test and capitalize on trends. It helps us illuminate, question and test modern technologies to their limits. The Digital Innovation Lab is about questioning the status quo, exploring new possibilities and discovering how we and our customers can work in a more target-oriented, standardized manner. At our core, however, we are and will remain SAP consultants – which is why we of course have already tested all imaginable scenarios in SAP Data Intelligence 3 (even though it has only been available to the general public since March 2020).
We need the right information as well as the right quantity and the right quality of data to achieve the right results. Of course, we do not know what the results will be in advance – our clients are often surprised by their own data. While we define the use case and vision in advance, we keep an open mindset to what is coming. I can confidently say that the whole process of data exploration and feature engineering needs a lot of brainpower – you are forced to think for yourself. Even if the algorithms are predetermined, business users have to think about resulting insights and derive actions from them. Artificial intelligence does not abolish (human) thinking, it merely enhances it.
Add Comment