High-quality data is the basis for better decisions – companies have to be able to trust the data if they are to generate added value and thereby achieve a competitive advantage. A holistic data strategy and a central analytics platform are essential for this.
Of the large amounts of data generated in companies, often only a small part can be used profitably. Some records may be unavailable because they are stored on fragmented systems. Others might not be in the required format or may be of dubious quality. If there are deficiencies in data quality, availability or completeness, companies cannot be confident of achieving optimal analytical results. Processes have to be adapted to a new system – or completely re-planned.
ReThink workshop
In the ReThink workshop, you can determine your digital maturity and plan your digital journey.
Added value thanks to a centralised data platform
Companies need to develop a data strategy that allows them to extract added value from their data. ‘We have found that many companies maintain various data silos in different departments which they use to run individual analysis. The results can then contradict each other or may be inadequate. Companies are far from using the full potential of their data,’ says Ioannis Theocharis, Data & AI Consultant at Swisscom. Trustworthy results can only be obtained from trustworthy data.
Ideally, companies should use the cloud to aggregate and consolidate all relevant data from different sources across their organisations and make it available for analysis so that it can form the basis for business decisions. For this to be achieved, data must be transparently collected, classified, structured and homogenised. ‘This effort can discourage companies from aggregating data on a central analytics platform,’ says Dave Schadock, Team Leader Data & AI Consulting at Swisscom. When a new data platform is introduced, for example in the form of a cloud migration, and the previous data architecture and structure are replaced, the new setup often serves the same function for the business as the old system at first. The two data experts therefore find that it is not unusual for customers to have doubts about the benefits or to wonder what is the point of such a big upgrade. ‘Never change a running system’ can be the refrain.
Identifying added value and reducing initial resistance
The potential of any new platform must be communicated in order to allay doubts. ‘The return on investment inevitably follows once the platform is being intensively used. The many new opportunities that can be exploited pay off in the long term,’ Schadock continues. He points to various successful customer projects. Geobrugg AG, for example, has been able to achieve a 15% increase in efficiency and quality. It has also eliminated the time-consuming manual maintenance of a very extensive collection of Excel spreadsheets.
There are different routes to an analytics platform in the cloud. Rather than tackling a full-scale transformation all at once, which can be time-consuming and complex, a new solution can also be implemented on a gradual basis with the right data strategy. ‘It’s often worth migrating the most important things first, such as the core business processes, and implementing new use cases on the platform. You can then leave other applications running in parallel in the old system and migrate them to the new architecture at a later stage,’ explains Ioannis Theocharis. With this pragmatic approach, new features and possibilities can be implemented one after the other while still using legacy systems. The media breaks that result from different, distributed systems are continuously reduced as more and more processes are migrated to the central data platform.
Using AI while still applying critical thinking
When data from in-house inventory systems are used for an analytics platform, trustworthiness is usually a given. ‘Aggregated data from different sources that have been inspected and approved by departments can also be given a quality seal,’ says Dave Schadock. In this way, the sources and processes behind them can be seen as transparent and trustworthy. On the other hand, if external data sources that cannot be fully inspected are integrated via APIs – such as social networks, e-mails, weather data, etc. – a serious review of the data obtained is essential.
The same applies when using external AI models trained on data from unknown sources. ‘AI models should be used in a controlled manner and the results interpreted with expertise,’ says Dave Schadock. Many users still overestimate the accuracy of AI results. For now, at least, the output generated by ChatGPT and other large language model (LLM) applications must still be reviewed. ‘These LLMs should merely be considered co-pilots that need a certain degree of supervision – and not autopilots that can be allowed to work completely independently. You always have to think critically and interpret the results, placing them within your company’s overall context.’
According to Ioannis Theocharis, the result is only ever as good as the data itself. A final quality check or a suitable quality seal is essential. Ultimately, there is always a risk of result distortion if analysis is based on insufficient or incomplete data. ‘Some customers want a technology that can take useless data and improve its quality at the touch of a button – that is, turn junk data into trustworthy information.’ Unfortunately, that is a long way off being the case and companies still need to be selective and process data professionally to ensure they have a good dataset that will enable reliable analysis.
ReThink Workshop
Let’s ReThink – transformation means trust. In digitisation, in employees and in our own management. In the ReThink workshop with our experts, you will learn more about your digital maturity and the potential of your company.