Take your IT Production to the next level thanks to data insights

Top real world experts share their insights with you.

DATA
IT PRODUCTION
By
Oksana Biens
the
16/12/2022
Conversation with market leaders about data

How to break silos, connect data across your entire organisation and leverage it to improve the level of service you deliver to your clients ?

This conference organised during the Alenia Production Tour 2022 gathered multiple CIOs, data leaders and operational experts, including Laurent Demeestere, machine learning engineer at Engie Digital, Guillaume Besson, strategy manager at AI Builders and Nathalie Bouillé, regional director at Dynatrace. It was hosted by Oksana Biens, data expert at Alenia.

We touched ground on topics like data-driven IT production, the importance of leveraging data in your decision-making, best-in-class sectors, data quality, the main challenges encountered by our guests and much more.  We hope you enjoy this conversation.

Oksana Biens, Alenia - Hello everybody! Let’s start with the basics. How would you define data-driven IT production?

Guillaume Besson, AI Builders - Data-driven IT Production is a buzzword that covers several realities. The fact is that we're using business intelligence, AI and advanced data analytics to understand what is going on within the IT systems. We have seen 2 phases in the development of data-driven solutions: the period of innovation (from 2016 to 2019), when we witnessed a shift from considering data as a cost to viewing it as an asset, and discovering all the possibilities it can offer. Since 2019 the solutions to generate insights are more mature and we start to realise the full potential offered by IT related data. 

OB - Thank you Guillaume. Nathalie, you and your colleagues at Dynatrace accompany clients in implementing solutions to generate value from data. What is your definition of data-driven production?

Nathalie Bouillé, Dynatrace - We at Dynatrace are in the observability market, and data-driven production for us is about making the right decisions thanks to data in the most automated way. In a sense this is not new, when I was younger we were already talking about data warehouses, business intelligence… then people put data into data lakes. What’s new is that now we tend to trace everything. Every single swipe, every user action on the mobile and on the Internet results in huge volumes of data. Moreover, now we work in hybrid cloud environments, so the types of data are very different. Collecting data and putting them into data lakes costs a lot. The challenge is to gather a lot of data in order to generate insights with the minimum cost. So it's all about observing data production and data development to help people deliver applications that work perfectly in the fastest way possible.

"The challenge is to gather a lot of data in order to generate insights with the minimum cost." N. Bouillé, Dynatrace

OB - Thank you for your thoughts, Nathalie. Now that we all agree on the topic of our discussion, I would like to start by explaining why at Alenia we think it's very important for CIO's and their teams to start leveraging data to drive decision making. From our experience, we've seen quite a lot of benefits to manage your IT Production, that you can get from your data, and here are some examples. 

The first one is breaking silos. In most organisations, data is scattered between business silos and IT silos, so just having this 360 view of your IT ecosystem enriched with real-time production status is already a huge benefit, allowing you to get a quick overview of what's going on in your IT production. 

The second one is reducing the number of incidents following production interventions. When you have changes in production that directly impact your critical business applications, well, in this case you have well defined procedures to limit the impact of such interventions. But how many times did you have your technical teams perform what they qualify as a minor change on an infrastructure component, and you end up with multiple business teams being impacted by that change, simply because the technical team was unable to assess the whole impact of the change on the IT ecosystem? So it's very critical to be able to qualify the exhaustive potential impact that you might have to limit side effects and reduce the number of incidents that occur after interventions. 

The third one is accelerating troubleshooting. In case a production incident does happen, it's very important to ensure a proactive response. You can achieve that by providing a centralised platform for your support teams to be able to facilitate investigation, establish correlations between events and quickly identify the root cause, and maybe even have a recommended corrective action suggested automatically by the system for the incident.

The fourth one is predicting future failures. That’s the ultimate goal: avoiding incidents before they happen based on the analysis of abnormal behaviour patterns. 

And last but not least is optimising resources and processes. For example, you might want to identify recurrent user requests to automate them, or identify a common root cause for multiple incidents to address it as a problem. Or even predict a production risk for the next day, week or month to be able to adjust your resources accordingly.

Data driven production in cartoon

OB - Laurent, you work at the digital department of Engie, it's a transversal department that serves the whole company. Can you give us examples of the major objectives that you help your company accomplish thanks to data?

Laurent Demesteere, Engie - Maybe I should start by presenting the place of Engie Digital in the global Engie ecosystem. Engie Digital is a software company within the Engie group. Our mission is to leverage data and digital capabilities to reach a carbon-neutral economy. We develop, deploy and run solutions for the customers of various Engie entities. 

As an example, we help our clients accelerate decarbonization by monitoring their carbon footprint. We optimise the equipment and maintenance of Engie's renewable assets by collecting data from wind, solar, hydro-electric, biogas equipment and crossing this data with the weather forecast. We enable predictive maintenance by identifying abnormal behaviour with the help of AI. We transform cities by helping our clients face complex urban challenges like reducing CO2 emissions or regulating traffic. 

OB - The energy industry seems quite advanced in leveraging data compared to the banking sector, where we have seen only a few large scale data valorization projects. Guillaume, you're working with clients from various sectors, which are best in class and which fall behind?

GB - Yes, I have worked a lot with clients from the pharmaceutical sector, energy, heavy industry and many others. We have seen real progress in maturity in all sectors over the past years. The idea of deploying data analytics solutions has gained traction, but moving from a pilot project to an industrialised solution remains complicated, and many fail. 

We have found one of the biggest challenges is the shift in corporate culture. It is in my opinion the most important factor that determines the success or the failure of data-driven IT production projects. 

OB - And talking about the banking sector in particular, what are the challenges that prevent companies from getting the most from the data? 

NB - Many banks and insurance companies have been spending a lot of money on tools to monitor their IT systems for the past 30 years. The profusion of tools and environments created silos, and it became very hard to have a global view of what's happening. It is not sufficient to monitor the network, the middleware and the applications separately, because there are a lot of dependencies between these layers. The big challenge is to make people think globally and have an understanding of the dependencies between everything that is happening. The goal is not just to collect data, but get answers from data to make the right decisions.

"Data quality and adoption rate are two major challenges we face at Engie Digital" L. Demeestere, Engie Digital

LD - For me, data quality and adoption rate are two major challenges that we face at Engie Digital. Regarding adoption, the question is how to track and increase the adoption of provided solutions by our clients. You need to use the proper monitoring tools to do that, like Dynatrace (that we happily use at Engie Digital). And with this kind of tool you can check the activity of your customers and assess user satisfaction by analysing the data points. And to increase adoption, the best strategy is to listen to your customers, do better testing and get regular feedback.

As for data quality, the question is how to measure, track, and monitor the quality of data you are using, since bad quality data will generate bad quality insights. This question implies several challenges: you need to get the proper tools and techniques, you need to get the proper people involved, and you need to define the proper alerting level. When you have some corrections to apply, you have to do it at the company level, and not only at the scale of your project, which is a big challenge as well. To conclude, you should allocate sufficient time and resources to the data quality topics, which are often underestimated. 

OB - I absolutely agree, especially regarding the data quality challenge. This is the major point that we see with our clients. Some of them are waiting to have their data all clean before they start leveraging it, which in my opinion is a mistake. Data quality management is not a one-off exercise, but a continuous effort. 

You need to create a virtuous circle: the more you exploit your data, the more people see what they can get in terms of business value, the more they understand that they need to play a part in improving the quality of the data at their level, and the better the data gets, hence generating better insights. 

When you described challenges preventing companies from leveraging their data, I was expecting to hear about required expert technical skills, especially with Dynatrace. Isn’t it something that you need? You can get a Formula One car, but it doesn't mean that you are able to race it, right? 

NB - Well, one of the benefits of Dynatrace is to enable people who are not experts to understand what's happening with their IT Production. So if you're not an expert, you will still be able to see the issues in your IT ecosystem, pin down the root cause, assess the impact, and identify necessary corrective actions. But you have to have a few people trained that are in charge of the platform to deploy it at scale, and Dynatrace is not working independently. I mean it has to be connected with alerting systems and other systems to do some auto remediation or automate quality gates or security gates. 

Market maturity overview

OB - Yet, despite all the mentioned challenges, there are numerous companies that succeed in leveraging their data. Can you share some success stories with us? How did you help your clients or your company use data for better decision making and improved IT production management? 

GB - Let me start with an example we have seen a few months ago at AI Builders. There was this company, working in the energy industry, and lagging behind in terms of data valorization. They decided to accelerate their data transformation and designed a new data organisation which brought together the data office, the IT and digital services within the same entity directly under the Chief Strategy officer. This allowed to align all stakeholders on the common goal and put the data considerations at the heart of the overall business strategy.

NB - At Dynatrace, we work with a lot of international banks. They all have the same challenge, which is to deliver better applications and provide better service to end customers. In order to meet this challenge, the objective is to avoid failures, reduce the time to repair, and deliver better applications to production faster. 

For us to get there, the first step is to reduce the number of incidents, the big ones but also the small ones. This actually starts before the application is released, making sure that any potential issues are detected by the development team, the IT support team and the testers before the go-live, reducing the difficulties and improving the user satisfactions. 

There are a lot of things to say, but behind it there's a lot of data. And the challenge is to organise the data, structure the data, and make sure you don't only get data but answers to improve the quality of service provided to your clients.

LD - I would like to talk to you about our Ellipse platform that helps us envision a carbon-neutral future. We help our customers perform decarbonization at scale, and the end users of this platform are sustainability managers. The platform allows them to track their progress in the carbon-neutral transition, using data from multiple sources, including energy consumption bills, to monitor and control the CO2 emissions.

OB - Thank you very much for these examples. The promise offered by leveraging data is big, but it is not always easy to kick things off. Could you share some practical advice: where should companies start with leveraging their data? What are the pitfalls to be avoided and the key success factors? 

GB - I think the key to successfully implement  data-driven IT production is to have allies within your organisation. Develop a project or an organisation which creates consensus among the different stakeholders. Build this by making people work together and aligned both on the strategic part and on the deployment part.

"You need allies within your organisation to successfully implement data-driven IT" Guillaume Besson, AI Builders

NB - If I take the use case “how to observe the IT ecosystem to increase quality and reduce costs”, I would say that you need to have the sponsorship of the CTO and the head of data. Because there is a lot of resistance inside every company, everyone wants to manage their own area, their own tools, their own data. Thinking globally is important. You have to think big and at the same time start somewhere. So there is a sort of maturity assessment to be done to understand where you are at the moment, where to go and who will embrace the project progressively to deploy it locally and then go massive.

LD - And don't neglect your data quality. When budgeting your projects, allocate time and resources to it. Don't forget to increase your customer adoption and build a strong data culture in your company.

OB - Thank you very much, now we'll open for Q&A. “Which industries are the most advanced in leveraging data?” Guillaume, this is for you!

GB - I don't know about all the existing industries, obviously! The most advanced industries I've seen are the energy and the pharmaceutical industries. The energy industry because they compete with many global companies. They need to be at the forefront of what exists today in terms of data including data-driven IT production. The pharmaceutical industry is also quite good because they traditionally have had to deal with a lot of data from their R&D.. 

OB - And I think we should not forget the retail industry as well... 

NB - … Yes, of course. The digital teams are well in advance in the retail industry.

OB - Thank you. Next question “What are the best solutions to overcome data quality issues?” 

NB - There is a question of how you collect the data and the contextual information, the metadata, to put the data into a data model that then the artificial intelligence will use. If you don't collect the contextual information with the data, in the end you will have data that is not clean.

LD - There are a lot of existing tools that can help. At Engie Digital we use Splunk to build dashboards for controlling data quality, but you can think of other tools, e.g., data governance tools like Colibra, to address these kinds of issues. You have to find a good balance between custom solutions and off-the-shelf solutions in your company, in case your projects and problems are very specific, a custom solution might work best. 

OB - I would also add that in my opinion the best way to improve data quality is to improve data literacy within the organisation and make sure that everyone understands that they have a role to play in this process. And the second thing is ensuring that you have data quality controls in place for checking known unknowns (i.e. when you assume that a potential issue might occur at a particular point in the process or system). In this way you can control this particular type of issue in a specific system, at a specific point in your ecosystem. But you also have unknown unknowns, i.e. unpredictable things that might happen, in which case observability can be of great help.

Understand the KPI

Here is a new question: “How to think globally if the organisation is siloed and not global?”

GB - I think the fact that an organisation is siloed is not necessarily a bad thing under two conditions. The first one is that the management needs to have a global view, and the second one is that people can talk to each other even given the siloed functioning of the company.

OB - Our last question: “Is data-driven IT production easier to improve when the production teams are embedded in feature teams with the dev team, or if application production teams are independent?”

Allow me to give my point of view on this one. For me, it's way easier if support teams are embedded within feature teams so they are aware of new releases and can anticipate potential exploitation issues that the developers might not think of because it's not in their culture. In case something does go wrong after a new release, support teams can be more proactive in fixing it if they know what has been changed.

Marjane Mabrouk, Alenia - I partially agree with you. From what we've seen, when the production support is embedded within the development team, like in agile at scale models, most of the time the quality of data is decreasing. 

"The best results we have seen so far to improve data-driven IT production are when support teams drive the design of the monitoring solutions." D. Cebrián, Santander

Dámaso Cebrián, Santander - In our experience in Santander when you have a feature product team that really takes observability and data model training into account it's great because they are the ones that can tailor the models to the actual use cases that are bringing real value. But the problem is that at this point in time very few production support teams worry about observability and know how to put it in place. So we typically get best results when support teams are the ones driving the design of the monitoring solutions. Now when you have a feature team that is taking this into account, they are obviously way better because they tailor their models very well.

MM - and it’s important to involve the business because most of the time IT is doing monitoring without the business, they are thinking for the business instead of integrating the users in this exercise. We are deviating more towards monitoring than data, but these two topics are definitely intricated.

Nathalie, do you see this pattern with your clients? 

NB - It's a good question. Sometimes people tend to love Dynatrace so much that they keep it. They keep the platform for themselves, so they work with an improved observability platform, but they don't go to the business, they don't involve other people in the project. It is just one more tool on top of the other ones. It's not unsuccessful, but it's not the massive ambition that we have to improve the quality of the data and the quality of service.

So the conditions to get a great project is to think big, but to get the support or the sponsorship of people who are working in the DevOps teams, people in the application support teams and people in IT monitoring as well.

OB - Thank you very much all of you. If you want to dig deeper into these topics, you can get in touch with us here, we’ll be happy to keep this conversation going.

Conversation with market leaders about data

Oksana Biens

Data Leader

LinkedIn IconEmail icon

Discover more