Kyu Kyu
Accueil | Nos Insights

Episode 2 – Data : an effective lever for risk management

Risk management Supply chain
Publications
Episode 2 – Data : an effective lever for risk management

Never before have companies generated as much data as they do today. Data related to risks and insurance is no exception to this trend: it is multiplying rapidly and constitutes a veritable gold mine of strategic information.
However, this wealth of data remains underutilized, as it is too often scattered internally and among multiple players (brokers, insurers, experts, etc.), and is stored in a variety of systems and formats (ERP, SIGR, Excel files, etc.).
As a result, answering relatively simple questions (how has the insurance budget changed from year to year ? How much in brokerage commissions? Number of claims ?) can be very time-consuming, as it requires consolidating multiple pieces of information that may also have been delegated to third parties.

At KYU, we believe it is essential to free risk managers from the chore of data consolidation so they can focus on analysis and value creation. This involves :

  • Autonomy in operations so they no longer have to rely on third parties,
  • The implementation of fast and reliable analysis tools,
  • The use of AI to exploit unstructured textual and visual data.

Our conviction: your data is a goldmine of productivity gains and added value that is under-exploited! Don’t let it accumulate out of sight; structure it, centralize it, and exploit it.

1. What if you could “take back control” of your data ?
Today, many companies depend on their brokers and insurers to access their own data and perform analyses. This results in a lack of responsiveness, visibility, and control, which can be detrimental in the event of changes or the need for analysis.

However, “taking back control” does not mean replacing your partners. It means bringing key information that is useful for managing risks and associated costs back in-house. This requires establishing clear rules to ensure that information remains comprehensive and reliable over time.

In practical terms, this involves several steps :

  • Mapping your analysis needs and the associated data to be hosted internally (insured values, claims history, expert reports, etc.) in a secure environment such as SharePoint ;
  • Establish scalable organizational repositories to ensure the uniqueness and reliability of the data collected and avoid inconsistencies, particularly those related to reorganizations, acquisitions, and asset disposals ;
  • Enter into contracts with stakeholders (brokers, insurers, experts) requiring them to provide data on a regular basis in consistent and compatible formats.

These foundations are essential for moving from a logic of dependency to true autonomy in data exploitation, thereby transforming a fragmented resource into a strategic asset.

Our approach: We help our clients identify their key data and build their repositories, organization, and governance to ensure completeness and reliability over time.

2. What tools are needed to meet analysis and reporting requirements ?
Once the data has been collected, it must be analyzed in a way that can be acted upon. Traditional solutions such as RMIS (Risk Management Information Systems) provide some answers, but their strength lies mainly in the robustness of their transactional processes rather than the flexibility of their data analytics module. Furthermore, they rarely cover everything, and their implementation times and costs are increasingly high.

Conversely, Business Intelligence tools such as PowerBI and Looker Studio offer an agile and powerful alternative for setting up dashboards and reports that have many advantages :

  • They are quick to set up and easy to maintain and upgrade over time ;
  • They enable the creation of dynamic and fully customizable dashboards, addressing everything from very general reporting needs to the most detailed analysis requirements ;
  • Reports can be easily shared within the organization with fine-grained access rights management.

Our approach: we favor agile deployment with limited business overhead to deliver dashboards in less than three months within the framework set by your IT department.

3. And what about AI ?
The rise of Artificial Intelligence tools is currently generating a great deal of enthusiasm and just as many questions. While they are a powerful lever for automating, accelerating, and improving the reliability of analyses, it is essential to bear in mind their current limitations and the associated risks, particularly in terms of data confidentiality.
It is unrealistic to think that AI tools are capable of reliably exploiting a heterogeneous data set and extracting a relevant analysis in response to a complex question without first providing a minimum of structure and configuration.

That said, it is still a priority to deploy AI in certain use cases where it brings real added value :

  • Automatic consolidation of data from multiple sources, detection of inconsistencies or duplicates, and intelligent data cleansing ;
  • Processing text and image files to create structured summaries or populate databases (insurance policies, expert reports, audit reports, etc.) ;
  • Natural language queries of structured data libraries or text files to answer simple questions (What is the cost of claims per entity over the last three years? What is the typology?)

Our belief: you already have AI tools in your internal IT systems that can help you increase efficiency. We can help you create agents that will enable you to quickly take your operations to the next level.