Knowledge graphs combined with LLMs address regulatory compliance challenges by automating analysis and providing accurate, explainable guidance that eliminates hallucination risks.

Regulatory compliance is a major challenge for modern organizations as they deal with huge amounts of constantly evolving legal documents. Most struggle to quickly identify relevant requirements, understand regulatory relationships, and ensure compliance across multiple jurisdictions.
Knowledge graphs combined with large language modes (LLMs) can make the life of legal experts much easier. They do that by automating regulatory analysis, providing contextual insights, and delivering explainable compliance guidance. This approach transforms regulatory complexity into accessible and actionable intelligence.
Industry use cases by persona
Before going to our specific use case, let’s have a quick look at the diverse needs of regulatory compliance and policy comprehension. We’ll do that by different persona requirements.
The first type of persona in this domain is the legal professional and the leading use case around this persona is enhanced analysis of the documents. The objective here is not to replace humans with AI, but to cut research time, reduce errors and omissions, and leave legal interpretation to lawyers. The focus is to optimize the mundane repeatable work and leave the professionals to do what they are best at doing.
The second type of persona is the domain expert, the SME. These are the people who know as much as the lawyers, but they have more technical skills and their objective is to know enough about regulations to be able to find what they need. They want to have first-hand understanding about what’s required without the mediation of legal professionals, which adds to both cost and time.
The third use case is policy comprehension or how to make the enormous amounts of internal policies easily usable and understandable. This is not about regulations but about internal policy so the impact and the risks are different than in the previous two use cases.
Meet DORA
Now, let’s focus on a specific use case around DORA (Digital Operational Resilience Act). This act is currently mandatory for all the financial institutions across the European Union and its aim is to strengthen the digital resilience of financial entities.
In the screenshot below, we see two paragraphs from Chapter II, Section 1, Article 5, which talks about ICT risk management. The highlighted parts show that the management body is not simply responsible but it holds the ultimate responsibility for what this regulation asks.
This would be something important for any executive. It means that DORA now is a regulation with a direct application across all EU jurisdictions and it defines ICT risk as “board level duty and priority”.
This raises a lot of questions such as:
- What is digital operational resilience?
- What is a “Financial entity”?
- What “framework” is required?
- What does “ICT risk” mean?
- What are “effective and prudent” criteria?
- What are the possible sanctions?
Can ChatGPT help?
Let’s try asking ChatGPT a question about DORA. We can simply share the regulation and ask it to provide the exact definition of digital operational resilience.
As we can see in the screenshot above, the answer is not only far from the actual definition, but dangerously different even though it was given the ACT and therefore the actual definition (see the screenshot below for comparison):
Taking laws as graphs
How can semantic knowledge graphs help? First of all, legal acts have hierarchical precedence. They are also very logically structured documents. Precision matters as law is exact and aims to avoid ambiguity. Law also relies on strictly defined terms and a lot of references like links to other acts or sections.
The methods of the Semantic Web for Linked Open Data can be easily reused here as they rely on a logically similar set of requirements. This is why we in Graphwise believe that semantic knowledge graphs are the best approach for regulatory compliance and understanding.
When combined with large language modes (LLMs), they offer key capabilities:
- Build your graph – helping you turn your unstructured data into structured content, rely on model-driven extraction, and introduce formal knowledge management.
- Link to your graph – enriching your documents with background sources, identifying the terms, concepts, or entities in it, and resolving definitions and references.
- Talk to your graph – ensuring relevance (retrieve, score, rank), consuming narrative insights, enabling self-service.
Although the current hype is to talk mainly about the third capability, building your graph and enriching it with background sources are equally, if not more, important than the actual consumption of the knowledge you have. Unless you have a well structured knowledge enriched with context, there is no substance from which to generate any insights.
How to go about it?
Some of the common tasks around building your graph are to convert your data into graph format (RDF or Jason-LD), map it to an ontological model, extract definitions, structure your knowledge and more. You can do them easily with Graphwise’s tools. Graph Automation converts and harmonizes your data while Graph Modeling offers a taxonomy management tool, which structures all the definitions of the DORA regulation. This is the backbone of our knowledge management.
Now, if we look again at Article 5, we see the same text but in addition, there are all taxonomy mentions and candidate concepts. This document is more than it was before because it is enriched with the background knowledge of your organization.
In linking to your graph our objective is to identify legal concepts and enrich them with background graph knowledge. So, let’s look again at some simple text from Article 5.
With the help of Graphwise tooling, you can see that financial entity is a collective term describing 12 entities. This will help you understand if your type of organization is considered a financial entity. We see that financial entities are required to have some kind of framework that is often mentioned in the text. The tool offers several candidates of extracted terms and suggests the most likely one. ICT risk is also a defined term and so on. You can see that it will be difficult to understand what this brief paragraph is about without the proper technology.
Finally, for talking to your graph, you’ll need Graphwise GraphDB. To test these capabilities let’s use the control question we had for ChatGPT: “What is the definition of the operational resiliency?”
As we can see from the screenshot above, when we have structured knowledge, we can retrieve the exact definition from the original document. Now, we can ask even more complex questions like “What is the business impact analysis in this case?” or “How software and hardware and people should be interlinked?”.
DORA has the answers and we’ve also tested our answers against the responses of a subject matter expert and have validated them.
When we’re working with the exact definitions and sources, even the narrated responses are high in precision and quality. So, structuring your graph and linking it to background sources leads to substantial improvement of the quality of the retrieved insights.
To wrap it up
Given the requirements of the domains of regulatory compliance and policy comprehension, the hallucination problem is not only a problem, but an anti-requirement. So in these domains, the structuring of background knowledge is much more important for retrieving information than generative AI capabilities that produce an approximate summary of the exact knowledge.
We can also easily see the benefits of using LLMs grounded in knowledge graphs for the legal personas outlined earlier.
By saving time to navigate and structure legal documents, this approach improves efficiency while also enhancing accuracy as there are less omissions and errors. Legal professionals experience higher satisfaction as now they are freed from mundane, repetitive tasks to focus on what they do best — interpretation and analysis. And last but not least, the resulting improvements in service margins boost profitability, while faster turnaround times for legal insights lead to increased overall productivity.