The best way to focus your internal business chatbot: Memgraph’s graph database expert Dominik Tomicevic shares the context best practice he’s seeing from the enterprise AI frontline

Give your AI the right information, and it will deliver informed, relevant answers. Over the past year, working with a wide range of AI developer teams across industries, we’ve observed the quiet rise of what could be called context engineering, gradually taking the place of traditional prompt engineering. Analysts like Gartner have also recognised the enormous potential of this shift, noting that moving from prompts to context is critical for achieving ‘scalable, adaptive high-impact enterprise AI’.
Context engineering sets up the LLM to reason correctly
Old-style prompt engineering guides the AI model to recall information, but that can only ever be drawn from what the LLM has already learned. General-purpose LLMs haven’t been trained on your company’s internal data, policies, or workflows. Therefore, tweaking prompts won’t reliably produce accurate results. Without the right context, the model is prone to hallucination and errors.
What we need is a way of curating, structuring, and integrating the right information so the model can operate effectively. That’s a process that needs to include defining the task, identifying relevant data sources, structuring that data, and considering previous interactions or memory.
Multi-shot prompting—feeding the model past attempts and errors—works only if the context includes the right historical and organisational information. That’s why we need the extra step of context engineering to set up the environment for the LLM to reason correctly and produce actionable output.
Be clear that this is work that is not done at the interface, which is all you can ever do with a prompt revision sequence, but at the back-end, where you ingest and structure the data that the model reasons across. Metadata and organisational modelling are crucial components. Mapping relationships between teams, policies, and processes is the only practical way a potential new AI application can gain sufficient semantic understanding of how a specific company operates.
Enter the GraphRAG dragon
Context engineering is the only way to evolve an LLM into an agent capable of understanding complex data interactions. GraphRAG (Graph-based Retrieval-Augmented Generation) is particularly powerful in this regard. By organising data in a graph format, GraphRAG captures the relationships between people, policies, customer data, and operational events.
This approach leverages the best of both worlds: the inherent strengths of graph databases and the retrieval capabilities of RAG. A GraphRAG-extended LLM can efficiently traverse the graph of all the context you have provided, and so in milliseconds filter irrelevant information, and focus on the most pertinent nodes.
Recursive summarisation further refines this context, trimming extraneous details while preserving essential information. Meanwhile, stepwise reasoning ensures the model applies the right context to each task, without being overwhelmed by irrelevant data.
Accurate, compliant, and up-to-date information
Ongoing monitoring and governance are also critical, because enterprise data is constantly changing—new customer behaviour, updated processes, and shifting regulations all affect context. GraphRAG supports this by consolidating inputs from multiple stakeholders and data silos, ensuring that the LLM always has access to accurate, compliant, and up-to-date information.
In summary, prompt engineering tells the model what to do, but context engineering provides the foundation for meaningful results. By organising, filtering, and summarising complex enterprise data, GraphRAG elevates your AI, enabling your LLM to deliver precise, actionable insights.
When combined with well-crafted prompts and carefully curated context, organisations can deploy AI that is not only effective but also adaptive to changing business needs.
The author is CEO of London-headquartered knowledge graph leader Memgraph
Main image courtesy of iStockPhoto.com and shulz

© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543