Platforms like ChatGPT, Gemini, and Perplexity are increasingly used for shopping queries, with ChatGPT alone reaching 501 million monthly users and holding a 74.2% market share in the U.S.
Clever Use of System Instructions
If AI systems can understand your brand, and what it offers – there is a higher chance that it will show up in search requests, or even as suggestions to customers looking for options. This could translate into retail brands building AI assistants that intelligently reference inventory levels, customer loyalty status, and seasonal trends before suggesting products. For a travel marketplace, it could mean using real-time weather data, user preferences, and booking history to generate personalised trip recommendations via chatbot. Today, it’s possible for one of these systems to plan a two-week holiday for you in conjunction with your preferences, says Moddemann.
ALSO READ: Does GEO Render Traditional SEO Redundant?
But it’s not only beneficial for in-house models, context engineering can influence generic LLMs too. This means that if a user is looking for a solution to their skin ailments and your brand pages offer relevant cues to a solution – the model will reference it, complete with citations. At the backend, it focuses on understanding prompt design, retrieval-augmented generation (RAG), and clever use of system instructions, memory, and tool integrations.
So, if brand marketers need to feed the model the right context—like documents, user history, or structured data— they will need a multi-modal approach. After all, search isn’t just shifting from keywords to intent, it is also moving from primarily text-based results to align text, image, and video content around shared themes or to get technical ‘semantic clusters’.
Multi-Modal Breadth and Depth of Information
A simple example here is to make videos searchable. AI models now index spoken content for relevance. Adding transcripts, captions, and timestamps to videos helps AI parse and surface the information at the right time.
In March, when Google launched Gemini 2.0 for AI Overviews in the US, the most significant move for the search giant was that users would be able to ask more nuanced questions that previously needed multiple searches. The upgrade allows users to not only access high-quality web content, but also tap into fresh, real-time sources like the Knowledge Graph – information about the real-world, and shopping data for billions of products. The company explained how it uses a “query fan-out” technique, issuing multiple related searches concurrently across subtopics and multiple data sources and then brings those results together to provide an easy-to-understand response. For content marketers, it’s imperative to note that queries, and the subsequent content needs to open up to the breadth and depth of information.
In the case of in-house models where there is full control over the architecture, it offers greater flexibility. Brands can fine-tune the model, build custom memory systems, and design bespoke workflows.
A few terms to understand context engineering;
- Short-Term Memory: Recent interactions or chat history that help maintain continuity.
- Long-Term Memory: Stored knowledge or previous sessions retrieved from databases or vector stores.
- Retrieved Knowledge (RAG): External documents or data pulled in real time to enrich the model’s understanding.
- Tool Definitions: APIs or functions the AI can call (e.g., search, summarize, calculate).
- Structured Output Templates: JSON schemas or formatting rules that guide how the AI should respond.
Traditional Rules Still Apply
“Traditional SEO optimisations, you might be familiar with like EEAT. (It stands for Experience, Expertise, Authoritativeness, and Trustworthiness, and is a framework that Google uses to evaluate the quality and reliability of website content.) are absolutely still applicable in the world of AI and even maybe more so,” says Moddemann.
Stick to the fundamentals, and add a layer of intent. Ask if the searcher goals are to find content that is informational, transactional, or navigational. For example, keywords still guide search intent but content must consider semantic clusters, not just exact matches. Write in a way that is conversational without letting go of accuracy, and freshness still counts.
On the backend, fast load times, mobile optimisation and crawlability remain crucial. In fact, AI crawlers rely on XML sitemaps, schema markup, and clean site architecture to understand and index content.
Backlinks and citations are still the source of trust signals for machines parsing content.
ALSO READ: Can GenAI Organise The Unorganised Web?