In 2026, Google ranks pages on E-E-A-T signals, Core Web Vitals performance, and how precisely the content matches search intent. Keyword stuffing and link schemes have lost ground to demonstrable expertise, fast load times, and structured data that surfaces answers in AI-generated search results.
Google's E-E-A-T framework — Experience, Expertise, Authoritativeness, and Trustworthiness — is now the dominant lens through which quality raters and the algorithm itself evaluate pages. Experience is the newest addition: Google wants evidence that the author has first-hand, real-world knowledge of the topic, not just a summary of what others have published.
In practical terms, this means pages need named authors with visible bios that link to their credentials, an About page that establishes the business or individual clearly, citations from credible third-party sources, and brand mentions across the web — reviews, press coverage, and professional directories. For South African businesses, local citations on Google Business Profile, Hello Peter, and industry bodies carry real weight.
Content depth matters too. A single 300-word overview page on a broad topic cannot compete with a content cluster that covers the subject from multiple angles, with internal links connecting pillar pages to supporting pieces. Google measures topical authority: the more comprehensively a site covers a subject, the more trustworthy it appears for related queries. Thin pages with no original insight are being filtered out at scale.
For reference, HubSpot: Google Ranking Factors provides a useful overview of how these quality signals interact across the algorithm.
Page experience signals — specifically Core Web Vitals — remain a confirmed ranking factor in 2026. The three metrics that matter are LCP (Largest Contentful Paint, which measures how fast the main content loads), CLS (Cumulative Layout Shift, which measures visual stability as the page renders), and INP (Interaction to Next Paint, which replaced FID in 2024 and measures how quickly the page responds to user input).
South African sites face a structural challenge here. Load shedding disrupts local hosting infrastructure, shared hosting on undersized South African servers struggles with time-to-first-byte, and many small business sites still rely on unoptimised WordPress themes with heavy render-blocking scripts. A site that scores below 50 on Google PageSpeed Insights for mobile is competing with one hand tied behind its back, regardless of content quality. Image compression, a CDN, and deferred JavaScript are the starting points — not nice-to-haves.
Search intent is equally non-negotiable. Google classifies queries as transactional (the user wants to buy or act), informational (the user wants to learn), or navigational (the user wants to reach a specific site). Serving an informational article to a transactional query — or a product page to someone researching options — is a mismatch that no amount of keyword repetition will fix. The page type, structure, and call to action need to align with what the searcher actually wants at that moment.
The arrival of Google's AI Overviews and widespread AI-assisted search has introduced a new layer of ranking signals that most South African sites are not yet optimised for. Structured data — JSON-LD schema markup — tells AI systems exactly what a page is about: a product, an FAQ, a local business, an article, a how-to guide. Pages with accurate schema are significantly more likely to be cited in AI-generated answer panels than those without it.
Entity recognition is related: Google's Knowledge Graph now maps businesses, people, places, and concepts as entities with relationships between them. A business that appears consistently across Google Business Profile, Wikipedia, Wikidata, LinkedIn, and industry directories builds entity salience — it becomes a known, trusted node in the knowledge graph rather than an anonymous domain.
In 2025, llms.txt emerged as a new convention — a plain-text file placed at the root of a domain that tells large language models (including Perplexity, Claude, and ChatGPT) which pages are most important, what the site covers, and how it is structured. While not yet a direct Google ranking signal, it influences AI-driven referral traffic and citation likelihood, which is increasingly where organic discovery happens.
What is losing importance: exact-match keyword stuffing has been penalised since the Panda and Hummingbird updates but still persists on older sites — it now actively harms rankings. Low-quality link schemes from private blog networks, directory spam, and paid links that violate Google's policies are regularly caught by SpamBrain, Google's AI-powered spam filter. Keyword density as a metric is obsolete; relevance and semantic coverage are what the algorithm measures.
Explore Technical SEO Factors →