Over the next three months, I’m documenting how I improve the Generative Engine Optimization (GEO) of my personal website.
My goal is simple:
Make my website easier for both Google and AI systems to understand, crawl, and cite.
Recently, I ran a GEO audit on my website (ingvarestorco.com) using my own analysis framework rather than relying on paid tools like Semrush. The results were eye-opening.
The audit gave my site a 40/100 GEO score, revealing that while the site technically works, it still lacks the signals needed for strong AI visibility.
More importantly, the report showed something I hadn’t fully realised:
AI systems may not yet recognise my name or expertise because my site lacks external authority signals and structured entity data.
So I decided to treat this as a hands-on experiment.
For the next three months, I’ll be applying what I’ve learned about:
This article documents what I learned so far and how I’m applying it to my own website.
When I first reviewed the report, I expected a few technical fixes.
Instead, it revealed deeper issues affecting how AI systems see my website.
Here are the key GEO scores from the audit:
| GEO Category | Score |
|---|---|
| AI Citability | 47/100 |
| Brand Authority | 22/100 |
| Content EEAT | 48/100 |
| Technical GEO | 54/100 |
| Schema & Structured Data | 42/100 |
| Platform Optimization | 20/100 |
Overall GEO score: 40/100.
The biggest insight for me was that SEO alone isn’t enough anymore.
A site can rank in Google but still be invisible to AI systems.
That’s where Generative Engine Optimization comes in.
Read the full GEO audit report generated using my personal analysis tool: https://ingvarestorco.com/wp-content/uploads/2026/03/GEO-AUDIT-REPORT.pdf

Even though I’ve worked in SEO and web development for years, reviewing the fundamentals helped me understand why my site had crawling and visibility issues.
Google search works through three main stages.
Google uses automated bots called Googlebot to explore the web and discover pages.
Crawlers find pages through:
During my audit review, I discovered something surprising:
My XML sitemap was broken.
That means Google crawlers may have difficulty discovering pages beyond the homepage.
Fixing this became my first GEO improvement task.
I immediately:
/sitemap_index.xmlThis small fix alone can significantly improve crawl efficiency.
After crawling, Google tries to understand the page and store it in its index.
It evaluates signals like:
While reviewing my homepage code, I realised something else:
My page builder renders most content via JavaScript.
The audit highlighted that AI crawlers might only see navigation and schema but not the actual content.
AI crawlers often do not execute JavaScript like browsers do.
That means critical information such as:
should exist in server-rendered HTML.
This is something I’m now actively improving on my homepage.
Traditional search engines rank pages based on relevance and authority.
But AI search engines do something slightly different.
Instead of showing ten links, they generate direct answers.
This means visibility now depends on citability.
AI models prefer sources that have:
This is exactly what Generative Engine Optimization (GEO) focuses on.
Before doing this audit, I thought GEO was just another SEO buzzword.
Now I see it differently.
GEO is about making sure AI systems can confidently reference your content as a source.
For my site, this means improving:
The audit highlighted that my website currently lacks strong brand recognition signals, which is why AI models may not identify me as a known entity yet.
That was a big wake-up call.
Instead of fixing everything at once, I created a three-month roadmap.
The first month focuses on infrastructure.
llms.txtThese changes improve crawlability and AI understanding.
The audit also showed my blog content is still too thin for topical authority.
I currently have only three blog posts, which is far below the expected coverage for an SEO specialist site.
Over the next month I will publish:
My goal is to add 8–10 new articles.
Each article will include:
One of the biggest weaknesses identified in the audit is brand authority.
My online presence currently exists mostly on platforms I control.
AI systems rely on independent third-party sources to confirm expertise.
I plan to create profiles on:
I also want to publish at least one guest article on an SEO or web development site.
These signals help AI systems verify who I am and what I specialise in.
One of the most important lessons from this process is that AI visibility is different from search rankings.
In traditional SEO, success often means ranking on page one.
In AI search, success means:
being cited in answers.
That requires content that is:
This is why I’m shifting my blog from purely informational posts to experience-based articles like this one.
Another insight from the audit was related to EEAT signals.
Google values demonstrated experience, not just explanations.
By documenting my own learning journey, I can show:
This creates stronger experience signals for both search engines and AI systems.
Google crawling is the process where automated bots called Googlebot discover and scan webpages across the internet so they can be indexed.
Google search works through three main steps:
Generative Engine Optimization (GEO) is the practice of optimising websites so that AI search engines can discover, understand, and cite content in generated answers.
AI visibility refers to how often your content appears as a source in AI-generated answers across platforms like ChatGPT, Gemini, and Perplexity.
A GEO audit evaluates how well a website is optimised for AI search systems by analysing:
This GEO audit was a valuable reminder that search is evolving quickly.
Optimising for Google alone is no longer enough.
Websites now need to be understandable to both search engines and AI systems.
Over the next three months, I’ll continue applying these lessons to my website and documenting the results.
My goal is simple:
Improve my GEO score, strengthen my authority signals, and make my content easier for AI systems to cite.
And if the process works, I’ll have a real case study showing how GEO improvements affect AI visibility over time.