From Page 5 to Top Result: My SEO Checklist
A proven SEO optimisation guide you can follow and see results too!
A couple of years ago, I was tasked with improving the visibility of a product website. We were buried in search results stuck on page five while our competitors claimed the spotlight. That experience set me down a rabbit hole of SEO. After a series of targeted improvements and best practice fixes, we successfully pushed the site into the top result for our product category within a couple of months.
Now, as AI-first websites rise and everyone now talks about being “AI SEO” friendly, it’s easy to forget that many regular websites still haven’t nailed the fundamentals of traditional SEO. These core principles still matter and they help ensure broader coverage across search engines (AI or not!). This post is an attempt to help move the web ecosystem forward to highlight the fundamental SEO practices for everyone building online.
TL;DR: Want to implement these SEO improvements directly in your code? Just visit this link and paste the prompt into your IDE to get started.
P.S. I’ve ranked these by impact so you can decide what’s worth tackling first.
🥇 High Impact
Server Side Rendering (SSR)
Search engines primarily crawl the HTML source of a page , not the content that loads later through JavaScript (a process known as hydration).
To ensure your key content is crawlable:
Use Server-Side Rendering (SSR) for important pages like the homepage, features, or blog.
In frameworks like Next.js, enabling SSR ensures content is included in the initial HTML response.
If you're using the App Router in Next.js, you can easily take advantage of SSR through features like streaming ssr, server components etc.
How to check if SSR is working:
Right-click on your page and select “View Page Source.” If you see your actual content in the HTML, SSR is properly set up.
Metadata: Say What You Do
Search engines rely on the `<title>` and `<meta name="description">` tags to understand your content and display it in results.
Make them specific, keyword-rich and helpful. This should cover most of the important aspects which your product handles.
Example:
<title>{YOUR_PRODUCT_NAME | {ADDITIONAL SHORT CONTEXT ABOUT WHAT IT DOES}</title>
<meta name="description" content="{A BRIEF DESCRIPTION COVERING IMPORTANT KEYWORDS OF YOUR PRODUCT}">This dramatically improved ranking once optimised.
Lighthouse Score: Time to fix your bad code
Google’s Lighthouse score is a reliable signal that search crawlers use to determine whether your site is fast, accessible, and optimised for various devices.
You can check it here: PageSpeed Insights
To improve your score:
Define width and height attributes on all images to avoid layout shifts
Split JS bundles using code-splitting.
Defer loading of non-critical scripts like Google Tag Manager
Lazy load heavy components which are below the fold. This has a huge positive impact on your LCP.
Aim for 90+ across performance, accessibility, best practices, and SEO.
Schema Markup: For Rich Results
Use structured data to:
Enhance search listings (e.g., FAQs, Reviews)
Enable features like breadcrumbs, ratings, and product highlights
Use JSON-LD in the <head> and validate via Google’s Rich Results Test.
Mobile Responsive
Mobile responsiveness is non-negotiable.
Most users and Googlebot operate on mobile-first indexing.
Checklist:
Use responsive layouts with flex or grid
Make buttons and inputs tap-friendly
Avoid fixed-width containers
Test responsiveness across multiple devices.
Add Sitemap
Sitemaps make it easier for search engines to crawl and index your content.
Use tools like next-sitemap or similar plugins to:
Generate a sitemap for all site pages
Generate a separate image sitemap if you use many visuals
Generate a video sitemap too if it’s relevant
Submit via Google Search Console.
Add Canonical Tags
Avoid duplicate content penalties by specifying the main version of a page using canonical tags. You loose trust of the algorithm if multiple URLs have the same content.
Example:
<link rel="canonical" href="https://{YOUR_PRODUCT}/features" />This is especially important when UTM tags or filtered pages are involved.
Increase Content of Site
More content = more keywords you can rank for.
Useful additions:
Blogs Page containing various articles that explain related concepts
FAQ pages targeting common queries
Use-case driven sections that match user intent
Helpful content increases trust, keeps users longer, and makes your site eligible for more queries. Google associates you with similar queries as you increase the blogs covering various aspects of your product.
Fetch Device Appropriate Images
Serving the same large images to all devices is wasteful.
We started dynamically serving optimized image sizes based on device and screen resolution. It's especially useful for:
Using
srcsetto define multiple image resolutionsEnsuring mobile users get lighter assets
Hosting assets on a CDN for faster delivery
This significantly improves Lighthouse performance, conserves bandwidth for mobile users, and ensures faster visual rendering.
I recommend using ImageKit for greater control over your assets, or leveraging Next.js Image component, which handles optimizations out of the box.
🥈 Moderate Impact
Avoid Errors in Console
Console errors (JS, 404s, CORS issues) hurt SEO.
Google bots interpret errors as signs of poor quality or broken user experience.
Regularly review and resolve issues in DevTools > Console.
Add Images with Alt Text
Alt text isn't just for accessibility but also a ranking signal.
Tips:
Add descriptive alt tags for all important images
Include relevant keywords when appropriate
Avoid keyword stuffing
Example:
<img src="mobile-ui.png" alt="{YOUR PRODUCT NAME} mobile interface">Importance of h1, h2 Tags
Heading tags signal content hierarchy. Search engines give these more weight than regular paragraph text.
Best practices:
Use a single
<h1>per page for your primary headingUse
<h2>,<h3>, etc. for subsectionsInclude relevant keywords naturally
Avoid using styled
<div>s to mimic headings, crawlers don’t interpret them as structural indicators.
Link Soup
All important pages should be reachable from one another.
To build this:
Add contextual links between pages
Ensure no page is orphaned
Use the footer to link to high-value pages (e.g., Privacy, Blog, Docs, About)
The easier it is to crawl your site, the better your indexation.
Robots.txt: Controlling What Gets Crawled
Your robots.txt file is a basic but essential tool for directing crawler behaviour.
Use it to:
Prevent indexing of dev or private pages
Ensure important paths are accessible
Avoid wasting crawl budget on irrelevant content
Example setup:
User-agent: *
Disallow: /dev/
Allow: /Keep this file simple and up to date with your site's structure.
Freshness Counts
Content that’s updated regularly ranks better.
Make it a habit to:
Update old posts with new insights
Fix outdated examples or screenshots
Refresh publish dates when relevant
This signals relevance to both search engines and users.
Avoid Redirects Internally
Avoid redirecting users to external pages unnecessarily. Redirects may be seen as suspicious by crawlers.
Where possible:
Link users directly to the target page
Avoid multiple hops (e.g., internal → short link → external)
Clean URL structures improve ranking and performance.
🥉 Lower Impact
Add Google Tag Manager
Google Tag Manager helps manage analytics and third-party scripts.
Best practice:
Load GTM asynchronously
Defer loading it until after first paint to preserve performance
Still track what matters but just smarter.
Favicon
A favicon is small but signals completeness.
Missing favicons are flagged in Lighthouse audits and reduce credibility.
Add one using:
<link rel="icon" href="{YOUR_FAVICON_PATH}">Add Open Graph Tags
Open Graph (OG) and Twitter tags make your site links look good when shared.
They don’t drastically affect SEO but help with trust and shareability.
Example:
<meta property="og:title" content="{YOUR PRODUCT NAME}">
<meta name="twitter:card" content="summary_large_image">Social Links in Footer
Add your verified handles (e.g., Twitter, Discord, GitHub) in the footer.
This helps:
Reinforce brand authority
Ensure crawlers associate your domain with your social profiles
Make it easier for users to find you across platforms
Add Images in CDN and Also Add Their Links in a Separate Sitemap
CDNs speed up content delivery. Serving images through them reduces latency.
Steps:
Host all static images on a CDN
Create a separate image sitemap listing all image URLs
Submit it via Search Console
This improves both performance and image discoverability.
Track Everything: Google Search Console
Google Search Console is essential for monitoring your SEO progress.
Key features:
Submit updated sitemaps
Track impressions, CTRs, and keyword rankings
Catch crawl errors early
Review reports weekly
Final Thoughts
Bringing more of the web up to SEO baseline is not just about traffic , it’s about giving useful, well-built websites the visibility they deserve. These learnings are based on what worked for me in the past, and I’m sure there’s still plenty more to learn and improve.
If this helps even a few people approach SEO with a bit more clarity and confidence, it’ll be worth the effort.
P.S. I’ll be sharing insights on AI SEO soon so you can learn how to boost your product’s presence on tools like GPT, Perplexity too.
Feel free to ping me on X if you need to discuss anything around SEO!


